[
  {
    "path": ".gitignore",
    "content": "# Byte-compiled / optimized / DLL files\n__pycache__/\n*.py[cod]\n*$py.class\n\n# C extensions\n*.so\n\n# Distribution / packaging\n.Python\nbuild/\ndevelop-eggs/\ndist/\ndownloads/\neggs/\n.eggs/\nlib/\nlib64/\nparts/\nsdist/\nvar/\nwheels/\npip-wheel-metadata/\nshare/python-wheels/\n*.egg-info/\n.installed.cfg\n*.egg\nMANIFEST\n\n# PyInstaller\n#  Usually these files are written by a python script from a template\n#  before PyInstaller builds the exe, so as to inject date/other infos into it.\n*.manifest\n*.spec\n\n# Installer logs\npip-log.txt\npip-delete-this-directory.txt\n\n# Unit test / coverage reports\nhtmlcov/\n.tox/\n.nox/\n.coverage\n.coverage.*\n.cache\nnosetests.xml\ncoverage.xml\n*.cover\n*.py,cover\n.hypothesis/\n.pytest_cache/\n\n# Translations\n*.mo\n*.pot\n\n# Django stuff:\n*.log\nlocal_settings.py\ndb.sqlite3\ndb.sqlite3-journal\n\n# Flask stuff:\ninstance/\n.webassets-cache\n\n# Scrapy stuff:\n.scrapy\n\n# Sphinx documentation\ndocs/_build/\n\n# PyBuilder\ntarget/\n\n# Jupyter Notebook\n.ipynb_checkpoints\n\n# IPython\nprofile_default/\nipython_config.py\n\n# pyenv\n.python-version\n\n# pipenv\n#   According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.\n#   However, in case of collaboration, if having platform-specific dependencies or dependencies\n#   having no cross-platform support, pipenv may install dependencies that don't work, or not\n#   install all needed dependencies.\n#Pipfile.lock\n\n# PEP 582; used by e.g. github.com/David-OConnor/pyflow\n__pypackages__/\n\n# Celery stuff\ncelerybeat-schedule\ncelerybeat.pid\n\n# SageMath parsed files\n*.sage.py\n\n# Environments\n.env\n.venv\nenv/\nvenv/\nENV/\nenv.bak/\nvenv.bak/\n\n# Spyder project settings\n.spyderproject\n.spyproject\n\n# Rope project settings\n.ropeproject\n\n# mkdocs documentation\n/site\n\n# mypy\n.mypy_cache/\n.dmypy.json\ndmypy.json\n\n# Pyre type checker\n.pyre/\n\n# checkpoint\n/ckpt\n/checkpoint\n\n# MAC\n.DS_Store\n"
  },
  {
    "path": ".readthedocs.yaml",
    "content": "# Read the Docs configuration file for Sphinx projects\n# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details\n\n# Required\nversion: 2\n\n# Set the OS, Python version and other tools you might need\nbuild:\n  os: ubuntu-22.04\n  tools:\n    python: \"3.9\"\n    # You can also specify other tool versions:\n    # nodejs: \"20\"\n    # rust: \"1.70\"\n    # golang: \"1.20\"\n\n# Build documentation in the \"docs/\" directory with Sphinx\nsphinx:\n  configuration: docs/source/conf.py\n  # You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs\n  # builder: \"dirhtml\"\n  # Fail on all warnings to avoid broken references\n  # fail_on_warning: true\n\n# Optionally build your docs in additional formats such as PDF and ePub\n# formats:\n#   - pdf\n#   - epub\n\n# Optional but recommended, declare the Python requirements required\n# to build your documentation\n# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html\npython:\n  install:\n    - requirements: docs/requirements.txt"
  },
  {
    "path": "LICENSE",
    "content": "BSD 2-Clause License\n\nCopyright (c) 2022, Zifeng\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without\nmodification, are permitted provided that the following conditions are met:\n\n1. Redistributions of source code must retain the above copyright notice, this\n   list of conditions and the following disclaimer.\n\n2. Redistributions in binary form must reproduce the above copyright notice,\n   this list of conditions and the following disclaimer in the documentation\n   and/or other materials provided with the distribution.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\nAND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\nIMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\nDISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE\nFOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\nDAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\nSERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\nCAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\nOR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\nOF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n"
  },
  {
    "path": "README.md",
    "content": "# TransTab: A flexible transferable tabular learning framework [[arxiv]](https://arxiv.org/pdf/2205.09328.pdf)\n\n\n[![PyPI version](https://badge.fury.io/py/transtab.svg)](https://badge.fury.io/py/transtab)\n[![Documentation Status](https://readthedocs.org/projects/transtab/badge/?version=latest)](https://transtab.readthedocs.io/en/latest/?badge=latest)\n[![License](https://img.shields.io/badge/License-BSD_2--Clause-orange.svg)](https://opensource.org/licenses/BSD-2-Clause)\n![GitHub Repo stars](https://img.shields.io/github/stars/ryanwangzf/transtab)\n![GitHub Repo forks](https://img.shields.io/github/forks/ryanwangzf/transtab)\n[![Downloads](https://pepy.tech/badge/transtab)](https://pepy.tech/project/transtab)\n[![Downloads](https://pepy.tech/badge/transtab/month)](https://pepy.tech/project/transtab)\n\n\nDocument is available at https://transtab.readthedocs.io/en/latest/index.html.\n\nPaper is available at https://arxiv.org/pdf/2205.09328.pdf.\n\n5 min blog to understand TransTab at [realsunlab.medium.com](https://realsunlab.medium.com/transtab-learning-transferable-tabular-transformers-across-tables-1e34eec161b8)!\n\n### News!\n- [03/12/25] Version `0.0.7` with `TransTabRegressor` available for regression. Thanks @yuxinchenNU.\n\n- [05/04/23] Check the version `0.0.5` of `TransTab`!\n\n- [01/04/23] Check the version `0.0.3` of `TransTab`!\n\n- [12/03/22] Check out our [[blog]](https://realsunlab.medium.com/transtab-learning-transferable-tabular-transformers-across-tables-1e34eec161b8) for a quick understanding of TransTab!\n\n- [08/31/22] `0.0.2` Support encode tabular inputs into embeddings directly. An example is provided [here](examples/table_embedding.ipynb). Several bugs are fixed.\n\n## TODO\n\n- [x] Table embedding.\n\n- [x] Add regression support.\n\n- [ ] Add support to direct process table with missing values.\n\n\n### Features\nThis repository provides the python package `transtab` for flexible tabular prediction model. The basic usage of `transtab` can be done in a couple of lines!\n\n```python\nimport transtab\n\n# load dataset by specifying dataset name\nallset, trainset, valset, testset, cat_cols, num_cols, bin_cols \\\n     = transtab.load_data('credit-g')\n\n# build classifier\nmodel = transtab.build_classifier(cat_cols, num_cols, bin_cols)\n\n# build regressor\n# model = transtab.build_regressor(cat_cols, num_cols, bin_cols)\n\n# start training\ntranstab.train(model, trainset, valset, **training_arguments)\n\n# make predictions, df_x is a pd.DataFrame with shape (n, d)\n# return the predictions ypred with shape (n, 1) if binary classification;\n# (n, n_class) if multiclass classification.\nypred = transtab.predict(model, df_x)\n```\n\nIt's easy, isn't it?\n\n\n\n## How to install\n\nFirst, download the right ``pytorch`` version following the guide on https://pytorch.org/get-started/locally/.\n\n~~Then try to install from pypi directly:~~ [Feb 2025: pypi version is not maintained, please try to install from github instead]\n\n~~or~~\n\n```bash\npip install git+https://github.com/RyanWangZf/transtab.git\n```\n\n\n\nPlease refer to for [more guidance on installation](https://transtab.readthedocs.io/en/latest/install.html) and troubleshooting.\n\n\n\n## Transfer learning across tables\n\nA novel feature of `transtab` is its ability to learn from multiple distinct tables. It is easy to trigger the training like\n\n```python\n# load the pretrained transtab model\nmodel = transtab.build_classifier(checkpoint='./ckpt')\n\n# load a new tabular dataset\nallset, trainset, valset, testset, cat_cols, num_cols, bin_cols \\\n     = transtab.load_data('credit-approval')\n\n# update categorical/numerical/binary column map of the loaded model\nmodel.update({'cat':cat_cols,'num':num_cols,'bin':bin_cols})\n\n# then we just trigger the training on the new data\ntranstab.train(model, trainset, valset, **training_arguments)\n```\n\n\n\n## Contrastive pretraining on multiple tables\n\nWe can also conduct contrastive pretraining on multiple distinct tables like\n\n```python\n# load from multiple tabular datasets\ndataname_list = ['credit-g', 'credit-approval']\nallset, trainset, valset, testset, cat_cols, num_cols, bin_cols \\\n     = transtab.load_data(dataname_list)\n\n# build contrastive learner, set supervised=True for supervised VPCL\nmodel, collate_fn = transtab.build_contrastive_learner(\n    cat_cols, num_cols, bin_cols, supervised=True)\n\n# start contrastive pretraining training\ntranstab.train(model, trainset, valset, collate_fn=collate_fn, **training_arguments)\n```\n\n\n\n## Citation\n\nIf you find this package useful, please consider citing the following paper:\n\n```latex\n@inproceedings{wang2022transtab,\n  title={TransTab: Learning Transferable Tabular Transformers Across Tables},\n  author={Wang, Zifeng and Sun, Jimeng},\n  booktitle={Advances in Neural Information Processing Systems},\n  year={2022}\n}\n```\n"
  },
  {
    "path": "blog/README.md",
    "content": "# NeurIPS'22 | How to perform transfer learning and zero-shot learning on tabular data?\n\n> This is our paper accepted by NeurIPS'22 with ratings 7/7/7, where we work on pretraining, transfer learning, and zero-shot learning on the tabular prediction task. The following are the links for this article and the codes.\n\nPaper: [TransTab: Learning Transferable Tabular Transformers Across Tables](https://arxiv.org/pdf/2205.09328.pdf)\n\nCode: [TransTab-github](https://github.com/RyanWangZf/transtab)\n\nDoc: [Transtab-doc](https://transtab.readthedocs.io/en/latest/)\n\n---\n\n\n\n## Tabular learning was not flexible\n\nIn this article, we refer the term *tabular learning* to the predictive task on the tabular data. For instance, we might know Kaggle competitions, where a lot of competitions are based on tabular data, e.g., house price prediction, credit fault detection, CTR prediction, etc. Basically, this type of task is on predicting the target label through a couple of features, just like in the following table\n\n| index | feature A | feature B | feature C | label |\n| ----- | --------- | --------- | --------- | ----- |\n| 0     | $x_1$     | $x_2$     | $x_3$     | $y$   |\n\none might take a linear regression to solve this problem as\n$$\ny = ax_1 + b x_2 + c x_3 +d.\n$$\nCompared with images and texts, tables are usually more frequently used in industrial applications. As we all know, recently there emerged the *pretrain+finetune* paradigm in the deep learning area, especially flourished in computer vision (CV) and natural language processing (NLP).\n\n\n\n<figure>\n<img src = \"figure/fig1.png\">\n<figcaption align = \"center\"> \n<b>Figure 1:</b> CV or NLP models usually share the same basic input unit, i.e., pixel for images and word/token for texts. However, tabular models only accept a fixed-structure table: the train and test tables should *always* have equal column sets, which prevents us from transfer learning or zero-shot learning on tabular data.\n</figcaption>\n</figure>\n\n\nIn CV & NLP, pretrained models like BERT, and ViT have become the strong baseline for almost all tasks. By contrast, in the tabular learning domain, we usually encounter the case \"xgboost is all you need\". GBDT models can achieve competent performances with less effort on data preprocessing and hyperparameter tuning than deep learning-based methods. In this circumstance, a lot of researchers have started to think about how we outperform GBDT using deep learning, especially leveraging the power of deep learning on big multi-sourced data.\n\n\n\n## Recent efforts on transfer learning for tabular learning\n\nOf course, there have been some efforts on transfer learning for deep learning-based tabular learning. For example, VIME [[1]](#1), SCARF [[2]](#2), and SubTab [[3]](#3) all employ self-supervision for tabular learning. The common self-supervision can be categorized as *generative* and *discriminative* learning. For the first venue, we mask several cells in the table and ask the model to recover the missing values; for the second, we create positive samples by deleting or replacing cells.\n\nNonetheless, they hardly apply to real application cases: all apply to fixed-structure tables. We do not have a large table without labels, instead, we often have multiple heterogenous labeled labels. The core challenge is how to leverage as much labeled data as possible and get rid of heavy data preprocessing and missing value imputation.\n\nThe nature of only receiving fixed-structure tables causes all existing tabular methods to be incapable of dealing with pretraining on multi-sourced tables. Once there is a minor change in the table's structure, e.g., a column named *age* changed to *ages*, the pretrained model becomes useless. And we need to roll back to the process of *data processing* $\\to$ *feature engineering* $\\to$ *model training*, which is costly in terms of time and money.\n\nTherefore, we ask, if it is possible to propose a tabular model that encodes **arbitrary** input tables needless of any adaptions?\n\n\n\n### Tabular learning is flexible\n\nIn fact, if we look back on the tabular data, we shall identify the column names are rich in semantics, which were long neglected by previous methods. \n\n| index | gender | age  | is_citizen |\n| ----- | ------ | ---- | ---------- |\n| 0     | male   | 25   | 0          |\n\nIn this example, we include three common types of features: *categorical*, *numerical*, and *binary*.\n\nWe argue that interpreting features considering column names is necessary. We know *25* under the column *age* means 25 years old instead of 25 km or 25 kg. We know *0* under the column *is_citizen* means the person is not a citizen instead of is not anything else. Previous methods drop column names and enforce the model to learn semantics from the raw features for decision-making, which is easy to implement but not transferable.\n\nOn the contrary, we ask why not just explicitly include the column names in the modeling. Surprisingly, we do not find any prior arts doing that in tabular learning.\n\nFormally, we process three types of features through\n\n- For categorical: we concatenate column names and features, i.e., *gender is male*.\n- For numerical, we tokenize and embed column names, then multiply the column embeddings with the feature value.\n- For bool: we tokenize and embed column names, they decide if pass this embedding to the encoder based on feature. If 0, then we drop this embedding.\n\n\n\n<figure>\n<img src = \"figure/fig2.png\">\n<figcaption align = \"center\"> \n<b>Figure 2:</b> The input feature processing module of *TransTab*.\n</figcaption>\n</figure>\n\n\n\nWith this processing module, we can linearize, tokenize, and embed any tabular data, which serves as the inputs for the encoder and the predictor.\n\n\n\n## Pretraining for TransTab\n\nThanks to its flexibility, *TransTab* is capable of learning across multiple heterogeneous tables. However, it is nontrivial to design an appropriate pretraining algorithm for it.\n\n\n\n<figure>\n<img src = \"figure/fig3.png\">\n<figcaption align = \"center\"> \n<b>Figure 3:</b> Learning across tables using naive supervised learning is harmful for representation learning.\n</figcaption>\n</figure>\n\n\n\nThe most straightforward way is illustrated as above: we train a shared backbone encoder plus task-specific  classifiers across tabular datasets. Nevertheless, we soon find this paradigm is suboptimal. The flaw comes from the heterogeneity of the target labels: two datasets might have opposite definition of labels.\n\nAccounting for this issue, we propose a novel **supervised contrastive learning** approach, namely **vertical partition contrastive learning (VPCL)** in this paper.\n\n\n\n<figure>\n<img src = \"figure/fig4.png\" width=\"80%\">\n<figcaption align = \"center\"> \n<b>Figure 4:</b> The proposed vertical partition contrastive learning (VPCL) approach for pretraining TransTab in our paper.\n</figcaption>\n</figure>\n\nIts principle is:\n\n- We split each raw (sample) into several partitions vertically, each partition is a sample for contrastive learning.\n- The partition comes from the same-label raw are positive, and vice versa.\n\nVPCL has the following merits:\n\n- It significantly expand the number of pairs for contrastive learning.\n- It is much more efficient and robust because it does not add additional task-specific classifiers.\n\n\n\n## Which new tasks that TransTab can solve?\n\nThanks to the flexibility of TransTab, it now handles many new tasks.\n\n\n\n<figure>\n<img src = \"figure/fig5.png\">\n<figcaption align = \"center\"> \n<b>Figure 5:</b> The new tasks that are amenable to TransTab.\n</figcaption>\n</figure>\n\n\n\n- Learning across multiple labeled datasets (share the same label) based on supervised learning, and finetuned on each specific dataset.\n- Learning from an incremental set of features and data, which usually originates from the updated measurements over time.\n- Pretrained on multiple labeled/unlabeled datasets (can have distinct labels) based on supervised VPCL, and finetuned on each dataset.\n- Learning from multiple labeled datasets (share the same label) based on supervised learning, and making predictions for brand new data without any further parameter updates.\n\n\n\n## Some experiment results\n\nFor the complete experiment results, please refer to [our paper](https://arxiv.org/pdf/2205.09328.pdf). Here we tell two interesting findings.\n\n\n\n### Pretraining\n\n<figure>\n<img src = \"figure/fig6.png\">\n<figcaption align = \"center\"> \n<b>Figure 6:</b> Experiment results of the pretraining+finetuning performances of TransTab.\n</figcaption>\n</figure>\n\nThe above figure illustrates the average performance (AUC) on multiple datasets. Left: on clinical  trial patient outcome prediction datasets. Right: on many open datasets. The red dotted line indicates the naive supervised learning performance. X-axis is the number of partitions made for VPCL.\n\nWe find:\n\n- Supervised VPCL generally improves predictive performances.\n- It is not an universally optimal number of partitions for VPCL.\n- Compared with open datasets, the pretraining on the left introduces much more improvements. That implies that it is still crucial to transfer knowledge from datasets coming from the similar domain. While the open datasets are very heterogeneous.\n\n\n\n### Zero-shot prediction\n\n\n\n<figure>\n<img src = \"figure/fig7.png\">\n<figcaption align = \"center\"> \n<b>Figure 7:</b> Experiment results of the zero-shot learning performances of TransTab.\n</figcaption>\n</figure>\n\nThe above figures demonstrate the zero-shot prediction performances of TransTab. We split one dataset into two parts and vary the overlap ratio of their column sets: from 0% to 100%. We find:\n\n- TransTab can even make reasonable predictions when there is **no column overlapping** between the train and test data, which is really amazing.\n- When the overlap ratio increases, we witness better performances, which is reasonable.\n\n\n\n\n\n## Use TransTab based on our package\n\nWe opensourced our package on [github](https://github.com/RyanWangZf/transtab) with the [documentations](https://transtab.readthedocs.io/en/latest/). It can be downloaded through\n\n```shell\npip install git+https://github.com/RyanWangZf/transtab.git\n```\n\n\n\nAnd it is rather easy to use it in tabular prediction tasks on multiple distinct tables:\n\n```python\nimport transtab\n\n# load multiple datasets by passing a list of data names\nallset, trainset, valset, testset, cat_cols, num_cols, bin_cols \\\n    = transtab.load_data(['credit-g','credit-approval'])\n\n# build transtab classifier model\nmodel = transtab.build_classifier(cat_cols, num_cols, bin_cols)\n\n# specify training arguments, take validation loss for early stopping\ntraining_arguments = {\n    'num_epoch':5,\n    'eval_metric':'val_loss',\n    'eval_less_is_better':True,\n    'output_dir':'./checkpoint'\n    }\n\n# start training\ntranstab.train(model, trainset, valset[0], **training_arguments)\n```\n\n\n\nFor pretraining based on VPCL, we have\n\n```python\nimport transtab\n\n# load multiple datasets by passing a list of data names\nallset, trainset, valset, testset, cat_cols, num_cols, bin_cols \\\n    = transtab.load_data(['credit-g','credit-approval'])\n\n# build contrastive learner, set supervised=True for supervised VPCL\nmodel, collate_fn = transtab.build_contrastive_learner(\n    cat_cols, num_cols, bin_cols,\n    supervised=True, # if take supervised CL\n    num_partition=4, # num of column partitions for pos/neg sampling\n    overlap_ratio=0.5, # specify the overlap ratio of column partitions during the CL\n)\n\n# start contrastive pretraining training\ntraining_arguments = {\n    'num_epoch':50,\n    'batch_size':64,\n    'lr':1e-4,\n    'eval_metric':'val_loss',\n    'eval_less_is_better':True,\n    'output_dir':'./checkpoint' # save the pretrained model\n    }\n\n# pass the collate function to the train function\ntranstab.train(model, trainset, valset, collate_fn=collate_fn, **training_arguments)\n```\n\nAnd after the pretraining completes, we can build a new classifier based on the pretrained model:\n\n```python\n# load the pretrained model and finetune on a target dataset\nallset, trainset, valset, testset, cat_cols, num_cols, bin_cols \\\n    = transtab.load_data('credit-approval')\n\n# build transtab classifier model, and load from the pretrained dir\nmodel = transtab.build_classifier(checkpoint='./checkpoint')\n\n# update model's categorical/numerical/binary column dict\nmodel.update({'cat':cat_cols,'num':num_cols,'bin':bin_cols})\n```\n\nIt is easy 😎 !\n\n\n\n## Conclusion\n\nDuplicating the success of deep learning in CV & NLP to tabular learning domain still requires rethinking the basic elements. In CV, we have pixel; in NLP, we have token/word. In this paper, we propose a simple yet effective algorithm to model tabular data. Our method explores using NLP techniques for enhancing tabular learning, with flexibility to handle arbitrary input tables. We hope it appeals to more attention in deep learning for tabular learning.\n\n\n\n## References\n<a id=\"1\"> [1] </a> Jinsung Yoon, Yao Zhang, James Jordon, and Mihaela van der Schaar. VIME: Extending the success of self-and semi-supervised learning to tabular domain. Advances in Neural Information Processing Systems, 33:11033–11043, 2020.\n\n<a id=\"2\"> [2] </a> Dara Bahri, Heinrich Jiang, Yi Tay, and Donald Metzler. SCARF: Self-supervised contrastive learning using random feature corruption. In International Conference on Learning Representations, 2022.\n\n<a id=\"3\"> [3] </a> Talip Ucar, Ehsan Hajiramezanali, and Lindsay Edwards. SubTab: Subsetting features of tabular data for self-supervised representation learning. Advances in Neural Information Processing Systems, 34, 2021.\n"
  },
  {
    "path": "docs/Makefile",
    "content": "# Minimal makefile for Sphinx documentation\n#\n\n# You can set these variables from the command line, and also\n# from the environment for the first two.\nSPHINXOPTS    ?=\nSPHINXBUILD   ?= sphinx-build\nSOURCEDIR     = source\nBUILDDIR      = build\n\n# Put it first so that \"make\" without argument is like \"make help\".\nhelp:\n\t@$(SPHINXBUILD) -M help \"$(SOURCEDIR)\" \"$(BUILDDIR)\" $(SPHINXOPTS) $(O)\n\n.PHONY: help Makefile\n\n# Catch-all target: route all unknown targets to Sphinx using the new\n# \"make mode\" option.  $(O) is meant as a shortcut for $(SPHINXOPTS).\n%: Makefile\n\t@$(SPHINXBUILD) -M $@ \"$(SOURCEDIR)\" \"$(BUILDDIR)\" $(SPHINXOPTS) $(O)\n"
  },
  {
    "path": "docs/make.bat",
    "content": "@ECHO OFF\n\npushd %~dp0\n\nREM Command file for Sphinx documentation\n\nif \"%SPHINXBUILD%\" == \"\" (\n\tset SPHINXBUILD=sphinx-build\n)\nset SOURCEDIR=source\nset BUILDDIR=build\n\n%SPHINXBUILD% >NUL 2>NUL\nif errorlevel 9009 (\n\techo.\n\techo.The 'sphinx-build' command was not found. Make sure you have Sphinx\n\techo.installed, then set the SPHINXBUILD environment variable to point\n\techo.to the full path of the 'sphinx-build' executable. Alternatively you\n\techo.may add the Sphinx directory to PATH.\n\techo.\n\techo.If you don't have Sphinx installed, grab it from\n\techo.https://www.sphinx-doc.org/\n\texit /b 1\n)\n\nif \"%1\" == \"\" goto help\n\n%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%\ngoto end\n\n:help\n%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%\n\n:end\npopd\n"
  },
  {
    "path": "docs/requirements.txt",
    "content": "sphinx-markdown-tables\nrecommonmark\nsphinx==4.2.0\nsphinx_rtd_theme==1.0.0\nreadthedocs-sphinx-search==0.1.1\nloguru\nnumpy\nscikit_learn\nsetuptools\ntransformers\ntqdm\npandas>=1.3.*\nopenml>=0.10.0\ntorch\n"
  },
  {
    "path": "docs/source/about.rst",
    "content": "About Us\n========\n\nThis package was developed and maintained by Zifeng Wang (Ph.D. Student @ UIUC).\n\nPlease refer to his `Homepage <https://zifengwang.xyz/>`_ for more details."
  },
  {
    "path": "docs/source/conf.py",
    "content": "# Configuration file for the Sphinx documentation builder.\n#\n# This file only contains a selection of the most common options. For a full\n# list see the documentation:\n# https://www.sphinx-doc.org/en/master/usage/configuration.html\n\n# -- Path setup --------------------------------------------------------------\n\n# If extensions (or modules to document with autodoc) are in another directory,\n# add these directories to sys.path here. If the directory is relative to the\n# documentation root, use os.path.abspath to make it absolute, like shown here.\n#\nimport os\nimport sys\nimport pdb\n\nsys.path.insert(0, os.path.abspath('../../'))\n\n# -- Project information -----------------------------------------------------\n\nproject = 'transtab'\ncopyright = '2022, Zifeng Wang'\nauthor = 'Zifeng Wang'\n\n# The full version, including alpha/beta/rc tags\nrelease = 'alpha'\n\n# Override the RTD default master doc\nmaster_doc = 'index'\n\n# -- General configuration ---------------------------------------------------\n\n# Add any Sphinx extension module names here, as strings. They can be\n# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom\n# ones.\nextensions = [\n    'recommonmark',\n    'sphinx_markdown_tables',\n    'sphinx.ext.intersphinx',\n    'sphinx.ext.imgmath',\n    'sphinx.ext.viewcode',\n    'sphinx.ext.napoleon',\n    'sphinx.ext.autodoc',\n]\n\nnapoleon_google_docstring = False\nnapoleon_numpy_docstring = True\n\n# Add any paths that contain templates here, relative to this directory.\ntemplates_path = ['_templates']\n\n# List of patterns, relative to source directory, that match files and\n# directories to ignore when looking for source files.\n# This pattern also affects html_static_path and html_extra_path.\nexclude_patterns = []\n\n\n# -- Options for HTML output -------------------------------------------------\n\n# The theme to use for HTML and HTML Help pages.  See the documentation for\n# a list of builtin themes.\n#\n# html_theme = 'alabaster'\nhtml_theme = 'sphinx_rtd_theme'\n# html_theme = 'furo'\n\n# Add any paths that contain custom static files (such as style sheets) here,\n# relative to this directory. They are copied after the builtin static files,\n# so a file named \"default.css\" will overwrite the builtin \"default.css\".\nhtml_static_path = ['_static']\n"
  },
  {
    "path": "docs/source/data_preparation.rst",
    "content": "Custom Dataset\n==============\n\nHere is the best practice to build your own datasets for `transtab`.\n\n::\n\n    project\n    |\n    ├── run_your_model.py\n    |\n    └─── data\n         |\n         ├── dataset1\n         |   |    data_processed.csv\n         |   |    binary_feature.txt\n         |   └─── numerical_feature.txt\n         |\n         ├── dataset2\n         |   \n        ...\n\nwhere the ``run_your_model.py`` is the code where you will load the dataset and train your models.\n\nYou should put the preprocessed table into ``data_processed.csv``, which is better to follow the protocols:\n\n* All the column names to be represented by meaningful natural languge.\n* All the categorical features to be represented by meaningful natural language.\n* All the binary features to be represented by 0 or 1.\n* All the numerical features to be represented by continuous values.\n* Store the processed table into ``data_processed.csv``.\n* Store the binary column names into ``binary_feature.txt``. No need to create this file if no binary feature.\n* Store the numerical column names into ``numerical_feature.txt``. No need to create this file if no numerical feature.\n* All the other columns will be treated as categorical or textual.\n\nAfter that, you can try to load the dataset by\n\n\n.. code-block:: python\n\n    transtab.load_data('./data/dataset1')\n\n\nAbout ``dataset_config``, an example is provided as\n\n.. code-block:: python\n\n    EXAMPLE_DATACONFIG = {\n        \"example\": { # dataset name\n            \"bin\": [\"bin1\", \"bin2\"], # binary column names\n            \"cat\": [\"cat1\", \"cat2\"], # categorical column names\n            \"num\": [\"num1\", \"num2\"], # numerical column names\n            \"cols\": [\"bin1\", \"bin2\", \"cat1\", \"cat2\", \"num1\", \"num2\"], # all column names\n            \"binary_indicator\": [\"1\", \"yes\", \"true\", \"positive\", \"t\", \"y\"], # binary indicators in the binary columns, which will be converted to 1\n            \"data_split_idx\": {\n                \"train\":[0, 1, 2, 3, 4, 5, 6, 7, 8, 9], # row indices for training set\n                \"val\":[10, 11, 12, 13, 14, 15, 16, 17, 18, 19], # row indices for validation set\n                \"test\":[20, 21, 22, 23, 24, 25, 26, 27, 28, 29], # row indices for test set\n                }\n            }\n        }\n\n"
  },
  {
    "path": "docs/source/example_encode.rst",
    "content": "Encode Tables\n=============\n\n*transtab* is able to take pd.DataFrame as inputs and outputs the encoded sample-level embeddings.\nThe full code is available at `Notebook Example <https://github.com/ryanwangzf/transtab/blob/master/examples/table_embedding.ipynb>`_.\n\n\n.. code-block:: python\n\n    import transtab\n\n    # load a dataset and start vanilla supervised training\n    allset, trainset, valset, testset, cat_cols, num_cols, bin_cols \\\n        = transtab.load_data('credit-g')\n    \n    # build transtab classifier model\n    model, collate_fn = transtab.build_contrastive_learner(cat_cols, num_cols, bin_cols)\n\n    # start training\n    training_arguments = {\n        'num_epoch':50,\n        'batch_size':64,\n        'lr':1e-4,\n        'eval_metric':'val_loss',\n        'eval_less_is_better':True,\n        'output_dir':'./checkpoint'\n        }\n    transtab.train(model, trainset, valset, collate_fn=collate_fn, **training_arguments)\n\nNow we have obtained the pretrained model saved in './checkpoint', we can load the model\nfrom this path and use it to encode tables.\n\n\n.. code-block:: python\n\n    # load the pretrained model\n    enc = transtab.build_encoder(\n        binary_columns=bin_cols,\n        checkpoint = './checkpoint'\n    )\n\nThen we can take the whole pretrained model and output the cls token embedding at the last layer's outputs\n\n.. code-block:: python\n\n    # encode tables to sample-level embeddings\n    df = trainset[0]\n    output = enc(df)\n"
  },
  {
    "path": "docs/source/example_pretrain.rst",
    "content": "Tabular Pretraining\n===================\n\nWhen encountering multiple distinct tables which may have different number of classes, performing\ncontrastive pretraining (called Vertical-Partition Contrastive Learning, VPCL in the paper) is often\na better choice. This can be done using the transtab contrastive learner model.\nThe full code is available at `Notebook Example <https://github.com/ryanwangzf/transtab/blob/master/examples/contrastive_learning.ipynb>`_.\n\n\n.. code-block:: python\n\n    import transtab\n\n    # load multiple datasets by passing a list of data names\n    allset, trainset, valset, testset, cat_cols, num_cols, bin_cols \\\n        = transtab.load_data(['credit-g','credit-approval'])\n\n    # build contrastive learner, set supervised=True for supervised VPCL\n    model, collate_fn = transtab.build_contrastive_learner(\n        cat_cols, num_cols, bin_cols, \n        supervised=True, # if take supervised CL\n        num_partition=4, # num of column partitions for pos/neg sampling\n        overlap_ratio=0.5, # specify the overlap ratio of column partitions during the CL\n    )\n\n\nThe function transtab.build_contrastive_learner returns both the CL model and the collate function\nfor the training dataloaders. We then train the model like\n\n.. code-block:: python\n\n    # start contrastive pretraining training\n    training_arguments = {\n        'num_epoch':50,\n        'batch_size':64,\n        'lr':1e-4,\n        'eval_metric':'val_loss',\n        'eval_less_is_better':True,\n        'output_dir':'./checkpoint' # save the pretrained model\n        }\n\n    # pass the collate function to the train function\n    transtab.train(model, trainset, valset, collate_fn=collate_fn, **training_arguments)\n\n    \nAfter this pretrain completes, we shall build a classifier from the checkpoint.\n\n.. code-block:: python\n\n    # load the pretrained model and finetune on a target dataset\n    allset, trainset, valset, testset, cat_cols, num_cols, bin_cols \\\n        = transtab.load_data('credit-approval')\n\n    # build transtab classifier model, and load from the pretrained dir\n    model = transtab.build_classifier(checkpoint='./checkpoint')\n\n    # update model's categorical/numerical/binary column dict\n    model.update({'cat':cat_cols,'num':num_cols,'bin':bin_cols})\n\n"
  },
  {
    "path": "docs/source/example_transfer.rst",
    "content": "Tabular Transfer Learning\n=========================\n\n*transtab* is able to leverage the knowledge learned from broad data sources than finetunes on the target\ndata. It is also easy to fulfill it by this package.\nThe full code is available at `Notebook Example <https://github.com/ryanwangzf/transtab/blob/master/examples/transfer_learning.ipynb>`_.\n\n\n.. code-block:: python\n\n    import transtab\n\n    # load a dataset and start vanilla supervised training\n    allset, trainset, valset, testset, cat_cols, num_cols, bin_cols \\\n        = transtab.load_data('credit-g')\n    \n    # build transtab classifier model\n    model = transtab.build_classifier(cat_cols, num_cols, bin_cols)\n\n    # start training\n    training_arguments = {\n        'num_epoch':50,\n        'eval_metric':'val_loss',\n        'eval_less_is_better':True,\n        'output_dir':'./checkpoint'\n        }\n    transtab.train(model, trainset, valset, **training_arguments)\n\nNow we have obtained the pretrained model saved in './checkpoint', we can load the model\nfrom this path and update the model with new samples and columns.\n\n\n.. code-block:: python\n\n    # now let's load another data and try to leverage the pretrained model for finetuning\n    allset, trainset, valset, testset, cat_cols, num_cols, bin_cols \\\n        = transtab.load_data('credit-approval')\n\n    # load the pretrained model\n    model.load('./checkpoint')\n\n    # update model's categorical/numerical/binary column dict\n    model.update({'cat':cat_cols,'num':num_cols,'bin':bin_cols})\n\n\nIt should be noted if the finetune data differs the pretrain data on the number of classes, this should\nbe explicitly claimed in the update.\n\n.. code-block:: python\n\n    model.update({'cat':cat_cols,'num':num_cols,'bin':bin_cols, 'num_class':2})\n\n\nThen we can continue to train the model just as same as done for supervised learning.\n\n.. code-block:: python\n\n    transtab.train(model, trainset, valset, **training_arguments)\n"
  },
  {
    "path": "docs/source/fast_train.rst",
    "content": "Fast Train with TransTab\n=========================\n\n*transtab* is featured for accepting variable-column tables for training and predicting. This is easy to be done\nby this package.\nThe full code is available at `Notebook Example <https://github.com/ryanwangzf/transtab/blob/master/examples/fast_train.ipynb>`_.\n\n\n.. code-block:: python\n\n    import transtab\n\n    # load multiple datasets by passing a list of data names\n    allset, trainset, valset, testset, cat_cols, num_cols, bin_cols \\\n        = transtab.load_data(['credit-g','credit-approval'])\n\n    # build transtab classifier model\n    model = transtab.build_classifier(cat_cols, num_cols, bin_cols)\n\n    # specify training arguments, take validation loss for early stopping\n    training_arguments = {\n        'num_epoch':5, \n        'eval_metric':'val_loss',\n        'eval_less_is_better':True,\n        'output_dir':'./checkpoint'\n        }\n\n\nOne can take the validation loss on the validation data of the first dataset *credit-g* only:\n\n.. code-block:: python\n\n    transtab.train(model, trainset, valset[0], **training_arguments)\n\nor take the macro average loss on the validation set of both two datasets:\n\n.. code-block:: python\n\n    transtab.train(model, trainset, valset, **training_arguments)\n\nAfter the training completes, we can load the best checkpoint judged by validation loss from the predefined *output_dir*\nand make predictions.\n\n.. code-block:: python\n\n    model.load('./checkpoint')\n\n    x_test, y_test = testset[0]\n\n    ypred = transtab.predict(x_test)\n\n\n.. warning::\n\n    Under this pure supervised learning setting, all the passed datasets should have the \n    same **number of label classes**. For instance, here *credit-g* and *credit-approval* are both\n    binary classification task. It is because the classifier of `transtab` only keeps one classification head \n    during the training and predicting.\n\n\n\n\n"
  },
  {
    "path": "docs/source/index.rst",
    "content": "Welcome to transtab documentation!\n==================================\n\n`transtab` is an easy-to-use **Python package** for flexible tabular prediction framework. **Tabular data** dominates the applications of machine learning in research & development, including healthcare, finance, advertising, engineering, etc.\n\n`transtab` is featured for the following scenarios of tabular predictions:\n\n* **Supervised learning**: the vanilla train and predict on tables with the identical columns.\n* **Transfer learning**: given multiple labeled tables partially share columns, we enhance models for each of those tables by leveraging other tables.\n* **Incremental learning**: as a table incrementally grows with more columns, we update the existing model to handle the new table with more columns.\n* **Table Pretraining**: we pretrain models on many tables with distinct columns and identifiers for the target tabular prediction task.\n* **Zero-shot inference**: we build a model for an unseen table that only has partial overlaps with training tables.\n\n.. figure:: ../images/transtab_tasks.png\n\n    The demonstration of ML modeling on different tabular data settings.\n    Previous tabular methods only do vanilla supervised training or pretraining on the same table due to they only accept\n    **fixed-column tables**. By contrast, \\method covers more new tasks (1) to (4) as it accepts **variable-column** tables.\n\n\nThe basic usage of `transtab` can be done in a couple of lines:\n\n.. code-block:: python\n\n    import transtab\n\n    # load dataset by specifying dataset name\n    allset, trainset, valset, testset, cat_cols, num_cols, bin_cols \\\n        = transtab.load_data('credit-g')\n\n    # build classifier\n    model = transtab.build_classifier(cat_cols, num_cols, bin_cols)\n\n    # start training\n    transtab.train(model, trainset, valset, **training_arguments)\n\n    # make predictions, df_x is a pd.DataFrame with shape (n, d)\n    # return the predictions ypred with shape (n, 1) if binary classification;\n    # (n, n_class) if multiclass classification.\n    ypred = transtab.predict(model, df_x)\n\n\nIt's easy, isn't it?\n\nLet's start the journey from the `installation <https://transtab.readthedocs.io/en/latest/install.html>`_ and the `first demo on supervised tabular learning <https://transtab.readthedocs.io/en/latest/fast_train.html>`_ !\n\nWe also provide the examples on `tabular transfer learning <https://transtab.readthedocs.io/en/latest/example_transfer.html>`_ and `tabular pretraining <https://transtab.readthedocs.io/en/latest/example_pretrain.html>`_ for the quick start.\n\n----\n\n**Citing transtab**:\n\nIf you use `transtab` in a scientific publication, we would appreciate citations to the following paper::\n\n    @article{wang2022transtab,\n        author = {Wang, Zifeng and Sun, Jimeng},\n        title = {TransTab: Learning Transferable Tabular Transformers Across Tables},\n        journal={arXiv preprint arXiv:2205.09328},\n        year = {2022},\n    }\n\n\n.. toctree::\n   :maxdepth: 2\n   :hidden:\n   :caption: Getting Started\n\n   install\n   fast_train\n   example_transfer\n   example_pretrain\n   example_encode\n   data_preparation\n\n.. toctree::\n    :maxdepth: 2\n    :hidden:\n    :caption: Documentation\n\n    main_func\n    models\n\n\n.. toctree::\n    :maxdepth: 2\n    :hidden:\n    :caption: Additional Information\n\n    about\n"
  },
  {
    "path": "docs/source/install.rst",
    "content": "Installation\n============\n\n*transtab* was tested on Python 3.7+, PyTorch 1.8.0+. Please follow the Installation instructions below for the\ntorch version and CUDA device you are using:\n\n`PyTorch Installation Instructions <https://pytorch.org/get-started/locally/>`_.\n\nAfter that, *transtab* can be downloaded directly using **pip**. [Feb 2025, the PyPI version is no longer maintained, please try to install it from github]:\n\n.. code-block:: bash\n\n    pip install git+https://github.com/RyanWangZf/transtab.git\n\n\nAlternatively, you can clone the project and install it from local\n\n.. code-block:: bash\n\n    git clone https://github.com/RyanWangZf/transtab.git\n    cd transtab\n    pip install .\n\n**Troubleshooting**:\n\n1. If encountering ``ERROR: Failed building wheel for tokenizers`` on MAC/Linux, please call\n\n.. code-block:: bash\n\n    curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh\n\nthen restart the terminal and call ``pip`` again.\n"
  },
  {
    "path": "docs/source/main_func.rst",
    "content": "Main Functions\n==============\n\n.. toctree::\n    load_data<transtab.load_data>\n    build_classifier<transtab.build_classifier>\n    build_contrastive_learner<transtab.build_contrastive_learner>\n    build_encoder<transtab.build_encoder>\n    build_extractor<transtab.build_extractor>\n    train<transtab.train>\n    predict<transtab.predict>"
  },
  {
    "path": "docs/source/models.rst",
    "content": "Models\n======\n\n.. toctree::\n    BaseModel<transtab.basemodel>\n    TransTabClassifier<transtab.classifier>\n    TransTabForCL<transtab.contrastive>"
  },
  {
    "path": "docs/source/transtab.basemodel.rst",
    "content": "TransTabModel\n=============\n\n.. automodule:: transtab.modeling_transtab\n    :members: TransTabModel\n    :no-undoc-members:\n    :no-show-inheritance:"
  },
  {
    "path": "docs/source/transtab.build_classifier.rst",
    "content": "build_classifier\n================\n\n.. autofunction:: transtab.build_classifier\n\n.. warning::\n    If ``categorical_columns``,  ``numerical_columns``, and ``binary_columns`` are **ALL** not specified, the model takes **ALL** as ``categorical columns``,\n    which may undermine the performance significantly.\n"
  },
  {
    "path": "docs/source/transtab.build_contrastive_learner.rst",
    "content": "build_contrastive_learner\n=========================\n\n.. autofunction:: transtab.build_contrastive_learner\n"
  },
  {
    "path": "docs/source/transtab.build_encoder.rst",
    "content": "build_extractor\n===============\n\n.. autofunction:: transtab.build_encoder\n\nThe returned feature extractor takes pd.DataFrame as inputs and outputs the\nencoded sample-level embeddings.\n\n.. code-block:: python\n\n    # build the feature extractor\n    enc = transtab.build_encoder(categorical_columns=['gender'], numerical_columns=['age'])\n\n    # build a table for inputs\n    df = pd.DataFrame({'age':[1,2], 'gender':['male','female']})\n\n    # extract the outputs\n    outputs = enc(df)\n\n    print(outputs.shape)\n\n    '''\n    torch.Size([2, 128])\n    '''"
  },
  {
    "path": "docs/source/transtab.build_extractor.rst",
    "content": "build_extractor\n===============\n\n.. autofunction:: transtab.build_extractor\n\n\nThe returned feature extractor takes pd.DataFrame as inputs and outputs the\nencoded outputs in dict.\n\n.. code-block:: python\n\n    # build the feature extractor\n    extractor = transtab.build_extractor(categorical_columns=['gender'], numerical_columns=['age'])\n\n    # build a table for inputs\n    df = pd.DataFrame({'age':[1,2], 'gender':['male','female']})\n\n    # extract the outputs\n    outputs = extractor(df)\n\n    print(outputs)\n\n    '''\n        {\n        'x_num': tensor([[1.],[2.]], dtype=torch.float64),\n        'num_col_input_ids': tensor([[2287]]),\n        'x_cat_input_ids': tensor([[5907, 3287], [5907, 2931]]),\n        'x_bin_input_ids': None,\n        'num_att_mask': tensor([[1]]),\n        'cat_att_mask': tensor([[1, 1], [1, 1]])\n        }\n    '''\n"
  },
  {
    "path": "docs/source/transtab.classifier.rst",
    "content": "TransTabClassifier\n==================\n\n.. autoclass:: transtab.modeling_transtab.TransTabClassifier\n    :members:\n    :no-undoc-members:\n    :no-show-inheritance:\n"
  },
  {
    "path": "docs/source/transtab.contrastive.rst",
    "content": "TransTabForCL\n=============\n\n.. autoclass:: transtab.modeling_transtab.TransTabForCL\n    :members:\n    :no-undoc-members:\n    :no-show-inheritance:\n"
  },
  {
    "path": "docs/source/transtab.load_data.rst",
    "content": "load_data\n=========\n\n.. autofunction:: transtab.load_data\n\n\n*transtab* provides flexible data loading function.\nIt can be used to load arbitrary datasets from `openml <https://www.openml.org/>`_ supported by `openml.datasets API <https://docs.openml.org/Python-API/>`_.\n\n.. code-block:: python\n\n    # specify the dataname\n    allset, trainset, valset, testset, cat_cols, num_cols, bin_cols \\\n        = transtab.load_data('credit-g')\n\n    # or specify the dataset index (in openml)\n    allset, trainset, valset, testset, cat_cols, num_cols, bin_cols \\\n        = transtab.load_data(31)\n\nIt can also be used to load datasets from the local device.\n\n.. code-block:: python\n\n    # specify the dataset dir\n    allset, trainset, valset, testset, cat_cols, num_cols, bin_cols \\\n        = transtab.load_data('./data/credit-g')\n\n\nAnother important feature is to use this function to load multiple datasets\n\n.. code-block:: python\n\n    # specify the dataset dir\n    allset, trainset, valset, testset, cat_cols, num_cols, bin_cols \\\n        = transtab.load_data(['./data/credit-g','./data/credit-approval'])\n\nOne can also pass ``dataset_config`` to the ``load_data`` function to manipulate the input table directly.\n\n.. code-block:: python\n\n    # customize dataset configuration\n    dataset_config = {\n        'credit-g':{\n            'columns':['a','b','c'], # specify the new columns for the table, should keep the same dimension as the original table.\n            'cat':['a'], # specify all the categorical columns\n            'bin':['b'], # specify all the binary columns\n            'num':['c']} # specify all the numerical columns\n            }\n\n    allset, trainset, valset, testset, cat_cols, num_cols, bin_cols \\\n        = transtab.load_data('credit-g', dataset_config=dataset_config)\n\n\nWhile this operation is not recommended. To avoid making errors, you'd better deposit all these configurations to the local following\nthe guidance of `custom dataset <https://transtab.readthedocs.io/en/latest/data_preparation.html>`_.\n"
  },
  {
    "path": "docs/source/transtab.predict.rst",
    "content": "predict\n=======\n\n.. autofunction:: transtab.predict\n"
  },
  {
    "path": "docs/source/transtab.train.rst",
    "content": "train\n=====\n\n.. autofunction:: transtab.train\n"
  },
  {
    "path": "docs/sphinx-commands.txt",
    "content": "# build html files\nsphinx-build -b html source build"
  },
  {
    "path": "examples/contrastive_learning.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 1,\n   \"id\": \"0c0001bb\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import os\\n\",\n    \"os.chdir('../')\\n\",\n    \"\\n\",\n    \"import transtab\\n\",\n    \"\\n\",\n    \"# set random seed\\n\",\n    \"transtab.random_seed(42)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 2,\n   \"id\": \"865b42a8\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"########################################\\n\",\n      \"openml data index: 31\\n\",\n      \"load data from credit-g\\n\",\n      \"# data: 1000, # feat: 20, # cate: 11,  # bin: 2, # numerical: 7, pos rate: 0.70\\n\",\n      \"########################################\\n\",\n      \"openml data index: 29\\n\",\n      \"load data from credit-approval\\n\",\n      \"# data: 690, # feat: 15, # cate: 9,  # bin: 0, # numerical: 6, pos rate: 0.56\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# load multiple datasets by passing a list of data names\\n\",\n    \"allset, trainset, valset, testset, cat_cols, num_cols, bin_cols \\\\\\n\",\n    \"    = transtab.load_data(['credit-g','credit-approval'])\\n\",\n    \"\\n\",\n    \"# build contrastive learner, set supervised=True for supervised VPCL\\n\",\n    \"model, collate_fn = transtab.build_contrastive_learner(\\n\",\n    \"    cat_cols, num_cols, bin_cols, \\n\",\n    \"    supervised=True, # if take supervised CL\\n\",\n    \"    num_partition=4, # num of column partitions for pos/neg sampling\\n\",\n    \"    overlap_ratio=0.5, # specify the overlap ratio of column partitions during the CL\\n\",\n    \")\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 3,\n   \"id\": \"78d0bc6c\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"1a6a12cd244e4672b360c68222c7b7f8\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Epoch:   0%|          | 0/50 [00:00<?, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 0, test val_loss: 5.794664\\n\",\n      \"epoch: 0, train loss: 105.4182, lr: 0.000100, spent: 1.1 secs\\n\",\n      \"epoch: 1, test val_loss: 5.786065\\n\",\n      \"epoch: 1, train loss: 104.5511, lr: 0.000100, spent: 2.0 secs\\n\",\n      \"epoch: 2, test val_loss: 5.781867\\n\",\n      \"epoch: 2, train loss: 104.5076, lr: 0.000100, spent: 3.0 secs\\n\",\n      \"epoch: 3, test val_loss: 5.777907\\n\",\n      \"epoch: 3, train loss: 104.4728, lr: 0.000100, spent: 4.1 secs\\n\",\n      \"epoch: 4, test val_loss: 5.775703\\n\",\n      \"epoch: 4, train loss: 104.4284, lr: 0.000100, spent: 5.0 secs\\n\",\n      \"epoch: 5, test val_loss: 5.772933\\n\",\n      \"epoch: 5, train loss: 104.4126, lr: 0.000100, spent: 6.0 secs\\n\",\n      \"epoch: 6, test val_loss: 5.771537\\n\",\n      \"epoch: 6, train loss: 104.3681, lr: 0.000100, spent: 6.9 secs\\n\",\n      \"epoch: 7, test val_loss: 5.768374\\n\",\n      \"epoch: 7, train loss: 104.3112, lr: 0.000100, spent: 7.8 secs\\n\",\n      \"epoch: 8, test val_loss: 5.766492\\n\",\n      \"epoch: 8, train loss: 104.3186, lr: 0.000100, spent: 8.8 secs\\n\",\n      \"epoch: 9, test val_loss: 5.763317\\n\",\n      \"epoch: 9, train loss: 104.2437, lr: 0.000100, spent: 9.7 secs\\n\",\n      \"epoch: 10, test val_loss: 5.763273\\n\",\n      \"epoch: 10, train loss: 104.2665, lr: 0.000100, spent: 10.8 secs\\n\",\n      \"epoch: 11, test val_loss: 5.758865\\n\",\n      \"epoch: 11, train loss: 104.2031, lr: 0.000100, spent: 12.0 secs\\n\",\n      \"epoch: 12, test val_loss: 5.761363\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 12, train loss: 104.2412, lr: 0.000100, spent: 13.1 secs\\n\",\n      \"epoch: 13, test val_loss: 5.760094\\n\",\n      \"EarlyStopping counter: 2 out of 5\\n\",\n      \"epoch: 13, train loss: 104.2192, lr: 0.000100, spent: 14.4 secs\\n\",\n      \"epoch: 14, test val_loss: 5.756854\\n\",\n      \"epoch: 14, train loss: 104.1880, lr: 0.000100, spent: 15.7 secs\\n\",\n      \"epoch: 15, test val_loss: 5.755385\\n\",\n      \"epoch: 15, train loss: 104.1087, lr: 0.000100, spent: 17.0 secs\\n\",\n      \"epoch: 16, test val_loss: 5.755942\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 16, train loss: 104.1531, lr: 0.000100, spent: 18.3 secs\\n\",\n      \"epoch: 17, test val_loss: 5.758205\\n\",\n      \"EarlyStopping counter: 2 out of 5\\n\",\n      \"epoch: 17, train loss: 104.2000, lr: 0.000100, spent: 19.4 secs\\n\",\n      \"epoch: 18, test val_loss: 5.748805\\n\",\n      \"epoch: 18, train loss: 104.0332, lr: 0.000100, spent: 20.5 secs\\n\",\n      \"epoch: 19, test val_loss: 5.748421\\n\",\n      \"epoch: 19, train loss: 104.0516, lr: 0.000100, spent: 21.8 secs\\n\",\n      \"epoch: 20, test val_loss: 5.749574\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 20, train loss: 104.0346, lr: 0.000100, spent: 22.9 secs\\n\",\n      \"epoch: 21, test val_loss: 5.749054\\n\",\n      \"EarlyStopping counter: 2 out of 5\\n\",\n      \"epoch: 21, train loss: 104.0557, lr: 0.000100, spent: 23.9 secs\\n\",\n      \"epoch: 22, test val_loss: 5.752270\\n\",\n      \"EarlyStopping counter: 3 out of 5\\n\",\n      \"epoch: 22, train loss: 104.0468, lr: 0.000100, spent: 25.1 secs\\n\",\n      \"epoch: 23, test val_loss: 5.749521\\n\",\n      \"EarlyStopping counter: 4 out of 5\\n\",\n      \"epoch: 23, train loss: 104.0925, lr: 0.000100, spent: 26.1 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"2022-08-31 10:56:45.227 | INFO     | transtab.trainer:train:132 - load best at last from ./checkpoint\\n\",\n      \"2022-08-31 10:56:45.242 | INFO     | transtab.trainer:save_model:239 - saving model checkpoint to ./checkpoint\\n\",\n      \"2022-08-31 10:56:45.379 | INFO     | transtab.trainer:train:137 - training complete, cost 27.2 secs.\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 24, test val_loss: 5.751015\\n\",\n      \"EarlyStopping counter: 5 out of 5\\n\",\n      \"early stopped\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# start contrastive pretraining training\\n\",\n    \"training_arguments = {\\n\",\n    \"    'num_epoch':50,\\n\",\n    \"    'batch_size':64,\\n\",\n    \"    'lr':1e-4,\\n\",\n    \"    'eval_metric':'val_loss',\\n\",\n    \"    'eval_less_is_better':True,\\n\",\n    \"    'output_dir':'./checkpoint'\\n\",\n    \"    }\\n\",\n    \"\\n\",\n    \"transtab.train(model, trainset, valset, collate_fn=collate_fn, **training_arguments)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 4,\n   \"id\": \"85e9ad3c\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"########################################\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"2022-08-31 10:56:48.450 | WARNING  | transtab.modeling_transtab:_check_column_overlap:254 - No cat/num/bin cols specified, will take ALL columns as categorical! Ignore this warning if you specify the `checkpoint` to load the model.\\n\",\n      \"2022-08-31 10:56:48.527 | INFO     | transtab.modeling_transtab:load:782 - missing keys: ['clf.fc.weight', 'clf.fc.bias', 'clf.norm.weight', 'clf.norm.bias']\\n\",\n      \"2022-08-31 10:56:48.528 | INFO     | transtab.modeling_transtab:load:783 - unexpected keys: ['projection_head.dense.weight']\\n\",\n      \"2022-08-31 10:56:48.528 | INFO     | transtab.modeling_transtab:load:784 - load model from ./checkpoint\\n\",\n      \"2022-08-31 10:56:48.542 | INFO     | transtab.modeling_transtab:load:222 - load feature extractor from ./checkpoint/extractor/extractor.json\\n\",\n      \"2022-08-31 10:56:48.556 | WARNING  | transtab.modeling_transtab:_check_column_overlap:254 - No cat/num/bin cols specified, will take ALL columns as categorical! Ignore this warning if you specify the `checkpoint` to load the model.\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"openml data index: 29\\n\",\n      \"load data from credit-approval\\n\",\n      \"# data: 690, # feat: 15, # cate: 9,  # bin: 0, # numerical: 6, pos rate: 0.56\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"8bc6cedea8c74fa0a79a6201160b8641\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Epoch:   0%|          | 0/50 [00:00<?, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 0, test val_loss: 0.683971\\n\",\n      \"epoch: 0, train loss: 5.4453, lr: 0.000100, spent: 0.3 secs\\n\",\n      \"epoch: 1, test val_loss: 0.646593\\n\",\n      \"epoch: 1, train loss: 5.2291, lr: 0.000100, spent: 0.6 secs\\n\",\n      \"epoch: 2, test val_loss: 0.598986\\n\",\n      \"epoch: 2, train loss: 4.9122, lr: 0.000100, spent: 0.8 secs\\n\",\n      \"epoch: 3, test val_loss: 0.571086\\n\",\n      \"epoch: 3, train loss: 4.6084, lr: 0.000100, spent: 1.1 secs\\n\",\n      \"epoch: 4, test val_loss: 0.500248\\n\",\n      \"epoch: 4, train loss: 4.2688, lr: 0.000100, spent: 1.3 secs\\n\",\n      \"epoch: 5, test val_loss: 0.461829\\n\",\n      \"epoch: 5, train loss: 3.8759, lr: 0.000100, spent: 1.6 secs\\n\",\n      \"epoch: 6, test val_loss: 0.418263\\n\",\n      \"epoch: 6, train loss: 3.5448, lr: 0.000100, spent: 1.9 secs\\n\",\n      \"epoch: 7, test val_loss: 0.406784\\n\",\n      \"epoch: 7, train loss: 3.3226, lr: 0.000100, spent: 2.2 secs\\n\",\n      \"epoch: 8, test val_loss: 0.415289\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 8, train loss: 3.2534, lr: 0.000100, spent: 2.5 secs\\n\",\n      \"epoch: 9, test val_loss: 0.395700\\n\",\n      \"epoch: 9, train loss: 3.1036, lr: 0.000100, spent: 2.7 secs\\n\",\n      \"epoch: 10, test val_loss: 0.477691\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 10, train loss: 2.9625, lr: 0.000100, spent: 3.2 secs\\n\",\n      \"epoch: 11, test val_loss: 0.394624\\n\",\n      \"epoch: 11, train loss: 2.9855, lr: 0.000100, spent: 3.5 secs\\n\",\n      \"epoch: 12, test val_loss: 0.395159\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 12, train loss: 3.0646, lr: 0.000100, spent: 3.7 secs\\n\",\n      \"epoch: 13, test val_loss: 0.520994\\n\",\n      \"EarlyStopping counter: 2 out of 5\\n\",\n      \"epoch: 13, train loss: 3.0765, lr: 0.000100, spent: 4.0 secs\\n\",\n      \"epoch: 14, test val_loss: 0.388927\\n\",\n      \"epoch: 14, train loss: 3.0590, lr: 0.000100, spent: 4.3 secs\\n\",\n      \"epoch: 15, test val_loss: 0.447461\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 15, train loss: 2.8070, lr: 0.000100, spent: 4.5 secs\\n\",\n      \"epoch: 16, test val_loss: 0.402370\\n\",\n      \"EarlyStopping counter: 2 out of 5\\n\",\n      \"epoch: 16, train loss: 2.6713, lr: 0.000100, spent: 4.7 secs\\n\",\n      \"epoch: 17, test val_loss: 0.393792\\n\",\n      \"EarlyStopping counter: 3 out of 5\\n\",\n      \"epoch: 17, train loss: 2.7131, lr: 0.000100, spent: 5.0 secs\\n\",\n      \"epoch: 18, test val_loss: 0.455256\\n\",\n      \"EarlyStopping counter: 4 out of 5\\n\",\n      \"epoch: 18, train loss: 2.7538, lr: 0.000100, spent: 5.2 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"2022-08-31 10:56:53.974 | INFO     | transtab.trainer:train:132 - load best at last from ./checkpoint\\n\",\n      \"2022-08-31 10:56:54.000 | INFO     | transtab.trainer:save_model:239 - saving model checkpoint to ./checkpoint\\n\",\n      \"2022-08-31 10:56:54.130 | INFO     | transtab.trainer:train:137 - training complete, cost 5.6 secs.\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 19, test val_loss: 0.406734\\n\",\n      \"EarlyStopping counter: 5 out of 5\\n\",\n      \"early stopped\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# load the pretrained model and finetune on a target dataset\\n\",\n    \"allset, trainset, valset, testset, cat_cols, num_cols, bin_cols \\\\\\n\",\n    \"     = transtab.load_data('credit-approval')\\n\",\n    \"\\n\",\n    \"# build transtab classifier model, and load from the pretrained dir\\n\",\n    \"model = transtab.build_classifier(checkpoint='./checkpoint')\\n\",\n    \"\\n\",\n    \"# update model's categorical/numerical/binary column dict\\n\",\n    \"model.update({'cat':cat_cols,'num':num_cols,'bin':bin_cols})\\n\",\n    \"\\n\",\n    \"# start finetuning\\n\",\n    \"training_arguments = {\\n\",\n    \"    'num_epoch':50,\\n\",\n    \"    'eval_metric':'val_loss',\\n\",\n    \"    'eval_less_is_better':True,\\n\",\n    \"    'output_dir':'./checkpoint'\\n\",\n    \"    }\\n\",\n    \"transtab.train(model, trainset, valset, **training_arguments)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 5,\n   \"id\": \"ba5e5238\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"auc 0.95 mean/interval 0.8382(0.06)\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"[0.8382272091644043]\"\n      ]\n     },\n     \"execution_count\": 5,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"# evaluation\\n\",\n    \"x_test, y_test = testset\\n\",\n    \"ypred = transtab.predict(model, x_test)\\n\",\n    \"transtab.evaluate(ypred, y_test, metric='auc')\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"id\": \"da5d6d70\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": []\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"Python 3 (ipykernel)\",\n   \"language\": \"python\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.8.13\"\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 5\n}\n"
  },
  {
    "path": "examples/fast_train.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 1,\n   \"id\": \"0bc8ef17\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import os\\n\",\n    \"os.chdir('../')\\n\",\n    \"\\n\",\n    \"import transtab\\n\",\n    \"\\n\",\n    \"# set random seed\\n\",\n    \"transtab.random_seed(42)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 2,\n   \"id\": \"e06b2eb3\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"########################################\\n\",\n      \"openml data index: 31\\n\",\n      \"load data from credit-g\\n\",\n      \"# data: 1000, # feat: 20, # cate: 11,  # bin: 2, # numerical: 7, pos rate: 0.70\\n\",\n      \"########################################\\n\",\n      \"openml data index: 29\\n\",\n      \"load data from credit-approval\\n\",\n      \"# data: 690, # feat: 15, # cate: 9,  # bin: 0, # numerical: 6, pos rate: 0.56\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# load multiple datasets by passing a list of data names\\n\",\n    \"allset, trainset, valset, testset, cat_cols, num_cols, bin_cols \\\\\\n\",\n    \"    = transtab.load_data(['credit-g','credit-approval'])\\n\",\n    \"\\n\",\n    \"# build transtab classifier model\\n\",\n    \"model = transtab.build_classifier(cat_cols, num_cols, bin_cols)\\n\",\n    \"\\n\",\n    \"# specify training arguments, take validation loss for early stopping\\n\",\n    \"training_arguments = {\\n\",\n    \"    'num_epoch':50,\\n\",\n    \"    'batch_size':128,\\n\",\n    \"    'lr':1e-4,\\n\",\n    \"    'eval_metric':'val_loss',\\n\",\n    \"    'eval_less_is_better':True,\\n\",\n    \"    'output_dir':'./checkpoint'\\n\",\n    \"    }\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 3,\n   \"id\": \"f0c84e5f\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>own_telephone</th>\\n\",\n       \"      <th>foreign_worker</th>\\n\",\n       \"      <th>duration</th>\\n\",\n       \"      <th>credit_amount</th>\\n\",\n       \"      <th>installment_commitment</th>\\n\",\n       \"      <th>residence_since</th>\\n\",\n       \"      <th>age</th>\\n\",\n       \"      <th>existing_credits</th>\\n\",\n       \"      <th>num_dependents</th>\\n\",\n       \"      <th>checking_status</th>\\n\",\n       \"      <th>credit_history</th>\\n\",\n       \"      <th>purpose</th>\\n\",\n       \"      <th>savings_status</th>\\n\",\n       \"      <th>employment</th>\\n\",\n       \"      <th>personal_status</th>\\n\",\n       \"      <th>other_parties</th>\\n\",\n       \"      <th>property_magnitude</th>\\n\",\n       \"      <th>other_payment_plans</th>\\n\",\n       \"      <th>housing</th>\\n\",\n       \"      <th>job</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>636</th>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0.294118</td>\\n\",\n       \"      <td>0.061957</td>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>0.160714</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>no checking</td>\\n\",\n       \"      <td>existing paid</td>\\n\",\n       \"      <td>radio/tv</td>\\n\",\n       \"      <td>500&lt;=X&lt;1000</td>\\n\",\n       \"      <td>4&lt;=X&lt;7</td>\\n\",\n       \"      <td>female div/dep/mar</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>car</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>own</td>\\n\",\n       \"      <td>skilled</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>182</th>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0.250000</td>\\n\",\n       \"      <td>0.076868</td>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>0.375000</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>1.0</td>\\n\",\n       \"      <td>&lt;0</td>\\n\",\n       \"      <td>all paid</td>\\n\",\n       \"      <td>new car</td>\\n\",\n       \"      <td>no known savings</td>\\n\",\n       \"      <td>1&lt;=X&lt;4</td>\\n\",\n       \"      <td>male single</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>life insurance</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>own</td>\\n\",\n       \"      <td>unskilled resident</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>736</th>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0.294118</td>\\n\",\n       \"      <td>0.622318</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"      <td>0.071429</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0&lt;=X&lt;200</td>\\n\",\n       \"      <td>existing paid</td>\\n\",\n       \"      <td>used car</td>\\n\",\n       \"      <td>&lt;100</td>\\n\",\n       \"      <td>1&lt;=X&lt;4</td>\\n\",\n       \"      <td>female div/dep/mar</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>car</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>rent</td>\\n\",\n       \"      <td>high qualif/self emp/mgmt</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>922</th>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0.073529</td>\\n\",\n       \"      <td>0.061406</td>\\n\",\n       \"      <td>0.666667</td>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"      <td>0.053571</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>&lt;0</td>\\n\",\n       \"      <td>existing paid</td>\\n\",\n       \"      <td>radio/tv</td>\\n\",\n       \"      <td>&lt;100</td>\\n\",\n       \"      <td>&lt;1</td>\\n\",\n       \"      <td>female div/dep/mar</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>life insurance</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>rent</td>\\n\",\n       \"      <td>skilled</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>511</th>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0.470588</td>\\n\",\n       \"      <td>0.244085</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>0.232143</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>no checking</td>\\n\",\n       \"      <td>existing paid</td>\\n\",\n       \"      <td>used car</td>\\n\",\n       \"      <td>&lt;100</td>\\n\",\n       \"      <td>1&lt;=X&lt;4</td>\\n\",\n       \"      <td>male single</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>no known property</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>for free</td>\\n\",\n       \"      <td>high qualif/self emp/mgmt</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>845</th>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0.250000</td>\\n\",\n       \"      <td>0.205018</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>0.666667</td>\\n\",\n       \"      <td>0.285714</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0&lt;=X&lt;200</td>\\n\",\n       \"      <td>existing paid</td>\\n\",\n       \"      <td>furniture/equipment</td>\\n\",\n       \"      <td>no known savings</td>\\n\",\n       \"      <td>4&lt;=X&lt;7</td>\\n\",\n       \"      <td>male single</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>car</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>own</td>\\n\",\n       \"      <td>skilled</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>492</th>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0.029412</td>\\n\",\n       \"      <td>0.054308</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>0.142857</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>no checking</td>\\n\",\n       \"      <td>critical/other existing credit</td>\\n\",\n       \"      <td>radio/tv</td>\\n\",\n       \"      <td>100&lt;=X&lt;500</td>\\n\",\n       \"      <td>1&lt;=X&lt;4</td>\\n\",\n       \"      <td>female div/dep/mar</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>life insurance</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>own</td>\\n\",\n       \"      <td>skilled</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>849</th>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0.117647</td>\\n\",\n       \"      <td>0.025256</td>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"      <td>0.678571</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>&lt;0</td>\\n\",\n       \"      <td>existing paid</td>\\n\",\n       \"      <td>radio/tv</td>\\n\",\n       \"      <td>&lt;100</td>\\n\",\n       \"      <td>&gt;=7</td>\\n\",\n       \"      <td>male single</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>real estate</td>\\n\",\n       \"      <td>stores</td>\\n\",\n       \"      <td>own</td>\\n\",\n       \"      <td>unskilled resident</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>297</th>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0.088235</td>\\n\",\n       \"      <td>0.057060</td>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>0.464286</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>no checking</td>\\n\",\n       \"      <td>existing paid</td>\\n\",\n       \"      <td>new car</td>\\n\",\n       \"      <td>no known savings</td>\\n\",\n       \"      <td>&gt;=7</td>\\n\",\n       \"      <td>male single</td>\\n\",\n       \"      <td>co applicant</td>\\n\",\n       \"      <td>life insurance</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>own</td>\\n\",\n       \"      <td>unskilled resident</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>98</th>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0.470588</td>\\n\",\n       \"      <td>0.114834</td>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"      <td>0.303571</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0&lt;=X&lt;200</td>\\n\",\n       \"      <td>critical/other existing credit</td>\\n\",\n       \"      <td>radio/tv</td>\\n\",\n       \"      <td>&lt;100</td>\\n\",\n       \"      <td>&gt;=7</td>\\n\",\n       \"      <td>male single</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>real estate</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>own</td>\\n\",\n       \"      <td>skilled</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"<p>700 rows × 20 columns</p>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"     own_telephone  foreign_worker  duration  credit_amount  \\\\\\n\",\n       \"636              0               1  0.294118       0.061957   \\n\",\n       \"182              0               1  0.250000       0.076868   \\n\",\n       \"736              0               1  0.294118       0.622318   \\n\",\n       \"922              0               1  0.073529       0.061406   \\n\",\n       \"511              1               1  0.470588       0.244085   \\n\",\n       \"..             ...             ...       ...            ...   \\n\",\n       \"845              1               1  0.250000       0.205018   \\n\",\n       \"492              0               1  0.029412       0.054308   \\n\",\n       \"849              0               1  0.117647       0.025256   \\n\",\n       \"297              0               0  0.088235       0.057060   \\n\",\n       \"98               0               1  0.470588       0.114834   \\n\",\n       \"\\n\",\n       \"     installment_commitment  residence_since       age  existing_credits  \\\\\\n\",\n       \"636                1.000000         0.000000  0.160714          0.000000   \\n\",\n       \"182                1.000000         0.333333  0.375000          0.333333   \\n\",\n       \"736                0.000000         1.000000  0.071429          0.333333   \\n\",\n       \"922                0.666667         1.000000  0.053571          0.000000   \\n\",\n       \"511                0.333333         0.333333  0.232143          0.000000   \\n\",\n       \"..                      ...              ...       ...               ...   \\n\",\n       \"845                0.333333         0.666667  0.285714          0.000000   \\n\",\n       \"492                0.000000         0.000000  0.142857          0.333333   \\n\",\n       \"849                1.000000         1.000000  0.678571          0.000000   \\n\",\n       \"297                1.000000         0.333333  0.464286          0.000000   \\n\",\n       \"98                 1.000000         1.000000  0.303571          0.000000   \\n\",\n       \"\\n\",\n       \"     num_dependents checking_status                  credit_history  \\\\\\n\",\n       \"636             0.0     no checking                   existing paid   \\n\",\n       \"182             1.0              <0                        all paid   \\n\",\n       \"736             0.0        0<=X<200                   existing paid   \\n\",\n       \"922             0.0              <0                   existing paid   \\n\",\n       \"511             0.0     no checking                   existing paid   \\n\",\n       \"..              ...             ...                             ...   \\n\",\n       \"845             0.0        0<=X<200                   existing paid   \\n\",\n       \"492             0.0     no checking  critical/other existing credit   \\n\",\n       \"849             0.0              <0                   existing paid   \\n\",\n       \"297             0.0     no checking                   existing paid   \\n\",\n       \"98              0.0        0<=X<200  critical/other existing credit   \\n\",\n       \"\\n\",\n       \"                 purpose    savings_status employment     personal_status  \\\\\\n\",\n       \"636             radio/tv       500<=X<1000     4<=X<7  female div/dep/mar   \\n\",\n       \"182              new car  no known savings     1<=X<4         male single   \\n\",\n       \"736             used car              <100     1<=X<4  female div/dep/mar   \\n\",\n       \"922             radio/tv              <100         <1  female div/dep/mar   \\n\",\n       \"511             used car              <100     1<=X<4         male single   \\n\",\n       \"..                   ...               ...        ...                 ...   \\n\",\n       \"845  furniture/equipment  no known savings     4<=X<7         male single   \\n\",\n       \"492             radio/tv        100<=X<500     1<=X<4  female div/dep/mar   \\n\",\n       \"849             radio/tv              <100        >=7         male single   \\n\",\n       \"297              new car  no known savings        >=7         male single   \\n\",\n       \"98              radio/tv              <100        >=7         male single   \\n\",\n       \"\\n\",\n       \"    other_parties property_magnitude other_payment_plans   housing  \\\\\\n\",\n       \"636          none                car                none       own   \\n\",\n       \"182          none     life insurance                none       own   \\n\",\n       \"736          none                car                none      rent   \\n\",\n       \"922          none     life insurance                none      rent   \\n\",\n       \"511          none  no known property                none  for free   \\n\",\n       \"..            ...                ...                 ...       ...   \\n\",\n       \"845          none                car                none       own   \\n\",\n       \"492          none     life insurance                none       own   \\n\",\n       \"849          none        real estate              stores       own   \\n\",\n       \"297  co applicant     life insurance                none       own   \\n\",\n       \"98           none        real estate                none       own   \\n\",\n       \"\\n\",\n       \"                           job  \\n\",\n       \"636                    skilled  \\n\",\n       \"182         unskilled resident  \\n\",\n       \"736  high qualif/self emp/mgmt  \\n\",\n       \"922                    skilled  \\n\",\n       \"511  high qualif/self emp/mgmt  \\n\",\n       \"..                         ...  \\n\",\n       \"845                    skilled  \\n\",\n       \"492                    skilled  \\n\",\n       \"849         unskilled resident  \\n\",\n       \"297         unskilled resident  \\n\",\n       \"98                     skilled  \\n\",\n       \"\\n\",\n       \"[700 rows x 20 columns]\"\n      ]\n     },\n     \"execution_count\": 3,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"trainset[0][0]\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 4,\n   \"id\": \"058f667e\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>own_telephone</th>\\n\",\n       \"      <th>foreign_worker</th>\\n\",\n       \"      <th>duration</th>\\n\",\n       \"      <th>credit_amount</th>\\n\",\n       \"      <th>installment_commitment</th>\\n\",\n       \"      <th>residence_since</th>\\n\",\n       \"      <th>age</th>\\n\",\n       \"      <th>existing_credits</th>\\n\",\n       \"      <th>num_dependents</th>\\n\",\n       \"      <th>checking_status</th>\\n\",\n       \"      <th>credit_history</th>\\n\",\n       \"      <th>purpose</th>\\n\",\n       \"      <th>savings_status</th>\\n\",\n       \"      <th>employment</th>\\n\",\n       \"      <th>personal_status</th>\\n\",\n       \"      <th>other_parties</th>\\n\",\n       \"      <th>property_magnitude</th>\\n\",\n       \"      <th>other_payment_plans</th>\\n\",\n       \"      <th>housing</th>\\n\",\n       \"      <th>job</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>32</th>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0.205882</td>\\n\",\n       \"      <td>0.309013</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>0.196429</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0&lt;=X&lt;200</td>\\n\",\n       \"      <td>existing paid</td>\\n\",\n       \"      <td>new car</td>\\n\",\n       \"      <td>100&lt;=X&lt;500</td>\\n\",\n       \"      <td>1&lt;=X&lt;4</td>\\n\",\n       \"      <td>male single</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>car</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>own</td>\\n\",\n       \"      <td>skilled</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>924</th>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0.294118</td>\\n\",\n       \"      <td>0.364367</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>0.642857</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>&lt;0</td>\\n\",\n       \"      <td>all paid</td>\\n\",\n       \"      <td>furniture/equipment</td>\\n\",\n       \"      <td>&lt;100</td>\\n\",\n       \"      <td>&lt;1</td>\\n\",\n       \"      <td>male div/sep</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>life insurance</td>\\n\",\n       \"      <td>bank</td>\\n\",\n       \"      <td>own</td>\\n\",\n       \"      <td>skilled</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>931</th>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0.073529</td>\\n\",\n       \"      <td>0.078134</td>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>0.053571</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0&lt;=X&lt;200</td>\\n\",\n       \"      <td>existing paid</td>\\n\",\n       \"      <td>radio/tv</td>\\n\",\n       \"      <td>&lt;100</td>\\n\",\n       \"      <td>&lt;1</td>\\n\",\n       \"      <td>female div/dep/mar</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>car</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>own</td>\\n\",\n       \"      <td>skilled</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>796</th>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0.205882</td>\\n\",\n       \"      <td>0.399527</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"      <td>0.571429</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>1.0</td>\\n\",\n       \"      <td>&lt;0</td>\\n\",\n       \"      <td>existing paid</td>\\n\",\n       \"      <td>used car</td>\\n\",\n       \"      <td>no known savings</td>\\n\",\n       \"      <td>&gt;=7</td>\\n\",\n       \"      <td>male single</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>life insurance</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>for free</td>\\n\",\n       \"      <td>skilled</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>226</th>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0.647059</td>\\n\",\n       \"      <td>0.589358</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>0.142857</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0&lt;=X&lt;200</td>\\n\",\n       \"      <td>existing paid</td>\\n\",\n       \"      <td>radio/tv</td>\\n\",\n       \"      <td>&gt;=1000</td>\\n\",\n       \"      <td>4&lt;=X&lt;7</td>\\n\",\n       \"      <td>male single</td>\\n\",\n       \"      <td>co applicant</td>\\n\",\n       \"      <td>no known property</td>\\n\",\n       \"      <td>bank</td>\\n\",\n       \"      <td>own</td>\\n\",\n       \"      <td>skilled</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>380</th>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0.235294</td>\\n\",\n       \"      <td>0.107956</td>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"      <td>0.357143</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>&lt;0</td>\\n\",\n       \"      <td>existing paid</td>\\n\",\n       \"      <td>furniture/equipment</td>\\n\",\n       \"      <td>no known savings</td>\\n\",\n       \"      <td>4&lt;=X&lt;7</td>\\n\",\n       \"      <td>male single</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>car</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>own</td>\\n\",\n       \"      <td>skilled</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>768</th>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0.117647</td>\\n\",\n       \"      <td>0.185265</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"      <td>0.160714</td>\\n\",\n       \"      <td>0.666667</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0&lt;=X&lt;200</td>\\n\",\n       \"      <td>critical/other existing credit</td>\\n\",\n       \"      <td>furniture/equipment</td>\\n\",\n       \"      <td>&lt;100</td>\\n\",\n       \"      <td>&gt;=7</td>\\n\",\n       \"      <td>male single</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>car</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>rent</td>\\n\",\n       \"      <td>skilled</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>85</th>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0.117647</td>\\n\",\n       \"      <td>0.063937</td>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>0.178571</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>no checking</td>\\n\",\n       \"      <td>critical/other existing credit</td>\\n\",\n       \"      <td>business</td>\\n\",\n       \"      <td>&lt;100</td>\\n\",\n       \"      <td>1&lt;=X&lt;4</td>\\n\",\n       \"      <td>female div/dep/mar</td>\\n\",\n       \"      <td>guarantor</td>\\n\",\n       \"      <td>real estate</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>own</td>\\n\",\n       \"      <td>high qualif/self emp/mgmt</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>527</th>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>0.068945</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>0.410714</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>1.0</td>\\n\",\n       \"      <td>no checking</td>\\n\",\n       \"      <td>critical/other existing credit</td>\\n\",\n       \"      <td>radio/tv</td>\\n\",\n       \"      <td>&lt;100</td>\\n\",\n       \"      <td>4&lt;=X&lt;7</td>\\n\",\n       \"      <td>male single</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>real estate</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>own</td>\\n\",\n       \"      <td>unskilled resident</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>117</th>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0.088235</td>\\n\",\n       \"      <td>0.103555</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>0.666667</td>\\n\",\n       \"      <td>0.142857</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>&lt;0</td>\\n\",\n       \"      <td>critical/other existing credit</td>\\n\",\n       \"      <td>furniture/equipment</td>\\n\",\n       \"      <td>no known savings</td>\\n\",\n       \"      <td>&lt;1</td>\\n\",\n       \"      <td>female div/dep/mar</td>\\n\",\n       \"      <td>co applicant</td>\\n\",\n       \"      <td>real estate</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>rent</td>\\n\",\n       \"      <td>skilled</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"<p>100 rows × 20 columns</p>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"     own_telephone  foreign_worker  duration  credit_amount  \\\\\\n\",\n       \"32               1               1  0.205882       0.309013   \\n\",\n       \"924              1               1  0.294118       0.364367   \\n\",\n       \"931              1               1  0.073529       0.078134   \\n\",\n       \"796              1               1  0.205882       0.399527   \\n\",\n       \"226              1               1  0.647059       0.589358   \\n\",\n       \"..             ...             ...       ...            ...   \\n\",\n       \"380              1               1  0.235294       0.107956   \\n\",\n       \"768              1               1  0.117647       0.185265   \\n\",\n       \"85               1               1  0.117647       0.063937   \\n\",\n       \"527              0               1  0.000000       0.068945   \\n\",\n       \"117              0               0  0.088235       0.103555   \\n\",\n       \"\\n\",\n       \"     installment_commitment  residence_since       age  existing_credits  \\\\\\n\",\n       \"32                 0.333333         0.333333  0.196429          0.333333   \\n\",\n       \"924                0.333333         0.000000  0.642857          0.000000   \\n\",\n       \"931                1.000000         0.333333  0.053571          0.000000   \\n\",\n       \"796                0.000000         1.000000  0.571429          0.000000   \\n\",\n       \"226                0.000000         0.333333  0.142857          0.333333   \\n\",\n       \"..                      ...              ...       ...               ...   \\n\",\n       \"380                1.000000         1.000000  0.357143          0.000000   \\n\",\n       \"768                0.000000         1.000000  0.160714          0.666667   \\n\",\n       \"85                 1.000000         0.333333  0.178571          0.333333   \\n\",\n       \"527                0.333333         0.000000  0.410714          0.333333   \\n\",\n       \"117                0.333333         0.666667  0.142857          0.333333   \\n\",\n       \"\\n\",\n       \"     num_dependents checking_status                  credit_history  \\\\\\n\",\n       \"32              0.0        0<=X<200                   existing paid   \\n\",\n       \"924             0.0              <0                        all paid   \\n\",\n       \"931             0.0        0<=X<200                   existing paid   \\n\",\n       \"796             1.0              <0                   existing paid   \\n\",\n       \"226             0.0        0<=X<200                   existing paid   \\n\",\n       \"..              ...             ...                             ...   \\n\",\n       \"380             0.0              <0                   existing paid   \\n\",\n       \"768             0.0        0<=X<200  critical/other existing credit   \\n\",\n       \"85              0.0     no checking  critical/other existing credit   \\n\",\n       \"527             1.0     no checking  critical/other existing credit   \\n\",\n       \"117             0.0              <0  critical/other existing credit   \\n\",\n       \"\\n\",\n       \"                 purpose    savings_status employment     personal_status  \\\\\\n\",\n       \"32               new car        100<=X<500     1<=X<4         male single   \\n\",\n       \"924  furniture/equipment              <100         <1        male div/sep   \\n\",\n       \"931             radio/tv              <100         <1  female div/dep/mar   \\n\",\n       \"796             used car  no known savings        >=7         male single   \\n\",\n       \"226             radio/tv            >=1000     4<=X<7         male single   \\n\",\n       \"..                   ...               ...        ...                 ...   \\n\",\n       \"380  furniture/equipment  no known savings     4<=X<7         male single   \\n\",\n       \"768  furniture/equipment              <100        >=7         male single   \\n\",\n       \"85              business              <100     1<=X<4  female div/dep/mar   \\n\",\n       \"527             radio/tv              <100     4<=X<7         male single   \\n\",\n       \"117  furniture/equipment  no known savings         <1  female div/dep/mar   \\n\",\n       \"\\n\",\n       \"    other_parties property_magnitude other_payment_plans   housing  \\\\\\n\",\n       \"32           none                car                none       own   \\n\",\n       \"924          none     life insurance                bank       own   \\n\",\n       \"931          none                car                none       own   \\n\",\n       \"796          none     life insurance                none  for free   \\n\",\n       \"226  co applicant  no known property                bank       own   \\n\",\n       \"..            ...                ...                 ...       ...   \\n\",\n       \"380          none                car                none       own   \\n\",\n       \"768          none                car                none      rent   \\n\",\n       \"85      guarantor        real estate                none       own   \\n\",\n       \"527          none        real estate                none       own   \\n\",\n       \"117  co applicant        real estate                none      rent   \\n\",\n       \"\\n\",\n       \"                           job  \\n\",\n       \"32                     skilled  \\n\",\n       \"924                    skilled  \\n\",\n       \"931                    skilled  \\n\",\n       \"796                    skilled  \\n\",\n       \"226                    skilled  \\n\",\n       \"..                         ...  \\n\",\n       \"380                    skilled  \\n\",\n       \"768                    skilled  \\n\",\n       \"85   high qualif/self emp/mgmt  \\n\",\n       \"527         unskilled resident  \\n\",\n       \"117                    skilled  \\n\",\n       \"\\n\",\n       \"[100 rows x 20 columns]\"\n      ]\n     },\n     \"execution_count\": 4,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"valset[0][0]\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 5,\n   \"id\": \"af2eed94\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"3018579e308d4eb995ed65b3581b7f06\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Epoch:   0%|          | 0/50 [00:00<?, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 0, test val_loss: 0.624792\\n\",\n      \"epoch: 0, train loss: 6.6127, lr: 0.000100, spent: 0.5 secs\\n\",\n      \"epoch: 1, test val_loss: 0.599838\\n\",\n      \"epoch: 1, train loss: 6.3586, lr: 0.000100, spent: 1.2 secs\\n\",\n      \"epoch: 2, test val_loss: 0.593658\\n\",\n      \"epoch: 2, train loss: 6.0999, lr: 0.000100, spent: 1.8 secs\\n\",\n      \"epoch: 3, test val_loss: 0.550265\\n\",\n      \"epoch: 3, train loss: 5.8295, lr: 0.000100, spent: 2.3 secs\\n\",\n      \"epoch: 4, test val_loss: 0.527351\\n\",\n      \"epoch: 4, train loss: 5.6347, lr: 0.000100, spent: 2.8 secs\\n\",\n      \"epoch: 5, test val_loss: 0.508950\\n\",\n      \"epoch: 5, train loss: 5.5123, lr: 0.000100, spent: 3.3 secs\\n\",\n      \"epoch: 6, test val_loss: 0.485854\\n\",\n      \"epoch: 6, train loss: 5.4929, lr: 0.000100, spent: 3.9 secs\\n\",\n      \"epoch: 7, test val_loss: 0.522198\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 7, train loss: 5.6552, lr: 0.000100, spent: 4.3 secs\\n\",\n      \"epoch: 8, test val_loss: 0.478467\\n\",\n      \"epoch: 8, train loss: 5.7420, lr: 0.000100, spent: 4.7 secs\\n\",\n      \"epoch: 9, test val_loss: 0.515104\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 9, train loss: 5.3993, lr: 0.000100, spent: 5.3 secs\\n\",\n      \"epoch: 10, test val_loss: 0.474058\\n\",\n      \"epoch: 10, train loss: 5.3141, lr: 0.000100, spent: 5.8 secs\\n\",\n      \"epoch: 11, test val_loss: 0.473926\\n\",\n      \"epoch: 11, train loss: 5.2754, lr: 0.000100, spent: 6.3 secs\\n\",\n      \"epoch: 12, test val_loss: 0.470752\\n\",\n      \"epoch: 12, train loss: 5.1095, lr: 0.000100, spent: 6.8 secs\\n\",\n      \"epoch: 13, test val_loss: 0.478428\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 13, train loss: 5.0845, lr: 0.000100, spent: 7.3 secs\\n\",\n      \"epoch: 14, test val_loss: 0.454532\\n\",\n      \"epoch: 14, train loss: 5.1003, lr: 0.000100, spent: 8.0 secs\\n\",\n      \"epoch: 15, test val_loss: 0.462518\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 15, train loss: 5.0139, lr: 0.000100, spent: 8.5 secs\\n\",\n      \"epoch: 16, test val_loss: 0.453442\\n\",\n      \"epoch: 16, train loss: 4.9912, lr: 0.000100, spent: 9.1 secs\\n\",\n      \"epoch: 17, test val_loss: 0.459327\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 17, train loss: 4.9310, lr: 0.000100, spent: 9.5 secs\\n\",\n      \"epoch: 18, test val_loss: 0.442287\\n\",\n      \"epoch: 18, train loss: 4.8740, lr: 0.000100, spent: 10.2 secs\\n\",\n      \"epoch: 19, test val_loss: 0.466330\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 19, train loss: 4.8456, lr: 0.000100, spent: 10.8 secs\\n\",\n      \"epoch: 20, test val_loss: 0.436802\\n\",\n      \"epoch: 20, train loss: 4.7808, lr: 0.000100, spent: 11.2 secs\\n\",\n      \"epoch: 21, test val_loss: 0.472410\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 21, train loss: 4.7860, lr: 0.000100, spent: 11.6 secs\\n\",\n      \"epoch: 22, test val_loss: 0.448208\\n\",\n      \"EarlyStopping counter: 2 out of 5\\n\",\n      \"epoch: 22, train loss: 4.9795, lr: 0.000100, spent: 12.2 secs\\n\",\n      \"epoch: 23, test val_loss: 0.426601\\n\",\n      \"epoch: 23, train loss: 4.8747, lr: 0.000100, spent: 12.8 secs\\n\",\n      \"epoch: 24, test val_loss: 0.556543\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 24, train loss: 4.9586, lr: 0.000100, spent: 13.2 secs\\n\",\n      \"epoch: 25, test val_loss: 0.455203\\n\",\n      \"EarlyStopping counter: 2 out of 5\\n\",\n      \"epoch: 25, train loss: 4.9627, lr: 0.000100, spent: 13.8 secs\\n\",\n      \"epoch: 26, test val_loss: 0.581238\\n\",\n      \"EarlyStopping counter: 3 out of 5\\n\",\n      \"epoch: 26, train loss: 5.0275, lr: 0.000100, spent: 14.2 secs\\n\",\n      \"epoch: 27, test val_loss: 0.501105\\n\",\n      \"EarlyStopping counter: 4 out of 5\\n\",\n      \"epoch: 27, train loss: 5.2915, lr: 0.000100, spent: 14.7 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"2022-08-31 10:57:39.041 | INFO     | transtab.trainer:train:132 - load best at last from ./checkpoint\\n\",\n      \"2022-08-31 10:57:39.057 | INFO     | transtab.trainer:save_model:239 - saving model checkpoint to ./checkpoint\\n\",\n      \"2022-08-31 10:57:39.187 | INFO     | transtab.trainer:train:137 - training complete, cost 15.3 secs.\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 28, test val_loss: 0.461543\\n\",\n      \"EarlyStopping counter: 5 out of 5\\n\",\n      \"early stopped\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# start training, take the validation loss on average for evaluation\\n\",\n    \"transtab.train(model, trainset, valset, **training_arguments)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 6,\n   \"id\": \"9b65a489\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# make predictions on the first dataset 'credit-g'\\n\",\n    \"x_test, y_test = testset[0]\\n\",\n    \"ypred = transtab.predict(model, x_test)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 7,\n   \"id\": \"6eefaa05\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"auc 0.95 mean/interval 0.7399(0.06)\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"[0.7399011920073604]\"\n      ]\n     },\n     \"execution_count\": 7,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"# evaluate the predictions with bootstrapping estimate\\n\",\n    \"transtab.evaluate(ypred, y_test, seed=123, metric='auc')\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"id\": \"34d19852\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": []\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"Python 3 (ipykernel)\",\n   \"language\": \"python\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.8.13\"\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 5\n}\n"
  },
  {
    "path": "examples/table_embedding.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 1,\n   \"id\": \"9aa34ef4\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import os\\n\",\n    \"os.chdir('../')\\n\",\n    \"\\n\",\n    \"import transtab\\n\",\n    \"\\n\",\n    \"# set random seed\\n\",\n    \"transtab.random_seed(42)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 2,\n   \"id\": \"ce7052e8\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"########################################\\n\",\n      \"openml data index: 31\\n\",\n      \"load data from credit-g\\n\",\n      \"# data: 1000, # feat: 20, # cate: 11,  # bin: 2, # numerical: 7, pos rate: 0.70\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# load a dataset and start vanilla supervised training\\n\",\n    \"allset, trainset, valset, testset, cat_cols, num_cols, bin_cols \\\\\\n\",\n    \"    = transtab.load_data('credit-g')\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 3,\n   \"id\": \"4e709521\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"0740c9e1a09844238618d786a971d916\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Epoch:   0%|          | 0/50 [00:00<?, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 0, test val_loss: 6.349929\\n\",\n      \"epoch: 0, train loss: 72.9975, lr: 0.000100, spent: 1.3 secs\\n\",\n      \"epoch: 1, test val_loss: 6.043663\\n\",\n      \"epoch: 1, train loss: 62.8806, lr: 0.000100, spent: 2.2 secs\\n\",\n      \"epoch: 2, test val_loss: 5.999826\\n\",\n      \"epoch: 2, train loss: 61.3078, lr: 0.000100, spent: 3.0 secs\\n\",\n      \"epoch: 3, test val_loss: 5.989734\\n\",\n      \"epoch: 3, train loss: 61.0470, lr: 0.000100, spent: 3.9 secs\\n\",\n      \"epoch: 4, test val_loss: 5.986117\\n\",\n      \"epoch: 4, train loss: 60.9742, lr: 0.000100, spent: 4.8 secs\\n\",\n      \"epoch: 5, test val_loss: 5.984314\\n\",\n      \"epoch: 5, train loss: 60.9454, lr: 0.000100, spent: 5.8 secs\\n\",\n      \"epoch: 6, test val_loss: 5.983197\\n\",\n      \"epoch: 6, train loss: 60.9270, lr: 0.000100, spent: 6.7 secs\\n\",\n      \"epoch: 7, test val_loss: 5.982450\\n\",\n      \"epoch: 7, train loss: 60.9164, lr: 0.000100, spent: 7.6 secs\\n\",\n      \"epoch: 8, test val_loss: 5.981885\\n\",\n      \"epoch: 8, train loss: 60.9102, lr: 0.000100, spent: 8.5 secs\\n\",\n      \"epoch: 9, test val_loss: 5.981443\\n\",\n      \"epoch: 9, train loss: 60.9047, lr: 0.000100, spent: 9.5 secs\\n\",\n      \"epoch: 10, test val_loss: 5.981087\\n\",\n      \"epoch: 10, train loss: 60.9004, lr: 0.000100, spent: 10.3 secs\\n\",\n      \"epoch: 11, test val_loss: 5.980795\\n\",\n      \"epoch: 11, train loss: 60.8956, lr: 0.000100, spent: 11.3 secs\\n\",\n      \"epoch: 12, test val_loss: 5.980557\\n\",\n      \"epoch: 12, train loss: 60.8925, lr: 0.000100, spent: 12.3 secs\\n\",\n      \"epoch: 13, test val_loss: 5.980357\\n\",\n      \"epoch: 13, train loss: 60.8902, lr: 0.000100, spent: 13.3 secs\\n\",\n      \"epoch: 14, test val_loss: 5.980191\\n\",\n      \"epoch: 14, train loss: 60.8874, lr: 0.000100, spent: 14.5 secs\\n\",\n      \"epoch: 15, test val_loss: 5.980050\\n\",\n      \"epoch: 15, train loss: 60.8863, lr: 0.000100, spent: 15.5 secs\\n\",\n      \"epoch: 16, test val_loss: 5.979930\\n\",\n      \"epoch: 16, train loss: 60.8836, lr: 0.000100, spent: 16.4 secs\\n\",\n      \"epoch: 17, test val_loss: 5.979825\\n\",\n      \"epoch: 17, train loss: 60.8822, lr: 0.000100, spent: 17.3 secs\\n\",\n      \"epoch: 18, test val_loss: 5.979736\\n\",\n      \"epoch: 18, train loss: 60.8821, lr: 0.000100, spent: 18.2 secs\\n\",\n      \"epoch: 19, test val_loss: 5.979657\\n\",\n      \"epoch: 19, train loss: 60.8804, lr: 0.000100, spent: 19.2 secs\\n\",\n      \"epoch: 20, test val_loss: 5.979586\\n\",\n      \"epoch: 20, train loss: 60.8802, lr: 0.000100, spent: 20.3 secs\\n\",\n      \"epoch: 21, test val_loss: 5.979523\\n\",\n      \"epoch: 21, train loss: 60.8798, lr: 0.000100, spent: 21.3 secs\\n\",\n      \"epoch: 22, test val_loss: 5.979466\\n\",\n      \"epoch: 22, train loss: 60.8791, lr: 0.000100, spent: 22.2 secs\\n\",\n      \"epoch: 23, test val_loss: 5.979416\\n\",\n      \"epoch: 23, train loss: 60.8778, lr: 0.000100, spent: 23.2 secs\\n\",\n      \"epoch: 24, test val_loss: 5.979372\\n\",\n      \"epoch: 24, train loss: 60.8776, lr: 0.000100, spent: 24.2 secs\\n\",\n      \"epoch: 25, test val_loss: 5.979331\\n\",\n      \"epoch: 25, train loss: 60.8773, lr: 0.000100, spent: 25.1 secs\\n\",\n      \"epoch: 26, test val_loss: 5.979294\\n\",\n      \"epoch: 26, train loss: 60.8763, lr: 0.000100, spent: 26.0 secs\\n\",\n      \"epoch: 27, test val_loss: 5.979260\\n\",\n      \"epoch: 27, train loss: 60.8761, lr: 0.000100, spent: 27.0 secs\\n\",\n      \"epoch: 28, test val_loss: 5.979229\\n\",\n      \"epoch: 28, train loss: 60.8761, lr: 0.000100, spent: 27.9 secs\\n\",\n      \"epoch: 29, test val_loss: 5.979202\\n\",\n      \"epoch: 29, train loss: 60.8752, lr: 0.000100, spent: 28.9 secs\\n\",\n      \"epoch: 30, test val_loss: 5.979175\\n\",\n      \"epoch: 30, train loss: 60.8755, lr: 0.000100, spent: 29.8 secs\\n\",\n      \"epoch: 31, test val_loss: 5.979153\\n\",\n      \"epoch: 31, train loss: 60.8744, lr: 0.000100, spent: 30.8 secs\\n\",\n      \"epoch: 32, test val_loss: 5.979130\\n\",\n      \"epoch: 32, train loss: 60.8744, lr: 0.000100, spent: 31.6 secs\\n\",\n      \"epoch: 33, test val_loss: 5.979110\\n\",\n      \"epoch: 33, train loss: 60.8743, lr: 0.000100, spent: 32.4 secs\\n\",\n      \"epoch: 34, test val_loss: 5.979090\\n\",\n      \"epoch: 34, train loss: 60.8736, lr: 0.000100, spent: 33.4 secs\\n\",\n      \"epoch: 35, test val_loss: 5.979072\\n\",\n      \"epoch: 35, train loss: 60.8720, lr: 0.000100, spent: 34.3 secs\\n\",\n      \"epoch: 36, test val_loss: 5.979054\\n\",\n      \"epoch: 36, train loss: 60.8724, lr: 0.000100, spent: 35.2 secs\\n\",\n      \"epoch: 37, test val_loss: 5.979037\\n\",\n      \"epoch: 37, train loss: 60.8735, lr: 0.000100, spent: 36.2 secs\\n\",\n      \"epoch: 38, test val_loss: 5.979021\\n\",\n      \"epoch: 38, train loss: 60.8723, lr: 0.000100, spent: 36.9 secs\\n\",\n      \"epoch: 39, test val_loss: 5.979005\\n\",\n      \"epoch: 39, train loss: 60.8726, lr: 0.000100, spent: 37.8 secs\\n\",\n      \"epoch: 40, test val_loss: 5.978991\\n\",\n      \"epoch: 40, train loss: 60.8719, lr: 0.000100, spent: 38.5 secs\\n\",\n      \"epoch: 41, test val_loss: 5.978974\\n\",\n      \"epoch: 41, train loss: 60.8720, lr: 0.000100, spent: 39.3 secs\\n\",\n      \"epoch: 42, test val_loss: 5.978961\\n\",\n      \"epoch: 42, train loss: 60.8717, lr: 0.000100, spent: 40.1 secs\\n\",\n      \"epoch: 43, test val_loss: 5.978946\\n\",\n      \"epoch: 43, train loss: 60.8721, lr: 0.000100, spent: 40.9 secs\\n\",\n      \"epoch: 44, test val_loss: 5.978931\\n\",\n      \"epoch: 44, train loss: 60.8710, lr: 0.000100, spent: 41.8 secs\\n\",\n      \"epoch: 45, test val_loss: 5.978916\\n\",\n      \"epoch: 45, train loss: 60.8711, lr: 0.000100, spent: 42.7 secs\\n\",\n      \"epoch: 46, test val_loss: 5.978899\\n\",\n      \"epoch: 46, train loss: 60.8713, lr: 0.000100, spent: 43.6 secs\\n\",\n      \"epoch: 47, test val_loss: 5.978884\\n\",\n      \"epoch: 47, train loss: 60.8702, lr: 0.000100, spent: 44.6 secs\\n\",\n      \"epoch: 48, test val_loss: 5.978869\\n\",\n      \"epoch: 48, train loss: 60.8705, lr: 0.000100, spent: 45.7 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"2022-08-31 14:15:16.839 | INFO     | transtab.trainer:train:132 - load best at last from ./checkpoint\\n\",\n      \"2022-08-31 14:15:16.853 | INFO     | transtab.trainer:save_model:239 - saving model checkpoint to ./checkpoint\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 49, test val_loss: 5.978854\\n\",\n      \"epoch: 49, train loss: 60.8699, lr: 0.000100, spent: 46.8 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"2022-08-31 14:15:17.035 | INFO     | transtab.trainer:train:137 - training complete, cost 47.0 secs.\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# make a fast pre-train of TransTab contrastive learning model\\n\",\n    \"# build contrastive learner, set supervised=True for supervised VPCL\\n\",\n    \"model, collate_fn = transtab.build_contrastive_learner(\\n\",\n    \"    cat_cols, num_cols, bin_cols, \\n\",\n    \"    supervised=True, # if take supervised CL\\n\",\n    \"    num_partition=4, # num of column partitions for pos/neg sampling\\n\",\n    \"    overlap_ratio=0.5, # specify the overlap ratio of column partitions during the CL\\n\",\n    \")\\n\",\n    \"\\n\",\n    \"# start contrastive pretraining training\\n\",\n    \"training_arguments = {\\n\",\n    \"    'num_epoch':50,\\n\",\n    \"    'batch_size':64,\\n\",\n    \"    'lr':1e-4,\\n\",\n    \"    'eval_metric':'val_loss',\\n\",\n    \"    'eval_less_is_better':True,\\n\",\n    \"    'output_dir':'./checkpoint'\\n\",\n    \"    }\\n\",\n    \"\\n\",\n    \"transtab.train(model, trainset, valset, collate_fn=collate_fn, **training_arguments)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 4,\n   \"id\": \"5c87e48b\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"2022-08-31 14:15:17.125 | INFO     | transtab.modeling_transtab:load:773 - missing keys: []\\n\",\n      \"2022-08-31 14:15:17.126 | INFO     | transtab.modeling_transtab:load:774 - unexpected keys: ['projection_head.dense.weight']\\n\",\n      \"2022-08-31 14:15:17.126 | INFO     | transtab.modeling_transtab:load:775 - load model from ./checkpoint\\n\",\n      \"2022-08-31 14:15:17.159 | INFO     | transtab.modeling_transtab:load:222 - load feature extractor from ./checkpoint/extractor/extractor.json\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# There are two ways to build the encoder\\n\",\n    \"# First, take the whole pretrained model and output the cls token embedding at the last layer's outputs\\n\",\n    \"enc = transtab.build_encoder(\\n\",\n    \"    binary_columns=bin_cols,\\n\",\n    \"    checkpoint = './checkpoint'\\n\",\n    \")\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 5,\n   \"id\": \"b8149cfa\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"torch.Size([700, 128])\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"tensor([[ 1.2959e+00,  1.5239e+00, -1.2096e+00,  3.0303e-01,  7.4638e-01,\\n\",\n       \"          1.1758e+00,  1.1774e+00, -2.1921e-01,  4.2850e-01,  8.3295e-03,\\n\",\n       \"         -5.3477e-01,  1.4859e+00, -2.0534e+00, -9.4093e-01,  3.7010e-01,\\n\",\n       \"          1.3663e-01,  4.4837e-01,  1.3882e+00,  1.6472e+00, -1.2430e+00,\\n\",\n       \"         -4.8809e-01, -5.1914e-01, -3.3168e-01,  1.9889e+00, -4.9873e-01,\\n\",\n       \"          1.2286e+00,  8.6373e-01,  5.1300e-01,  6.7551e-01, -1.2021e+00,\\n\",\n       \"          6.3210e-01,  6.2366e-01,  5.6712e-01,  1.2275e-03, -1.5154e+00,\\n\",\n       \"          2.0082e+00, -1.2255e+00, -2.4254e-01, -5.1009e-01,  1.6733e+00,\\n\",\n       \"         -1.2059e+00, -7.0246e-01,  1.8980e-01, -7.8196e-01,  1.0777e+00,\\n\",\n       \"         -6.1830e-01, -1.1279e+00, -1.3290e+00,  9.6929e-01, -7.6388e-02,\\n\",\n       \"         -4.5835e-01, -1.1462e+00,  1.5084e+00,  5.7778e-01,  2.0644e-01,\\n\",\n       \"          4.3633e-01,  7.6116e-03,  5.2441e-01, -1.9919e-01, -1.9441e-01,\\n\",\n       \"          1.8144e+00,  2.7863e-01, -1.8727e+00, -9.4760e-01,  1.1152e+00,\\n\",\n       \"          3.5514e-01,  1.6321e+00,  4.3554e-01,  6.1438e-01,  2.2991e-01,\\n\",\n       \"          2.3567e-01,  1.0738e+00, -1.0689e+00,  1.1454e+00, -2.9430e-01,\\n\",\n       \"         -7.8866e-01,  1.7377e-01,  4.7786e-01, -1.1535e+00, -1.9210e+00,\\n\",\n       \"          5.6469e-01, -4.9142e-02, -6.4016e-01, -3.3013e-01, -3.1188e-01,\\n\",\n       \"         -7.4673e-01, -3.0021e-01, -2.0609e+00,  7.0935e-01, -6.6764e-01,\\n\",\n       \"          6.4810e-01, -8.1043e-02, -1.0044e+00, -2.1534e+00, -1.4149e+00,\\n\",\n       \"         -7.6418e-01,  1.9660e+00, -1.0766e+00, -5.2616e-01, -1.2752e+00,\\n\",\n       \"          1.1527e+00,  2.2518e-01,  1.7696e-01,  8.3931e-01, -3.5717e-01,\\n\",\n       \"          1.4251e-01,  1.6778e+00, -1.5331e+00, -1.5316e+00, -7.3143e-01,\\n\",\n       \"         -2.6362e-01, -5.3092e-01,  1.1220e+00,  9.4099e-01, -1.3653e+00,\\n\",\n       \"         -5.5385e-01, -2.5665e-01, -3.1621e-01, -1.3123e+00, -9.7127e-02,\\n\",\n       \"         -4.2603e-01,  1.8091e+00, -7.5452e-01,  1.9514e+00,  7.2433e-03,\\n\",\n       \"          3.7320e-02,  5.3549e-01, -3.9535e-01],\\n\",\n       \"        [ 1.4275e+00,  1.4772e+00, -1.1928e+00,  1.8642e-01,  8.1510e-01,\\n\",\n       \"          1.2602e+00,  1.2150e+00, -2.1353e-01,  3.9298e-01, -1.8265e-01,\\n\",\n       \"         -5.9739e-01,  1.2885e+00, -2.1044e+00, -1.0534e+00,  4.8087e-01,\\n\",\n       \"          1.2070e-01,  3.0839e-01,  1.2873e+00,  1.6255e+00, -1.0916e+00,\\n\",\n       \"         -3.2920e-01, -2.7017e-01, -3.4054e-01,  2.0612e+00, -6.5718e-01,\\n\",\n       \"          1.1547e+00,  9.0340e-01,  5.3138e-01,  7.4846e-01, -1.1599e+00,\\n\",\n       \"          6.1057e-01,  6.2320e-01,  6.3401e-01, -7.8121e-02, -1.5336e+00,\\n\",\n       \"          1.8799e+00, -1.4002e+00, -3.4578e-01, -8.7409e-01,  1.7005e+00,\\n\",\n       \"         -1.2923e+00, -5.9172e-01,  8.2113e-02, -7.6255e-01,  9.8186e-01,\\n\",\n       \"         -5.2740e-01, -1.1055e+00, -1.3655e+00,  8.0880e-01,  6.8788e-02,\\n\",\n       \"         -5.1715e-01, -1.2682e+00,  1.6060e+00,  5.9163e-01,  3.5197e-01,\\n\",\n       \"          6.1037e-01,  1.6449e-01,  4.7828e-01, -2.3575e-01, -2.4127e-01,\\n\",\n       \"          1.8397e+00,  3.7601e-01, -1.9676e+00, -9.4222e-01,  1.1711e+00,\\n\",\n       \"          3.2122e-01,  1.7164e+00,  4.7828e-01,  7.2740e-01,  2.1730e-01,\\n\",\n       \"          2.0191e-01,  7.4816e-01, -1.1957e+00,  1.2826e+00, -3.4407e-01,\\n\",\n       \"         -8.6727e-01,  1.4943e-01,  5.4311e-01, -1.1209e+00, -1.8852e+00,\\n\",\n       \"          5.8967e-01, -2.3814e-01, -6.1390e-01, -2.7548e-01, -2.5533e-01,\\n\",\n       \"         -8.5195e-01, -2.3613e-01, -1.9835e+00,  5.6644e-01, -5.9843e-01,\\n\",\n       \"          6.8693e-01,  3.4524e-02, -1.0214e+00, -1.8806e+00, -1.4108e+00,\\n\",\n       \"         -7.1087e-01,  1.9959e+00, -1.2109e+00, -6.3984e-01, -9.7635e-01,\\n\",\n       \"          1.1544e+00,  2.3031e-01,  2.3562e-01,  6.8024e-01, -2.9665e-01,\\n\",\n       \"          1.2141e-01,  1.7590e+00, -1.4833e+00, -1.4007e+00, -9.1892e-01,\\n\",\n       \"         -1.3863e-01, -3.3393e-01,  1.0803e+00,  1.0124e+00, -1.4227e+00,\\n\",\n       \"         -6.2524e-01, -1.6816e-01, -4.6652e-01, -1.3414e+00, -1.7069e-01,\\n\",\n       \"         -2.8513e-01,  1.7853e+00, -9.1653e-01,  1.7702e+00,  2.3768e-01,\\n\",\n       \"          9.3338e-02,  5.9862e-01, -3.1038e-01]], device='cuda:0',\\n\",\n       \"       grad_fn=<SliceBackward0>)\"\n      ]\n     },\n     \"execution_count\": 5,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"# Then take the encoder to get the input embedding\\n\",\n    \"df = trainset[0]\\n\",\n    \"output = enc(df)\\n\",\n    \"print(output.shape)\\n\",\n    \"output[:2]\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 6,\n   \"id\": \"4aadae44\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>own_telephone</th>\\n\",\n       \"      <th>foreign_worker</th>\\n\",\n       \"      <th>duration</th>\\n\",\n       \"      <th>credit_amount</th>\\n\",\n       \"      <th>installment_commitment</th>\\n\",\n       \"      <th>residence_since</th>\\n\",\n       \"      <th>age</th>\\n\",\n       \"      <th>existing_credits</th>\\n\",\n       \"      <th>num_dependents</th>\\n\",\n       \"      <th>checking_status</th>\\n\",\n       \"      <th>credit_history</th>\\n\",\n       \"      <th>purpose</th>\\n\",\n       \"      <th>savings_status</th>\\n\",\n       \"      <th>employment</th>\\n\",\n       \"      <th>personal_status</th>\\n\",\n       \"      <th>other_parties</th>\\n\",\n       \"      <th>property_magnitude</th>\\n\",\n       \"      <th>other_payment_plans</th>\\n\",\n       \"      <th>housing</th>\\n\",\n       \"      <th>job</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>636</th>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0.294118</td>\\n\",\n       \"      <td>0.061957</td>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>0.160714</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>no checking</td>\\n\",\n       \"      <td>existing paid</td>\\n\",\n       \"      <td>radio/tv</td>\\n\",\n       \"      <td>500&lt;=X&lt;1000</td>\\n\",\n       \"      <td>4&lt;=X&lt;7</td>\\n\",\n       \"      <td>female div/dep/mar</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>car</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>own</td>\\n\",\n       \"      <td>skilled</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>182</th>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0.250000</td>\\n\",\n       \"      <td>0.076868</td>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>0.375000</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>1.0</td>\\n\",\n       \"      <td>&lt;0</td>\\n\",\n       \"      <td>all paid</td>\\n\",\n       \"      <td>new car</td>\\n\",\n       \"      <td>no known savings</td>\\n\",\n       \"      <td>1&lt;=X&lt;4</td>\\n\",\n       \"      <td>male single</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>life insurance</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>own</td>\\n\",\n       \"      <td>unskilled resident</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>736</th>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0.294118</td>\\n\",\n       \"      <td>0.622318</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"      <td>0.071429</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0&lt;=X&lt;200</td>\\n\",\n       \"      <td>existing paid</td>\\n\",\n       \"      <td>used car</td>\\n\",\n       \"      <td>&lt;100</td>\\n\",\n       \"      <td>1&lt;=X&lt;4</td>\\n\",\n       \"      <td>female div/dep/mar</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>car</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>rent</td>\\n\",\n       \"      <td>high qualif/self emp/mgmt</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>922</th>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0.073529</td>\\n\",\n       \"      <td>0.061406</td>\\n\",\n       \"      <td>0.666667</td>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"      <td>0.053571</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>&lt;0</td>\\n\",\n       \"      <td>existing paid</td>\\n\",\n       \"      <td>radio/tv</td>\\n\",\n       \"      <td>&lt;100</td>\\n\",\n       \"      <td>&lt;1</td>\\n\",\n       \"      <td>female div/dep/mar</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>life insurance</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>rent</td>\\n\",\n       \"      <td>skilled</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>511</th>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0.470588</td>\\n\",\n       \"      <td>0.244085</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>0.333333</td>\\n\",\n       \"      <td>0.232143</td>\\n\",\n       \"      <td>0.000000</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>no checking</td>\\n\",\n       \"      <td>existing paid</td>\\n\",\n       \"      <td>used car</td>\\n\",\n       \"      <td>&lt;100</td>\\n\",\n       \"      <td>1&lt;=X&lt;4</td>\\n\",\n       \"      <td>male single</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>no known property</td>\\n\",\n       \"      <td>none</td>\\n\",\n       \"      <td>for free</td>\\n\",\n       \"      <td>high qualif/self emp/mgmt</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"     own_telephone  foreign_worker  duration  credit_amount  \\\\\\n\",\n       \"636              0               1  0.294118       0.061957   \\n\",\n       \"182              0               1  0.250000       0.076868   \\n\",\n       \"736              0               1  0.294118       0.622318   \\n\",\n       \"922              0               1  0.073529       0.061406   \\n\",\n       \"511              1               1  0.470588       0.244085   \\n\",\n       \"\\n\",\n       \"     installment_commitment  residence_since       age  existing_credits  \\\\\\n\",\n       \"636                1.000000         0.000000  0.160714          0.000000   \\n\",\n       \"182                1.000000         0.333333  0.375000          0.333333   \\n\",\n       \"736                0.000000         1.000000  0.071429          0.333333   \\n\",\n       \"922                0.666667         1.000000  0.053571          0.000000   \\n\",\n       \"511                0.333333         0.333333  0.232143          0.000000   \\n\",\n       \"\\n\",\n       \"     num_dependents checking_status credit_history   purpose  \\\\\\n\",\n       \"636             0.0     no checking  existing paid  radio/tv   \\n\",\n       \"182             1.0              <0       all paid   new car   \\n\",\n       \"736             0.0        0<=X<200  existing paid  used car   \\n\",\n       \"922             0.0              <0  existing paid  radio/tv   \\n\",\n       \"511             0.0     no checking  existing paid  used car   \\n\",\n       \"\\n\",\n       \"       savings_status employment     personal_status other_parties  \\\\\\n\",\n       \"636       500<=X<1000     4<=X<7  female div/dep/mar          none   \\n\",\n       \"182  no known savings     1<=X<4         male single          none   \\n\",\n       \"736              <100     1<=X<4  female div/dep/mar          none   \\n\",\n       \"922              <100         <1  female div/dep/mar          none   \\n\",\n       \"511              <100     1<=X<4         male single          none   \\n\",\n       \"\\n\",\n       \"    property_magnitude other_payment_plans   housing  \\\\\\n\",\n       \"636                car                none       own   \\n\",\n       \"182     life insurance                none       own   \\n\",\n       \"736                car                none      rent   \\n\",\n       \"922     life insurance                none      rent   \\n\",\n       \"511  no known property                none  for free   \\n\",\n       \"\\n\",\n       \"                           job  \\n\",\n       \"636                    skilled  \\n\",\n       \"182         unskilled resident  \\n\",\n       \"736  high qualif/self emp/mgmt  \\n\",\n       \"922                    skilled  \\n\",\n       \"511  high qualif/self emp/mgmt  \"\n      ]\n     },\n     \"execution_count\": 6,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"df.head()\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 8,\n   \"id\": \"4f3e1e91\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"2022-08-31 14:16:28.124 | INFO     | transtab.modeling_transtab:load:222 - load feature extractor from ./checkpoint/extractor/extractor.json\\n\",\n      \"2022-08-31 14:16:28.134 | INFO     | transtab.modeling_transtab:load:523 - missing keys: []\\n\",\n      \"2022-08-31 14:16:28.135 | INFO     | transtab.modeling_transtab:load:524 - unexpected keys: []\\n\",\n      \"2022-08-31 14:16:28.136 | INFO     | transtab.modeling_transtab:load:525 - load model from ./checkpoint\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# Second, if we only want to the embeded token level embeddings (embeddings before going to transformers)\\n\",\n    \"enc = transtab.build_encoder(\\n\",\n    \"    binary_columns=bin_cols,\\n\",\n    \"    checkpoint = './checkpoint',\\n\",\n    \"    num_layer = 0,\\n\",\n    \")\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 12,\n   \"id\": \"39a0172b\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"torch.Size([700, 85, 128])\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"tensor([[[ 0.1370,  0.0427, -0.0106,  ..., -0.0806,  0.0518, -0.1315],\\n\",\n       \"         [ 0.0657,  0.0341, -0.0128,  ..., -0.0207,  0.0102, -0.0046],\\n\",\n       \"         [ 0.1494,  0.4290,  0.2463,  ...,  0.1992, -0.0848, -0.0840],\\n\",\n       \"         ...,\\n\",\n       \"         [ 1.1575,  0.0165,  0.9202,  ..., -0.2052,  1.0815, -1.0268],\\n\",\n       \"         [ 1.1575,  0.0165,  0.9202,  ..., -0.2052,  1.0815, -1.0268],\\n\",\n       \"         [ 1.1575,  0.0165,  0.9202,  ..., -0.2052,  1.0815, -1.0268]],\\n\",\n       \"\\n\",\n       \"        [[ 0.1204,  0.0388, -0.0098,  ..., -0.0738,  0.0400, -0.1099],\\n\",\n       \"         [ 0.0752,  0.0383, -0.0145,  ..., -0.0174,  0.0190, -0.0085],\\n\",\n       \"         [ 0.1494,  0.4290,  0.2463,  ...,  0.1992, -0.0848, -0.0840],\\n\",\n       \"         ...,\\n\",\n       \"         [ 1.1575,  0.0165,  0.9202,  ..., -0.2052,  1.0815, -1.0268],\\n\",\n       \"         [ 1.1575,  0.0165,  0.9202,  ..., -0.2052,  1.0815, -1.0268],\\n\",\n       \"         [ 1.1575,  0.0165,  0.9202,  ..., -0.2052,  1.0815, -1.0268]]],\\n\",\n       \"       device='cuda:0', grad_fn=<SliceBackward0>)\"\n      ]\n     },\n     \"execution_count\": 12,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"output = enc(df)\\n\",\n    \"print(output['embedding'].shape)\\n\",\n    \"output['embedding'][:2]\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"id\": \"55936f1e\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": []\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"Python 3 (ipykernel)\",\n   \"language\": \"python\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.8.13\"\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 5\n}\n"
  },
  {
    "path": "examples/transfer_learning.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 6,\n   \"id\": \"134f979d\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import os\\n\",\n    \"os.chdir('../')\\n\",\n    \"\\n\",\n    \"import transtab\\n\",\n    \"\\n\",\n    \"# set random seed\\n\",\n    \"transtab.random_seed(42)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 7,\n   \"id\": \"42c60011\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"########################################\\n\",\n      \"openml data index: 31\\n\",\n      \"load data from credit-g\\n\",\n      \"# data: 1000, # feat: 20, # cate: 11,  # bin: 2, # numerical: 7, pos rate: 0.70\\n\",\n      \"########################################\\n\",\n      \"openml data index: 29\\n\",\n      \"load data from credit-approval\\n\",\n      \"# data: 690, # feat: 15, # cate: 9,  # bin: 0, # numerical: 6, pos rate: 0.56\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"dd62a8df24d14e22a69d77088bd1b220\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Epoch:   0%|          | 0/50 [00:00<?, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 0, test val_loss: 0.574102\\n\",\n      \"epoch: 0, train loss: 3.9759, lr: 0.000100, spent: 0.4 secs\\n\",\n      \"epoch: 1, test val_loss: 0.565162\\n\",\n      \"epoch: 1, train loss: 3.7812, lr: 0.000100, spent: 0.9 secs\\n\",\n      \"epoch: 2, test val_loss: 0.576745\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 2, train loss: 3.6560, lr: 0.000100, spent: 1.1 secs\\n\",\n      \"epoch: 3, test val_loss: 0.566665\\n\",\n      \"EarlyStopping counter: 2 out of 5\\n\",\n      \"epoch: 3, train loss: 3.6539, lr: 0.000100, spent: 1.4 secs\\n\",\n      \"epoch: 4, test val_loss: 0.548929\\n\",\n      \"epoch: 4, train loss: 3.6118, lr: 0.000100, spent: 1.7 secs\\n\",\n      \"epoch: 5, test val_loss: 0.545800\\n\",\n      \"epoch: 5, train loss: 3.5634, lr: 0.000100, spent: 2.2 secs\\n\",\n      \"epoch: 6, test val_loss: 0.545121\\n\",\n      \"epoch: 6, train loss: 3.5035, lr: 0.000100, spent: 2.4 secs\\n\",\n      \"epoch: 7, test val_loss: 0.529130\\n\",\n      \"epoch: 7, train loss: 3.4372, lr: 0.000100, spent: 2.7 secs\\n\",\n      \"epoch: 8, test val_loss: 0.525149\\n\",\n      \"epoch: 8, train loss: 3.3768, lr: 0.000100, spent: 3.0 secs\\n\",\n      \"epoch: 9, test val_loss: 0.518042\\n\",\n      \"epoch: 9, train loss: 3.3204, lr: 0.000100, spent: 3.5 secs\\n\",\n      \"epoch: 10, test val_loss: 0.508209\\n\",\n      \"epoch: 10, train loss: 3.2816, lr: 0.000100, spent: 3.8 secs\\n\",\n      \"epoch: 11, test val_loss: 0.497027\\n\",\n      \"epoch: 11, train loss: 3.1952, lr: 0.000100, spent: 4.1 secs\\n\",\n      \"epoch: 12, test val_loss: 0.495085\\n\",\n      \"epoch: 12, train loss: 3.1852, lr: 0.000100, spent: 4.6 secs\\n\",\n      \"epoch: 13, test val_loss: 0.479123\\n\",\n      \"epoch: 13, train loss: 3.0853, lr: 0.000100, spent: 4.9 secs\\n\",\n      \"epoch: 14, test val_loss: 0.492737\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 14, train loss: 3.0682, lr: 0.000100, spent: 5.2 secs\\n\",\n      \"epoch: 15, test val_loss: 0.477266\\n\",\n      \"epoch: 15, train loss: 2.9653, lr: 0.000100, spent: 5.5 secs\\n\",\n      \"epoch: 16, test val_loss: 0.503946\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 16, train loss: 2.9797, lr: 0.000100, spent: 5.7 secs\\n\",\n      \"epoch: 17, test val_loss: 0.484869\\n\",\n      \"EarlyStopping counter: 2 out of 5\\n\",\n      \"epoch: 17, train loss: 2.9767, lr: 0.000100, spent: 6.0 secs\\n\",\n      \"epoch: 18, test val_loss: 0.467354\\n\",\n      \"epoch: 18, train loss: 2.8925, lr: 0.000100, spent: 6.5 secs\\n\",\n      \"epoch: 19, test val_loss: 0.471429\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 19, train loss: 2.8963, lr: 0.000100, spent: 6.7 secs\\n\",\n      \"epoch: 20, test val_loss: 0.460370\\n\",\n      \"epoch: 20, train loss: 2.8847, lr: 0.000100, spent: 7.0 secs\\n\",\n      \"epoch: 21, test val_loss: 0.498306\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 21, train loss: 2.8389, lr: 0.000100, spent: 7.4 secs\\n\",\n      \"epoch: 22, test val_loss: 0.441738\\n\",\n      \"epoch: 22, train loss: 2.8077, lr: 0.000100, spent: 7.7 secs\\n\",\n      \"epoch: 23, test val_loss: 0.479452\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 23, train loss: 2.8506, lr: 0.000100, spent: 8.0 secs\\n\",\n      \"epoch: 24, test val_loss: 0.450146\\n\",\n      \"EarlyStopping counter: 2 out of 5\\n\",\n      \"epoch: 24, train loss: 2.7006, lr: 0.000100, spent: 8.5 secs\\n\",\n      \"epoch: 25, test val_loss: 0.460931\\n\",\n      \"EarlyStopping counter: 3 out of 5\\n\",\n      \"epoch: 25, train loss: 2.7361, lr: 0.000100, spent: 8.7 secs\\n\",\n      \"epoch: 26, test val_loss: 0.482305\\n\",\n      \"EarlyStopping counter: 4 out of 5\\n\",\n      \"epoch: 26, train loss: 2.6959, lr: 0.000100, spent: 9.0 secs\\n\",\n      \"epoch: 27, test val_loss: 0.440060\\n\",\n      \"epoch: 27, train loss: 2.7485, lr: 0.000100, spent: 9.3 secs\\n\",\n      \"epoch: 28, test val_loss: 0.450090\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 28, train loss: 2.7765, lr: 0.000100, spent: 9.6 secs\\n\",\n      \"epoch: 29, test val_loss: 0.472720\\n\",\n      \"EarlyStopping counter: 2 out of 5\\n\",\n      \"epoch: 29, train loss: 2.6344, lr: 0.000100, spent: 9.8 secs\\n\",\n      \"epoch: 30, test val_loss: 0.438471\\n\",\n      \"epoch: 30, train loss: 2.5639, lr: 0.000100, spent: 10.3 secs\\n\",\n      \"epoch: 31, test val_loss: 0.498057\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 31, train loss: 2.7224, lr: 0.000100, spent: 10.6 secs\\n\",\n      \"epoch: 32, test val_loss: 0.463493\\n\",\n      \"EarlyStopping counter: 2 out of 5\\n\",\n      \"epoch: 32, train loss: 2.6888, lr: 0.000100, spent: 11.0 secs\\n\",\n      \"epoch: 33, test val_loss: 0.435828\\n\",\n      \"epoch: 33, train loss: 2.6895, lr: 0.000100, spent: 11.3 secs\\n\",\n      \"epoch: 34, test val_loss: 0.495953\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 34, train loss: 2.5385, lr: 0.000100, spent: 11.6 secs\\n\",\n      \"epoch: 35, test val_loss: 0.444737\\n\",\n      \"EarlyStopping counter: 2 out of 5\\n\",\n      \"epoch: 35, train loss: 2.5663, lr: 0.000100, spent: 12.1 secs\\n\",\n      \"epoch: 36, test val_loss: 0.449832\\n\",\n      \"EarlyStopping counter: 3 out of 5\\n\",\n      \"epoch: 36, train loss: 2.6015, lr: 0.000100, spent: 12.4 secs\\n\",\n      \"epoch: 37, test val_loss: 0.441197\\n\",\n      \"EarlyStopping counter: 4 out of 5\\n\",\n      \"epoch: 37, train loss: 2.5011, lr: 0.000100, spent: 12.6 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"2022-10-05 08:35:04.023 | INFO     | transtab.trainer:train:136 - load best at last from ./checkpoint\\n\",\n      \"2022-10-05 08:35:04.042 | INFO     | transtab.trainer:save_model:243 - saving model checkpoint to ./checkpoint\\n\",\n      \"2022-10-05 08:35:04.167 | INFO     | transtab.trainer:train:141 - training complete, cost 13.1 secs.\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 38, test val_loss: 0.503903\\n\",\n      \"EarlyStopping counter: 5 out of 5\\n\",\n      \"early stopped\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# load a dataset and start vanilla supervised training\\n\",\n    \"allset, trainset, valset, testset, cat_cols, num_cols, bin_cols = transtab.load_data(['credit-g', 'credit-approval'])\\n\",\n    \"\\n\",\n    \"# build transtab classifier model\\n\",\n    \"model = transtab.build_classifier(cat_cols, num_cols, bin_cols)\\n\",\n    \"\\n\",\n    \"# start training\\n\",\n    \"training_arguments = {\\n\",\n    \"    'num_epoch':50,\\n\",\n    \"    'eval_metric':'val_loss',\\n\",\n    \"    'eval_less_is_better':True,\\n\",\n    \"    'output_dir':'./checkpoint',\\n\",\n    \"    'batch_size':128,\\n\",\n    \"    'lr':1e-4,\\n\",\n    \"    'weight_decay':1e-4,\\n\",\n    \"    }\\n\",\n    \"transtab.train(model, trainset[0], valset[0], **training_arguments)\\n\",\n    \"\\n\",\n    \"# save model\\n\",\n    \"model.save('./ckpt/pretrained')\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 8,\n   \"id\": \"d6bdc971\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"2022-10-05 08:35:04.352 | INFO     | transtab.modeling_transtab:load:773 - missing keys: []\\n\",\n      \"2022-10-05 08:35:04.354 | INFO     | transtab.modeling_transtab:load:774 - unexpected keys: []\\n\",\n      \"2022-10-05 08:35:04.355 | INFO     | transtab.modeling_transtab:load:775 - load model from ./ckpt/pretrained\\n\",\n      \"2022-10-05 08:35:04.370 | INFO     | transtab.modeling_transtab:load:222 - load feature extractor from ./ckpt/pretrained/extractor/extractor.json\\n\",\n      \"2022-10-05 08:35:04.372 | INFO     | transtab.modeling_transtab:update:832 - Build a new classifier with num 2 classes outputs, need further finetune to work.\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# now let's use another data and try to leverage the pretrained model for finetuning\\n\",\n    \"# here we have loaded the required data `credit-approval` before, no need to load again.\\n\",\n    \"\\n\",\n    \"# load the pretrained model\\n\",\n    \"model.load('./ckpt/pretrained')\\n\",\n    \"\\n\",\n    \"# update model's categorical/numerical/binary column dict\\n\",\n    \"# need to specify the number of classes if the new dataset has different # of classes from the \\n\",\n    \"# pretrained one.\\n\",\n    \"model.update({'cat':cat_cols,'num':num_cols,'bin':bin_cols, 'num_class':2})\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 9,\n   \"id\": \"f399d02e\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"9b64fb45097e4061af5a0186c17d98a6\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Epoch:   0%|          | 0/50 [00:00<?, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\",\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 0, test auc: 0.282251\\n\",\n      \"epoch: 0, train loss: 3.3862, lr: 0.000200, spent: 0.2 secs\\n\",\n      \"epoch: 1, test auc: 0.865801\\n\",\n      \"epoch: 1, train loss: 2.8794, lr: 0.000200, spent: 0.3 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\",\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 2, test auc: 0.865801\\n\",\n      \"epoch: 2, train loss: 2.5943, lr: 0.000200, spent: 0.7 secs\\n\",\n      \"epoch: 3, test auc: 0.865801\\n\",\n      \"epoch: 3, train loss: 2.4300, lr: 0.000200, spent: 0.8 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\",\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 4, test auc: 0.872727\\n\",\n      \"epoch: 4, train loss: 2.2617, lr: 0.000200, spent: 1.0 secs\\n\",\n      \"epoch: 5, test auc: 0.879654\\n\",\n      \"epoch: 5, train loss: 2.0867, lr: 0.000200, spent: 1.1 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\",\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 6, test auc: 0.880519\\n\",\n      \"epoch: 6, train loss: 1.9774, lr: 0.000200, spent: 1.3 secs\\n\",\n      \"epoch: 7, test auc: 0.883117\\n\",\n      \"epoch: 7, train loss: 1.8739, lr: 0.000200, spent: 1.4 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\",\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 8, test auc: 0.889177\\n\",\n      \"epoch: 8, train loss: 1.8919, lr: 0.000200, spent: 1.5 secs\\n\",\n      \"epoch: 9, test auc: 0.890909\\n\",\n      \"epoch: 9, train loss: 1.8794, lr: 0.000200, spent: 1.7 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\",\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 10, test auc: 0.896970\\n\",\n      \"epoch: 10, train loss: 1.8456, lr: 0.000200, spent: 2.0 secs\\n\",\n      \"epoch: 11, test auc: 0.897835\\n\",\n      \"epoch: 11, train loss: 1.8213, lr: 0.000200, spent: 2.2 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\",\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 12, test auc: 0.896104\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 12, train loss: 1.8219, lr: 0.000200, spent: 2.3 secs\\n\",\n      \"epoch: 13, test auc: 0.903896\\n\",\n      \"epoch: 13, train loss: 1.7924, lr: 0.000200, spent: 2.4 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\",\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 14, test auc: 0.905628\\n\",\n      \"epoch: 14, train loss: 1.7964, lr: 0.000200, spent: 2.6 secs\\n\",\n      \"epoch: 15, test auc: 0.904762\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 15, train loss: 1.7641, lr: 0.000200, spent: 2.7 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\",\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 16, test auc: 0.904762\\n\",\n      \"EarlyStopping counter: 2 out of 5\\n\",\n      \"epoch: 16, train loss: 1.7788, lr: 0.000200, spent: 2.8 secs\\n\",\n      \"epoch: 17, test auc: 0.909091\\n\",\n      \"epoch: 17, train loss: 1.7456, lr: 0.000200, spent: 2.9 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\",\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 18, test auc: 0.910823\\n\",\n      \"epoch: 18, train loss: 1.7438, lr: 0.000200, spent: 3.3 secs\\n\",\n      \"epoch: 19, test auc: 0.912554\\n\",\n      \"epoch: 19, train loss: 1.7569, lr: 0.000200, spent: 3.4 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\",\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 20, test auc: 0.912554\\n\",\n      \"epoch: 20, train loss: 1.7533, lr: 0.000200, spent: 3.5 secs\\n\",\n      \"epoch: 21, test auc: 0.915152\\n\",\n      \"epoch: 21, train loss: 1.7439, lr: 0.000200, spent: 3.7 secs\\n\",\n      \"epoch: 22, test auc: 0.915152\\n\",\n      \"epoch: 22, train loss: 1.7020, lr: 0.000200, spent: 3.9 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\",\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 23, test auc: 0.916883\\n\",\n      \"epoch: 23, train loss: 1.7017, lr: 0.000200, spent: 4.0 secs\\n\",\n      \"epoch: 24, test auc: 0.917749\\n\",\n      \"epoch: 24, train loss: 1.6625, lr: 0.000200, spent: 4.1 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\",\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 25, test auc: 0.918615\\n\",\n      \"epoch: 25, train loss: 1.6432, lr: 0.000200, spent: 4.3 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\",\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 26, test auc: 0.922944\\n\",\n      \"epoch: 26, train loss: 1.6299, lr: 0.000200, spent: 4.7 secs\\n\",\n      \"epoch: 27, test auc: 0.922944\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 27, train loss: 1.6158, lr: 0.000200, spent: 4.8 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\",\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 28, test auc: 0.925541\\n\",\n      \"epoch: 28, train loss: 1.5971, lr: 0.000200, spent: 4.9 secs\\n\",\n      \"epoch: 29, test auc: 0.926407\\n\",\n      \"epoch: 29, train loss: 1.5771, lr: 0.000200, spent: 5.0 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\",\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 30, test auc: 0.927273\\n\",\n      \"epoch: 30, train loss: 1.5763, lr: 0.000200, spent: 5.2 secs\\n\",\n      \"epoch: 31, test auc: 0.933333\\n\",\n      \"epoch: 31, train loss: 1.6021, lr: 0.000200, spent: 5.3 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\",\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 32, test auc: 0.936797\\n\",\n      \"epoch: 32, train loss: 1.5513, lr: 0.000200, spent: 5.5 secs\\n\",\n      \"epoch: 33, test auc: 0.938528\\n\",\n      \"epoch: 33, train loss: 1.5160, lr: 0.000200, spent: 5.6 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\",\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 34, test auc: 0.938528\\n\",\n      \"epoch: 34, train loss: 1.5250, lr: 0.000200, spent: 5.8 secs\\n\",\n      \"epoch: 35, test auc: 0.938528\\n\",\n      \"epoch: 35, train loss: 1.4732, lr: 0.000200, spent: 6.0 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\",\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 36, test auc: 0.934199\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 36, train loss: 1.4738, lr: 0.000200, spent: 6.1 secs\\n\",\n      \"epoch: 37, test auc: 0.934199\\n\",\n      \"EarlyStopping counter: 2 out of 5\\n\",\n      \"epoch: 37, train loss: 1.4667, lr: 0.000200, spent: 6.2 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\",\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 38, test auc: 0.933333\\n\",\n      \"EarlyStopping counter: 3 out of 5\\n\",\n      \"epoch: 38, train loss: 1.4209, lr: 0.000200, spent: 6.3 secs\\n\",\n      \"epoch: 39, test auc: 0.933333\\n\",\n      \"EarlyStopping counter: 4 out of 5\\n\",\n      \"epoch: 39, train loss: 1.4371, lr: 0.000200, spent: 6.4 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/zifengw2/outcome_predict/transtab/transtab/trainer.py:169: FutureWarning: In a future version of pandas all arguments of concat except for the argument 'objs' will be keyword-only.\\n\",\n      \"  y_test = pd.concat(y_test, 0)\\n\",\n      \"2022-10-05 08:35:10.982 | INFO     | transtab.trainer:train:136 - load best at last from ./checkpoint\\n\",\n      \"2022-10-05 08:35:10.994 | INFO     | transtab.trainer:save_model:243 - saving model checkpoint to ./checkpoint\\n\",\n      \"2022-10-05 08:35:11.142 | INFO     | transtab.trainer:train:141 - training complete, cost 6.7 secs.\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 40, test auc: 0.929870\\n\",\n      \"EarlyStopping counter: 5 out of 5\\n\",\n      \"early stopped\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# start training\\n\",\n    \"training_arguments = {\\n\",\n    \"    'num_epoch':50,\\n\",\n    \"    'eval_metric':'auc',\\n\",\n    \"    'eval_less_is_better':False,\\n\",\n    \"    'output_dir':'./checkpoint',\\n\",\n    \"    'batch_size':128,\\n\",\n    \"    'lr':2e-4,\\n\",\n    \"    }\\n\",\n    \"\\n\",\n    \"transtab.train(model, trainset[1], valset[1], **training_arguments)\\n\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 10,\n   \"id\": \"3aa87021\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"auc 0.95 mean/interval 0.8757(0.05)\\n\",\n      \"0.8807749627421758\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# evaluation\\n\",\n    \"x_test, y_test = testset[1]\\n\",\n    \"ypred = transtab.predict(model, x_test)\\n\",\n    \"transtab.evaluate(ypred, y_test, metric='auc')\\n\",\n    \"\\n\",\n    \"from sklearn.metrics import roc_auc_score\\n\",\n    \"print(roc_auc_score(y_test, ypred))\"\n   ]\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"Python 3.8.13 ('pytrial': conda)\",\n   \"language\": \"python\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.8.13\"\n  },\n  \"vscode\": {\n   \"interpreter\": {\n    \"hash\": \"2f00ab411e3cfe281b54106f98420bd06c3920b043d7b3741a63d2a4ac576305\"\n   }\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 5\n}\n"
  },
  {
    "path": "examples/transfer_learning_regressor.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 1,\n   \"id\": \"739e0cff\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import warnings\\n\",\n    \"warnings.filterwarnings(\\\"ignore\\\")\\n\",\n    \"import os\\n\",\n    \"os.chdir('../')\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 2,\n   \"id\": \"134f979d\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import transtab\\n\",\n    \"\\n\",\n    \"# set random seed\\n\",\n    \"transtab.random_seed(42)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 4,\n   \"id\": \"668517ad\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import numpy as np\\n\",\n    \"import pandas as pd\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"id\": \"3a64015e\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Requirement already satisfied: openml in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (0.15.1)\\n\",\n      \"Requirement already satisfied: liac-arff>=2.4.0 in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from openml) (2.5.0)\\n\",\n      \"Requirement already satisfied: xmltodict in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from openml) (0.14.2)\\n\",\n      \"Requirement already satisfied: requests in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from openml) (2.32.3)\\n\",\n      \"Requirement already satisfied: scikit-learn>=0.18 in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from openml) (1.6.1)\\n\",\n      \"Requirement already satisfied: python-dateutil in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from openml) (2.9.0.post0)\\n\",\n      \"Requirement already satisfied: pandas>=1.0.0 in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from openml) (2.2.3)\\n\",\n      \"Requirement already satisfied: scipy>=0.13.3 in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from openml) (1.15.2)\\n\",\n      \"Requirement already satisfied: numpy>=1.6.2 in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from openml) (2.2.3)\\n\",\n      \"Requirement already satisfied: minio in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from openml) (7.2.15)\\n\",\n      \"Requirement already satisfied: pyarrow in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from openml) (19.0.1)\\n\",\n      \"Requirement already satisfied: tqdm in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from openml) (4.67.1)\\n\",\n      \"Requirement already satisfied: packaging in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from openml) (24.2)\\n\",\n      \"Requirement already satisfied: pytz>=2020.1 in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from pandas>=1.0.0->openml) (2025.1)\\n\",\n      \"Requirement already satisfied: tzdata>=2022.7 in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from pandas>=1.0.0->openml) (2025.1)\\n\",\n      \"Requirement already satisfied: six>=1.5 in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from python-dateutil->openml) (1.17.0)\\n\",\n      \"Requirement already satisfied: joblib>=1.2.0 in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from scikit-learn>=0.18->openml) (1.4.2)\\n\",\n      \"Requirement already satisfied: threadpoolctl>=3.1.0 in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from scikit-learn>=0.18->openml) (3.5.0)\\n\",\n      \"Requirement already satisfied: certifi in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from minio->openml) (2025.1.31)\\n\",\n      \"Requirement already satisfied: urllib3 in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from minio->openml) (2.3.0)\\n\",\n      \"Requirement already satisfied: argon2-cffi in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from minio->openml) (23.1.0)\\n\",\n      \"Requirement already satisfied: pycryptodome in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from minio->openml) (3.21.0)\\n\",\n      \"Requirement already satisfied: typing-extensions in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from minio->openml) (4.12.2)\\n\",\n      \"Requirement already satisfied: charset-normalizer<4,>=2 in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from requests->openml) (3.4.1)\\n\",\n      \"Requirement already satisfied: idna<4,>=2.5 in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from requests->openml) (3.10)\\n\",\n      \"Requirement already satisfied: argon2-cffi-bindings in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from argon2-cffi->minio->openml) (21.2.0)\\n\",\n      \"Requirement already satisfied: cffi>=1.0.1 in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from argon2-cffi-bindings->argon2-cffi->minio->openml) (1.17.1)\\n\",\n      \"Requirement already satisfied: pycparser in /home/zifengw2/miniconda3/envs/digitaltwin/lib/python3.10/site-packages (from cffi>=1.0.1->argon2-cffi-bindings->argon2-cffi->minio->openml) (2.22)\\n\",\n      \"########################################\\n\"\n     ]\n    },\n    {\n     \"ename\": \"ImportError\",\n     \"evalue\": \"OpenML is required for this functionality. Please install it with: pip install openml\",\n     \"output_type\": \"error\",\n     \"traceback\": [\n      \"\\u001b[0;31m---------------------------------------------------------------------------\\u001b[0m\",\n      \"\\u001b[0;31mImportError\\u001b[0m                               Traceback (most recent call last)\",\n      \"Cell \\u001b[0;32mIn[7], line 3\\u001b[0m\\n\\u001b[1;32m      1\\u001b[0m \\u001b[38;5;66;03m# load a dataset and start vanilla supervised training\\u001b[39;00m\\n\\u001b[1;32m      2\\u001b[0m get_ipython()\\u001b[38;5;241m.\\u001b[39msystem(\\u001b[38;5;124m'\\u001b[39m\\u001b[38;5;124mpip install openml\\u001b[39m\\u001b[38;5;124m'\\u001b[39m)\\n\\u001b[0;32m----> 3\\u001b[0m allset, trainset, valset, testset, cat_cols, num_cols, bin_cols \\u001b[38;5;241m=\\u001b[39m \\u001b[43mtranstab\\u001b[49m\\u001b[38;5;241;43m.\\u001b[39;49m\\u001b[43mload_data\\u001b[49m\\u001b[43m(\\u001b[49m\\u001b[43m[\\u001b[49m\\u001b[38;5;124;43m'\\u001b[39;49m\\u001b[38;5;124;43mcredit-g\\u001b[39;49m\\u001b[38;5;124;43m'\\u001b[39;49m\\u001b[43m,\\u001b[49m\\u001b[43m \\u001b[49m\\u001b[38;5;124;43m'\\u001b[39;49m\\u001b[38;5;124;43mcredit-approval\\u001b[39;49m\\u001b[38;5;124;43m'\\u001b[39;49m\\u001b[43m]\\u001b[49m\\u001b[43m)\\u001b[49m\\n\",\n      \"File \\u001b[0;32m~/github/transtab/transtab/dataset.py:95\\u001b[0m, in \\u001b[0;36mload_data\\u001b[0;34m(dataname, dataset_config, encode_cat, data_cut, seed)\\u001b[0m\\n\\u001b[1;32m     92\\u001b[0m \\u001b[38;5;28;01mfor\\u001b[39;00m dataname_ \\u001b[38;5;129;01min\\u001b[39;00m dataname:\\n\\u001b[1;32m     93\\u001b[0m     data_config \\u001b[38;5;241m=\\u001b[39m dataset_config\\u001b[38;5;241m.\\u001b[39mget(dataname_, \\u001b[38;5;28;01mNone\\u001b[39;00m)\\n\\u001b[1;32m     94\\u001b[0m     allset, trainset, valset, testset, cat_cols, num_cols, bin_cols \\u001b[38;5;241m=\\u001b[39m \\\\\\n\\u001b[0;32m---> 95\\u001b[0m         \\u001b[43mload_single_data\\u001b[49m\\u001b[43m(\\u001b[49m\\u001b[43mdataname_\\u001b[49m\\u001b[43m,\\u001b[49m\\u001b[43m \\u001b[49m\\u001b[43mdataset_config\\u001b[49m\\u001b[38;5;241;43m=\\u001b[39;49m\\u001b[43mdata_config\\u001b[49m\\u001b[43m,\\u001b[49m\\u001b[43m \\u001b[49m\\u001b[43mencode_cat\\u001b[49m\\u001b[38;5;241;43m=\\u001b[39;49m\\u001b[43mencode_cat\\u001b[49m\\u001b[43m,\\u001b[49m\\u001b[43m \\u001b[49m\\u001b[43mdata_cut\\u001b[49m\\u001b[38;5;241;43m=\\u001b[39;49m\\u001b[43mdata_cut\\u001b[49m\\u001b[43m,\\u001b[49m\\u001b[43m \\u001b[49m\\u001b[43mseed\\u001b[49m\\u001b[38;5;241;43m=\\u001b[39;49m\\u001b[43mseed\\u001b[49m\\u001b[43m)\\u001b[49m\\n\\u001b[1;32m     96\\u001b[0m     num_col_list\\u001b[38;5;241m.\\u001b[39mextend(num_cols)\\n\\u001b[1;32m     97\\u001b[0m     cat_col_list\\u001b[38;5;241m.\\u001b[39mextend(cat_cols)\\n\",\n      \"File \\u001b[0;32m~/github/transtab/transtab/dataset.py:159\\u001b[0m, in \\u001b[0;36mload_single_data\\u001b[0;34m(dataname, dataset_config, encode_cat, data_cut, seed)\\u001b[0m\\n\\u001b[1;32m    157\\u001b[0m \\u001b[38;5;28;01melse\\u001b[39;00m:\\n\\u001b[1;32m    158\\u001b[0m     \\u001b[38;5;28;01mif\\u001b[39;00m \\u001b[38;5;129;01mnot\\u001b[39;00m _has_openml:\\n\\u001b[0;32m--> 159\\u001b[0m         \\u001b[38;5;28;01mraise\\u001b[39;00m \\u001b[38;5;167;01mImportError\\u001b[39;00m(\\n\\u001b[1;32m    160\\u001b[0m             \\u001b[38;5;124m\\\"\\u001b[39m\\u001b[38;5;124mOpenML is required for this functionality. \\u001b[39m\\u001b[38;5;124m\\\"\\u001b[39m\\n\\u001b[1;32m    161\\u001b[0m             \\u001b[38;5;124m\\\"\\u001b[39m\\u001b[38;5;124mPlease install it with: pip install openml\\u001b[39m\\u001b[38;5;124m\\\"\\u001b[39m\\n\\u001b[1;32m    162\\u001b[0m         )\\n\\u001b[1;32m    163\\u001b[0m     dataset \\u001b[38;5;241m=\\u001b[39m openml\\u001b[38;5;241m.\\u001b[39mdatasets\\u001b[38;5;241m.\\u001b[39mget_dataset(dataname)\\n\\u001b[1;32m    164\\u001b[0m     X,y,categorical_indicator, attribute_names \\u001b[38;5;241m=\\u001b[39m dataset\\u001b[38;5;241m.\\u001b[39mget_data(dataset_format\\u001b[38;5;241m=\\u001b[39m\\u001b[38;5;124m'\\u001b[39m\\u001b[38;5;124mdataframe\\u001b[39m\\u001b[38;5;124m'\\u001b[39m, target\\u001b[38;5;241m=\\u001b[39mdataset\\u001b[38;5;241m.\\u001b[39mdefault_target_attribute)\\n\",\n      \"\\u001b[0;31mImportError\\u001b[0m: OpenML is required for this functionality. Please install it with: pip install openml\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# load a dataset and start vanilla supervised training\\n\",\n    \"# !pip install openml\\n\",\n    \"allset, trainset, valset, testset, cat_cols, num_cols, bin_cols = transtab.load_data(['credit-g', 'credit-approval'])\\n\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 3,\n   \"id\": \"521fb369\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"ename\": \"NameError\",\n     \"evalue\": \"name 'trainset' is not defined\",\n     \"output_type\": \"error\",\n     \"traceback\": [\n      \"\\u001b[0;31m---------------------------------------------------------------------------\\u001b[0m\",\n      \"\\u001b[0;31mNameError\\u001b[0m                                 Traceback (most recent call last)\",\n      \"Cell \\u001b[0;32mIn[3], line 1\\u001b[0m\\n\\u001b[0;32m----> 1\\u001b[0m trainset_reg \\u001b[38;5;241m=\\u001b[39m [(\\u001b[43mtrainset\\u001b[49m[\\u001b[38;5;241m0\\u001b[39m][\\u001b[38;5;241m0\\u001b[39m], pd\\u001b[38;5;241m.\\u001b[39mSeries(np\\u001b[38;5;241m.\\u001b[39mrandom\\u001b[38;5;241m.\\u001b[39mrandn(trainset[\\u001b[38;5;241m0\\u001b[39m][\\u001b[38;5;241m0\\u001b[39m]\\u001b[38;5;241m.\\u001b[39mshape[\\u001b[38;5;241m0\\u001b[39m]))), (trainset[\\u001b[38;5;241m1\\u001b[39m][\\u001b[38;5;241m0\\u001b[39m], pd\\u001b[38;5;241m.\\u001b[39mSeries(np\\u001b[38;5;241m.\\u001b[39mrandom\\u001b[38;5;241m.\\u001b[39mrandn(trainset[\\u001b[38;5;241m1\\u001b[39m][\\u001b[38;5;241m0\\u001b[39m]\\u001b[38;5;241m.\\u001b[39mshape[\\u001b[38;5;241m0\\u001b[39m])))]\\n\",\n      \"\\u001b[0;31mNameError\\u001b[0m: name 'trainset' is not defined\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"trainset_reg = [(trainset[0][0], pd.Series(np.random.randn(trainset[0][0].shape[0]))), (trainset[1][0], pd.Series(np.random.randn(trainset[1][0].shape[0])))]\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 38,\n   \"id\": \"cadc940f\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"valset_reg = [(valset[0][0], pd.Series(np.random.randn(valset[0][0].shape[0]))), (valset[1][0], pd.Series(np.random.randn(valset[1][0].shape[0])))]\\n\",\n    \"testset_reg = [(testset[0][0], pd.Series(np.random.randn(testset[0][0].shape[0]))), (testset[1][0], pd.Series(np.random.randn(testset[1][0].shape[0])))]\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 39,\n   \"id\": \"42c60011\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Epoch:   2%|▏         | 1/50 [00:01<01:33,  1.91s/it]\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 0, test val_loss: 1.372377\\n\",\n      \"epoch: 0, train loss: 6.7940, lr: 0.000100, spent: 1.9 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Epoch:   4%|▍         | 2/50 [00:03<01:22,  1.72s/it]\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 1, test val_loss: 1.184756\\n\",\n      \"epoch: 1, train loss: 6.0480, lr: 0.000100, spent: 3.5 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Epoch:   6%|▌         | 3/50 [00:05<01:18,  1.66s/it]\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 2, test val_loss: 1.194661\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 2, train loss: 6.1002, lr: 0.000100, spent: 5.1 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Epoch:   8%|▊         | 4/50 [00:06<01:14,  1.63s/it]\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 3, test val_loss: 1.218926\\n\",\n      \"EarlyStopping counter: 2 out of 5\\n\",\n      \"epoch: 3, train loss: 5.8850, lr: 0.000100, spent: 6.7 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Epoch:  10%|█         | 5/50 [00:08<01:12,  1.61s/it]\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 4, test val_loss: 1.198663\\n\",\n      \"EarlyStopping counter: 3 out of 5\\n\",\n      \"epoch: 4, train loss: 5.9642, lr: 0.000100, spent: 8.3 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Epoch:  12%|█▏        | 6/50 [00:09<01:10,  1.61s/it]\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 5, test val_loss: 1.205427\\n\",\n      \"EarlyStopping counter: 4 out of 5\\n\",\n      \"epoch: 5, train loss: 5.8004, lr: 0.000100, spent: 9.9 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Epoch:  12%|█▏        | 6/50 [00:11<01:24,  1.92s/it]\\n\",\n      \"\\u001b[32m2024-03-08 16:32:55.367\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mtranstab.trainer\\u001b[0m:\\u001b[36mtrain\\u001b[0m:\\u001b[36m136\\u001b[0m - \\u001b[1mload best at last from ./checkpoint\\u001b[0m\\n\",\n      \"\\u001b[32m2024-03-08 16:32:55.378\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mtranstab.trainer\\u001b[0m:\\u001b[36msave_model\\u001b[0m:\\u001b[36m247\\u001b[0m - \\u001b[1msaving model checkpoint to ./checkpoint\\u001b[0m\\n\",\n      \"\\u001b[32m2024-03-08 16:32:55.471\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mtranstab.trainer\\u001b[0m:\\u001b[36mtrain\\u001b[0m:\\u001b[36m141\\u001b[0m - \\u001b[1mtraining complete, cost 11.6 secs.\\u001b[0m\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 6, test val_loss: 1.198403\\n\",\n      \"EarlyStopping counter: 5 out of 5\\n\",\n      \"early stopped\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"\\n\",\n    \"# build transtab classifier model\\n\",\n    \"model = transtab.build_regressor(cat_cols, num_cols, bin_cols, device='cpu')\\n\",\n    \"\\n\",\n    \"# start training\\n\",\n    \"training_arguments = {\\n\",\n    \"    'num_epoch':50,\\n\",\n    \"    'eval_metric':'val_loss',\\n\",\n    \"    'eval_less_is_better':True,\\n\",\n    \"    'output_dir':'./checkpoint',\\n\",\n    \"    'batch_size':128,\\n\",\n    \"    'lr':1e-4,\\n\",\n    \"    'weight_decay':1e-4,\\n\",\n    \"    }\\n\",\n    \"transtab.train(model, trainset_reg[0], valset_reg[0], **training_arguments)\\n\",\n    \"\\n\",\n    \"# save model\\n\",\n    \"model.save('./ckpt/pretrained')\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 40,\n   \"id\": \"d6bdc971\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"\\u001b[32m2024-03-08 16:33:11.448\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mtranstab.modeling_transtab\\u001b[0m:\\u001b[36mload\\u001b[0m:\\u001b[36m787\\u001b[0m - \\u001b[1mmissing keys: []\\u001b[0m\\n\",\n      \"\\u001b[32m2024-03-08 16:33:11.448\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mtranstab.modeling_transtab\\u001b[0m:\\u001b[36mload\\u001b[0m:\\u001b[36m788\\u001b[0m - \\u001b[1munexpected keys: []\\u001b[0m\\n\",\n      \"\\u001b[32m2024-03-08 16:33:11.449\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mtranstab.modeling_transtab\\u001b[0m:\\u001b[36mload\\u001b[0m:\\u001b[36m789\\u001b[0m - \\u001b[1mload model from ./ckpt/pretrained\\u001b[0m\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"\\u001b[32m2024-03-08 16:33:11.468\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mtranstab.modeling_transtab\\u001b[0m:\\u001b[36mload\\u001b[0m:\\u001b[36m222\\u001b[0m - \\u001b[1mload feature extractor from ./ckpt/pretrained/extractor/extractor.json\\u001b[0m\\n\",\n      \"\\u001b[32m2024-03-08 16:33:11.470\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mtranstab.modeling_transtab\\u001b[0m:\\u001b[36m_adapt_to_new_num_class\\u001b[0m:\\u001b[36m886\\u001b[0m - \\u001b[1mBuild a new classifier with num 2 classes outputs, need further finetune to work.\\u001b[0m\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# now let's use another data and try to leverage the pretrained model for finetuning\\n\",\n    \"# here we have loaded the required data `credit-approval` before, no need to load again.\\n\",\n    \"\\n\",\n    \"# load the pretrained model\\n\",\n    \"model.load('./ckpt/pretrained')\\n\",\n    \"\\n\",\n    \"# update model's categorical/numerical/binary column dict\\n\",\n    \"# need to specify the number of classes if the new dataset has different # of classes from the \\n\",\n    \"# pretrained one.\\n\",\n    \"model.update({'cat':cat_cols,'num':num_cols,'bin':bin_cols, 'num_class':2})\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 44,\n   \"id\": \"f399d02e\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Epoch:   2%|▏         | 1/50 [00:00<00:37,  1.32it/s]\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 0, test mse: 0.814842\\n\",\n      \"epoch: 0, train loss: 2.9249, lr: 0.000200, spent: 0.8 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Epoch:   4%|▍         | 2/50 [00:01<00:31,  1.54it/s]\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 1, test mse: 0.803411\\n\",\n      \"EarlyStopping counter: 1 out of 5\\n\",\n      \"epoch: 1, train loss: 0.1003, lr: 0.000200, spent: 1.3 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Epoch:   6%|▌         | 3/50 [00:01<00:29,  1.57it/s]\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 2, test mse: 0.802998\\n\",\n      \"EarlyStopping counter: 2 out of 5\\n\",\n      \"epoch: 2, train loss: -0.3084, lr: 0.000200, spent: 2.0 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Epoch:   8%|▊         | 4/50 [00:02<00:28,  1.61it/s]\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 3, test mse: 0.802881\\n\",\n      \"EarlyStopping counter: 3 out of 5\\n\",\n      \"epoch: 3, train loss: -0.3803, lr: 0.000200, spent: 2.6 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Epoch:  10%|█         | 5/50 [00:03<00:28,  1.60it/s]\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 4, test mse: 0.802826\\n\",\n      \"EarlyStopping counter: 4 out of 5\\n\",\n      \"epoch: 4, train loss: -0.2638, lr: 0.000200, spent: 3.2 secs\\n\"\n     ]\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Epoch:  10%|█         | 5/50 [00:03<00:33,  1.34it/s]\\n\",\n      \"\\u001b[32m2024-03-08 16:37:52.614\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mtranstab.trainer\\u001b[0m:\\u001b[36mtrain\\u001b[0m:\\u001b[36m136\\u001b[0m - \\u001b[1mload best at last from ./checkpoint\\u001b[0m\\n\",\n      \"\\u001b[32m2024-03-08 16:37:52.621\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mtranstab.trainer\\u001b[0m:\\u001b[36msave_model\\u001b[0m:\\u001b[36m247\\u001b[0m - \\u001b[1msaving model checkpoint to ./checkpoint\\u001b[0m\\n\",\n      \"\\u001b[32m2024-03-08 16:37:52.718\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mtranstab.trainer\\u001b[0m:\\u001b[36mtrain\\u001b[0m:\\u001b[36m141\\u001b[0m - \\u001b[1mtraining complete, cost 3.9 secs.\\u001b[0m\\n\"\n     ]\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"epoch: 5, test mse: 0.802803\\n\",\n      \"EarlyStopping counter: 5 out of 5\\n\",\n      \"early stopped\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# start training\\n\",\n    \"training_arguments = {\\n\",\n    \"    'num_epoch':50,\\n\",\n    \"    'eval_metric':'mse',\\n\",\n    \"    'eval_less_is_better':False,\\n\",\n    \"    'output_dir':'./checkpoint',\\n\",\n    \"    'batch_size':128,\\n\",\n    \"    'lr':2e-4,\\n\",\n    \"    }\\n\",\n    \"\\n\",\n    \"transtab.train(model, trainset_reg[1], valset_reg[1], **training_arguments)\\n\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 46,\n   \"id\": \"3aa87021\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"0.9819256995837686\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# evaluation\\n\",\n    \"x_test, y_test = testset_reg[1]\\n\",\n    \"ypred = transtab.predict(model, x_test, y_test)\\n\",\n    \"transtab.evaluate(ypred, y_test, metric='mse')\\n\",\n    \"\\n\",\n    \"from sklearn.metrics import mean_squared_error\\n\",\n    \"print(mean_squared_error(y_test, ypred))\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"id\": \"d4bf1d31\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": []\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"digitaltwin\",\n   \"language\": \"python\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.10.16\"\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 5\n}\n"
  },
  {
    "path": "pypi_build_commands.txt",
    "content": "# This is a command list for building pypi packages\npython setup.py sdist bdist_wheel\n\ntwine check dist/*\n\n# upload to pypi-test\npython -m twine upload --repository-url https://test.pypi.org/legacy/ dist/*\n\n# install from test-pypi\npip install --index-url https://test.pypi.org/simple/ transtab==0.0.2c\n\n# upload to pypi\ntwine upload dist/*\n"
  },
  {
    "path": "requirements.txt",
    "content": "numpy\nscikit_learn\nsetuptools\ntransformers<=4.30.0\ntqdm\npandas>=1.3.0\nopenml>=0.10.0\n"
  },
  {
    "path": "setup.py",
    "content": "import os\nimport setuptools\n\nthis_directory = os.path.abspath(os.path.dirname(__file__))\n\nwith open(\"README.md\", \"r\") as f:\n    long_description = f.read()\n\n# read the contents of requirements.txt\nwith open(os.path.join(this_directory, 'requirements.txt'),\n          encoding='utf-8') as f:\n    requirements = f.read().splitlines()\n\nsetuptools.setup(\n    name = 'transtab',\n    version = '0.0.7',\n    author = 'Zifeng Wang',\n    author_email = 'zifengw2@illinois.edu',\n    description = 'A flexible tabular prediction model that handles variable-column input tables.',\n    url = 'https://github.com/RyanWangZf/transtab',\n    keywords=['tabular data', 'machine learning', 'data mining', 'data science'],\n    long_description=long_description,\n    long_description_content_type='text/markdown',\n    packages=setuptools.find_packages(exclude=['test']),\n    install_requires=requirements,\n    classifiers=[\n        \"Programming Language :: Python :: 3\",\n        \"Programming Language :: Python :: 3.7\",\n        \"Programming Language :: Python :: 3.8\",\n        \"Programming Language :: Python :: 3.9\",\n        \"License :: OSI Approved :: BSD License\",\n        \"Operating System :: OS Independent\",\n    ],\n)\n"
  },
  {
    "path": "transtab/__init__.py",
    "content": "name = 'transtab'\nversion = '0.0.6'\n\nfrom .transtab import *\n"
  },
  {
    "path": "transtab/constants.py",
    "content": "# Name of the files used for checkpointing\nTRAINING_ARGS_NAME = \"training_args.json\"\nTRAINER_STATE_NAME = \"trainer_state.json\"\nOPTIMIZER_NAME = \"optimizer.pt\"\nSCHEDULER_NAME = \"scheduler.pt\"\nWEIGHTS_NAME = \"pytorch_model.bin\"\nTOKENIZER_DIR = 'tokenizer'\nEXTRACTOR_STATE_DIR = 'extractor'\nEXTRACTOR_STATE_NAME = 'extractor.json'\nINPUT_ENCODER_NAME = 'input_encoder.bin'"
  },
  {
    "path": "transtab/dataset.py",
    "content": "import os\nimport pdb\n\nimport pandas as pd\nimport numpy as np\nfrom sklearn.preprocessing import LabelEncoder, OrdinalEncoder, MinMaxScaler\nfrom sklearn.model_selection import train_test_split\n\ntry:\n    import openml\n    _has_openml = True\nexcept ImportError:\n    _has_openml = False\n\nimport logging\nlogger = logging.getLogger(__name__)\n\nOPENML_DATACONFIG = {\n    'credit-g': {'bin': ['own_telephone', 'foreign_worker']},\n}\n\nEXAMPLE_DATACONFIG = {\n    \"example\": {\n        \"bin\": [\"bin1\", \"bin2\"],\n        \"cat\": [\"cat1\", \"cat2\"],\n        \"num\": [\"num1\", \"num2\"],\n        \"cols\": [\"bin1\", \"bin2\", \"cat1\", \"cat2\", \"num1\", \"num2\"],\n        \"binary_indicator\": [\"1\", \"yes\", \"true\", \"positive\", \"t\", \"y\"],\n        \"data_split_idx\": {\n            \"train\":[0, 1, 2, 3, 4, 5, 6, 7, 8, 9],\n            \"val\":[10, 11, 12, 13, 14, 15, 16, 17, 18, 19],\n            \"test\":[20, 21, 22, 23, 24, 25, 26, 27, 28, 29],\n        }\n    }\n}\n\ndef load_data(dataname, dataset_config=None, encode_cat=False, data_cut=None, seed=123):\n    '''Load datasets from the local device or from openml.datasets.\n\n    Parameters\n    ----------\n    dataname: str or int\n        the dataset name/index intended to be loaded from openml. or the directory to the local dataset.\n    \n    dataset_config: dict\n        the dataset configuration to specify for loading. Please note that this variable will\n        override the configuration loaded from the local files or from the openml.dataset.\n    \n    encode_cat: bool\n        whether encoder the categorical/binary columns to be discrete indices, keep False for TransTab models.\n    \n    data_cut: int\n        how many to split the raw tables into partitions equally; set None will not execute partition.\n\n    seed: int\n        the random seed set to ensure the fixed train/val/test split.\n\n    Returns\n    -------\n    all_list: list or tuple\n        the complete dataset, be (x,y) or [(x1,y1),(x2,y2),...].\n\n    train_list: list or tuple\n        the train dataset, be (x,y) or [(x1,y1),(x2,y2),...].\n\n    val_list: list or tuple\n        the validation dataset, be (x,y) or [(x1,y1),(x2,y2),...].\n\n    test_list: list\n        the test dataset, be (x,y) or [(x1,y1),(x2,y2),...].\n\n    cat_col_list: list\n        the list of categorical column names.\n\n    num_col_list: list\n        the list of numerical column names.\n\n    bin_col_list: list\n        the list of binary column names.\n\n    '''\n    if dataset_config is None: dataset_config = OPENML_DATACONFIG\n    if isinstance(dataname, str):\n        # load a single tabular data\n        return load_single_data(dataname=dataname, dataset_config=dataset_config, encode_cat=encode_cat, data_cut=data_cut, seed=seed)\n    \n    if isinstance(dataname, list):\n        # load a list of datasets, combine together and outputs\n        num_col_list, cat_col_list, bin_col_list = [], [], []\n        all_list = []\n        train_list, val_list, test_list = [], [], []\n        for dataname_ in dataname:\n            data_config = dataset_config.get(dataname_, None)\n            allset, trainset, valset, testset, cat_cols, num_cols, bin_cols = \\\n                load_single_data(dataname_, dataset_config=data_config, encode_cat=encode_cat, data_cut=data_cut, seed=seed)\n            num_col_list.extend(num_cols)\n            cat_col_list.extend(cat_cols)\n            bin_col_list.extend(bin_cols)\n            all_list.append(allset)\n            train_list.append(trainset)\n            val_list.append(valset)\n            test_list.append(testset)\n        return all_list, train_list, val_list, test_list, cat_col_list, num_col_list, bin_col_list\n\ndef load_single_data(dataname, dataset_config=None, encode_cat=False, data_cut=None, seed=123):\n    '''Load tabular dataset from local or from openml public database.\n    args:\n        dataname: Can either be the data directory on `./data/{dataname}` or the dataname which can be found from the openml database.\n        dataset_config: \n            A dict like {'dataname':{'bin': [col1,col2,...]}} to indicate the binary columns for the data obtained from openml.\n            Also can be used to {'dataname':{'cols':[col1,col2,..]}} to assign a new set of column names to the data\n        encode_cat:  Set `False` if we are using transtab, otherwise we set it True to encode categorical values into indexes.\n        data_cut: The number of cuts of the training set. Cut is performed on both rows and columns.\n    outputs:\n        allset: (X,y) that contains all samples of this dataset\n        trainset, valset, testset: the train/val/test split\n        num_cols, cat_cols, bin_cols: the list of numerical/categorical/binary column names\n    '''\n    print('####'*10)\n    if os.path.exists(dataname):\n        print(f'load from local data dir {dataname}')\n        filename = os.path.join(dataname, 'data_processed.csv')\n        df = pd.read_csv(filename, index_col=0)\n        y = df['target_label']\n        X = df.drop(['target_label'],axis=1)\n        all_cols = [col.lower() for col in X.columns.tolist()]\n\n        X.columns = all_cols\n        attribute_names = all_cols\n        ftfile = os.path.join(dataname, 'numerical_feature.txt')\n        if os.path.exists(ftfile):\n            with open(ftfile,'r') as f: num_cols = [x.strip().lower() for x in f.readlines()]\n        else:\n            num_cols = []\n        bnfile = os.path.join(dataname, 'binary_feature.txt')\n        if os.path.exists(bnfile):\n            with open(bnfile,'r') as f: bin_cols = [x.strip().lower() for x in f.readlines()]\n        else:\n            bin_cols = []\n        cat_cols = [col for col in all_cols if col not in num_cols and col not in bin_cols]\n\n        # update cols by loading dataset_config\n        if dataset_config is not None:\n            if 'columns' in dataset_config:\n                new_cols = dataset_config['columns']\n                X.columns = new_cols\n\n            if 'bin' in dataset_config:\n                bin_cols = dataset_config['bin']\n            \n            if 'cat' in dataset_config:\n                cat_cols = dataset_config['cat']\n\n            if 'num' in dataset_config:\n                num_cols = dataset_config['num']\n        \n    else:\n        if not _has_openml:\n            raise ImportError(\n                \"OpenML is required for this functionality. \"\n                \"Please install it with: pip install openml\"\n            )\n        dataset = openml.datasets.get_dataset(dataname)\n        X,y,categorical_indicator, attribute_names = dataset.get_data(dataset_format='dataframe', target=dataset.default_target_attribute)\n        \n        if isinstance(dataname, int):\n            openml_list = openml.datasets.list_datasets(output_format=\"dataframe\")  # returns a dict\n            dataname = openml_list.loc[openml_list.did == dataname].name.values[0]\n        else:\n            openml_list = openml.datasets.list_datasets(output_format=\"dataframe\")  # returns a dict\n            print(f'openml data index: {openml_list.loc[openml_list.name == dataname].index[0]}')\n        \n        print(f'load data from {dataname}')\n\n        # drop cols which only have one unique value\n        drop_cols = [col for col in attribute_names if X[col].nunique()<=1]\n\n        all_cols = np.array(attribute_names)\n        categorical_indicator = np.array(categorical_indicator)\n        cat_cols = [col for col in all_cols[categorical_indicator] if col not in drop_cols]\n        num_cols = [col for col in all_cols[~categorical_indicator] if col not in drop_cols]\n        all_cols = [col for col in all_cols if col not in drop_cols]\n        \n        if dataset_config is not None:\n            if 'bin' in dataset_config: bin_cols = [c for c in cat_cols if c in dataset_config['bin']]\n        else: bin_cols = []\n        cat_cols = [c for c in cat_cols if c not in bin_cols]\n\n        # encode target label\n        y = LabelEncoder().fit_transform(y.values)\n        y = pd.Series(y,index=X.index)\n\n    # start processing features\n    # process num\n    if len(num_cols) > 0:\n        for col in num_cols: X[col].fillna(X[col].mode()[0], inplace=True)\n        X[num_cols] = MinMaxScaler().fit_transform(X[num_cols])\n\n    if len(cat_cols) > 0:\n        for col in cat_cols: X[col].fillna(X[col].mode()[0], inplace=True)\n        # process cate\n        if encode_cat:\n            X[cat_cols] = OrdinalEncoder().fit_transform(X[cat_cols])\n        else:\n            X[cat_cols] = X[cat_cols].astype(str)\n\n    if len(bin_cols) > 0:\n        for col in bin_cols: X[col].fillna(X[col].mode()[0], inplace=True)\n        if 'binary_indicator' in dataset_config:\n            X[bin_cols] = X[bin_cols].astype(str).applymap(lambda x: 1 if x.lower() in dataset_config['binary_indicator'] else 0).values\n        else:\n            X[bin_cols] = X[bin_cols].astype(str).applymap(lambda x: 1 if x.lower() in ['yes','true','1','t'] else 0).values        \n        \n        # if no dataset_config given, keep its original format\n        # raise warning if there is not only 0/1 in the binary columns\n        if (~X[bin_cols].isin([0,1])).any().any():\n            raise ValueError(f'binary columns {bin_cols} contains values other than 0/1.')\n\n    \n    X = X[bin_cols + num_cols + cat_cols]\n\n    # rename column names if is given\n    if dataset_config is not None:\n        data_config = dataset_config\n        if 'columns' in data_config:\n            new_cols = data_config['columns']\n            X.columns = new_cols\n            attribute_names = new_cols\n\n        if 'bin' in data_config:\n            bin_cols = data_config['bin']\n        \n        if 'cat' in data_config:\n            cat_cols = data_config['cat']\n\n        if 'num' in data_config:\n            num_cols = data_config['num']\n\n\n    # split train/val/test\n    data_split_idx = None\n    if dataset_config is not None:\n        data_split_idx = dataset_config.get('data_split_idx', None)\n\n    if data_split_idx is not None:\n        train_idx = data_split_idx.get('train', None)\n        val_idx = data_split_idx.get('val', None)\n        test_idx = data_split_idx.get('test', None)\n\n        if train_idx is None or test_idx is None:\n            raise ValueError('train/test split indices must be provided together')\n    \n        else:\n            train_dataset = X.iloc[train_idx]\n            y_train = y[train_idx]\n            test_dataset = X.iloc[test_idx]\n            y_test = y[test_idx]\n            if val_idx is not None:\n                val_dataset = X.iloc[val_idx]\n                y_val = y[val_idx]\n            else:\n                val_dataset = None\n                y_val = None\n    else:\n        # split train/val/test\n        train_dataset, test_dataset, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=seed, stratify=y, shuffle=True)\n        val_size = int(len(y)*0.1)\n        val_dataset = train_dataset.iloc[-val_size:]\n        y_val = y_train[-val_size:]\n        train_dataset = train_dataset.iloc[:-val_size]\n        y_train = y_train[:-val_size]\n\n    if data_cut is not None:\n        np.random.shuffle(all_cols)\n        sp_size=int(len(all_cols)/data_cut)\n        col_splits = np.split(all_cols, range(0,len(all_cols),sp_size))[1:]\n        new_col_splits = []\n        for split in col_splits:\n            candidate_cols = np.random.choice(np.setdiff1d(all_cols, split), int(sp_size/2), replace=False)\n            new_col_splits.append(split.tolist() + candidate_cols.tolist())\n        if len(col_splits) > data_cut:\n            for i in range(len(col_splits[-1])):\n                new_col_splits[i] += [col_splits[-1][i]]\n                new_col_splits[i] = np.unique(new_col_splits[i]).tolist()\n            new_col_splits = new_col_splits[:-1]\n\n        # cut subset\n        trainset_splits = np.array_split(train_dataset, data_cut)\n        train_subset_list = []\n        for i in range(data_cut):\n            train_subset_list.append(\n                (trainset_splits[i][new_col_splits[i]], y_train.loc[trainset_splits[i].index])\n            )\n        print('# data: {}, # feat: {}, # cate: {},  # bin: {}, # numerical: {}, pos rate: {:.2f}'.format(len(X), len(attribute_names), len(cat_cols), len(bin_cols), len(num_cols), (y==1).sum()/len(y)))\n        return (X, y), train_subset_list, (val_dataset,y_val), (test_dataset, y_test), cat_cols, num_cols, bin_cols\n\n    else:\n        print('# data: {}, # feat: {}, # cate: {},  # bin: {}, # numerical: {}, pos rate: {:.2f}'.format(len(X), len(attribute_names), len(cat_cols), len(bin_cols), len(num_cols), (y==1).sum()/len(y)))\n        return (X,y), (train_dataset,y_train), (val_dataset,y_val), (test_dataset, y_test), cat_cols, num_cols, bin_cols"
  },
  {
    "path": "transtab/evaluator.py",
    "content": "from collections import defaultdict\nimport os\nimport pdb\n\nimport torch\nimport numpy as np\nfrom sklearn.metrics import roc_auc_score, accuracy_score, mean_squared_error\n\nfrom transtab import constants\n\ndef predict(clf, \n    x_test,\n    y_test=None,\n    return_loss=False,\n    eval_batch_size=256,\n    ):\n    '''Make predictions by TransTabClassifier.\n\n    Parameters\n    ----------\n    clf: TransTabClassifier\n        the classifier model to make predictions.\n\n    x_test: pd.DataFrame\n            input tabular data.\n\n    y_test: pd.Series\n        target labels for input x_test. will be ignored if ``return_loss=False``.\n    \n    return_loss: bool\n        set True will return the loss if y_test is given.\n    \n    eval_batch_size: int\n        the batch size for inference.\n\n    Returns\n    -------\n    pred_all: np.array\n        if ``return_loss=False``, return the predictions made by TransTabClassifier.\n\n    avg_loss: float\n        if ``return_loss=True``, return the mean loss of the predictions made by TransTabClassifier.\n\n    '''\n    clf.eval()\n    pred_list, loss_list = [], []\n    for i in range(0, len(x_test), eval_batch_size):\n        bs_x_test = x_test.iloc[i:i+eval_batch_size]\n        bs_y_test = y_test.iloc[i:i+eval_batch_size] if y_test is not None else None\n        with torch.no_grad():\n            logits, loss = clf(bs_x_test, bs_y_test)\n        \n        if loss is not None:\n            loss_list.append(loss.item())\n            \n        if logits.shape[-1] == 1: # binary classification\n            pred_list.append(logits.sigmoid().detach().cpu().numpy())\n        else: # multi-class classification\n            pred_list.append(torch.softmax(logits,-1).detach().cpu().numpy())\n            \n    pred_all = np.concatenate(pred_list, 0)\n        \n    if logits.shape[-1] == 1:\n        pred_all = pred_all.flatten()\n\n    if return_loss:\n        avg_loss = np.mean(loss_list)\n        return avg_loss\n    else:\n        return pred_all\n\ndef evaluate(ypred, y_test, metric='auc', seed=123, bootstrap=False):\n    np.random.seed(seed)\n    eval_fn = get_eval_metric_fn(metric)\n    res_list = []\n    stats_dict = defaultdict(list)\n    if bootstrap:\n        for i in range(10):\n            sub_idx = np.random.choice(np.arange(len(ypred)), len(ypred), replace=True)\n            sub_ypred = ypred[sub_idx]\n            sub_ytest = y_test.iloc[sub_idx]\n            try:\n                sub_res = eval_fn(sub_ytest, sub_ypred)\n            except ValueError:\n                print('evaluation went wrong!')\n            stats_dict[metric].append(sub_res)\n        for key in stats_dict.keys():\n            stats = stats_dict[key]\n            alpha = 0.95\n            p = ((1-alpha)/2) * 100\n            lower = max(0, np.percentile(stats, p))\n            p = (alpha+((1.0-alpha)/2.0)) * 100\n            upper = min(1.0, np.percentile(stats, p))\n            print('{} {:.2f} mean/interval {:.4f}({:.2f})'.format(key, alpha, (upper+lower)/2, (upper-lower)/2))\n            if key == metric: res_list.append((upper+lower)/2)\n    else:\n        res = eval_fn(y_test, ypred)\n        res_list.append(res)\n    return res_list\n\ndef get_eval_metric_fn(eval_metric):\n    fn_dict = {\n        'acc': acc_fn,\n        'auc': auc_fn,\n        'mse': mse_fn,\n        'val_loss': None,\n    }\n    return fn_dict[eval_metric]\n\ndef acc_fn(y, p):\n    y_p = np.argmax(p, -1)\n    return accuracy_score(y, y_p)\n\ndef auc_fn(y, p):\n    return roc_auc_score(y, p)\n\ndef mse_fn(y, p):\n    return mean_squared_error(y, p)\n\nclass EarlyStopping:\n    \"\"\"Early stops the training if validation loss doesn't improve after a given patience.\"\"\"\n    def __init__(self, patience=7, verbose=False, delta=0, output_dir='ckpt', trace_func=print, less_is_better=False):\n        \"\"\"\n        Args:\n            patience (int): How long to wait after last time validation loss improved.\n                            Default: 7\n            verbose (bool): If True, prints a message for each validation loss improvement. \n                            Default: False\n            delta (float): Minimum change in the monitored quantity to qualify as an improvement.\n                            Default: 0\n            path (str): Path for the checkpoint to be saved to.\n                            Default: 'checkpoint.pt'\n            trace_func (function): trace print function.\n                            Default: print     \n            less_is_better (bool): If True (e.g., val loss), the metric is less the better.       \n        \"\"\"\n        self.patience = patience\n        self.verbose = verbose\n        self.counter = 0\n        self.best_score = None\n        self.early_stop = False\n        self.val_loss_min = np.inf\n        self.delta = delta\n        self.path = output_dir\n        self.trace_func = trace_func\n        self.less_is_better = less_is_better\n\n    def __call__(self, val_loss, model):\n        if self.patience < 0: # no early stop\n            self.early_stop = False\n            return\n        \n        if self.less_is_better:\n            score = val_loss\n        else:    \n            score = -val_loss\n        if self.best_score is None:\n            self.best_score = score\n            self.save_checkpoint(val_loss, model)\n        elif score < self.best_score + self.delta:\n            self.counter += 1\n            self.trace_func(f'EarlyStopping counter: {self.counter} out of {self.patience}')\n            if self.counter >= self.patience:\n                self.early_stop = True\n        else:\n            self.best_score = score\n            self.save_checkpoint(val_loss, model)\n            self.counter = 0\n\n    def save_checkpoint(self, val_loss, model):\n        '''Saves model when validation loss decrease.'''\n        if self.verbose:\n            self.trace_func(f'Validation loss decreased ({self.val_loss_min:.6f} --> {val_loss:.6f}).  Saving model ...')\n        torch.save(model.state_dict(), os.path.join(self.path, constants.WEIGHTS_NAME))\n        self.val_loss_min = val_loss\n\n"
  },
  {
    "path": "transtab/modeling_transtab.py",
    "content": "import os, pdb\nimport math\nimport collections\nimport json\nfrom typing import Dict, Optional, Any, Union, Callable, List\n\nfrom transformers import BertTokenizer, BertTokenizerFast\nimport torch\nfrom torch import nn\nfrom torch import Tensor\nimport torch.nn.init as nn_init\nimport torch.nn.functional as F\nimport numpy as np\nimport pandas as pd\n\nfrom transtab import constants\n\nimport logging\nlogger = logging.getLogger(__name__)\n\nclass TransTabWordEmbedding(nn.Module):\n    r'''\n    Encode tokens drawn from column names, categorical and binary features.\n    '''\n    def __init__(self,\n        vocab_size,\n        hidden_dim,\n        padding_idx=0,\n        hidden_dropout_prob=0,\n        layer_norm_eps=1e-5,\n        ) -> None:\n        super().__init__()\n        self.word_embeddings = nn.Embedding(vocab_size, hidden_dim, padding_idx)\n        nn_init.kaiming_normal_(self.word_embeddings.weight)\n        self.norm = nn.LayerNorm(hidden_dim, eps=layer_norm_eps)\n        self.dropout = nn.Dropout(hidden_dropout_prob)\n\n    def forward(self, input_ids) -> Tensor:\n        embeddings = self.word_embeddings(input_ids)\n        embeddings = self.norm(embeddings)\n        embeddings =  self.dropout(embeddings)\n        return embeddings\n\nclass TransTabNumEmbedding(nn.Module):\n    r'''\n    Encode tokens drawn from column names and the corresponding numerical features.\n    '''\n    def __init__(self, hidden_dim) -> None:\n        super().__init__()\n        self.norm = nn.LayerNorm(hidden_dim)\n        self.num_bias = nn.Parameter(Tensor(1, 1, hidden_dim)) # add bias\n        nn_init.uniform_(self.num_bias, a=-1/math.sqrt(hidden_dim), b=1/math.sqrt(hidden_dim))\n\n    def forward(self, num_col_emb, x_num_ts, num_mask=None) -> Tensor:\n        '''args:\n        num_col_emb: numerical column embedding, (# numerical columns, emb_dim)\n        x_num_ts: numerical features, (bs, emb_dim)\n        num_mask: the mask for NaN numerical features, (bs, # numerical columns)\n        '''\n        num_col_emb = num_col_emb.unsqueeze(0).expand((x_num_ts.shape[0],-1,-1))\n        num_feat_emb = num_col_emb * x_num_ts.unsqueeze(-1).float() + self.num_bias\n        return num_feat_emb\n\nclass TransTabFeatureExtractor:\n    r'''\n    Process input dataframe to input indices towards transtab encoder,\n    usually used to build dataloader for paralleling loading.\n    '''\n    def __init__(self,\n        categorical_columns=None,\n        numerical_columns=None,\n        binary_columns=None,\n        disable_tokenizer_parallel=False,\n        ignore_duplicate_cols=False,\n        **kwargs,\n        ) -> None:\n        '''args:\n        categorical_columns: a list of categories feature names\n        numerical_columns: a list of numerical feature names\n        binary_columns: a list of yes or no feature names, accept binary indicators like\n            (yes,no); (true,false); (0,1).\n        disable_tokenizer_parallel: true if use extractor for collator function in torch.DataLoader\n        ignore_duplicate_cols: check if exists one col belongs to both cat/num or cat/bin or num/bin,\n            if set `true`, the duplicate cols will be deleted, else throws errors.\n        '''\n        if os.path.exists('./transtab/tokenizer'):\n            self.tokenizer = BertTokenizerFast.from_pretrained('./transtab/tokenizer')\n        else:\n            self.tokenizer = BertTokenizerFast.from_pretrained('bert-base-uncased')\n            self.tokenizer.save_pretrained('./transtab/tokenizer')\n        self.tokenizer.__dict__['model_max_length'] = 512\n        if disable_tokenizer_parallel: # disable tokenizer parallel\n            os.environ[\"TOKENIZERS_PARALLELISM\"] = \"false\"\n        self.vocab_size = self.tokenizer.vocab_size\n        self.pad_token_id = self.tokenizer.pad_token_id\n\n        self.categorical_columns = categorical_columns\n        self.numerical_columns = numerical_columns\n        self.binary_columns = binary_columns\n        self.ignore_duplicate_cols = ignore_duplicate_cols\n\n        if categorical_columns is not None:\n            self.categorical_columns = list(set(categorical_columns))\n        if numerical_columns is not None:\n            self.numerical_columns = list(set(numerical_columns))\n        if binary_columns is not None:\n            self.binary_columns = list(set(binary_columns))\n\n        # check if column exists overlap\n        col_no_overlap, duplicate_cols = self._check_column_overlap(self.categorical_columns, self.numerical_columns, self.binary_columns)\n        if not self.ignore_duplicate_cols:\n            for col in duplicate_cols:\n                logger.error(f'Find duplicate cols named `{col}`, please process the raw data or set `ignore_duplicate_cols` to True!')\n            assert col_no_overlap, 'The assigned categorical_columns, numerical_columns, binary_columns should not have overlap! Please check your input.'\n        else:\n            self._solve_duplicate_cols(duplicate_cols)\n\n    def __call__(self, x, shuffle=False) -> Dict:\n        '''\n        Parameters\n        ----------\n        x: pd.DataFrame \n            with column names and features.\n\n        shuffle: bool\n            if shuffle column order during the training.\n\n        Returns\n        -------\n        encoded_inputs: a dict with {\n                'x_num': tensor contains numerical features,\n                'num_col_input_ids': tensor contains numerical column tokenized ids,\n                'x_cat_input_ids': tensor contains categorical column + feature ids,\n                'x_bin_input_ids': tesnor contains binary column + feature ids,\n            }\n        '''\n        encoded_inputs = {\n            'x_num':None,\n            'num_col_input_ids':None,\n            'x_cat_input_ids':None,\n            'x_bin_input_ids':None,\n        }\n        col_names = x.columns.tolist()\n        cat_cols = [c for c in col_names if c in self.categorical_columns] if self.categorical_columns is not None else []\n        num_cols = [c for c in col_names if c in self.numerical_columns] if self.numerical_columns is not None else []\n        bin_cols = [c for c in col_names if c in self.binary_columns] if self.binary_columns is not None else []\n\n        if len(cat_cols+num_cols+bin_cols) == 0:\n            # take all columns as categorical columns!\n            cat_cols = col_names\n\n        if shuffle:\n            np.random.shuffle(cat_cols)\n            np.random.shuffle(num_cols)\n            np.random.shuffle(bin_cols)\n\n        # TODO:\n        # mask out NaN values like done in binary columns\n        if len(num_cols) > 0:\n            x_num = x[num_cols]\n            x_num = x_num.fillna(0) # fill Nan with zero\n            x_num_ts = torch.tensor(x_num.values, dtype=float)\n            num_col_ts = self.tokenizer(num_cols, padding=True, truncation=True, add_special_tokens=False, return_tensors='pt')\n            encoded_inputs['x_num'] = x_num_ts\n            encoded_inputs['num_col_input_ids'] = num_col_ts['input_ids']\n            encoded_inputs['num_att_mask'] = num_col_ts['attention_mask'] # mask out attention\n\n        if len(cat_cols) > 0:\n            x_cat = x[cat_cols].astype(str)\n            x_mask = (~pd.isna(x_cat)).astype(int)\n            x_cat = x_cat.fillna('')\n            x_cat = x_cat.apply(lambda x: x.name + ' '+ x) * x_mask # mask out nan features\n            x_cat_str = x_cat.agg(' '.join, axis=1).values.tolist()\n            x_cat_ts = self.tokenizer(x_cat_str, padding=True, truncation=True, add_special_tokens=False, return_tensors='pt')\n\n            encoded_inputs['x_cat_input_ids'] = x_cat_ts['input_ids']\n            encoded_inputs['cat_att_mask'] = x_cat_ts['attention_mask']\n\n        if len(bin_cols) > 0:\n            x_bin = x[bin_cols] # x_bin should already be integral (binary values in 0 & 1)\n            x_bin_str = x_bin.apply(lambda x: x.name + ' ') * x_bin\n            x_bin_str = x_bin_str.agg(' '.join, axis=1).values.tolist()\n            x_bin_ts = self.tokenizer(x_bin_str, padding=True, truncation=True, add_special_tokens=False, return_tensors='pt')\n            if x_bin_ts['input_ids'].shape[1] > 0: # not all false\n                encoded_inputs['x_bin_input_ids'] = x_bin_ts['input_ids']\n                encoded_inputs['bin_att_mask'] = x_bin_ts['attention_mask']\n\n        return encoded_inputs\n\n    def save(self, path):\n        '''save the feature extractor configuration to local dir.\n        '''\n        save_path = os.path.join(path, constants.EXTRACTOR_STATE_DIR)\n        if not os.path.exists(save_path):\n            os.makedirs(save_path)\n\n        # save tokenizer\n        tokenizer_path = os.path.join(save_path, constants.TOKENIZER_DIR)\n        self.tokenizer.save_pretrained(tokenizer_path)\n\n        # save other configurations\n        coltype_path = os.path.join(save_path, constants.EXTRACTOR_STATE_NAME)\n        col_type_dict = {\n            'categorical': self.categorical_columns,\n            'binary': self.binary_columns,\n            'numerical': self.numerical_columns,\n        }\n        with open(coltype_path, 'w', encoding='utf-8') as f:\n            f.write(json.dumps(col_type_dict))\n\n    def load(self, path):\n        '''load the feature extractor configuration from local dir.\n        '''\n        tokenizer_path = os.path.join(path, constants.TOKENIZER_DIR)\n        coltype_path = os.path.join(path, constants.EXTRACTOR_STATE_NAME)\n\n        self.tokenizer = BertTokenizerFast.from_pretrained(tokenizer_path)\n        with open(coltype_path, 'r', encoding='utf-8') as f:\n            col_type_dict = json.loads(f.read())\n\n        self.categorical_columns = col_type_dict['categorical']\n        self.numerical_columns = col_type_dict['numerical']\n        self.binary_columns = col_type_dict['binary']\n        logger.info(f'load feature extractor from {coltype_path}')\n\n    def update(self, cat=None, num=None, bin=None):\n        '''update cat/num/bin column maps.\n        '''\n        if cat is not None:\n            self.categorical_columns.extend(cat)\n            self.categorical_columns = list(set(self.categorical_columns))\n\n        if num is not None:\n            self.numerical_columns.extend(num)\n            self.numerical_columns = list(set(self.numerical_columns))\n\n        if bin is not None:\n            self.binary_columns.extend(bin)\n            self.binary_columns = list(set(self.binary_columns))\n\n        col_no_overlap, duplicate_cols = self._check_column_overlap(self.categorical_columns, self.numerical_columns, self.binary_columns)\n        if not self.ignore_duplicate_cols:\n            for col in duplicate_cols:\n                logger.error(f'Find duplicate cols named `{col}`, please process the raw data or set `ignore_duplicate_cols` to True!')\n            assert col_no_overlap, 'The assigned categorical_columns, numerical_columns, binary_columns should not have overlap! Please check your input.'\n        else:\n            self._solve_duplicate_cols(duplicate_cols)\n\n    def _check_column_overlap(self, cat_cols=None, num_cols=None, bin_cols=None):\n        all_cols = []\n        if cat_cols is not None: all_cols.extend(cat_cols)\n        if num_cols is not None: all_cols.extend(num_cols)\n        if bin_cols is not None: all_cols.extend(bin_cols)\n        org_length = len(all_cols)\n        if org_length == 0:\n            logger.warning('No cat/num/bin cols specified, will take ALL columns as categorical! Ignore this warning if you specify the `checkpoint` to load the model.')\n            return True, []\n        unq_length = len(list(set(all_cols)))\n        duplicate_cols = [item for item, count in collections.Counter(all_cols).items() if count > 1]\n        return org_length == unq_length, duplicate_cols\n\n    def _solve_duplicate_cols(self, duplicate_cols):\n        for col in duplicate_cols:\n            logger.warning('Find duplicate cols named `{col}`, will ignore it during training!')\n            if col in self.categorical_columns:\n                self.categorical_columns.remove(col)\n                self.categorical_columns.append(f'[cat]{col}')\n            if col in self.numerical_columns:\n                self.numerical_columns.remove(col)\n                self.numerical_columns.append(f'[num]{col}')\n            if col in self.binary_columns:\n                self.binary_columns.remove(col)\n                self.binary_columns.append(f'[bin]{col}')\n\nclass TransTabFeatureProcessor(nn.Module):\n    r'''\n    Process inputs from feature extractor to map them to embeddings.\n    '''\n    def __init__(self,\n        vocab_size=None,\n        hidden_dim=128,\n        hidden_dropout_prob=0,\n        pad_token_id=0,\n        device='cuda:0',\n        ) -> None:\n        '''args:\n        categorical_columns: a list of categories feature names\n        numerical_columns: a list of numerical feature names\n        binary_columns: a list of yes or no feature names, accept binary indicators like\n            (yes,no); (true,false); (0,1).\n        '''\n        super().__init__()\n        self.word_embedding = TransTabWordEmbedding(\n            vocab_size=vocab_size,\n            hidden_dim=hidden_dim,\n            hidden_dropout_prob=hidden_dropout_prob,\n            padding_idx=pad_token_id\n            )\n        self.num_embedding = TransTabNumEmbedding(hidden_dim)\n        self.align_layer = nn.Linear(hidden_dim, hidden_dim, bias=False)\n        self.device = device\n\n    def _avg_embedding_by_mask(self, embs, att_mask=None):\n        if att_mask is None:\n            return embs.mean(1)\n        else:\n            embs[att_mask==0] = 0\n            embs = embs.sum(1) / att_mask.sum(1,keepdim=True).to(embs.device)\n            return embs\n\n    def forward(self,\n        x_num=None,\n        num_col_input_ids=None,\n        num_att_mask=None,\n        x_cat_input_ids=None,\n        cat_att_mask=None,\n        x_bin_input_ids=None,\n        bin_att_mask=None,\n        **kwargs,\n        ) -> Tensor:\n        '''args:\n        x: pd.DataFrame with column names and features.\n        shuffle: if shuffle column order during the training.\n        num_mask: indicate the NaN place of numerical features, 0: NaN 1: normal.\n        '''\n        num_feat_embedding = None\n        cat_feat_embedding = None\n        bin_feat_embedding = None\n\n        if x_num is not None and num_col_input_ids is not None:\n            num_col_emb = self.word_embedding(num_col_input_ids.to(self.device)) # number of cat col, num of tokens, embdding size\n            x_num = x_num.to(self.device)\n            num_col_emb = self._avg_embedding_by_mask(num_col_emb, num_att_mask)\n            num_feat_embedding = self.num_embedding(num_col_emb, x_num)\n            num_feat_embedding = self.align_layer(num_feat_embedding)\n\n        if x_cat_input_ids is not None:\n            cat_feat_embedding = self.word_embedding(x_cat_input_ids.to(self.device))\n            cat_feat_embedding = self.align_layer(cat_feat_embedding)\n\n        if x_bin_input_ids is not None:\n            if x_bin_input_ids.shape[1] == 0: # all false, pad zero\n                x_bin_input_ids = torch.zeros(x_bin_input_ids.shape[0],dtype=int)[:,None]\n            bin_feat_embedding = self.word_embedding(x_bin_input_ids.to(self.device))\n            bin_feat_embedding = self.align_layer(bin_feat_embedding)\n\n        # concat all embeddings\n        emb_list = []\n        att_mask_list = []\n        if num_feat_embedding is not None:\n            emb_list += [num_feat_embedding]\n            att_mask_list += [torch.ones(num_feat_embedding.shape[0], num_feat_embedding.shape[1])]\n        if cat_feat_embedding is not None:\n            emb_list += [cat_feat_embedding]\n            att_mask_list += [cat_att_mask]\n        if bin_feat_embedding is not None:\n            emb_list += [bin_feat_embedding]\n            att_mask_list += [bin_att_mask]\n        if len(emb_list) == 0: raise Exception('no feature found belonging into numerical, categorical, or binary, check your data!')\n        all_feat_embedding = torch.cat(emb_list, 1).float()\n        attention_mask = torch.cat(att_mask_list, 1).to(all_feat_embedding.device)\n        return {'embedding': all_feat_embedding, 'attention_mask': attention_mask}\n\ndef _get_activation_fn(activation):\n    if activation == \"relu\":\n        return F.relu\n    elif activation == \"gelu\":\n        return F.gelu\n    elif activation == 'selu':\n        return F.selu\n    elif activation == 'leakyrelu':\n        return F.leaky_relu\n    raise RuntimeError(\"activation should be relu/gelu/selu/leakyrelu, not {}\".format(activation))\n\nclass TransTabTransformerLayer(nn.Module):\n    __constants__ = ['batch_first', 'norm_first']\n    def __init__(self, d_model, nhead, dim_feedforward=2048, dropout=0.1, activation=F.relu,\n                 layer_norm_eps=1e-5, batch_first=True, norm_first=False,\n                 device=None, dtype=None, use_layer_norm=True) -> None:\n        factory_kwargs = {'device': device, 'dtype': dtype}\n        super().__init__()\n        self.self_attn = nn.MultiheadAttention(d_model, nhead, batch_first=batch_first,\n                                            **factory_kwargs)\n        # Implementation of Feedforward model\n        self.linear1 = nn.Linear(d_model, dim_feedforward, **factory_kwargs)\n        self.dropout = nn.Dropout(dropout)\n        self.linear2 = nn.Linear(dim_feedforward, d_model, **factory_kwargs)\n\n        # Implementation of gates\n        self.gate_linear = nn.Linear(d_model, 1, bias=False)\n        self.gate_act = nn.Sigmoid()\n\n        self.norm_first = norm_first\n        self.use_layer_norm = use_layer_norm\n\n        if self.use_layer_norm:\n            self.norm1 = nn.LayerNorm(d_model, eps=layer_norm_eps, **factory_kwargs)\n            self.norm2 = nn.LayerNorm(d_model, eps=layer_norm_eps, **factory_kwargs)\n        self.dropout1 = nn.Dropout(dropout)\n        self.dropout2 = nn.Dropout(dropout)\n\n        # Legacy string support for activation function.\n        if isinstance(activation, str):\n            self.activation = _get_activation_fn(activation)\n        else:\n            self.activation = activation\n\n    # self-attention block\n    def _sa_block(self, x: Tensor,\n                  attn_mask: Optional[Tensor], key_padding_mask: Optional[Tensor]) -> Tensor:\n        src = x\n        key_padding_mask = ~key_padding_mask.bool()\n        x = self.self_attn(x, x, x,\n                           attn_mask=attn_mask,\n                           key_padding_mask=key_padding_mask,\n                           )[0]\n        return self.dropout1(x)\n\n    # feed forward block\n    def _ff_block(self, x: Tensor) -> Tensor:\n        g = self.gate_act(self.gate_linear(x))\n        h = self.linear1(x)\n        h = h * g # add gate\n        h = self.linear2(self.dropout(self.activation(h)))\n        return self.dropout2(h)\n\n    def __setstate__(self, state):\n        if 'activation' not in state:\n            state['activation'] = F.relu\n        super().__setstate__(state)\n\n    def forward(self, src, src_mask= None, src_key_padding_mask= None, is_causal=None, **kwargs) -> Tensor:\n        r\"\"\"Pass the input through the encoder layer.\n\n        Args:\n            src: the sequence to the encoder layer (required).\n            src_mask: the mask for the src sequence (optional).\n            src_key_padding_mask: the mask for the src keys per batch (optional).\n\n        Shape:\n            see the docs in Transformer class.\n        \"\"\"\n        # see Fig. 1 of https://arxiv.org/pdf/2002.04745v1.pdf\n        x = src\n        if self.use_layer_norm:\n            if self.norm_first:\n                x = x + self._sa_block(self.norm1(x), src_mask, src_key_padding_mask)\n                x = x + self._ff_block(self.norm2(x))\n            else:\n                x = self.norm1(x + self._sa_block(x, src_mask, src_key_padding_mask))\n                x = self.norm2(x + self._ff_block(x))\n\n        else: # do not use layer norm\n                x = x + self._sa_block(x, src_mask, src_key_padding_mask)\n                x = x + self._ff_block(x)\n        return x\n\nclass TransTabInputEncoder(nn.Module):\n    '''\n    Build a feature encoder that maps inputs tabular samples to embeddings.\n    \n    Parameters:\n    -----------\n    categorical_columns: list \n        a list of categorical feature names.\n\n    numerical_columns: list\n        a list of numerical feature names.\n\n    binary_columns: list\n        a list of binary feature names, accept binary indicators like (yes,no); (true,false); (0,1).\n\n    ignore_duplicate_cols: bool\n        if there is one column assigned to more than one type, e.g., the feature age is both nominated\n        as categorical and binary columns, the model will raise errors. set True to avoid this error as \n        the model will ignore this duplicate feature.\n\n    disable_tokenizer_parallel: bool\n        if the returned feature extractor is leveraged by the collate function for a dataloader,\n        try to set this False in case the dataloader raises errors because the dataloader builds \n        multiple workers and the tokenizer builds multiple workers at the same time.\n\n    hidden_dim: int\n        the dimension of hidden embeddings.\n\n    hidden_dropout_prob: float\n        the dropout ratio in the transformer encoder.\n    \n    device: str\n        the device, ``\"cpu\"`` or ``\"cuda:0\"``.\n\n    '''\n    def __init__(self,\n        feature_extractor,\n        feature_processor,\n        device='cuda:0',\n        ):\n        super().__init__()\n        self.feature_extractor = feature_extractor\n        self.feature_processor = feature_processor\n        self.device = device\n        self.to(device)\n\n    def forward(self, x):\n        '''\n        Encode input tabular samples into embeddings.\n\n        Parameters\n        ----------\n        x: pd.DataFrame\n            with column names and features.        \n        '''\n        tokenized = self.feature_extractor(x)\n        embeds = self.feature_processor(**tokenized)\n        return embeds\n    \n    def load(self, ckpt_dir):\n        # load feature extractor\n        self.feature_extractor.load(os.path.join(ckpt_dir, constants.EXTRACTOR_STATE_DIR))\n\n        # load embedding layer\n        model_name = os.path.join(ckpt_dir, constants.INPUT_ENCODER_NAME)\n        state_dict = torch.load(model_name, map_location='cpu')\n        missing_keys, unexpected_keys = self.load_state_dict(state_dict, strict=False)\n        logger.info(f'missing keys: {missing_keys}')\n        logger.info(f'unexpected keys: {unexpected_keys}')\n        logger.info(f'load model from {ckpt_dir}')\n\nclass TransTabEncoder(nn.Module):\n    def __init__(self,\n        hidden_dim=128,\n        num_layer=2,\n        num_attention_head=2,\n        hidden_dropout_prob=0,\n        ffn_dim=256,\n        activation='relu',\n        ):\n        super().__init__()\n        self.transformer_encoder = nn.ModuleList(\n            [\n            TransTabTransformerLayer(\n                d_model=hidden_dim,\n                nhead=num_attention_head,\n                dropout=hidden_dropout_prob,\n                dim_feedforward=ffn_dim,\n                batch_first=True,\n                layer_norm_eps=1e-5,\n                norm_first=False,\n                use_layer_norm=True,\n                activation=activation,)\n            ]\n            )\n        if num_layer > 1:\n            encoder_layer = TransTabTransformerLayer(d_model=hidden_dim,\n                nhead=num_attention_head,\n                dropout=hidden_dropout_prob,\n                dim_feedforward=ffn_dim,\n                batch_first=True,\n                layer_norm_eps=1e-5,\n                norm_first=False,\n                use_layer_norm=True,\n                activation=activation,\n                )\n            stacked_transformer = nn.TransformerEncoder(encoder_layer, num_layers=num_layer-1)\n            self.transformer_encoder.append(stacked_transformer)\n\n    def forward(self, embedding, attention_mask=None, **kwargs) -> Tensor:\n        '''args:\n        embedding: bs, num_token, hidden_dim\n        '''\n        outputs = embedding\n        for i, mod in enumerate(self.transformer_encoder):\n            outputs = mod(outputs, src_key_padding_mask=attention_mask)\n        return outputs\n\nclass TransTabLinearClassifier(nn.Module):\n    def __init__(self,\n        num_class,\n        hidden_dim=128) -> None:\n        super().__init__()\n        if num_class <= 2:\n            self.fc = nn.Linear(hidden_dim, 1)\n        else:\n            self.fc = nn.Linear(hidden_dim, num_class)\n        self.norm = nn.LayerNorm(hidden_dim)\n\n    def forward(self, x) -> Tensor:\n        x = x[:,0,:] # take the cls token embedding\n        x = self.norm(x)\n        logits = self.fc(x)\n        return logits\n    \nclass TransTabLinearRegressor(nn.Module):\n    def __init__(self,\n        hidden_dim=128) -> None:\n        super().__init__()\n        self.fc = nn.Linear(hidden_dim, 1)\n        self.norm = nn.LayerNorm(hidden_dim)\n\n    def forward(self, x) -> Tensor:\n        x = x[:,0,:] # take the cls token embedding\n        x = self.norm(x)\n        output = self.fc(x)\n        return output\n\nclass TransTabProjectionHead(nn.Module):\n    def __init__(self,\n        hidden_dim=128,\n        projection_dim=128):\n        super().__init__()\n        self.dense = nn.Linear(hidden_dim, projection_dim, bias=False)\n\n    def forward(self, x) -> Tensor:\n        h = self.dense(x)\n        return h\n\nclass TransTabCLSToken(nn.Module):\n    '''add a learnable cls token embedding at the end of each sequence.\n    '''\n    def __init__(self, hidden_dim) -> None:\n        super().__init__()\n        self.weight = nn.Parameter(Tensor(hidden_dim))\n        nn_init.uniform_(self.weight, a=-1/math.sqrt(hidden_dim),b=1/math.sqrt(hidden_dim))\n        self.hidden_dim = hidden_dim\n\n    def expand(self, *leading_dimensions):\n        new_dims = (1,) * (len(leading_dimensions)-1)\n        return self.weight.view(*new_dims, -1).expand(*leading_dimensions, -1)\n\n    def forward(self, embedding, attention_mask=None, **kwargs) -> Tensor:\n        embedding = torch.cat([self.expand(len(embedding), 1), embedding], dim=1)\n        outputs = {'embedding': embedding}\n        if attention_mask is not None:\n            attention_mask = torch.cat([torch.ones(attention_mask.shape[0],1).to(attention_mask.device), attention_mask], 1)\n        outputs['attention_mask'] = attention_mask\n        return outputs\n\nclass TransTabModel(nn.Module):\n    '''The base transtab model for downstream tasks like contrastive learning, binary classification, etc.\n    All models subclass this basemodel and usually rewrite the ``forward`` function. Refer to the source code of\n    :class:`transtab.modeling_transtab.TransTabClassifier` or :class:`transtab.modeling_transtab.TransTabForCL` for the implementation details.\n\n    Parameters\n    ----------\n    categorical_columns: list\n        a list of categorical feature names.\n\n    numerical_columns: list\n        a list of numerical feature names.\n\n    binary_columns: list\n        a list of binary feature names, accept binary indicators like (yes,no); (true,false); (0,1).\n\n    feature_extractor: TransTabFeatureExtractor\n        a feature extractor to tokenize the input tables. if not passed the model will build itself.\n\n    hidden_dim: int\n        the dimension of hidden embeddings.\n\n    num_layer: int\n        the number of transformer layers used in the encoder.\n\n    num_attention_head: int\n        the numebr of heads of multihead self-attention layer in the transformers.\n\n    hidden_dropout_prob: float\n        the dropout ratio in the transformer encoder.\n\n    ffn_dim: int\n        the dimension of feed-forward layer in the transformer layer.\n\n    activation: str\n        the name of used activation functions, support ``\"relu\"``, ``\"gelu\"``, ``\"selu\"``, ``\"leakyrelu\"``.\n\n    device: str\n        the device, ``\"cpu\"`` or ``\"cuda:0\"``.\n\n    Returns\n    -------\n    A TransTabModel model.\n\n    '''\n    def __init__(self,\n        categorical_columns=None,\n        numerical_columns=None,\n        binary_columns=None,\n        feature_extractor=None,\n        hidden_dim=128,\n        num_layer=2,\n        num_attention_head=8,\n        hidden_dropout_prob=0.1,\n        ffn_dim=256,\n        activation='relu',\n        device='cuda:0',\n        **kwargs,\n        ) -> None:\n\n        super().__init__()\n        self.categorical_columns=categorical_columns\n        self.numerical_columns=numerical_columns\n        self.binary_columns=binary_columns\n        if categorical_columns is not None:\n            self.categorical_columns = list(set(categorical_columns))\n        if numerical_columns is not None:\n            self.numerical_columns = list(set(numerical_columns))\n        if binary_columns is not None:\n            self.binary_columns = list(set(binary_columns))\n\n        if feature_extractor is None:\n            feature_extractor = TransTabFeatureExtractor(\n                categorical_columns=self.categorical_columns,\n                numerical_columns=self.numerical_columns,\n                binary_columns=self.binary_columns,\n                **kwargs,\n            )\n\n        feature_processor = TransTabFeatureProcessor(\n            vocab_size=feature_extractor.vocab_size,\n            pad_token_id=feature_extractor.pad_token_id,\n            hidden_dim=hidden_dim,\n            hidden_dropout_prob=hidden_dropout_prob,\n            device=device,\n            )\n        \n        self.input_encoder = TransTabInputEncoder(\n            feature_extractor=feature_extractor,\n            feature_processor=feature_processor,\n            device=device,\n            )\n\n        self.encoder = TransTabEncoder(\n            hidden_dim=hidden_dim,\n            num_layer=num_layer,\n            num_attention_head=num_attention_head,\n            hidden_dropout_prob=hidden_dropout_prob,\n            ffn_dim=ffn_dim,\n            activation=activation,\n            )\n\n        self.cls_token = TransTabCLSToken(hidden_dim=hidden_dim)\n        self.device = device\n        self.to(device)\n\n    def forward(self, x, y=None):\n        '''Extract the embeddings based on input tables.\n\n        Parameters\n        ----------\n        x: pd.DataFrame\n            a batch of samples stored in pd.DataFrame.\n\n        y: pd.Series\n            the corresponding labels for each sample in ``x``. ignored for the basemodel.\n\n        Returns\n        -------\n        final_cls_embedding: torch.Tensor\n            the [CLS] embedding at the end of transformer encoder.\n\n        '''\n        embeded = self.input_encoder(x)\n        embeded = self.cls_token(**embeded)\n\n        # go through transformers, get final cls embedding\n        encoder_output = self.encoder(**embeded)\n\n        # get cls token\n        final_cls_embedding = encoder_output[:,0,:]\n        return final_cls_embedding\n\n    def load(self, ckpt_dir):\n        '''Load the model state_dict and feature_extractor configuration\n        from the ``ckpt_dir``.\n\n        Parameters\n        ----------\n        ckpt_dir: str\n            the directory path to load.\n\n        Returns\n        -------\n        None\n\n        '''\n        # load model weight state dict\n        model_name = os.path.join(ckpt_dir, constants.WEIGHTS_NAME)\n        state_dict = torch.load(model_name, map_location='cpu')\n        missing_keys, unexpected_keys = self.load_state_dict(state_dict, strict=False)\n        logger.info(f'missing keys: {missing_keys}')\n        logger.info(f'unexpected keys: {unexpected_keys}')\n        logger.info(f'load model from {ckpt_dir}')\n\n        # load feature extractor\n        self.input_encoder.feature_extractor.load(os.path.join(ckpt_dir, constants.EXTRACTOR_STATE_DIR))\n        self.binary_columns = self.input_encoder.feature_extractor.binary_columns\n        self.categorical_columns = self.input_encoder.feature_extractor.categorical_columns\n        self.numerical_columns = self.input_encoder.feature_extractor.numerical_columns\n\n    def save(self, ckpt_dir):\n        '''Save the model state_dict and feature_extractor configuration\n        to the ``ckpt_dir``.\n\n        Parameters\n        ----------\n        ckpt_dir: str\n            the directory path to save.\n\n        Returns\n        -------\n        None\n\n        '''\n        # save model weight state dict\n        if not os.path.exists(ckpt_dir): os.makedirs(ckpt_dir, exist_ok=True)\n        state_dict = self.state_dict()\n        torch.save(state_dict, os.path.join(ckpt_dir, constants.WEIGHTS_NAME))\n        if self.input_encoder.feature_extractor is not None:\n            self.input_encoder.feature_extractor.save(ckpt_dir)\n\n        # save the input encoder separately\n        state_dict_input_encoder = self.input_encoder.state_dict()\n        torch.save(state_dict_input_encoder, os.path.join(ckpt_dir, constants.INPUT_ENCODER_NAME))\n        return None\n\n    def update(self, config):\n        '''Update the configuration of feature extractor's column map for cat, num, and bin cols.\n        Or update the number of classes for the output classifier layer.\n\n        Parameters\n        ----------\n        config: dict\n            a dict of configurations: keys cat:list, num:list, bin:list are to specify the new column names;\n            key num_class:int is to specify the number of classes for finetuning on a new dataset.\n\n        Returns\n        -------\n        None\n\n        '''\n\n        col_map = {}\n        for k,v in config.items():\n            if k in ['cat','num','bin']: col_map[k] = v\n\n        self.input_encoder.feature_extractor.update(**col_map)\n        self.binary_columns = self.input_encoder.feature_extractor.binary_columns\n        self.categorical_columns = self.input_encoder.feature_extractor.categorical_columns\n        self.numerical_columns = self.input_encoder.feature_extractor.numerical_columns\n\n        if 'num_class' in config:\n            num_class = config['num_class']\n            self._adapt_to_new_num_class(num_class)\n\n        return None\n\n    def _check_column_overlap(self, cat_cols=None, num_cols=None, bin_cols=None):\n        all_cols = []\n        if cat_cols is not None: all_cols.extend(cat_cols)\n        if num_cols is not None: all_cols.extend(num_cols)\n        if bin_cols is not None: all_cols.extend(bin_cols)\n        org_length = len(all_cols)\n        unq_length = len(list(set(all_cols)))\n        duplicate_cols = [item for item, count in collections.Counter(all_cols).items() if count > 1]\n        return org_length == unq_length, duplicate_cols\n\n    def _solve_duplicate_cols(self, duplicate_cols):\n        for col in duplicate_cols:\n            logger.warning('Find duplicate cols named `{col}`, will ignore it during training!')\n            if col in self.categorical_columns:\n                self.categorical_columns.remove(col)\n                self.categorical_columns.append(f'[cat]{col}')\n            if col in self.numerical_columns:\n                self.numerical_columns.remove(col)\n                self.numerical_columns.append(f'[num]{col}')\n            if col in self.binary_columns:\n                self.binary_columns.remove(col)\n                self.binary_columns.append(f'[bin]{col}')\n\n    def _adapt_to_new_num_class(self, num_class):\n        if num_class != self.num_class:\n            self.num_class = num_class\n            self.clf = TransTabLinearClassifier(num_class, hidden_dim=self.cls_token.hidden_dim)\n            self.clf.to(self.device)\n            if self.num_class > 2:\n                self.loss_fn = nn.CrossEntropyLoss(reduction='none')\n            else:\n                self.loss_fn = nn.BCEWithLogitsLoss(reduction='none')\n            logger.info(f'Build a new classifier with num {num_class} classes outputs, need further finetune to work.')\n\n\nclass TransTabClassifier(TransTabModel):\n    '''The classifier model subclass from :class:`transtab.modeling_transtab.TransTabModel`.\n\n    Parameters\n    ----------\n    categorical_columns: list\n        a list of categorical feature names.\n\n    numerical_columns: list\n        a list of numerical feature names.\n\n    binary_columns: list\n        a list of binary feature names, accept binary indicators like (yes,no); (true,false); (0,1).\n\n    feature_extractor: TransTabFeatureExtractor\n        a feature extractor to tokenize the input tables. if not passed the model will build itself.\n\n    num_class: int\n        number of output classes to be predicted.\n\n    hidden_dim: int\n        the dimension of hidden embeddings.\n\n    num_layer: int\n        the number of transformer layers used in the encoder.\n\n    num_attention_head: int\n        the numebr of heads of multihead self-attention layer in the transformers.\n\n    hidden_dropout_prob: float\n        the dropout ratio in the transformer encoder.\n\n    ffn_dim: int\n        the dimension of feed-forward layer in the transformer layer.\n\n    activation: str\n        the name of used activation functions, support ``\"relu\"``, ``\"gelu\"``, ``\"selu\"``, ``\"leakyrelu\"``.\n\n    device: str\n        the device, ``\"cpu\"`` or ``\"cuda:0\"``.\n\n    Returns\n    -------\n    A TransTabClassifier model.\n\n    '''\n    def __init__(self,\n        categorical_columns=None,\n        numerical_columns=None,\n        binary_columns=None,\n        feature_extractor=None,\n        num_class=2,\n        hidden_dim=128,\n        num_layer=2,\n        num_attention_head=8,\n        hidden_dropout_prob=0,\n        ffn_dim=256,\n        activation='relu',\n        device='cuda:0',\n        **kwargs,\n        ) -> None:\n        super().__init__(\n            categorical_columns=categorical_columns,\n            numerical_columns=numerical_columns,\n            binary_columns=binary_columns,\n            feature_extractor=feature_extractor,\n            hidden_dim=hidden_dim,\n            num_layer=num_layer,\n            num_attention_head=num_attention_head,\n            hidden_dropout_prob=hidden_dropout_prob,\n            ffn_dim=ffn_dim,\n            activation=activation,\n            device=device,\n            **kwargs,\n        )\n        self.num_class = num_class\n        self.clf = TransTabLinearClassifier(num_class=num_class, hidden_dim=hidden_dim)\n        if self.num_class > 2:\n            self.loss_fn = nn.CrossEntropyLoss(reduction='none')\n        else:\n            self.loss_fn = nn.BCEWithLogitsLoss(reduction='none')\n        self.to(device)\n\n    def forward(self, x, y=None):\n        '''Make forward pass given the input feature ``x`` and label ``y`` (optional).\n\n        Parameters\n        ----------\n        x: pd.DataFrame or dict\n            pd.DataFrame: a batch of raw tabular samples; dict: the output of TransTabFeatureExtractor.\n\n        y: pd.Series\n            the corresponding labels for each sample in ``x``. if label is given, the model will return\n            the classification loss by ``self.loss_fn``.\n\n        Returns\n        -------\n        logits: torch.Tensor\n            the [CLS] embedding at the end of transformer encoder.\n\n        loss: torch.Tensor or None\n            the classification loss.\n\n        '''\n        if isinstance(x, dict):\n            # input is the pre-tokenized encoded inputs\n            inputs = x\n        elif isinstance(x, pd.DataFrame):\n            # input is dataframe\n            inputs = self.input_encoder.feature_extractor(x)\n        else:\n            raise ValueError(f'TransTabClassifier takes inputs with dict or pd.DataFrame, find {type(x)}.')\n\n        outputs = self.input_encoder.feature_processor(**inputs)\n        outputs = self.cls_token(**outputs)\n\n        # go through transformers, get the first cls embedding\n        encoder_output = self.encoder(**outputs) # bs, seqlen+1, hidden_dim\n\n        # classifier\n        logits = self.clf(encoder_output)\n\n        if y is not None:\n            # compute classification loss\n            if self.num_class == 2:\n                y_ts = torch.tensor(y.values).to(self.device).float()\n                loss = self.loss_fn(logits.flatten(), y_ts)\n            else:\n                y_ts = torch.tensor(y.values).to(self.device).long()\n                loss = self.loss_fn(logits, y_ts)\n            loss = loss.mean()\n        else:\n            loss = None\n\n        return logits, loss\n    \nclass TransTabRegressor(TransTabModel):\n    '''The regression model subclass from :class:`transtab.modeling_transtab.TransTabModel`.\n\n    Parameters\n    ----------\n    categorical_columns: list\n        a list of categorical feature names.\n\n    numerical_columns: list\n        a list of numerical feature names.\n\n    binary_columns: list\n        a list of binary feature names, accept binary indicators like (yes,no); (true,false); (0,1).\n\n    feature_extractor: TransTabFeatureExtractor\n        a feature extractor to tokenize the input tables. if not passed the model will build itself.\n\n    num_class: int\n        number of output classes to be predicted.\n\n    hidden_dim: int\n        the dimension of hidden embeddings.\n\n    num_layer: int\n        the number of transformer layers used in the encoder.\n\n    num_attention_head: int\n        the numebr of heads of multihead self-attention layer in the transformers.\n\n    hidden_dropout_prob: float\n        the dropout ratio in the transformer encoder.\n\n    ffn_dim: int\n        the dimension of feed-forward layer in the transformer layer.\n\n    activation: str\n        the name of used activation functions, support ``\"relu\"``, ``\"gelu\"``, ``\"selu\"``, ``\"leakyrelu\"``.\n\n    device: str\n        the device, ``\"cpu\"`` or ``\"cuda:0\"``.\n\n    Returns\n    -------\n    A TransTabRegressor model.\n\n    '''\n    def __init__(self,\n        categorical_columns=None,\n        numerical_columns=None,\n        binary_columns=None,\n        feature_extractor=None,\n        num_class=1,\n        hidden_dim=128,\n        num_layer=2,\n        num_attention_head=8,\n        hidden_dropout_prob=0,\n        ffn_dim=256,\n        activation='relu',\n        device='cuda:0',\n        **kwargs,\n        ) -> None:\n        super().__init__(\n            categorical_columns=categorical_columns,\n            numerical_columns=numerical_columns,\n            binary_columns=binary_columns,\n            feature_extractor=feature_extractor,\n            hidden_dim=hidden_dim,\n            num_layer=num_layer,\n            num_attention_head=num_attention_head,\n            hidden_dropout_prob=hidden_dropout_prob,\n            ffn_dim=ffn_dim,\n            activation=activation,\n            device=device,\n            **kwargs,\n        )\n        self.num_class = num_class\n        self.regressor = TransTabLinearRegressor(hidden_dim=hidden_dim)\n        \n        self.loss_fn = nn.MSELoss()\n        self.to(device)\n\n    def forward(self, x, y=None):\n        '''Make forward pass given the input feature ``x`` and label ``y`` (optional).\n\n        Parameters\n        ----------\n        x: pd.DataFrame or dict\n            pd.DataFrame: a batch of raw tabular samples; dict: the output of TransTabFeatureExtractor.\n\n        y: pd.Series\n            the corresponding labels for each sample in ``x``. if label is given, the model will return\n            the classification loss by ``self.loss_fn``.\n\n        Returns\n        -------\n        logits: torch.Tensor\n            the [CLS] embedding at the end of transformer encoder.\n\n        loss: torch.Tensor or None\n            the classification loss.\n\n        '''\n        if isinstance(x, dict):\n            # input is the pre-tokenized encoded inputs\n            inputs = x\n        elif isinstance(x, pd.DataFrame):\n            # input is dataframe\n            inputs = self.input_encoder.feature_extractor(x)\n        else:\n            raise ValueError(f'TransTabRegressor takes inputs with dict or pd.DataFrame, find {type(x)}.')\n\n        outputs = self.input_encoder.feature_processor(**inputs)\n        outputs = self.cls_token(**outputs)\n\n        # go through transformers, get the first cls embedding\n        encoder_output = self.encoder(**outputs) # bs, seqlen+1, hidden_dim\n\n        # regression\n        output = self.regressor(encoder_output)\n\n        if y is not None:\n            # compute regression loss\n            y_ts = torch.tensor(y.values).to(self.device).float()\n            loss = self.loss_fn(output.flatten(), y_ts)\n            loss = loss.mean()\n        else:\n            loss = None\n\n        return output, loss\n\nclass TransTabForCL(TransTabModel):\n    '''The contrasstive learning model subclass from :class:`transtab.modeling_transtab.TransTabModel`.\n\n    Parameters\n    ----------\n    categorical_columns: list\n        a list of categorical feature names.\n\n    numerical_columns: list\n        a list of numerical feature names.\n\n    binary_columns: list\n        a list of binary feature names, accept binary indicators like (yes,no); (true,false); (0,1).\n\n    feature_extractor: TransTabFeatureExtractor\n        a feature extractor to tokenize the input tables. if not passed the model will build itself.\n\n    hidden_dim: int\n        the dimension of hidden embeddings.\n\n    num_layer: int\n        the number of transformer layers used in the encoder.\n\n    num_attention_head: int\n        the numebr of heads of multihead self-attention layer in the transformers.\n\n    hidden_dropout_prob: float\n        the dropout ratio in the transformer encoder.\n\n    ffn_dim: int\n        the dimension of feed-forward layer in the transformer layer.\n\n    projection_dim: int\n        the dimension of projection head on the top of encoder.\n\n    overlap_ratio: float\n        the overlap ratio of columns of different partitions when doing subsetting.\n\n    num_partition: int\n        the number of partitions made for vertical-partition contrastive learning.\n\n    supervised: bool\n        whether or not to take supervised VPCL, otherwise take self-supervised VPCL.\n\n    temperature: float\n        temperature used to compute logits for contrastive learning.\n\n    base_temperature: float\n        base temperature used to normalize the temperature.\n\n    activation: str\n        the name of used activation functions, support ``\"relu\"``, ``\"gelu\"``, ``\"selu\"``, ``\"leakyrelu\"``.\n\n    device: str\n        the device, ``\"cpu\"`` or ``\"cuda:0\"``.\n\n    Returns\n    -------\n    A TransTabForCL model.\n\n    '''\n    def __init__(self,\n        categorical_columns=None,\n        numerical_columns=None,\n        binary_columns=None,\n        feature_extractor=None,\n        hidden_dim=128,\n        num_layer=2,\n        num_attention_head=8,\n        hidden_dropout_prob=0,\n        ffn_dim=256,\n        projection_dim=128,\n        overlap_ratio=0.1,\n        num_partition=2,\n        supervised=True,\n        temperature=10,\n        base_temperature=10,\n        activation='relu',\n        device='cuda:0',\n        **kwargs,\n        ) -> None:\n        super().__init__(\n            categorical_columns=categorical_columns,\n            numerical_columns=numerical_columns,\n            binary_columns=binary_columns,\n            feature_extractor=feature_extractor,\n            hidden_dim=hidden_dim,\n            num_layer=num_layer,\n            num_attention_head=num_attention_head,\n            hidden_dropout_prob=hidden_dropout_prob,\n            ffn_dim=ffn_dim,\n            activation=activation,\n            device=device,\n            **kwargs,\n            )\n        assert num_partition > 0, f'number of contrastive subsets must be greater than 0, got {num_partition}'\n        assert isinstance(num_partition,int), f'number of constrative subsets must be int, got {type(num_partition)}'\n        assert overlap_ratio >= 0 and overlap_ratio < 1, f'overlap_ratio must be in [0, 1), got {overlap_ratio}'\n        self.projection_head = TransTabProjectionHead(hidden_dim, projection_dim)\n        self.cross_entropy_loss = nn.CrossEntropyLoss()\n        self.temperature = temperature\n        self.base_temperature = base_temperature\n        self.num_partition = num_partition\n        self.overlap_ratio = overlap_ratio\n        self.supervised = supervised\n        self.device = device\n        self.to(device)\n\n    def forward(self, x, y=None):\n        '''Make forward pass given the input feature ``x`` and label ``y`` (optional).\n\n        Parameters\n        ----------\n        x: pd.DataFrame or dict\n            pd.DataFrame: a batch of raw tabular samples; dict: the output of TransTabFeatureExtractor.\n\n        y: pd.Series\n            the corresponding labels for each sample in ``x``. if label is given, the model will return\n            the classification loss by ``self.loss_fn``.\n\n        Returns\n        -------\n        logits: None\n            this CL model does NOT return logits.\n\n        loss: torch.Tensor\n            the supervised or self-supervised VPCL loss.\n\n        '''\n        # do positive sampling\n        feat_x_list = []\n        if isinstance(x, pd.DataFrame):\n            sub_x_list = self._build_positive_pairs(x, self.num_partition)\n            for sub_x in sub_x_list:\n                # encode two subset feature samples\n                feat_x = self.input_encoder(sub_x)\n                feat_x = self.cls_token(**feat_x)\n                feat_x = self.encoder(**feat_x)\n                feat_x_proj = feat_x[:,0,:] # take cls embedding\n                feat_x_proj = self.projection_head(feat_x_proj) # bs, projection_dim\n                feat_x_list.append(feat_x_proj)\n        elif isinstance(x, dict):\n            # pretokenized inputs\n            for input_x in x['input_sub_x']:\n                feat_x = self.input_encoder.feature_processor(**input_x)\n                feat_x = self.cls_token(**feat_x)\n                feat_x = self.encoder(**feat_x)\n                feat_x_proj = feat_x[:, 0, :]\n                feat_x_proj = self.projection_head(feat_x_proj)\n                feat_x_list.append(feat_x_proj)\n        else:\n            raise ValueError(f'expect input x to be pd.DataFrame or dict(pretokenized), get {type(x)} instead')\n\n        feat_x_multiview = torch.stack(feat_x_list, axis=1) # bs, n_view, emb_dim\n\n        if y is not None and self.supervised:\n            # take supervised loss\n            y = torch.tensor(y.values, device=feat_x_multiview.device)\n            loss = self.supervised_contrastive_loss(feat_x_multiview, y)\n        else:\n            # compute cl loss (multi-view InfoNCE loss)\n            loss = self.self_supervised_contrastive_loss(feat_x_multiview)\n        return None, loss\n\n    def _build_positive_pairs(self, x, n):\n        x_cols = x.columns.tolist()\n        sub_col_list = np.array_split(np.array(x_cols), n)\n        len_cols = len(sub_col_list[0])\n        overlap = int(np.ceil(len_cols * (self.overlap_ratio)))\n        sub_x_list = []\n        for i, sub_col in enumerate(sub_col_list):\n            if overlap > 0 and i < n-1:\n                sub_col = np.concatenate([sub_col, sub_col_list[i+1][:overlap]])\n            elif overlap >0 and i == n-1:\n                sub_col = np.concatenate([sub_col, sub_col_list[i-1][-overlap:]])\n            sub_x = x.copy()[sub_col]\n            sub_x_list.append(sub_x)\n        return sub_x_list\n\n    def cos_sim(self, a, b):\n        if not isinstance(a, torch.Tensor):\n            a = torch.tensor(a)\n\n        if not isinstance(b, torch.Tensor):\n            b = torch.tensor(b)\n\n        if len(a.shape) == 1:\n            a = a.unsqueeze(0)\n\n        if len(b.shape) == 1:\n            b = b.unsqueeze(0)\n\n        a_norm = torch.nn.functional.normalize(a, p=2, dim=1)\n        b_norm = torch.nn.functional.normalize(b, p=2, dim=1)\n        return torch.mm(a_norm, b_norm.transpose(0, 1))\n\n    def self_supervised_contrastive_loss(self, features):\n        '''Compute the self-supervised VPCL loss.\n\n        Parameters\n        ----------\n        features: torch.Tensor\n            the encoded features of multiple partitions of input tables, with shape ``(bs, n_partition, proj_dim)``.\n\n        Returns\n        -------\n        loss: torch.Tensor\n            the computed self-supervised VPCL loss.\n        '''\n        batch_size = features.shape[0]\n        labels = torch.arange(batch_size, dtype=torch.long, device=self.device).view(-1,1)\n        mask = torch.eq(labels, labels.T).float().to(labels.device)\n\n        contrast_count = features.shape[1]\n        # [[0,1],[2,3]] -> [0,2,1,3]\n        contrast_feature = torch.cat(torch.unbind(features,dim=1),dim=0)\n        anchor_feature = contrast_feature\n        anchor_count = contrast_count\n        anchor_dot_contrast = torch.div(torch.matmul(anchor_feature, contrast_feature.T), self.temperature)\n        logits_max, _ = torch.max(anchor_dot_contrast, dim=1, keepdim=True)\n        logits = anchor_dot_contrast - logits_max.detach()\n        mask = mask.repeat(anchor_count, contrast_count)\n        logits_mask = torch.scatter(torch.ones_like(mask), 1, torch.arange(batch_size * anchor_count).view(-1, 1).to(features.device), 0)\n        mask = mask * logits_mask\n        # compute log_prob\n        exp_logits = torch.exp(logits) * logits_mask\n        log_prob = logits - torch.log(exp_logits.sum(1, keepdim=True))\n        # compute mean of log-likelihood over positive\n        mean_log_prob_pos = (mask * log_prob).sum(1) / mask.sum(1)\n        loss = - (self.temperature / self.base_temperature) * mean_log_prob_pos\n        loss = loss.view(anchor_count, batch_size).mean()\n        return loss\n\n    def supervised_contrastive_loss(self, features, labels):\n        '''Compute the supervised VPCL loss.\n\n        Parameters\n        ----------\n        features: torch.Tensor\n            the encoded features of multiple partitions of input tables, with shape ``(bs, n_partition, proj_dim)``.\n\n        labels: torch.Tensor\n            the class labels to be used for building positive/negative pairs in VPCL.\n\n        Returns\n        -------\n        loss: torch.Tensor\n            the computed VPCL loss.\n\n        '''\n        labels = labels.contiguous().view(-1,1)\n        batch_size = features.shape[0]\n        mask = torch.eq(labels, labels.T).float().to(labels.device)\n\n        contrast_count = features.shape[1]\n        contrast_feature = torch.cat(torch.unbind(features,dim=1),dim=0)\n\n        # contrast_mode == 'all'\n        anchor_feature = contrast_feature\n        anchor_count = contrast_count\n\n        # compute logits\n        anchor_dot_contrast = torch.div(\n            torch.matmul(anchor_feature, contrast_feature.T),\n            self.temperature)\n        # for numerical stability\n        logits_max, _ = torch.max(anchor_dot_contrast, dim=1, keepdim=True)\n        logits = anchor_dot_contrast - logits_max.detach()\n        # tile mask\n        mask = mask.repeat(anchor_count, contrast_count)\n        # mask-out self-contrast cases\n        logits_mask = torch.scatter(\n            torch.ones_like(mask),\n            1,\n            torch.arange(batch_size * anchor_count).view(-1, 1).to(features.device),\n            0,\n        )\n        mask = mask * logits_mask\n        # compute log_prob\n        exp_logits = torch.exp(logits) * logits_mask\n        log_prob = logits - torch.log(exp_logits.sum(1, keepdim=True))\n        # compute mean of log-likelihood over positive\n        mean_log_prob_pos = (mask * log_prob).sum(1) / mask.sum(1)\n        loss = - (self.temperature / self.base_temperature) * mean_log_prob_pos\n        loss = loss.view(anchor_count, batch_size).mean()\n        return loss\n"
  },
  {
    "path": "transtab/tokenizer/special_tokens_map.json",
    "content": "{\"unk_token\": \"[UNK]\", \"sep_token\": \"[SEP]\", \"pad_token\": \"[PAD]\", \"cls_token\": \"[CLS]\", \"mask_token\": \"[MASK]\"}"
  },
  {
    "path": "transtab/tokenizer/tokenizer_config.json",
    "content": "{\"do_lower_case\": true, \"model_max_length\": 512}"
  },
  {
    "path": "transtab/tokenizer/vocab.txt",
    "content": "[PAD]\n[unused0]\n[unused1]\n[unused2]\n[unused3]\n[unused4]\n[unused5]\n[unused6]\n[unused7]\n[unused8]\n[unused9]\n[unused10]\n[unused11]\n[unused12]\n[unused13]\n[unused14]\n[unused15]\n[unused16]\n[unused17]\n[unused18]\n[unused19]\n[unused20]\n[unused21]\n[unused22]\n[unused23]\n[unused24]\n[unused25]\n[unused26]\n[unused27]\n[unused28]\n[unused29]\n[unused30]\n[unused31]\n[unused32]\n[unused33]\n[unused34]\n[unused35]\n[unused36]\n[unused37]\n[unused38]\n[unused39]\n[unused40]\n[unused41]\n[unused42]\n[unused43]\n[unused44]\n[unused45]\n[unused46]\n[unused47]\n[unused48]\n[unused49]\n[unused50]\n[unused51]\n[unused52]\n[unused53]\n[unused54]\n[unused55]\n[unused56]\n[unused57]\n[unused58]\n[unused59]\n[unused60]\n[unused61]\n[unused62]\n[unused63]\n[unused64]\n[unused65]\n[unused66]\n[unused67]\n[unused68]\n[unused69]\n[unused70]\n[unused71]\n[unused72]\n[unused73]\n[unused74]\n[unused75]\n[unused76]\n[unused77]\n[unused78]\n[unused79]\n[unused80]\n[unused81]\n[unused82]\n[unused83]\n[unused84]\n[unused85]\n[unused86]\n[unused87]\n[unused88]\n[unused89]\n[unused90]\n[unused91]\n[unused92]\n[unused93]\n[unused94]\n[unused95]\n[unused96]\n[unused97]\n[unused98]\n[UNK]\n[CLS]\n[SEP]\n[MASK]\n[unused99]\n[unused100]\n[unused101]\n[unused102]\n[unused103]\n[unused104]\n[unused105]\n[unused106]\n[unused107]\n[unused108]\n[unused109]\n[unused110]\n[unused111]\n[unused112]\n[unused113]\n[unused114]\n[unused115]\n[unused116]\n[unused117]\n[unused118]\n[unused119]\n[unused120]\n[unused121]\n[unused122]\n[unused123]\n[unused124]\n[unused125]\n[unused126]\n[unused127]\n[unused128]\n[unused129]\n[unused130]\n[unused131]\n[unused132]\n[unused133]\n[unused134]\n[unused135]\n[unused136]\n[unused137]\n[unused138]\n[unused139]\n[unused140]\n[unused141]\n[unused142]\n[unused143]\n[unused144]\n[unused145]\n[unused146]\n[unused147]\n[unused148]\n[unused149]\n[unused150]\n[unused151]\n[unused152]\n[unused153]\n[unused154]\n[unused155]\n[unused156]\n[unused157]\n[unused158]\n[unused159]\n[unused160]\n[unused161]\n[unused162]\n[unused163]\n[unused164]\n[unused165]\n[unused166]\n[unused167]\n[unused168]\n[unused169]\n[unused170]\n[unused171]\n[unused172]\n[unused173]\n[unused174]\n[unused175]\n[unused176]\n[unused177]\n[unused178]\n[unused179]\n[unused180]\n[unused181]\n[unused182]\n[unused183]\n[unused184]\n[unused185]\n[unused186]\n[unused187]\n[unused188]\n[unused189]\n[unused190]\n[unused191]\n[unused192]\n[unused193]\n[unused194]\n[unused195]\n[unused196]\n[unused197]\n[unused198]\n[unused199]\n[unused200]\n[unused201]\n[unused202]\n[unused203]\n[unused204]\n[unused205]\n[unused206]\n[unused207]\n[unused208]\n[unused209]\n[unused210]\n[unused211]\n[unused212]\n[unused213]\n[unused214]\n[unused215]\n[unused216]\n[unused217]\n[unused218]\n[unused219]\n[unused220]\n[unused221]\n[unused222]\n[unused223]\n[unused224]\n[unused225]\n[unused226]\n[unused227]\n[unused228]\n[unused229]\n[unused230]\n[unused231]\n[unused232]\n[unused233]\n[unused234]\n[unused235]\n[unused236]\n[unused237]\n[unused238]\n[unused239]\n[unused240]\n[unused241]\n[unused242]\n[unused243]\n[unused244]\n[unused245]\n[unused246]\n[unused247]\n[unused248]\n[unused249]\n[unused250]\n[unused251]\n[unused252]\n[unused253]\n[unused254]\n[unused255]\n[unused256]\n[unused257]\n[unused258]\n[unused259]\n[unused260]\n[unused261]\n[unused262]\n[unused263]\n[unused264]\n[unused265]\n[unused266]\n[unused267]\n[unused268]\n[unused269]\n[unused270]\n[unused271]\n[unused272]\n[unused273]\n[unused274]\n[unused275]\n[unused276]\n[unused277]\n[unused278]\n[unused279]\n[unused280]\n[unused281]\n[unused282]\n[unused283]\n[unused284]\n[unused285]\n[unused286]\n[unused287]\n[unused288]\n[unused289]\n[unused290]\n[unused291]\n[unused292]\n[unused293]\n[unused294]\n[unused295]\n[unused296]\n[unused297]\n[unused298]\n[unused299]\n[unused300]\n[unused301]\n[unused302]\n[unused303]\n[unused304]\n[unused305]\n[unused306]\n[unused307]\n[unused308]\n[unused309]\n[unused310]\n[unused311]\n[unused312]\n[unused313]\n[unused314]\n[unused315]\n[unused316]\n[unused317]\n[unused318]\n[unused319]\n[unused320]\n[unused321]\n[unused322]\n[unused323]\n[unused324]\n[unused325]\n[unused326]\n[unused327]\n[unused328]\n[unused329]\n[unused330]\n[unused331]\n[unused332]\n[unused333]\n[unused334]\n[unused335]\n[unused336]\n[unused337]\n[unused338]\n[unused339]\n[unused340]\n[unused341]\n[unused342]\n[unused343]\n[unused344]\n[unused345]\n[unused346]\n[unused347]\n[unused348]\n[unused349]\n[unused350]\n[unused351]\n[unused352]\n[unused353]\n[unused354]\n[unused355]\n[unused356]\n[unused357]\n[unused358]\n[unused359]\n[unused360]\n[unused361]\n[unused362]\n[unused363]\n[unused364]\n[unused365]\n[unused366]\n[unused367]\n[unused368]\n[unused369]\n[unused370]\n[unused371]\n[unused372]\n[unused373]\n[unused374]\n[unused375]\n[unused376]\n[unused377]\n[unused378]\n[unused379]\n[unused380]\n[unused381]\n[unused382]\n[unused383]\n[unused384]\n[unused385]\n[unused386]\n[unused387]\n[unused388]\n[unused389]\n[unused390]\n[unused391]\n[unused392]\n[unused393]\n[unused394]\n[unused395]\n[unused396]\n[unused397]\n[unused398]\n[unused399]\n[unused400]\n[unused401]\n[unused402]\n[unused403]\n[unused404]\n[unused405]\n[unused406]\n[unused407]\n[unused408]\n[unused409]\n[unused410]\n[unused411]\n[unused412]\n[unused413]\n[unused414]\n[unused415]\n[unused416]\n[unused417]\n[unused418]\n[unused419]\n[unused420]\n[unused421]\n[unused422]\n[unused423]\n[unused424]\n[unused425]\n[unused426]\n[unused427]\n[unused428]\n[unused429]\n[unused430]\n[unused431]\n[unused432]\n[unused433]\n[unused434]\n[unused435]\n[unused436]\n[unused437]\n[unused438]\n[unused439]\n[unused440]\n[unused441]\n[unused442]\n[unused443]\n[unused444]\n[unused445]\n[unused446]\n[unused447]\n[unused448]\n[unused449]\n[unused450]\n[unused451]\n[unused452]\n[unused453]\n[unused454]\n[unused455]\n[unused456]\n[unused457]\n[unused458]\n[unused459]\n[unused460]\n[unused461]\n[unused462]\n[unused463]\n[unused464]\n[unused465]\n[unused466]\n[unused467]\n[unused468]\n[unused469]\n[unused470]\n[unused471]\n[unused472]\n[unused473]\n[unused474]\n[unused475]\n[unused476]\n[unused477]\n[unused478]\n[unused479]\n[unused480]\n[unused481]\n[unused482]\n[unused483]\n[unused484]\n[unused485]\n[unused486]\n[unused487]\n[unused488]\n[unused489]\n[unused490]\n[unused491]\n[unused492]\n[unused493]\n[unused494]\n[unused495]\n[unused496]\n[unused497]\n[unused498]\n[unused499]\n[unused500]\n[unused501]\n[unused502]\n[unused503]\n[unused504]\n[unused505]\n[unused506]\n[unused507]\n[unused508]\n[unused509]\n[unused510]\n[unused511]\n[unused512]\n[unused513]\n[unused514]\n[unused515]\n[unused516]\n[unused517]\n[unused518]\n[unused519]\n[unused520]\n[unused521]\n[unused522]\n[unused523]\n[unused524]\n[unused525]\n[unused526]\n[unused527]\n[unused528]\n[unused529]\n[unused530]\n[unused531]\n[unused532]\n[unused533]\n[unused534]\n[unused535]\n[unused536]\n[unused537]\n[unused538]\n[unused539]\n[unused540]\n[unused541]\n[unused542]\n[unused543]\n[unused544]\n[unused545]\n[unused546]\n[unused547]\n[unused548]\n[unused549]\n[unused550]\n[unused551]\n[unused552]\n[unused553]\n[unused554]\n[unused555]\n[unused556]\n[unused557]\n[unused558]\n[unused559]\n[unused560]\n[unused561]\n[unused562]\n[unused563]\n[unused564]\n[unused565]\n[unused566]\n[unused567]\n[unused568]\n[unused569]\n[unused570]\n[unused571]\n[unused572]\n[unused573]\n[unused574]\n[unused575]\n[unused576]\n[unused577]\n[unused578]\n[unused579]\n[unused580]\n[unused581]\n[unused582]\n[unused583]\n[unused584]\n[unused585]\n[unused586]\n[unused587]\n[unused588]\n[unused589]\n[unused590]\n[unused591]\n[unused592]\n[unused593]\n[unused594]\n[unused595]\n[unused596]\n[unused597]\n[unused598]\n[unused599]\n[unused600]\n[unused601]\n[unused602]\n[unused603]\n[unused604]\n[unused605]\n[unused606]\n[unused607]\n[unused608]\n[unused609]\n[unused610]\n[unused611]\n[unused612]\n[unused613]\n[unused614]\n[unused615]\n[unused616]\n[unused617]\n[unused618]\n[unused619]\n[unused620]\n[unused621]\n[unused622]\n[unused623]\n[unused624]\n[unused625]\n[unused626]\n[unused627]\n[unused628]\n[unused629]\n[unused630]\n[unused631]\n[unused632]\n[unused633]\n[unused634]\n[unused635]\n[unused636]\n[unused637]\n[unused638]\n[unused639]\n[unused640]\n[unused641]\n[unused642]\n[unused643]\n[unused644]\n[unused645]\n[unused646]\n[unused647]\n[unused648]\n[unused649]\n[unused650]\n[unused651]\n[unused652]\n[unused653]\n[unused654]\n[unused655]\n[unused656]\n[unused657]\n[unused658]\n[unused659]\n[unused660]\n[unused661]\n[unused662]\n[unused663]\n[unused664]\n[unused665]\n[unused666]\n[unused667]\n[unused668]\n[unused669]\n[unused670]\n[unused671]\n[unused672]\n[unused673]\n[unused674]\n[unused675]\n[unused676]\n[unused677]\n[unused678]\n[unused679]\n[unused680]\n[unused681]\n[unused682]\n[unused683]\n[unused684]\n[unused685]\n[unused686]\n[unused687]\n[unused688]\n[unused689]\n[unused690]\n[unused691]\n[unused692]\n[unused693]\n[unused694]\n[unused695]\n[unused696]\n[unused697]\n[unused698]\n[unused699]\n[unused700]\n[unused701]\n[unused702]\n[unused703]\n[unused704]\n[unused705]\n[unused706]\n[unused707]\n[unused708]\n[unused709]\n[unused710]\n[unused711]\n[unused712]\n[unused713]\n[unused714]\n[unused715]\n[unused716]\n[unused717]\n[unused718]\n[unused719]\n[unused720]\n[unused721]\n[unused722]\n[unused723]\n[unused724]\n[unused725]\n[unused726]\n[unused727]\n[unused728]\n[unused729]\n[unused730]\n[unused731]\n[unused732]\n[unused733]\n[unused734]\n[unused735]\n[unused736]\n[unused737]\n[unused738]\n[unused739]\n[unused740]\n[unused741]\n[unused742]\n[unused743]\n[unused744]\n[unused745]\n[unused746]\n[unused747]\n[unused748]\n[unused749]\n[unused750]\n[unused751]\n[unused752]\n[unused753]\n[unused754]\n[unused755]\n[unused756]\n[unused757]\n[unused758]\n[unused759]\n[unused760]\n[unused761]\n[unused762]\n[unused763]\n[unused764]\n[unused765]\n[unused766]\n[unused767]\n[unused768]\n[unused769]\n[unused770]\n[unused771]\n[unused772]\n[unused773]\n[unused774]\n[unused775]\n[unused776]\n[unused777]\n[unused778]\n[unused779]\n[unused780]\n[unused781]\n[unused782]\n[unused783]\n[unused784]\n[unused785]\n[unused786]\n[unused787]\n[unused788]\n[unused789]\n[unused790]\n[unused791]\n[unused792]\n[unused793]\n[unused794]\n[unused795]\n[unused796]\n[unused797]\n[unused798]\n[unused799]\n[unused800]\n[unused801]\n[unused802]\n[unused803]\n[unused804]\n[unused805]\n[unused806]\n[unused807]\n[unused808]\n[unused809]\n[unused810]\n[unused811]\n[unused812]\n[unused813]\n[unused814]\n[unused815]\n[unused816]\n[unused817]\n[unused818]\n[unused819]\n[unused820]\n[unused821]\n[unused822]\n[unused823]\n[unused824]\n[unused825]\n[unused826]\n[unused827]\n[unused828]\n[unused829]\n[unused830]\n[unused831]\n[unused832]\n[unused833]\n[unused834]\n[unused835]\n[unused836]\n[unused837]\n[unused838]\n[unused839]\n[unused840]\n[unused841]\n[unused842]\n[unused843]\n[unused844]\n[unused845]\n[unused846]\n[unused847]\n[unused848]\n[unused849]\n[unused850]\n[unused851]\n[unused852]\n[unused853]\n[unused854]\n[unused855]\n[unused856]\n[unused857]\n[unused858]\n[unused859]\n[unused860]\n[unused861]\n[unused862]\n[unused863]\n[unused864]\n[unused865]\n[unused866]\n[unused867]\n[unused868]\n[unused869]\n[unused870]\n[unused871]\n[unused872]\n[unused873]\n[unused874]\n[unused875]\n[unused876]\n[unused877]\n[unused878]\n[unused879]\n[unused880]\n[unused881]\n[unused882]\n[unused883]\n[unused884]\n[unused885]\n[unused886]\n[unused887]\n[unused888]\n[unused889]\n[unused890]\n[unused891]\n[unused892]\n[unused893]\n[unused894]\n[unused895]\n[unused896]\n[unused897]\n[unused898]\n[unused899]\n[unused900]\n[unused901]\n[unused902]\n[unused903]\n[unused904]\n[unused905]\n[unused906]\n[unused907]\n[unused908]\n[unused909]\n[unused910]\n[unused911]\n[unused912]\n[unused913]\n[unused914]\n[unused915]\n[unused916]\n[unused917]\n[unused918]\n[unused919]\n[unused920]\n[unused921]\n[unused922]\n[unused923]\n[unused924]\n[unused925]\n[unused926]\n[unused927]\n[unused928]\n[unused929]\n[unused930]\n[unused931]\n[unused932]\n[unused933]\n[unused934]\n[unused935]\n[unused936]\n[unused937]\n[unused938]\n[unused939]\n[unused940]\n[unused941]\n[unused942]\n[unused943]\n[unused944]\n[unused945]\n[unused946]\n[unused947]\n[unused948]\n[unused949]\n[unused950]\n[unused951]\n[unused952]\n[unused953]\n[unused954]\n[unused955]\n[unused956]\n[unused957]\n[unused958]\n[unused959]\n[unused960]\n[unused961]\n[unused962]\n[unused963]\n[unused964]\n[unused965]\n[unused966]\n[unused967]\n[unused968]\n[unused969]\n[unused970]\n[unused971]\n[unused972]\n[unused973]\n[unused974]\n[unused975]\n[unused976]\n[unused977]\n[unused978]\n[unused979]\n[unused980]\n[unused981]\n[unused982]\n[unused983]\n[unused984]\n[unused985]\n[unused986]\n[unused987]\n[unused988]\n[unused989]\n[unused990]\n[unused991]\n[unused992]\n[unused993]\n!\n\"\n#\n$\n%\n&\n'\n(\n)\n*\n+\n,\n-\n.\n/\n0\n1\n2\n3\n4\n5\n6\n7\n8\n9\n:\n;\n<\n=\n>\n?\n@\n[\n\\\n]\n^\n_\n`\na\nb\nc\nd\ne\nf\ng\nh\ni\nj\nk\nl\nm\nn\no\np\nq\nr\ns\nt\nu\nv\nw\nx\ny\nz\n{\n|\n}\n~\n¡\n¢\n£\n¤\n¥\n¦\n§\n¨\n©\nª\n«\n¬\n®\n°\n±\n²\n³\n´\nµ\n¶\n·\n¹\nº\n»\n¼\n½\n¾\n¿\n×\nß\næ\nð\n÷\nø\nþ\nđ\nħ\nı\nł\nŋ\nœ\nƒ\nɐ\nɑ\nɒ\nɔ\nɕ\nə\nɛ\nɡ\nɣ\nɨ\nɪ\nɫ\nɬ\nɯ\nɲ\nɴ\nɹ\nɾ\nʀ\nʁ\nʂ\nʃ\nʉ\nʊ\nʋ\nʌ\nʎ\nʐ\nʑ\nʒ\nʔ\nʰ\nʲ\nʳ\nʷ\nʸ\nʻ\nʼ\nʾ\nʿ\nˈ\nː\nˡ\nˢ\nˣ\nˤ\nα\nβ\nγ\nδ\nε\nζ\nη\nθ\nι\nκ\nλ\nμ\nν\nξ\nο\nπ\nρ\nς\nσ\nτ\nυ\nφ\nχ\nψ\nω\nа\nб\nв\nг\nд\nе\nж\nз\nи\nк\nл\nм\nн\nо\nп\nр\nс\nт\nу\nф\nх\nц\nч\nш\nщ\nъ\nы\nь\nэ\nю\nя\nђ\nє\nі\nј\nљ\nњ\nћ\nӏ\nա\nբ\nգ\nդ\nե\nթ\nի\nլ\nկ\nհ\nմ\nյ\nն\nո\nպ\nս\nվ\nտ\nր\nւ\nք\n־\nא\nב\nג\nד\nה\nו\nז\nח\nט\nי\nך\nכ\nל\nם\nמ\nן\nנ\nס\nע\nף\nפ\nץ\nצ\nק\nר\nש\nת\n،\nء\nا\nب\nة\nت\nث\nج\nح\nخ\nد\nذ\nر\nز\nس\nش\nص\nض\nط\nظ\nع\nغ\nـ\nف\nق\nك\nل\nم\nن\nه\nو\nى\nي\nٹ\nپ\nچ\nک\nگ\nں\nھ\nہ\nی\nے\nअ\nआ\nउ\nए\nक\nख\nग\nच\nज\nट\nड\nण\nत\nथ\nद\nध\nन\nप\nब\nभ\nम\nय\nर\nल\nव\nश\nष\nस\nह\nा\nि\nी\nो\n।\n॥\nং\nঅ\nআ\nই\nউ\nএ\nও\nক\nখ\nগ\nচ\nছ\nজ\nট\nড\nণ\nত\nথ\nদ\nধ\nন\nপ\nব\nভ\nম\nয\nর\nল\nশ\nষ\nস\nহ\nা\nি\nী\nে\nக\nச\nட\nத\nந\nன\nப\nம\nய\nர\nல\nள\nவ\nா\nி\nு\nே\nை\nನ\nರ\nಾ\nක\nය\nර\nල\nව\nා\nก\nง\nต\nท\nน\nพ\nม\nย\nร\nล\nว\nส\nอ\nา\nเ\n་\n།\nག\nང\nད\nན\nཔ\nབ\nམ\nའ\nར\nལ\nས\nမ\nა\nბ\nგ\nდ\nე\nვ\nთ\nი\nკ\nლ\nმ\nნ\nო\nრ\nს\nტ\nუ\nᄀ\nᄂ\nᄃ\nᄅ\nᄆ\nᄇ\nᄉ\nᄊ\nᄋ\nᄌ\nᄎ\nᄏ\nᄐ\nᄑ\nᄒ\nᅡ\nᅢ\nᅥ\nᅦ\nᅧ\nᅩ\nᅪ\nᅭ\nᅮ\nᅯ\nᅲ\nᅳ\nᅴ\nᅵ\nᆨ\nᆫ\nᆯ\nᆷ\nᆸ\nᆼ\nᴬ\nᴮ\nᴰ\nᴵ\nᴺ\nᵀ\nᵃ\nᵇ\nᵈ\nᵉ\nᵍ\nᵏ\nᵐ\nᵒ\nᵖ\nᵗ\nᵘ\nᵢ\nᵣ\nᵤ\nᵥ\nᶜ\nᶠ\n‐\n‑\n‒\n–\n—\n―\n‖\n‘\n’\n‚\n“\n”\n„\n†\n‡\n•\n…\n‰\n′\n″\n›\n‿\n⁄\n⁰\nⁱ\n⁴\n⁵\n⁶\n⁷\n⁸\n⁹\n⁺\n⁻\nⁿ\n₀\n₁\n₂\n₃\n₄\n₅\n₆\n₇\n₈\n₉\n₊\n₍\n₎\nₐ\nₑ\nₒ\nₓ\nₕ\nₖ\nₗ\nₘ\nₙ\nₚ\nₛ\nₜ\n₤\n₩\n€\n₱\n₹\nℓ\n№\nℝ\n™\n⅓\n⅔\n←\n↑\n→\n↓\n↔\n↦\n⇄\n⇌\n⇒\n∂\n∅\n∆\n∇\n∈\n−\n∗\n∘\n√\n∞\n∧\n∨\n∩\n∪\n≈\n≡\n≤\n≥\n⊂\n⊆\n⊕\n⊗\n⋅\n─\n│\n■\n▪\n●\n★\n☆\n☉\n♠\n♣\n♥\n♦\n♭\n♯\n⟨\n⟩\nⱼ\n⺩\n⺼\n⽥\n、\n。\n〈\n〉\n《\n》\n「\n」\n『\n』\n〜\nあ\nい\nう\nえ\nお\nか\nき\nく\nけ\nこ\nさ\nし\nす\nせ\nそ\nた\nち\nっ\nつ\nて\nと\nな\nに\nぬ\nね\nの\nは\nひ\nふ\nへ\nほ\nま\nみ\nむ\nめ\nも\nや\nゆ\nよ\nら\nり\nる\nれ\nろ\nを\nん\nァ\nア\nィ\nイ\nウ\nェ\nエ\nオ\nカ\nキ\nク\nケ\nコ\nサ\nシ\nス\nセ\nタ\nチ\nッ\nツ\nテ\nト\nナ\nニ\nノ\nハ\nヒ\nフ\nヘ\nホ\nマ\nミ\nム\nメ\nモ\nャ\nュ\nョ\nラ\nリ\nル\nレ\nロ\nワ\nン\n・\nー\n一\n三\n上\n下\n不\n世\n中\n主\n久\n之\n也\n事\n二\n五\n井\n京\n人\n亻\n仁\n介\n代\n仮\n伊\n会\n佐\n侍\n保\n信\n健\n元\n光\n八\n公\n内\n出\n分\n前\n劉\n力\n加\n勝\n北\n区\n十\n千\n南\n博\n原\n口\n古\n史\n司\n合\n吉\n同\n名\n和\n囗\n四\n国\n國\n土\n地\n坂\n城\n堂\n場\n士\n夏\n外\n大\n天\n太\n夫\n奈\n女\n子\n学\n宀\n宇\n安\n宗\n定\n宣\n宮\n家\n宿\n寺\n將\n小\n尚\n山\n岡\n島\n崎\n川\n州\n巿\n帝\n平\n年\n幸\n广\n弘\n張\n彳\n後\n御\n德\n心\n忄\n志\n忠\n愛\n成\n我\n戦\n戸\n手\n扌\n政\n文\n新\n方\n日\n明\n星\n春\n昭\n智\n曲\n書\n月\n有\n朝\n木\n本\n李\n村\n東\n松\n林\n森\n楊\n樹\n橋\n歌\n止\n正\n武\n比\n氏\n民\n水\n氵\n氷\n永\n江\n沢\n河\n治\n法\n海\n清\n漢\n瀬\n火\n版\n犬\n王\n生\n田\n男\n疒\n発\n白\n的\n皇\n目\n相\n省\n真\n石\n示\n社\n神\n福\n禾\n秀\n秋\n空\n立\n章\n竹\n糹\n美\n義\n耳\n良\n艹\n花\n英\n華\n葉\n藤\n行\n街\n西\n見\n訁\n語\n谷\n貝\n貴\n車\n軍\n辶\n道\n郎\n郡\n部\n都\n里\n野\n金\n鈴\n镇\n長\n門\n間\n阝\n阿\n陳\n陽\n雄\n青\n面\n風\n食\n香\n馬\n高\n龍\n龸\nﬁ\nﬂ\n！\n（\n）\n，\n－\n．\n／\n：\n？\n～\nthe\nof\nand\nin\nto\nwas\nhe\nis\nas\nfor\non\nwith\nthat\nit\nhis\nby\nat\nfrom\nher\n##s\nshe\nyou\nhad\nan\nwere\nbut\nbe\nthis\nare\nnot\nmy\nthey\none\nwhich\nor\nhave\nhim\nme\nfirst\nall\nalso\ntheir\nhas\nup\nwho\nout\nbeen\nwhen\nafter\nthere\ninto\nnew\ntwo\nits\n##a\ntime\nwould\nno\nwhat\nabout\nsaid\nwe\nover\nthen\nother\nso\nmore\n##e\ncan\nif\nlike\nback\nthem\nonly\nsome\ncould\n##i\nwhere\njust\n##ing\nduring\nbefore\n##n\ndo\n##o\nmade\nschool\nthrough\nthan\nnow\nyears\nmost\nworld\nmay\nbetween\ndown\nwell\nthree\n##d\nyear\nwhile\nwill\n##ed\n##r\n##y\nlater\n##t\ncity\nunder\naround\ndid\nsuch\nbeing\nused\nstate\npeople\npart\nknow\nagainst\nyour\nmany\nsecond\nuniversity\nboth\nnational\n##er\nthese\ndon\nknown\noff\nway\nuntil\nre\nhow\neven\nget\nhead\n...\ndidn\n##ly\nteam\namerican\nbecause\nde\n##l\nborn\nunited\nfilm\nsince\nstill\nlong\nwork\nsouth\nus\nbecame\nany\nhigh\nagain\nday\nfamily\nsee\nright\nman\neyes\nhouse\nseason\nwar\nstates\nincluding\ntook\nlife\nnorth\nsame\neach\ncalled\nname\nmuch\nplace\nhowever\ngo\nfour\ngroup\nanother\nfound\nwon\narea\nhere\ngoing\n10\naway\nseries\nleft\nhome\nmusic\nbest\nmake\nhand\nnumber\ncompany\nseveral\nnever\nlast\njohn\n000\nvery\nalbum\ntake\nend\ngood\ntoo\nfollowing\nreleased\ngame\nplayed\nlittle\nbegan\ndistrict\n##m\nold\nwant\nthose\nside\nheld\nown\nearly\ncounty\nll\nleague\nuse\nwest\n##u\nface\nthink\n##es\n2010\ngovernment\n##h\nmarch\ncame\nsmall\ngeneral\ntown\njune\n##on\nline\nbased\nsomething\n##k\nseptember\nthought\nlooked\nalong\ninternational\n2011\nair\njuly\nclub\nwent\njanuary\noctober\nour\naugust\napril\nyork\n12\nfew\n2012\n2008\neast\nshow\nmember\ncollege\n2009\nfather\npublic\n##us\ncome\nmen\nfive\nset\nstation\nchurch\n##c\nnext\nformer\nnovember\nroom\nparty\nlocated\ndecember\n2013\nage\ngot\n2007\n##g\nsystem\nlet\nlove\n2006\nthough\nevery\n2014\nlook\nsong\nwater\ncentury\nwithout\nbody\nblack\nnight\nwithin\ngreat\nwomen\nsingle\nve\nbuilding\nlarge\npopulation\nriver\nnamed\nband\nwhite\nstarted\n##an\nonce\n15\n20\nshould\n18\n2015\nservice\ntop\nbuilt\nbritish\nopen\ndeath\nking\nmoved\nlocal\ntimes\nchildren\nfebruary\nbook\nwhy\n11\ndoor\nneed\npresident\norder\nfinal\nroad\nwasn\nalthough\ndue\nmajor\ndied\nvillage\nthird\nknew\n2016\nasked\nturned\nst\nwanted\nsay\n##p\ntogether\nreceived\nmain\nson\nserved\ndifferent\n##en\nbehind\nhimself\nfelt\nmembers\npower\nfootball\nlaw\nvoice\nplay\n##in\nnear\npark\nhistory\n30\nhaving\n2005\n16\n##man\nsaw\nmother\n##al\narmy\npoint\nfront\nhelp\nenglish\nstreet\nart\nlate\nhands\ngames\naward\n##ia\nyoung\n14\nput\npublished\ncountry\ndivision\nacross\ntold\n13\noften\never\nfrench\nlondon\ncenter\nsix\nred\n2017\nled\ndays\ninclude\nlight\n25\nfind\ntell\namong\nspecies\nreally\naccording\ncentral\nhalf\n2004\nform\noriginal\ngave\noffice\nmaking\nenough\nlost\nfull\nopened\nmust\nincluded\nlive\ngiven\ngerman\nplayer\nrun\nbusiness\nwoman\ncommunity\ncup\nmight\nmillion\nland\n2000\ncourt\ndevelopment\n17\nshort\nround\nii\nkm\nseen\nclass\nstory\nalways\nbecome\nsure\nresearch\nalmost\ndirector\ncouncil\nla\n##2\ncareer\nthings\nusing\nisland\n##z\ncouldn\ncar\n##is\n24\nclose\nforce\n##1\nbetter\nfree\nsupport\ncontrol\nfield\nstudents\n2003\neducation\nmarried\n##b\nnothing\nworked\nothers\nrecord\nbig\ninside\nlevel\nanything\ncontinued\ngive\njames\n##3\nmilitary\nestablished\nnon\nreturned\nfeel\ndoes\ntitle\nwritten\nthing\nfeet\nwilliam\nfar\nco\nassociation\nhard\nalready\n2002\n##ra\nchampionship\nhuman\nwestern\n100\n##na\ndepartment\nhall\nrole\nvarious\nproduction\n21\n19\nheart\n2001\nliving\nfire\nversion\n##ers\n##f\ntelevision\nroyal\n##4\nproduced\nworking\nact\ncase\nsociety\nregion\npresent\nradio\nperiod\nlooking\nleast\ntotal\nkeep\nengland\nwife\nprogram\nper\nbrother\nmind\nspecial\n22\n##le\nam\nworks\nsoon\n##6\npolitical\ngeorge\nservices\ntaken\ncreated\n##7\nfurther\nable\nreached\ndavid\nunion\njoined\nupon\ndone\nimportant\nsocial\ninformation\neither\n##ic\n##x\nappeared\nposition\nground\nlead\nrock\ndark\nelection\n23\nboard\nfrance\nhair\ncourse\narms\nsite\npolice\ngirl\ninstead\nreal\nsound\n##v\nwords\nmoment\n##te\nsomeone\n##8\nsummer\nproject\nannounced\nsan\nless\nwrote\npast\nfollowed\n##5\nblue\nfounded\nal\nfinally\nindia\ntaking\nrecords\namerica\n##ne\n1999\ndesign\nconsidered\nnorthern\ngod\nstop\nbattle\ntoward\neuropean\noutside\ndescribed\ntrack\ntoday\nplaying\nlanguage\n28\ncall\n26\nheard\nprofessional\nlow\naustralia\nmiles\ncalifornia\nwin\nyet\ngreen\n##ie\ntrying\nblood\n##ton\nsouthern\nscience\nmaybe\neverything\nmatch\nsquare\n27\nmouth\nvideo\nrace\nrecorded\nleave\nabove\n##9\ndaughter\npoints\nspace\n1998\nmuseum\nchange\nmiddle\ncommon\n##0\nmove\ntv\npost\n##ta\nlake\nseven\ntried\nelected\nclosed\nten\npaul\nminister\n##th\nmonths\nstart\nchief\nreturn\ncanada\nperson\nsea\nrelease\nsimilar\nmodern\nbrought\nrest\nhit\nformed\nmr\n##la\n1997\nfloor\nevent\ndoing\nthomas\n1996\nrobert\ncare\nkilled\ntraining\nstar\nweek\nneeded\nturn\nfinished\nrailway\nrather\nnews\nhealth\nsent\nexample\nran\nterm\nmichael\ncoming\ncurrently\nyes\nforces\ndespite\ngold\nareas\n50\nstage\nfact\n29\ndead\nsays\npopular\n2018\noriginally\ngermany\nprobably\ndeveloped\nresult\npulled\nfriend\nstood\nmoney\nrunning\nmi\nsigned\nword\nsongs\nchild\neventually\nmet\ntour\naverage\nteams\nminutes\nfestival\ncurrent\ndeep\nkind\n1995\ndecided\nusually\neastern\nseemed\n##ness\nepisode\nbed\nadded\ntable\nindian\nprivate\ncharles\nroute\navailable\nidea\nthroughout\ncentre\naddition\nappointed\nstyle\n1994\nbooks\neight\nconstruction\npress\nmean\nwall\nfriends\nremained\nschools\nstudy\n##ch\n##um\ninstitute\noh\nchinese\nsometimes\nevents\npossible\n1992\naustralian\ntype\nbrown\nforward\ntalk\nprocess\nfood\ndebut\nseat\nperformance\ncommittee\nfeatures\ncharacter\narts\nherself\nelse\nlot\nstrong\nrussian\nrange\nhours\npeter\narm\n##da\nmorning\ndr\nsold\n##ry\nquickly\ndirected\n1993\nguitar\nchina\n##w\n31\nlist\n##ma\nperformed\nmedia\nuk\nplayers\nsmile\n##rs\nmyself\n40\nplaced\ncoach\nprovince\ntowards\nwouldn\nleading\nwhole\nboy\nofficial\ndesigned\ngrand\ncensus\n##el\neurope\nattack\njapanese\nhenry\n1991\n##re\n##os\ncross\ngetting\nalone\naction\nlower\nnetwork\nwide\nwashington\njapan\n1990\nhospital\nbelieve\nchanged\nsister\n##ar\nhold\ngone\nsir\nhadn\nship\n##ka\nstudies\nacademy\nshot\nrights\nbelow\nbase\nbad\ninvolved\nkept\nlargest\n##ist\nbank\nfuture\nespecially\nbeginning\nmark\nmovement\nsection\nfemale\nmagazine\nplan\nprofessor\nlord\nlonger\n##ian\nsat\nwalked\nhill\nactually\ncivil\nenergy\nmodel\nfamilies\nsize\nthus\naircraft\ncompleted\nincludes\ndata\ncaptain\n##or\nfight\nvocals\nfeatured\nrichard\nbridge\nfourth\n1989\nofficer\nstone\nhear\n##ism\nmeans\nmedical\ngroups\nmanagement\nself\nlips\ncompetition\nentire\nlived\ntechnology\nleaving\nfederal\ntournament\nbit\npassed\nhot\nindependent\nawards\nkingdom\nmary\nspent\nfine\ndoesn\nreported\n##ling\njack\nfall\nraised\nitself\nstay\ntrue\nstudio\n1988\nsports\nreplaced\nparis\nsystems\nsaint\nleader\ntheatre\nwhose\nmarket\ncapital\nparents\nspanish\ncanadian\nearth\n##ity\ncut\ndegree\nwriting\nbay\nchristian\nawarded\nnatural\nhigher\nbill\n##as\ncoast\nprovided\nprevious\nsenior\nft\nvalley\norganization\nstopped\nonto\ncountries\nparts\nconference\nqueen\nsecurity\ninterest\nsaying\nallowed\nmaster\nearlier\nphone\nmatter\nsmith\nwinning\ntry\nhappened\nmoving\ncampaign\nlos\n##ley\nbreath\nnearly\nmid\n1987\ncertain\ngirls\ndate\nitalian\nafrican\nstanding\nfell\nartist\n##ted\nshows\ndeal\nmine\nindustry\n1986\n##ng\neveryone\nrepublic\nprovide\ncollection\nlibrary\nstudent\n##ville\nprimary\nowned\nolder\nvia\nheavy\n1st\nmakes\n##able\nattention\nanyone\nafrica\n##ri\nstated\nlength\nended\nfingers\ncommand\nstaff\nskin\nforeign\nopening\ngovernor\nokay\nmedal\nkill\nsun\ncover\njob\n1985\nintroduced\nchest\nhell\nfeeling\n##ies\nsuccess\nmeet\nreason\nstandard\nmeeting\nnovel\n1984\ntrade\nsource\nbuildings\n##land\nrose\nguy\ngoal\n##ur\nchapter\nnative\nhusband\npreviously\nunit\nlimited\nentered\nweeks\nproducer\noperations\nmountain\ntakes\ncovered\nforced\nrelated\nroman\ncomplete\nsuccessful\nkey\ntexas\ncold\n##ya\nchannel\n1980\ntraditional\nfilms\ndance\nclear\napproximately\n500\nnine\nvan\nprince\nquestion\nactive\ntracks\nireland\nregional\nsilver\nauthor\npersonal\nsense\noperation\n##ine\neconomic\n1983\nholding\ntwenty\nisbn\nadditional\nspeed\nhour\nedition\nregular\nhistoric\nplaces\nwhom\nshook\nmovie\nkm²\nsecretary\nprior\nreport\nchicago\nread\nfoundation\nview\nengine\nscored\n1982\nunits\nask\nairport\nproperty\nready\nimmediately\nlady\nmonth\nlisted\ncontract\n##de\nmanager\nthemselves\nlines\n##ki\nnavy\nwriter\nmeant\n##ts\nruns\n##ro\npractice\nchampionships\nsinger\nglass\ncommission\nrequired\nforest\nstarting\nculture\ngenerally\ngiving\naccess\nattended\ntest\ncouple\nstand\ncatholic\nmartin\ncaught\nexecutive\n##less\neye\n##ey\nthinking\nchair\nquite\nshoulder\n1979\nhope\ndecision\nplays\ndefeated\nmunicipality\nwhether\nstructure\noffered\nslowly\npain\nice\ndirection\n##ion\npaper\nmission\n1981\nmostly\n200\nnoted\nindividual\nmanaged\nnature\nlives\nplant\n##ha\nhelped\nexcept\nstudied\ncomputer\nfigure\nrelationship\nissue\nsignificant\nloss\ndie\nsmiled\ngun\nago\nhighest\n1972\n##am\nmale\nbring\ngoals\nmexico\nproblem\ndistance\ncommercial\ncompletely\nlocation\nannual\nfamous\ndrive\n1976\nneck\n1978\nsurface\ncaused\nitaly\nunderstand\ngreek\nhighway\nwrong\nhotel\ncomes\nappearance\njoseph\ndouble\nissues\nmusical\ncompanies\ncastle\nincome\nreview\nassembly\nbass\ninitially\nparliament\nartists\nexperience\n1974\nparticular\nwalk\nfoot\nengineering\ntalking\nwindow\ndropped\n##ter\nmiss\nbaby\nboys\nbreak\n1975\nstars\nedge\nremember\npolicy\ncarried\ntrain\nstadium\nbar\nsex\nangeles\nevidence\n##ge\nbecoming\nassistant\nsoviet\n1977\nupper\nstep\nwing\n1970\nyouth\nfinancial\nreach\n##ll\nactor\nnumerous\n##se\n##st\nnodded\narrived\n##ation\nminute\n##nt\nbelieved\nsorry\ncomplex\nbeautiful\nvictory\nassociated\ntemple\n1968\n1973\nchance\nperhaps\nmetal\n##son\n1945\nbishop\n##et\nlee\nlaunched\nparticularly\ntree\nle\nretired\nsubject\nprize\ncontains\nyeah\ntheory\nempire\n##ce\nsuddenly\nwaiting\ntrust\nrecording\n##to\nhappy\nterms\ncamp\nchampion\n1971\nreligious\npass\nzealand\nnames\n2nd\nport\nancient\ntom\ncorner\nrepresented\nwatch\nlegal\nanti\njustice\ncause\nwatched\nbrothers\n45\nmaterial\nchanges\nsimply\nresponse\nlouis\nfast\n##ting\nanswer\n60\nhistorical\n1969\nstories\nstraight\ncreate\nfeature\nincreased\nrate\nadministration\nvirginia\nel\nactivities\ncultural\noverall\nwinner\nprograms\nbasketball\nlegs\nguard\nbeyond\ncast\ndoctor\nmm\nflight\nresults\nremains\ncost\neffect\nwinter\n##ble\nlarger\nislands\nproblems\nchairman\ngrew\ncommander\nisn\n1967\npay\nfailed\nselected\nhurt\nfort\nbox\nregiment\nmajority\njournal\n35\nedward\nplans\n##ke\n##ni\nshown\npretty\nirish\ncharacters\ndirectly\nscene\nlikely\noperated\nallow\nspring\n##j\njunior\nmatches\nlooks\nmike\nhouses\nfellow\n##tion\nbeach\nmarriage\n##ham\n##ive\nrules\noil\n65\nflorida\nexpected\nnearby\ncongress\nsam\npeace\nrecent\niii\nwait\nsubsequently\ncell\n##do\nvariety\nserving\nagreed\nplease\npoor\njoe\npacific\nattempt\nwood\ndemocratic\npiece\nprime\n##ca\nrural\nmile\ntouch\nappears\ntownship\n1964\n1966\nsoldiers\n##men\n##ized\n1965\npennsylvania\ncloser\nfighting\nclaimed\nscore\njones\nphysical\neditor\n##ous\nfilled\ngenus\nspecific\nsitting\nsuper\nmom\n##va\ntherefore\nsupported\nstatus\nfear\ncases\nstore\nmeaning\nwales\nminor\nspain\ntower\nfocus\nvice\nfrank\nfollow\nparish\nseparate\ngolden\nhorse\nfifth\nremaining\nbranch\n32\npresented\nstared\n##id\nuses\nsecret\nforms\n##co\nbaseball\nexactly\n##ck\nchoice\nnote\ndiscovered\ntravel\ncomposed\ntruth\nrussia\nball\ncolor\nkiss\ndad\nwind\ncontinue\nring\nreferred\nnumbers\ndigital\ngreater\n##ns\nmetres\nslightly\ndirect\nincrease\n1960\nresponsible\ncrew\nrule\ntrees\ntroops\n##no\nbroke\ngoes\nindividuals\nhundred\nweight\ncreek\nsleep\nmemory\ndefense\nprovides\nordered\ncode\nvalue\njewish\nwindows\n1944\nsafe\njudge\nwhatever\ncorps\nrealized\ngrowing\npre\n##ga\ncities\nalexander\ngaze\nlies\nspread\nscott\nletter\nshowed\nsituation\nmayor\ntransport\nwatching\nworkers\nextended\n##li\nexpression\nnormal\n##ment\nchart\nmultiple\nborder\n##ba\nhost\n##ner\ndaily\nmrs\nwalls\npiano\n##ko\nheat\ncannot\n##ate\nearned\nproducts\ndrama\nera\nauthority\nseasons\njoin\ngrade\n##io\nsign\ndifficult\nmachine\n1963\nterritory\nmainly\n##wood\nstations\nsquadron\n1962\nstepped\niron\n19th\n##led\nserve\nappear\nsky\nspeak\nbroken\ncharge\nknowledge\nkilometres\nremoved\nships\narticle\ncampus\nsimple\n##ty\npushed\nbritain\n##ve\nleaves\nrecently\ncd\nsoft\nboston\nlatter\neasy\nacquired\npoland\n##sa\nquality\nofficers\npresence\nplanned\nnations\nmass\nbroadcast\njean\nshare\nimage\ninfluence\nwild\noffer\nemperor\nelectric\nreading\nheaded\nability\npromoted\nyellow\nministry\n1942\nthroat\nsmaller\npolitician\n##by\nlatin\nspoke\ncars\nwilliams\nmales\nlack\npop\n80\n##ier\nacting\nseeing\nconsists\n##ti\nestate\n1961\npressure\njohnson\nnewspaper\njr\nchris\nolympics\nonline\nconditions\nbeat\nelements\nwalking\nvote\n##field\nneeds\ncarolina\ntext\nfeaturing\nglobal\nblock\nshirt\nlevels\nfrancisco\npurpose\nfemales\net\ndutch\nduke\nahead\ngas\ntwice\nsafety\nserious\nturning\nhighly\nlieutenant\nfirm\nmaria\namount\nmixed\ndaniel\nproposed\nperfect\nagreement\naffairs\n3rd\nseconds\ncontemporary\npaid\n1943\nprison\nsave\nkitchen\nlabel\nadministrative\nintended\nconstructed\nacademic\nnice\nteacher\nraces\n1956\nformerly\ncorporation\nben\nnation\nissued\nshut\n1958\ndrums\nhousing\nvictoria\nseems\nopera\n1959\ngraduated\nfunction\nvon\nmentioned\npicked\nbuild\nrecognized\nshortly\nprotection\npicture\nnotable\nexchange\nelections\n1980s\nloved\npercent\nracing\nfish\nelizabeth\ngarden\nvolume\nhockey\n1941\nbeside\nsettled\n##ford\n1940\ncompeted\nreplied\ndrew\n1948\nactress\nmarine\nscotland\nsteel\nglanced\nfarm\nsteve\n1957\nrisk\ntonight\npositive\nmagic\nsingles\neffects\ngray\nscreen\ndog\n##ja\nresidents\nbus\nsides\nnone\nsecondary\nliterature\npolish\ndestroyed\nflying\nfounder\nhouseholds\n1939\nlay\nreserve\nusa\ngallery\n##ler\n1946\nindustrial\nyounger\napproach\nappearances\nurban\nones\n1950\nfinish\navenue\npowerful\nfully\ngrowth\npage\nhonor\njersey\nprojects\nadvanced\nrevealed\nbasic\n90\ninfantry\npair\nequipment\nvisit\n33\nevening\nsearch\ngrant\neffort\nsolo\ntreatment\nburied\nrepublican\nprimarily\nbottom\nowner\n1970s\nisrael\ngives\njim\ndream\nbob\nremain\nspot\n70\nnotes\nproduce\nchampions\ncontact\ned\nsoul\naccepted\nways\ndel\n##ally\nlosing\nsplit\nprice\ncapacity\nbasis\ntrial\nquestions\n##ina\n1955\n20th\nguess\nofficially\nmemorial\nnaval\ninitial\n##ization\nwhispered\nmedian\nengineer\n##ful\nsydney\n##go\ncolumbia\nstrength\n300\n1952\ntears\nsenate\n00\ncard\nasian\nagent\n1947\nsoftware\n44\ndraw\nwarm\nsupposed\ncom\npro\n##il\ntransferred\nleaned\n##at\ncandidate\nescape\nmountains\nasia\npotential\nactivity\nentertainment\nseem\ntraffic\njackson\nmurder\n36\nslow\nproduct\norchestra\nhaven\nagency\nbbc\ntaught\nwebsite\ncomedy\nunable\nstorm\nplanning\nalbums\nrugby\nenvironment\nscientific\ngrabbed\nprotect\n##hi\nboat\ntypically\n1954\n1953\ndamage\nprincipal\ndivided\ndedicated\nmount\nohio\n##berg\npick\nfought\ndriver\n##der\nempty\nshoulders\nsort\nthank\nberlin\nprominent\naccount\nfreedom\nnecessary\nefforts\nalex\nheadquarters\nfollows\nalongside\ndes\nsimon\nandrew\nsuggested\noperating\nlearning\nsteps\n1949\nsweet\ntechnical\nbegin\neasily\n34\nteeth\nspeaking\nsettlement\nscale\n##sh\nrenamed\nray\nmax\nenemy\nsemi\njoint\ncompared\n##rd\nscottish\nleadership\nanalysis\noffers\ngeorgia\npieces\ncaptured\nanimal\ndeputy\nguest\norganized\n##lin\ntony\ncombined\nmethod\nchallenge\n1960s\nhuge\nwants\nbattalion\nsons\nrise\ncrime\ntypes\nfacilities\ntelling\npath\n1951\nplatform\nsit\n1990s\n##lo\ntells\nassigned\nrich\npull\n##ot\ncommonly\nalive\n##za\nletters\nconcept\nconducted\nwearing\nhappen\nbought\nbecomes\nholy\ngets\nocean\ndefeat\nlanguages\npurchased\ncoffee\noccurred\ntitled\n##q\ndeclared\napplied\nsciences\nconcert\nsounds\njazz\nbrain\n##me\npainting\nfleet\ntax\nnick\n##ius\nmichigan\ncount\nanimals\nleaders\nepisodes\n##line\ncontent\n##den\nbirth\n##it\nclubs\n64\npalace\ncritical\nrefused\nfair\nleg\nlaughed\nreturning\nsurrounding\nparticipated\nformation\nlifted\npointed\nconnected\nrome\nmedicine\nlaid\ntaylor\nsanta\npowers\nadam\ntall\nshared\nfocused\nknowing\nyards\nentrance\nfalls\n##wa\ncalling\n##ad\nsources\nchosen\nbeneath\nresources\nyard\n##ite\nnominated\nsilence\nzone\ndefined\n##que\ngained\nthirty\n38\nbodies\nmoon\n##ard\nadopted\nchristmas\nwidely\nregister\napart\niran\npremier\nserves\ndu\nunknown\nparties\n##les\ngeneration\n##ff\ncontinues\nquick\nfields\nbrigade\nquiet\nteaching\nclothes\nimpact\nweapons\npartner\nflat\ntheater\nsupreme\n1938\n37\nrelations\n##tor\nplants\nsuffered\n1936\nwilson\nkids\nbegins\n##age\n1918\nseats\narmed\ninternet\nmodels\nworth\nlaws\n400\ncommunities\nclasses\nbackground\nknows\nthanks\nquarter\nreaching\nhumans\ncarry\nkilling\nformat\nkong\nhong\nsetting\n75\narchitecture\ndisease\nrailroad\ninc\npossibly\nwish\narthur\nthoughts\nharry\ndoors\ndensity\n##di\ncrowd\nillinois\nstomach\ntone\nunique\nreports\nanyway\n##ir\nliberal\nder\nvehicle\nthick\ndry\ndrug\nfaced\nlargely\nfacility\ntheme\nholds\ncreation\nstrange\ncolonel\n##mi\nrevolution\nbell\npolitics\nturns\nsilent\nrail\nrelief\nindependence\ncombat\nshape\nwrite\ndetermined\nsales\nlearned\n4th\nfinger\noxford\nproviding\n1937\nheritage\nfiction\nsituated\ndesignated\nallowing\ndistribution\nhosted\n##est\nsight\ninterview\nestimated\nreduced\n##ria\ntoronto\nfootballer\nkeeping\nguys\ndamn\nclaim\nmotion\nsport\nsixth\nstayed\n##ze\nen\nrear\nreceive\nhanded\ntwelve\ndress\naudience\ngranted\nbrazil\n##well\nspirit\n##ated\nnoticed\netc\nolympic\nrepresentative\neric\ntight\ntrouble\nreviews\ndrink\nvampire\nmissing\nroles\nranked\nnewly\nhousehold\nfinals\nwave\ncritics\n##ee\nphase\nmassachusetts\npilot\nunlike\nphiladelphia\nbright\nguns\ncrown\norganizations\nroof\n42\nrespectively\nclearly\ntongue\nmarked\ncircle\nfox\nkorea\nbronze\nbrian\nexpanded\nsexual\nsupply\nyourself\ninspired\nlabour\nfc\n##ah\nreference\nvision\ndraft\nconnection\nbrand\nreasons\n1935\nclassic\ndriving\ntrip\njesus\ncells\nentry\n1920\nneither\ntrail\nclaims\natlantic\norders\nlabor\nnose\nafraid\nidentified\nintelligence\ncalls\ncancer\nattacked\npassing\nstephen\npositions\nimperial\ngrey\njason\n39\nsunday\n48\nswedish\navoid\nextra\nuncle\nmessage\ncovers\nallows\nsurprise\nmaterials\nfame\nhunter\n##ji\n1930\ncitizens\nfigures\ndavis\nenvironmental\nconfirmed\nshit\ntitles\ndi\nperforming\ndifference\nacts\nattacks\n##ov\nexisting\nvotes\nopportunity\nnor\nshop\nentirely\ntrains\nopposite\npakistan\n##pa\ndevelop\nresulted\nrepresentatives\nactions\nreality\npressed\n##ish\nbarely\nwine\nconversation\nfaculty\nnorthwest\nends\ndocumentary\nnuclear\nstock\ngrace\nsets\neat\nalternative\n##ps\nbag\nresulting\ncreating\nsurprised\ncemetery\n1919\ndrop\nfinding\nsarah\ncricket\nstreets\ntradition\nride\n1933\nexhibition\ntarget\near\nexplained\nrain\ncomposer\ninjury\napartment\nmunicipal\neducational\noccupied\nnetherlands\nclean\nbillion\nconstitution\nlearn\n1914\nmaximum\nclassical\nfrancis\nlose\nopposition\njose\nontario\nbear\ncore\nhills\nrolled\nending\ndrawn\npermanent\nfun\n##tes\n##lla\nlewis\nsites\nchamber\nryan\n##way\nscoring\nheight\n1934\n##house\nlyrics\nstaring\n55\nofficials\n1917\nsnow\noldest\n##tic\norange\n##ger\nqualified\ninterior\napparently\nsucceeded\nthousand\ndinner\nlights\nexistence\nfans\nheavily\n41\ngreatest\nconservative\nsend\nbowl\nplus\nenter\ncatch\n##un\neconomy\nduty\n1929\nspeech\nauthorities\nprincess\nperformances\nversions\nshall\ngraduate\npictures\neffective\nremembered\npoetry\ndesk\ncrossed\nstarring\nstarts\npassenger\nsharp\n##ant\nacres\nass\nweather\nfalling\nrank\nfund\nsupporting\ncheck\nadult\npublishing\nheads\ncm\nsoutheast\nlane\n##burg\napplication\nbc\n##ura\nles\ncondition\ntransfer\nprevent\ndisplay\nex\nregions\nearl\nfederation\ncool\nrelatively\nanswered\nbesides\n1928\nobtained\nportion\n##town\nmix\n##ding\nreaction\nliked\ndean\nexpress\npeak\n1932\n##tte\ncounter\nreligion\nchain\nrare\nmiller\nconvention\naid\nlie\nvehicles\nmobile\nperform\nsquad\nwonder\nlying\ncrazy\nsword\n##ping\nattempted\ncenturies\nweren\nphilosophy\ncategory\n##ize\nanna\ninterested\n47\nsweden\nwolf\nfrequently\nabandoned\nkg\nliterary\nalliance\ntask\nentitled\n##ay\nthrew\npromotion\nfactory\ntiny\nsoccer\nvisited\nmatt\nfm\nachieved\n52\ndefence\ninternal\npersian\n43\nmethods\n##ging\narrested\notherwise\ncambridge\nprogramming\nvillages\nelementary\ndistricts\nrooms\ncriminal\nconflict\nworry\ntrained\n1931\nattempts\nwaited\nsignal\nbird\ntruck\nsubsequent\nprogramme\n##ol\nad\n49\ncommunist\ndetails\nfaith\nsector\npatrick\ncarrying\nlaugh\n##ss\ncontrolled\nkorean\nshowing\norigin\nfuel\nevil\n1927\n##ent\nbrief\nidentity\ndarkness\naddress\npool\nmissed\npublication\nweb\nplanet\nian\nanne\nwings\ninvited\n##tt\nbriefly\nstandards\nkissed\n##be\nideas\nclimate\ncausing\nwalter\nworse\nalbert\narticles\nwinners\ndesire\naged\nnortheast\ndangerous\ngate\ndoubt\n1922\nwooden\nmulti\n##ky\npoet\nrising\nfunding\n46\ncommunications\ncommunication\nviolence\ncopies\nprepared\nford\ninvestigation\nskills\n1924\npulling\nelectronic\n##ak\n##ial\n##han\ncontaining\nultimately\noffices\nsinging\nunderstanding\nrestaurant\ntomorrow\nfashion\nchrist\nward\nda\npope\nstands\n5th\nflow\nstudios\naired\ncommissioned\ncontained\nexist\nfresh\namericans\n##per\nwrestling\napproved\nkid\nemployed\nrespect\nsuit\n1925\nangel\nasking\nincreasing\nframe\nangry\nselling\n1950s\nthin\nfinds\n##nd\ntemperature\nstatement\nali\nexplain\ninhabitants\ntowns\nextensive\nnarrow\n51\njane\nflowers\nimages\npromise\nsomewhere\nobject\nfly\nclosely\n##ls\n1912\nbureau\ncape\n1926\nweekly\npresidential\nlegislative\n1921\n##ai\n##au\nlaunch\nfounding\n##ny\n978\n##ring\nartillery\nstrike\nun\ninstitutions\nroll\nwriters\nlanding\nchose\nkevin\nanymore\npp\n##ut\nattorney\nfit\ndan\nbillboard\nreceiving\nagricultural\nbreaking\nsought\ndave\nadmitted\nlands\nmexican\n##bury\ncharlie\nspecifically\nhole\niv\nhoward\ncredit\nmoscow\nroads\naccident\n1923\nproved\nwear\nstruck\nhey\nguards\nstuff\nslid\nexpansion\n1915\ncat\nanthony\n##kin\nmelbourne\nopposed\nsub\nsouthwest\narchitect\nfailure\nplane\n1916\n##ron\nmap\ncamera\ntank\nlisten\nregarding\nwet\nintroduction\nmetropolitan\nlink\nep\nfighter\ninch\ngrown\ngene\nanger\nfixed\nbuy\ndvd\nkhan\ndomestic\nworldwide\nchapel\nmill\nfunctions\nexamples\n##head\ndeveloping\n1910\nturkey\nhits\npocket\nantonio\npapers\ngrow\nunless\ncircuit\n18th\nconcerned\nattached\njournalist\nselection\njourney\nconverted\nprovincial\npainted\nhearing\naren\nbands\nnegative\naside\nwondered\nknight\nlap\nsurvey\nma\n##ow\nnoise\nbilly\n##ium\nshooting\nguide\nbedroom\npriest\nresistance\nmotor\nhomes\nsounded\ngiant\n##mer\n150\nscenes\nequal\ncomic\npatients\nhidden\nsolid\nactual\nbringing\nafternoon\ntouched\nfunds\nwedding\nconsisted\nmarie\ncanal\nsr\nkim\ntreaty\nturkish\nrecognition\nresidence\ncathedral\nbroad\nknees\nincident\nshaped\nfired\nnorwegian\nhandle\ncheek\ncontest\nrepresent\n##pe\nrepresenting\nbeauty\n##sen\nbirds\nadvantage\nemergency\nwrapped\ndrawing\nnotice\npink\nbroadcasting\n##ong\nsomehow\nbachelor\nseventh\ncollected\nregistered\nestablishment\nalan\nassumed\nchemical\npersonnel\nroger\nretirement\njeff\nportuguese\nwore\ntied\ndevice\nthreat\nprogress\nadvance\n##ised\nbanks\nhired\nmanchester\nnfl\nteachers\nstructures\nforever\n##bo\ntennis\nhelping\nsaturday\nsale\napplications\njunction\nhip\nincorporated\nneighborhood\ndressed\nceremony\n##ds\ninfluenced\nhers\nvisual\nstairs\ndecades\ninner\nkansas\nhung\nhoped\ngain\nscheduled\ndowntown\nengaged\naustria\nclock\nnorway\ncertainly\npale\nprotected\n1913\nvictor\nemployees\nplate\nputting\nsurrounded\n##ists\nfinishing\nblues\ntropical\n##ries\nminnesota\nconsider\nphilippines\naccept\n54\nretrieved\n1900\nconcern\nanderson\nproperties\ninstitution\ngordon\nsuccessfully\nvietnam\n##dy\nbacking\noutstanding\nmuslim\ncrossing\nfolk\nproducing\nusual\ndemand\noccurs\nobserved\nlawyer\neducated\n##ana\nkelly\nstring\npleasure\nbudget\nitems\nquietly\ncolorado\nphilip\ntypical\n##worth\nderived\n600\nsurvived\nasks\nmental\n##ide\n56\njake\njews\ndistinguished\nltd\n1911\nsri\nextremely\n53\nathletic\nloud\nthousands\nworried\nshadow\ntransportation\nhorses\nweapon\narena\nimportance\nusers\ntim\nobjects\ncontributed\ndragon\ndouglas\naware\nsenator\njohnny\njordan\nsisters\nengines\nflag\ninvestment\nsamuel\nshock\ncapable\nclark\nrow\nwheel\nrefers\nsession\nfamiliar\nbiggest\nwins\nhate\nmaintained\ndrove\nhamilton\nrequest\nexpressed\ninjured\nunderground\nchurches\nwalker\nwars\ntunnel\npasses\nstupid\nagriculture\nsoftly\ncabinet\nregarded\njoining\nindiana\n##ea\n##ms\npush\ndates\nspend\nbehavior\nwoods\nprotein\ngently\nchase\nmorgan\nmention\nburning\nwake\ncombination\noccur\nmirror\nleads\njimmy\nindeed\nimpossible\nsingapore\npaintings\ncovering\n##nes\nsoldier\nlocations\nattendance\nsell\nhistorian\nwisconsin\ninvasion\nargued\npainter\ndiego\nchanging\negypt\n##don\nexperienced\ninches\n##ku\nmissouri\nvol\ngrounds\nspoken\nswitzerland\n##gan\nreform\nrolling\nha\nforget\nmassive\nresigned\nburned\nallen\ntennessee\nlocked\nvalues\nimproved\n##mo\nwounded\nuniverse\nsick\ndating\nfacing\npack\npurchase\nuser\n##pur\nmoments\n##ul\nmerged\nanniversary\n1908\ncoal\nbrick\nunderstood\ncauses\ndynasty\nqueensland\nestablish\nstores\ncrisis\npromote\nhoping\nviews\ncards\nreferee\nextension\n##si\nraise\narizona\nimprove\ncolonial\nformal\ncharged\n##rt\npalm\nlucky\nhide\nrescue\nfaces\n95\nfeelings\ncandidates\njuan\n##ell\ngoods\n6th\ncourses\nweekend\n59\nluke\ncash\nfallen\n##om\ndelivered\naffected\ninstalled\ncarefully\ntries\nswiss\nhollywood\ncosts\nlincoln\nresponsibility\n##he\nshore\nfile\nproper\nnormally\nmaryland\nassistance\njump\nconstant\noffering\nfriendly\nwaters\npersons\nrealize\ncontain\ntrophy\n800\npartnership\nfactor\n58\nmusicians\ncry\nbound\noregon\nindicated\nhero\nhouston\nmedium\n##ure\nconsisting\nsomewhat\n##ara\n57\ncycle\n##che\nbeer\nmoore\nfrederick\ngotten\neleven\nworst\nweak\napproached\narranged\nchin\nloan\nuniversal\nbond\nfifteen\npattern\ndisappeared\n##ney\ntranslated\n##zed\nlip\narab\ncapture\ninterests\ninsurance\n##chi\nshifted\ncave\nprix\nwarning\nsections\ncourts\ncoat\nplot\nsmell\nfeed\ngolf\nfavorite\nmaintain\nknife\nvs\nvoted\ndegrees\nfinance\nquebec\nopinion\ntranslation\nmanner\nruled\noperate\nproductions\nchoose\nmusician\ndiscovery\nconfused\ntired\nseparated\nstream\ntechniques\ncommitted\nattend\nranking\nkings\nthrow\npassengers\nmeasure\nhorror\nfan\nmining\nsand\ndanger\nsalt\ncalm\ndecade\ndam\nrequire\nrunner\n##ik\nrush\nassociate\ngreece\n##ker\nrivers\nconsecutive\nmatthew\n##ski\nsighed\nsq\ndocuments\nsteam\nedited\nclosing\ntie\naccused\n1905\n##ini\nislamic\ndistributed\ndirectors\norganisation\nbruce\n7th\nbreathing\nmad\nlit\narrival\nconcrete\ntaste\n08\ncomposition\nshaking\nfaster\namateur\nadjacent\nstating\n1906\ntwin\nflew\n##ran\ntokyo\npublications\n##tone\nobviously\nridge\nstorage\n1907\ncarl\npages\nconcluded\ndesert\ndriven\nuniversities\nages\nterminal\nsequence\nborough\n250\nconstituency\ncreative\ncousin\neconomics\ndreams\nmargaret\nnotably\nreduce\nmontreal\nmode\n17th\nears\nsaved\njan\nvocal\n##ica\n1909\nandy\n##jo\nriding\nroughly\nthreatened\n##ise\nmeters\nmeanwhile\nlanded\ncompete\nrepeated\ngrass\nczech\nregularly\ncharges\ntea\nsudden\nappeal\n##ung\nsolution\ndescribes\npierre\nclassification\nglad\nparking\n##ning\nbelt\nphysics\n99\nrachel\nadd\nhungarian\nparticipate\nexpedition\ndamaged\ngift\nchildhood\n85\nfifty\n##red\nmathematics\njumped\nletting\ndefensive\nmph\n##ux\n##gh\ntesting\n##hip\nhundreds\nshoot\nowners\nmatters\nsmoke\nisraeli\nkentucky\ndancing\nmounted\ngrandfather\nemma\ndesigns\nprofit\nargentina\n##gs\ntruly\nli\nlawrence\ncole\nbegun\ndetroit\nwilling\nbranches\nsmiling\ndecide\nmiami\nenjoyed\nrecordings\n##dale\npoverty\nethnic\ngay\n##bi\ngary\narabic\n09\naccompanied\n##one\n##ons\nfishing\ndetermine\nresidential\nacid\n##ary\nalice\nreturns\nstarred\nmail\n##ang\njonathan\nstrategy\n##ue\nnet\nforty\ncook\nbusinesses\nequivalent\ncommonwealth\ndistinct\nill\n##cy\nseriously\n##ors\n##ped\nshift\nharris\nreplace\nrio\nimagine\nformula\nensure\n##ber\nadditionally\nscheme\nconservation\noccasionally\npurposes\nfeels\nfavor\n##and\n##ore\n1930s\ncontrast\nhanging\nhunt\nmovies\n1904\ninstruments\nvictims\ndanish\nchristopher\nbusy\ndemon\nsugar\nearliest\ncolony\nstudying\nbalance\nduties\n##ks\nbelgium\nslipped\ncarter\n05\nvisible\nstages\niraq\nfifa\n##im\ncommune\nforming\nzero\n07\ncontinuing\ntalked\ncounties\nlegend\nbathroom\noption\ntail\nclay\ndaughters\nafterwards\nsevere\njaw\nvisitors\n##ded\ndevices\naviation\nrussell\nkate\n##vi\nentering\nsubjects\n##ino\ntemporary\nswimming\nforth\nsmooth\nghost\naudio\nbush\noperates\nrocks\nmovements\nsigns\neddie\n##tz\nann\nvoices\nhonorary\n06\nmemories\ndallas\npure\nmeasures\nracial\npromised\n66\nharvard\nceo\n16th\nparliamentary\nindicate\nbenefit\nflesh\ndublin\nlouisiana\n1902\n1901\npatient\nsleeping\n1903\nmembership\ncoastal\nmedieval\nwanting\nelement\nscholars\nrice\n62\nlimit\nsurvive\nmakeup\nrating\ndefinitely\ncollaboration\nobvious\n##tan\nboss\nms\nbaron\nbirthday\nlinked\nsoil\ndiocese\n##lan\nncaa\n##mann\noffensive\nshell\nshouldn\nwaist\n##tus\nplain\nross\norgan\nresolution\nmanufacturing\nadding\nrelative\nkennedy\n98\nwhilst\nmoth\nmarketing\ngardens\ncrash\n72\nheading\npartners\ncredited\ncarlos\nmoves\ncable\n##zi\nmarshall\n##out\ndepending\nbottle\nrepresents\nrejected\nresponded\nexisted\n04\njobs\ndenmark\nlock\n##ating\ntreated\ngraham\nroutes\ntalent\ncommissioner\ndrugs\nsecure\ntests\nreign\nrestored\nphotography\n##gi\ncontributions\noklahoma\ndesigner\ndisc\ngrin\nseattle\nrobin\npaused\natlanta\nunusual\n##gate\npraised\nlas\nlaughing\nsatellite\nhungary\nvisiting\n##sky\ninteresting\nfactors\ndeck\npoems\nnorman\n##water\nstuck\nspeaker\nrifle\ndomain\npremiered\n##her\ndc\ncomics\nactors\n01\nreputation\neliminated\n8th\nceiling\nprisoners\nscript\n##nce\nleather\naustin\nmississippi\nrapidly\nadmiral\nparallel\ncharlotte\nguilty\ntools\ngender\ndivisions\nfruit\n##bs\nlaboratory\nnelson\nfantasy\nmarry\nrapid\naunt\ntribe\nrequirements\naspects\nsuicide\namongst\nadams\nbone\nukraine\nabc\nkick\nsees\nedinburgh\nclothing\ncolumn\nrough\ngods\nhunting\nbroadway\ngathered\nconcerns\n##ek\nspending\nty\n12th\nsnapped\nrequires\nsolar\nbones\ncavalry\n##tta\niowa\ndrinking\nwaste\nindex\nfranklin\ncharity\nthompson\nstewart\ntip\nflash\nlandscape\nfriday\nenjoy\nsingh\npoem\nlistening\n##back\neighth\nfred\ndifferences\nadapted\nbomb\nukrainian\nsurgery\ncorporate\nmasters\nanywhere\n##more\nwaves\nodd\nsean\nportugal\norleans\ndick\ndebate\nkent\neating\npuerto\ncleared\n96\nexpect\ncinema\n97\nguitarist\nblocks\nelectrical\nagree\ninvolving\ndepth\ndying\npanel\nstruggle\n##ged\npeninsula\nadults\nnovels\nemerged\nvienna\nmetro\ndebuted\nshoes\ntamil\nsongwriter\nmeets\nprove\nbeating\ninstance\nheaven\nscared\nsending\nmarks\nartistic\npassage\nsuperior\n03\nsignificantly\nshopping\n##tive\nretained\n##izing\nmalaysia\ntechnique\ncheeks\n##ola\nwarren\nmaintenance\ndestroy\nextreme\nallied\n120\nappearing\n##yn\nfill\nadvice\nalabama\nqualifying\npolicies\ncleveland\nhat\nbattery\nsmart\nauthors\n10th\nsoundtrack\nacted\ndated\nlb\nglance\nequipped\ncoalition\nfunny\nouter\nambassador\nroy\npossibility\ncouples\ncampbell\ndna\nloose\nethan\nsupplies\n1898\ngonna\n88\nmonster\n##res\nshake\nagents\nfrequency\nsprings\ndogs\npractices\n61\ngang\nplastic\neasier\nsuggests\ngulf\nblade\nexposed\ncolors\nindustries\nmarkets\npan\nnervous\nelectoral\ncharts\nlegislation\nownership\n##idae\nmac\nappointment\nshield\ncopy\nassault\nsocialist\nabbey\nmonument\nlicense\nthrone\nemployment\njay\n93\nreplacement\ncharter\ncloud\npowered\nsuffering\naccounts\noak\nconnecticut\nstrongly\nwright\ncolour\ncrystal\n13th\ncontext\nwelsh\nnetworks\nvoiced\ngabriel\njerry\n##cing\nforehead\nmp\n##ens\nmanage\nschedule\ntotally\nremix\n##ii\nforests\noccupation\nprint\nnicholas\nbrazilian\nstrategic\nvampires\nengineers\n76\nroots\nseek\ncorrect\ninstrumental\nund\nalfred\nbacked\nhop\n##des\nstanley\nrobinson\ntraveled\nwayne\nwelcome\naustrian\nachieve\n67\nexit\nrates\n1899\nstrip\nwhereas\n##cs\nsing\ndeeply\nadventure\nbobby\nrick\njamie\ncareful\ncomponents\ncap\nuseful\npersonality\nknee\n##shi\npushing\nhosts\n02\nprotest\nca\nottoman\nsymphony\n##sis\n63\nboundary\n1890\nprocesses\nconsidering\nconsiderable\ntons\n##work\n##ft\n##nia\ncooper\ntrading\ndear\nconduct\n91\nillegal\napple\nrevolutionary\nholiday\ndefinition\nharder\n##van\njacob\ncircumstances\ndestruction\n##lle\npopularity\ngrip\nclassified\nliverpool\ndonald\nbaltimore\nflows\nseeking\nhonour\napproval\n92\nmechanical\ntill\nhappening\nstatue\ncritic\nincreasingly\nimmediate\ndescribe\ncommerce\nstare\n##ster\nindonesia\nmeat\nrounds\nboats\nbaker\northodox\ndepression\nformally\nworn\nnaked\nclaire\nmuttered\nsentence\n11th\nemily\ndocument\n77\ncriticism\nwished\nvessel\nspiritual\nbent\nvirgin\nparker\nminimum\nmurray\nlunch\ndanny\nprinted\ncompilation\nkeyboards\nfalse\nblow\nbelonged\n68\nraising\n78\ncutting\n##board\npittsburgh\n##up\n9th\nshadows\n81\nhated\nindigenous\njon\n15th\nbarry\nscholar\nah\n##zer\noliver\n##gy\nstick\nsusan\nmeetings\nattracted\nspell\nromantic\n##ver\nye\n1895\nphoto\ndemanded\ncustomers\n##ac\n1896\nlogan\nrevival\nkeys\nmodified\ncommanded\njeans\n##ious\nupset\nraw\nphil\ndetective\nhiding\nresident\nvincent\n##bly\nexperiences\ndiamond\ndefeating\ncoverage\nlucas\nexternal\nparks\nfranchise\nhelen\nbible\nsuccessor\npercussion\ncelebrated\nil\nlift\nprofile\nclan\nromania\n##ied\nmills\n##su\nnobody\nachievement\nshrugged\nfault\n1897\nrhythm\ninitiative\nbreakfast\ncarbon\n700\n69\nlasted\nviolent\n74\nwound\nken\nkiller\ngradually\nfilmed\n°c\ndollars\nprocessing\n94\nremove\ncriticized\nguests\nsang\nchemistry\n##vin\nlegislature\ndisney\n##bridge\nuniform\nescaped\nintegrated\nproposal\npurple\ndenied\nliquid\nkarl\ninfluential\nmorris\nnights\nstones\nintense\nexperimental\ntwisted\n71\n84\n##ld\npace\nnazi\nmitchell\nny\nblind\nreporter\nnewspapers\n14th\ncenters\nburn\nbasin\nforgotten\nsurviving\nfiled\ncollections\nmonastery\nlosses\nmanual\ncouch\ndescription\nappropriate\nmerely\ntag\nmissions\nsebastian\nrestoration\nreplacing\ntriple\n73\nelder\njulia\nwarriors\nbenjamin\njulian\nconvinced\nstronger\namazing\ndeclined\nversus\nmerchant\nhappens\noutput\nfinland\nbare\nbarbara\nabsence\nignored\ndawn\ninjuries\n##port\nproducers\n##ram\n82\nluis\n##ities\nkw\nadmit\nexpensive\nelectricity\nnba\nexception\nsymbol\n##ving\nladies\nshower\nsheriff\ncharacteristics\n##je\naimed\nbutton\nratio\neffectively\nsummit\nangle\njury\nbears\nfoster\nvessels\npants\nexecuted\nevans\ndozen\nadvertising\nkicked\npatrol\n1889\ncompetitions\nlifetime\nprinciples\nathletics\n##logy\nbirmingham\nsponsored\n89\nrob\nnomination\n1893\nacoustic\n##sm\ncreature\nlongest\n##tra\ncredits\nharbor\ndust\njosh\n##so\nterritories\nmilk\ninfrastructure\ncompletion\nthailand\nindians\nleon\narchbishop\n##sy\nassist\npitch\nblake\narrangement\ngirlfriend\nserbian\noperational\nhence\nsad\nscent\nfur\ndj\nsessions\nhp\nrefer\nrarely\n##ora\nexists\n1892\n##ten\nscientists\ndirty\npenalty\nburst\nportrait\nseed\n79\npole\nlimits\nrival\n1894\nstable\nalpha\ngrave\nconstitutional\nalcohol\narrest\nflower\nmystery\ndevil\narchitectural\nrelationships\ngreatly\nhabitat\n##istic\nlarry\nprogressive\nremote\ncotton\n##ics\n##ok\npreserved\nreaches\n##ming\ncited\n86\nvast\nscholarship\ndecisions\ncbs\njoy\nteach\n1885\neditions\nknocked\neve\nsearching\npartly\nparticipation\ngap\nanimated\nfate\nexcellent\n##ett\nna\n87\nalternate\nsaints\nyoungest\n##ily\nclimbed\n##ita\n##tors\nsuggest\n##ct\ndiscussion\nstaying\nchoir\nlakes\njacket\nrevenue\nnevertheless\npeaked\ninstrument\nwondering\nannually\nmanaging\nneil\n1891\nsigning\nterry\n##ice\napply\nclinical\nbrooklyn\naim\ncatherine\nfuck\nfarmers\nfigured\nninth\npride\nhugh\nevolution\nordinary\ninvolvement\ncomfortable\nshouted\ntech\nencouraged\ntaiwan\nrepresentation\nsharing\n##lia\n##em\npanic\nexact\ncargo\ncompeting\nfat\ncried\n83\n1920s\noccasions\npa\ncabin\nborders\nutah\nmarcus\n##isation\nbadly\nmuscles\n##ance\nvictorian\ntransition\nwarner\nbet\npermission\n##rin\nslave\nterrible\nsimilarly\nshares\nseth\nuefa\npossession\nmedals\nbenefits\ncolleges\nlowered\nperfectly\nmall\ntransit\n##ye\n##kar\npublisher\n##ened\nharrison\ndeaths\nelevation\n##ae\nasleep\nmachines\nsigh\nash\nhardly\nargument\noccasion\nparent\nleo\ndecline\n1888\ncontribution\n##ua\nconcentration\n1000\nopportunities\nhispanic\nguardian\nextent\nemotions\nhips\nmason\nvolumes\nbloody\ncontroversy\ndiameter\nsteady\nmistake\nphoenix\nidentify\nviolin\n##sk\ndeparture\nrichmond\nspin\nfuneral\nenemies\n1864\ngear\nliterally\nconnor\nrandom\nsergeant\ngrab\nconfusion\n1865\ntransmission\ninformed\nop\nleaning\nsacred\nsuspended\nthinks\ngates\nportland\nluck\nagencies\nyours\nhull\nexpert\nmuscle\nlayer\npractical\nsculpture\njerusalem\nlatest\nlloyd\nstatistics\ndeeper\nrecommended\nwarrior\narkansas\nmess\nsupports\ngreg\neagle\n1880\nrecovered\nrated\nconcerts\nrushed\n##ano\nstops\neggs\nfiles\npremiere\nkeith\n##vo\ndelhi\nturner\npit\naffair\nbelief\npaint\n##zing\nmate\n##ach\n##ev\nvictim\n##ology\nwithdrew\nbonus\nstyles\nfled\n##ud\nglasgow\ntechnologies\nfunded\nnbc\nadaptation\n##ata\nportrayed\ncooperation\nsupporters\njudges\nbernard\njustin\nhallway\nralph\n##ick\ngraduating\ncontroversial\ndistant\ncontinental\nspider\nbite\n##ho\nrecognize\nintention\nmixing\n##ese\negyptian\nbow\ntourism\nsuppose\nclaiming\ntiger\ndominated\nparticipants\nvi\n##ru\nnurse\npartially\ntape\n##rum\npsychology\n##rn\nessential\ntouring\nduo\nvoting\ncivilian\nemotional\nchannels\n##king\napparent\nhebrew\n1887\ntommy\ncarrier\nintersection\nbeast\nhudson\n##gar\n##zo\nlab\nnova\nbench\ndiscuss\ncosta\n##ered\ndetailed\nbehalf\ndrivers\nunfortunately\nobtain\n##lis\nrocky\n##dae\nsiege\nfriendship\nhoney\n##rian\n1861\namy\nhang\nposted\ngovernments\ncollins\nrespond\nwildlife\npreferred\noperator\n##po\nlaura\npregnant\nvideos\ndennis\nsuspected\nboots\ninstantly\nweird\nautomatic\nbusinessman\nalleged\nplacing\nthrowing\nph\nmood\n1862\nperry\nvenue\njet\nremainder\n##lli\n##ci\npassion\nbiological\nboyfriend\n1863\ndirt\nbuffalo\nron\nsegment\nfa\nabuse\n##era\ngenre\nthrown\nstroke\ncolored\nstress\nexercise\ndisplayed\n##gen\nstruggled\n##tti\nabroad\ndramatic\nwonderful\nthereafter\nmadrid\ncomponent\nwidespread\n##sed\ntale\ncitizen\ntodd\nmonday\n1886\nvancouver\noverseas\nforcing\ncrying\ndescent\n##ris\ndiscussed\nsubstantial\nranks\nregime\n1870\nprovinces\nswitch\ndrum\nzane\nted\ntribes\nproof\nlp\ncream\nresearchers\nvolunteer\nmanor\nsilk\nmilan\ndonated\nallies\nventure\nprinciple\ndelivery\nenterprise\n##ves\n##ans\nbars\ntraditionally\nwitch\nreminded\ncopper\n##uk\npete\ninter\nlinks\ncolin\ngrinned\nelsewhere\ncompetitive\nfrequent\n##oy\nscream\n##hu\ntension\ntexts\nsubmarine\nfinnish\ndefending\ndefend\npat\ndetail\n1884\naffiliated\nstuart\nthemes\nvilla\nperiods\ntool\nbelgian\nruling\ncrimes\nanswers\nfolded\nlicensed\nresort\ndemolished\nhans\nlucy\n1881\nlion\ntraded\nphotographs\nwrites\ncraig\n##fa\ntrials\ngenerated\nbeth\nnoble\ndebt\npercentage\nyorkshire\nerected\nss\nviewed\ngrades\nconfidence\nceased\nislam\ntelephone\nretail\n##ible\nchile\nm²\nroberts\nsixteen\n##ich\ncommented\nhampshire\ninnocent\ndual\npounds\nchecked\nregulations\nafghanistan\nsung\nrico\nliberty\nassets\nbigger\noptions\nangels\nrelegated\ntribute\nwells\nattending\nleaf\n##yan\nbutler\nromanian\nforum\nmonthly\nlisa\npatterns\ngmina\n##tory\nmadison\nhurricane\nrev\n##ians\nbristol\n##ula\nelite\nvaluable\ndisaster\ndemocracy\nawareness\ngermans\nfreyja\n##ins\nloop\nabsolutely\npaying\npopulations\nmaine\nsole\nprayer\nspencer\nreleases\ndoorway\nbull\n##ani\nlover\nmidnight\nconclusion\n##sson\nthirteen\nlily\nmediterranean\n##lt\nnhl\nproud\nsample\n##hill\ndrummer\nguinea\n##ova\nmurphy\nclimb\n##ston\ninstant\nattributed\nhorn\nain\nrailways\nsteven\n##ao\nautumn\nferry\nopponent\nroot\ntraveling\nsecured\ncorridor\nstretched\ntales\nsheet\ntrinity\ncattle\nhelps\nindicates\nmanhattan\nmurdered\nfitted\n1882\ngentle\ngrandmother\nmines\nshocked\nvegas\nproduces\n##light\ncaribbean\n##ou\nbelong\ncontinuous\ndesperate\ndrunk\nhistorically\ntrio\nwaved\nraf\ndealing\nnathan\nbat\nmurmured\ninterrupted\nresiding\nscientist\npioneer\nharold\naaron\n##net\ndelta\nattempting\nminority\nmini\nbelieves\nchorus\ntend\nlots\neyed\nindoor\nload\nshots\nupdated\njail\n##llo\nconcerning\nconnecting\nwealth\n##ved\nslaves\narrive\nrangers\nsufficient\nrebuilt\n##wick\ncardinal\nflood\nmuhammad\nwhenever\nrelation\nrunners\nmoral\nrepair\nviewers\narriving\nrevenge\npunk\nassisted\nbath\nfairly\nbreathe\nlists\ninnings\nillustrated\nwhisper\nnearest\nvoters\nclinton\nties\nultimate\nscreamed\nbeijing\nlions\nandre\nfictional\ngathering\ncomfort\nradar\nsuitable\ndismissed\nhms\nban\npine\nwrist\natmosphere\nvoivodeship\nbid\ntimber\n##ned\n##nan\ngiants\n##ane\ncameron\nrecovery\nuss\nidentical\ncategories\nswitched\nserbia\nlaughter\nnoah\nensemble\ntherapy\npeoples\ntouching\n##off\nlocally\npearl\nplatforms\neverywhere\nballet\ntables\nlanka\nherbert\noutdoor\ntoured\nderek\n1883\nspaces\ncontested\nswept\n1878\nexclusive\nslight\nconnections\n##dra\nwinds\nprisoner\ncollective\nbangladesh\ntube\npublicly\nwealthy\nthai\n##ys\nisolated\nselect\n##ric\ninsisted\npen\nfortune\nticket\nspotted\nreportedly\nanimation\nenforcement\ntanks\n110\ndecides\nwider\nlowest\nowen\n##time\nnod\nhitting\n##hn\ngregory\nfurthermore\nmagazines\nfighters\nsolutions\n##ery\npointing\nrequested\nperu\nreed\nchancellor\nknights\nmask\nworker\neldest\nflames\nreduction\n1860\nvolunteers\n##tis\nreporting\n##hl\nwire\nadvisory\nendemic\norigins\nsettlers\npursue\nknock\nconsumer\n1876\neu\ncompound\ncreatures\nmansion\nsentenced\nivan\ndeployed\nguitars\nfrowned\ninvolves\nmechanism\nkilometers\nperspective\nshops\nmaps\nterminus\nduncan\nalien\nfist\nbridges\n##pers\nheroes\nfed\nderby\nswallowed\n##ros\npatent\nsara\nillness\ncharacterized\nadventures\nslide\nhawaii\njurisdiction\n##op\norganised\n##side\nadelaide\nwalks\nbiology\nse\n##ties\nrogers\nswing\ntightly\nboundaries\n##rie\nprepare\nimplementation\nstolen\n##sha\ncertified\ncolombia\nedwards\ngarage\n##mm\nrecalled\n##ball\nrage\nharm\nnigeria\nbreast\n##ren\nfurniture\npupils\nsettle\n##lus\ncuba\nballs\nclient\nalaska\n21st\nlinear\nthrust\ncelebration\nlatino\ngenetic\nterror\n##cia\n##ening\nlightning\nfee\nwitness\nlodge\nestablishing\nskull\n##ique\nearning\nhood\n##ei\nrebellion\nwang\nsporting\nwarned\nmissile\ndevoted\nactivist\nporch\nworship\nfourteen\npackage\n1871\ndecorated\n##shire\nhoused\n##ock\nchess\nsailed\ndoctors\noscar\njoan\ntreat\ngarcia\nharbour\njeremy\n##ire\ntraditions\ndominant\njacques\n##gon\n##wan\nrelocated\n1879\namendment\nsized\ncompanion\nsimultaneously\nvolleyball\nspun\nacre\nincreases\nstopping\nloves\nbelongs\naffect\ndrafted\ntossed\nscout\nbattles\n1875\nfilming\nshoved\nmunich\ntenure\nvertical\nromance\npc\n##cher\nargue\n##ical\ncraft\nranging\nwww\nopens\nhonest\ntyler\nyesterday\nvirtual\n##let\nmuslims\nreveal\nsnake\nimmigrants\nradical\nscreaming\nspeakers\nfiring\nsaving\nbelonging\nease\nlighting\nprefecture\nblame\nfarmer\nhungry\ngrows\nrubbed\nbeam\nsur\nsubsidiary\n##cha\narmenian\nsao\ndropping\nconventional\n##fer\nmicrosoft\nreply\nqualify\nspots\n1867\nsweat\nfestivals\n##ken\nimmigration\nphysician\ndiscover\nexposure\nsandy\nexplanation\nisaac\nimplemented\n##fish\nhart\ninitiated\nconnect\nstakes\npresents\nheights\nhouseholder\npleased\ntourist\nregardless\nslip\nclosest\n##ction\nsurely\nsultan\nbrings\nriley\npreparation\naboard\nslammed\nbaptist\nexperiment\nongoing\ninterstate\norganic\nplayoffs\n##ika\n1877\n130\n##tar\nhindu\nerror\ntours\ntier\nplenty\narrangements\ntalks\ntrapped\nexcited\nsank\nho\nathens\n1872\ndenver\nwelfare\nsuburb\nathletes\ntrick\ndiverse\nbelly\nexclusively\nyelled\n1868\n##med\nconversion\n##ette\n1874\ninternationally\ncomputers\nconductor\nabilities\nsensitive\nhello\ndispute\nmeasured\nglobe\nrocket\nprices\namsterdam\nflights\ntigers\ninn\nmunicipalities\nemotion\nreferences\n3d\n##mus\nexplains\nairlines\nmanufactured\npm\narchaeological\n1873\ninterpretation\ndevon\ncomment\n##ites\nsettlements\nkissing\nabsolute\nimprovement\nsuite\nimpressed\nbarcelona\nsullivan\njefferson\ntowers\njesse\njulie\n##tin\n##lu\ngrandson\nhi\ngauge\nregard\nrings\ninterviews\ntrace\nraymond\nthumb\ndepartments\nburns\nserial\nbulgarian\nscores\ndemonstrated\n##ix\n1866\nkyle\nalberta\nunderneath\nromanized\n##ward\nrelieved\nacquisition\nphrase\ncliff\nreveals\nhan\ncuts\nmerger\ncustom\n##dar\nnee\ngilbert\ngraduation\n##nts\nassessment\ncafe\ndifficulty\ndemands\nswung\ndemocrat\njennifer\ncommons\n1940s\ngrove\n##yo\ncompleting\nfocuses\nsum\nsubstitute\nbearing\nstretch\nreception\n##py\nreflected\nessentially\ndestination\npairs\n##ched\nsurvival\nresource\n##bach\npromoting\ndoubles\nmessages\ntear\n##down\n##fully\nparade\nflorence\nharvey\nincumbent\npartial\nframework\n900\npedro\nfrozen\nprocedure\nolivia\ncontrols\n##mic\nshelter\npersonally\ntemperatures\n##od\nbrisbane\ntested\nsits\nmarble\ncomprehensive\noxygen\nleonard\n##kov\ninaugural\niranian\nreferring\nquarters\nattitude\n##ivity\nmainstream\nlined\nmars\ndakota\nnorfolk\nunsuccessful\n##°\nexplosion\nhelicopter\ncongressional\n##sing\ninspector\nbitch\nseal\ndeparted\ndivine\n##ters\ncoaching\nexamination\npunishment\nmanufacturer\nsink\ncolumns\nunincorporated\nsignals\nnevada\nsqueezed\ndylan\ndining\nphotos\nmartial\nmanuel\neighteen\nelevator\nbrushed\nplates\nministers\nivy\ncongregation\n##len\nslept\nspecialized\ntaxes\ncurve\nrestricted\nnegotiations\nlikes\nstatistical\narnold\ninspiration\nexecution\nbold\nintermediate\nsignificance\nmargin\nruler\nwheels\ngothic\nintellectual\ndependent\nlistened\neligible\nbuses\nwidow\nsyria\nearn\ncincinnati\ncollapsed\nrecipient\nsecrets\naccessible\nphilippine\nmaritime\ngoddess\nclerk\nsurrender\nbreaks\nplayoff\ndatabase\n##ified\n##lon\nideal\nbeetle\naspect\nsoap\nregulation\nstrings\nexpand\nanglo\nshorter\ncrosses\nretreat\ntough\ncoins\nwallace\ndirections\npressing\n##oon\nshipping\nlocomotives\ncomparison\ntopics\nnephew\n##mes\ndistinction\nhonors\ntravelled\nsierra\nibn\n##over\nfortress\nsa\nrecognised\ncarved\n1869\nclients\n##dan\nintent\n##mar\ncoaches\ndescribing\nbread\n##ington\nbeaten\nnorthwestern\n##ona\nmerit\nyoutube\ncollapse\nchallenges\nem\nhistorians\nobjective\nsubmitted\nvirus\nattacking\ndrake\nassume\n##ere\ndiseases\nmarc\nstem\nleeds\n##cus\n##ab\nfarming\nglasses\n##lock\nvisits\nnowhere\nfellowship\nrelevant\ncarries\nrestaurants\nexperiments\n101\nconstantly\nbases\ntargets\nshah\ntenth\nopponents\nverse\nterritorial\n##ira\nwritings\ncorruption\n##hs\ninstruction\ninherited\nreverse\nemphasis\n##vic\nemployee\narch\nkeeps\nrabbi\nwatson\npayment\nuh\n##ala\nnancy\n##tre\nvenice\nfastest\nsexy\nbanned\nadrian\nproperly\nruth\ntouchdown\ndollar\nboards\nmetre\ncircles\nedges\nfavour\ncomments\nok\ntravels\nliberation\nscattered\nfirmly\n##ular\nholland\npermitted\ndiesel\nkenya\nden\noriginated\n##ral\ndemons\nresumed\ndragged\nrider\n##rus\nservant\nblinked\nextend\ntorn\n##ias\n##sey\ninput\nmeal\neverybody\ncylinder\nkinds\ncamps\n##fe\nbullet\nlogic\n##wn\ncroatian\nevolved\nhealthy\nfool\nchocolate\nwise\npreserve\npradesh\n##ess\nrespective\n1850\n##ew\nchicken\nartificial\ngross\ncorresponding\nconvicted\ncage\ncaroline\ndialogue\n##dor\nnarrative\nstranger\nmario\nbr\nchristianity\nfailing\ntrent\ncommanding\nbuddhist\n1848\nmaurice\nfocusing\nyale\nbike\naltitude\n##ering\nmouse\nrevised\n##sley\nveteran\n##ig\npulls\ntheology\ncrashed\ncampaigns\nlegion\n##ability\ndrag\nexcellence\ncustomer\ncancelled\nintensity\nexcuse\n##lar\nliga\nparticipating\ncontributing\nprinting\n##burn\nvariable\n##rk\ncurious\nbin\nlegacy\nrenaissance\n##my\nsymptoms\nbinding\nvocalist\ndancer\n##nie\ngrammar\ngospel\ndemocrats\nya\nenters\nsc\ndiplomatic\nhitler\n##ser\nclouds\nmathematical\nquit\ndefended\noriented\n##heim\nfundamental\nhardware\nimpressive\nequally\nconvince\nconfederate\nguilt\nchuck\nsliding\n##ware\nmagnetic\nnarrowed\npetersburg\nbulgaria\notto\nphd\nskill\n##ama\nreader\nhopes\npitcher\nreservoir\nhearts\nautomatically\nexpecting\nmysterious\nbennett\nextensively\nimagined\nseeds\nmonitor\nfix\n##ative\njournalism\nstruggling\nsignature\nranch\nencounter\nphotographer\nobservation\nprotests\n##pin\ninfluences\n##hr\ncalendar\n##all\ncruz\ncroatia\nlocomotive\nhughes\nnaturally\nshakespeare\nbasement\nhook\nuncredited\nfaded\ntheories\napproaches\ndare\nphillips\nfilling\nfury\nobama\n##ain\nefficient\narc\ndeliver\nmin\nraid\nbreeding\ninducted\nleagues\nefficiency\naxis\nmontana\neagles\n##ked\nsupplied\ninstructions\nkaren\npicking\nindicating\ntrap\nanchor\npractically\nchristians\ntomb\nvary\noccasional\nelectronics\nlords\nreaders\nnewcastle\nfaint\ninnovation\ncollect\nsituations\nengagement\n160\nclaude\nmixture\n##feld\npeer\ntissue\nlogo\nlean\n##ration\n°f\nfloors\n##ven\narchitects\nreducing\n##our\n##ments\nrope\n1859\nottawa\n##har\nsamples\nbanking\ndeclaration\nproteins\nresignation\nfrancois\nsaudi\nadvocate\nexhibited\narmor\ntwins\ndivorce\n##ras\nabraham\nreviewed\njo\ntemporarily\nmatrix\nphysically\npulse\ncurled\n##ena\ndifficulties\nbengal\nusage\n##ban\nannie\nriders\ncertificate\n##pi\nholes\nwarsaw\ndistinctive\njessica\n##mon\nmutual\n1857\ncustoms\ncircular\neugene\nremoval\nloaded\nmere\nvulnerable\ndepicted\ngenerations\ndame\nheir\nenormous\nlightly\nclimbing\npitched\nlessons\npilots\nnepal\nram\ngoogle\npreparing\nbrad\nlouise\nrenowned\n##₂\nliam\n##ably\nplaza\nshaw\nsophie\nbrilliant\nbills\n##bar\n##nik\nfucking\nmainland\nserver\npleasant\nseized\nveterans\njerked\nfail\nbeta\nbrush\nradiation\nstored\nwarmth\nsoutheastern\nnate\nsin\nraced\nberkeley\njoke\nathlete\ndesignation\ntrunk\n##low\nroland\nqualification\narchives\nheels\nartwork\nreceives\njudicial\nreserves\n##bed\nwoke\ninstallation\nabu\nfloating\nfake\nlesser\nexcitement\ninterface\nconcentrated\naddressed\ncharacteristic\namanda\nsaxophone\nmonk\nauto\n##bus\nreleasing\negg\ndies\ninteraction\ndefender\nce\noutbreak\nglory\nloving\n##bert\nsequel\nconsciousness\nhttp\nawake\nski\nenrolled\n##ress\nhandling\nrookie\nbrow\nsomebody\nbiography\nwarfare\namounts\ncontracts\npresentation\nfabric\ndissolved\nchallenged\nmeter\npsychological\nlt\nelevated\nrally\naccurate\n##tha\nhospitals\nundergraduate\nspecialist\nvenezuela\nexhibit\nshed\nnursing\nprotestant\nfluid\nstructural\nfootage\njared\nconsistent\nprey\n##ska\nsuccession\nreflect\nexile\nlebanon\nwiped\nsuspect\nshanghai\nresting\nintegration\npreservation\nmarvel\nvariant\npirates\nsheep\nrounded\ncapita\nsailing\ncolonies\nmanuscript\ndeemed\nvariations\nclarke\nfunctional\nemerging\nboxing\nrelaxed\ncurse\nazerbaijan\nheavyweight\nnickname\neditorial\nrang\ngrid\ntightened\nearthquake\nflashed\nmiguel\nrushing\n##ches\nimprovements\nboxes\nbrooks\n180\nconsumption\nmolecular\nfelix\nsocieties\nrepeatedly\nvariation\naids\ncivic\ngraphics\nprofessionals\nrealm\nautonomous\nreceiver\ndelayed\nworkshop\nmilitia\nchairs\ntrump\ncanyon\n##point\nharsh\nextending\nlovely\nhappiness\n##jan\nstake\neyebrows\nembassy\nwellington\nhannah\n##ella\nsony\ncorners\nbishops\nswear\ncloth\ncontents\nxi\nnamely\ncommenced\n1854\nstanford\nnashville\ncourage\ngraphic\ncommitment\ngarrison\n##bin\nhamlet\nclearing\nrebels\nattraction\nliteracy\ncooking\nruins\ntemples\njenny\nhumanity\ncelebrate\nhasn\nfreight\nsixty\nrebel\nbastard\n##art\nnewton\n##ada\ndeer\n##ges\n##ching\nsmiles\ndelaware\nsingers\n##ets\napproaching\nassists\nflame\n##ph\nboulevard\nbarrel\nplanted\n##ome\npursuit\n##sia\nconsequences\nposts\nshallow\ninvitation\nrode\ndepot\nernest\nkane\nrod\nconcepts\npreston\ntopic\nchambers\nstriking\nblast\narrives\ndescendants\nmontgomery\nranges\nworlds\n##lay\n##ari\nspan\nchaos\npraise\n##ag\nfewer\n1855\nsanctuary\nmud\nfbi\n##ions\nprogrammes\nmaintaining\nunity\nharper\nbore\nhandsome\nclosure\ntournaments\nthunder\nnebraska\nlinda\nfacade\nputs\nsatisfied\nargentine\ndale\ncork\ndome\npanama\n##yl\n1858\ntasks\nexperts\n##ates\nfeeding\nequation\n##las\n##ida\n##tu\nengage\nbryan\n##ax\num\nquartet\nmelody\ndisbanded\nsheffield\nblocked\ngasped\ndelay\nkisses\nmaggie\nconnects\n##non\nsts\npoured\ncreator\npublishers\n##we\nguided\nellis\nextinct\nhug\ngaining\n##ord\ncomplicated\n##bility\npoll\nclenched\ninvestigate\n##use\nthereby\nquantum\nspine\ncdp\nhumor\nkills\nadministered\nsemifinals\n##du\nencountered\nignore\n##bu\ncommentary\n##maker\nbother\nroosevelt\n140\nplains\nhalfway\nflowing\ncultures\ncrack\nimprisoned\nneighboring\nairline\n##ses\n##view\n##mate\n##ec\ngather\nwolves\nmarathon\ntransformed\n##ill\ncruise\norganisations\ncarol\npunch\nexhibitions\nnumbered\nalarm\nratings\ndaddy\nsilently\n##stein\nqueens\ncolours\nimpression\nguidance\nliu\ntactical\n##rat\nmarshal\ndella\narrow\n##ings\nrested\nfeared\ntender\nowns\nbitter\nadvisor\nescort\n##ides\nspare\nfarms\ngrants\n##ene\ndragons\nencourage\ncolleagues\ncameras\n##und\nsucked\npile\nspirits\nprague\nstatements\nsuspension\nlandmark\nfence\ntorture\nrecreation\nbags\npermanently\nsurvivors\npond\nspy\npredecessor\nbombing\ncoup\n##og\nprotecting\ntransformation\nglow\n##lands\n##book\ndug\npriests\nandrea\nfeat\nbarn\njumping\n##chen\n##ologist\n##con\ncasualties\nstern\nauckland\npipe\nserie\nrevealing\nba\n##bel\ntrevor\nmercy\nspectrum\nyang\nconsist\ngoverning\ncollaborated\npossessed\nepic\ncomprises\nblew\nshane\n##ack\nlopez\nhonored\nmagical\nsacrifice\njudgment\nperceived\nhammer\nmtv\nbaronet\ntune\ndas\nmissionary\nsheets\n350\nneutral\noral\nthreatening\nattractive\nshade\naims\nseminary\n##master\nestates\n1856\nmichel\nwounds\nrefugees\nmanufacturers\n##nic\nmercury\nsyndrome\nporter\n##iya\n##din\nhamburg\nidentification\nupstairs\npurse\nwidened\npause\ncared\nbreathed\naffiliate\nsantiago\nprevented\nceltic\nfisher\n125\nrecruited\nbyzantine\nreconstruction\nfarther\n##mp\ndiet\nsake\nau\nspite\nsensation\n##ert\nblank\nseparation\n105\n##hon\nvladimir\narmies\nanime\n##lie\naccommodate\norbit\ncult\nsofia\narchive\n##ify\n##box\nfounders\nsustained\ndisorder\nhonours\nnortheastern\nmia\ncrops\nviolet\nthreats\nblanket\nfires\ncanton\nfollowers\nsouthwestern\nprototype\nvoyage\nassignment\naltered\nmoderate\nprotocol\npistol\n##eo\nquestioned\nbrass\nlifting\n1852\nmath\nauthored\n##ual\ndoug\ndimensional\ndynamic\n##san\n1851\npronounced\ngrateful\nquest\nuncomfortable\nboom\npresidency\nstevens\nrelating\npoliticians\nchen\nbarrier\nquinn\ndiana\nmosque\ntribal\ncheese\npalmer\nportions\nsometime\nchester\ntreasure\nwu\nbend\ndownload\nmillions\nreforms\nregistration\n##osa\nconsequently\nmonitoring\nate\npreliminary\nbrandon\ninvented\nps\neaten\nexterior\nintervention\nports\ndocumented\nlog\ndisplays\nlecture\nsally\nfavourite\n##itz\nvermont\nlo\ninvisible\nisle\nbreed\n##ator\njournalists\nrelay\nspeaks\nbackward\nexplore\nmidfielder\nactively\nstefan\nprocedures\ncannon\nblond\nkenneth\ncentered\nservants\nchains\nlibraries\nmalcolm\nessex\nhenri\nslavery\n##hal\nfacts\nfairy\ncoached\ncassie\ncats\nwashed\ncop\n##fi\nannouncement\nitem\n2000s\nvinyl\nactivated\nmarco\nfrontier\ngrowled\ncurriculum\n##das\nloyal\naccomplished\nleslie\nritual\nkenny\n##00\nvii\nnapoleon\nhollow\nhybrid\njungle\nstationed\nfriedrich\ncounted\n##ulated\nplatinum\ntheatrical\nseated\ncol\nrubber\nglen\n1840\ndiversity\nhealing\nextends\nid\nprovisions\nadministrator\ncolumbus\n##oe\ntributary\nte\nassured\norg\n##uous\nprestigious\nexamined\nlectures\ngrammy\nronald\nassociations\nbailey\nallan\nessays\nflute\nbelieving\nconsultant\nproceedings\ntravelling\n1853\nkit\nkerala\nyugoslavia\nbuddy\nmethodist\n##ith\nburial\ncentres\nbatman\n##nda\ndiscontinued\nbo\ndock\nstockholm\nlungs\nseverely\n##nk\nciting\nmanga\n##ugh\nsteal\nmumbai\niraqi\nrobot\ncelebrity\nbride\nbroadcasts\nabolished\npot\njoel\noverhead\nfranz\npacked\nreconnaissance\njohann\nacknowledged\nintroduce\nhandled\ndoctorate\ndevelopments\ndrinks\nalley\npalestine\n##nis\n##aki\nproceeded\nrecover\nbradley\ngrain\npatch\nafford\ninfection\nnationalist\nlegendary\n##ath\ninterchange\nvirtually\ngen\ngravity\nexploration\namber\nvital\nwishes\npowell\ndoctrine\nelbow\nscreenplay\n##bird\ncontribute\nindonesian\npet\ncreates\n##com\nenzyme\nkylie\ndiscipline\ndrops\nmanila\nhunger\n##ien\nlayers\nsuffer\nfever\nbits\nmonica\nkeyboard\nmanages\n##hood\nsearched\nappeals\n##bad\ntestament\ngrande\nreid\n##war\nbeliefs\ncongo\n##ification\n##dia\nsi\nrequiring\n##via\ncasey\n1849\nregret\nstreak\nrape\ndepends\nsyrian\nsprint\npound\ntourists\nupcoming\npub\n##xi\ntense\n##els\npracticed\necho\nnationwide\nguild\nmotorcycle\nliz\n##zar\nchiefs\ndesired\nelena\nbye\nprecious\nabsorbed\nrelatives\nbooth\npianist\n##mal\ncitizenship\nexhausted\nwilhelm\n##ceae\n##hed\nnoting\nquarterback\nurge\nhectares\n##gue\nace\nholly\n##tal\nblonde\ndavies\nparked\nsustainable\nstepping\ntwentieth\nairfield\ngalaxy\nnest\nchip\n##nell\ntan\nshaft\npaulo\nrequirement\n##zy\nparadise\ntobacco\ntrans\nrenewed\nvietnamese\n##cker\n##ju\nsuggesting\ncatching\nholmes\nenjoying\nmd\ntrips\ncolt\nholder\nbutterfly\nnerve\nreformed\ncherry\nbowling\ntrailer\ncarriage\ngoodbye\nappreciate\ntoy\njoshua\ninteractive\nenabled\ninvolve\n##kan\ncollar\ndetermination\nbunch\nfacebook\nrecall\nshorts\nsuperintendent\nepiscopal\nfrustration\ngiovanni\nnineteenth\nlaser\nprivately\narray\ncirculation\n##ovic\narmstrong\ndeals\npainful\npermit\ndiscrimination\n##wi\naires\nretiring\ncottage\nni\n##sta\nhorizon\nellen\njamaica\nripped\nfernando\nchapters\nplaystation\npatron\nlecturer\nnavigation\nbehaviour\ngenes\ngeorgian\nexport\nsolomon\nrivals\nswift\nseventeen\nrodriguez\nprinceton\nindependently\nsox\n1847\narguing\nentity\ncasting\nhank\ncriteria\noakland\ngeographic\nmilwaukee\nreflection\nexpanding\nconquest\ndubbed\n##tv\nhalt\nbrave\nbrunswick\ndoi\narched\ncurtis\ndivorced\npredominantly\nsomerset\nstreams\nugly\nzoo\nhorrible\ncurved\nbuenos\nfierce\ndictionary\nvector\ntheological\nunions\nhandful\nstability\nchan\npunjab\nsegments\n##lly\naltar\nignoring\ngesture\nmonsters\npastor\n##stone\nthighs\nunexpected\noperators\nabruptly\ncoin\ncompiled\nassociates\nimproving\nmigration\npin\n##ose\ncompact\ncollegiate\nreserved\n##urs\nquarterfinals\nroster\nrestore\nassembled\nhurry\noval\n##cies\n1846\nflags\nmartha\n##del\nvictories\nsharply\n##rated\nargues\ndeadly\nneo\ndrawings\nsymbols\nperformer\n##iel\ngriffin\nrestrictions\nediting\nandrews\njava\njournals\narabia\ncompositions\ndee\npierce\nremoving\nhindi\ncasino\nrunway\ncivilians\nminds\nnasa\nhotels\n##zation\nrefuge\nrent\nretain\npotentially\nconferences\nsuburban\nconducting\n##tto\n##tions\n##tle\ndescended\nmassacre\n##cal\nammunition\nterrain\nfork\nsouls\ncounts\nchelsea\ndurham\ndrives\ncab\n##bank\nperth\nrealizing\npalestinian\nfinn\nsimpson\n##dal\nbetty\n##ule\nmoreover\nparticles\ncardinals\ntent\nevaluation\nextraordinary\n##oid\ninscription\n##works\nwednesday\nchloe\nmaintains\npanels\nashley\ntrucks\n##nation\ncluster\nsunlight\nstrikes\nzhang\n##wing\ndialect\ncanon\n##ap\ntucked\n##ws\ncollecting\n##mas\n##can\n##sville\nmaker\nquoted\nevan\nfranco\naria\nbuying\ncleaning\neva\ncloset\nprovision\napollo\nclinic\nrat\n##ez\nnecessarily\nac\n##gle\n##ising\nvenues\nflipped\ncent\nspreading\ntrustees\nchecking\nauthorized\n##sco\ndisappointed\n##ado\nnotion\nduration\ntrumpet\nhesitated\ntopped\nbrussels\nrolls\ntheoretical\nhint\ndefine\naggressive\nrepeat\nwash\npeaceful\noptical\nwidth\nallegedly\nmcdonald\nstrict\ncopyright\n##illa\ninvestors\nmar\njam\nwitnesses\nsounding\nmiranda\nmichelle\nprivacy\nhugo\nharmony\n##pp\nvalid\nlynn\nglared\nnina\n102\nheadquartered\ndiving\nboarding\ngibson\n##ncy\nalbanian\nmarsh\nroutine\ndealt\nenhanced\ner\nintelligent\nsubstance\ntargeted\nenlisted\ndiscovers\nspinning\nobservations\npissed\nsmoking\nrebecca\ncapitol\nvisa\nvaried\ncostume\nseemingly\nindies\ncompensation\nsurgeon\nthursday\narsenal\nwestminster\nsuburbs\nrid\nanglican\n##ridge\nknots\nfoods\nalumni\nlighter\nfraser\nwhoever\nportal\nscandal\n##ray\ngavin\nadvised\ninstructor\nflooding\nterrorist\n##ale\nteenage\ninterim\nsenses\nduck\nteen\nthesis\nabby\neager\novercome\n##ile\nnewport\nglenn\nrises\nshame\n##cc\nprompted\npriority\nforgot\nbomber\nnicolas\nprotective\n360\ncartoon\nkatherine\nbreeze\nlonely\ntrusted\nhenderson\nrichardson\nrelax\nbanner\ncandy\npalms\nremarkable\n##rio\nlegends\ncricketer\nessay\nordained\nedmund\nrifles\ntrigger\n##uri\n##away\nsail\nalert\n1830\naudiences\npenn\nsussex\nsiblings\npursued\nindianapolis\nresist\nrosa\nconsequence\nsucceed\navoided\n1845\n##ulation\ninland\n##tie\n##nna\ncounsel\nprofession\nchronicle\nhurried\n##una\neyebrow\neventual\nbleeding\ninnovative\ncure\n##dom\ncommittees\naccounting\ncon\nscope\nhardy\nheather\ntenor\ngut\nherald\ncodes\ntore\nscales\nwagon\n##oo\nluxury\ntin\nprefer\nfountain\ntriangle\nbonds\ndarling\nconvoy\ndried\ntraced\nbeings\ntroy\naccidentally\nslam\nfindings\nsmelled\njoey\nlawyers\noutcome\nsteep\nbosnia\nconfiguration\nshifting\ntoll\nbrook\nperformers\nlobby\nphilosophical\nconstruct\nshrine\naggregate\nboot\ncox\nphenomenon\nsavage\ninsane\nsolely\nreynolds\nlifestyle\n##ima\nnationally\nholdings\nconsideration\nenable\nedgar\nmo\nmama\n##tein\nfights\nrelegation\nchances\natomic\nhub\nconjunction\nawkward\nreactions\ncurrency\nfinale\nkumar\nunderwent\nsteering\nelaborate\ngifts\ncomprising\nmelissa\nveins\nreasonable\nsunshine\nchi\nsolve\ntrails\ninhabited\nelimination\nethics\nhuh\nana\nmolly\nconsent\napartments\nlayout\nmarines\n##ces\nhunters\nbulk\n##oma\nhometown\n##wall\n##mont\ncracked\nreads\nneighbouring\nwithdrawn\nadmission\nwingspan\ndamned\nanthology\nlancashire\nbrands\nbatting\nforgive\ncuban\nawful\n##lyn\n104\ndimensions\nimagination\n##ade\ndante\n##ship\ntracking\ndesperately\ngoalkeeper\n##yne\ngroaned\nworkshops\nconfident\nburton\ngerald\nmilton\ncircus\nuncertain\nslope\ncopenhagen\nsophia\nfog\nphilosopher\nportraits\naccent\ncycling\nvarying\ngripped\nlarvae\ngarrett\nspecified\nscotia\nmature\nluther\nkurt\nrap\n##kes\naerial\n750\nferdinand\nheated\nes\ntransported\n##shan\nsafely\nnonetheless\n##orn\n##gal\nmotors\ndemanding\n##sburg\nstartled\n##brook\nally\ngenerate\ncaps\nghana\nstained\ndemo\nmentions\nbeds\nap\nafterward\ndiary\n##bling\nutility\n##iro\nrichards\n1837\nconspiracy\nconscious\nshining\nfootsteps\nobserver\ncyprus\nurged\nloyalty\ndeveloper\nprobability\nolive\nupgraded\ngym\nmiracle\ninsects\ngraves\n1844\nourselves\nhydrogen\namazon\nkatie\ntickets\npoets\n##pm\nplanes\n##pan\nprevention\nwitnessed\ndense\njin\nrandy\ntang\nwarehouse\nmonroe\nbang\narchived\nelderly\ninvestigations\nalec\ngranite\nmineral\nconflicts\ncontrolling\naboriginal\ncarlo\n##zu\nmechanics\nstan\nstark\nrhode\nskirt\nest\n##berry\nbombs\nrespected\n##horn\nimposed\nlimestone\ndeny\nnominee\nmemphis\ngrabbing\ndisabled\n##als\namusement\naa\nfrankfurt\ncorn\nreferendum\nvaries\nslowed\ndisk\nfirms\nunconscious\nincredible\nclue\nsue\n##zhou\ntwist\n##cio\njoins\nidaho\nchad\ndevelopers\ncomputing\ndestroyer\n103\nmortal\ntucker\nkingston\nchoices\nyu\ncarson\n1800\nos\nwhitney\ngeneva\npretend\ndimension\nstaged\nplateau\nmaya\n##une\nfreestyle\n##bc\nrovers\nhiv\n##ids\ntristan\nclassroom\nprospect\n##hus\nhonestly\ndiploma\nlied\nthermal\nauxiliary\nfeast\nunlikely\niata\n##tel\nmorocco\npounding\ntreasury\nlithuania\nconsiderably\n1841\ndish\n1812\ngeological\nmatching\nstumbled\ndestroying\nmarched\nbrien\nadvances\ncake\nnicole\nbelle\nsettling\nmeasuring\ndirecting\n##mie\ntuesday\nbassist\ncapabilities\nstunned\nfraud\ntorpedo\n##list\n##phone\nanton\nwisdom\nsurveillance\nruined\n##ulate\nlawsuit\nhealthcare\ntheorem\nhalls\ntrend\naka\nhorizontal\ndozens\nacquire\nlasting\nswim\nhawk\ngorgeous\nfees\nvicinity\ndecrease\nadoption\ntactics\n##ography\npakistani\n##ole\ndraws\n##hall\nwillie\nburke\nheath\nalgorithm\nintegral\npowder\nelliott\nbrigadier\njackie\ntate\nvarieties\ndarker\n##cho\nlately\ncigarette\nspecimens\nadds\n##ree\n##ensis\n##inger\nexploded\nfinalist\ncia\nmurders\nwilderness\narguments\nnicknamed\nacceptance\nonwards\nmanufacture\nrobertson\njets\ntampa\nenterprises\nblog\nloudly\ncomposers\nnominations\n1838\nai\nmalta\ninquiry\nautomobile\nhosting\nviii\nrays\ntilted\ngrief\nmuseums\nstrategies\nfurious\neuro\nequality\ncohen\npoison\nsurrey\nwireless\ngoverned\nridiculous\nmoses\n##esh\n##room\nvanished\n##ito\nbarnes\nattract\nmorrison\nistanbul\n##iness\nabsent\nrotation\npetition\njanet\n##logical\nsatisfaction\ncustody\ndeliberately\nobservatory\ncomedian\nsurfaces\npinyin\nnovelist\nstrictly\ncanterbury\noslo\nmonks\nembrace\nibm\njealous\nphotograph\ncontinent\ndorothy\nmarina\ndoc\nexcess\nholden\nallegations\nexplaining\nstack\navoiding\nlance\nstoryline\nmajesty\npoorly\nspike\ndos\nbradford\nraven\ntravis\nclassics\nproven\nvoltage\npillow\nfists\nbutt\n1842\ninterpreted\n##car\n1839\ngage\ntelegraph\nlens\npromising\nexpelled\ncasual\ncollector\nzones\n##min\nsilly\nnintendo\n##kh\n##bra\ndownstairs\nchef\nsuspicious\nafl\nflies\nvacant\nuganda\npregnancy\ncondemned\nlutheran\nestimates\ncheap\ndecree\nsaxon\nproximity\nstripped\nidiot\ndeposits\ncontrary\npresenter\nmagnus\nglacier\nim\noffense\nedwin\n##ori\nupright\n##long\nbolt\n##ois\ntoss\ngeographical\n##izes\nenvironments\ndelicate\nmarking\nabstract\nxavier\nnails\nwindsor\nplantation\noccurring\nequity\nsaskatchewan\nfears\ndrifted\nsequences\nvegetation\nrevolt\n##stic\n1843\nsooner\nfusion\nopposing\nnato\nskating\n1836\nsecretly\nruin\nlease\n##oc\nedit\n##nne\nflora\nanxiety\nruby\n##ological\n##mia\ntel\nbout\ntaxi\nemmy\nfrost\nrainbow\ncompounds\nfoundations\nrainfall\nassassination\nnightmare\ndominican\n##win\nachievements\ndeserve\norlando\nintact\narmenia\n##nte\ncalgary\nvalentine\n106\nmarion\nproclaimed\ntheodore\nbells\ncourtyard\nthigh\ngonzalez\nconsole\ntroop\nminimal\nmonte\neveryday\n##ence\n##if\nsupporter\nterrorism\nbuck\nopenly\npresbyterian\nactivists\ncarpet\n##iers\nrubbing\nuprising\n##yi\ncute\nconceived\nlegally\n##cht\nmillennium\ncello\nvelocity\nji\nrescued\ncardiff\n1835\nrex\nconcentrate\nsenators\nbeard\nrendered\nglowing\nbattalions\nscouts\ncompetitors\nsculptor\ncatalogue\narctic\nion\nraja\nbicycle\nwow\nglancing\nlawn\n##woman\ngentleman\nlighthouse\npublish\npredicted\ncalculated\n##val\nvariants\n##gne\nstrain\n##ui\nwinston\ndeceased\n##nus\ntouchdowns\nbrady\ncaleb\nsinking\nechoed\ncrush\nhon\nblessed\nprotagonist\nhayes\nendangered\nmagnitude\neditors\n##tine\nestimate\nresponsibilities\n##mel\nbackup\nlaying\nconsumed\nsealed\nzurich\nlovers\nfrustrated\n##eau\nahmed\nkicking\nmit\ntreasurer\n1832\nbiblical\nrefuse\nterrified\npump\nagrees\ngenuine\nimprisonment\nrefuses\nplymouth\n##hen\nlou\n##nen\ntara\ntrembling\nantarctic\nton\nlearns\n##tas\ncrap\ncrucial\nfaction\natop\n##borough\nwrap\nlancaster\nodds\nhopkins\nerik\nlyon\n##eon\nbros\n##ode\nsnap\nlocality\ntips\nempress\ncrowned\ncal\nacclaimed\nchuckled\n##ory\nclara\nsends\nmild\ntowel\n##fl\n##day\n##а\nwishing\nassuming\ninterviewed\n##bal\n##die\ninteractions\neden\ncups\nhelena\n##lf\nindie\nbeck\n##fire\nbatteries\nfilipino\nwizard\nparted\n##lam\ntraces\n##born\nrows\nidol\nalbany\ndelegates\n##ees\n##sar\ndiscussions\n##ex\nnotre\ninstructed\nbelgrade\nhighways\nsuggestion\nlauren\npossess\norientation\nalexandria\nabdul\nbeats\nsalary\nreunion\nludwig\nalright\nwagner\nintimate\npockets\nslovenia\nhugged\nbrighton\nmerchants\ncruel\nstole\ntrek\nslopes\nrepairs\nenrollment\npolitically\nunderlying\npromotional\ncounting\nboeing\n##bb\nisabella\nnaming\n##и\nkeen\nbacteria\nlisting\nseparately\nbelfast\nussr\n450\nlithuanian\nanybody\nribs\nsphere\nmartinez\ncock\nembarrassed\nproposals\nfragments\nnationals\n##fs\n##wski\npremises\nfin\n1500\nalpine\nmatched\nfreely\nbounded\njace\nsleeve\n##af\ngaming\npier\npopulated\nevident\n##like\nfrances\nflooded\n##dle\nfrightened\npour\ntrainer\nframed\nvisitor\nchallenging\npig\nwickets\n##fold\ninfected\nemail\n##pes\narose\n##aw\nreward\necuador\noblast\nvale\nch\nshuttle\n##usa\nbach\nrankings\nforbidden\ncornwall\naccordance\nsalem\nconsumers\nbruno\nfantastic\ntoes\nmachinery\nresolved\njulius\nremembering\npropaganda\niceland\nbombardment\ntide\ncontacts\nwives\n##rah\nconcerto\nmacdonald\nalbania\nimplement\ndaisy\ntapped\nsudan\nhelmet\nangela\nmistress\n##lic\ncrop\nsunk\nfinest\n##craft\nhostile\n##ute\n##tsu\nboxer\nfr\npaths\nadjusted\nhabit\nballot\nsupervision\nsoprano\n##zen\nbullets\nwicked\nsunset\nregiments\ndisappear\nlamp\nperforms\napp\n##gia\n##oa\nrabbit\ndigging\nincidents\nentries\n##cion\ndishes\n##oi\nintroducing\n##ati\n##fied\nfreshman\nslot\njill\ntackles\nbaroque\nbacks\n##iest\nlone\nsponsor\ndestiny\naltogether\nconvert\n##aro\nconsensus\nshapes\ndemonstration\nbasically\nfeminist\nauction\nartifacts\n##bing\nstrongest\ntwitter\nhalifax\n2019\nallmusic\nmighty\nsmallest\nprecise\nalexandra\nviola\n##los\n##ille\nmanuscripts\n##illo\ndancers\nari\nmanagers\nmonuments\nblades\nbarracks\nspringfield\nmaiden\nconsolidated\nelectron\n##end\nberry\nairing\nwheat\nnobel\ninclusion\nblair\npayments\ngeography\nbee\ncc\neleanor\nreact\n##hurst\nafc\nmanitoba\n##yu\nsu\nlineup\nfitness\nrecreational\ninvestments\nairborne\ndisappointment\n##dis\nedmonton\nviewing\n##row\nrenovation\n##cast\ninfant\nbankruptcy\nroses\naftermath\npavilion\n##yer\ncarpenter\nwithdrawal\nladder\n##hy\ndiscussing\npopped\nreliable\nagreements\nrochester\n##abad\ncurves\nbombers\n220\nrao\nreverend\ndecreased\nchoosing\n107\nstiff\nconsulting\nnaples\ncrawford\ntracy\nka\nribbon\ncops\n##lee\ncrushed\ndeciding\nunified\nteenager\naccepting\nflagship\nexplorer\npoles\nsanchez\ninspection\nrevived\nskilled\ninduced\nexchanged\nflee\nlocals\ntragedy\nswallow\nloading\nhanna\ndemonstrate\n##ela\nsalvador\nflown\ncontestants\ncivilization\n##ines\nwanna\nrhodes\nfletcher\nhector\nknocking\nconsiders\n##ough\nnash\nmechanisms\nsensed\nmentally\nwalt\nunclear\n##eus\nrenovated\nmadame\n##cks\ncrews\ngovernmental\n##hin\nundertaken\nmonkey\n##ben\n##ato\nfatal\narmored\ncopa\ncaves\ngovernance\ngrasp\nperception\ncertification\nfroze\ndamp\ntugged\nwyoming\n##rg\n##ero\nnewman\n##lor\nnerves\ncuriosity\ngraph\n115\n##ami\nwithdraw\ntunnels\ndull\nmeredith\nmoss\nexhibits\nneighbors\ncommunicate\naccuracy\nexplored\nraiders\nrepublicans\nsecular\nkat\nsuperman\npenny\ncriticised\n##tch\nfreed\nupdate\nconviction\nwade\nham\nlikewise\ndelegation\ngotta\ndoll\npromises\ntechnological\nmyth\nnationality\nresolve\nconvent\n##mark\nsharon\ndig\nsip\ncoordinator\nentrepreneur\nfold\n##dine\ncapability\ncouncillor\nsynonym\nblown\nswan\ncursed\n1815\njonas\nhaired\nsofa\ncanvas\nkeeper\nrivalry\n##hart\nrapper\nspeedway\nswords\npostal\nmaxwell\nestonia\npotter\nrecurring\n##nn\n##ave\nerrors\n##oni\ncognitive\n1834\n##²\nclaws\nnadu\nroberto\nbce\nwrestler\nellie\n##ations\ninfinite\nink\n##tia\npresumably\nfinite\nstaircase\n108\nnoel\npatricia\nnacional\n##cation\nchill\neternal\ntu\npreventing\nprussia\nfossil\nlimbs\n##logist\nernst\nfrog\nperez\nrene\n##ace\npizza\nprussian\n##ios\n##vy\nmolecules\nregulatory\nanswering\nopinions\nsworn\nlengths\nsupposedly\nhypothesis\nupward\nhabitats\nseating\nancestors\ndrank\nyield\nhd\nsynthesis\nresearcher\nmodest\n##var\nmothers\npeered\nvoluntary\nhomeland\n##the\nacclaim\n##igan\nstatic\nvalve\nluxembourg\nalto\ncarroll\nfe\nreceptor\nnorton\nambulance\n##tian\njohnston\ncatholics\ndepicting\njointly\nelephant\ngloria\nmentor\nbadge\nahmad\ndistinguish\nremarked\ncouncils\nprecisely\nallison\nadvancing\ndetection\ncrowded\n##10\ncooperative\nankle\nmercedes\ndagger\nsurrendered\npollution\ncommit\nsubway\njeffrey\nlesson\nsculptures\nprovider\n##fication\nmembrane\ntimothy\nrectangular\nfiscal\nheating\nteammate\nbasket\nparticle\nanonymous\ndeployment\n##ple\nmissiles\ncourthouse\nproportion\nshoe\nsec\n##ller\ncomplaints\nforbes\nblacks\nabandon\nremind\nsizes\noverwhelming\nautobiography\nnatalie\n##awa\nrisks\ncontestant\ncountryside\nbabies\nscorer\ninvaded\nenclosed\nproceed\nhurling\ndisorders\n##cu\nreflecting\ncontinuously\ncruiser\ngraduates\nfreeway\ninvestigated\nore\ndeserved\nmaid\nblocking\nphillip\njorge\nshakes\ndove\nmann\nvariables\nlacked\nburden\naccompanying\nque\nconsistently\norganizing\nprovisional\ncomplained\nendless\n##rm\ntubes\njuice\ngeorges\nkrishna\nmick\nlabels\nthriller\n##uch\nlaps\narcade\nsage\nsnail\n##table\nshannon\nfi\nlaurence\nseoul\nvacation\npresenting\nhire\nchurchill\nsurprisingly\nprohibited\nsavannah\ntechnically\n##oli\n170\n##lessly\ntestimony\nsuited\nspeeds\ntoys\nromans\nmlb\nflowering\nmeasurement\ntalented\nkay\nsettings\ncharleston\nexpectations\nshattered\nachieving\ntriumph\nceremonies\nportsmouth\nlanes\nmandatory\nloser\nstretching\ncologne\nrealizes\nseventy\ncornell\ncareers\nwebb\n##ulating\namericas\nbudapest\nava\nsuspicion\n##ison\nyo\nconrad\n##hai\nsterling\njessie\nrector\n##az\n1831\ntransform\norganize\nloans\nchristine\nvolcanic\nwarrant\nslender\nsummers\nsubfamily\nnewer\ndanced\ndynamics\nrhine\nproceeds\nheinrich\ngastropod\ncommands\nsings\nfacilitate\neaster\nra\npositioned\nresponses\nexpense\nfruits\nyanked\nimported\n25th\nvelvet\nvic\nprimitive\ntribune\nbaldwin\nneighbourhood\ndonna\nrip\nhay\npr\n##uro\n1814\nespn\nwelcomed\n##aria\nqualifier\nglare\nhighland\ntiming\n##cted\nshells\neased\ngeometry\nlouder\nexciting\nslovakia\n##sion\n##iz\n##lot\nsavings\nprairie\n##ques\nmarching\nrafael\ntonnes\n##lled\ncurtain\npreceding\nshy\nheal\ngreene\nworthy\n##pot\ndetachment\nbury\nsherman\n##eck\nreinforced\nseeks\nbottles\ncontracted\nduchess\noutfit\nwalsh\n##sc\nmickey\n##ase\ngeoffrey\narcher\nsqueeze\ndawson\neliminate\ninvention\n##enberg\nneal\n##eth\nstance\ndealer\ncoral\nmaple\nretire\npolo\nsimplified\n##ht\n1833\nhid\nwatts\nbackwards\njules\n##oke\ngenesis\nmt\nframes\nrebounds\nburma\nwoodland\nmoist\nsantos\nwhispers\ndrained\nsubspecies\n##aa\nstreaming\nulster\nburnt\ncorrespondence\nmaternal\ngerard\ndenis\nstealing\n##load\ngenius\nduchy\n##oria\ninaugurated\nmomentum\nsuits\nplacement\nsovereign\nclause\nthames\n##hara\nconfederation\nreservation\nsketch\nyankees\nlets\nrotten\ncharm\nhal\nverses\nultra\ncommercially\ndot\nsalon\ncitation\nadopt\nwinnipeg\nmist\nallocated\ncairo\n##boy\njenkins\ninterference\nobjectives\n##wind\n1820\nportfolio\narmoured\nsectors\n##eh\ninitiatives\n##world\nintegrity\nexercises\nrobe\ntap\nab\ngazed\n##tones\ndistracted\nrulers\n111\nfavorable\njerome\ntended\ncart\nfactories\n##eri\ndiplomat\nvalued\ngravel\ncharitable\n##try\ncalvin\nexploring\nchang\nshepherd\nterrace\npdf\npupil\n##ural\nreflects\nups\n##rch\ngovernors\nshelf\ndepths\n##nberg\ntrailed\ncrest\ntackle\n##nian\n##ats\nhatred\n##kai\nclare\nmakers\nethiopia\nlongtime\ndetected\nembedded\nlacking\nslapped\nrely\nthomson\nanticipation\niso\nmorton\nsuccessive\nagnes\nscreenwriter\nstraightened\nphilippe\nplaywright\nhaunted\nlicence\niris\nintentions\nsutton\n112\nlogical\ncorrectly\n##weight\nbranded\nlicked\ntipped\nsilva\nricky\nnarrator\nrequests\n##ents\ngreeted\nsupernatural\ncow\n##wald\nlung\nrefusing\nemployer\nstrait\ngaelic\nliner\n##piece\nzoe\nsabha\n##mba\ndriveway\nharvest\nprints\nbates\nreluctantly\nthreshold\nalgebra\nira\nwherever\ncoupled\n240\nassumption\npicks\n##air\ndesigners\nraids\ngentlemen\n##ean\nroller\nblowing\nleipzig\nlocks\nscrew\ndressing\nstrand\n##lings\nscar\ndwarf\ndepicts\n##nu\nnods\n##mine\ndiffer\nboris\n##eur\nyuan\nflip\n##gie\nmob\ninvested\nquestioning\napplying\n##ture\nshout\n##sel\ngameplay\nblamed\nillustrations\nbothered\nweakness\nrehabilitation\n##of\n##zes\nenvelope\nrumors\nminers\nleicester\nsubtle\nkerry\n##ico\nferguson\n##fu\npremiership\nne\n##cat\nbengali\nprof\ncatches\nremnants\ndana\n##rily\nshouting\npresidents\nbaltic\nought\nghosts\ndances\nsailors\nshirley\nfancy\ndominic\n##bie\nmadonna\n##rick\nbark\nbuttons\ngymnasium\nashes\nliver\ntoby\noath\nprovidence\ndoyle\nevangelical\nnixon\ncement\ncarnegie\nembarked\nhatch\nsurroundings\nguarantee\nneeding\npirate\nessence\n##bee\nfilter\ncrane\nhammond\nprojected\nimmune\npercy\ntwelfth\n##ult\nregent\ndoctoral\ndamon\nmikhail\n##ichi\nlu\ncritically\nelect\nrealised\nabortion\nacute\nscreening\nmythology\nsteadily\n##fc\nfrown\nnottingham\nkirk\nwa\nminneapolis\n##rra\nmodule\nalgeria\nmc\nnautical\nencounters\nsurprising\nstatues\navailability\nshirts\npie\nalma\nbrows\nmunster\nmack\nsoup\ncrater\ntornado\nsanskrit\ncedar\nexplosive\nbordered\ndixon\nplanets\nstamp\nexam\nhappily\n##bble\ncarriers\nkidnapped\n##vis\naccommodation\nemigrated\n##met\nknockout\ncorrespondent\nviolation\nprofits\npeaks\nlang\nspecimen\nagenda\nancestry\npottery\nspelling\nequations\nobtaining\nki\nlinking\n1825\ndebris\nasylum\n##20\nbuddhism\nteddy\n##ants\ngazette\n##nger\n##sse\ndental\neligibility\nutc\nfathers\naveraged\nzimbabwe\nfrancesco\ncoloured\nhissed\ntranslator\nlynch\nmandate\nhumanities\nmackenzie\nuniforms\nlin\n##iana\n##gio\nasset\nmhz\nfitting\nsamantha\ngenera\nwei\nrim\nbeloved\nshark\nriot\nentities\nexpressions\nindo\ncarmen\nslipping\nowing\nabbot\nneighbor\nsidney\n##av\nrats\nrecommendations\nencouraging\nsquadrons\nanticipated\ncommanders\nconquered\n##oto\ndonations\ndiagnosed\n##mond\ndivide\n##iva\nguessed\ndecoration\nvernon\nauditorium\nrevelation\nconversations\n##kers\n##power\nherzegovina\ndash\nalike\nprotested\nlateral\nherman\naccredited\nmg\n##gent\nfreeman\nmel\nfiji\ncrow\ncrimson\n##rine\nlivestock\n##pped\nhumanitarian\nbored\noz\nwhip\n##lene\n##ali\nlegitimate\nalter\ngrinning\nspelled\nanxious\noriental\nwesley\n##nin\n##hole\ncarnival\ncontroller\ndetect\n##ssa\nbowed\neducator\nkosovo\nmacedonia\n##sin\noccupy\nmastering\nstephanie\njaneiro\npara\nunaware\nnurses\nnoon\n135\ncam\nhopefully\nranger\ncombine\nsociology\npolar\nrica\n##eer\nneill\n##sman\nholocaust\n##ip\ndoubled\nlust\n1828\n109\ndecent\ncooling\nunveiled\n##card\n1829\nnsw\nhomer\nchapman\nmeyer\n##gin\ndive\nmae\nreagan\nexpertise\n##gled\ndarwin\nbrooke\nsided\nprosecution\ninvestigating\ncomprised\npetroleum\ngenres\nreluctant\ndifferently\ntrilogy\njohns\nvegetables\ncorpse\nhighlighted\nlounge\npension\nunsuccessfully\nelegant\naided\nivory\nbeatles\namelia\ncain\ndubai\nsunny\nimmigrant\nbabe\nclick\n##nder\nunderwater\npepper\ncombining\nmumbled\natlas\nhorns\naccessed\nballad\nphysicians\nhomeless\ngestured\nrpm\nfreak\nlouisville\ncorporations\npatriots\nprizes\nrational\nwarn\nmodes\ndecorative\novernight\ndin\ntroubled\nphantom\n##ort\nmonarch\nsheer\n##dorf\ngenerals\nguidelines\norgans\naddresses\n##zon\nenhance\ncurling\nparishes\ncord\n##kie\nlinux\ncaesar\ndeutsche\nbavaria\n##bia\ncoleman\ncyclone\n##eria\nbacon\npetty\n##yama\n##old\nhampton\ndiagnosis\n1824\nthrows\ncomplexity\nrita\ndisputed\n##₃\npablo\n##sch\nmarketed\ntrafficking\n##ulus\nexamine\nplague\nformats\n##oh\nvault\nfaithful\n##bourne\nwebster\n##ox\nhighlights\n##ient\n##ann\nphones\nvacuum\nsandwich\nmodeling\n##gated\nbolivia\nclergy\nqualities\nisabel\n##nas\n##ars\nwears\nscreams\nreunited\nannoyed\nbra\n##ancy\n##rate\ndifferential\ntransmitter\ntattoo\ncontainer\npoker\n##och\nexcessive\nresides\ncowboys\n##tum\naugustus\ntrash\nproviders\nstatute\nretreated\nbalcony\nreversed\nvoid\nstorey\npreceded\nmasses\nleap\nlaughs\nneighborhoods\nwards\nschemes\nfalcon\nsanto\nbattlefield\npad\nronnie\nthread\nlesbian\nvenus\n##dian\nbeg\nsandstone\ndaylight\npunched\ngwen\nanalog\nstroked\nwwe\nacceptable\nmeasurements\ndec\ntoxic\n##kel\nadequate\nsurgical\neconomist\nparameters\nvarsity\n##sberg\nquantity\nella\n##chy\n##rton\ncountess\ngenerating\nprecision\ndiamonds\nexpressway\nga\n##ı\n1821\nuruguay\ntalents\ngalleries\nexpenses\nscanned\ncolleague\noutlets\nryder\nlucien\n##ila\nparamount\n##bon\nsyracuse\ndim\nfangs\ngown\nsweep\n##sie\ntoyota\nmissionaries\nwebsites\n##nsis\nsentences\nadviser\nval\ntrademark\nspells\n##plane\npatience\nstarter\nslim\n##borg\ntoe\nincredibly\nshoots\nelliot\nnobility\n##wyn\ncowboy\nendorsed\ngardner\ntendency\npersuaded\norganisms\nemissions\nkazakhstan\namused\nboring\nchips\nthemed\n##hand\nllc\nconstantinople\nchasing\nsystematic\nguatemala\nborrowed\nerin\ncarey\n##hard\nhighlands\nstruggles\n1810\n##ifying\n##ced\nwong\nexceptions\ndevelops\nenlarged\nkindergarten\ncastro\n##ern\n##rina\nleigh\nzombie\njuvenile\n##most\nconsul\n##nar\nsailor\nhyde\nclarence\nintensive\npinned\nnasty\nuseless\njung\nclayton\nstuffed\nexceptional\nix\napostolic\n230\ntransactions\n##dge\nexempt\nswinging\ncove\nreligions\n##ash\nshields\ndairy\nbypass\n190\npursuing\nbug\njoyce\nbombay\nchassis\nsouthampton\nchat\ninteract\nredesignated\n##pen\nnascar\npray\nsalmon\nrigid\nregained\nmalaysian\ngrim\npublicity\nconstituted\ncapturing\ntoilet\ndelegate\npurely\ntray\ndrift\nloosely\nstriker\nweakened\ntrinidad\nmitch\nitv\ndefines\ntransmitted\nming\nscarlet\nnodding\nfitzgerald\nfu\nnarrowly\nsp\ntooth\nstandings\nvirtue\n##₁\n##wara\n##cting\nchateau\ngloves\nlid\n##nel\nhurting\nconservatory\n##pel\nsinclair\nreopened\nsympathy\nnigerian\nstrode\nadvocated\noptional\nchronic\ndischarge\n##rc\nsuck\ncompatible\nlaurel\nstella\nshi\nfails\nwage\ndodge\n128\ninformal\nsorts\nlevi\nbuddha\nvillagers\n##aka\nchronicles\nheavier\nsummoned\ngateway\n3000\neleventh\njewelry\ntranslations\naccordingly\nseas\n##ency\nfiber\npyramid\ncubic\ndragging\n##ista\ncaring\n##ops\nandroid\ncontacted\nlunar\n##dt\nkai\nlisbon\npatted\n1826\nsacramento\ntheft\nmadagascar\nsubtropical\ndisputes\nta\nholidays\npiper\nwillow\nmare\ncane\nitunes\nnewfoundland\nbenny\ncompanions\ndong\nraj\nobserve\nroar\ncharming\nplaque\ntibetan\nfossils\nenacted\nmanning\nbubble\ntina\ntanzania\n##eda\n##hir\nfunk\nswamp\ndeputies\ncloak\nufc\nscenario\npar\nscratch\nmetals\nanthem\nguru\nengaging\nspecially\n##boat\ndialects\nnineteen\ncecil\nduet\ndisability\nmessenger\nunofficial\n##lies\ndefunct\neds\nmoonlight\ndrainage\nsurname\npuzzle\nhonda\nswitching\nconservatives\nmammals\nknox\nbroadcaster\nsidewalk\ncope\n##ried\nbenson\nprinces\npeterson\n##sal\nbedford\nsharks\neli\nwreck\nalberto\ngasp\narchaeology\nlgbt\nteaches\nsecurities\nmadness\ncompromise\nwaving\ncoordination\ndavidson\nvisions\nleased\npossibilities\neighty\njun\nfernandez\nenthusiasm\nassassin\nsponsorship\nreviewer\nkingdoms\nestonian\nlaboratories\n##fy\n##nal\napplies\nverb\ncelebrations\n##zzo\nrowing\nlightweight\nsadness\nsubmit\nmvp\nbalanced\ndude\n##vas\nexplicitly\nmetric\nmagnificent\nmound\nbrett\nmohammad\nmistakes\nirregular\n##hing\n##ass\nsanders\nbetrayed\nshipped\nsurge\n##enburg\nreporters\ntermed\ngeorg\npity\nverbal\nbulls\nabbreviated\nenabling\nappealed\n##are\n##atic\nsicily\nsting\nheel\nsweetheart\nbart\nspacecraft\nbrutal\nmonarchy\n##tter\naberdeen\ncameo\ndiane\n##ub\nsurvivor\nclyde\n##aries\ncomplaint\n##makers\nclarinet\ndelicious\nchilean\nkarnataka\ncoordinates\n1818\npanties\n##rst\npretending\nar\ndramatically\nkiev\nbella\ntends\ndistances\n113\ncatalog\nlaunching\ninstances\ntelecommunications\nportable\nlindsay\nvatican\n##eim\nangles\naliens\nmarker\nstint\nscreens\nbolton\n##rne\njudy\nwool\nbenedict\nplasma\neuropa\nspark\nimaging\nfilmmaker\nswiftly\n##een\ncontributor\n##nor\nopted\nstamps\napologize\nfinancing\nbutter\ngideon\nsophisticated\nalignment\navery\nchemicals\nyearly\nspeculation\nprominence\nprofessionally\n##ils\nimmortal\ninstitutional\ninception\nwrists\nidentifying\ntribunal\nderives\ngains\n##wo\npapal\npreference\nlinguistic\nvince\noperative\nbrewery\n##ont\nunemployment\nboyd\n##ured\n##outs\nalbeit\nprophet\n1813\nbi\n##rr\n##face\n##rad\nquarterly\nasteroid\ncleaned\nradius\ntemper\n##llen\ntelugu\njerk\nviscount\nmenu\n##ote\nglimpse\n##aya\nyacht\nhawaiian\nbaden\n##rl\nlaptop\nreadily\n##gu\nmonetary\noffshore\nscots\nwatches\n##yang\n##arian\nupgrade\nneedle\nxbox\nlea\nencyclopedia\nflank\nfingertips\n##pus\ndelight\nteachings\nconfirm\nroth\nbeaches\nmidway\nwinters\n##iah\nteasing\ndaytime\nbeverly\ngambling\nbonnie\n##backs\nregulated\nclement\nhermann\ntricks\nknot\n##shing\n##uring\n##vre\ndetached\necological\nowed\nspecialty\nbyron\ninventor\nbats\nstays\nscreened\nunesco\nmidland\ntrim\naffection\n##ander\n##rry\njess\nthoroughly\nfeedback\n##uma\nchennai\nstrained\nheartbeat\nwrapping\novertime\npleaded\n##sworth\nmon\nleisure\noclc\n##tate\n##ele\nfeathers\nangelo\nthirds\nnuts\nsurveys\nclever\ngill\ncommentator\n##dos\ndarren\nrides\ngibraltar\n##nc\n##mu\ndissolution\ndedication\nshin\nmeals\nsaddle\nelvis\nreds\nchaired\ntaller\nappreciation\nfunctioning\nniece\nfavored\nadvocacy\nrobbie\ncriminals\nsuffolk\nyugoslav\npassport\nconstable\ncongressman\nhastings\nvera\n##rov\nconsecrated\nsparks\necclesiastical\nconfined\n##ovich\nmuller\nfloyd\nnora\n1822\npaved\n1827\ncumberland\nned\nsaga\nspiral\n##flow\nappreciated\nyi\ncollaborative\ntreating\nsimilarities\nfeminine\nfinishes\n##ib\njade\nimport\n##nse\n##hot\nchampagne\nmice\nsecuring\ncelebrities\nhelsinki\nattributes\n##gos\ncousins\nphases\nache\nlucia\ngandhi\nsubmission\nvicar\nspear\nshine\ntasmania\nbiting\ndetention\nconstitute\ntighter\nseasonal\n##gus\nterrestrial\nmatthews\n##oka\neffectiveness\nparody\nphilharmonic\n##onic\n1816\nstrangers\nencoded\nconsortium\nguaranteed\nregards\nshifts\ntortured\ncollision\nsupervisor\ninform\nbroader\ninsight\ntheaters\narmour\nemeritus\nblink\nincorporates\nmapping\n##50\n##ein\nhandball\nflexible\n##nta\nsubstantially\ngenerous\nthief\n##own\ncarr\nloses\n1793\nprose\nucla\nromeo\ngeneric\nmetallic\nrealization\ndamages\nmk\ncommissioners\nzach\ndefault\n##ther\nhelicopters\nlengthy\nstems\nspa\npartnered\nspectators\nrogue\nindication\npenalties\nteresa\n1801\nsen\n##tric\ndalton\n##wich\nirving\nphotographic\n##vey\ndell\ndeaf\npeters\nexcluded\nunsure\n##vable\npatterson\ncrawled\n##zio\nresided\nwhipped\nlatvia\nslower\necole\npipes\nemployers\nmaharashtra\ncomparable\nva\ntextile\npageant\n##gel\nalphabet\nbinary\nirrigation\nchartered\nchoked\nantoine\noffs\nwaking\nsupplement\n##wen\nquantities\ndemolition\nregain\nlocate\nurdu\nfolks\nalt\n114\n##mc\nscary\nandreas\nwhites\n##ava\nclassrooms\nmw\naesthetic\npublishes\nvalleys\nguides\ncubs\njohannes\nbryant\nconventions\naffecting\n##itt\ndrain\nawesome\nisolation\nprosecutor\nambitious\napology\ncaptive\ndowns\natmospheric\nlorenzo\naisle\nbeef\nfoul\n##onia\nkidding\ncomposite\ndisturbed\nillusion\nnatives\n##ffer\nemi\nrockets\nriverside\nwartime\npainters\nadolf\nmelted\n##ail\nuncertainty\nsimulation\nhawks\nprogressed\nmeantime\nbuilder\nspray\nbreach\nunhappy\nregina\nrussians\n##urg\ndetermining\n##tation\ntram\n1806\n##quin\naging\n##12\n1823\ngarion\nrented\nmister\ndiaz\nterminated\nclip\n1817\ndepend\nnervously\ndisco\nowe\ndefenders\nshiva\nnotorious\ndisbelief\nshiny\nworcester\n##gation\n##yr\ntrailing\nundertook\nislander\nbelarus\nlimitations\nwatershed\nfuller\noverlooking\nutilized\nraphael\n1819\nsynthetic\nbreakdown\nklein\n##nate\nmoaned\nmemoir\nlamb\npracticing\n##erly\ncellular\narrows\nexotic\n##graphy\nwitches\n117\ncharted\nrey\nhut\nhierarchy\nsubdivision\nfreshwater\ngiuseppe\naloud\nreyes\nqatar\nmarty\nsideways\nutterly\nsexually\njude\nprayers\nmccarthy\nsoftball\nblend\ndamien\n##gging\n##metric\nwholly\nerupted\nlebanese\nnegro\nrevenues\ntasted\ncomparative\nteamed\ntransaction\nlabeled\nmaori\nsovereignty\nparkway\ntrauma\ngran\nmalay\n121\nadvancement\ndescendant\n2020\nbuzz\nsalvation\ninventory\nsymbolic\n##making\nantarctica\nmps\n##gas\n##bro\nmohammed\nmyanmar\nholt\nsubmarines\ntones\n##lman\nlocker\npatriarch\nbangkok\nemerson\nremarks\npredators\nkin\nafghan\nconfession\nnorwich\nrental\nemerge\nadvantages\n##zel\nrca\n##hold\nshortened\nstorms\naidan\n##matic\nautonomy\ncompliance\n##quet\ndudley\natp\n##osis\n1803\nmotto\ndocumentation\nsummary\nprofessors\nspectacular\nchristina\narchdiocese\nflashing\ninnocence\nremake\n##dell\npsychic\nreef\nscare\nemploy\nrs\nsticks\nmeg\ngus\nleans\n##ude\naccompany\nbergen\ntomas\n##iko\ndoom\nwages\npools\n##nch\n##bes\nbreasts\nscholarly\nalison\noutline\nbrittany\nbreakthrough\nwillis\nrealistic\n##cut\n##boro\ncompetitor\n##stan\npike\npicnic\nicon\ndesigning\ncommercials\nwashing\nvillain\nskiing\nmicro\ncostumes\nauburn\nhalted\nexecutives\n##hat\nlogistics\ncycles\nvowel\napplicable\nbarrett\nexclaimed\neurovision\neternity\nramon\n##umi\n##lls\nmodifications\nsweeping\ndisgust\n##uck\ntorch\naviv\nensuring\nrude\ndusty\nsonic\ndonovan\noutskirts\ncu\npathway\n##band\n##gun\n##lines\ndisciplines\nacids\ncadet\npaired\n##40\nsketches\n##sive\nmarriages\n##⁺\nfolding\npeers\nslovak\nimplies\nadmired\n##beck\n1880s\nleopold\ninstinct\nattained\nweston\nmegan\nhorace\n##ination\ndorsal\ningredients\nevolutionary\n##its\ncomplications\ndeity\nlethal\nbrushing\nlevy\ndeserted\ninstitutes\nposthumously\ndelivering\ntelescope\ncoronation\nmotivated\nrapids\nluc\nflicked\npays\nvolcano\ntanner\nweighed\n##nica\ncrowds\nfrankie\ngifted\naddressing\ngranddaughter\nwinding\n##rna\nconstantine\ngomez\n##front\nlandscapes\nrudolf\nanthropology\nslate\nwerewolf\n##lio\nastronomy\ncirca\nrouge\ndreaming\nsack\nknelt\ndrowned\nnaomi\nprolific\ntracked\nfreezing\nherb\n##dium\nagony\nrandall\ntwisting\nwendy\ndeposit\ntouches\nvein\nwheeler\n##bbled\n##bor\nbatted\nretaining\ntire\npresently\ncompare\nspecification\ndaemon\nnigel\n##grave\nmerry\nrecommendation\nczechoslovakia\nsandra\nng\nroma\n##sts\nlambert\ninheritance\nsheikh\nwinchester\ncries\nexamining\n##yle\ncomeback\ncuisine\nnave\n##iv\nko\nretrieve\ntomatoes\nbarker\npolished\ndefining\nirene\nlantern\npersonalities\nbegging\ntract\nswore\n1809\n175\n##gic\nomaha\nbrotherhood\n##rley\nhaiti\n##ots\nexeter\n##ete\n##zia\nsteele\ndumb\npearson\n210\nsurveyed\nelisabeth\ntrends\n##ef\nfritz\n##rf\npremium\nbugs\nfraction\ncalmly\nviking\n##birds\ntug\ninserted\nunusually\n##ield\nconfronted\ndistress\ncrashing\nbrent\nturks\nresign\n##olo\ncambodia\ngabe\nsauce\n##kal\nevelyn\n116\nextant\nclusters\nquarry\nteenagers\nluna\n##lers\n##ister\naffiliation\ndrill\n##ashi\npanthers\nscenic\nlibya\nanita\nstrengthen\ninscriptions\n##cated\nlace\nsued\njudith\nriots\n##uted\nmint\n##eta\npreparations\nmidst\ndub\nchallenger\n##vich\nmock\ncf\ndisplaced\nwicket\nbreaths\nenables\nschmidt\nanalyst\n##lum\nag\nhighlight\nautomotive\naxe\njosef\nnewark\nsufficiently\nresembles\n50th\n##pal\nflushed\nmum\ntraits\n##ante\ncommodore\nincomplete\nwarming\ntitular\nceremonial\nethical\n118\ncelebrating\neighteenth\ncao\nlima\nmedalist\nmobility\nstrips\nsnakes\n##city\nminiature\nzagreb\nbarton\nescapes\numbrella\nautomated\ndoubted\ndiffers\ncooled\ngeorgetown\ndresden\ncooked\nfade\nwyatt\nrna\njacobs\ncarlton\nabundant\nstereo\nboost\nmadras\ninning\n##hia\nspur\nip\nmalayalam\nbegged\nosaka\ngroan\nescaping\ncharging\ndose\nvista\n##aj\nbud\npapa\ncommunists\nadvocates\nedged\ntri\n##cent\nresemble\npeaking\nnecklace\nfried\nmontenegro\nsaxony\ngoose\nglances\nstuttgart\ncurator\nrecruit\ngrocery\nsympathetic\n##tting\n##fort\n127\nlotus\nrandolph\nancestor\n##rand\nsucceeding\njupiter\n1798\nmacedonian\n##heads\nhiking\n1808\nhanding\nfischer\n##itive\ngarbage\nnode\n##pies\nprone\nsingular\npapua\ninclined\nattractions\nitalia\npouring\nmotioned\ngrandma\ngarnered\njacksonville\ncorp\nego\nringing\naluminum\n##hausen\nordering\n##foot\ndrawer\ntraders\nsynagogue\n##play\n##kawa\nresistant\nwandering\nfragile\nfiona\nteased\nvar\nhardcore\nsoaked\njubilee\ndecisive\nexposition\nmercer\nposter\nvalencia\nhale\nkuwait\n1811\n##ises\n##wr\n##eed\ntavern\ngamma\n122\njohan\n##uer\nairways\namino\ngil\n##ury\nvocational\ndomains\ntorres\n##sp\ngenerator\nfolklore\noutcomes\n##keeper\ncanberra\nshooter\nfl\nbeams\nconfrontation\n##lling\n##gram\nfeb\naligned\nforestry\npipeline\njax\nmotorway\nconception\ndecay\n##tos\ncoffin\n##cott\nstalin\n1805\nescorted\nminded\n##nam\nsitcom\npurchasing\ntwilight\nveronica\nadditions\npassive\ntensions\nstraw\n123\nfrequencies\n1804\nrefugee\ncultivation\n##iate\nchristie\nclary\nbulletin\ncrept\ndisposal\n##rich\n##zong\nprocessor\ncrescent\n##rol\nbmw\nemphasized\nwhale\nnazis\naurora\n##eng\ndwelling\nhauled\nsponsors\ntoledo\nmega\nideology\ntheatres\ntessa\ncerambycidae\nsaves\nturtle\ncone\nsuspects\nkara\nrusty\nyelling\ngreeks\nmozart\nshades\ncocked\nparticipant\n##tro\nshire\nspit\nfreeze\nnecessity\n##cos\ninmates\nnielsen\ncouncillors\nloaned\nuncommon\nomar\npeasants\nbotanical\noffspring\ndaniels\nformations\njokes\n1794\npioneers\nsigma\nlicensing\n##sus\nwheelchair\npolite\n1807\nliquor\npratt\ntrustee\n##uta\nforewings\nballoon\n##zz\nkilometre\ncamping\nexplicit\ncasually\nshawn\nfoolish\nteammates\nnm\nhassan\ncarrie\njudged\nsatisfy\nvanessa\nknives\nselective\ncnn\nflowed\n##lice\neclipse\nstressed\neliza\nmathematician\ncease\ncultivated\n##roy\ncommissions\nbrowns\n##ania\ndestroyers\nsheridan\nmeadow\n##rius\nminerals\n##cial\ndownstream\nclash\ngram\nmemoirs\nventures\nbaha\nseymour\narchie\nmidlands\nedith\nfare\nflynn\ninvite\ncanceled\ntiles\nstabbed\nboulder\nincorporate\namended\ncamden\nfacial\nmollusk\nunreleased\ndescriptions\nyoga\ngrabs\n550\nraises\nramp\nshiver\n##rose\ncoined\npioneering\ntunes\nqing\nwarwick\ntops\n119\nmelanie\ngiles\n##rous\nwandered\n##inal\nannexed\nnov\n30th\nunnamed\n##ished\norganizational\nairplane\nnormandy\nstoke\nwhistle\nblessing\nviolations\nchased\nholders\nshotgun\n##ctic\noutlet\nreactor\n##vik\ntires\ntearing\nshores\nfortified\nmascot\nconstituencies\nnc\ncolumnist\nproductive\ntibet\n##rta\nlineage\nhooked\noct\ntapes\njudging\ncody\n##gger\nhansen\nkashmir\ntriggered\n##eva\nsolved\ncliffs\n##tree\nresisted\nanatomy\nprotesters\ntransparent\nimplied\n##iga\ninjection\nmattress\nexcluding\n##mbo\ndefenses\nhelpless\ndevotion\n##elli\ngrowl\nliberals\nweber\nphenomena\natoms\nplug\n##iff\nmortality\napprentice\nhowe\nconvincing\naaa\nswimmer\nbarber\nleone\npromptly\nsodium\ndef\nnowadays\narise\n##oning\ngloucester\ncorrected\ndignity\nnorm\nerie\n##ders\nelders\nevacuated\nsylvia\ncompression\n##yar\nhartford\npose\nbackpack\nreasoning\naccepts\n24th\nwipe\nmillimetres\nmarcel\n##oda\ndodgers\nalbion\n1790\noverwhelmed\naerospace\noaks\n1795\nshowcase\nacknowledge\nrecovering\nnolan\nashe\nhurts\ngeology\nfashioned\ndisappearance\nfarewell\nswollen\nshrug\nmarquis\nwimbledon\n124\nrue\n1792\ncommemorate\nreduces\nexperiencing\ninevitable\ncalcutta\nintel\n##court\nmurderer\nsticking\nfisheries\nimagery\nbloom\n280\nbrake\n##inus\ngustav\nhesitation\nmemorable\npo\nviral\nbeans\naccidents\ntunisia\nantenna\nspilled\nconsort\ntreatments\naye\nperimeter\n##gard\ndonation\nhostage\nmigrated\nbanker\naddiction\napex\nlil\ntrout\n##ously\nconscience\n##nova\nrams\nsands\ngenome\npassionate\ntroubles\n##lets\n##set\namid\n##ibility\n##ret\nhiggins\nexceed\nvikings\n##vie\npayne\n##zan\nmuscular\n##ste\ndefendant\nsucking\n##wal\nibrahim\nfuselage\nclaudia\nvfl\neuropeans\nsnails\ninterval\n##garh\npreparatory\nstatewide\ntasked\nlacrosse\nviktor\n##lation\nangola\n##hra\nflint\nimplications\nemploys\nteens\npatrons\nstall\nweekends\nbarriers\nscrambled\nnucleus\ntehran\njenna\nparsons\nlifelong\nrobots\ndisplacement\n5000\n##bles\nprecipitation\n##gt\nknuckles\nclutched\n1802\nmarrying\necology\nmarx\naccusations\ndeclare\nscars\nkolkata\nmat\nmeadows\nbermuda\nskeleton\nfinalists\nvintage\ncrawl\ncoordinate\naffects\nsubjected\norchestral\nmistaken\n##tc\nmirrors\ndipped\nrelied\n260\narches\ncandle\n##nick\nincorporating\nwildly\nfond\nbasilica\nowl\nfringe\nrituals\nwhispering\nstirred\nfeud\ntertiary\nslick\ngoat\nhonorable\nwhereby\nskip\nricardo\nstripes\nparachute\nadjoining\nsubmerged\nsynthesizer\n##gren\nintend\npositively\nninety\nphi\nbeaver\npartition\nfellows\nalexis\nprohibition\ncarlisle\nbizarre\nfraternity\n##bre\ndoubts\nicy\ncbc\naquatic\nsneak\nsonny\ncombines\nairports\ncrude\nsupervised\nspatial\nmerge\nalfonso\n##bic\ncorrupt\nscan\nundergo\n##ams\ndisabilities\ncolombian\ncomparing\ndolphins\nperkins\n##lish\nreprinted\nunanimous\nbounced\nhairs\nunderworld\nmidwest\nsemester\nbucket\npaperback\nminiseries\ncoventry\ndemise\n##leigh\ndemonstrations\nsensor\nrotating\nyan\n##hler\narrange\nsoils\n##idge\nhyderabad\nlabs\n##dr\nbrakes\ngrandchildren\n##nde\nnegotiated\nrover\nferrari\ncontinuation\ndirectorate\naugusta\nstevenson\ncounterpart\ngore\n##rda\nnursery\nrican\nave\ncollectively\nbroadly\npastoral\nrepertoire\nasserted\ndiscovering\nnordic\nstyled\nfiba\ncunningham\nharley\nmiddlesex\nsurvives\ntumor\ntempo\nzack\naiming\nlok\nurgent\n##rade\n##nto\ndevils\n##ement\ncontractor\nturin\n##wl\n##ool\nbliss\nrepaired\nsimmons\nmoan\nastronomical\ncr\nnegotiate\nlyric\n1890s\nlara\nbred\nclad\nangus\npbs\n##ience\nengineered\nposed\n##lk\nhernandez\npossessions\nelbows\npsychiatric\nstrokes\nconfluence\nelectorate\nlifts\ncampuses\nlava\nalps\n##ep\n##ution\n##date\nphysicist\nwoody\n##page\n##ographic\n##itis\njuliet\nreformation\nsparhawk\n320\ncomplement\nsuppressed\njewel\n##½\nfloated\n##kas\ncontinuity\nsadly\n##ische\ninability\nmelting\nscanning\npaula\nflour\njudaism\nsafer\nvague\n##lm\nsolving\ncurb\n##stown\nfinancially\ngable\nbees\nexpired\nmiserable\ncassidy\ndominion\n1789\ncupped\n145\nrobbery\nfacto\namos\nwarden\nresume\ntallest\nmarvin\ning\npounded\nusd\ndeclaring\ngasoline\n##aux\ndarkened\n270\n650\nsophomore\n##mere\nerection\ngossip\ntelevised\nrisen\ndial\n##eu\npillars\n##link\npassages\nprofound\n##tina\narabian\nashton\nsilicon\nnail\n##ead\n##lated\n##wer\n##hardt\nfleming\nfirearms\nducked\ncircuits\nblows\nwaterloo\ntitans\n##lina\natom\nfireplace\ncheshire\nfinanced\nactivation\nalgorithms\n##zzi\nconstituent\ncatcher\ncherokee\npartnerships\nsexuality\nplatoon\ntragic\nvivian\nguarded\nwhiskey\nmeditation\npoetic\n##late\n##nga\n##ake\nporto\nlisteners\ndominance\nkendra\nmona\nchandler\nfactions\n22nd\nsalisbury\nattitudes\nderivative\n##ido\n##haus\nintake\npaced\njavier\nillustrator\nbarrels\nbias\ncockpit\nburnett\ndreamed\nensuing\n##anda\nreceptors\nsomeday\nhawkins\nmattered\n##lal\nslavic\n1799\njesuit\ncameroon\nwasted\ntai\nwax\nlowering\nvictorious\nfreaking\noutright\nhancock\nlibrarian\nsensing\nbald\ncalcium\nmyers\ntablet\nannouncing\nbarack\nshipyard\npharmaceutical\n##uan\ngreenwich\nflush\nmedley\npatches\nwolfgang\npt\nspeeches\nacquiring\nexams\nnikolai\n##gg\nhayden\nkannada\n##type\nreilly\n##pt\nwaitress\nabdomen\ndevastated\ncapped\npseudonym\npharmacy\nfulfill\nparaguay\n1796\nclicked\n##trom\narchipelago\nsyndicated\n##hman\nlumber\norgasm\nrejection\nclifford\nlorraine\nadvent\nmafia\nrodney\nbrock\n##ght\n##used\n##elia\ncassette\nchamberlain\ndespair\nmongolia\nsensors\ndevelopmental\nupstream\n##eg\n##alis\nspanning\n165\ntrombone\nbasque\nseeded\ninterred\nrenewable\nrhys\nleapt\nrevision\nmolecule\n##ages\nchord\nvicious\nnord\nshivered\n23rd\narlington\ndebts\ncorpus\nsunrise\nbays\nblackburn\ncentimetres\n##uded\nshuddered\ngm\nstrangely\ngripping\ncartoons\nisabelle\norbital\n##ppa\nseals\nproving\n##lton\nrefusal\nstrengthened\nbust\nassisting\nbaghdad\nbatsman\nportrayal\nmara\npushes\nspears\nog\n##cock\nreside\nnathaniel\nbrennan\n1776\nconfirmation\ncaucus\n##worthy\nmarkings\nyemen\nnobles\nku\nlazy\nviewer\ncatalan\nencompasses\nsawyer\n##fall\nsparked\nsubstances\npatents\nbraves\narranger\nevacuation\nsergio\npersuade\ndover\ntolerance\npenguin\ncum\njockey\ninsufficient\ntownships\noccupying\ndeclining\nplural\nprocessed\nprojection\npuppet\nflanders\nintroduces\nliability\n##yon\ngymnastics\nantwerp\ntaipei\nhobart\ncandles\njeep\nwes\nobservers\n126\nchaplain\nbundle\nglorious\n##hine\nhazel\nflung\nsol\nexcavations\ndumped\nstares\nsh\nbangalore\ntriangular\nicelandic\nintervals\nexpressing\nturbine\n##vers\nsongwriting\ncrafts\n##igo\njasmine\nditch\nrite\n##ways\nentertaining\ncomply\nsorrow\nwrestlers\nbasel\nemirates\nmarian\nrivera\nhelpful\n##some\ncaution\ndownward\nnetworking\n##atory\n##tered\ndarted\ngenocide\nemergence\nreplies\nspecializing\nspokesman\nconvenient\nunlocked\nfading\naugustine\nconcentrations\nresemblance\nelijah\ninvestigator\nandhra\n##uda\npromotes\nbean\n##rrell\nfleeing\nwan\nsimone\nannouncer\n##ame\n##bby\nlydia\nweaver\n132\nresidency\nmodification\n##fest\nstretches\n##ast\nalternatively\nnat\nlowe\nlacks\n##ented\npam\ntile\nconcealed\ninferior\nabdullah\nresidences\ntissues\nvengeance\n##ided\nmoisture\npeculiar\ngroove\nzip\nbologna\njennings\nninja\noversaw\nzombies\npumping\nbatch\nlivingston\nemerald\ninstallations\n1797\npeel\nnitrogen\nrama\n##fying\n##star\nschooling\nstrands\nresponding\nwerner\n##ost\nlime\ncasa\naccurately\ntargeting\n##rod\nunderway\n##uru\nhemisphere\nlester\n##yard\noccupies\n2d\ngriffith\nangrily\nreorganized\n##owing\ncourtney\ndeposited\n##dd\n##30\nestadio\n##ifies\ndunn\nexiled\n##ying\nchecks\n##combe\n##о\n##fly\nsuccesses\nunexpectedly\nblu\nassessed\n##flower\n##ه\nobserving\nsacked\nspiders\nkn\n##tail\nmu\nnodes\nprosperity\naudrey\ndivisional\n155\nbroncos\ntangled\nadjust\nfeeds\nerosion\npaolo\nsurf\ndirectory\nsnatched\nhumid\nadmiralty\nscrewed\ngt\nreddish\n##nese\nmodules\ntrench\nlamps\nbind\nleah\nbucks\ncompetes\n##nz\n##form\ntranscription\n##uc\nisles\nviolently\nclutching\npga\ncyclist\ninflation\nflats\nragged\nunnecessary\n##hian\nstubborn\ncoordinated\nharriet\nbaba\ndisqualified\n330\ninsect\nwolfe\n##fies\nreinforcements\nrocked\nduel\nwinked\nembraced\nbricks\n##raj\nhiatus\ndefeats\npending\nbrightly\njealousy\n##xton\n##hm\n##uki\nlena\ngdp\ncolorful\n##dley\nstein\nkidney\n##shu\nunderwear\nwanderers\n##haw\n##icus\nguardians\nm³\nroared\nhabits\n##wise\npermits\ngp\nuranium\npunished\ndisguise\nbundesliga\nelise\ndundee\nerotic\npartisan\npi\ncollectors\nfloat\nindividually\nrendering\nbehavioral\nbucharest\nser\nhare\nvalerie\ncorporal\nnutrition\nproportional\n##isa\nimmense\n##kis\npavement\n##zie\n##eld\nsutherland\ncrouched\n1775\n##lp\nsuzuki\ntrades\nendurance\noperas\ncrosby\nprayed\npriory\nrory\nsocially\n##urn\ngujarat\n##pu\nwalton\ncube\npasha\nprivilege\nlennon\nfloods\nthorne\nwaterfall\nnipple\nscouting\napprove\n##lov\nminorities\nvoter\ndwight\nextensions\nassure\nballroom\nslap\ndripping\nprivileges\nrejoined\nconfessed\ndemonstrating\npatriotic\nyell\ninvestor\n##uth\npagan\nslumped\nsquares\n##cle\n##kins\nconfront\nbert\nembarrassment\n##aid\naston\nurging\nsweater\nstarr\nyuri\nbrains\nwilliamson\ncommuter\nmortar\nstructured\nselfish\nexports\n##jon\ncds\n##him\nunfinished\n##rre\nmortgage\ndestinations\n##nagar\ncanoe\nsolitary\nbuchanan\ndelays\nmagistrate\nfk\n##pling\nmotivation\n##lier\n##vier\nrecruiting\nassess\n##mouth\nmalik\nantique\n1791\npius\nrahman\nreich\ntub\nzhou\nsmashed\nairs\ngalway\nxii\nconditioning\nhonduras\ndischarged\ndexter\n##pf\nlionel\n129\ndebates\nlemon\ntiffany\nvolunteered\ndom\ndioxide\nprocession\ndevi\nsic\ntremendous\nadvertisements\ncolts\ntransferring\nverdict\nhanover\ndecommissioned\nutter\nrelate\npac\nracism\n##top\nbeacon\nlimp\nsimilarity\nterra\noccurrence\nant\n##how\nbecky\ncapt\nupdates\narmament\nrichie\npal\n##graph\nhalloween\nmayo\n##ssen\n##bone\ncara\nserena\nfcc\ndolls\nobligations\n##dling\nviolated\nlafayette\njakarta\nexploitation\n##ime\ninfamous\niconic\n##lah\n##park\nkitty\nmoody\nreginald\ndread\nspill\ncrystals\nolivier\nmodeled\nbluff\nequilibrium\nseparating\nnotices\nordnance\nextinction\nonset\ncosmic\nattachment\nsammy\nexpose\nprivy\nanchored\n##bil\nabbott\nadmits\nbending\nbaritone\nemmanuel\npoliceman\nvaughan\nwinged\nclimax\ndresses\ndenny\npolytechnic\nmohamed\nburmese\nauthentic\nnikki\ngenetics\ngrandparents\nhomestead\ngaza\npostponed\nmetacritic\nuna\n##sby\n##bat\nunstable\ndissertation\n##rial\n##cian\ncurls\nobscure\nuncovered\nbronx\npraying\ndisappearing\n##hoe\nprehistoric\ncoke\nturret\nmutations\nnonprofit\npits\nmonaco\n##ي\n##usion\nprominently\ndispatched\npodium\n##mir\nuci\n##uation\n133\nfortifications\nbirthplace\nkendall\n##lby\n##oll\npreacher\nrack\ngoodman\n##rman\npersistent\n##ott\ncountless\njaime\nrecorder\nlexington\npersecution\njumps\nrenewal\nwagons\n##11\ncrushing\n##holder\ndecorations\n##lake\nabundance\nwrath\nlaundry\n£1\ngarde\n##rp\njeanne\nbeetles\npeasant\n##sl\nsplitting\ncaste\nsergei\n##rer\n##ema\nscripts\n##ively\nrub\nsatellites\n##vor\ninscribed\nverlag\nscrapped\ngale\npackages\nchick\npotato\nslogan\nkathleen\narabs\n##culture\ncounterparts\nreminiscent\nchoral\n##tead\nrand\nretains\nbushes\ndane\naccomplish\ncourtesy\ncloses\n##oth\nslaughter\nhague\nkrakow\nlawson\ntailed\nelias\nginger\n##ttes\ncanopy\nbetrayal\nrebuilding\nturf\n##hof\nfrowning\nallegiance\nbrigades\nkicks\nrebuild\npolls\nalias\nnationalism\ntd\nrowan\naudition\nbowie\nfortunately\nrecognizes\nharp\ndillon\nhorrified\n##oro\nrenault\n##tics\nropes\n##α\npresumed\nrewarded\ninfrared\nwiping\naccelerated\nillustration\n##rid\npresses\npractitioners\nbadminton\n##iard\ndetained\n##tera\nrecognizing\nrelates\nmisery\n##sies\n##tly\nreproduction\npiercing\npotatoes\nthornton\nesther\nmanners\nhbo\n##aan\nours\nbullshit\nernie\nperennial\nsensitivity\nilluminated\nrupert\n##jin\n##iss\n##ear\nrfc\nnassau\n##dock\nstaggered\nsocialism\n##haven\nappointments\nnonsense\nprestige\nsharma\nhaul\n##tical\nsolidarity\ngps\n##ook\n##rata\nigor\npedestrian\n##uit\nbaxter\ntenants\nwires\nmedication\nunlimited\nguiding\nimpacts\ndiabetes\n##rama\nsasha\npas\nclive\nextraction\n131\ncontinually\nconstraints\n##bilities\nsonata\nhunted\nsixteenth\nchu\nplanting\nquote\nmayer\npretended\nabs\nspat\n##hua\nceramic\n##cci\ncurtains\npigs\npitching\n##dad\nlatvian\nsore\ndayton\n##sted\n##qi\npatrols\nslice\nplayground\n##nted\nshone\nstool\napparatus\ninadequate\nmates\ntreason\n##ija\ndesires\n##liga\n##croft\nsomalia\nlaurent\nmir\nleonardo\noracle\ngrape\nobliged\nchevrolet\nthirteenth\nstunning\nenthusiastic\n##ede\naccounted\nconcludes\ncurrents\nbasil\n##kovic\ndrought\n##rica\nmai\n##aire\nshove\nposting\n##shed\npilgrimage\nhumorous\npacking\nfry\npencil\nwines\nsmells\n144\nmarilyn\naching\nnewest\nclung\nbon\nneighbours\nsanctioned\n##pie\nmug\n##stock\ndrowning\n##mma\nhydraulic\n##vil\nhiring\nreminder\nlilly\ninvestigators\n##ncies\nsour\n##eous\ncompulsory\npacket\n##rion\n##graphic\n##elle\ncannes\n##inate\ndepressed\n##rit\nheroic\nimportantly\ntheresa\n##tled\nconway\nsaturn\nmarginal\nrae\n##xia\ncorresponds\nroyce\npact\njasper\nexplosives\npackaging\naluminium\n##ttered\ndenotes\nrhythmic\nspans\nassignments\nhereditary\noutlined\noriginating\nsundays\nlad\nreissued\ngreeting\nbeatrice\n##dic\npillar\nmarcos\nplots\nhandbook\nalcoholic\njudiciary\navant\nslides\nextract\nmasculine\nblur\n##eum\n##force\nhomage\ntrembled\nowens\nhymn\ntrey\nomega\nsignaling\nsocks\naccumulated\nreacted\nattic\ntheo\nlining\nangie\ndistraction\nprimera\ntalbot\n##key\n1200\nti\ncreativity\nbilled\n##hey\ndeacon\neduardo\nidentifies\nproposition\ndizzy\ngunner\nhogan\n##yam\n##pping\n##hol\nja\n##chan\njensen\nreconstructed\n##berger\nclearance\ndarius\n##nier\nabe\nharlem\nplea\ndei\ncircled\nemotionally\nnotation\nfascist\nneville\nexceeded\nupwards\nviable\nducks\n##fo\nworkforce\nracer\nlimiting\nshri\n##lson\npossesses\n1600\nkerr\nmoths\ndevastating\nladen\ndisturbing\nlocking\n##cture\ngal\nfearing\naccreditation\nflavor\naide\n1870s\nmountainous\n##baum\nmelt\n##ures\nmotel\ntexture\nservers\nsoda\n##mb\nherd\n##nium\nerect\npuzzled\nhum\npeggy\nexaminations\ngould\ntestified\ngeoff\nren\ndevised\nsacks\n##law\ndenial\nposters\ngrunted\ncesar\ntutor\nec\ngerry\nofferings\nbyrne\nfalcons\ncombinations\nct\nincoming\npardon\nrocking\n26th\navengers\nflared\nmankind\nseller\nuttar\nloch\nnadia\nstroking\nexposing\n##hd\nfertile\nancestral\ninstituted\n##has\nnoises\nprophecy\ntaxation\neminent\nvivid\npol\n##bol\ndart\nindirect\nmultimedia\nnotebook\nupside\ndisplaying\nadrenaline\nreferenced\ngeometric\n##iving\nprogression\n##ddy\nblunt\nannounce\n##far\nimplementing\n##lav\naggression\nliaison\ncooler\ncares\nheadache\nplantations\ngorge\ndots\nimpulse\nthickness\nashamed\naveraging\nkathy\nobligation\nprecursor\n137\nfowler\nsymmetry\nthee\n225\nhears\n##rai\nundergoing\nads\nbutcher\nbowler\n##lip\ncigarettes\nsubscription\ngoodness\n##ically\nbrowne\n##hos\n##tech\nkyoto\ndonor\n##erty\ndamaging\nfriction\ndrifting\nexpeditions\nhardened\nprostitution\n152\nfauna\nblankets\nclaw\ntossing\nsnarled\nbutterflies\nrecruits\ninvestigative\ncoated\nhealed\n138\ncommunal\nhai\nxiii\nacademics\nboone\npsychologist\nrestless\nlahore\nstephens\nmba\nbrendan\nforeigners\nprinter\n##pc\nached\nexplode\n27th\ndeed\nscratched\ndared\n##pole\ncardiac\n1780\nokinawa\nproto\ncommando\ncompelled\noddly\nelectrons\n##base\nreplica\nthanksgiving\n##rist\nsheila\ndeliberate\nstafford\ntidal\nrepresentations\nhercules\nou\n##path\n##iated\nkidnapping\nlenses\n##tling\ndeficit\nsamoa\nmouths\nconsuming\ncomputational\nmaze\ngranting\nsmirk\nrazor\nfixture\nideals\ninviting\naiden\nnominal\n##vs\nissuing\njulio\npitt\nramsey\ndocks\n##oss\nexhaust\n##owed\nbavarian\ndraped\nanterior\nmating\nethiopian\nexplores\nnoticing\n##nton\ndiscarded\nconvenience\nhoffman\nendowment\nbeasts\ncartridge\nmormon\npaternal\nprobe\nsleeves\ninterfere\nlump\ndeadline\n##rail\njenks\nbulldogs\nscrap\nalternating\njustified\nreproductive\nnam\nseize\ndescending\nsecretariat\nkirby\ncoupe\ngrouped\nsmash\npanther\nsedan\ntapping\n##18\nlola\ncheer\ngermanic\nunfortunate\n##eter\nunrelated\n##fan\nsubordinate\n##sdale\nsuzanne\nadvertisement\n##ility\nhorsepower\n##lda\ncautiously\ndiscourse\nluigi\n##mans\n##fields\nnoun\nprevalent\nmao\nschneider\neverett\nsurround\ngovernorate\nkira\n##avia\nwestward\n##take\nmisty\nrails\nsustainability\n134\nunused\n##rating\npacks\ntoast\nunwilling\nregulate\nthy\nsuffrage\nnile\nawe\nassam\ndefinitions\ntravelers\naffordable\n##rb\nconferred\nsells\nundefeated\nbeneficial\ntorso\nbasal\nrepeating\nremixes\n##pass\nbahrain\ncables\nfang\n##itated\nexcavated\nnumbering\nstatutory\n##rey\ndeluxe\n##lian\nforested\nramirez\nderbyshire\nzeus\nslamming\ntransfers\nastronomer\nbanana\nlottery\nberg\nhistories\nbamboo\n##uchi\nresurrection\nposterior\nbowls\nvaguely\n##thi\nthou\npreserving\ntensed\noffence\n##inas\nmeyrick\ncallum\nridden\nwatt\nlangdon\ntying\nlowland\nsnorted\ndaring\ntruman\n##hale\n##girl\naura\noverly\nfiling\nweighing\ngoa\ninfections\nphilanthropist\nsaunders\neponymous\n##owski\nlatitude\nperspectives\nreviewing\nmets\ncommandant\nradial\n##kha\nflashlight\nreliability\nkoch\nvowels\namazed\nada\nelaine\nsupper\n##rth\n##encies\npredator\ndebated\nsoviets\ncola\n##boards\n##nah\ncompartment\ncrooked\narbitrary\nfourteenth\n##ctive\nhavana\nmajors\nsteelers\nclips\nprofitable\nambush\nexited\npackers\n##tile\nnude\ncracks\nfungi\n##е\nlimb\ntrousers\njosie\nshelby\ntens\nfrederic\n##ος\ndefinite\nsmoothly\nconstellation\ninsult\nbaton\ndiscs\nlingering\n##nco\nconclusions\nlent\nstaging\nbecker\ngrandpa\nshaky\n##tron\neinstein\nobstacles\nsk\nadverse\nelle\neconomically\n##moto\nmccartney\nthor\ndismissal\nmotions\nreadings\nnostrils\ntreatise\n##pace\nsqueezing\nevidently\nprolonged\n1783\nvenezuelan\nje\nmarguerite\nbeirut\ntakeover\nshareholders\n##vent\ndenise\ndigit\nairplay\nnorse\n##bbling\nimaginary\npills\nhubert\nblaze\nvacated\neliminating\n##ello\nvine\nmansfield\n##tty\nretrospective\nbarrow\nborne\nclutch\nbail\nforensic\nweaving\n##nett\n##witz\ndesktop\ncitadel\npromotions\nworrying\ndorset\nieee\nsubdivided\n##iating\nmanned\nexpeditionary\npickup\nsynod\nchuckle\n185\nbarney\n##rz\n##ffin\nfunctionality\nkarachi\nlitigation\nmeanings\nuc\nlick\nturbo\nanders\n##ffed\nexecute\ncurl\noppose\nankles\ntyphoon\n##د\n##ache\n##asia\nlinguistics\ncompassion\npressures\ngrazing\nperfection\n##iting\nimmunity\nmonopoly\nmuddy\nbackgrounds\n136\nnamibia\nfrancesca\nmonitors\nattracting\nstunt\ntuition\n##ии\nvegetable\n##mates\n##quent\nmgm\njen\ncomplexes\nforts\n##ond\ncellar\nbites\nseventeenth\nroyals\nflemish\nfailures\nmast\ncharities\n##cular\nperuvian\ncapitals\nmacmillan\nipswich\noutward\nfrigate\npostgraduate\nfolds\nemploying\n##ouse\nconcurrently\nfiery\n##tai\ncontingent\nnightmares\nmonumental\nnicaragua\n##kowski\nlizard\nmal\nfielding\ngig\nreject\n##pad\nharding\n##ipe\ncoastline\n##cin\n##nos\nbeethoven\nhumphrey\ninnovations\n##tam\n##nge\nnorris\ndoris\nsolicitor\nhuang\nobey\n141\n##lc\nniagara\n##tton\nshelves\naug\nbourbon\ncurry\nnightclub\nspecifications\nhilton\n##ndo\ncentennial\ndispersed\nworm\nneglected\nbriggs\nsm\nfont\nkuala\nuneasy\nplc\n##nstein\n##bound\n##aking\n##burgh\nawaiting\npronunciation\n##bbed\n##quest\neh\noptimal\nzhu\nraped\ngreens\npresided\nbrenda\nworries\n##life\nvenetian\nmarxist\nturnout\n##lius\nrefined\nbraced\nsins\ngrasped\nsunderland\nnickel\nspeculated\nlowell\ncyrillic\ncommunism\nfundraising\nresembling\ncolonists\nmutant\nfreddie\nusc\n##mos\ngratitude\n##run\nmural\n##lous\nchemist\nwi\nreminds\n28th\nsteals\ntess\npietro\n##ingen\npromoter\nri\nmicrophone\nhonoured\nrai\nsant\n##qui\nfeather\n##nson\nburlington\nkurdish\nterrorists\ndeborah\nsickness\n##wed\n##eet\nhazard\nirritated\ndesperation\nveil\nclarity\n##rik\njewels\nxv\n##gged\n##ows\n##cup\nberkshire\nunfair\nmysteries\norchid\nwinced\nexhaustion\nrenovations\nstranded\nobe\ninfinity\n##nies\nadapt\nredevelopment\nthanked\nregistry\nolga\ndomingo\nnoir\ntudor\nole\n##atus\ncommenting\nbehaviors\n##ais\ncrisp\npauline\nprobable\nstirling\nwigan\n##bian\nparalympics\npanting\nsurpassed\n##rew\nluca\nbarred\npony\nfamed\n##sters\ncassandra\nwaiter\ncarolyn\nexported\n##orted\nandres\ndestructive\ndeeds\njonah\ncastles\nvacancy\nsuv\n##glass\n1788\norchard\nyep\nfamine\nbelarusian\nsprang\n##forth\nskinny\n##mis\nadministrators\nrotterdam\nzambia\nzhao\nboiler\ndiscoveries\n##ride\n##physics\nlucius\ndisappointing\noutreach\nspoon\n##frame\nqualifications\nunanimously\nenjoys\nregency\n##iidae\nstade\nrealism\nveterinary\nrodgers\ndump\nalain\nchestnut\ncastile\ncensorship\nrumble\ngibbs\n##itor\ncommunion\nreggae\ninactivated\nlogs\nloads\n##houses\nhomosexual\n##iano\nale\ninforms\n##cas\nphrases\nplaster\nlinebacker\nambrose\nkaiser\nfascinated\n850\nlimerick\nrecruitment\nforge\nmastered\n##nding\nleinster\nrooted\nthreaten\n##strom\nborneo\n##hes\nsuggestions\nscholarships\npropeller\ndocumentaries\npatronage\ncoats\nconstructing\ninvest\nneurons\ncomet\nentirety\nshouts\nidentities\nannoying\nunchanged\nwary\n##antly\n##ogy\nneat\noversight\n##kos\nphillies\nreplay\nconstance\n##kka\nincarnation\nhumble\nskies\nminus\n##acy\nsmithsonian\n##chel\nguerrilla\njar\ncadets\n##plate\nsurplus\naudit\n##aru\ncracking\njoanna\nlouisa\npacing\n##lights\nintentionally\n##iri\ndiner\nnwa\nimprint\naustralians\ntong\nunprecedented\nbunker\nnaive\nspecialists\nark\nnichols\nrailing\nleaked\npedal\n##uka\nshrub\nlonging\nroofs\nv8\ncaptains\nneural\ntuned\n##ntal\n##jet\nemission\nmedina\nfrantic\ncodex\ndefinitive\nsid\nabolition\nintensified\nstocks\nenrique\nsustain\ngenoa\noxide\n##written\nclues\ncha\n##gers\ntributaries\nfragment\nvenom\n##rity\n##ente\n##sca\nmuffled\nvain\nsire\nlaos\n##ingly\n##hana\nhastily\nsnapping\nsurfaced\nsentiment\nmotive\n##oft\ncontests\napproximate\nmesa\nluckily\ndinosaur\nexchanges\npropelled\naccord\nbourne\nrelieve\ntow\nmasks\noffended\n##ues\ncynthia\n##mmer\nrains\nbartender\nzinc\nreviewers\nlois\n##sai\nlegged\narrogant\nrafe\nrosie\ncomprise\nhandicap\nblockade\ninlet\nlagoon\ncopied\ndrilling\nshelley\npetals\n##inian\nmandarin\nobsolete\n##inated\nonward\narguably\nproductivity\ncindy\npraising\nseldom\nbusch\ndiscusses\nraleigh\nshortage\nranged\nstanton\nencouragement\nfirstly\nconceded\novers\ntemporal\n##uke\ncbe\n##bos\nwoo\ncertainty\npumps\n##pton\nstalked\n##uli\nlizzie\nperiodic\nthieves\nweaker\n##night\ngases\nshoving\nchooses\nwc\n##chemical\nprompting\nweights\n##kill\nrobust\nflanked\nsticky\nhu\ntuberculosis\n##eb\n##eal\nchristchurch\nresembled\nwallet\nreese\ninappropriate\npictured\ndistract\nfixing\nfiddle\ngiggled\nburger\nheirs\nhairy\nmechanic\ntorque\napache\nobsessed\nchiefly\ncheng\nlogging\n##tag\nextracted\nmeaningful\nnumb\n##vsky\ngloucestershire\nreminding\n##bay\nunite\n##lit\nbreeds\ndiminished\nclown\nglove\n1860s\n##ن\n##ug\narchibald\nfocal\nfreelance\nsliced\ndepiction\n##yk\norganism\nswitches\nsights\nstray\ncrawling\n##ril\nlever\nleningrad\ninterpretations\nloops\nanytime\nreel\nalicia\ndelighted\n##ech\ninhaled\nxiv\nsuitcase\nbernie\nvega\nlicenses\nnorthampton\nexclusion\ninduction\nmonasteries\nracecourse\nhomosexuality\n##right\n##sfield\n##rky\ndimitri\nmichele\nalternatives\nions\ncommentators\ngenuinely\nobjected\npork\nhospitality\nfencing\nstephan\nwarships\nperipheral\nwit\ndrunken\nwrinkled\nquentin\nspends\ndeparting\nchung\nnumerical\nspokesperson\n##zone\njohannesburg\ncaliber\nkillers\n##udge\nassumes\nneatly\ndemographic\nabigail\nbloc\n##vel\nmounting\n##lain\nbentley\nslightest\nxu\nrecipients\n##jk\nmerlin\n##writer\nseniors\nprisons\nblinking\nhindwings\nflickered\nkappa\n##hel\n80s\nstrengthening\nappealing\nbrewing\ngypsy\nmali\nlashes\nhulk\nunpleasant\nharassment\nbio\ntreaties\npredict\ninstrumentation\npulp\ntroupe\nboiling\nmantle\n##ffe\nins\n##vn\ndividing\nhandles\nverbs\n##onal\ncoconut\nsenegal\n340\nthorough\ngum\nmomentarily\n##sto\ncocaine\npanicked\ndestined\n##turing\nteatro\ndenying\nweary\ncaptained\nmans\n##hawks\n##code\nwakefield\nbollywood\nthankfully\n##16\ncyril\n##wu\namendments\n##bahn\nconsultation\nstud\nreflections\nkindness\n1787\ninternally\n##ovo\ntex\nmosaic\ndistribute\npaddy\nseeming\n143\n##hic\npiers\n##15\n##mura\n##verse\npopularly\nwinger\nkang\nsentinel\nmccoy\n##anza\ncovenant\n##bag\nverge\nfireworks\nsuppress\nthrilled\ndominate\n##jar\nswansea\n##60\n142\nreconciliation\n##ndi\nstiffened\ncue\ndorian\n##uf\ndamascus\namor\nida\nforemost\n##aga\nporsche\nunseen\ndir\n##had\n##azi\nstony\nlexi\nmelodies\n##nko\nangular\ninteger\npodcast\nants\ninherent\njaws\njustify\npersona\n##olved\njosephine\n##nr\n##ressed\ncustomary\nflashes\ngala\ncyrus\nglaring\nbackyard\nariel\nphysiology\ngreenland\nhtml\nstir\navon\natletico\nfinch\nmethodology\nked\n##lent\nmas\ncatholicism\ntownsend\nbranding\nquincy\nfits\ncontainers\n1777\nashore\naragon\n##19\nforearm\npoisoning\n##sd\nadopting\nconquer\ngrinding\namnesty\nkeller\nfinances\nevaluate\nforged\nlankan\ninstincts\n##uto\nguam\nbosnian\nphotographed\nworkplace\ndesirable\nprotector\n##dog\nallocation\nintently\nencourages\nwilly\n##sten\nbodyguard\nelectro\nbrighter\n##ν\nbihar\n##chev\nlasts\nopener\namphibious\nsal\nverde\narte\n##cope\ncaptivity\nvocabulary\nyields\n##tted\nagreeing\ndesmond\npioneered\n##chus\nstrap\ncampaigned\nrailroads\n##ович\nemblem\n##dre\nstormed\n501\n##ulous\nmarijuana\nnorthumberland\n##gn\n##nath\nbowen\nlandmarks\nbeaumont\n##qua\ndanube\n##bler\nattorneys\nth\nge\nflyers\ncritique\nvillains\ncass\nmutation\nacc\n##0s\ncolombo\nmckay\nmotif\nsampling\nconcluding\nsyndicate\n##rell\nneon\nstables\nds\nwarnings\nclint\nmourning\nwilkinson\n##tated\nmerrill\nleopard\nevenings\nexhaled\nemil\nsonia\nezra\ndiscrete\nstove\nfarrell\nfifteenth\nprescribed\nsuperhero\n##rier\nworms\nhelm\nwren\n##duction\n##hc\nexpo\n##rator\nhq\nunfamiliar\nantony\nprevents\nacceleration\nfiercely\nmari\npainfully\ncalculations\ncheaper\nign\nclifton\nirvine\ndavenport\nmozambique\n##np\npierced\n##evich\nwonders\n##wig\n##cate\n##iling\ncrusade\nware\n##uel\nenzymes\nreasonably\nmls\n##coe\nmater\nambition\nbunny\neliot\nkernel\n##fin\nasphalt\nheadmaster\ntorah\naden\nlush\npins\nwaived\n##care\n##yas\njoao\nsubstrate\nenforce\n##grad\n##ules\nalvarez\nselections\nepidemic\ntempted\n##bit\nbremen\ntranslates\nensured\nwaterfront\n29th\nforrest\nmanny\nmalone\nkramer\nreigning\ncookies\nsimpler\nabsorption\n205\nengraved\n##ffy\nevaluated\n1778\nhaze\n146\ncomforting\ncrossover\n##abe\nthorn\n##rift\n##imo\n##pop\nsuppression\nfatigue\ncutter\n##tr\n201\nwurttemberg\n##orf\nenforced\nhovering\nproprietary\ngb\nsamurai\nsyllable\nascent\nlacey\ntick\nlars\ntractor\nmerchandise\nrep\nbouncing\ndefendants\n##yre\nhuntington\n##ground\n##oko\nstandardized\n##hor\n##hima\nassassinated\nnu\npredecessors\nrainy\nliar\nassurance\nlyrical\n##uga\nsecondly\nflattened\nios\nparameter\nundercover\n##mity\nbordeaux\npunish\nridges\nmarkers\nexodus\ninactive\nhesitate\ndebbie\nnyc\npledge\nsavoy\nnagar\noffset\norganist\n##tium\nhesse\nmarin\nconverting\n##iver\ndiagram\npropulsion\npu\nvalidity\nreverted\nsupportive\n##dc\nministries\nclans\nresponds\nproclamation\n##inae\n##ø\n##rea\nein\npleading\npatriot\nsf\nbirch\nislanders\nstrauss\nhates\n##dh\nbrandenburg\nconcession\nrd\n##ob\n1900s\nkillings\ntextbook\nantiquity\ncinematography\nwharf\nembarrassing\nsetup\ncreed\nfarmland\ninequality\ncentred\nsignatures\nfallon\n370\n##ingham\n##uts\nceylon\ngazing\ndirective\nlaurie\n##tern\nglobally\n##uated\n##dent\nallah\nexcavation\nthreads\n##cross\n148\nfrantically\nicc\nutilize\ndetermines\nrespiratory\nthoughtful\nreceptions\n##dicate\nmerging\nchandra\nseine\n147\nbuilders\nbuilds\ndiagnostic\ndev\nvisibility\ngoddamn\nanalyses\ndhaka\ncho\nproves\nchancel\nconcurrent\ncuriously\ncanadians\npumped\nrestoring\n1850s\nturtles\njaguar\nsinister\nspinal\ntraction\ndeclan\nvows\n1784\nglowed\ncapitalism\nswirling\ninstall\nuniversidad\n##lder\n##oat\nsoloist\n##genic\n##oor\ncoincidence\nbeginnings\nnissan\ndip\nresorts\ncaucasus\ncombustion\ninfectious\n##eno\npigeon\nserpent\n##itating\nconclude\nmasked\nsalad\njew\n##gr\nsurreal\ntoni\n##wc\nharmonica\n151\n##gins\n##etic\n##coat\nfishermen\nintending\nbravery\n##wave\nklaus\ntitan\nwembley\ntaiwanese\nransom\n40th\nincorrect\nhussein\neyelids\njp\ncooke\ndramas\nutilities\n##etta\n##print\neisenhower\nprincipally\ngranada\nlana\n##rak\nopenings\nconcord\n##bl\nbethany\nconnie\nmorality\nsega\n##mons\n##nard\nearnings\n##kara\n##cine\nwii\ncommunes\n##rel\ncoma\ncomposing\nsoftened\nsevered\ngrapes\n##17\nnguyen\nanalyzed\nwarlord\nhubbard\nheavenly\nbehave\nslovenian\n##hit\n##ony\nhailed\nfilmmakers\ntrance\ncaldwell\nskye\nunrest\ncoward\nlikelihood\n##aging\nbern\nsci\ntaliban\nhonolulu\npropose\n##wang\n1700\nbrowser\nimagining\ncobra\ncontributes\ndukes\ninstinctively\nconan\nviolinist\n##ores\naccessories\ngradual\n##amp\nquotes\nsioux\n##dating\nundertake\nintercepted\nsparkling\ncompressed\n139\nfungus\ntombs\nhaley\nimposing\nrests\ndegradation\nlincolnshire\nretailers\nwetlands\ntulsa\ndistributor\ndungeon\nnun\ngreenhouse\nconvey\natlantis\naft\nexits\noman\ndresser\nlyons\n##sti\njoking\neddy\njudgement\nomitted\ndigits\n##cts\n##game\njuniors\n##rae\ncents\nstricken\nune\n##ngo\nwizards\nweir\nbreton\nnan\ntechnician\nfibers\nliking\nroyalty\n##cca\n154\npersia\nterribly\nmagician\n##rable\n##unt\nvance\ncafeteria\nbooker\ncamille\nwarmer\n##static\nconsume\ncavern\ngaps\ncompass\ncontemporaries\nfoyer\nsoothing\ngraveyard\nmaj\nplunged\nblush\n##wear\ncascade\ndemonstrates\nordinance\n##nov\nboyle\n##lana\nrockefeller\nshaken\nbanjo\nizzy\n##ense\nbreathless\nvines\n##32\n##eman\nalterations\nchromosome\ndwellings\nfeudal\nmole\n153\ncatalonia\nrelics\ntenant\nmandated\n##fm\nfridge\nhats\nhonesty\npatented\nraul\nheap\ncruisers\naccusing\nenlightenment\ninfants\nwherein\nchatham\ncontractors\nzen\naffinity\nhc\nosborne\npiston\n156\ntraps\nmaturity\n##rana\nlagos\n##zal\npeering\n##nay\nattendant\ndealers\nprotocols\nsubset\nprospects\nbiographical\n##cre\nartery\n##zers\ninsignia\nnuns\nendured\n##eration\nrecommend\nschwartz\nserbs\nberger\ncromwell\ncrossroads\n##ctor\nenduring\nclasped\ngrounded\n##bine\nmarseille\ntwitched\nabel\nchoke\nhttps\ncatalyst\nmoldova\nitalians\n##tist\ndisastrous\nwee\n##oured\n##nti\nwwf\nnope\n##piration\n##asa\nexpresses\nthumbs\n167\n##nza\ncoca\n1781\ncheating\n##ption\nskipped\nsensory\nheidelberg\nspies\nsatan\ndangers\nsemifinal\n202\nbohemia\nwhitish\nconfusing\nshipbuilding\nrelies\nsurgeons\nlandings\nravi\nbaku\nmoor\nsuffix\nalejandro\n##yana\nlitre\nupheld\n##unk\nrajasthan\n##rek\ncoaster\ninsists\nposture\nscenarios\netienne\nfavoured\nappoint\ntransgender\nelephants\npoked\ngreenwood\ndefences\nfulfilled\nmilitant\nsomali\n1758\nchalk\npotent\n##ucci\nmigrants\nwink\nassistants\nnos\nrestriction\nactivism\nniger\n##ario\ncolon\nshaun\n##sat\ndaphne\n##erated\nswam\ncongregations\nreprise\nconsiderations\nmagnet\nplayable\nxvi\n##р\noverthrow\ntobias\nknob\nchavez\ncoding\n##mers\npropped\nkatrina\norient\nnewcomer\n##suke\ntemperate\n##pool\nfarmhouse\ninterrogation\n##vd\ncommitting\n##vert\nforthcoming\nstrawberry\njoaquin\nmacau\nponds\nshocking\nsiberia\n##cellular\nchant\ncontributors\n##nant\n##ologists\nsped\nabsorb\nhail\n1782\nspared\n##hore\nbarbados\nkarate\nopus\noriginates\nsaul\n##xie\nevergreen\nleaped\n##rock\ncorrelation\nexaggerated\nweekday\nunification\nbump\ntracing\nbrig\nafb\npathways\nutilizing\n##ners\nmod\nmb\ndisturbance\nkneeling\n##stad\n##guchi\n100th\npune\n##thy\ndecreasing\n168\nmanipulation\nmiriam\nacademia\necosystem\noccupational\nrbi\n##lem\nrift\n##14\nrotary\nstacked\nincorporation\nawakening\ngenerators\nguerrero\nracist\n##omy\ncyber\nderivatives\nculminated\nallie\nannals\npanzer\nsainte\nwikipedia\npops\nzu\naustro\n##vate\nalgerian\npolitely\nnicholson\nmornings\neducate\ntastes\nthrill\ndartmouth\n##gating\ndb\n##jee\nregan\ndiffering\nconcentrating\nchoreography\ndivinity\n##media\npledged\nalexandre\nrouting\ngregor\nmadeline\n##idal\napocalypse\n##hora\ngunfire\nculminating\nelves\nfined\nliang\nlam\nprogrammed\ntar\nguessing\ntransparency\ngabrielle\n##gna\ncancellation\nflexibility\n##lining\naccession\nshea\nstronghold\nnets\nspecializes\n##rgan\nabused\nhasan\nsgt\nling\nexceeding\n##₄\nadmiration\nsupermarket\n##ark\nphotographers\nspecialised\ntilt\nresonance\nhmm\nperfume\n380\nsami\nthreatens\ngarland\nbotany\nguarding\nboiled\ngreet\npuppy\nrusso\nsupplier\nwilmington\nvibrant\nvijay\n##bius\nparalympic\ngrumbled\npaige\nfaa\nlicking\nmargins\nhurricanes\n##gong\nfest\ngrenade\nripping\n##uz\ncounseling\nweigh\n##sian\nneedles\nwiltshire\nedison\ncostly\n##not\nfulton\ntramway\nredesigned\nstaffordshire\ncache\ngasping\nwatkins\nsleepy\ncandidacy\n##group\nmonkeys\ntimeline\nthrobbing\n##bid\n##sos\nberth\nuzbekistan\nvanderbilt\nbothering\noverturned\nballots\ngem\n##iger\nsunglasses\nsubscribers\nhooker\ncompelling\nang\nexceptionally\nsaloon\nstab\n##rdi\ncarla\nterrifying\nrom\n##vision\ncoil\n##oids\nsatisfying\nvendors\n31st\nmackay\ndeities\noverlooked\nambient\nbahamas\nfelipe\nolympia\nwhirled\nbotanist\nadvertised\ntugging\n##dden\ndisciples\nmorales\nunionist\nrites\nfoley\nmorse\nmotives\ncreepy\n##₀\nsoo\n##sz\nbargain\nhighness\nfrightening\nturnpike\ntory\nreorganization\n##cer\ndepict\nbiographer\n##walk\nunopposed\nmanifesto\n##gles\ninstitut\nemile\naccidental\nkapoor\n##dam\nkilkenny\ncortex\nlively\n##13\nromanesque\njain\nshan\ncannons\n##ood\n##ske\npetrol\nechoing\namalgamated\ndisappears\ncautious\nproposes\nsanctions\ntrenton\n##ر\nflotilla\naus\ncontempt\ntor\ncanary\ncote\ntheirs\n##hun\nconceptual\ndeleted\nfascinating\npaso\nblazing\nelf\nhonourable\nhutchinson\n##eiro\n##outh\n##zin\nsurveyor\ntee\namidst\nwooded\nreissue\nintro\n##ono\ncobb\nshelters\nnewsletter\nhanson\nbrace\nencoding\nconfiscated\ndem\ncaravan\nmarino\nscroll\nmelodic\ncows\nimam\n##adi\n##aneous\nnorthward\nsearches\nbiodiversity\ncora\n310\nroaring\n##bers\nconnell\ntheologian\nhalo\ncompose\npathetic\nunmarried\ndynamo\n##oot\naz\ncalculation\ntoulouse\ndeserves\nhumour\nnr\nforgiveness\ntam\nundergone\nmartyr\npamela\nmyths\nwhore\ncounselor\nhicks\n290\nheavens\nbattleship\nelectromagnetic\n##bbs\nstellar\nestablishments\npresley\nhopped\n##chin\ntemptation\n90s\nwills\nnas\n##yuan\nnhs\n##nya\nseminars\n##yev\nadaptations\ngong\nasher\nlex\nindicator\nsikh\ntobago\ncites\ngoin\n##yte\nsatirical\n##gies\ncharacterised\ncorrespond\nbubbles\nlure\nparticipates\n##vid\neruption\nskate\ntherapeutic\n1785\ncanals\nwholesale\ndefaulted\nsac\n460\npetit\n##zzled\nvirgil\nleak\nravens\n256\nportraying\n##yx\nghetto\ncreators\ndams\nportray\nvicente\n##rington\nfae\nnamesake\nbounty\n##arium\njoachim\n##ota\n##iser\naforementioned\naxle\nsnout\ndepended\ndismantled\nreuben\n480\n##ibly\ngallagher\n##lau\n##pd\nearnest\n##ieu\n##iary\ninflicted\nobjections\n##llar\nasa\ngritted\n##athy\njericho\n##sea\n##was\nflick\nunderside\nceramics\nundead\nsubstituted\n195\neastward\nundoubtedly\nwheeled\nchimney\n##iche\nguinness\ncb\n##ager\nsiding\n##bell\ntraitor\nbaptiste\ndisguised\ninauguration\n149\ntipperary\nchoreographer\nperched\nwarmed\nstationary\neco\n##ike\n##ntes\nbacterial\n##aurus\nflores\nphosphate\n##core\nattacker\ninvaders\nalvin\nintersects\na1\nindirectly\nimmigrated\nbusinessmen\ncornelius\nvalves\nnarrated\npill\nsober\nul\nnationale\nmonastic\napplicants\nscenery\n##jack\n161\nmotifs\nconstitutes\ncpu\n##osh\njurisdictions\nsd\ntuning\nirritation\nwoven\n##uddin\nfertility\ngao\n##erie\nantagonist\nimpatient\nglacial\nhides\nboarded\ndenominations\ninterception\n##jas\ncookie\nnicola\n##tee\nalgebraic\nmarquess\nbahn\nparole\nbuyers\nbait\nturbines\npaperwork\nbestowed\nnatasha\nrenee\noceans\npurchases\n157\nvaccine\n215\n##tock\nfixtures\nplayhouse\nintegrate\njai\noswald\nintellectuals\n##cky\nbooked\nnests\nmortimer\n##isi\nobsession\nsept\n##gler\n##sum\n440\nscrutiny\nsimultaneous\nsquinted\n##shin\ncollects\noven\nshankar\npenned\nremarkably\n##я\nslips\nluggage\nspectral\n1786\ncollaborations\nlouie\nconsolidation\n##ailed\n##ivating\n420\nhoover\nblackpool\nharness\nignition\nvest\ntails\nbelmont\nmongol\nskinner\n##nae\nvisually\nmage\nderry\n##tism\n##unce\nstevie\ntransitional\n##rdy\nredskins\ndrying\nprep\nprospective\n##21\nannoyance\noversee\n##loaded\nfills\n##books\n##iki\nannounces\nfda\nscowled\nrespects\nprasad\nmystic\ntucson\n##vale\nrevue\nspringer\nbankrupt\n1772\naristotle\nsalvatore\nhabsburg\n##geny\ndal\nnatal\nnut\npod\nchewing\ndarts\nmoroccan\nwalkover\nrosario\nlenin\npunjabi\n##ße\ngrossed\nscattering\nwired\ninvasive\nhui\npolynomial\ncorridors\nwakes\ngina\nportrays\n##cratic\narid\nretreating\nerich\nirwin\nsniper\n##dha\nlinen\nlindsey\nmaneuver\nbutch\nshutting\nsocio\nbounce\ncommemorative\npostseason\njeremiah\npines\n275\nmystical\nbeads\nbp\nabbas\nfurnace\nbidding\nconsulted\nassaulted\nempirical\nrubble\nenclosure\nsob\nweakly\ncancel\npolly\nyielded\n##emann\ncurly\nprediction\nbattered\n70s\nvhs\njacqueline\nrender\nsails\nbarked\ndetailing\ngrayson\nriga\nsloane\nraging\n##yah\nherbs\nbravo\n##athlon\nalloy\ngiggle\nimminent\nsuffers\nassumptions\nwaltz\n##itate\naccomplishments\n##ited\nbathing\nremixed\ndeception\nprefix\n##emia\ndeepest\n##tier\n##eis\nbalkan\nfrogs\n##rong\nslab\n##pate\nphilosophers\npeterborough\ngrains\nimports\ndickinson\nrwanda\n##atics\n1774\ndirk\nlan\ntablets\n##rove\nclone\n##rice\ncaretaker\nhostilities\nmclean\n##gre\nregimental\ntreasures\nnorms\nimpose\ntsar\ntango\ndiplomacy\nvariously\ncomplain\n192\nrecognise\narrests\n1779\ncelestial\npulitzer\n##dus\nbing\nlibretto\n##moor\nadele\nsplash\n##rite\nexpectation\nlds\nconfronts\n##izer\nspontaneous\nharmful\nwedge\nentrepreneurs\nbuyer\n##ope\nbilingual\ntranslate\nrugged\nconner\ncirculated\nuae\neaton\n##gra\n##zzle\nlingered\nlockheed\nvishnu\nreelection\nalonso\n##oom\njoints\nyankee\nheadline\ncooperate\nheinz\nlaureate\ninvading\n##sford\nechoes\nscandinavian\n##dham\nhugging\nvitamin\nsalute\nmicah\nhind\ntrader\n##sper\nradioactive\n##ndra\nmilitants\npoisoned\nratified\nremark\ncampeonato\ndeprived\nwander\nprop\n##dong\noutlook\n##tani\n##rix\n##eye\nchiang\ndarcy\n##oping\nmandolin\nspice\nstatesman\nbabylon\n182\nwalled\nforgetting\nafro\n##cap\n158\ngiorgio\nbuffer\n##polis\nplanetary\n##gis\noverlap\nterminals\nkinda\ncentenary\n##bir\narising\nmanipulate\nelm\nke\n1770\nak\n##tad\nchrysler\nmapped\nmoose\npomeranian\nquad\nmacarthur\nassemblies\nshoreline\nrecalls\nstratford\n##rted\nnoticeable\n##evic\nimp\n##rita\n##sque\naccustomed\nsupplying\ntents\ndisgusted\nvogue\nsipped\nfilters\nkhz\nreno\nselecting\nluftwaffe\nmcmahon\ntyne\nmasterpiece\ncarriages\ncollided\ndunes\nexercised\nflare\nremembers\nmuzzle\n##mobile\nheck\n##rson\nburgess\nlunged\nmiddleton\nboycott\nbilateral\n##sity\nhazardous\nlumpur\nmultiplayer\nspotlight\njackets\ngoldman\nliege\nporcelain\nrag\nwaterford\nbenz\nattracts\nhopeful\nbattling\nottomans\nkensington\nbaked\nhymns\ncheyenne\nlattice\nlevine\nborrow\npolymer\nclashes\nmichaels\nmonitored\ncommitments\ndenounced\n##25\n##von\ncavity\n##oney\nhobby\nakin\n##holders\nfutures\nintricate\ncornish\npatty\n##oned\nillegally\ndolphin\n##lag\nbarlow\nyellowish\nmaddie\napologized\nluton\nplagued\n##puram\nnana\n##rds\nsway\nfanny\nłodz\n##rino\npsi\nsuspicions\nhanged\n##eding\ninitiate\ncharlton\n##por\nnak\ncompetent\n235\nanalytical\nannex\nwardrobe\nreservations\n##rma\nsect\n162\nfairfax\nhedge\npiled\nbuckingham\nuneven\nbauer\nsimplicity\nsnyder\ninterpret\naccountability\ndonors\nmoderately\nbyrd\ncontinents\n##cite\n##max\ndisciple\nhr\njamaican\nping\nnominees\n##uss\nmongolian\ndiver\nattackers\neagerly\nideological\npillows\nmiracles\napartheid\nrevolver\nsulfur\nclinics\nmoran\n163\n##enko\nile\nkaty\nrhetoric\n##icated\nchronology\nrecycling\n##hrer\nelongated\nmughal\npascal\nprofiles\nvibration\ndatabases\ndomination\n##fare\n##rant\nmatthias\ndigest\nrehearsal\npolling\nweiss\ninitiation\nreeves\nclinging\nflourished\nimpress\nngo\n##hoff\n##ume\nbuckley\nsymposium\nrhythms\nweed\nemphasize\ntransforming\n##taking\n##gence\n##yman\naccountant\nanalyze\nflicker\nfoil\npriesthood\nvoluntarily\ndecreases\n##80\n##hya\nslater\nsv\ncharting\nmcgill\n##lde\nmoreno\n##iu\nbesieged\nzur\nrobes\n##phic\nadmitting\napi\ndeported\nturmoil\npeyton\nearthquakes\n##ares\nnationalists\nbeau\nclair\nbrethren\ninterrupt\nwelch\ncurated\ngalerie\nrequesting\n164\n##ested\nimpending\nsteward\nviper\n##vina\ncomplaining\nbeautifully\nbrandy\nfoam\nnl\n1660\n##cake\nalessandro\npunches\nlaced\nexplanations\n##lim\nattribute\nclit\nreggie\ndiscomfort\n##cards\nsmoothed\nwhales\n##cene\nadler\ncountered\nduffy\ndisciplinary\nwidening\nrecipe\nreliance\nconducts\ngoats\ngradient\npreaching\n##shaw\nmatilda\nquasi\nstriped\nmeridian\ncannabis\ncordoba\ncertificates\n##agh\n##tering\ngraffiti\nhangs\npilgrims\nrepeats\n##ych\nrevive\nurine\netat\n##hawk\nfueled\nbelts\nfuzzy\nsusceptible\n##hang\nmauritius\nsalle\nsincere\nbeers\nhooks\n##cki\narbitration\nentrusted\nadvise\nsniffed\nseminar\njunk\ndonnell\nprocessors\nprincipality\nstrapped\ncelia\nmendoza\neverton\nfortunes\nprejudice\nstarving\nreassigned\nsteamer\n##lund\ntuck\nevenly\nforeman\n##ffen\ndans\n375\nenvisioned\nslit\n##xy\nbaseman\nliberia\nrosemary\n##weed\nelectrified\nperiodically\npotassium\nstride\ncontexts\nsperm\nslade\nmariners\ninflux\nbianca\nsubcommittee\n##rane\nspilling\nicao\nestuary\n##nock\ndelivers\niphone\n##ulata\nisa\nmira\nbohemian\ndessert\n##sbury\nwelcoming\nproudly\nslowing\n##chs\nmusee\nascension\nruss\n##vian\nwaits\n##psy\nafricans\nexploit\n##morphic\ngov\neccentric\ncrab\npeck\n##ull\nentrances\nformidable\nmarketplace\ngroom\nbolted\nmetabolism\npatton\nrobbins\ncourier\npayload\nendure\n##ifier\nandes\nrefrigerator\n##pr\nornate\n##uca\nruthless\nillegitimate\nmasonry\nstrasbourg\nbikes\nadobe\n##³\napples\nquintet\nwillingly\nniche\nbakery\ncorpses\nenergetic\n##cliffe\n##sser\n##ards\n177\ncentimeters\ncentro\nfuscous\ncretaceous\nrancho\n##yde\nandrei\ntelecom\ntottenham\noasis\nordination\nvulnerability\npresiding\ncorey\ncp\npenguins\nsims\n##pis\nmalawi\npiss\n##48\ncorrection\n##cked\n##ffle\n##ryn\ncountdown\ndetectives\npsychiatrist\npsychedelic\ndinosaurs\nblouse\n##get\nchoi\nvowed\n##oz\nrandomly\n##pol\n49ers\nscrub\nblanche\nbruins\ndusseldorf\n##using\nunwanted\n##ums\n212\ndominique\nelevations\nheadlights\nom\nlaguna\n##oga\n1750\nfamously\nignorance\nshrewsbury\n##aine\najax\nbreuning\nche\nconfederacy\ngreco\noverhaul\n##screen\npaz\nskirts\ndisagreement\ncruelty\njagged\nphoebe\nshifter\nhovered\nviruses\n##wes\nmandy\n##lined\n##gc\nlandlord\nsquirrel\ndashed\n##ι\nornamental\ngag\nwally\ngrange\nliteral\nspurs\nundisclosed\nproceeding\nyin\n##text\nbillie\norphan\nspanned\nhumidity\nindy\nweighted\npresentations\nexplosions\nlucian\n##tary\nvaughn\nhindus\n##anga\n##hell\npsycho\n171\ndaytona\nprotects\nefficiently\nrematch\nsly\ntandem\n##oya\nrebranded\nimpaired\nhee\nmetropolis\npeach\ngodfrey\ndiaspora\nethnicity\nprosperous\ngleaming\ndar\ngrossing\nplayback\n##rden\nstripe\npistols\n##tain\nbirths\nlabelled\n##cating\n172\nrudy\nalba\n##onne\naquarium\nhostility\n##gb\n##tase\nshudder\nsumatra\nhardest\nlakers\nconsonant\ncreeping\ndemos\nhomicide\ncapsule\nzeke\nliberties\nexpulsion\npueblo\n##comb\ntrait\ntransporting\n##ddin\n##neck\n##yna\ndepart\ngregg\nmold\nledge\nhangar\noldham\nplayboy\ntermination\nanalysts\ngmbh\nromero\n##itic\ninsist\ncradle\nfilthy\nbrightness\nslash\nshootout\ndeposed\nbordering\n##truct\nisis\nmicrowave\ntumbled\nsheltered\ncathy\nwerewolves\nmessy\nandersen\nconvex\nclapped\nclinched\nsatire\nwasting\nedo\nvc\nrufus\n##jak\nmont\n##etti\npoznan\n##keeping\nrestructuring\ntransverse\n##rland\nazerbaijani\nslovene\ngestures\nroommate\nchoking\nshear\n##quist\nvanguard\noblivious\n##hiro\ndisagreed\nbaptism\n##lich\ncoliseum\n##aceae\nsalvage\nsociete\ncory\nlocke\nrelocation\nrelying\nversailles\nahl\nswelling\n##elo\ncheerful\n##word\n##edes\ngin\nsarajevo\nobstacle\ndiverted\n##nac\nmessed\nthoroughbred\nfluttered\nutrecht\nchewed\nacquaintance\nassassins\ndispatch\nmirza\n##wart\nnike\nsalzburg\nswell\nyen\n##gee\nidle\nligue\nsamson\n##nds\n##igh\nplayful\nspawned\n##cise\ntease\n##case\nburgundy\n##bot\nstirring\nskeptical\ninterceptions\nmarathi\n##dies\nbedrooms\naroused\npinch\n##lik\npreferences\ntattoos\nbuster\ndigitally\nprojecting\nrust\n##ital\nkitten\npriorities\naddison\npseudo\n##guard\ndusk\nicons\nsermon\n##psis\n##iba\nbt\n##lift\n##xt\nju\ntruce\nrink\n##dah\n##wy\ndefects\npsychiatry\noffences\ncalculate\nglucose\n##iful\n##rized\n##unda\nfrancaise\n##hari\nrichest\nwarwickshire\ncarly\n1763\npurity\nredemption\nlending\n##cious\nmuse\nbruises\ncerebral\naero\ncarving\n##name\npreface\nterminology\ninvade\nmonty\n##int\nanarchist\nblurred\n##iled\nrossi\ntreats\nguts\nshu\nfoothills\nballads\nundertaking\npremise\ncecilia\naffiliates\nblasted\nconditional\nwilder\nminors\ndrone\nrudolph\nbuffy\nswallowing\nhorton\nattested\n##hop\nrutherford\nhowell\nprimetime\nlivery\npenal\n##bis\nminimize\nhydro\nwrecked\nwrought\npalazzo\n##gling\ncans\nvernacular\nfriedman\nnobleman\nshale\nwalnut\ndanielle\n##ection\n##tley\nsears\n##kumar\nchords\nlend\nflipping\nstreamed\npor\ndracula\ngallons\nsacrifices\ngamble\norphanage\n##iman\nmckenzie\n##gible\nboxers\ndaly\n##balls\n##ان\n208\n##ific\n##rative\n##iq\nexploited\nslated\n##uity\ncircling\nhillary\npinched\ngoldberg\nprovost\ncampaigning\nlim\npiles\nironically\njong\nmohan\nsuccessors\nusaf\n##tem\n##ught\nautobiographical\nhaute\npreserves\n##ending\nacquitted\ncomparisons\n203\nhydroelectric\ngangs\ncypriot\ntorpedoes\nrushes\nchrome\nderive\nbumps\ninstability\nfiat\npets\n##mbe\nsilas\ndye\nreckless\nsettler\n##itation\ninfo\nheats\n##writing\n176\ncanonical\nmaltese\nfins\nmushroom\nstacy\naspen\navid\n##kur\n##loading\nvickers\ngaston\nhillside\nstatutes\nwilde\ngail\nkung\nsabine\ncomfortably\nmotorcycles\n##rgo\n169\npneumonia\nfetch\n##sonic\naxel\nfaintly\nparallels\n##oop\nmclaren\nspouse\ncompton\ninterdisciplinary\nminer\n##eni\n181\nclamped\n##chal\n##llah\nseparates\nversa\n##mler\nscarborough\nlabrador\n##lity\n##osing\nrutgers\nhurdles\ncomo\n166\nburt\ndivers\n##100\nwichita\ncade\ncoincided\n##erson\nbruised\nmla\n##pper\nvineyard\n##ili\n##brush\nnotch\nmentioning\njase\nhearted\nkits\ndoe\n##acle\npomerania\n##ady\nronan\nseizure\npavel\nproblematic\n##zaki\ndomenico\n##ulin\ncatering\npenelope\ndependence\nparental\nemilio\nministerial\natkinson\n##bolic\nclarkson\nchargers\ncolby\ngrill\npeeked\narises\nsummon\n##aged\nfools\n##grapher\nfaculties\nqaeda\n##vial\ngarner\nrefurbished\n##hwa\ngeelong\ndisasters\nnudged\nbs\nshareholder\nlori\nalgae\nreinstated\nrot\n##ades\n##nous\ninvites\nstainless\n183\ninclusive\n##itude\ndiocesan\ntil\n##icz\ndenomination\n##xa\nbenton\nfloral\nregisters\n##ider\n##erman\n##kell\nabsurd\nbrunei\nguangzhou\nhitter\nretaliation\n##uled\n##eve\nblanc\nnh\nconsistency\ncontamination\n##eres\n##rner\ndire\npalermo\nbroadcasters\ndiaries\ninspire\nvols\nbrewer\ntightening\nky\nmixtape\nhormone\n##tok\nstokes\n##color\n##dly\n##ssi\npg\n##ometer\n##lington\nsanitation\n##tility\nintercontinental\napps\n##adt\n¹⁄₂\ncylinders\neconomies\nfavourable\nunison\ncroix\ngertrude\nodyssey\nvanity\ndangling\n##logists\nupgrades\ndice\nmiddleweight\npractitioner\n##ight\n206\nhenrik\nparlor\norion\nangered\nlac\npython\nblurted\n##rri\nsensual\nintends\nswings\nangled\n##phs\nhusky\nattain\npeerage\nprecinct\ntextiles\ncheltenham\nshuffled\ndai\nconfess\ntasting\nbhutan\n##riation\ntyrone\nsegregation\nabrupt\nruiz\n##rish\nsmirked\nblackwell\nconfidential\nbrowning\namounted\n##put\nvase\nscarce\nfabulous\nraided\nstaple\nguyana\nunemployed\nglider\nshay\n##tow\ncarmine\ntroll\nintervene\nsquash\nsuperstar\n##uce\ncylindrical\nlen\nroadway\nresearched\nhandy\n##rium\n##jana\nmeta\nlao\ndeclares\n##rring\n##tadt\n##elin\n##kova\nwillem\nshrubs\nnapoleonic\nrealms\nskater\nqi\nvolkswagen\n##ł\ntad\nhara\narchaeologist\nawkwardly\neerie\n##kind\nwiley\n##heimer\n##24\ntitus\norganizers\ncfl\ncrusaders\nlama\nusb\nvent\nenraged\nthankful\noccupants\nmaximilian\n##gaard\npossessing\ntextbooks\n##oran\ncollaborator\nquaker\n##ulo\navalanche\nmono\nsilky\nstraits\nisaiah\nmustang\nsurged\nresolutions\npotomac\ndescend\ncl\nkilograms\nplato\nstrains\nsaturdays\n##olin\nbernstein\n##ype\nholstein\nponytail\n##watch\nbelize\nconversely\nheroine\nperpetual\n##ylus\ncharcoal\npiedmont\nglee\nnegotiating\nbackdrop\nprologue\n##jah\n##mmy\npasadena\nclimbs\nramos\nsunni\n##holm\n##tner\n##tri\nanand\ndeficiency\nhertfordshire\nstout\n##avi\naperture\norioles\n##irs\ndoncaster\nintrigued\nbombed\ncoating\notis\n##mat\ncocktail\n##jit\n##eto\namir\narousal\nsar\n##proof\n##act\n##ories\ndixie\npots\n##bow\nwhereabouts\n159\n##fted\ndrains\nbullying\ncottages\nscripture\ncoherent\nfore\npoe\nappetite\n##uration\nsampled\n##ators\n##dp\nderrick\nrotor\njays\npeacock\ninstallment\n##rro\nadvisors\n##coming\nrodeo\nscotch\n##mot\n##db\n##fen\n##vant\nensued\nrodrigo\ndictatorship\nmartyrs\ntwenties\n##н\ntowed\nincidence\nmarta\nrainforest\nsai\nscaled\n##cles\noceanic\nqualifiers\nsymphonic\nmcbride\ndislike\ngeneralized\naubrey\ncolonization\n##iation\n##lion\n##ssing\ndisliked\nlublin\nsalesman\n##ulates\nspherical\nwhatsoever\nsweating\navalon\ncontention\npunt\nseverity\nalderman\natari\n##dina\n##grant\n##rop\nscarf\nseville\nvertices\nannexation\nfairfield\nfascination\ninspiring\nlaunches\npalatinate\nregretted\n##rca\nferal\n##iom\nelk\nnap\nolsen\nreddy\nyong\n##leader\n##iae\ngarment\ntransports\nfeng\ngracie\noutrage\nviceroy\ninsides\n##esis\nbreakup\ngrady\norganizer\nsofter\ngrimaced\n222\nmurals\ngalicia\narranging\nvectors\n##rsten\nbas\n##sb\n##cens\nsloan\n##eka\nbitten\nara\nfender\nnausea\nbumped\nkris\nbanquet\ncomrades\ndetector\npersisted\n##llan\nadjustment\nendowed\ncinemas\n##shot\nsellers\n##uman\npeek\nepa\nkindly\nneglect\nsimpsons\ntalon\nmausoleum\nrunaway\nhangul\nlookout\n##cic\nrewards\ncoughed\nacquainted\nchloride\n##ald\nquicker\naccordion\nneolithic\n##qa\nartemis\ncoefficient\nlenny\npandora\ntx\n##xed\necstasy\nlitter\nsegunda\nchairperson\ngemma\nhiss\nrumor\nvow\nnasal\nantioch\ncompensate\npatiently\ntransformers\n##eded\njudo\nmorrow\npenis\nposthumous\nphilips\nbandits\nhusbands\ndenote\nflaming\n##any\n##phones\nlangley\nyorker\n1760\nwalters\n##uo\n##kle\ngubernatorial\nfatty\nsamsung\nleroy\noutlaw\n##nine\nunpublished\npoole\njakob\n##ᵢ\n##ₙ\ncrete\ndistorted\nsuperiority\n##dhi\nintercept\ncrust\nmig\nclaus\ncrashes\npositioning\n188\nstallion\n301\nfrontal\narmistice\n##estinal\nelton\naj\nencompassing\ncamel\ncommemorated\nmalaria\nwoodward\ncalf\ncigar\npenetrate\n##oso\nwillard\n##rno\n##uche\nillustrate\namusing\nconvergence\nnoteworthy\n##lma\n##rva\njourneys\nrealise\nmanfred\n##sable\n410\n##vocation\nhearings\nfiance\n##posed\neducators\nprovoked\nadjusting\n##cturing\nmodular\nstockton\npaterson\nvlad\nrejects\nelectors\nselena\nmaureen\n##tres\nuber\n##rce\nswirled\n##num\nproportions\nnanny\npawn\nnaturalist\nparma\napostles\nawoke\nethel\nwen\n##bey\nmonsoon\noverview\n##inating\nmccain\nrendition\nrisky\nadorned\n##ih\nequestrian\ngermain\nnj\nconspicuous\nconfirming\n##yoshi\nshivering\n##imeter\nmilestone\nrumours\nflinched\nbounds\nsmacked\ntoken\n##bei\nlectured\nautomobiles\n##shore\nimpacted\n##iable\nnouns\nnero\n##leaf\nismail\nprostitute\ntrams\n##lace\nbridget\nsud\nstimulus\nimpressions\nreins\nrevolves\n##oud\n##gned\ngiro\nhoneymoon\n##swell\ncriterion\n##sms\n##uil\nlibyan\nprefers\n##osition\n211\npreview\nsucks\naccusation\nbursts\nmetaphor\ndiffusion\ntolerate\nfaye\nbetting\ncinematographer\nliturgical\nspecials\nbitterly\nhumboldt\n##ckle\nflux\nrattled\n##itzer\narchaeologists\nodor\nauthorised\nmarshes\ndiscretion\n##ов\nalarmed\narchaic\ninverse\n##leton\nexplorers\n##pine\ndrummond\ntsunami\nwoodlands\n##minate\n##tland\nbooklet\ninsanity\nowning\ninsert\ncrafted\ncalculus\n##tore\nreceivers\n##bt\nstung\n##eca\n##nched\nprevailing\ntravellers\neyeing\nlila\ngraphs\n##borne\n178\njulien\n##won\nmorale\nadaptive\ntherapist\nerica\ncw\nlibertarian\nbowman\npitches\nvita\n##ional\ncrook\n##ads\n##entation\ncaledonia\nmutiny\n##sible\n1840s\nautomation\n##ß\nflock\n##pia\nironic\npathology\n##imus\nremarried\n##22\njoker\nwithstand\nenergies\n##att\nshropshire\nhostages\nmadeleine\ntentatively\nconflicting\nmateo\nrecipes\neuros\nol\nmercenaries\nnico\n##ndon\nalbuquerque\naugmented\nmythical\nbel\nfreud\n##child\ncough\n##lica\n365\nfreddy\nlillian\ngenetically\nnuremberg\ncalder\n209\nbonn\noutdoors\npaste\nsuns\nurgency\nvin\nrestraint\ntyson\n##cera\n##selle\nbarrage\nbethlehem\nkahn\n##par\nmounts\nnippon\nbarony\nhappier\nryu\nmakeshift\nsheldon\nblushed\ncastillo\nbarking\nlistener\ntaped\nbethel\nfluent\nheadlines\npornography\nrum\ndisclosure\nsighing\nmace\ndoubling\ngunther\nmanly\n##plex\nrt\ninterventions\nphysiological\nforwards\nemerges\n##tooth\n##gny\ncompliment\nrib\nrecession\nvisibly\nbarge\nfaults\nconnector\nexquisite\nprefect\n##rlin\npatio\n##cured\nelevators\nbrandt\nitalics\npena\n173\nwasp\nsatin\nea\nbotswana\ngraceful\nrespectable\n##jima\n##rter\n##oic\nfranciscan\ngenerates\n##dl\nalfredo\ndisgusting\n##olate\n##iously\nsherwood\nwarns\ncod\npromo\ncheryl\nsino\n##ة\n##escu\ntwitch\n##zhi\nbrownish\nthom\nortiz\n##dron\ndensely\n##beat\ncarmel\nreinforce\n##bana\n187\nanastasia\ndownhill\nvertex\ncontaminated\nremembrance\nharmonic\nhomework\n##sol\nfiancee\ngears\nolds\nangelica\nloft\nramsay\nquiz\ncolliery\nsevens\n##cape\nautism\n##hil\nwalkway\n##boats\nruben\nabnormal\nounce\nkhmer\n##bbe\nzachary\nbedside\nmorphology\npunching\n##olar\nsparrow\nconvinces\n##35\nhewitt\nqueer\nremastered\nrods\nmabel\nsolemn\nnotified\nlyricist\nsymmetric\n##xide\n174\nencore\npassports\nwildcats\n##uni\nbaja\n##pac\nmildly\n##ease\nbleed\ncommodity\nmounds\nglossy\norchestras\n##omo\ndamian\nprelude\nambitions\n##vet\nawhile\nremotely\n##aud\nasserts\nimply\n##iques\ndistinctly\nmodelling\nremedy\n##dded\nwindshield\ndani\nxiao\n##endra\naudible\npowerplant\n1300\ninvalid\nelemental\nacquisitions\n##hala\nimmaculate\nlibby\nplata\nsmuggling\nventilation\ndenoted\nminh\n##morphism\n430\ndiffered\ndion\nkelley\nlore\nmocking\nsabbath\nspikes\nhygiene\ndrown\nrunoff\nstylized\ntally\nliberated\naux\ninterpreter\nrighteous\naba\nsiren\nreaper\npearce\nmillie\n##cier\n##yra\ngaius\n##iso\ncaptures\n##ttering\ndorm\nclaudio\n##sic\nbenches\nknighted\nblackness\n##ored\ndiscount\nfumble\noxidation\nrouted\n##ς\nnovak\nperpendicular\nspoiled\nfracture\nsplits\n##urt\npads\ntopology\n##cats\naxes\nfortunate\noffenders\nprotestants\nesteem\n221\nbroadband\nconvened\nfrankly\nhound\nprototypes\nisil\nfacilitated\nkeel\n##sher\nsahara\nawaited\nbubba\norb\nprosecutors\n186\nhem\n520\n##xing\nrelaxing\nremnant\nromney\nsorted\nslalom\nstefano\nulrich\n##active\nexemption\nfolder\npauses\nfoliage\nhitchcock\nepithet\n204\ncriticisms\n##aca\nballistic\nbrody\nhinduism\nchaotic\nyouths\nequals\n##pala\npts\nthicker\nanalogous\ncapitalist\nimprovised\noverseeing\nsinatra\nascended\nbeverage\n##tl\nstraightforward\n##kon\ncurran\n##west\nbois\n325\ninduce\nsurveying\nemperors\nsax\nunpopular\n##kk\ncartoonist\nfused\n##mble\nunto\n##yuki\nlocalities\n##cko\n##ln\ndarlington\nslain\nacademie\nlobbying\nsediment\npuzzles\n##grass\ndefiance\ndickens\nmanifest\ntongues\nalumnus\narbor\ncoincide\n184\nappalachian\nmustafa\nexaminer\ncabaret\ntraumatic\nyves\nbracelet\ndraining\nheroin\nmagnum\nbaths\nodessa\nconsonants\nmitsubishi\n##gua\nkellan\nvaudeville\n##fr\njoked\nnull\nstraps\nprobation\n##ław\nceded\ninterfaces\n##pas\n##zawa\nblinding\nviet\n224\nrothschild\nmuseo\n640\nhuddersfield\n##vr\ntactic\n##storm\nbrackets\ndazed\nincorrectly\n##vu\nreg\nglazed\nfearful\nmanifold\nbenefited\nirony\n##sun\nstumbling\n##rte\nwillingness\nbalkans\nmei\nwraps\n##aba\ninjected\n##lea\ngu\nsyed\nharmless\n##hammer\nbray\ntakeoff\npoppy\ntimor\ncardboard\nastronaut\npurdue\nweeping\nsouthbound\ncursing\nstalls\ndiagonal\n##neer\nlamar\nbryce\ncomte\nweekdays\nharrington\n##uba\nnegatively\n##see\nlays\ngrouping\n##cken\n##henko\naffirmed\nhalle\nmodernist\n##lai\nhodges\nsmelling\naristocratic\nbaptized\ndismiss\njustification\noilers\n##now\ncoupling\nqin\nsnack\nhealer\n##qing\ngardener\nlayla\nbattled\nformulated\nstephenson\ngravitational\n##gill\n##jun\n1768\ngranny\ncoordinating\nsuites\n##cd\n##ioned\nmonarchs\n##cote\n##hips\nsep\nblended\napr\nbarrister\ndeposition\nfia\nmina\npolicemen\nparanoid\n##pressed\nchurchyard\ncovert\ncrumpled\ncreep\nabandoning\ntr\ntransmit\nconceal\nbarr\nunderstands\nreadiness\nspire\n##cology\n##enia\n##erry\n610\nstartling\nunlock\nvida\nbowled\nslots\n##nat\n##islav\nspaced\ntrusting\nadmire\nrig\n##ink\nslack\n##70\nmv\n207\ncasualty\n##wei\nclassmates\n##odes\n##rar\n##rked\namherst\nfurnished\nevolve\nfoundry\nmenace\nmead\n##lein\nflu\nwesleyan\n##kled\nmonterey\nwebber\n##vos\nwil\n##mith\n##на\nbartholomew\njustices\nrestrained\n##cke\namenities\n191\nmediated\nsewage\ntrenches\nml\nmainz\n##thus\n1800s\n##cula\n##inski\ncaine\nbonding\n213\nconverts\nspheres\nsuperseded\nmarianne\ncrypt\nsweaty\nensign\nhistoria\n##br\nspruce\n##post\n##ask\nforks\nthoughtfully\nyukon\npamphlet\names\n##uter\nkarma\n##yya\nbryn\nnegotiation\nsighs\nincapable\n##mbre\n##ntial\nactresses\ntaft\n##mill\nluce\nprevailed\n##amine\n1773\nmotionless\nenvoy\ntestify\ninvesting\nsculpted\ninstructors\nprovence\nkali\ncullen\nhorseback\n##while\ngoodwin\n##jos\ngaa\nnorte\n##ldon\nmodify\nwavelength\nabd\n214\nskinned\nsprinter\nforecast\nscheduling\nmarries\nsquared\ntentative\n##chman\nboer\n##isch\nbolts\nswap\nfisherman\nassyrian\nimpatiently\nguthrie\nmartins\nmurdoch\n194\ntanya\nnicely\ndolly\nlacy\nmed\n##45\nsyn\ndecks\nfashionable\nmillionaire\n##ust\nsurfing\n##ml\n##ision\nheaved\ntammy\nconsulate\nattendees\nroutinely\n197\nfuse\nsaxophonist\nbackseat\nmalaya\n##lord\nscowl\ntau\n##ishly\n193\nsighted\nsteaming\n##rks\n303\n911\n##holes\n##hong\nching\n##wife\nbless\nconserved\njurassic\nstacey\nunix\nzion\nchunk\nrigorous\nblaine\n198\npeabody\nslayer\ndismay\nbrewers\nnz\n##jer\ndet\n##glia\nglover\npostwar\nint\npenetration\nsylvester\nimitation\nvertically\nairlift\nheiress\nknoxville\nviva\n##uin\n390\nmacon\n##rim\n##fighter\n##gonal\njanice\n##orescence\n##wari\nmarius\nbelongings\nleicestershire\n196\nblanco\ninverted\npreseason\nsanity\nsobbing\n##due\n##elt\n##dled\ncollingwood\nregeneration\nflickering\nshortest\n##mount\n##osi\nfeminism\n##lat\nsherlock\ncabinets\nfumbled\nnorthbound\nprecedent\nsnaps\n##mme\nresearching\n##akes\nguillaume\ninsights\nmanipulated\nvapor\nneighbour\nsap\ngangster\nfrey\nf1\nstalking\nscarcely\ncallie\nbarnett\ntendencies\naudi\ndoomed\nassessing\nslung\npanchayat\nambiguous\nbartlett\n##etto\ndistributing\nviolating\nwolverhampton\n##hetic\nswami\nhistoire\n##urus\nliable\npounder\ngroin\nhussain\nlarsen\npopping\nsurprises\n##atter\nvie\ncurt\n##station\nmute\nrelocate\nmusicals\nauthorization\nrichter\n##sef\nimmortality\ntna\nbombings\n##press\ndeteriorated\nyiddish\n##acious\nrobbed\ncolchester\ncs\npmid\nao\nverified\nbalancing\napostle\nswayed\nrecognizable\noxfordshire\nretention\nnottinghamshire\ncontender\njudd\ninvitational\nshrimp\nuhf\n##icient\ncleaner\nlongitudinal\ntanker\n##mur\nacronym\nbroker\nkoppen\nsundance\nsuppliers\n##gil\n4000\nclipped\nfuels\npetite\n##anne\nlandslide\nhelene\ndiversion\npopulous\nlandowners\nauspices\nmelville\nquantitative\n##xes\nferries\nnicky\n##llus\ndoo\nhaunting\nroche\ncarver\ndowned\nunavailable\n##pathy\napproximation\nhiroshima\n##hue\ngarfield\nvalle\ncomparatively\nkeyboardist\ntraveler\n##eit\ncongestion\ncalculating\nsubsidiaries\n##bate\nserb\nmodernization\nfairies\ndeepened\nville\naverages\n##lore\ninflammatory\ntonga\n##itch\nco₂\nsquads\n##hea\ngigantic\nserum\nenjoyment\nretailer\nverona\n35th\ncis\n##phobic\nmagna\ntechnicians\n##vati\narithmetic\n##sport\nlevin\n##dation\namtrak\nchow\nsienna\n##eyer\nbackstage\nentrepreneurship\n##otic\nlearnt\ntao\n##udy\nworcestershire\nformulation\nbaggage\nhesitant\nbali\nsabotage\n##kari\nbarren\nenhancing\nmurmur\npl\nfreshly\nputnam\nsyntax\naces\nmedicines\nresentment\nbandwidth\n##sier\ngrins\nchili\nguido\n##sei\nframing\nimplying\ngareth\nlissa\ngenevieve\npertaining\nadmissions\ngeo\nthorpe\nproliferation\nsato\nbela\nanalyzing\nparting\n##gor\nawakened\n##isman\nhuddled\nsecrecy\n##kling\nhush\ngentry\n540\ndungeons\n##ego\ncoasts\n##utz\nsacrificed\n##chule\nlandowner\nmutually\nprevalence\nprogrammer\nadolescent\ndisrupted\nseaside\ngee\ntrusts\nvamp\ngeorgie\n##nesian\n##iol\nschedules\nsindh\n##market\netched\nhm\nsparse\nbey\nbeaux\nscratching\ngliding\nunidentified\n216\ncollaborating\ngems\njesuits\noro\naccumulation\nshaping\nmbe\nanal\n##xin\n231\nenthusiasts\nnewscast\n##egan\njanata\ndewey\nparkinson\n179\nankara\nbiennial\ntowering\ndd\ninconsistent\n950\n##chet\nthriving\nterminate\ncabins\nfuriously\neats\nadvocating\ndonkey\nmarley\nmuster\nphyllis\nleiden\n##user\ngrassland\nglittering\niucn\nloneliness\n217\nmemorandum\narmenians\n##ddle\npopularized\nrhodesia\n60s\nlame\n##illon\nsans\nbikini\nheader\norbits\n##xx\n##finger\n##ulator\nsharif\nspines\nbiotechnology\nstrolled\nnaughty\nyates\n##wire\nfremantle\nmilo\n##mour\nabducted\nremoves\n##atin\nhumming\nwonderland\n##chrome\n##ester\nhume\npivotal\n##rates\narmand\ngrams\nbelievers\nelector\nrte\napron\nbis\nscraped\n##yria\nendorsement\ninitials\n##llation\neps\ndotted\nhints\nbuzzing\nemigration\nnearer\n##tom\nindicators\n##ulu\ncoarse\nneutron\nprotectorate\n##uze\ndirectional\nexploits\npains\nloire\n1830s\nproponents\nguggenheim\nrabbits\nritchie\n305\nhectare\ninputs\nhutton\n##raz\nverify\n##ako\nboilers\nlongitude\n##lev\nskeletal\nyer\nemilia\ncitrus\ncompromised\n##gau\npokemon\nprescription\nparagraph\neduard\ncadillac\nattire\ncategorized\nkenyan\nweddings\ncharley\n##bourg\nentertain\nmonmouth\n##lles\nnutrients\ndavey\nmesh\nincentive\npractised\necosystems\nkemp\nsubdued\noverheard\n##rya\nbodily\nmaxim\n##nius\napprenticeship\nursula\n##fight\nlodged\nrug\nsilesian\nunconstitutional\npatel\ninspected\ncoyote\nunbeaten\n##hak\n34th\ndisruption\nconvict\nparcel\n##cl\n##nham\ncollier\nimplicated\nmallory\n##iac\n##lab\nsusannah\nwinkler\n##rber\nshia\nphelps\nsediments\ngraphical\nrobotic\n##sner\nadulthood\nmart\nsmoked\n##isto\nkathryn\nclarified\n##aran\ndivides\nconvictions\noppression\npausing\nburying\n##mt\nfederico\nmathias\neileen\n##tana\nkite\nhunched\n##acies\n189\n##atz\ndisadvantage\nliza\nkinetic\ngreedy\nparadox\nyokohama\ndowager\ntrunks\nventured\n##gement\ngupta\nvilnius\nolaf\n##thest\ncrimean\nhopper\n##ej\nprogressively\narturo\nmouthed\narrondissement\n##fusion\nrubin\nsimulcast\noceania\n##orum\n##stra\n##rred\nbusiest\nintensely\nnavigator\ncary\n##vine\n##hini\n##bies\nfife\nrowe\nrowland\nposing\ninsurgents\nshafts\nlawsuits\nactivate\nconor\ninward\nculturally\ngarlic\n265\n##eering\neclectic\n##hui\n##kee\n##nl\nfurrowed\nvargas\nmeteorological\nrendezvous\n##aus\nculinary\ncommencement\n##dition\nquota\n##notes\nmommy\nsalaries\noverlapping\nmule\n##iology\n##mology\nsums\nwentworth\n##isk\n##zione\nmainline\nsubgroup\n##illy\nhack\nplaintiff\nverdi\nbulb\ndifferentiation\nengagements\nmultinational\nsupplemented\nbertrand\ncaller\nregis\n##naire\n##sler\n##arts\n##imated\nblossom\npropagation\nkilometer\nviaduct\nvineyards\n##uate\nbeckett\noptimization\ngolfer\nsongwriters\nseminal\nsemitic\nthud\nvolatile\nevolving\nridley\n##wley\ntrivial\ndistributions\nscandinavia\njiang\n##ject\nwrestled\ninsistence\n##dio\nemphasizes\nnapkin\n##ods\nadjunct\nrhyme\n##ricted\n##eti\nhopeless\nsurrounds\ntremble\n32nd\nsmoky\n##ntly\noils\nmedicinal\npadded\nsteer\nwilkes\n219\n255\nconcessions\nhue\nuniquely\nblinded\nlandon\nyahoo\n##lane\nhendrix\ncommemorating\ndex\nspecify\nchicks\n##ggio\nintercity\n1400\nmorley\n##torm\nhighlighting\n##oting\npang\noblique\nstalled\n##liner\nflirting\nnewborn\n1769\nbishopric\nshaved\n232\ncurrie\n##ush\ndharma\nspartan\n##ooped\nfavorites\nsmug\nnovella\nsirens\nabusive\ncreations\nespana\n##lage\nparadigm\nsemiconductor\nsheen\n##rdo\n##yen\n##zak\nnrl\nrenew\n##pose\n##tur\nadjutant\nmarches\nnorma\n##enity\nineffective\nweimar\ngrunt\n##gat\nlordship\nplotting\nexpenditure\ninfringement\nlbs\nrefrain\nav\nmimi\nmistakenly\npostmaster\n1771\n##bara\nras\nmotorsports\ntito\n199\nsubjective\n##zza\nbully\nstew\n##kaya\nprescott\n1a\n##raphic\n##zam\nbids\nstyling\nparanormal\nreeve\nsneaking\nexploding\nkatz\nakbar\nmigrant\nsyllables\nindefinitely\n##ogical\ndestroys\nreplaces\napplause\n##phine\npest\n##fide\n218\narticulated\nbertie\n##thing\n##cars\n##ptic\ncourtroom\ncrowley\naesthetics\ncummings\ntehsil\nhormones\ntitanic\ndangerously\n##ibe\nstadion\njaenelle\nauguste\nciudad\n##chu\nmysore\npartisans\n##sio\nlucan\nphilipp\n##aly\ndebating\nhenley\ninteriors\n##rano\n##tious\nhomecoming\nbeyonce\nusher\nhenrietta\nprepares\nweeds\n##oman\nely\nplucked\n##pire\n##dable\nluxurious\n##aq\nartifact\npassword\npasture\njuno\nmaddy\nminsk\n##dder\n##ologies\n##rone\nassessments\nmartian\nroyalist\n1765\nexamines\n##mani\n##rge\nnino\n223\nparry\nscooped\nrelativity\n##eli\n##uting\n##cao\ncongregational\nnoisy\ntraverse\n##agawa\nstrikeouts\nnickelodeon\nobituary\ntransylvania\nbinds\ndepictions\npolk\ntrolley\n##yed\n##lard\nbreeders\n##under\ndryly\nhokkaido\n1762\nstrengths\nstacks\nbonaparte\nconnectivity\nneared\nprostitutes\nstamped\nanaheim\ngutierrez\nsinai\n##zzling\nbram\nfresno\nmadhya\n##86\nproton\n##lena\n##llum\n##phon\nreelected\nwanda\n##anus\n##lb\nample\ndistinguishing\n##yler\ngrasping\nsermons\ntomato\nbland\nstimulation\navenues\n##eux\nspreads\nscarlett\nfern\npentagon\nassert\nbaird\nchesapeake\nir\ncalmed\ndistortion\nfatalities\n##olis\ncorrectional\npricing\n##astic\n##gina\nprom\ndammit\nying\ncollaborate\n##chia\nwelterweight\n33rd\npointer\nsubstitution\nbonded\numpire\ncommunicating\nmultitude\npaddle\n##obe\nfederally\nintimacy\n##insky\nbetray\nssr\n##lett\n##lean\n##lves\n##therapy\nairbus\n##tery\nfunctioned\nud\nbearer\nbiomedical\nnetflix\n##hire\n##nca\ncondom\nbrink\nik\n##nical\nmacy\n##bet\nflap\ngma\nexperimented\njelly\nlavender\n##icles\n##ulia\nmunro\n##mian\n##tial\nrye\n##rle\n60th\ngigs\nhottest\nrotated\npredictions\nfuji\nbu\n##erence\n##omi\nbarangay\n##fulness\n##sas\nclocks\n##rwood\n##liness\ncereal\nroe\nwight\ndecker\nuttered\nbabu\nonion\nxml\nforcibly\n##df\npetra\nsarcasm\nhartley\npeeled\nstorytelling\n##42\n##xley\n##ysis\n##ffa\nfibre\nkiel\nauditor\nfig\nharald\ngreenville\n##berries\ngeographically\nnell\nquartz\n##athic\ncemeteries\n##lr\ncrossings\nnah\nholloway\nreptiles\nchun\nsichuan\nsnowy\n660\ncorrections\n##ivo\nzheng\nambassadors\nblacksmith\nfielded\nfluids\nhardcover\nturnover\nmedications\nmelvin\nacademies\n##erton\nro\nroach\nabsorbing\nspaniards\ncolton\n##founded\noutsider\nespionage\nkelsey\n245\nedible\n##ulf\ndora\nestablishes\n##sham\n##tries\ncontracting\n##tania\ncinematic\ncostello\nnesting\n##uron\nconnolly\nduff\n##nology\nmma\n##mata\nfergus\nsexes\ngi\noptics\nspectator\nwoodstock\nbanning\n##hee\n##fle\ndifferentiate\noutfielder\nrefinery\n226\n312\ngerhard\nhorde\nlair\ndrastically\n##udi\nlandfall\n##cheng\nmotorsport\nodi\n##achi\npredominant\nquay\nskins\n##ental\nedna\nharshly\ncomplementary\nmurdering\n##aves\nwreckage\n##90\nono\noutstretched\nlennox\nmunitions\ngalen\nreconcile\n470\nscalp\nbicycles\ngillespie\nquestionable\nrosenberg\nguillermo\nhostel\njarvis\nkabul\nvolvo\nopium\nyd\n##twined\nabuses\ndecca\noutpost\n##cino\nsensible\nneutrality\n##64\nponce\nanchorage\natkins\nturrets\ninadvertently\ndisagree\nlibre\nvodka\nreassuring\nweighs\n##yal\nglide\njumper\nceilings\nrepertory\nouts\nstain\n##bial\nenvy\n##ucible\nsmashing\nheightened\npolicing\nhyun\nmixes\nlai\nprima\n##ples\nceleste\n##bina\nlucrative\nintervened\nkc\nmanually\n##rned\nstature\nstaffed\nbun\nbastards\nnairobi\npriced\n##auer\nthatcher\n##kia\ntripped\ncomune\n##ogan\n##pled\nbrasil\nincentives\nemanuel\nhereford\nmusica\n##kim\nbenedictine\nbiennale\n##lani\neureka\ngardiner\nrb\nknocks\nsha\n##ael\n##elled\n##onate\nefficacy\nventura\nmasonic\nsanford\nmaize\nleverage\n##feit\ncapacities\nsantana\n##aur\nnovelty\nvanilla\n##cter\n##tour\nbenin\n##oir\n##rain\nneptune\ndrafting\ntallinn\n##cable\nhumiliation\n##boarding\nschleswig\nfabian\nbernardo\nliturgy\nspectacle\nsweeney\npont\nroutledge\n##tment\ncosmos\nut\nhilt\nsleek\nuniversally\n##eville\n##gawa\ntyped\n##dry\nfavors\nallegheny\nglaciers\n##rly\nrecalling\naziz\n##log\nparasite\nrequiem\nauf\n##berto\n##llin\nillumination\n##breaker\n##issa\nfestivities\nbows\ngovern\nvibe\nvp\n333\nsprawled\nlarson\npilgrim\nbwf\nleaping\n##rts\n##ssel\nalexei\ngreyhound\nhoarse\n##dler\n##oration\nseneca\n##cule\ngaping\n##ulously\n##pura\ncinnamon\n##gens\n##rricular\ncraven\nfantasies\nhoughton\nengined\nreigned\ndictator\nsupervising\n##oris\nbogota\ncommentaries\nunnatural\nfingernails\nspirituality\ntighten\n##tm\ncanadiens\nprotesting\nintentional\ncheers\nsparta\n##ytic\n##iere\n##zine\nwiden\nbelgarath\ncontrollers\ndodd\niaaf\nnavarre\n##ication\ndefect\nsquire\nsteiner\nwhisky\n##mins\n560\ninevitably\ntome\n##gold\nchew\n##uid\n##lid\nelastic\n##aby\nstreaked\nalliances\njailed\nregal\n##ined\n##phy\nczechoslovak\nnarration\nabsently\n##uld\nbluegrass\nguangdong\nquran\ncriticizing\nhose\nhari\n##liest\n##owa\nskier\nstreaks\ndeploy\n##lom\nraft\nbose\ndialed\nhuff\n##eira\nhaifa\nsimplest\nbursting\nendings\nib\nsultanate\n##titled\nfranks\nwhitman\nensures\nsven\n##ggs\ncollaborators\nforster\norganising\nui\nbanished\nnapier\ninjustice\nteller\nlayered\nthump\n##otti\nroc\nbattleships\nevidenced\nfugitive\nsadie\nrobotics\n##roud\nequatorial\ngeologist\n##iza\nyielding\n##bron\n##sr\ninternationale\nmecca\n##diment\nsbs\nskyline\ntoad\nuploaded\nreflective\nundrafted\nlal\nleafs\nbayern\n##dai\nlakshmi\nshortlisted\n##stick\n##wicz\ncamouflage\ndonate\naf\nchristi\nlau\n##acio\ndisclosed\nnemesis\n1761\nassemble\nstraining\nnorthamptonshire\ntal\n##asi\nbernardino\npremature\nheidi\n42nd\ncoefficients\ngalactic\nreproduce\nbuzzed\nsensations\nzionist\nmonsieur\nmyrtle\n##eme\narchery\nstrangled\nmusically\nviewpoint\nantiquities\nbei\ntrailers\nseahawks\ncured\npee\npreferring\ntasmanian\nlange\nsul\n##mail\n##working\ncolder\noverland\nlucivar\nmassey\ngatherings\nhaitian\n##smith\ndisapproval\nflaws\n##cco\n##enbach\n1766\nnpr\n##icular\nboroughs\ncreole\nforums\ntechno\n1755\ndent\nabdominal\nstreetcar\n##eson\n##stream\nprocurement\ngemini\npredictable\n##tya\nacheron\nchristoph\nfeeder\nfronts\nvendor\nbernhard\njammu\ntumors\nslang\n##uber\ngoaltender\ntwists\ncurving\nmanson\nvuelta\nmer\npeanut\nconfessions\npouch\nunpredictable\nallowance\ntheodor\nvascular\n##factory\nbala\nauthenticity\nmetabolic\ncoughing\nnanjing\n##cea\npembroke\n##bard\nsplendid\n36th\nff\nhourly\n##ahu\nelmer\nhandel\n##ivate\nawarding\nthrusting\ndl\nexperimentation\n##hesion\n##46\ncaressed\nentertained\nsteak\n##rangle\nbiologist\norphans\nbaroness\noyster\nstepfather\n##dridge\nmirage\nreefs\nspeeding\n##31\nbarons\n1764\n227\ninhabit\npreached\nrepealed\n##tral\nhonoring\nboogie\ncaptives\nadminister\njohanna\n##imate\ngel\nsuspiciously\n1767\nsobs\n##dington\nbackbone\nhayward\ngarry\n##folding\n##nesia\nmaxi\n##oof\n##ppe\nellison\ngalileo\n##stand\ncrimea\nfrenzy\namour\nbumper\nmatrices\nnatalia\nbaking\ngarth\npalestinians\n##grove\nsmack\nconveyed\nensembles\ngardening\n##manship\n##rup\n##stituting\n1640\nharvesting\ntopography\njing\nshifters\ndormitory\n##carriage\n##lston\nist\nskulls\n##stadt\ndolores\njewellery\nsarawak\n##wai\n##zier\nfences\nchristy\nconfinement\ntumbling\ncredibility\nfir\nstench\n##bria\n##plication\n##nged\n##sam\nvirtues\n##belt\nmarjorie\npba\n##eem\n##made\ncelebrates\nschooner\nagitated\nbarley\nfulfilling\nanthropologist\n##pro\nrestrict\nnovi\nregulating\n##nent\npadres\n##rani\n##hesive\nloyola\ntabitha\nmilky\nolson\nproprietor\ncrambidae\nguarantees\nintercollegiate\nljubljana\nhilda\n##sko\nignorant\nhooded\n##lts\nsardinia\n##lidae\n##vation\nfrontman\nprivileged\nwitchcraft\n##gp\njammed\nlaude\npoking\n##than\nbracket\namazement\nyunnan\n##erus\nmaharaja\nlinnaeus\n264\ncommissioning\nmilano\npeacefully\n##logies\nakira\nrani\nregulator\n##36\ngrasses\n##rance\nluzon\ncrows\ncompiler\ngretchen\nseaman\nedouard\ntab\nbuccaneers\nellington\nhamlets\nwhig\nsocialists\n##anto\ndirectorial\neaston\nmythological\n##kr\n##vary\nrhineland\nsemantic\ntaut\ndune\ninventions\nsucceeds\n##iter\nreplication\nbranched\n##pired\njul\nprosecuted\nkangaroo\npenetrated\n##avian\nmiddlesbrough\ndoses\nbleak\nmadam\npredatory\nrelentless\n##vili\nreluctance\n##vir\nhailey\ncrore\nsilvery\n1759\nmonstrous\nswimmers\ntransmissions\nhawthorn\ninforming\n##eral\ntoilets\ncaracas\ncrouch\nkb\n##sett\n295\ncartel\nhadley\n##aling\nalexia\nyvonne\n##biology\ncinderella\neton\nsuperb\nblizzard\nstabbing\nindustrialist\nmaximus\n##gm\n##orus\ngroves\nmaud\nclade\noversized\ncomedic\n##bella\nrosen\nnomadic\nfulham\nmontane\nbeverages\ngalaxies\nredundant\nswarm\n##rot\n##folia\n##llis\nbuckinghamshire\nfen\nbearings\nbahadur\n##rom\ngilles\nphased\ndynamite\nfaber\nbenoit\nvip\n##ount\n##wd\nbooking\nfractured\ntailored\nanya\nspices\nwestwood\ncairns\nauditions\ninflammation\nsteamed\n##rocity\n##acion\n##urne\nskyla\nthereof\nwatford\ntorment\narchdeacon\ntransforms\nlulu\ndemeanor\nfucked\nserge\n##sor\nmckenna\nminas\nentertainer\n##icide\ncaress\noriginate\nresidue\n##sty\n1740\n##ilised\n##org\nbeech\n##wana\nsubsidies\n##ghton\nemptied\ngladstone\nru\nfirefighters\nvoodoo\n##rcle\nhet\nnightingale\ntamara\nedmond\ningredient\nweaknesses\nsilhouette\n285\ncompatibility\nwithdrawing\nhampson\n##mona\nanguish\ngiggling\n##mber\nbookstore\n##jiang\nsouthernmost\ntilting\n##vance\nbai\neconomical\nrf\nbriefcase\ndreadful\nhinted\nprojections\nshattering\ntotaling\n##rogate\nanalogue\nindicted\nperiodical\nfullback\n##dman\nhaynes\n##tenberg\n##ffs\n##ishment\n1745\nthirst\nstumble\npenang\nvigorous\n##ddling\n##kor\n##lium\noctave\n##ove\n##enstein\n##inen\n##ones\nsiberian\n##uti\ncbn\nrepeal\nswaying\n##vington\nkhalid\ntanaka\nunicorn\notago\nplastered\nlobe\nriddle\n##rella\nperch\n##ishing\ncroydon\nfiltered\ngraeme\ntripoli\n##ossa\ncrocodile\n##chers\nsufi\nmined\n##tung\ninferno\nlsu\n##phi\nswelled\nutilizes\n£2\ncale\nperiodicals\nstyx\nhike\ninformally\ncoop\nlund\n##tidae\nala\nhen\nqui\ntransformations\ndisposed\nsheath\nchickens\n##cade\nfitzroy\nsas\nsilesia\nunacceptable\nodisha\n1650\nsabrina\npe\nspokane\nratios\nathena\nmassage\nshen\ndilemma\n##drum\n##riz\n##hul\ncorona\ndoubtful\nniall\n##pha\n##bino\nfines\ncite\nacknowledging\nbangor\nballard\nbathurst\n##resh\nhuron\nmustered\nalzheimer\ngarments\nkinase\ntyre\nwarship\n##cp\nflashback\npulmonary\nbraun\ncheat\nkamal\ncyclists\nconstructions\ngrenades\nndp\ntraveller\nexcuses\nstomped\nsignalling\ntrimmed\nfutsal\nmosques\nrelevance\n##wine\nwta\n##23\n##vah\n##lter\nhoc\n##riding\noptimistic\n##´s\ndeco\nsim\ninteracting\nrejecting\nmoniker\nwaterways\n##ieri\n##oku\nmayors\ngdansk\noutnumbered\npearls\n##ended\n##hampton\nfairs\ntotals\ndominating\n262\nnotions\nstairway\ncompiling\npursed\ncommodities\ngrease\nyeast\n##jong\ncarthage\ngriffiths\nresidual\namc\ncontraction\nlaird\nsapphire\n##marine\n##ivated\namalgamation\ndissolve\ninclination\nlyle\npackaged\naltitudes\nsuez\ncanons\ngraded\nlurched\nnarrowing\nboasts\nguise\nwed\nenrico\n##ovsky\nrower\nscarred\nbree\ncub\niberian\nprotagonists\nbargaining\nproposing\ntrainers\nvoyages\nvans\nfishes\n##aea\n##ivist\n##verance\nencryption\nartworks\nkazan\nsabre\ncleopatra\nhepburn\nrotting\nsupremacy\nmecklenburg\n##brate\nburrows\nhazards\noutgoing\nflair\norganizes\n##ctions\nscorpion\n##usions\nboo\n234\nchevalier\ndunedin\nslapping\n##34\nineligible\npensions\n##38\n##omic\nmanufactures\nemails\nbismarck\n238\nweakening\nblackish\nding\nmcgee\nquo\n##rling\nnorthernmost\nxx\nmanpower\ngreed\nsampson\nclicking\n##ange\n##horpe\n##inations\n##roving\ntorre\n##eptive\n##moral\nsymbolism\n38th\nasshole\nmeritorious\noutfits\nsplashed\nbiographies\nsprung\nastros\n##tale\n302\n737\nfilly\nraoul\nnw\ntokugawa\nlinden\nclubhouse\n##apa\ntracts\nromano\n##pio\nputin\ntags\n##note\nchained\ndickson\ngunshot\nmoe\ngunn\nrashid\n##tails\nzipper\n##bas\n##nea\ncontrasted\n##ply\n##udes\nplum\npharaoh\n##pile\naw\ncomedies\ningrid\nsandwiches\nsubdivisions\n1100\nmariana\nnokia\nkamen\nhz\ndelaney\nveto\nherring\n##words\npossessive\noutlines\n##roup\nsiemens\nstairwell\nrc\ngallantry\nmessiah\npalais\nyells\n233\nzeppelin\n##dm\nbolivar\n##cede\nsmackdown\nmckinley\n##mora\n##yt\nmuted\ngeologic\nfinely\nunitary\navatar\nhamas\nmaynard\nrees\nbog\ncontrasting\n##rut\nliv\nchico\ndisposition\npixel\n##erate\nbecca\ndmitry\nyeshiva\nnarratives\n##lva\n##ulton\nmercenary\nsharpe\ntempered\nnavigate\nstealth\namassed\nkeynes\n##lini\nuntouched\n##rrie\nhavoc\nlithium\n##fighting\nabyss\ngraf\nsouthward\nwolverine\nballoons\nimplements\nngos\ntransitions\n##icum\nambushed\nconcacaf\ndormant\neconomists\n##dim\ncosting\ncsi\nrana\nuniversite\nboulders\nverity\n##llon\ncollin\nmellon\nmisses\ncypress\nfluorescent\nlifeless\nspence\n##ulla\ncrewe\nshepard\npak\nrevelations\n##م\njolly\ngibbons\npaw\n##dro\n##quel\nfreeing\n##test\nshack\nfries\npalatine\n##51\n##hiko\naccompaniment\ncruising\nrecycled\n##aver\nerwin\nsorting\nsynthesizers\ndyke\nrealities\nsg\nstrides\nenslaved\nwetland\n##ghan\ncompetence\ngunpowder\ngrassy\nmaroon\nreactors\nobjection\n##oms\ncarlson\ngearbox\nmacintosh\nradios\nshelton\n##sho\nclergyman\nprakash\n254\nmongols\ntrophies\noricon\n228\nstimuli\ntwenty20\ncantonese\ncortes\nmirrored\n##saurus\nbhp\ncristina\nmelancholy\n##lating\nenjoyable\nnuevo\n##wny\ndownfall\nschumacher\n##ind\nbanging\nlausanne\nrumbled\nparamilitary\nreflex\nax\namplitude\nmigratory\n##gall\n##ups\nmidi\nbarnard\nlastly\nsherry\n##hp\n##nall\nkeystone\n##kra\ncarleton\nslippery\n##53\ncoloring\nfoe\nsocket\notter\n##rgos\nmats\n##tose\nconsultants\nbafta\nbison\ntopping\n##km\n490\nprimal\nabandonment\ntransplant\natoll\nhideous\nmort\npained\nreproduced\ntae\nhowling\n##turn\nunlawful\nbillionaire\nhotter\npoised\nlansing\n##chang\ndinamo\nretro\nmessing\nnfc\ndomesday\n##mina\nblitz\ntimed\n##athing\n##kley\nascending\ngesturing\n##izations\nsignaled\ntis\nchinatown\nmermaid\nsavanna\njameson\n##aint\ncatalina\n##pet\n##hers\ncochrane\ncy\nchatting\n##kus\nalerted\ncomputation\nmused\nnoelle\nmajestic\nmohawk\ncampo\noctagonal\n##sant\n##hend\n241\naspiring\n##mart\ncomprehend\niona\nparalyzed\nshimmering\nswindon\nrhone\n##eley\nreputed\nconfigurations\npitchfork\nagitation\nfrancais\ngillian\nlipstick\n##ilo\noutsiders\npontifical\nresisting\nbitterness\nsewer\nrockies\n##edd\n##ucher\nmisleading\n1756\nexiting\ngalloway\n##nging\nrisked\n##heart\n246\ncommemoration\nschultz\n##rka\nintegrating\n##rsa\nposes\nshrieked\n##weiler\nguineas\ngladys\njerking\nowls\ngoldsmith\nnightly\npenetrating\n##unced\nlia\n##33\nignited\nbetsy\n##aring\n##thorpe\nfollower\nvigorously\n##rave\ncoded\nkiran\nknit\nzoology\ntbilisi\n##28\n##bered\nrepository\ngovt\ndeciduous\ndino\ngrowling\n##bba\nenhancement\nunleashed\nchanting\npussy\nbiochemistry\n##eric\nkettle\nrepression\ntoxicity\nnrhp\n##arth\n##kko\n##bush\nernesto\ncommended\noutspoken\n242\nmca\nparchment\nsms\nkristen\n##aton\nbisexual\nraked\nglamour\nnavajo\na2\nconditioned\nshowcased\n##hma\nspacious\nyouthful\n##esa\nusl\nappliances\njunta\nbrest\nlayne\nconglomerate\nenchanted\nchao\nloosened\npicasso\ncirculating\ninspect\nmontevideo\n##centric\n##kti\npiazza\nspurred\n##aith\nbari\nfreedoms\npoultry\nstamford\nlieu\n##ect\nindigo\nsarcastic\nbahia\nstump\nattach\ndvds\nfrankenstein\nlille\napprox\nscriptures\npollen\n##script\nnmi\noverseen\n##ivism\ntides\nproponent\nnewmarket\ninherit\nmilling\n##erland\ncentralized\n##rou\ndistributors\ncredentials\ndrawers\nabbreviation\n##lco\n##xon\ndowning\nuncomfortably\nripe\n##oes\nerase\nfranchises\n##ever\npopulace\n##bery\n##khar\ndecomposition\npleas\n##tet\ndaryl\nsabah\n##stle\n##wide\nfearless\ngenie\nlesions\nannette\n##ogist\noboe\nappendix\nnair\ndripped\npetitioned\nmaclean\nmosquito\nparrot\nrpg\nhampered\n1648\noperatic\nreservoirs\n##tham\nirrelevant\njolt\nsummarized\n##fp\nmedallion\n##taff\n##−\nclawed\nharlow\nnarrower\ngoddard\nmarcia\nbodied\nfremont\nsuarez\naltering\ntempest\nmussolini\nporn\n##isms\nsweetly\noversees\nwalkers\nsolitude\ngrimly\nshrines\nhk\nich\nsupervisors\nhostess\ndietrich\nlegitimacy\nbrushes\nexpressive\n##yp\ndissipated\n##rse\nlocalized\nsystemic\n##nikov\ngettysburg\n##js\n##uaries\ndialogues\nmuttering\n251\nhousekeeper\nsicilian\ndiscouraged\n##frey\nbeamed\nkaladin\nhalftime\nkidnap\n##amo\n##llet\n1754\nsynonymous\ndepleted\ninstituto\ninsulin\nreprised\n##opsis\nclashed\n##ctric\ninterrupting\nradcliffe\ninsisting\nmedici\n1715\nejected\nplayfully\nturbulent\n##47\nstarvation\n##rini\nshipment\nrebellious\npetersen\nverification\nmerits\n##rified\ncakes\n##charged\n1757\nmilford\nshortages\nspying\nfidelity\n##aker\nemitted\nstorylines\nharvested\nseismic\n##iform\ncheung\nkilda\ntheoretically\nbarbie\nlynx\n##rgy\n##tius\ngoblin\nmata\npoisonous\n##nburg\nreactive\nresidues\nobedience\n##евич\nconjecture\n##rac\n401\nhating\nsixties\nkicker\nmoaning\nmotown\n##bha\nemancipation\nneoclassical\n##hering\nconsoles\nebert\nprofessorship\n##tures\nsustaining\nassaults\nobeyed\naffluent\nincurred\ntornadoes\n##eber\n##zow\nemphasizing\nhighlanders\ncheated\nhelmets\n##ctus\ninternship\nterence\nbony\nexecutions\nlegislators\nberries\npeninsular\ntinged\n##aco\n1689\namplifier\ncorvette\nribbons\nlavish\npennant\n##lander\nworthless\n##chfield\n##forms\nmariano\npyrenees\nexpenditures\n##icides\nchesterfield\nmandir\ntailor\n39th\nsergey\nnestled\nwilled\naristocracy\ndevotees\ngoodnight\nraaf\nrumored\nweaponry\nremy\nappropriations\nharcourt\nburr\nriaa\n##lence\nlimitation\nunnoticed\nguo\nsoaking\nswamps\n##tica\ncollapsing\ntatiana\ndescriptive\nbrigham\npsalm\n##chment\nmaddox\n##lization\npatti\ncaliph\n##aja\nakron\ninjuring\nserra\n##ganj\nbasins\n##sari\nastonished\nlauncher\n##church\nhilary\nwilkins\nsewing\n##sf\nstinging\n##fia\n##ncia\nunderwood\nstartup\n##ition\ncompilations\nvibrations\nembankment\njurist\n##nity\nbard\njuventus\ngroundwater\nkern\npalaces\nhelium\nboca\ncramped\nmarissa\nsoto\n##worm\njae\nprincely\n##ggy\nfaso\nbazaar\nwarmly\n##voking\n229\npairing\n##lite\n##grate\n##nets\nwien\nfreaked\nulysses\nrebirth\n##alia\n##rent\nmummy\nguzman\njimenez\nstilled\n##nitz\ntrajectory\ntha\nwoken\narchival\nprofessions\n##pts\n##pta\nhilly\nshadowy\nshrink\n##bolt\nnorwood\nglued\nmigrate\nstereotypes\ndevoid\n##pheus\n625\nevacuate\nhorrors\ninfancy\ngotham\nknowles\noptic\ndownloaded\nsachs\nkingsley\nparramatta\ndarryl\nmor\n##onale\nshady\ncommence\nconfesses\nkan\n##meter\n##placed\nmarlborough\nroundabout\nregents\nfrigates\nio\n##imating\ngothenburg\nrevoked\ncarvings\nclockwise\nconvertible\nintruder\n##sche\nbanged\n##ogo\nvicky\nbourgeois\n##mony\ndupont\nfooting\n##gum\npd\n##real\nbuckle\nyun\npenthouse\nsane\n720\nserviced\nstakeholders\nneumann\nbb\n##eers\ncomb\n##gam\ncatchment\npinning\nrallies\ntyping\n##elles\nforefront\nfreiburg\nsweetie\ngiacomo\nwidowed\ngoodwill\nworshipped\naspirations\nmidday\n##vat\nfishery\n##trick\nbournemouth\nturk\n243\nhearth\nethanol\nguadalajara\nmurmurs\nsl\n##uge\nafforded\nscripted\n##hta\nwah\n##jn\ncoroner\ntranslucent\n252\nmemorials\npuck\nprogresses\nclumsy\n##race\n315\ncandace\nrecounted\n##27\n##slin\n##uve\nfiltering\n##mac\nhowl\nstrata\nheron\nleveled\n##ays\ndubious\n##oja\n##т\n##wheel\ncitations\nexhibiting\n##laya\n##mics\n##pods\nturkic\n##lberg\ninjunction\n##ennial\n##mit\nantibodies\n##44\norganise\n##rigues\ncardiovascular\ncushion\ninverness\n##zquez\ndia\ncocoa\nsibling\n##tman\n##roid\nexpanse\nfeasible\ntunisian\nalgiers\n##relli\nrus\nbloomberg\ndso\nwestphalia\nbro\ntacoma\n281\ndownloads\n##ours\nkonrad\nduran\n##hdi\ncontinuum\njett\ncompares\nlegislator\nsecession\n##nable\n##gues\n##zuka\ntranslating\nreacher\n##gley\n##ła\naleppo\n##agi\ntc\norchards\ntrapping\nlinguist\nversatile\ndrumming\npostage\ncalhoun\nsuperiors\n##mx\nbarefoot\nleary\n##cis\nignacio\nalfa\nkaplan\n##rogen\nbratislava\nmori\n##vot\ndisturb\nhaas\n313\ncartridges\ngilmore\nradiated\nsalford\ntunic\nhades\n##ulsive\narcheological\ndelilah\nmagistrates\nauditioned\nbrewster\ncharters\nempowerment\nblogs\ncappella\ndynasties\niroquois\nwhipping\n##krishna\nraceway\ntruths\nmyra\nweaken\njudah\nmcgregor\n##horse\nmic\nrefueling\n37th\nburnley\nbosses\nmarkus\npremio\nquery\n##gga\ndunbar\n##economic\ndarkest\nlyndon\nsealing\ncommendation\nreappeared\n##mun\naddicted\nezio\nslaughtered\nsatisfactory\nshuffle\n##eves\n##thic\n##uj\nfortification\nwarrington\n##otto\nresurrected\nfargo\nmane\n##utable\n##lei\n##space\nforeword\nox\n##aris\n##vern\nabrams\nhua\n##mento\nsakura\n##alo\nuv\nsentimental\n##skaya\nmidfield\n##eses\nsturdy\nscrolls\nmacleod\n##kyu\nentropy\n##lance\nmitochondrial\ncicero\nexcelled\nthinner\nconvoys\nperceive\n##oslav\n##urable\nsystematically\ngrind\nburkina\n287\n##tagram\nops\n##aman\nguantanamo\n##cloth\n##tite\nforcefully\nwavy\n##jou\npointless\n##linger\n##tze\nlayton\nportico\nsuperficial\nclerical\noutlaws\n##hism\nburials\nmuir\n##inn\ncreditors\nhauling\nrattle\n##leg\ncalais\nmonde\narchers\nreclaimed\ndwell\nwexford\nhellenic\nfalsely\nremorse\n##tek\ndough\nfurnishings\n##uttered\ngabon\nneurological\nnovice\n##igraphy\ncontemplated\npulpit\nnightstand\nsaratoga\n##istan\ndocumenting\npulsing\ntaluk\n##firmed\nbusted\nmarital\n##rien\ndisagreements\nwasps\n##yes\nhodge\nmcdonnell\nmimic\nfran\npendant\ndhabi\nmusa\n##nington\ncongratulations\nargent\ndarrell\nconcussion\nlosers\nregrets\nthessaloniki\nreversal\ndonaldson\nhardwood\nthence\nachilles\nritter\n##eran\ndemonic\njurgen\nprophets\ngoethe\neki\nclassmate\nbuff\n##cking\nyank\nirrational\n##inging\nperished\nseductive\nqur\nsourced\n##crat\n##typic\nmustard\nravine\nbarre\nhorizontally\ncharacterization\nphylogenetic\nboise\n##dit\n##runner\n##tower\nbrutally\nintercourse\nseduce\n##bbing\nfay\nferris\nogden\namar\nnik\nunarmed\n##inator\nevaluating\nkyrgyzstan\nsweetness\n##lford\n##oki\nmccormick\nmeiji\nnotoriety\nstimulate\ndisrupt\nfiguring\ninstructional\nmcgrath\n##zoo\ngroundbreaking\n##lto\nflinch\nkhorasan\nagrarian\nbengals\nmixer\nradiating\n##sov\ningram\npitchers\nnad\ntariff\n##cript\ntata\n##codes\n##emi\n##ungen\nappellate\nlehigh\n##bled\n##giri\nbrawl\nduct\ntexans\n##ciation\n##ropolis\nskipper\nspeculative\nvomit\ndoctrines\nstresses\n253\ndavy\ngraders\nwhitehead\njozef\ntimely\ncumulative\nharyana\npaints\nappropriately\nboon\ncactus\n##ales\n##pid\ndow\nlegions\n##pit\nperceptions\n1730\npicturesque\n##yse\nperiphery\nrune\nwr\n##aha\nceltics\nsentencing\nwhoa\n##erin\nconfirms\nvariance\n425\nmoines\nmathews\nspade\nrave\nm1\nfronted\nfx\nblending\nalleging\nreared\n##gl\n237\n##paper\ngrassroots\neroded\n##free\n##physical\ndirects\nordeal\n##sław\naccelerate\nhacker\nrooftop\n##inia\nlev\nbuys\ncebu\ndevote\n##lce\nspecialising\n##ulsion\nchoreographed\nrepetition\nwarehouses\n##ryl\npaisley\ntuscany\nanalogy\nsorcerer\nhash\nhuts\nshards\ndescends\nexclude\nnix\nchaplin\ngaga\nito\nvane\n##drich\ncauseway\nmisconduct\nlimo\norchestrated\nglands\njana\n##kot\nu2\n##mple\n##sons\nbranching\ncontrasts\nscoop\nlonged\n##virus\nchattanooga\n##75\nsyrup\ncornerstone\n##tized\n##mind\n##iaceae\ncareless\nprecedence\nfrescoes\n##uet\nchilled\nconsult\nmodelled\nsnatch\npeat\n##thermal\ncaucasian\nhumane\nrelaxation\nspins\ntemperance\n##lbert\noccupations\nlambda\nhybrids\nmoons\nmp3\n##oese\n247\nrolf\nsocietal\nyerevan\nness\n##ssler\nbefriended\nmechanized\nnominate\ntrough\nboasted\ncues\nseater\n##hom\nbends\n##tangle\nconductors\nemptiness\n##lmer\neurasian\nadriatic\ntian\n##cie\nanxiously\nlark\npropellers\nchichester\njock\nev\n2a\n##holding\ncredible\nrecounts\ntori\nloyalist\nabduction\n##hoot\n##redo\nnepali\n##mite\nventral\ntempting\n##ango\n##crats\nsteered\n##wice\njavelin\ndipping\nlaborers\nprentice\nlooming\ntitanium\n##ː\nbadges\nemir\ntensor\n##ntation\negyptians\nrash\ndenies\nhawthorne\nlombard\nshowers\nwehrmacht\ndietary\ntrojan\n##reus\nwelles\nexecuting\nhorseshoe\nlifeboat\n##lak\nelsa\ninfirmary\nnearing\nroberta\nboyer\nmutter\ntrillion\njoanne\n##fine\n##oked\nsinks\nvortex\nuruguayan\nclasp\nsirius\n##block\naccelerator\nprohibit\nsunken\nbyu\nchronological\ndiplomats\nochreous\n510\nsymmetrical\n1644\nmaia\n##tology\nsalts\nreigns\natrocities\n##ия\nhess\nbared\nissn\n##vyn\ncater\nsaturated\n##cycle\n##isse\nsable\nvoyager\ndyer\nyusuf\n##inge\nfountains\nwolff\n##39\n##nni\nengraving\nrollins\natheist\nominous\n##ault\nherr\nchariot\nmartina\nstrung\n##fell\n##farlane\nhorrific\nsahib\ngazes\nsaetan\nerased\nptolemy\n##olic\nflushing\nlauderdale\nanalytic\n##ices\n530\nnavarro\nbeak\ngorilla\nherrera\nbroom\nguadalupe\nraiding\nsykes\n311\nbsc\ndeliveries\n1720\ninvasions\ncarmichael\ntajikistan\nthematic\necumenical\nsentiments\nonstage\n##rians\n##brand\n##sume\ncatastrophic\nflanks\nmolten\n##arns\nwaller\naimee\nterminating\n##icing\nalternately\n##oche\nnehru\nprinters\noutraged\n##eving\nempires\ntemplate\nbanners\nrepetitive\nza\n##oise\nvegetarian\n##tell\nguiana\nopt\ncavendish\nlucknow\nsynthesized\n##hani\n##mada\nfinalized\n##ctable\nfictitious\nmayoral\nunreliable\n##enham\nembracing\npeppers\nrbis\n##chio\n##neo\ninhibition\nslashed\ntogo\norderly\nembroidered\nsafari\nsalty\n236\nbarron\nbenito\ntotaled\n##dak\npubs\nsimulated\ncaden\ndevin\ntolkien\nmomma\nwelding\nsesame\n##ept\ngottingen\nhardness\n630\nshaman\ntemeraire\n620\nadequately\npediatric\n##kit\nck\nassertion\nradicals\ncomposure\ncadence\nseafood\nbeaufort\nlazarus\nmani\nwarily\ncunning\nkurdistan\n249\ncantata\n##kir\nares\n##41\n##clusive\nnape\ntownland\ngeared\ninsulted\nflutter\nboating\nviolate\ndraper\ndumping\nmalmo\n##hh\n##romatic\nfirearm\nalta\nbono\nobscured\n##clave\nexceeds\npanorama\nunbelievable\n##train\npreschool\n##essed\ndisconnected\ninstalling\nrescuing\nsecretaries\naccessibility\n##castle\n##drive\n##ifice\n##film\nbouts\nslug\nwaterway\nmindanao\n##buro\n##ratic\nhalves\n##ل\ncalming\nliter\nmaternity\nadorable\nbragg\nelectrification\nmcc\n##dote\nroxy\nschizophrenia\n##body\nmunoz\nkaye\nwhaling\n239\nmil\ntingling\ntolerant\n##ago\nunconventional\nvolcanoes\n##finder\ndeportivo\n##llie\nrobson\nkaufman\nneuroscience\nwai\ndeportation\nmasovian\nscraping\nconverse\n##bh\nhacking\nbulge\n##oun\nadministratively\nyao\n580\namp\nmammoth\nbooster\nclaremont\nhooper\nnomenclature\npursuits\nmclaughlin\nmelinda\n##sul\ncatfish\nbarclay\nsubstrates\ntaxa\nzee\noriginals\nkimberly\npackets\npadma\n##ality\nborrowing\nostensibly\nsolvent\n##bri\n##genesis\n##mist\nlukas\nshreveport\nveracruz\n##ь\n##lou\n##wives\ncheney\ntt\nanatolia\nhobbs\n##zyn\ncyclic\nradiant\nalistair\ngreenish\nsiena\ndat\nindependents\n##bation\nconform\npieter\nhyper\napplicant\nbradshaw\nspores\ntelangana\nvinci\ninexpensive\nnuclei\n322\njang\nnme\nsoho\nspd\n##ign\ncradled\nreceptionist\npow\n##43\n##rika\nfascism\n##ifer\nexperimenting\n##ading\n##iec\n##region\n345\njocelyn\nmaris\nstair\nnocturnal\ntoro\nconstabulary\nelgin\n##kker\nmsc\n##giving\n##schen\n##rase\ndoherty\ndoping\nsarcastically\nbatter\nmaneuvers\n##cano\n##apple\n##gai\n##git\nintrinsic\n##nst\n##stor\n1753\nshowtime\ncafes\ngasps\nlviv\nushered\n##thed\nfours\nrestart\nastonishment\ntransmitting\nflyer\nshrugs\n##sau\nintriguing\ncones\ndictated\nmushrooms\nmedial\n##kovsky\n##elman\nescorting\ngaped\n##26\ngodfather\n##door\n##sell\ndjs\nrecaptured\ntimetable\nvila\n1710\n3a\naerodrome\nmortals\nscientology\n##orne\nangelina\nmag\nconvection\nunpaid\ninsertion\nintermittent\nlego\n##nated\nendeavor\nkota\npereira\n##lz\n304\nbwv\nglamorgan\ninsults\nagatha\nfey\n##cend\nfleetwood\nmahogany\nprotruding\nsteamship\nzeta\n##arty\nmcguire\nsuspense\n##sphere\nadvising\nurges\n##wala\nhurriedly\nmeteor\ngilded\ninline\narroyo\nstalker\n##oge\nexcitedly\nrevered\n##cure\nearle\nintroductory\n##break\n##ilde\nmutants\npuff\npulses\nreinforcement\n##haling\ncurses\nlizards\nstalk\ncorrelated\n##fixed\nfallout\nmacquarie\n##unas\nbearded\ndenton\nheaving\n802\n##ocation\nwinery\nassign\ndortmund\n##lkirk\neverest\ninvariant\ncharismatic\nsusie\n##elling\nbled\nlesley\ntelegram\nsumner\nbk\n##ogen\n##к\nwilcox\nneedy\ncolbert\nduval\n##iferous\n##mbled\nallotted\nattends\nimperative\n##hita\nreplacements\nhawker\n##inda\ninsurgency\n##zee\n##eke\ncasts\n##yla\n680\nives\ntransitioned\n##pack\n##powering\nauthoritative\nbaylor\nflex\ncringed\nplaintiffs\nwoodrow\n##skie\ndrastic\nape\naroma\nunfolded\ncommotion\nnt\npreoccupied\ntheta\nroutines\nlasers\nprivatization\nwand\ndomino\nek\nclenching\nnsa\nstrategically\nshowered\nbile\nhandkerchief\npere\nstoring\nchristophe\ninsulting\n316\nnakamura\nromani\nasiatic\nmagdalena\npalma\ncruises\nstripping\n405\nkonstantin\nsoaring\n##berman\ncolloquially\nforerunner\nhavilland\nincarcerated\nparasites\nsincerity\n##utus\ndisks\nplank\nsaigon\n##ining\ncorbin\nhomo\nornaments\npowerhouse\n##tlement\nchong\nfastened\nfeasibility\nidf\nmorphological\nusable\n##nish\n##zuki\naqueduct\njaguars\nkeepers\n##flies\naleksandr\nfaust\nassigns\newing\nbacterium\nhurled\ntricky\nhungarians\nintegers\nwallis\n321\nyamaha\n##isha\nhushed\noblivion\naviator\nevangelist\nfriars\n##eller\nmonograph\node\n##nary\nairplanes\nlabourers\ncharms\n##nee\n1661\nhagen\ntnt\nrudder\nfiesta\ntranscript\ndorothea\nska\ninhibitor\nmaccabi\nretorted\nraining\nencompassed\nclauses\nmenacing\n1642\nlineman\n##gist\nvamps\n##ape\n##dick\ngloom\n##rera\ndealings\neasing\nseekers\n##nut\n##pment\nhelens\nunmanned\n##anu\n##isson\nbasics\n##amy\n##ckman\nadjustments\n1688\nbrutality\nhorne\n##zell\nsui\n##55\n##mable\naggregator\n##thal\nrhino\n##drick\n##vira\ncounters\nzoom\n##01\n##rting\nmn\nmontenegrin\npackard\n##unciation\n##♭\n##kki\nreclaim\nscholastic\nthugs\npulsed\n##icia\nsyriac\nquan\nsaddam\nbanda\nkobe\nblaming\nbuddies\ndissent\n##lusion\n##usia\ncorbett\njaya\ndelle\nerratic\nlexie\n##hesis\n435\namiga\nhermes\n##pressing\n##leen\nchapels\ngospels\njamal\n##uating\ncompute\nrevolving\nwarp\n##sso\n##thes\narmory\n##eras\n##gol\nantrim\nloki\n##kow\n##asian\n##good\n##zano\nbraid\nhandwriting\nsubdistrict\nfunky\npantheon\n##iculate\nconcurrency\nestimation\nimproper\njuliana\n##his\nnewcomers\njohnstone\nstaten\ncommunicated\n##oco\n##alle\nsausage\nstormy\n##stered\n##tters\nsuperfamily\n##grade\nacidic\ncollateral\ntabloid\n##oped\n##rza\nbladder\nausten\n##ellant\nmcgraw\n##hay\nhannibal\nmein\naquino\nlucifer\nwo\nbadger\nboar\ncher\nchristensen\ngreenberg\ninterruption\n##kken\njem\n244\nmocked\nbottoms\ncambridgeshire\n##lide\nsprawling\n##bbly\neastwood\nghent\nsynth\n##buck\nadvisers\n##bah\nnominally\nhapoel\nqu\ndaggers\nestranged\nfabricated\ntowels\nvinnie\nwcw\nmisunderstanding\nanglia\nnothin\nunmistakable\n##dust\n##lova\nchilly\nmarquette\ntruss\n##edge\n##erine\nreece\n##lty\n##chemist\n##connected\n272\n308\n41st\nbash\nraion\nwaterfalls\n##ump\n##main\nlabyrinth\nqueue\ntheorist\n##istle\nbharatiya\nflexed\nsoundtracks\nrooney\nleftist\npatrolling\nwharton\nplainly\nalleviate\neastman\nschuster\ntopographic\nengages\nimmensely\nunbearable\nfairchild\n1620\ndona\nlurking\nparisian\noliveira\nia\nindictment\nhahn\nbangladeshi\n##aster\nvivo\n##uming\n##ential\nantonia\nexpects\nindoors\nkildare\nharlan\n##logue\n##ogenic\n##sities\nforgiven\n##wat\nchildish\ntavi\n##mide\n##orra\nplausible\ngrimm\nsuccessively\nscooted\n##bola\n##dget\n##rith\nspartans\nemery\nflatly\nazure\nepilogue\n##wark\nflourish\n##iny\n##tracted\n##overs\n##oshi\nbestseller\ndistressed\nreceipt\nspitting\nhermit\ntopological\n##cot\ndrilled\nsubunit\nfrancs\n##layer\neel\n##fk\n##itas\noctopus\nfootprint\npetitions\nufo\n##say\n##foil\ninterfering\nleaking\npalo\n##metry\nthistle\nvaliant\n##pic\nnarayan\nmcpherson\n##fast\ngonzales\n##ym\n##enne\ndustin\nnovgorod\nsolos\n##zman\ndoin\n##raph\n##patient\n##meyer\nsoluble\nashland\ncuffs\ncarole\npendleton\nwhistling\nvassal\n##river\ndeviation\nrevisited\nconstituents\nrallied\nrotate\nloomed\n##eil\n##nting\namateurs\naugsburg\nauschwitz\ncrowns\nskeletons\n##cona\nbonnet\n257\ndummy\nglobalization\nsimeon\nsleeper\nmandal\ndifferentiated\n##crow\n##mare\nmilne\nbundled\nexasperated\ntalmud\nowes\nsegregated\n##feng\n##uary\ndentist\npiracy\nprops\n##rang\ndevlin\n##torium\nmalicious\npaws\n##laid\ndependency\n##ergy\n##fers\n##enna\n258\npistons\nrourke\njed\ngrammatical\ntres\nmaha\nwig\n512\nghostly\njayne\n##achal\n##creen\n##ilis\n##lins\n##rence\ndesignate\n##with\narrogance\ncambodian\nclones\nshowdown\nthrottle\ntwain\n##ception\nlobes\nmetz\nnagoya\n335\nbraking\n##furt\n385\nroaming\n##minster\namin\ncrippled\n##37\n##llary\nindifferent\nhoffmann\nidols\nintimidating\n1751\n261\ninfluenza\nmemo\nonions\n1748\nbandage\nconsciously\n##landa\n##rage\nclandestine\nobserves\nswiped\ntangle\n##ener\n##jected\n##trum\n##bill\n##lta\nhugs\ncongresses\njosiah\nspirited\n##dek\nhumanist\nmanagerial\nfilmmaking\ninmate\nrhymes\ndebuting\ngrimsby\nur\n##laze\nduplicate\nvigor\n##tf\nrepublished\nbolshevik\nrefurbishment\nantibiotics\nmartini\nmethane\nnewscasts\nroyale\nhorizons\nlevant\niain\nvisas\n##ischen\npaler\n##around\nmanifestation\nsnuck\nalf\nchop\nfutile\npedestal\nrehab\n##kat\nbmg\nkerman\nres\nfairbanks\njarrett\nabstraction\nsaharan\n##zek\n1746\nprocedural\nclearer\nkincaid\nsash\nluciano\n##ffey\ncrunch\nhelmut\n##vara\nrevolutionaries\n##tute\ncreamy\nleach\n##mmon\n1747\npermitting\nnes\nplight\nwendell\n##lese\ncontra\nts\nclancy\nipa\nmach\nstaples\nautopsy\ndisturbances\nnueva\nkarin\npontiac\n##uding\nproxy\nvenerable\nhaunt\nleto\nbergman\nexpands\n##helm\nwal\n##pipe\ncanning\nceline\ncords\nobesity\n##enary\nintrusion\nplanner\n##phate\nreasoned\nsequencing\n307\nharrow\n##chon\n##dora\nmarred\nmcintyre\nrepay\ntarzan\ndarting\n248\nharrisburg\nmargarita\nrepulsed\n##hur\n##lding\nbelinda\nhamburger\nnovo\ncompliant\nrunways\nbingham\nregistrar\nskyscraper\nic\ncuthbert\nimprovisation\nlivelihood\n##corp\n##elial\nadmiring\n##dened\nsporadic\nbeliever\ncasablanca\npopcorn\n##29\nasha\nshovel\n##bek\n##dice\ncoiled\ntangible\n##dez\ncasper\nelsie\nresin\ntenderness\nrectory\n##ivision\navail\nsonar\n##mori\nboutique\n##dier\nguerre\nbathed\nupbringing\nvaulted\nsandals\nblessings\n##naut\n##utnant\n1680\n306\nfoxes\npia\ncorrosion\nhesitantly\nconfederates\ncrystalline\nfootprints\nshapiro\ntirana\nvalentin\ndrones\n45th\nmicroscope\nshipments\ntexted\ninquisition\nwry\nguernsey\nunauthorized\nresigning\n760\nripple\nschubert\nstu\nreassure\nfelony\n##ardo\nbrittle\nkoreans\n##havan\n##ives\ndun\nimplicit\ntyres\n##aldi\n##lth\nmagnolia\n##ehan\n##puri\n##poulos\naggressively\nfei\ngr\nfamiliarity\n##poo\nindicative\n##trust\nfundamentally\njimmie\noverrun\n395\nanchors\nmoans\n##opus\nbritannia\narmagh\n##ggle\npurposely\nseizing\n##vao\nbewildered\nmundane\navoidance\ncosmopolitan\ngeometridae\nquartermaster\ncaf\n415\nchatter\nengulfed\ngleam\npurge\n##icate\njuliette\njurisprudence\nguerra\nrevisions\n##bn\ncasimir\nbrew\n##jm\n1749\nclapton\ncloudy\nconde\nhermitage\n278\nsimulations\ntorches\nvincenzo\nmatteo\n##rill\nhidalgo\nbooming\nwestbound\naccomplishment\ntentacles\nunaffected\n##sius\nannabelle\nflopped\nsloping\n##litz\ndreamer\ninterceptor\nvu\n##loh\nconsecration\ncopying\nmessaging\nbreaker\nclimates\nhospitalized\n1752\ntorino\nafternoons\nwinfield\nwitnessing\n##teacher\nbreakers\nchoirs\nsawmill\ncoldly\n##ege\nsipping\nhaste\nuninhabited\nconical\nbibliography\npamphlets\nsevern\nedict\n##oca\ndeux\nillnesses\ngrips\n##pl\nrehearsals\nsis\nthinkers\ntame\n##keepers\n1690\nacacia\nreformer\n##osed\n##rys\nshuffling\n##iring\n##shima\neastbound\nionic\nrhea\nflees\nlittered\n##oum\nrocker\nvomiting\ngroaning\nchamp\noverwhelmingly\ncivilizations\npaces\nsloop\nadoptive\n##tish\nskaters\n##vres\naiding\nmango\n##joy\nnikola\nshriek\n##ignon\npharmaceuticals\n##mg\ntuna\ncalvert\ngustavo\nstocked\nyearbook\n##urai\n##mana\ncomputed\nsubsp\nriff\nhanoi\nkelvin\nhamid\nmoors\npastures\nsummons\njihad\nnectar\n##ctors\nbayou\nuntitled\npleasing\nvastly\nrepublics\nintellect\n##η\n##ulio\n##tou\ncrumbling\nstylistic\nsb\n##ی\nconsolation\nfrequented\nh₂o\nwalden\nwidows\n##iens\n404\n##ignment\nchunks\nimproves\n288\ngrit\nrecited\n##dev\nsnarl\nsociological\n##arte\n##gul\ninquired\n##held\nbruise\nclube\nconsultancy\nhomogeneous\nhornets\nmultiplication\npasta\nprick\nsavior\n##grin\n##kou\n##phile\nyoon\n##gara\ngrimes\nvanishing\ncheering\nreacting\nbn\ndistillery\n##quisite\n##vity\ncoe\ndockyard\nmassif\n##jord\nescorts\nvoss\n##valent\nbyte\nchopped\nhawke\nillusions\nworkings\nfloats\n##koto\n##vac\nkv\nannapolis\nmadden\n##onus\nalvaro\nnoctuidae\n##cum\n##scopic\navenge\nsteamboat\nforte\nillustrates\nerika\n##trip\n570\ndew\nnationalities\nbran\nmanifested\nthirsty\ndiversified\nmuscled\nreborn\n##standing\narson\n##lessness\n##dran\n##logram\n##boys\n##kushima\n##vious\nwilloughby\n##phobia\n286\nalsace\ndashboard\nyuki\n##chai\ngranville\nmyspace\npublicized\ntricked\n##gang\nadjective\n##ater\nrelic\nreorganisation\nenthusiastically\nindications\nsaxe\n##lassified\nconsolidate\niec\npadua\nhelplessly\nramps\nrenaming\nregulars\npedestrians\naccents\nconvicts\ninaccurate\nlowers\nmana\n##pati\nbarrie\nbjp\noutta\nsomeplace\nberwick\nflanking\ninvoked\nmarrow\nsparsely\nexcerpts\nclothed\nrei\n##ginal\nwept\n##straße\n##vish\nalexa\nexcel\n##ptive\nmembranes\naquitaine\ncreeks\ncutler\nsheppard\nimplementations\nns\n##dur\nfragrance\nbudge\nconcordia\nmagnesium\nmarcelo\n##antes\ngladly\nvibrating\n##rral\n##ggles\nmontrose\n##omba\nlew\nseamus\n1630\ncocky\n##ament\n##uen\nbjorn\n##rrick\nfielder\nfluttering\n##lase\nmethyl\nkimberley\nmcdowell\nreductions\nbarbed\n##jic\n##tonic\naeronautical\ncondensed\ndistracting\n##promising\nhuffed\n##cala\n##sle\nclaudius\ninvincible\nmissy\npious\nbalthazar\nci\n##lang\nbutte\ncombo\norson\n##dication\nmyriad\n1707\nsilenced\n##fed\n##rh\ncoco\nnetball\nyourselves\n##oza\nclarify\nheller\npeg\ndurban\netudes\noffender\nroast\nblackmail\ncurvature\n##woods\nvile\n309\nillicit\nsuriname\n##linson\noverture\n1685\nbubbling\ngymnast\ntucking\n##mming\n##ouin\nmaldives\n##bala\ngurney\n##dda\n##eased\n##oides\nbackside\npinto\njars\nracehorse\ntending\n##rdial\nbaronetcy\nwiener\nduly\n##rke\nbarbarian\ncupping\nflawed\n##thesis\nbertha\npleistocene\npuddle\nswearing\n##nob\n##tically\nfleeting\nprostate\namulet\neducating\n##mined\n##iti\n##tler\n75th\njens\nrespondents\nanalytics\ncavaliers\npapacy\nraju\n##iente\n##ulum\n##tip\nfunnel\n271\ndisneyland\n##lley\nsociologist\n##iam\n2500\nfaulkner\nlouvre\nmenon\n##dson\n276\n##ower\nafterlife\nmannheim\npeptide\nreferees\ncomedians\nmeaningless\n##anger\n##laise\nfabrics\nhurley\nrenal\nsleeps\n##bour\n##icle\nbreakout\nkristin\nroadside\nanimator\nclover\ndisdain\nunsafe\nredesign\n##urity\nfirth\nbarnsley\nportage\nreset\nnarrows\n268\ncommandos\nexpansive\nspeechless\ntubular\n##lux\nessendon\neyelashes\nsmashwords\n##yad\n##bang\n##claim\ncraved\nsprinted\nchet\nsomme\nastor\nwrocław\norton\n266\nbane\n##erving\n##uing\nmischief\n##amps\n##sund\nscaling\nterre\n##xious\nimpairment\noffenses\nundermine\nmoi\nsoy\ncontiguous\narcadia\ninuit\nseam\n##tops\nmacbeth\nrebelled\n##icative\n##iot\n590\nelaborated\nfrs\nuniformed\n##dberg\n259\npowerless\npriscilla\nstimulated\n980\nqc\narboretum\nfrustrating\ntrieste\nbullock\n##nified\nenriched\nglistening\nintern\n##adia\nlocus\nnouvelle\nollie\nike\nlash\nstarboard\nee\ntapestry\nheadlined\nhove\nrigged\n##vite\npollock\n##yme\nthrive\nclustered\ncas\nroi\ngleamed\nolympiad\n##lino\npressured\nregimes\n##hosis\n##lick\nripley\n##ophone\nkickoff\ngallon\nrockwell\n##arable\ncrusader\nglue\nrevolutions\nscrambling\n1714\ngrover\n##jure\nenglishman\naztec\n263\ncontemplating\ncoven\nipad\npreach\ntriumphant\ntufts\n##esian\nrotational\n##phus\n328\nfalkland\n##brates\nstrewn\nclarissa\nrejoin\nenvironmentally\nglint\nbanded\ndrenched\nmoat\nalbanians\njohor\nrr\nmaestro\nmalley\nnouveau\nshaded\ntaxonomy\nv6\nadhere\nbunk\nairfields\n##ritan\n1741\nencompass\nremington\ntran\n##erative\namelie\nmazda\nfriar\nmorals\npassions\n##zai\nbreadth\nvis\n##hae\nargus\nburnham\ncaressing\ninsider\nrudd\n##imov\n##mini\n##rso\nitalianate\nmurderous\ntextual\nwainwright\narmada\nbam\nweave\ntimer\n##taken\n##nh\nfra\n##crest\nardent\nsalazar\ntaps\ntunis\n##ntino\nallegro\ngland\nphilanthropic\n##chester\nimplication\n##optera\nesq\njudas\nnoticeably\nwynn\n##dara\ninched\nindexed\ncrises\nvilliers\nbandit\nroyalties\npatterned\ncupboard\ninterspersed\naccessory\nisla\nkendrick\nentourage\nstitches\n##esthesia\nheadwaters\n##ior\ninterlude\ndistraught\ndraught\n1727\n##basket\nbiased\nsy\ntransient\ntriad\nsubgenus\nadapting\nkidd\nshortstop\n##umatic\ndimly\nspiked\nmcleod\nreprint\nnellie\npretoria\nwindmill\n##cek\nsingled\n##mps\n273\nreunite\n##orous\n747\nbankers\noutlying\n##omp\n##ports\n##tream\napologies\ncosmetics\npatsy\n##deh\n##ocks\n##yson\nbender\nnantes\nserene\n##nad\nlucha\nmmm\n323\n##cius\n##gli\ncmll\ncoinage\nnestor\njuarez\n##rook\nsmeared\nsprayed\ntwitching\nsterile\nirina\nembodied\njuveniles\nenveloped\nmiscellaneous\ncancers\ndq\ngulped\nluisa\ncrested\nswat\ndonegal\nref\n##anov\n##acker\nhearst\nmercantile\n##lika\ndoorbell\nua\nvicki\n##alla\n##som\nbilbao\npsychologists\nstryker\nsw\nhorsemen\nturkmenistan\nwits\n##national\nanson\nmathew\nscreenings\n##umb\nrihanna\n##agne\n##nessy\naisles\n##iani\n##osphere\nhines\nkenton\nsaskatoon\ntasha\ntruncated\n##champ\n##itan\nmildred\nadvises\nfredrik\ninterpreting\ninhibitors\n##athi\nspectroscopy\n##hab\n##kong\nkarim\npanda\n##oia\n##nail\n##vc\nconqueror\nkgb\nleukemia\n##dity\narrivals\ncheered\npisa\nphosphorus\nshielded\n##riated\nmammal\nunitarian\nurgently\nchopin\nsanitary\n##mission\nspicy\ndrugged\nhinges\n##tort\ntipping\ntrier\nimpoverished\nwestchester\n##caster\n267\nepoch\nnonstop\n##gman\n##khov\naromatic\ncentrally\ncerro\n##tively\n##vio\nbillions\nmodulation\nsedimentary\n283\nfacilitating\noutrageous\ngoldstein\n##eak\n##kt\nld\nmaitland\npenultimate\npollard\n##dance\nfleets\nspaceship\nvertebrae\n##nig\nalcoholism\nals\nrecital\n##bham\n##ference\n##omics\nm2\n##bm\ntrois\n##tropical\n##в\ncommemorates\n##meric\nmarge\n##raction\n1643\n670\ncosmetic\nravaged\n##ige\ncatastrophe\neng\n##shida\nalbrecht\narterial\nbellamy\ndecor\nharmon\n##rde\nbulbs\nsynchronized\nvito\neasiest\nshetland\nshielding\nwnba\n##glers\n##ssar\n##riam\nbrianna\ncumbria\n##aceous\n##rard\ncores\nthayer\n##nsk\nbrood\nhilltop\nluminous\ncarts\nkeynote\nlarkin\nlogos\n##cta\n##ا\n##mund\n##quay\nlilith\ntinted\n277\nwrestle\nmobilization\n##uses\nsequential\nsiam\nbloomfield\ntakahashi\n274\n##ieving\npresenters\nringo\nblazed\nwitty\n##oven\n##ignant\ndevastation\nhaydn\nharmed\nnewt\ntherese\n##peed\ngershwin\nmolina\nrabbis\nsudanese\n001\ninnate\nrestarted\n##sack\n##fus\nslices\nwb\n##shah\nenroll\nhypothetical\nhysterical\n1743\nfabio\nindefinite\nwarped\n##hg\nexchanging\n525\nunsuitable\n##sboro\ngallo\n1603\nbret\ncobalt\nhomemade\n##hunter\nmx\noperatives\n##dhar\nterraces\ndurable\nlatch\npens\nwhorls\n##ctuated\n##eaux\nbilling\nligament\nsuccumbed\n##gly\nregulators\nspawn\n##brick\n##stead\nfilmfare\nrochelle\n##nzo\n1725\ncircumstance\nsaber\nsupplements\n##nsky\n##tson\ncrowe\nwellesley\ncarrot\n##9th\n##movable\nprimate\ndrury\nsincerely\ntopical\n##mad\n##rao\ncallahan\nkyiv\nsmarter\ntits\nundo\n##yeh\nannouncements\nanthologies\nbarrio\nnebula\n##islaus\n##shaft\n##tyn\nbodyguards\n2021\nassassinate\nbarns\nemmett\nscully\n##mah\n##yd\n##eland\n##tino\n##itarian\ndemoted\ngorman\nlashed\nprized\nadventist\nwrit\n##gui\nalla\ninvertebrates\n##ausen\n1641\namman\n1742\nalign\nhealy\nredistribution\n##gf\n##rize\ninsulation\n##drop\nadherents\nhezbollah\nvitro\nferns\nyanking\n269\nphp\nregistering\nuppsala\ncheerleading\nconfines\nmischievous\ntully\n##ross\n49th\ndocked\nroam\nstipulated\npumpkin\n##bry\nprompt\n##ezer\nblindly\nshuddering\ncraftsmen\nfrail\nscented\nkatharine\nscramble\nshaggy\nsponge\nhelix\nzaragoza\n279\n##52\n43rd\nbacklash\nfontaine\nseizures\nposse\ncowan\nnonfiction\ntelenovela\nwwii\nhammered\nundone\n##gpur\nencircled\nirs\n##ivation\nartefacts\noneself\nsearing\nsmallpox\n##belle\n##osaurus\nshandong\nbreached\nupland\nblushing\nrankin\ninfinitely\npsyche\ntolerated\ndocking\nevicted\n##col\nunmarked\n##lving\ngnome\nlettering\nlitres\nmusique\n##oint\nbenevolent\n##jal\nblackened\n##anna\nmccall\nracers\ntingle\n##ocene\n##orestation\nintroductions\nradically\n292\n##hiff\n##باد\n1610\n1739\nmunchen\nplead\n##nka\ncondo\nscissors\n##sight\n##tens\napprehension\n##cey\n##yin\nhallmark\nwatering\nformulas\nsequels\n##llas\naggravated\nbae\ncommencing\n##building\nenfield\nprohibits\nmarne\nvedic\ncivilized\neuclidean\njagger\nbeforehand\nblasts\ndumont\n##arney\n##nem\n740\nconversions\nhierarchical\nrios\nsimulator\n##dya\n##lellan\nhedges\noleg\nthrusts\nshadowed\ndarby\nmaximize\n1744\ngregorian\n##nded\n##routed\nsham\nunspecified\n##hog\nemory\nfactual\n##smo\n##tp\nfooled\n##rger\nortega\nwellness\nmarlon\n##oton\n##urance\ncasket\nkeating\nley\nenclave\n##ayan\nchar\ninfluencing\njia\n##chenko\n412\nammonia\nerebidae\nincompatible\nviolins\ncornered\n##arat\ngrooves\nastronauts\ncolumbian\nrampant\nfabrication\nkyushu\nmahmud\nvanish\n##dern\nmesopotamia\n##lete\nict\n##rgen\ncaspian\nkenji\npitted\n##vered\n999\ngrimace\nroanoke\ntchaikovsky\ntwinned\n##analysis\n##awan\nxinjiang\narias\nclemson\nkazakh\nsizable\n1662\n##khand\n##vard\nplunge\ntatum\nvittorio\n##nden\ncholera\n##dana\n##oper\nbracing\nindifference\nprojectile\nsuperliga\n##chee\nrealises\nupgrading\n299\nporte\nretribution\n##vies\nnk\nstil\n##resses\nama\nbureaucracy\nblackberry\nbosch\ntestosterone\ncollapses\ngreer\n##pathic\nioc\nfifties\nmalls\n##erved\nbao\nbaskets\nadolescents\nsiegfried\n##osity\n##tosis\nmantra\ndetecting\nexistent\nfledgling\n##cchi\ndissatisfied\ngan\ntelecommunication\nmingled\nsobbed\n6000\ncontroversies\noutdated\ntaxis\n##raus\nfright\nslams\n##lham\n##fect\n##tten\ndetectors\nfetal\ntanned\n##uw\nfray\ngoth\nolympian\nskipping\nmandates\nscratches\nsheng\nunspoken\nhyundai\ntracey\nhotspur\nrestrictive\n##buch\namericana\nmundo\n##bari\nburroughs\ndiva\nvulcan\n##6th\ndistinctions\nthumping\n##ngen\nmikey\nsheds\nfide\nrescues\nspringsteen\nvested\nvaluation\n##ece\n##ely\npinnacle\nrake\nsylvie\n##edo\nalmond\nquivering\n##irus\nalteration\nfaltered\n##wad\n51st\nhydra\nticked\n##kato\nrecommends\n##dicated\nantigua\narjun\nstagecoach\nwilfred\ntrickle\npronouns\n##pon\naryan\nnighttime\n##anian\ngall\npea\nstitch\n##hei\nleung\nmilos\n##dini\neritrea\nnexus\nstarved\nsnowfall\nkant\nparasitic\ncot\ndiscus\nhana\nstrikers\nappleton\nkitchens\n##erina\n##partisan\n##itha\n##vius\ndisclose\nmetis\n##channel\n1701\ntesla\n##vera\nfitch\n1735\nblooded\n##tila\ndecimal\n##tang\n##bai\ncyclones\neun\nbottled\npeas\npensacola\nbasha\nbolivian\ncrabs\nboil\nlanterns\npartridge\nroofed\n1645\nnecks\n##phila\nopined\npatting\n##kla\n##lland\nchuckles\nvolta\nwhereupon\n##nche\ndevout\neuroleague\nsuicidal\n##dee\ninherently\ninvoluntary\nknitting\nnasser\n##hide\npuppets\ncolourful\ncourageous\nsouthend\nstills\nmiraculous\nhodgson\nricher\nrochdale\nethernet\ngreta\nuniting\nprism\numm\n##haya\n##itical\n##utation\ndeterioration\npointe\nprowess\n##ropriation\nlids\nscranton\nbillings\nsubcontinent\n##koff\n##scope\nbrute\nkellogg\npsalms\ndegraded\n##vez\nstanisław\n##ructured\nferreira\npun\nastonishing\ngunnar\n##yat\narya\nprc\ngottfried\n##tight\nexcursion\n##ographer\ndina\n##quil\n##nare\nhuffington\nillustrious\nwilbur\ngundam\nverandah\n##zard\nnaacp\n##odle\nconstructive\nfjord\nkade\n##naud\ngenerosity\nthrilling\nbaseline\ncayman\nfrankish\nplastics\naccommodations\nzoological\n##fting\ncedric\nqb\nmotorized\n##dome\n##otted\nsquealed\ntackled\ncanucks\nbudgets\nsitu\nasthma\ndail\ngabled\ngrasslands\nwhimpered\nwrithing\njudgments\n##65\nminnie\npv\n##carbon\nbananas\ngrille\ndomes\nmonique\nodin\nmaguire\nmarkham\ntierney\n##estra\n##chua\nlibel\npoke\nspeedy\natrium\nlaval\nnotwithstanding\n##edly\nfai\nkala\n##sur\nrobb\n##sma\nlistings\nluz\nsupplementary\ntianjin\n##acing\nenzo\njd\nric\nscanner\ncroats\ntranscribed\n##49\narden\ncv\n##hair\n##raphy\n##lver\n##uy\n357\nseventies\nstaggering\nalam\nhorticultural\nhs\nregression\ntimbers\nblasting\n##ounded\nmontagu\nmanipulating\n##cit\ncatalytic\n1550\ntroopers\n##meo\ncondemnation\nfitzpatrick\n##oire\n##roved\ninexperienced\n1670\ncastes\n##lative\nouting\n314\ndubois\nflicking\nquarrel\nste\nlearners\n1625\niq\nwhistled\n##class\n282\nclassify\ntariffs\ntemperament\n355\nfolly\nliszt\n##yles\nimmersed\njordanian\nceasefire\napparel\nextras\nmaru\nfished\n##bio\nharta\nstockport\nassortment\ncraftsman\nparalysis\ntransmitters\n##cola\nblindness\n##wk\nfatally\nproficiency\nsolemnly\n##orno\nrepairing\namore\ngroceries\nultraviolet\n##chase\nschoolhouse\n##tua\nresurgence\nnailed\n##otype\n##×\nruse\nsaliva\ndiagrams\n##tructing\nalbans\nrann\nthirties\n1b\nantennas\nhilarious\ncougars\npaddington\nstats\n##eger\nbreakaway\nipod\nreza\nauthorship\nprohibiting\nscoffed\n##etz\n##ttle\nconscription\ndefected\ntrondheim\n##fires\nivanov\nkeenan\n##adan\n##ciful\n##fb\n##slow\nlocating\n##ials\n##tford\ncadiz\nbasalt\nblankly\ninterned\nrags\nrattling\n##tick\ncarpathian\nreassured\nsync\nbum\nguildford\niss\nstaunch\n##onga\nastronomers\nsera\nsofie\nemergencies\nsusquehanna\n##heard\nduc\nmastery\nvh1\nwilliamsburg\nbayer\nbuckled\ncraving\n##khan\n##rdes\nbloomington\n##write\nalton\nbarbecue\n##bians\njustine\n##hri\n##ndt\ndelightful\nsmartphone\nnewtown\nphoton\nretrieval\npeugeot\nhissing\n##monium\n##orough\nflavors\nlighted\nrelaunched\ntainted\n##games\n##lysis\nanarchy\nmicroscopic\nhopping\nadept\nevade\nevie\n##beau\ninhibit\nsinn\nadjustable\nhurst\nintuition\nwilton\ncisco\n44th\nlawful\nlowlands\nstockings\nthierry\n##dalen\n##hila\n##nai\nfates\nprank\ntb\nmaison\nlobbied\nprovocative\n1724\n4a\nutopia\n##qual\ncarbonate\ngujarati\npurcell\n##rford\ncurtiss\n##mei\novergrown\narenas\nmediation\nswallows\n##rnik\nrespectful\nturnbull\n##hedron\n##hope\nalyssa\nozone\n##ʻi\nami\ngestapo\njohansson\nsnooker\ncanteen\ncuff\ndeclines\nempathy\nstigma\n##ags\n##iner\n##raine\ntaxpayers\ngui\nvolga\n##wright\n##copic\nlifespan\novercame\ntattooed\nenactment\ngiggles\n##ador\n##camp\nbarrington\nbribe\nobligatory\norbiting\npeng\n##enas\nelusive\nsucker\n##vating\ncong\nhardship\nempowered\nanticipating\nestrada\ncryptic\ngreasy\ndetainees\nplanck\nsudbury\nplaid\ndod\nmarriott\nkayla\n##ears\n##vb\n##zd\nmortally\n##hein\ncognition\nradha\n319\nliechtenstein\nmeade\nrichly\nargyle\nharpsichord\nliberalism\ntrumpets\nlauded\ntyrant\nsalsa\ntiled\nlear\npromoters\nreused\nslicing\ntrident\n##chuk\n##gami\n##lka\ncantor\ncheckpoint\n##points\ngaul\nleger\nmammalian\n##tov\n##aar\n##schaft\ndoha\nfrenchman\nnirvana\n##vino\ndelgado\nheadlining\n##eron\n##iography\njug\ntko\n1649\nnaga\nintersections\n##jia\nbenfica\nnawab\n##suka\nashford\ngulp\n##deck\n##vill\n##rug\nbrentford\nfrazier\npleasures\ndunne\npotsdam\nshenzhen\ndentistry\n##tec\nflanagan\n##dorff\n##hear\nchorale\ndinah\nprem\nquezon\n##rogated\nrelinquished\nsutra\nterri\n##pani\nflaps\n##rissa\npoly\n##rnet\nhomme\naback\n##eki\nlinger\nwomb\n##kson\n##lewood\ndoorstep\northodoxy\nthreaded\nwestfield\n##rval\ndioceses\nfridays\nsubsided\n##gata\nloyalists\n##biotic\n##ettes\nletterman\nlunatic\nprelate\ntenderly\ninvariably\nsouza\nthug\nwinslow\n##otide\nfurlongs\ngogh\njeopardy\n##runa\npegasus\n##umble\nhumiliated\nstandalone\ntagged\n##roller\nfreshmen\nklan\n##bright\nattaining\ninitiating\ntransatlantic\nlogged\nviz\n##uance\n1723\ncombatants\nintervening\nstephane\nchieftain\ndespised\ngrazed\n317\ncdc\ngalveston\ngodzilla\nmacro\nsimulate\n##planes\nparades\n##esses\n960\n##ductive\n##unes\nequator\noverdose\n##cans\n##hosh\n##lifting\njoshi\nepstein\nsonora\ntreacherous\naquatics\nmanchu\nresponsive\n##sation\nsupervisory\n##christ\n##llins\n##ibar\n##balance\n##uso\nkimball\nkarlsruhe\nmab\n##emy\nignores\nphonetic\nreuters\nspaghetti\n820\nalmighty\ndanzig\nrumbling\ntombstone\ndesignations\nlured\noutset\n##felt\nsupermarkets\n##wt\ngrupo\nkei\nkraft\nsusanna\n##blood\ncomprehension\ngenealogy\n##aghan\n##verted\nredding\n##ythe\n1722\nbowing\n##pore\n##roi\nlest\nsharpened\nfulbright\nvalkyrie\nsikhs\n##unds\nswans\nbouquet\nmerritt\n##tage\n##venting\ncommuted\nredhead\nclerks\nleasing\ncesare\ndea\nhazy\n##vances\nfledged\ngreenfield\nservicemen\n##gical\narmando\nblackout\ndt\nsagged\ndownloadable\nintra\npotion\npods\n##4th\n##mism\nxp\nattendants\ngambia\nstale\n##ntine\nplump\nasteroids\nrediscovered\nbuds\nflea\nhive\n##neas\n1737\nclassifications\ndebuts\n##eles\nolympus\nscala\n##eurs\n##gno\n##mute\nhummed\nsigismund\nvisuals\nwiggled\nawait\npilasters\nclench\nsulfate\n##ances\nbellevue\nenigma\ntrainee\nsnort\n##sw\nclouded\ndenim\n##rank\n##rder\nchurning\nhartman\nlodges\nriches\nsima\n##missible\naccountable\nsocrates\nregulates\nmueller\n##cr\n1702\navoids\nsolids\nhimalayas\nnutrient\npup\n##jevic\nsquat\nfades\nnec\n##lates\n##pina\n##rona\n##ου\nprivateer\ntequila\n##gative\n##mpton\napt\nhornet\nimmortals\n##dou\nasturias\ncleansing\ndario\n##rries\n##anta\netymology\nservicing\nzhejiang\n##venor\n##nx\nhorned\nerasmus\nrayon\nrelocating\n£10\n##bags\nescalated\npromenade\nstubble\n2010s\nartisans\naxial\nliquids\nmora\nsho\nyoo\n##tsky\nbundles\noldies\n##nally\nnotification\nbastion\n##ths\nsparkle\n##lved\n1728\nleash\npathogen\nhighs\n##hmi\nimmature\n880\ngonzaga\nignatius\nmansions\nmonterrey\nsweets\nbryson\n##loe\npolled\nregatta\nbrightest\npei\nrosy\nsquid\nhatfield\npayroll\naddict\nmeath\ncornerback\nheaviest\nlodging\n##mage\ncapcom\nrippled\n##sily\nbarnet\nmayhem\nymca\nsnuggled\nrousseau\n##cute\nblanchard\n284\nfragmented\nleighton\nchromosomes\nrisking\n##md\n##strel\n##utter\ncorinne\ncoyotes\ncynical\nhiroshi\nyeomanry\n##ractive\nebook\ngrading\nmandela\nplume\nagustin\nmagdalene\n##rkin\nbea\nfemme\ntrafford\n##coll\n##lun\n##tance\n52nd\nfourier\nupton\n##mental\ncamilla\ngust\niihf\nislamabad\nlongevity\n##kala\nfeldman\nnetting\n##rization\nendeavour\nforaging\nmfa\norr\n##open\ngreyish\ncontradiction\ngraz\n##ruff\nhandicapped\nmarlene\ntweed\noaxaca\nspp\ncampos\nmiocene\npri\nconfigured\ncooks\npluto\ncozy\npornographic\n##entes\n70th\nfairness\nglided\njonny\nlynne\nrounding\nsired\n##emon\n##nist\nremade\nuncover\n##mack\ncomplied\nlei\nnewsweek\n##jured\n##parts\n##enting\n##pg\n293\nfiner\nguerrillas\nathenian\ndeng\ndisused\nstepmother\naccuse\ngingerly\nseduction\n521\nconfronting\n##walker\n##going\ngora\nnostalgia\nsabres\nvirginity\nwrenched\n##minated\nsyndication\nwielding\neyre\n##56\n##gnon\n##igny\nbehaved\ntaxpayer\nsweeps\n##growth\nchildless\ngallant\n##ywood\namplified\ngeraldine\nscrape\n##ffi\nbabylonian\nfresco\n##rdan\n##kney\n##position\n1718\nrestricting\ntack\nfukuoka\nosborn\nselector\npartnering\n##dlow\n318\ngnu\nkia\ntak\nwhitley\ngables\n##54\n##mania\nmri\nsoftness\nimmersion\n##bots\n##evsky\n1713\nchilling\ninsignificant\npcs\n##uis\nelites\nlina\npurported\nsupplemental\nteaming\n##americana\n##dding\n##inton\nproficient\nrouen\n##nage\n##rret\nniccolo\nselects\n##bread\nfluffy\n1621\ngruff\nknotted\nmukherjee\npolgara\nthrash\nnicholls\nsecluded\nsmoothing\nthru\ncorsica\nloaf\nwhitaker\ninquiries\n##rrier\n##kam\nindochina\n289\nmarlins\nmyles\npeking\n##tea\nextracts\npastry\nsuperhuman\nconnacht\nvogel\n##ditional\n##het\n##udged\n##lash\ngloss\nquarries\nrefit\nteaser\n##alic\n##gaon\n20s\nmaterialized\nsling\ncamped\npickering\ntung\ntracker\npursuant\n##cide\ncranes\nsoc\n##cini\n##typical\n##viere\nanhalt\noverboard\nworkout\nchores\nfares\norphaned\nstains\n##logie\nfenton\nsurpassing\njoyah\ntriggers\n##itte\ngrandmaster\n##lass\n##lists\nclapping\nfraudulent\nledger\nnagasaki\n##cor\n##nosis\n##tsa\neucalyptus\ntun\n##icio\n##rney\n##tara\ndax\nheroism\nina\nwrexham\nonboard\nunsigned\n##dates\nmoshe\ngalley\nwinnie\ndroplets\nexiles\npraises\nwatered\nnoodles\n##aia\nfein\nadi\nleland\nmulticultural\nstink\nbingo\ncomets\nerskine\nmodernized\ncanned\nconstraint\ndomestically\nchemotherapy\nfeatherweight\nstifled\n##mum\ndarkly\nirresistible\nrefreshing\nhasty\nisolate\n##oys\nkitchener\nplanners\n##wehr\ncages\nyarn\nimplant\ntoulon\nelects\nchildbirth\nyue\n##lind\n##lone\ncn\nrightful\nsportsman\njunctions\nremodeled\nspecifies\n##rgh\n291\n##oons\ncomplimented\n##urgent\nlister\not\n##logic\nbequeathed\ncheekbones\nfontana\ngabby\n##dial\namadeus\ncorrugated\nmaverick\nresented\ntriangles\n##hered\n##usly\nnazareth\ntyrol\n1675\nassent\npoorer\nsectional\naegean\n##cous\n296\nnylon\nghanaian\n##egorical\n##weig\ncushions\nforbid\nfusiliers\nobstruction\nsomerville\n##scia\ndime\nearrings\nelliptical\nleyte\noder\npolymers\ntimmy\natm\nmidtown\npiloted\nsettles\ncontinual\nexternally\nmayfield\n##uh\nenrichment\nhenson\nkeane\npersians\n1733\nbenji\nbraden\npep\n324\n##efe\ncontenders\npepsi\nvalet\n##isches\n298\n##asse\n##earing\ngoofy\nstroll\n##amen\nauthoritarian\noccurrences\nadversary\nahmedabad\ntangent\ntoppled\ndorchester\n1672\nmodernism\nmarxism\nislamist\ncharlemagne\nexponential\nracks\nunicode\nbrunette\nmbc\npic\nskirmish\n##bund\n##lad\n##powered\n##yst\nhoisted\nmessina\nshatter\n##ctum\njedi\nvantage\n##music\n##neil\nclemens\nmahmoud\ncorrupted\nauthentication\nlowry\nnils\n##washed\nomnibus\nwounding\njillian\n##itors\n##opped\nserialized\nnarcotics\nhandheld\n##arm\n##plicity\nintersecting\nstimulating\n##onis\ncrate\nfellowships\nhemingway\ncasinos\nclimatic\nfordham\ncopeland\ndrip\nbeatty\nleaflets\nrobber\nbrothel\nmadeira\n##hedral\nsphinx\nultrasound\n##vana\nvalor\nforbade\nleonid\nvillas\n##aldo\nduane\nmarquez\n##cytes\ndisadvantaged\nforearms\nkawasaki\nreacts\nconsular\nlax\nuncles\nuphold\n##hopper\nconcepcion\ndorsey\nlass\n##izan\narching\npassageway\n1708\nresearches\ntia\ninternationals\n##graphs\n##opers\ndistinguishes\njavanese\ndivert\n##uven\nplotted\n##listic\n##rwin\n##erik\n##tify\naffirmative\nsignifies\nvalidation\n##bson\nkari\nfelicity\ngeorgina\nzulu\n##eros\n##rained\n##rath\novercoming\n##dot\nargyll\n##rbin\n1734\nchiba\nratification\nwindy\nearls\nparapet\n##marks\nhunan\npristine\nastrid\npunta\n##gart\nbrodie\n##kota\n##oder\nmalaga\nminerva\nrouse\n##phonic\nbellowed\npagoda\nportals\nreclamation\n##gur\n##odies\n##⁄₄\nparentheses\nquoting\nallergic\npalette\nshowcases\nbenefactor\nheartland\nnonlinear\n##tness\nbladed\ncheerfully\nscans\n##ety\n##hone\n1666\ngirlfriends\npedersen\nhiram\nsous\n##liche\n##nator\n1683\n##nery\n##orio\n##umen\nbobo\nprimaries\nsmiley\n##cb\nunearthed\nuniformly\nfis\nmetadata\n1635\nind\n##oted\nrecoil\n##titles\n##tura\n##ια\n406\nhilbert\njamestown\nmcmillan\ntulane\nseychelles\n##frid\nantics\ncoli\nfated\nstucco\n##grants\n1654\nbulky\naccolades\narrays\ncaledonian\ncarnage\noptimism\npuebla\n##tative\n##cave\nenforcing\nrotherham\nseo\ndunlop\naeronautics\nchimed\nincline\nzoning\narchduke\nhellenistic\n##oses\n##sions\ncandi\nthong\n##ople\nmagnate\nrustic\n##rsk\nprojective\nslant\n##offs\ndanes\nhollis\nvocalists\n##ammed\ncongenital\ncontend\ngesellschaft\n##ocating\n##pressive\ndouglass\nquieter\n##cm\n##kshi\nhowled\nsalim\nspontaneously\ntownsville\nbuena\nsouthport\n##bold\nkato\n1638\nfaerie\nstiffly\n##vus\n##rled\n297\nflawless\nrealising\ntaboo\n##7th\nbytes\nstraightening\n356\njena\n##hid\n##rmin\ncartwright\nberber\nbertram\nsoloists\n411\nnoses\n417\ncoping\nfission\nhardin\ninca\n##cen\n1717\nmobilized\nvhf\n##raf\nbiscuits\ncurate\n##85\n##anial\n331\ngaunt\nneighbourhoods\n1540\n##abas\nblanca\nbypassed\nsockets\nbehold\ncoincidentally\n##bane\nnara\nshave\nsplinter\nterrific\n##arion\n##erian\ncommonplace\njuris\nredwood\nwaistband\nboxed\ncaitlin\nfingerprints\njennie\nnaturalized\n##ired\nbalfour\ncraters\njody\nbungalow\nhugely\nquilt\nglitter\npigeons\nundertaker\nbulging\nconstrained\ngoo\n##sil\n##akh\nassimilation\nreworked\n##person\npersuasion\n##pants\nfelicia\n##cliff\n##ulent\n1732\nexplodes\n##dun\n##inium\n##zic\nlyman\nvulture\nhog\noverlook\nbegs\nnorthwards\now\nspoil\n##urer\nfatima\nfavorably\naccumulate\nsargent\nsorority\ncorresponded\ndispersal\nkochi\ntoned\n##imi\n##lita\ninternacional\nnewfound\n##agger\n##lynn\n##rigue\nbooths\npeanuts\n##eborg\nmedicare\nmuriel\nnur\n##uram\ncrates\nmillennia\npajamas\nworsened\n##breakers\njimi\nvanuatu\nyawned\n##udeau\ncarousel\n##hony\nhurdle\n##ccus\n##mounted\n##pod\nrv\n##eche\nairship\nambiguity\ncompulsion\nrecapture\n##claiming\narthritis\n##osomal\n1667\nasserting\nngc\nsniffing\ndade\ndiscontent\nglendale\nported\n##amina\ndefamation\nrammed\n##scent\nfling\nlivingstone\n##fleet\n875\n##ppy\napocalyptic\ncomrade\nlcd\n##lowe\ncessna\neine\npersecuted\nsubsistence\ndemi\nhoop\nreliefs\n710\ncoptic\nprogressing\nstemmed\nperpetrators\n1665\npriestess\n##nio\ndobson\nebony\nrooster\nitf\ntortricidae\n##bbon\n##jian\ncleanup\n##jean\n##øy\n1721\neighties\ntaxonomic\nholiness\n##hearted\n##spar\nantilles\nshowcasing\nstabilized\n##nb\ngia\nmascara\nmichelangelo\ndawned\n##uria\n##vinsky\nextinguished\nfitz\ngrotesque\n£100\n##fera\n##loid\n##mous\nbarges\nneue\nthrobbed\ncipher\njohnnie\n##a1\n##mpt\noutburst\n##swick\nspearheaded\nadministrations\nc1\nheartbreak\npixels\npleasantly\n##enay\nlombardy\nplush\n##nsed\nbobbie\n##hly\nreapers\ntremor\nxiang\nminogue\nsubstantive\nhitch\nbarak\n##wyl\nkwan\n##encia\n910\nobscene\nelegance\nindus\nsurfer\nbribery\nconserve\n##hyllum\n##masters\nhoratio\n##fat\napes\nrebound\npsychotic\n##pour\niteration\n##mium\n##vani\nbotanic\nhorribly\nantiques\ndispose\npaxton\n##hli\n##wg\ntimeless\n1704\ndisregard\nengraver\nhounds\n##bau\n##version\nlooted\nuno\nfacilitates\ngroans\nmasjid\nrutland\nantibody\ndisqualification\ndecatur\nfootballers\nquake\nslacks\n48th\nrein\nscribe\nstabilize\ncommits\nexemplary\ntho\n##hort\n##chison\npantry\ntraversed\n##hiti\ndisrepair\nidentifiable\nvibrated\nbaccalaureate\n##nnis\ncsa\ninterviewing\n##iensis\n##raße\ngreaves\nwealthiest\n343\nclassed\njogged\n£5\n##58\n##atal\nilluminating\nknicks\nrespecting\n##uno\nscrubbed\n##iji\n##dles\nkruger\nmoods\ngrowls\nraider\nsilvia\nchefs\nkam\nvr\ncree\npercival\n##terol\ngunter\ncounterattack\ndefiant\nhenan\nze\n##rasia\n##riety\nequivalence\nsubmissions\n##fra\n##thor\nbautista\nmechanically\n##heater\ncornice\nherbal\ntemplar\n##mering\noutputs\nruining\nligand\nrenumbered\nextravagant\nmika\nblockbuster\neta\ninsurrection\n##ilia\ndarkening\nferocious\npianos\nstrife\nkinship\n##aer\nmelee\n##anor\n##iste\n##may\n##oue\ndecidedly\nweep\n##jad\n##missive\n##ppel\n354\npuget\nunease\n##gnant\n1629\nhammering\nkassel\nob\nwessex\n##lga\nbromwich\negan\nparanoia\nutilization\n##atable\n##idad\ncontradictory\nprovoke\n##ols\n##ouring\n##tangled\nknesset\n##very\n##lette\nplumbing\n##sden\n##¹\ngreensboro\noccult\nsniff\n338\nzev\nbeaming\ngamer\nhaggard\nmahal\n##olt\n##pins\nmendes\nutmost\nbriefing\ngunnery\n##gut\n##pher\n##zh\n##rok\n1679\nkhalifa\nsonya\n##boot\nprincipals\nurbana\nwiring\n##liffe\n##minating\n##rrado\ndahl\nnyu\nskepticism\nnp\ntownspeople\nithaca\nlobster\nsomethin\n##fur\n##arina\n##−1\nfreighter\nzimmerman\nbiceps\ncontractual\n##herton\namend\nhurrying\nsubconscious\n##anal\n336\nmeng\nclermont\nspawning\n##eia\n##lub\ndignitaries\nimpetus\nsnacks\nspotting\ntwigs\n##bilis\n##cz\n##ouk\nlibertadores\nnic\nskylar\n##aina\n##firm\ngustave\nasean\n##anum\ndieter\nlegislatures\nflirt\nbromley\ntrolls\numar\n##bbies\n##tyle\nblah\nparc\nbridgeport\ncrank\nnegligence\n##nction\n46th\nconstantin\nmolded\nbandages\nseriousness\n00pm\nsiegel\ncarpets\ncompartments\nupbeat\nstatehood\n##dner\n##edging\nmarko\n730\nplatt\n##hane\npaving\n##iy\n1738\nabbess\nimpatience\nlimousine\nnbl\n##talk\n441\nlucille\nmojo\nnightfall\nrobbers\n##nais\nkarel\nbrisk\ncalves\nreplicate\nascribed\ntelescopes\n##olf\nintimidated\n##reen\nballast\nspecialization\n##sit\naerodynamic\ncaliphate\nrainer\nvisionary\n##arded\nepsilon\n##aday\n##onte\naggregation\nauditory\nboosted\nreunification\nkathmandu\nloco\nrobyn\n402\nacknowledges\nappointing\nhumanoid\nnewell\nredeveloped\nrestraints\n##tained\nbarbarians\nchopper\n1609\nitaliana\n##lez\n##lho\ninvestigates\nwrestlemania\n##anies\n##bib\n690\n##falls\ncreaked\ndragoons\ngravely\nminions\nstupidity\nvolley\n##harat\n##week\nmusik\n##eries\n##uously\nfungal\nmassimo\nsemantics\nmalvern\n##ahl\n##pee\ndiscourage\nembryo\nimperialism\n1910s\nprofoundly\n##ddled\njiangsu\nsparkled\nstat\n##holz\nsweatshirt\ntobin\n##iction\nsneered\n##cheon\n##oit\nbrit\ncausal\nsmyth\n##neuve\ndiffuse\nperrin\nsilvio\n##ipes\n##recht\ndetonated\niqbal\nselma\n##nism\n##zumi\nroasted\n##riders\ntay\n##ados\n##mament\n##mut\n##rud\n840\ncompletes\nnipples\ncfa\nflavour\nhirsch\n##laus\ncalderon\nsneakers\nmoravian\n##ksha\n1622\nrq\n294\n##imeters\nbodo\n##isance\n##pre\n##ronia\nanatomical\nexcerpt\n##lke\ndh\nkunst\n##tablished\n##scoe\nbiomass\npanted\nunharmed\ngael\nhousemates\nmontpellier\n##59\ncoa\nrodents\ntonic\nhickory\nsingleton\n##taro\n451\n1719\naldo\nbreaststroke\ndempsey\noch\nrocco\n##cuit\nmerton\ndissemination\nmidsummer\nserials\n##idi\nhaji\npolynomials\n##rdon\ngs\nenoch\nprematurely\nshutter\ntaunton\n£3\n##grating\n##inates\narchangel\nharassed\n##asco\n326\narchway\ndazzling\n##ecin\n1736\nsumo\nwat\n##kovich\n1086\nhonneur\n##ently\n##nostic\n##ttal\n##idon\n1605\n403\n1716\nblogger\nrents\n##gnan\nhires\n##ikh\n##dant\nhowie\n##rons\nhandler\nretracted\nshocks\n1632\narun\nduluth\nkepler\ntrumpeter\n##lary\npeeking\nseasoned\ntrooper\n##mara\nlaszlo\n##iciencies\n##rti\nheterosexual\n##inatory\n##ssion\nindira\njogging\n##inga\n##lism\nbeit\ndissatisfaction\nmalice\n##ately\nnedra\npeeling\n##rgeon\n47th\nstadiums\n475\nvertigo\n##ains\niced\nrestroom\n##plify\n##tub\nillustrating\npear\n##chner\n##sibility\ninorganic\nrappers\nreceipts\nwatery\n##kura\nlucinda\n##oulos\nreintroduced\n##8th\n##tched\ngracefully\nsaxons\nnutritional\nwastewater\nrained\nfavourites\nbedrock\nfisted\nhallways\nlikeness\nupscale\n##lateral\n1580\nblinds\nprequel\n##pps\n##tama\ndeter\nhumiliating\nrestraining\ntn\nvents\n1659\nlaundering\nrecess\nrosary\ntractors\ncoulter\nfederer\n##ifiers\n##plin\npersistence\n##quitable\ngeschichte\npendulum\nquakers\n##beam\nbassett\npictorial\nbuffet\nkoln\n##sitor\ndrills\nreciprocal\nshooters\n##57\n##cton\n##tees\nconverge\npip\ndmitri\ndonnelly\nyamamoto\naqua\nazores\ndemographics\nhypnotic\nspitfire\nsuspend\nwryly\nroderick\n##rran\nsebastien\n##asurable\nmavericks\n##fles\n##200\nhimalayan\nprodigy\n##iance\ntransvaal\ndemonstrators\nhandcuffs\ndodged\nmcnamara\nsublime\n1726\ncrazed\n##efined\n##till\nivo\npondered\nreconciled\nshrill\nsava\n##duk\nbal\ncad\nheresy\njaipur\ngoran\n##nished\n341\nlux\nshelly\nwhitehall\n##hre\nisraelis\npeacekeeping\n##wled\n1703\ndemetrius\nousted\n##arians\n##zos\nbeale\nanwar\nbackstroke\nraged\nshrinking\ncremated\n##yck\nbenign\ntowing\nwadi\ndarmstadt\nlandfill\nparana\nsoothe\ncolleen\nsidewalks\nmayfair\ntumble\nhepatitis\nferrer\nsuperstructure\n##gingly\n##urse\n##wee\nanthropological\ntranslators\n##mies\ncloseness\nhooves\n##pw\nmondays\n##roll\n##vita\nlandscaping\n##urized\npurification\nsock\nthorns\nthwarted\njalan\ntiberius\n##taka\nsaline\n##rito\nconfidently\nkhyber\nsculptors\n##ij\nbrahms\nhammersmith\ninspectors\nbattista\nfivb\nfragmentation\nhackney\n##uls\narresting\nexercising\nantoinette\nbedfordshire\n##zily\ndyed\n##hema\n1656\nracetrack\nvariability\n##tique\n1655\naustrians\ndeteriorating\nmadman\ntheorists\naix\nlehman\nweathered\n1731\ndecreed\neruptions\n1729\nflaw\nquinlan\nsorbonne\nflutes\nnunez\n1711\nadored\ndownwards\nfable\nrasped\n1712\nmoritz\nmouthful\nrenegade\nshivers\nstunts\ndysfunction\nrestrain\ntranslit\n327\npancakes\n##avio\n##cision\n##tray\n351\nvial\n##lden\nbain\n##maid\n##oxide\nchihuahua\nmalacca\nvimes\n##rba\n##rnier\n1664\ndonnie\nplaques\n##ually\n337\nbangs\nfloppy\nhuntsville\nloretta\nnikolay\n##otte\neater\nhandgun\nubiquitous\n##hett\neras\nzodiac\n1634\n##omorphic\n1820s\n##zog\ncochran\n##bula\n##lithic\nwarring\n##rada\ndalai\nexcused\nblazers\nmcconnell\nreeling\nbot\neste\n##abi\ngeese\nhoax\ntaxon\n##bla\nguitarists\n##icon\ncondemning\nhunts\ninversion\nmoffat\ntaekwondo\n##lvis\n1624\nstammered\n##rest\n##rzy\nsousa\nfundraiser\nmarylebone\nnavigable\nuptown\ncabbage\ndaniela\nsalman\nshitty\nwhimper\n##kian\n##utive\nprogrammers\nprotections\nrm\n##rmi\n##rued\nforceful\n##enes\nfuss\n##tao\n##wash\nbrat\noppressive\nreykjavik\nspartak\nticking\n##inkles\n##kiewicz\nadolph\nhorst\nmaui\nprotege\nstraighten\ncpc\nlandau\nconcourse\nclements\nresultant\n##ando\nimaginative\njoo\nreactivated\n##rem\n##ffled\n##uising\nconsultative\n##guide\nflop\nkaitlyn\nmergers\nparenting\nsomber\n##vron\nsupervise\nvidhan\n##imum\ncourtship\nexemplified\nharmonies\nmedallist\nrefining\n##rrow\n##ка\namara\n##hum\n780\ngoalscorer\nsited\novershadowed\nrohan\ndispleasure\nsecretive\nmultiplied\nosman\n##orth\nengravings\npadre\n##kali\n##veda\nminiatures\nmis\n##yala\nclap\npali\nrook\n##cana\n1692\n57th\nantennae\nastro\noskar\n1628\nbulldog\ncrotch\nhackett\nyucatan\n##sure\namplifiers\nbrno\nferrara\nmigrating\n##gree\nthanking\nturing\n##eza\nmccann\nting\nandersson\nonslaught\ngaines\nganga\nincense\nstandardization\n##mation\nsentai\nscuba\nstuffing\nturquoise\nwaivers\nalloys\n##vitt\nregaining\nvaults\n##clops\n##gizing\ndigger\nfurry\nmemorabilia\nprobing\n##iad\npayton\nrec\ndeutschland\nfilippo\nopaque\nseamen\nzenith\nafrikaans\n##filtration\ndisciplined\ninspirational\n##merie\nbanco\nconfuse\ngrafton\ntod\n##dgets\nchampioned\nsimi\nanomaly\nbiplane\n##ceptive\nelectrode\n##para\n1697\ncleavage\ncrossbow\nswirl\ninformant\n##lars\n##osta\nafi\nbonfire\nspec\n##oux\nlakeside\nslump\n##culus\n##lais\n##qvist\n##rrigan\n1016\nfacades\nborg\ninwardly\ncervical\nxl\npointedly\n050\nstabilization\n##odon\nchests\n1699\nhacked\nctv\northogonal\nsuzy\n##lastic\ngaulle\njacobite\nrearview\n##cam\n##erted\nashby\n##drik\n##igate\n##mise\n##zbek\naffectionately\ncanine\ndisperse\nlatham\n##istles\n##ivar\nspielberg\n##orin\n##idium\nezekiel\ncid\n##sg\ndurga\nmiddletown\n##cina\ncustomized\nfrontiers\nharden\n##etano\n##zzy\n1604\nbolsheviks\n##66\ncoloration\nyoko\n##bedo\nbriefs\nslabs\ndebra\nliquidation\nplumage\n##oin\nblossoms\ndementia\nsubsidy\n1611\nproctor\nrelational\njerseys\nparochial\nter\n##ici\nesa\npeshawar\ncavalier\nloren\ncpi\nidiots\nshamrock\n1646\ndutton\nmalabar\nmustache\n##endez\n##ocytes\nreferencing\nterminates\nmarche\nyarmouth\n##sop\nacton\nmated\nseton\nsubtly\nbaptised\nbeige\nextremes\njolted\nkristina\ntelecast\n##actic\nsafeguard\nwaldo\n##baldi\n##bular\nendeavors\nsloppy\nsubterranean\n##ensburg\n##itung\ndelicately\npigment\ntq\n##scu\n1626\n##ound\ncollisions\ncoveted\nherds\n##personal\n##meister\n##nberger\nchopra\n##ricting\nabnormalities\ndefective\ngalician\nlucie\n##dilly\nalligator\nlikened\n##genase\nburundi\nclears\ncomplexion\nderelict\ndeafening\ndiablo\nfingered\nchampaign\ndogg\nenlist\nisotope\nlabeling\nmrna\n##erre\nbrilliance\nmarvelous\n##ayo\n1652\ncrawley\nether\nfooted\ndwellers\ndeserts\nhamish\nrubs\nwarlock\nskimmed\n##lizer\n870\nbuick\nembark\nheraldic\nirregularities\n##ajan\nkiara\n##kulam\n##ieg\nantigen\nkowalski\n##lge\noakley\nvisitation\n##mbit\nvt\n##suit\n1570\nmurderers\n##miento\n##rites\nchimneys\n##sling\ncondemn\ncuster\nexchequer\nhavre\n##ghi\nfluctuations\n##rations\ndfb\nhendricks\nvaccines\n##tarian\nnietzsche\nbiking\njuicy\n##duced\nbrooding\nscrolling\nselangor\n##ragan\n352\nannum\nboomed\nseminole\nsugarcane\n##dna\ndepartmental\ndismissing\ninnsbruck\narteries\nashok\nbatavia\ndaze\nkun\novertook\n##rga\n##tlan\nbeheaded\ngaddafi\nholm\nelectronically\nfaulty\ngalilee\nfractures\nkobayashi\n##lized\ngunmen\nmagma\naramaic\nmala\neastenders\ninference\nmessengers\nbf\n##qu\n407\nbathrooms\n##vere\n1658\nflashbacks\nideally\nmisunderstood\n##jali\n##weather\nmendez\n##grounds\n505\nuncanny\n##iii\n1709\nfriendships\n##nbc\nsacrament\naccommodated\nreiterated\nlogistical\npebbles\nthumped\n##escence\nadministering\ndecrees\ndrafts\n##flight\n##cased\n##tula\nfuturistic\npicket\nintimidation\nwinthrop\n##fahan\ninterfered\n339\nafar\nfrancoise\nmorally\nuta\ncochin\ncroft\ndwarfs\n##bruck\n##dents\n##nami\nbiker\n##hner\n##meral\nnano\n##isen\n##ometric\n##pres\n##ан\nbrightened\nmeek\nparcels\nsecurely\ngunners\n##jhl\n##zko\nagile\nhysteria\n##lten\n##rcus\nbukit\nchamps\nchevy\ncuckoo\nleith\nsadler\ntheologians\nwelded\n##section\n1663\njj\nplurality\nxander\n##rooms\n##formed\nshredded\ntemps\nintimately\npau\ntormented\n##lok\n##stellar\n1618\ncharred\nems\nessen\n##mmel\nalarms\nspraying\nascot\nblooms\ntwinkle\n##abia\n##apes\ninternment\nobsidian\n##chaft\nsnoop\n##dav\n##ooping\nmalibu\n##tension\nquiver\n##itia\nhays\nmcintosh\ntravers\nwalsall\n##ffie\n1623\nbeverley\nschwarz\nplunging\nstructurally\nm3\nrosenthal\nvikram\n##tsk\n770\nghz\n##onda\n##tiv\nchalmers\ngroningen\npew\nreckon\nunicef\n##rvis\n55th\n##gni\n1651\nsulawesi\navila\ncai\nmetaphysical\nscrewing\nturbulence\n##mberg\naugusto\nsamba\n56th\nbaffled\nmomentary\ntoxin\n##urian\n##wani\naachen\ncondoms\ndali\nsteppe\n##3d\n##app\n##oed\n##year\nadolescence\ndauphin\nelectrically\ninaccessible\nmicroscopy\nnikita\n##ega\natv\n##cel\n##enter\n##oles\n##oteric\n##ы\naccountants\npunishments\nwrongly\nbribes\nadventurous\nclinch\nflinders\nsouthland\n##hem\n##kata\ngough\n##ciency\nlads\nsoared\n##ה\nundergoes\ndeformation\noutlawed\nrubbish\n##arus\n##mussen\n##nidae\n##rzburg\narcs\n##ingdon\n##tituted\n1695\nwheelbase\nwheeling\nbombardier\ncampground\nzebra\n##lices\n##oj\n##bain\nlullaby\n##ecure\ndonetsk\nwylie\ngrenada\n##arding\n##ης\nsquinting\neireann\nopposes\n##andra\nmaximal\nrunes\n##broken\n##cuting\n##iface\n##ror\n##rosis\nadditive\nbritney\nadultery\ntriggering\n##drome\ndetrimental\naarhus\ncontainment\njc\nswapped\nvichy\n##ioms\nmadly\n##oric\n##rag\nbrant\n##ckey\n##trix\n1560\n1612\nbroughton\nrustling\n##stems\n##uder\nasbestos\nmentoring\n##nivorous\nfinley\nleaps\n##isan\napical\npry\nslits\nsubstitutes\n##dict\nintuitive\nfantasia\ninsistent\nunreasonable\n##igen\n##vna\ndomed\nhannover\nmargot\nponder\n##zziness\nimpromptu\njian\nlc\nrampage\nstemming\n##eft\nandrey\ngerais\nwhichever\namnesia\nappropriated\nanzac\nclicks\nmodifying\nultimatum\ncambrian\nmaids\nverve\nyellowstone\n##mbs\nconservatoire\n##scribe\nadherence\ndinners\nspectra\nimperfect\nmysteriously\nsidekick\ntatar\ntuba\n##aks\n##ifolia\ndistrust\n##athan\n##zle\nc2\nronin\nzac\n##pse\ncelaena\ninstrumentalist\nscents\nskopje\n##mbling\ncomical\ncompensated\nvidal\ncondor\nintersect\njingle\nwavelengths\n##urrent\nmcqueen\n##izzly\ncarp\nweasel\n422\nkanye\nmilitias\npostdoctoral\neugen\ngunslinger\n##ɛ\nfaux\nhospice\n##for\nappalled\nderivation\ndwarves\n##elis\ndilapidated\n##folk\nastoria\nphilology\n##lwyn\n##otho\n##saka\ninducing\nphilanthropy\n##bf\n##itative\ngeek\nmarkedly\nsql\n##yce\nbessie\nindices\nrn\n##flict\n495\nfrowns\nresolving\nweightlifting\ntugs\ncleric\ncontentious\n1653\nmania\nrms\n##miya\n##reate\n##ruck\n##tucket\nbien\neels\nmarek\n##ayton\n##cence\ndiscreet\nunofficially\n##ife\nleaks\n##bber\n1705\n332\ndung\ncompressor\nhillsborough\npandit\nshillings\ndistal\n##skin\n381\n##tat\n##you\nnosed\n##nir\nmangrove\nundeveloped\n##idia\ntextures\n##inho\n##500\n##rise\nae\nirritating\nnay\namazingly\nbancroft\napologetic\ncompassionate\nkata\nsymphonies\n##lovic\nairspace\n##lch\n930\ngifford\nprecautions\nfulfillment\nsevilla\nvulgar\nmartinique\n##urities\nlooting\npiccolo\ntidy\n##dermott\nquadrant\narmchair\nincomes\nmathematicians\nstampede\nnilsson\n##inking\n##scan\nfoo\nquarterfinal\n##ostal\nshang\nshouldered\nsquirrels\n##owe\n344\nvinegar\n##bner\n##rchy\n##systems\ndelaying\n##trics\nars\ndwyer\nrhapsody\nsponsoring\n##gration\nbipolar\ncinder\nstarters\n##olio\n##urst\n421\nsignage\n##nty\naground\nfigurative\nmons\nacquaintances\nduets\nerroneously\nsoyuz\nelliptic\nrecreated\n##cultural\n##quette\n##ssed\n##tma\n##zcz\nmoderator\nscares\n##itaire\n##stones\n##udence\njuniper\nsighting\n##just\n##nsen\nbritten\ncalabria\nry\nbop\ncramer\nforsyth\nstillness\n##л\nairmen\ngathers\nunfit\n##umber\n##upt\ntaunting\n##rip\nseeker\nstreamlined\n##bution\nholster\nschumann\ntread\nvox\n##gano\n##onzo\nstrive\ndil\nreforming\ncovent\nnewbury\npredicting\n##orro\ndecorate\ntre\n##puted\nandover\nie\nasahi\ndept\ndunkirk\ngills\n##tori\nburen\nhuskies\n##stis\n##stov\nabstracts\nbets\nloosen\n##opa\n1682\nyearning\n##glio\n##sir\nberman\neffortlessly\nenamel\nnapoli\npersist\n##peration\n##uez\nattache\nelisa\nb1\ninvitations\n##kic\naccelerating\nreindeer\nboardwalk\nclutches\nnelly\npolka\nstarbucks\n##kei\nadamant\nhuey\nlough\nunbroken\nadventurer\nembroidery\ninspecting\nstanza\n##ducted\nnaia\ntaluka\n##pone\n##roids\nchases\ndeprivation\nflorian\n##jing\n##ppet\nearthly\n##lib\n##ssee\ncolossal\nforeigner\nvet\nfreaks\npatrice\nrosewood\ntriassic\nupstate\n##pkins\ndominates\nata\nchants\nks\nvo\n##400\n##bley\n##raya\n##rmed\n555\nagra\ninfiltrate\n##ailing\n##ilation\n##tzer\n##uppe\n##werk\nbinoculars\nenthusiast\nfujian\nsqueak\n##avs\nabolitionist\nalmeida\nboredom\nhampstead\nmarsden\nrations\n##ands\ninflated\n334\nbonuses\nrosalie\npatna\n##rco\n329\ndetachments\npenitentiary\n54th\nflourishing\nwoolf\n##dion\n##etched\npapyrus\n##lster\n##nsor\n##toy\nbobbed\ndismounted\nendelle\ninhuman\nmotorola\ntbs\nwince\nwreath\n##ticus\nhideout\ninspections\nsanjay\ndisgrace\ninfused\npudding\nstalks\n##urbed\narsenic\nleases\n##hyl\n##rrard\ncollarbone\n##waite\n##wil\ndowry\n##bant\n##edance\ngenealogical\nnitrate\nsalamanca\nscandals\nthyroid\nnecessitated\n##!\n##\"\n###\n##$\n##%\n##&\n##'\n##(\n##)\n##*\n##+\n##,\n##-\n##.\n##/\n##:\n##;\n##<\n##=\n##>\n##?\n##@\n##[\n##\\\n##]\n##^\n##_\n##`\n##{\n##|\n##}\n##~\n##¡\n##¢\n##£\n##¤\n##¥\n##¦\n##§\n##¨\n##©\n##ª\n##«\n##¬\n##®\n##±\n##´\n##µ\n##¶\n##·\n##º\n##»\n##¼\n##¾\n##¿\n##æ\n##ð\n##÷\n##þ\n##đ\n##ħ\n##ŋ\n##œ\n##ƒ\n##ɐ\n##ɑ\n##ɒ\n##ɔ\n##ɕ\n##ə\n##ɡ\n##ɣ\n##ɨ\n##ɪ\n##ɫ\n##ɬ\n##ɯ\n##ɲ\n##ɴ\n##ɹ\n##ɾ\n##ʀ\n##ʁ\n##ʂ\n##ʃ\n##ʉ\n##ʊ\n##ʋ\n##ʌ\n##ʎ\n##ʐ\n##ʑ\n##ʒ\n##ʔ\n##ʰ\n##ʲ\n##ʳ\n##ʷ\n##ʸ\n##ʻ\n##ʼ\n##ʾ\n##ʿ\n##ˈ\n##ˡ\n##ˢ\n##ˣ\n##ˤ\n##β\n##γ\n##δ\n##ε\n##ζ\n##θ\n##κ\n##λ\n##μ\n##ξ\n##ο\n##π\n##ρ\n##σ\n##τ\n##υ\n##φ\n##χ\n##ψ\n##ω\n##б\n##г\n##д\n##ж\n##з\n##м\n##п\n##с\n##у\n##ф\n##х\n##ц\n##ч\n##ш\n##щ\n##ъ\n##э\n##ю\n##ђ\n##є\n##і\n##ј\n##љ\n##њ\n##ћ\n##ӏ\n##ա\n##բ\n##գ\n##դ\n##ե\n##թ\n##ի\n##լ\n##կ\n##հ\n##մ\n##յ\n##ն\n##ո\n##պ\n##ս\n##վ\n##տ\n##ր\n##ւ\n##ք\n##־\n##א\n##ב\n##ג\n##ד\n##ו\n##ז\n##ח\n##ט\n##י\n##ך\n##כ\n##ל\n##ם\n##מ\n##ן\n##נ\n##ס\n##ע\n##ף\n##פ\n##ץ\n##צ\n##ק\n##ר\n##ש\n##ת\n##،\n##ء\n##ب\n##ت\n##ث\n##ج\n##ح\n##خ\n##ذ\n##ز\n##س\n##ش\n##ص\n##ض\n##ط\n##ظ\n##ع\n##غ\n##ـ\n##ف\n##ق\n##ك\n##و\n##ى\n##ٹ\n##پ\n##چ\n##ک\n##گ\n##ں\n##ھ\n##ہ\n##ے\n##अ\n##आ\n##उ\n##ए\n##क\n##ख\n##ग\n##च\n##ज\n##ट\n##ड\n##ण\n##त\n##थ\n##द\n##ध\n##न\n##प\n##ब\n##भ\n##म\n##य\n##र\n##ल\n##व\n##श\n##ष\n##स\n##ह\n##ा\n##ि\n##ी\n##ो\n##।\n##॥\n##ং\n##অ\n##আ\n##ই\n##উ\n##এ\n##ও\n##ক\n##খ\n##গ\n##চ\n##ছ\n##জ\n##ট\n##ড\n##ণ\n##ত\n##থ\n##দ\n##ধ\n##ন\n##প\n##ব\n##ভ\n##ম\n##য\n##র\n##ল\n##শ\n##ষ\n##স\n##হ\n##া\n##ি\n##ী\n##ে\n##க\n##ச\n##ட\n##த\n##ந\n##ன\n##ப\n##ம\n##ய\n##ர\n##ல\n##ள\n##வ\n##ா\n##ி\n##ு\n##ே\n##ை\n##ನ\n##ರ\n##ಾ\n##ක\n##ය\n##ර\n##ල\n##ව\n##ා\n##ก\n##ง\n##ต\n##ท\n##น\n##พ\n##ม\n##ย\n##ร\n##ล\n##ว\n##ส\n##อ\n##า\n##เ\n##་\n##།\n##ག\n##ང\n##ད\n##ན\n##པ\n##བ\n##མ\n##འ\n##ར\n##ལ\n##ས\n##မ\n##ა\n##ბ\n##გ\n##დ\n##ე\n##ვ\n##თ\n##ი\n##კ\n##ლ\n##მ\n##ნ\n##ო\n##რ\n##ს\n##ტ\n##უ\n##ᄀ\n##ᄂ\n##ᄃ\n##ᄅ\n##ᄆ\n##ᄇ\n##ᄉ\n##ᄊ\n##ᄋ\n##ᄌ\n##ᄎ\n##ᄏ\n##ᄐ\n##ᄑ\n##ᄒ\n##ᅡ\n##ᅢ\n##ᅥ\n##ᅦ\n##ᅧ\n##ᅩ\n##ᅪ\n##ᅭ\n##ᅮ\n##ᅯ\n##ᅲ\n##ᅳ\n##ᅴ\n##ᅵ\n##ᆨ\n##ᆫ\n##ᆯ\n##ᆷ\n##ᆸ\n##ᆼ\n##ᴬ\n##ᴮ\n##ᴰ\n##ᴵ\n##ᴺ\n##ᵀ\n##ᵃ\n##ᵇ\n##ᵈ\n##ᵉ\n##ᵍ\n##ᵏ\n##ᵐ\n##ᵒ\n##ᵖ\n##ᵗ\n##ᵘ\n##ᵣ\n##ᵤ\n##ᵥ\n##ᶜ\n##ᶠ\n##‐\n##‑\n##‒\n##–\n##—\n##―\n##‖\n##‘\n##’\n##‚\n##“\n##”\n##„\n##†\n##‡\n##•\n##…\n##‰\n##′\n##″\n##›\n##‿\n##⁄\n##⁰\n##ⁱ\n##⁴\n##⁵\n##⁶\n##⁷\n##⁸\n##⁹\n##⁻\n##ⁿ\n##₅\n##₆\n##₇\n##₈\n##₉\n##₊\n##₍\n##₎\n##ₐ\n##ₑ\n##ₒ\n##ₓ\n##ₕ\n##ₖ\n##ₗ\n##ₘ\n##ₚ\n##ₛ\n##ₜ\n##₤\n##₩\n##€\n##₱\n##₹\n##ℓ\n##№\n##ℝ\n##™\n##⅓\n##⅔\n##←\n##↑\n##→\n##↓\n##↔\n##↦\n##⇄\n##⇌\n##⇒\n##∂\n##∅\n##∆\n##∇\n##∈\n##∗\n##∘\n##√\n##∞\n##∧\n##∨\n##∩\n##∪\n##≈\n##≡\n##≤\n##≥\n##⊂\n##⊆\n##⊕\n##⊗\n##⋅\n##─\n##│\n##■\n##▪\n##●\n##★\n##☆\n##☉\n##♠\n##♣\n##♥\n##♦\n##♯\n##⟨\n##⟩\n##ⱼ\n##⺩\n##⺼\n##⽥\n##、\n##。\n##〈\n##〉\n##《\n##》\n##「\n##」\n##『\n##』\n##〜\n##あ\n##い\n##う\n##え\n##お\n##か\n##き\n##く\n##け\n##こ\n##さ\n##し\n##す\n##せ\n##そ\n##た\n##ち\n##っ\n##つ\n##て\n##と\n##な\n##に\n##ぬ\n##ね\n##の\n##は\n##ひ\n##ふ\n##へ\n##ほ\n##ま\n##み\n##む\n##め\n##も\n##や\n##ゆ\n##よ\n##ら\n##り\n##る\n##れ\n##ろ\n##を\n##ん\n##ァ\n##ア\n##ィ\n##イ\n##ウ\n##ェ\n##エ\n##オ\n##カ\n##キ\n##ク\n##ケ\n##コ\n##サ\n##シ\n##ス\n##セ\n##タ\n##チ\n##ッ\n##ツ\n##テ\n##ト\n##ナ\n##ニ\n##ノ\n##ハ\n##ヒ\n##フ\n##ヘ\n##ホ\n##マ\n##ミ\n##ム\n##メ\n##モ\n##ャ\n##ュ\n##ョ\n##ラ\n##リ\n##ル\n##レ\n##ロ\n##ワ\n##ン\n##・\n##ー\n##一\n##三\n##上\n##下\n##不\n##世\n##中\n##主\n##久\n##之\n##也\n##事\n##二\n##五\n##井\n##京\n##人\n##亻\n##仁\n##介\n##代\n##仮\n##伊\n##会\n##佐\n##侍\n##保\n##信\n##健\n##元\n##光\n##八\n##公\n##内\n##出\n##分\n##前\n##劉\n##力\n##加\n##勝\n##北\n##区\n##十\n##千\n##南\n##博\n##原\n##口\n##古\n##史\n##司\n##合\n##吉\n##同\n##名\n##和\n##囗\n##四\n##国\n##國\n##土\n##地\n##坂\n##城\n##堂\n##場\n##士\n##夏\n##外\n##大\n##天\n##太\n##夫\n##奈\n##女\n##子\n##学\n##宀\n##宇\n##安\n##宗\n##定\n##宣\n##宮\n##家\n##宿\n##寺\n##將\n##小\n##尚\n##山\n##岡\n##島\n##崎\n##川\n##州\n##巿\n##帝\n##平\n##年\n##幸\n##广\n##弘\n##張\n##彳\n##後\n##御\n##德\n##心\n##忄\n##志\n##忠\n##愛\n##成\n##我\n##戦\n##戸\n##手\n##扌\n##政\n##文\n##新\n##方\n##日\n##明\n##星\n##春\n##昭\n##智\n##曲\n##書\n##月\n##有\n##朝\n##木\n##本\n##李\n##村\n##東\n##松\n##林\n##森\n##楊\n##樹\n##橋\n##歌\n##止\n##正\n##武\n##比\n##氏\n##民\n##水\n##氵\n##氷\n##永\n##江\n##沢\n##河\n##治\n##法\n##海\n##清\n##漢\n##瀬\n##火\n##版\n##犬\n##王\n##生\n##田\n##男\n##疒\n##発\n##白\n##的\n##皇\n##目\n##相\n##省\n##真\n##石\n##示\n##社\n##神\n##福\n##禾\n##秀\n##秋\n##空\n##立\n##章\n##竹\n##糹\n##美\n##義\n##耳\n##良\n##艹\n##花\n##英\n##華\n##葉\n##藤\n##行\n##街\n##西\n##見\n##訁\n##語\n##谷\n##貝\n##貴\n##車\n##軍\n##辶\n##道\n##郎\n##郡\n##部\n##都\n##里\n##野\n##金\n##鈴\n##镇\n##長\n##門\n##間\n##阝\n##阿\n##陳\n##陽\n##雄\n##青\n##面\n##風\n##食\n##香\n##馬\n##高\n##龍\n##龸\n##ﬁ\n##ﬂ\n##！\n##（\n##）\n##，\n##－\n##．\n##／\n##：\n##？\n##～\n"
  },
  {
    "path": "transtab/trainer.py",
    "content": "import os\nimport pdb\nimport math\nimport time\nimport json\n\nimport torch\nfrom torch import nn\nfrom torch.utils.data import Dataset, DataLoader\nimport numpy as np\nimport pandas as pd\nfrom tqdm.autonotebook import trange\n\nfrom transtab import constants\nfrom transtab.evaluator import predict, get_eval_metric_fn, EarlyStopping\nfrom transtab.modeling_transtab import TransTabFeatureExtractor\nfrom transtab.trainer_utils import SupervisedTrainCollator, TrainDataset\nfrom transtab.trainer_utils import get_parameter_names\nfrom transtab.trainer_utils import get_scheduler\n\nimport logging\nlogger = logging.getLogger(__name__)\n\nclass Trainer:\n    def __init__(self,\n        model,\n        train_set_list,\n        test_set_list=None,\n        collate_fn=None,\n        output_dir='./ckpt',\n        num_epoch=10,\n        batch_size=64,\n        lr=1e-4,\n        weight_decay=0,\n        patience=5,\n        eval_batch_size=256,\n        warmup_ratio=None,\n        warmup_steps=None,\n        balance_sample=False,\n        load_best_at_last=True,\n        ignore_duplicate_cols=False,\n        eval_metric='auc',\n        eval_less_is_better=False,\n        num_workers=0,\n        **kwargs,\n        ):\n        '''args:\n        train_set_list: a list of training sets [(x_1,y_1),(x_2,y_2),...]\n        test_set_list: a list of tuples of test set (x, y), same as train_set_list. if set None, do not do evaluation and early stopping\n        patience: the max number of early stop patience\n        num_workers: how many workers used to process dataloader. recommend to be 0 if training data smaller than 10000.\n        eval_less_is_better: if the set eval_metric is the less the better. For val_loss, it should be set True.\n        '''\n        self.model = model\n        if isinstance(train_set_list, tuple): train_set_list = [train_set_list]\n        if isinstance(test_set_list, tuple): test_set_list = [test_set_list]\n\n        self.train_set_list = train_set_list\n        self.test_set_list = test_set_list\n        self.collate_fn = collate_fn\n        if collate_fn is None:\n            self.collate_fn = SupervisedTrainCollator(\n                categorical_columns=model.categorical_columns,\n                numerical_columns=model.numerical_columns,\n                binary_columns=model.binary_columns,\n                ignore_duplicate_cols=ignore_duplicate_cols,\n            )\n        self.trainloader_list = [\n            self._build_dataloader(trainset, batch_size, collator=self.collate_fn, num_workers=num_workers) for trainset in train_set_list\n        ]\n        if test_set_list is not None:\n            self.testloader_list = [\n                self._build_dataloader(testset, eval_batch_size, collator=self.collate_fn, num_workers=num_workers, shuffle=False) for testset in test_set_list\n            ]\n        else:\n            self.testloader_list = None\n            \n        self.test_set_list = test_set_list\n        self.output_dir = output_dir\n        self.early_stopping = EarlyStopping(output_dir=output_dir, patience=patience, verbose=False, less_is_better=eval_less_is_better)\n        self.args = {\n            'lr':lr,\n            'weight_decay':weight_decay,\n            'batch_size':batch_size,\n            'num_epoch':num_epoch,\n            'eval_batch_size':eval_batch_size,\n            'warmup_ratio': warmup_ratio,\n            'warmup_steps': warmup_steps,\n            'num_training_steps': self.get_num_train_steps(train_set_list, num_epoch, batch_size),\n            'eval_metric': get_eval_metric_fn(eval_metric),\n            'eval_metric_name': eval_metric,\n            }\n        self.args['steps_per_epoch'] = int(self.args['num_training_steps'] / (num_epoch*len(self.train_set_list)))\n        if not os.path.exists(output_dir):\n            os.makedirs(output_dir)\n        self.optimizer = None\n        self.lr_scheduler = None\n        self.balance_sample = balance_sample\n        self.load_best_at_last = load_best_at_last\n\n    def train(self):\n        args = self.args\n        self.create_optimizer()\n        if args['warmup_ratio'] is not None or args['warmup_steps'] is not None:\n            num_train_steps = args['num_training_steps']\n            logger.info(f'set warmup training in initial {num_train_steps} steps')\n            self.create_scheduler(num_train_steps, self.optimizer)\n\n        start_time = time.time()\n        for epoch in trange(args['num_epoch'], desc='Epoch'):\n            ite = 0\n            train_loss_all = 0\n            for dataindex in range(len(self.trainloader_list)):\n                for data in self.trainloader_list[dataindex]:\n                    self.optimizer.zero_grad()\n                    logits, loss = self.model(data[0], data[1])\n                    loss.backward()\n                    self.optimizer.step()\n                    train_loss_all += loss.item()\n                    ite += 1\n                    if self.lr_scheduler is not None:\n                        self.lr_scheduler.step()\n\n            if self.test_set_list is not None:\n                eval_res_list = self.evaluate()\n                eval_res = np.mean(eval_res_list)\n                print('epoch: {}, test {}: {:.6f}'.format(epoch, self.args['eval_metric_name'], eval_res))\n                self.early_stopping(-eval_res, self.model)\n                if self.early_stopping.early_stop:\n                    print('early stopped')\n                    break\n            print('epoch: {}, train loss: {:.4f}, lr: {:.6f}, spent: {:.1f} secs'.format(epoch, train_loss_all, self.optimizer.param_groups[0]['lr'], time.time()-start_time))\n\n        if os.path.exists(self.output_dir):\n            if self.test_set_list is not None:\n                # load checkpoints\n                logger.info(f'load best at last from {self.output_dir}')\n                state_dict = torch.load(os.path.join(self.output_dir, constants.WEIGHTS_NAME), map_location='cpu')\n                self.model.load_state_dict(state_dict)\n            self.save_model(self.output_dir)\n\n        logger.info('training complete, cost {:.1f} secs.'.format(time.time()-start_time))\n\n    def evaluate(self):\n        # evaluate in each epoch\n        self.model.eval()\n        eval_res_list = []\n        for dataindex in range(len(self.testloader_list)):\n            y_test, pred_list, loss_list = [], [], []\n            for data in self.testloader_list[dataindex]:\n                if data[1] is not None:\n                    label = data[1]\n                    if isinstance(label, pd.Series):\n                        label = label.values\n                    y_test.append(label)\n                with torch.no_grad():\n                    logits, loss = self.model(data[0], data[1])\n                if loss is not None:\n                    loss_list.append(loss.item())\n                if logits is not None:\n                    if logits.shape[-1] == 1: # binary classification\n                        pred_list.append(logits.sigmoid().detach().cpu().numpy())\n                    else: # multi-class classification\n                        pred_list.append(torch.softmax(logits,-1).detach().cpu().numpy())\n\n            if len(pred_list)>0:\n                pred_all = np.concatenate(pred_list, 0)\n                if logits.shape[-1] == 1:\n                    pred_all = pred_all.flatten()\n\n            if self.args['eval_metric_name'] == 'val_loss':\n                eval_res = np.mean(loss_list)\n            else:\n                y_test = np.concatenate(y_test, 0)\n                eval_res = self.args['eval_metric'](y_test, pred_all)\n\n            eval_res_list.append(eval_res)\n\n        return eval_res_list\n\n    def train_no_dataloader(self,\n        resume_from_checkpoint = None,\n        ):\n        resume_from_checkpoint = None if not resume_from_checkpoint else resume_from_checkpoint\n        args = self.args\n        self.create_optimizer()\n        if args['warmup_ratio'] is not None or args['warmup_steps'] is not None:\n            print('set warmup training.')\n            self.create_scheduler(args['num_training_steps'], self.optimizer)\n\n        for epoch in range(args['num_epoch']):\n            ite = 0\n            # go through all train sets\n            for train_set in self.train_set_list:\n                x_train, y_train = train_set\n                train_loss_all = 0\n                for i in range(0, len(x_train), args['batch_size']):\n                    self.model.train()\n                    if self.balance_sample:\n                        bs_x_train_pos = x_train.loc[y_train==1].sample(int(args['batch_size']/2))\n                        bs_y_train_pos = y_train.loc[bs_x_train_pos.index]\n                        bs_x_train_neg = x_train.loc[y_train==0].sample(int(args['batch_size']/2))\n                        bs_y_train_neg = y_train.loc[bs_x_train_neg.index]\n                        bs_x_train = pd.concat([bs_x_train_pos, bs_x_train_neg], axis=0)\n                        bs_y_train = pd.concat([bs_y_train_pos, bs_y_train_neg], axis=0)\n                    else:\n                        bs_x_train = x_train.iloc[i:i+args['batch_size']]\n                        bs_y_train = y_train.loc[bs_x_train.index]\n\n                    self.optimizer.zero_grad()\n                    logits, loss = self.model(bs_x_train, bs_y_train)\n                    loss.backward()\n\n                    self.optimizer.step()\n                    train_loss_all += loss.item()\n                    ite += 1\n                    if self.lr_scheduler is not None:\n                        self.lr_scheduler.step()\n\n            if self.test_set is not None:\n                # evaluate in each epoch\n                self.model.eval()\n                x_test, y_test = self.test_set\n                pred_all = predict(self.model, x_test, self.args['eval_batch_size'])\n                eval_res = self.args['eval_metric'](y_test, pred_all)\n                print('epoch: {}, test {}: {}'.format(epoch, self.args['eval_metric_name'], eval_res))\n                self.early_stopping(-eval_res, self.model)\n                if self.early_stopping.early_stop:\n                    print('early stopped')\n                    break\n\n            print('epoch: {}, train loss: {}, lr: {:.6f}'.format(epoch, train_loss_all, self.optimizer.param_groups[0]['lr']))\n\n        if os.path.exists(self.output_dir):\n            if self.test_set is not None:\n                # load checkpoints\n                print('load best at last from', self.output_dir)\n                state_dict = torch.load(os.path.join(self.output_dir, constants.WEIGHTS_NAME), map_location='cpu')\n                self.model.load_state_dict(state_dict)\n            self.save_model(self.output_dir)\n\n    def save_model(self, output_dir=None):\n        if output_dir is None:\n            print('no path assigned for save mode, default saved to ./ckpt/model.pt !')\n            output_dir = self.output_dir\n\n        if not os.path.exists(output_dir): os.makedirs(output_dir, exist_ok=True)\n        logger.info(f'saving model checkpoint to {output_dir}')\n        self.model.save(output_dir)\n        self.collate_fn.save(output_dir)\n\n        if self.optimizer is not None:\n            torch.save(self.optimizer.state_dict(), os.path.join(output_dir, constants.OPTIMIZER_NAME))\n        if self.lr_scheduler is not None:\n            torch.save(self.lr_scheduler.state_dict(), os.path.join(output_dir, constants.SCHEDULER_NAME))\n        if self.args is not None:\n            train_args = {}\n            for k,v in self.args.items():\n                if isinstance(v, int) or isinstance(v, str) or isinstance(v, float):\n                    train_args[k] = v\n            with open(os.path.join(output_dir, constants.TRAINING_ARGS_NAME), 'w', encoding='utf-8') as f:\n                f.write(json.dumps(train_args))\n\n    def create_optimizer(self):\n        if self.optimizer is None:\n            decay_parameters = get_parameter_names(self.model, [nn.LayerNorm])\n            decay_parameters = [name for name in decay_parameters if \"bias\" not in name]\n            optimizer_grouped_parameters = [\n                {\n                    \"params\": [p for n, p in self.model.named_parameters() if n in decay_parameters],\n                    \"weight_decay\": self.args['weight_decay'],\n                },\n                {\n                    \"params\": [p for n, p in self.model.named_parameters() if n not in decay_parameters],\n                    \"weight_decay\": 0.0,\n                },\n            ]\n            self.optimizer = torch.optim.Adam(optimizer_grouped_parameters, lr=self.args['lr'])\n\n    def create_scheduler(self, num_training_steps, optimizer):\n        self.lr_scheduler = get_scheduler(\n            'cosine',\n            optimizer = optimizer,\n            num_warmup_steps=self.get_warmup_steps(num_training_steps),\n            num_training_steps=num_training_steps,\n        )\n        return self.lr_scheduler\n\n    def get_num_train_steps(self, train_set_list, num_epoch, batch_size):\n        total_step = 0\n        for trainset in train_set_list:\n            x_train, _ = trainset\n            total_step += np.ceil(len(x_train) / batch_size)\n        total_step *= num_epoch\n        return total_step\n\n    def get_warmup_steps(self, num_training_steps):\n        \"\"\"\n        Get number of steps used for a linear warmup.\n        \"\"\"\n        warmup_steps = (\n            self.args['warmup_steps'] if self.args['warmup_steps'] is not None else math.ceil(num_training_steps * self.args['warmup_ratio'])\n        )\n        return warmup_steps\n\n    def _build_dataloader(self, trainset, batch_size, collator, num_workers=8, shuffle=True):\n        trainloader = DataLoader(\n            TrainDataset(trainset),\n            collate_fn=collator,\n            batch_size=batch_size,\n            shuffle=shuffle,\n            num_workers=num_workers,\n            pin_memory=True,\n            drop_last=False,\n            )\n        return trainloader\n"
  },
  {
    "path": "transtab/trainer_utils.py",
    "content": "import pdb\nimport os\nimport random\nimport math\n\nimport numpy as np\nimport pandas as pd\nimport torch\nfrom torch.utils.data import Dataset, DataLoader\nfrom transformers.optimization import (\n    get_linear_schedule_with_warmup,\n    get_cosine_schedule_with_warmup,\n    get_cosine_with_hard_restarts_schedule_with_warmup,\n    get_polynomial_decay_schedule_with_warmup,\n    get_constant_schedule,\n    get_constant_schedule_with_warmup\n)\n\nfrom transtab.modeling_transtab import TransTabFeatureExtractor\n\nTYPE_TO_SCHEDULER_FUNCTION = {\n    'linear': get_linear_schedule_with_warmup,\n    'cosine': get_cosine_schedule_with_warmup,\n    'cosine_with_restarts': get_cosine_with_hard_restarts_schedule_with_warmup,\n    'polynomial': get_polynomial_decay_schedule_with_warmup,\n    'constant': get_constant_schedule,\n    'constant_with_warmup': get_constant_schedule_with_warmup,\n}\n\nclass TrainDataset(Dataset):\n    def __init__(self, trainset):\n        self.x, self.y = trainset\n\n    def __len__(self):\n        return len(self.x)\n    \n    def __getitem__(self, index):\n        x = self.x.iloc[index-1:index]\n        if self.y is not None:\n            y = self.y.iloc[index-1:index]\n        else:\n            y = None\n        return x, y\n\nclass TrainCollator:\n    '''A base class for all collate function used for TransTab training.\n    '''\n    def __init__(self,\n        categorical_columns=None,\n        numerical_columns=None,\n        binary_columns=None,\n        ignore_duplicate_cols=False,\n        **kwargs,\n        ):\n        self.feature_extractor=TransTabFeatureExtractor(\n            categorical_columns=categorical_columns,\n            numerical_columns=numerical_columns,\n            binary_columns=binary_columns,\n            disable_tokenizer_parallel=True,\n            ignore_duplicate_cols=ignore_duplicate_cols,\n        )\n    \n    def save(self, path):\n        self.feature_extractor.save(path)\n    \n    def __call__(self, data):\n        raise NotImplementedError\n\nclass SupervisedTrainCollator(TrainCollator):\n    def __init__(self,\n        categorical_columns=None,\n        numerical_columns=None,\n        binary_columns=None,\n        ignore_duplicate_cols=False,\n        **kwargs,\n        ):\n        super().__init__(\n        categorical_columns=categorical_columns,\n        numerical_columns=numerical_columns,\n        binary_columns=binary_columns,\n        ignore_duplicate_cols=ignore_duplicate_cols,\n        )\n    \n    def __call__(self, data):\n        x = pd.concat([row[0] for row in data])\n        y = pd.concat([row[1] for row in data])\n        inputs = self.feature_extractor(x)\n        return inputs, y\n\nclass TransTabCollatorForCL(TrainCollator):\n    '''support positive pair sampling for contrastive learning of transtab model.\n    '''\n    def __init__(self, \n        categorical_columns=None,\n        numerical_columns=None,\n        binary_columns=None,\n        overlap_ratio=0.5, \n        num_partition=3,\n        ignore_duplicate_cols=False,\n        **kwargs) -> None:\n        super().__init__(\n            categorical_columns=categorical_columns,\n            numerical_columns=numerical_columns,\n            binary_columns=binary_columns,\n            ignore_duplicate_cols=ignore_duplicate_cols,\n        )\n        assert num_partition > 0, f'number of contrastive subsets must be greater than 0, got {num_partition}'\n        assert isinstance(num_partition,int), f'number of constrative subsets must be int, got {type(num_partition)}'\n        assert overlap_ratio >= 0 and overlap_ratio < 1, f'overlap_ratio must be in [0, 1), got {overlap_ratio}'\n        self.overlap_ratio=overlap_ratio\n        self.num_partition=num_partition\n\n    def __call__(self, data):\n        '''\n        Take a list of subsets (views) from the original tests.\n        '''\n        # 1. build positive pairs\n        # 2. encode each pair using feature extractor\n        df_x = pd.concat([row[0] for row in data])\n        df_y = pd.concat([row[1] for row in data])\n        if self.num_partition > 1:\n            sub_x_list = self._build_positive_pairs(df_x, self.num_partition)\n        else:\n            sub_x_list = self._build_positive_pairs_single_view(df_x)\n        input_x_list = []\n        for sub_x in sub_x_list:\n            inputs = self.feature_extractor(sub_x)\n            input_x_list.append(inputs)\n        res = {'input_sub_x':input_x_list}\n        return res, df_y\n\n    def _build_positive_pairs(self, x, n):\n        '''build multi-view of each sample by spliting columns\n        '''\n        x_cols = x.columns.tolist()\n        sub_col_list = np.array_split(np.array(x_cols), n)\n        len_cols = len(sub_col_list[0])\n        overlap = int(math.ceil(len_cols * (self.overlap_ratio)))\n        sub_x_list = []\n        for i, sub_col in enumerate(sub_col_list):\n            if overlap > 0 and i < n-1:\n                sub_col = np.concatenate([sub_col, sub_col_list[i+1][:overlap]])\n            elif overlap >0 and i == n-1:\n                sub_col = np.concatenate([sub_col, sub_col_list[i-1][-overlap:]])\n            # np.random.shuffle(sub_col)\n            sub_x = x.copy()[sub_col]\n            sub_x_list.append(sub_x)\n        return sub_x_list\n\n    def _build_positive_pairs_single_view(self, x):\n        x_cols = x.columns.tolist()        \n        sub_x_list = [x]\n        n_corrupt = int(len(x_cols)*0.5)\n        corrupt_cols = x_cols[:n_corrupt]\n        x_corrupt = x.copy()[corrupt_cols]\n        np.random.shuffle(x_corrupt.values)\n        sub_x_list.append(pd.concat([x.copy().drop(corrupt_cols,axis=1), x_corrupt], axis=1))\n        return sub_x_list\n\ndef get_parameter_names(model, forbidden_layer_types):\n    \"\"\"\n    Returns the names of the model parameters that are not inside a forbidden layer.\n    \"\"\"\n    result = []\n    for name, child in model.named_children():\n        result += [\n            f\"{name}.{n}\"\n            for n in get_parameter_names(child, forbidden_layer_types)\n            if not isinstance(child, tuple(forbidden_layer_types))\n        ]\n    # Add model specific parameters (defined with nn.Parameter) since they are not in any child.\n    result += list(model._parameters.keys())\n    return result\n\ndef random_seed(seed):\n    os.environ['PYTHONHASHSEED'] = str(seed)\n    random.seed(seed)\n    np.random.seed(seed)\n    torch.manual_seed(seed)\n\ndef get_scheduler(\n    name,\n    optimizer,\n    num_warmup_steps = None,\n    num_training_steps = None,\n    ):\n    '''\n    Unified API to get any scheduler from its name.\n\n    Parameters\n    ----------\n    name: str\n        The name of the scheduler to use.\n\n    optimizer: torch.optim.Optimizer\n        The optimizer that will be used during training.\n\n    num_warmup_steps: int\n        The number of warmup steps to do. This is not required by all schedulers (hence the argument being\n        optional), the function will raise an error if it's unset and the scheduler type requires it.\n    \n    num_training_steps: int\n        The number of training steps to do. This is not required by all schedulers (hence the argument being\n        optional), the function will raise an error if it's unset and the scheduler type requires it.\n    '''\n    name = name.lower()\n    schedule_func = TYPE_TO_SCHEDULER_FUNCTION[name]\n\n    if name == 'constant':\n        return schedule_func(optimizer)\n    \n    if num_warmup_steps is None:\n        raise ValueError(f\"{name} requires `num_warmup_steps`, please provide that argument.\")\n\n    if name == 'constant_with_warmup':\n        return schedule_func(optimizer, num_warmup_steps=num_warmup_steps)\n    \n    if num_training_steps is None:\n        raise ValueError(f\"{name} requires `num_training_steps`, please provide that argument.\")\n\n    return schedule_func(optimizer, num_warmup_steps=num_warmup_steps, num_training_steps=num_training_steps)\n"
  },
  {
    "path": "transtab/transtab.py",
    "content": "import pdb\nimport os\n\nfrom transtab import constants\nfrom transtab.modeling_transtab import TransTabClassifier, TransTabRegressor, TransTabFeatureExtractor, TransTabFeatureProcessor\nfrom transtab.modeling_transtab import TransTabForCL\nfrom transtab.modeling_transtab import TransTabInputEncoder, TransTabModel\nfrom transtab.dataset import load_data\nfrom transtab.evaluator import predict, evaluate\nfrom transtab.trainer import Trainer\nfrom transtab.trainer_utils import TransTabCollatorForCL\nfrom transtab.trainer_utils import random_seed\n\ndef build_classifier(\n    categorical_columns=None,\n    numerical_columns=None,\n    binary_columns=None,\n    feature_extractor=None,\n    num_class=2,\n    hidden_dim=128,\n    num_layer=2,\n    num_attention_head=8,\n    hidden_dropout_prob=0,\n    ffn_dim=256,\n    activation='relu',\n    device='cuda:0',\n    checkpoint=None,\n    **kwargs) -> TransTabClassifier:\n    '''Build a :class:`transtab.modeling_transtab.TransTabClassifier`.\n\n    Parameters\n    ----------\n    categorical_columns: list \n        a list of categorical feature names.\n\n    numerical_columns: list\n        a list of numerical feature names.\n\n    binary_columns: list\n        a list of binary feature names, accept binary indicators like (yes,no); (true,false); (0,1).\n    \n    feature_extractor: TransTabFeatureExtractor\n        a feature extractor to tokenize the input tables. if not passed the model will build itself.\n\n    num_class: int\n        number of output classes to be predicted.\n\n    hidden_dim: int\n        the dimension of hidden embeddings.\n    \n    num_layer: int\n        the number of transformer layers used in the encoder.\n    \n    num_attention_head: int\n        the numebr of heads of multihead self-attention layer in the transformers.\n\n    hidden_dropout_prob: float\n        the dropout ratio in the transformer encoder.\n\n    ffn_dim: int\n        the dimension of feed-forward layer in the transformer layer.\n    \n    activation: str\n        the name of used activation functions, support ``\"relu\"``, ``\"gelu\"``, ``\"selu\"``, ``\"leakyrelu\"``.\n    \n    device: str\n        the device, ``\"cpu\"`` or ``\"cuda:0\"``.\n    \n    checkpoint: str\n        the directory to load the pretrained TransTab model.\n\n    Returns\n    -------\n    A TransTabClassifier model.\n\n    '''\n    model = TransTabClassifier(\n        categorical_columns = categorical_columns,\n        numerical_columns = numerical_columns,\n        binary_columns = binary_columns,\n        feature_extractor = feature_extractor,\n        num_class=num_class,\n        hidden_dim=hidden_dim,\n        num_layer=num_layer,\n        num_attention_head=num_attention_head,\n        hidden_dropout_prob=hidden_dropout_prob,\n        ffn_dim=ffn_dim,\n        activation=activation,\n        device=device,\n        **kwargs,\n        )\n    \n    if checkpoint is not None:\n        model.load(checkpoint)\n\n    return model\n\ndef build_regressor(\n    categorical_columns=None,\n    numerical_columns=None,\n    binary_columns=None,\n    feature_extractor=None,\n    num_class=1,\n    hidden_dim=128,\n    num_layer=2,\n    num_attention_head=8,\n    hidden_dropout_prob=0,\n    ffn_dim=256,\n    activation='relu',\n    device='cuda:0',\n    checkpoint=None,\n    **kwargs) -> TransTabRegressor:\n    '''Build a :class:`transtab.modeling_transtab.TransTabRegressor`.\n\n    Parameters\n    ----------\n    categorical_columns: list \n        a list of categorical feature names.\n\n    numerical_columns: list\n        a list of numerical feature names.\n\n    binary_columns: list\n        a list of binary feature names, accept binary indicators like (yes,no); (true,false); (0,1).\n    \n    feature_extractor: TransTabFeatureExtractor\n        a feature extractor to tokenize the input tables. if not passed the model will build itself.\n\n    num_class: int\n        number of output classes to be predicted.\n\n    hidden_dim: int\n        the dimension of hidden embeddings.\n    \n    num_layer: int\n        the number of transformer layers used in the encoder.\n    \n    num_attention_head: int\n        the numebr of heads of multihead self-attention layer in the transformers.\n\n    hidden_dropout_prob: float\n        the dropout ratio in the transformer encoder.\n\n    ffn_dim: int\n        the dimension of feed-forward layer in the transformer layer.\n    \n    activation: str\n        the name of used activation functions, support ``\"relu\"``, ``\"gelu\"``, ``\"selu\"``, ``\"leakyrelu\"``.\n    \n    device: str\n        the device, ``\"cpu\"`` or ``\"cuda:0\"``.\n    \n    checkpoint: str\n        the directory to load the pretrained TransTab model.\n\n    Returns\n    -------\n    A TransTabClassifier model.\n\n    '''\n    model = TransTabRegressor(\n        categorical_columns = categorical_columns,\n        numerical_columns = numerical_columns,\n        binary_columns = binary_columns,\n        feature_extractor = feature_extractor,\n        num_class=num_class,\n        hidden_dim=hidden_dim,\n        num_layer=num_layer,\n        num_attention_head=num_attention_head,\n        hidden_dropout_prob=hidden_dropout_prob,\n        ffn_dim=ffn_dim,\n        activation=activation,\n        device=device,\n        **kwargs,\n        )\n    \n    if checkpoint is not None:\n        model.load(checkpoint)\n\n    return model\n\ndef build_extractor(\n    categorical_columns=None,\n    numerical_columns=None,\n    binary_columns=None,\n    ignore_duplicate_cols=False,\n    disable_tokenizer_parallel=False,\n    checkpoint=None,\n    **kwargs,) -> TransTabFeatureExtractor:\n    '''Build a feature extractor for TransTab model.\n\n    Parameters\n    ----------\n    categorical_columns: list \n        a list of categorical feature names.\n\n    numerical_columns: list\n        a list of numerical feature names.\n\n    binary_columns: list\n        a list of binary feature names, accept binary indicators like (yes,no); (true,false); (0,1).\n\n    ignore_duplicate_cols: bool\n        if there is one column assigned to more than one type, e.g., the feature age is both nominated\n        as categorical and binary columns, the model will raise errors. set True to avoid this error as \n        the model will ignore this duplicate feature.\n\n    disable_tokenizer_parallel: bool\n        if the returned feature extractor is leveraged by the collate function for a dataloader,\n        try to set this False in case the dataloader raises errors because the dataloader builds \n        multiple workers and the tokenizer builds multiple workers at the same time.\n\n    checkpoint: str\n        the directory of the predefined TransTabFeatureExtractor.\n\n    Returns\n    -------\n    A TransTabFeatureExtractor module.\n\n    '''\n    feature_extractor = TransTabFeatureExtractor(\n        categorical_columns=categorical_columns,\n        numerical_columns=numerical_columns,\n        binary_columns=binary_columns,\n        disable_tokenizer_parallel=disable_tokenizer_parallel,\n        ignore_duplicate_cols=ignore_duplicate_cols,\n    )\n    if checkpoint is not None:\n        extractor_path = os.path.join(checkpoint, constants.EXTRACTOR_STATE_DIR)\n        if os.path.exists(extractor_path):\n            feature_extractor.load(extractor_path)\n        else:\n            feature_extractor.load(checkpoint)\n    return feature_extractor\n\ndef build_encoder(\n    categorical_columns=None,\n    numerical_columns=None,\n    binary_columns=None,\n    hidden_dim=128,\n    num_layer=2,\n    num_attention_head=8,\n    hidden_dropout_prob=0,\n    ffn_dim=256,\n    activation='relu',\n    device='cuda:0',\n    checkpoint=None,\n    **kwargs,\n    ):\n    '''\n    Build a feature encoder that maps inputs tabular samples to embeddings.\n    \n    Parameters\n    ----------\n    categorical_columns: list \n        a list of categorical feature names.\n\n    numerical_columns: list\n        a list of numerical feature names.\n\n    binary_columns: list\n        a list of binary feature names, accept binary indicators like (yes,no); (true,false); (0,1).\n    \n    hidden_dim: int\n        the dimension of hidden embeddings.\n    \n    num_layer: int\n        the number of transformer layers used in the encoder. If set zero, only use the\n        embedding layer to get token-level embeddings.\n    \n    num_attention_head: int\n        the numebr of heads of multihead self-attention layer in the transformers.\n        Ignored if `num_layer=0` is zero.\n\n    hidden_dropout_prob: float\n        the dropout ratio in the transformer encoder.\n        Ignored if `num_layer=0` is zero.\n\n    ffn_dim: int\n        the dimension of feed-forward layer in the transformer layer.\n        Ignored if `num_layer=0` is zero.\n\n    activation: str\n        the name of used activation functions, support ``\"relu\"``, ``\"gelu\"``, ``\"selu\"``, ``\"leakyrelu\"``.\n        Ignored if `num_layer=0` is zero.\n    \n    device: str\n        the device, ``\"cpu\"`` or ``\"cuda:0\"``.\n    \n    checkpoint: str\n        the directory to load the pretrained TransTab model.\n    '''\n    if num_layer == 0:\n        feature_extractor = TransTabFeatureExtractor(\n            categorical_columns=categorical_columns,\n            numerical_columns=numerical_columns,\n            binary_columns=binary_columns,\n            )\n        \n        feature_processor = TransTabFeatureProcessor(\n            vocab_size=feature_extractor.vocab_size,\n            pad_token_id=feature_extractor.pad_token_id,\n            hidden_dim=hidden_dim,\n            hidden_dropout_prob=hidden_dropout_prob,\n            device=device,\n            )\n\n        enc = TransTabInputEncoder(feature_extractor, feature_processor)\n        enc.load(checkpoint)\n        \n    else:\n        enc = TransTabModel(\n            categorical_columns=categorical_columns,\n            numerical_columns=numerical_columns,\n            binary_columns=binary_columns,\n            hidden_dim=hidden_dim,\n            num_layer=num_layer,\n            num_attention_head=num_attention_head,\n            hidden_dropout_prob=hidden_dropout_prob,\n            ffn_dim=ffn_dim,\n            activation=activation,\n            device=device,\n            )\n        if checkpoint is not None:\n            enc.load(checkpoint)\n\n    return enc\n\ndef build_contrastive_learner(\n    categorical_columns=None,\n    numerical_columns=None,\n    binary_columns=None,\n    projection_dim=128,\n    num_partition=3,\n    overlap_ratio=0.5,\n    supervised=True,\n    hidden_dim=128,\n    num_layer=2,\n    num_attention_head=8,\n    hidden_dropout_prob=0,\n    ffn_dim=256,\n    activation='relu',\n    device='cuda:0',\n    checkpoint=None,\n    ignore_duplicate_cols=True,\n    **kwargs,\n    ): \n    '''Build a contrastive learner for pretraining based on TransTab.\n    If no cat/num/bin specified, the model takes ALL as categorical columns,\n    which may undermine the performance significantly.\n\n    If there is one column assigned to more than one type, e.g., the feature age is both nominated\n    as categorical and binary columns, the model will raise errors. set ``ignore_duplicate_cols=True`` to avoid this error as \n    the model will ignore this duplicate feature.\n\n    Parameters\n    ----------\n    categorical_columns: list \n        a list of categorical feature names.\n\n    numerical_columns: list\n        a list of numerical feature names.\n\n    binary_columns: list\n        a list of binary feature names, accept binary indicators like (yes,no); (true,false); (0,1).\n    \n    feature_extractor: TransTabFeatureExtractor\n        a feature extractor to tokenize the input tables. if not passed the model will build itself.\n\n    hidden_dim: int\n        the dimension of hidden embeddings.\n    \n    num_layer: int\n        the number of transformer layers used in the encoder.\n    \n    num_attention_head: int\n        the numebr of heads of multihead self-attention layer in the transformers.\n\n    hidden_dropout_prob: float\n        the dropout ratio in the transformer encoder.\n\n    ffn_dim: int\n        the dimension of feed-forward layer in the transformer layer.\n    \n    projection_dim: int\n        the dimension of projection head on the top of encoder.\n    \n    overlap_ratio: float\n        the overlap ratio of columns of different partitions when doing subsetting.\n    \n    num_partition: int\n        the number of partitions made for vertical-partition contrastive learning.\n\n    supervised: bool\n        whether or not to take supervised VPCL, otherwise take self-supervised VPCL.\n    \n    temperature: float\n        temperature used to compute logits for contrastive learning.\n\n    base_temperature: float\n        base temperature used to normalize the temperature.\n    \n    activation: str\n        the name of used activation functions, support ``\"relu\"``, ``\"gelu\"``, ``\"selu\"``, ``\"leakyrelu\"``.\n    \n    device: str\n        the device, ``\"cpu\"`` or ``\"cuda:0\"``.\n\n    checkpoint: str\n        the directory of the pretrained transtab model.\n    \n    ignore_duplicate_cols: bool\n        if there is one column assigned to more than one type, e.g., the feature age is both nominated\n        as categorical and binary columns, the model will raise errors. set True to avoid this error as \n        the model will ignore this duplicate feature.\n    \n    Returns\n    -------\n    A TransTabForCL model.\n\n    '''\n\n    model = TransTabForCL(\n        categorical_columns = categorical_columns,\n        numerical_columns = numerical_columns,\n        binary_columns = binary_columns,\n        num_partition= num_partition,\n        hidden_dim=hidden_dim,\n        num_layer=num_layer,\n        num_attention_head=num_attention_head,\n        hidden_dropout_prob=hidden_dropout_prob,\n        supervised=supervised,\n        ffn_dim=ffn_dim,\n        projection_dim=projection_dim,\n        overlap_ratio=overlap_ratio,\n        activation=activation,\n        device=device,\n    )\n    if checkpoint is not None:\n        model.load(checkpoint)\n    \n    # build collate function for contrastive learning\n    collate_fn = TransTabCollatorForCL(\n        categorical_columns=categorical_columns,\n        numerical_columns=numerical_columns,\n        binary_columns=binary_columns,\n        overlap_ratio=overlap_ratio,\n        num_partition=num_partition,\n        ignore_duplicate_cols=ignore_duplicate_cols\n    )\n    if checkpoint is not None:\n        collate_fn.feature_extractor.load(os.path.join(checkpoint, constants.EXTRACTOR_STATE_DIR))\n\n    return model, collate_fn\n\ndef train(model, \n    trainset, \n    valset=None,\n    num_epoch=10,\n    batch_size=64,\n    eval_batch_size=256,\n    lr=1e-4,\n    weight_decay=0,\n    patience=5,\n    warmup_ratio=None,\n    warmup_steps=None,\n    eval_metric='auc',\n    output_dir='./ckpt',\n    collate_fn=None,\n    num_workers=0,\n    balance_sample=False,\n    load_best_at_last=True,\n    ignore_duplicate_cols=False,\n    eval_less_is_better=False,\n    **kwargs,\n    ):\n    '''\n    The shared train function for all TransTabModel based models.\n\n    Parameters\n    ----------\n    model: TransTabModel and its subclass\n        A subclass of the base model. Should be able to output logits and loss in forward, e.g.,\n        ``logit, loss = model(x, y)``.\n    \n    trainset: list or tuple\n        a list of trainsets, or a single trainset consisting of (x, y). x: pd.DataFrame or dict, y: pd.Series.\n    \n    valset: list or tuple\n        a list of valsets, or a single valset of consisting of (x, y).\n    \n    num_epoch: int\n        number of training epochs.\n    \n    batch_size: int\n        training batch size.\n    \n    eval_batch_size: int\n        evaluation batch size.\n\n    lr: float\n        training learning rate.\n\n    weight_decay: float\n        training weight decay.\n    \n    patience: int\n        early stopping patience, only valid when ``valset`` is given.\n    \n    warmup_ratio: float\n        the portion of training steps for learning rate warmup, if `warmup_steps` is set, it will be ignored.\n    \n    warmup_steps: int\n        the number of training steps for learning rate warmup.\n    \n    eval_metric: str\n        the evaluation metric during training for early stopping, can be ``\"acc\"``, ``\"auc\"``, ``\"mse\"``, ``\"val_loss\"``.\n    \n    output_dir: str\n        the output training model weights and feature extractor configurations.\n    \n    collate_fn: function\n        specify training collate function if it is not standard supervised learning, e.g., contrastive learning.\n\n    num_workers: int\n        the number of workers for the dataloader.\n    \n    balance_sample: bool\n        balance_sample: whether or not do bootstrapping to maintain in batch samples are in balanced classes, only support binary classification.\n    \n    load_best_at_last: bool\n        whether or not load the best checkpoint after the training completes.\n\n    ignore_duplicate_cols: bool\n        whether or not ignore the contradictory of cat/num/bin cols\n\n    eval_less_is_better: bool\n        if the set eval_metric is the less the better. For val_loss, it should be set True.\n    \n    Returns\n    -------\n        None\n        \n    '''\n    if isinstance(trainset, tuple): trainset = [trainset]\n\n    train_args = {\n        'num_epoch': num_epoch,\n        'batch_size': batch_size,\n        'eval_batch_size': eval_batch_size,\n        'lr': lr,\n        'weight_decay':weight_decay,\n        'patience':patience,\n        'warmup_ratio':warmup_ratio,\n        'warmup_steps':warmup_steps,\n        'eval_metric':eval_metric,\n        'output_dir':output_dir,\n        'collate_fn':collate_fn,\n        'num_workers':num_workers,\n        'balance_sample':balance_sample,\n        'load_best_at_last':load_best_at_last,\n        'ignore_duplicate_cols':ignore_duplicate_cols,\n        'eval_less_is_better':eval_less_is_better,\n    }\n    trainer = Trainer(\n        model,\n        trainset,\n        valset,\n        **train_args,\n    )\n    trainer.train()\n"
  }
]