[
  {
    "path": ".gitignore",
    "content": "# Byte-compiled / optimized / DLL files\n__pycache__/\n*.py[cod]\n*$py.class\n\nsftp-config.json\n\n.DS_Store\n\n*.py.swp\n\n# C extensions\n*.so\n\n# Distribution / packaging\n.Python\nbuild/\ndevelop-eggs/\ndist/\ndownloads/\neggs/\n.eggs/\nlib/\nlib64/\nparts/\nsdist/\nvar/\nwheels/\npip-wheel-metadata/\nshare/python-wheels/\n*.egg-info/\n.installed.cfg\n*.egg\nMANIFEST\n\n# PyInstaller\n#  Usually these files are written by a python script from a template\n#  before PyInstaller builds the exe, so as to inject date/other infos into it.\n*.manifest\n*.spec\n\n# Installer logs\npip-log.txt\npip-delete-this-directory.txt\n\n# Unit test / coverage reports\nhtmlcov/\n.tox/\n.nox/\n.coverage\n.coverage.*\n.cache\nnosetests.xml\ncoverage.xml\n*.cover\n*.py,cover\n.hypothesis/\n.pytest_cache/\n\n# Translations\n*.mo\n*.pot\n\n# Django stuff:\nlocal_settings.py\ndb.sqlite3\ndb.sqlite3-journal\n\n# Flask stuff:\ninstance/\n.webassets-cache\n\n# Scrapy stuff:\n.scrapy\n\n# Sphinx documentation\ndocs/_build/\n\n# PyBuilder\ntarget/\n\n# Jupyter Notebook\n.ipynb_checkpoints\n\n# IPython\nprofile_default/\nipython_config.py\n\n# pyenv\n.python-version\n\n# pipenv\n#   According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.\n#   However, in case of collaboration, if having platform-specific dependencies or dependencies\n#   having no cross-platform support, pipenv may install dependencies that don't work, or not\n#   install all needed dependencies.\n#Pipfile.lock\n\n# PEP 582; used by e.g. github.com/David-OConnor/pyflow\n__pypackages__/\n\n# Celery stuff\ncelerybeat-schedule\ncelerybeat.pid\n\n# SageMath parsed files\n*.sage.py\n\n# Environments\n.env\n.venv\nenv/\nvenv/\nENV/\nenv.bak/\nvenv.bak/\n\n# Spyder project settings\n.spyderproject\n.spyproject\n\n# Rope project settings\n.ropeproject\n\n# mkdocs documentation\n/site\n\n# mypy \n.mypy_cache/\n.dmypy.json\ndmypy.json\n\n# Pyre type checker\n.pyre/"
  },
  {
    "path": "README.md",
    "content": "# CorefQA: Coreference Resolution as Query-based Span Prediction\n\nThe repository contains the code of the recent research advances in [Shannon.AI](http://www.shannonai.com). Please post github issues or email xiaoya_li@shannonai.com for relevant questions.\n\n\n\n**CorefQA: Coreference Resolution as Query-based Span Prediction** <br>\nWei Wu, Fei Wang, Arianna Yuan, Fei Wu and Jiwei Li<br>\nIn ACL 2020. [paper](https://arxiv.org/abs/1911.01746)<br>\nIf you find this repo helpful, please cite the following:\n```latex\n@article{wu2019coreference,\n  title={Coreference Resolution as Query-based Span Prediction},\n  author={Wu, Wei and Wang, Fei and Yuan, Arianna and Wu, Fei and Li, Jiwei},\n  journal={arXiv preprint arXiv:1911.01746},\n  year={2019}\n}\n```\n\n\n## Contents \n- [Overview](#overview)\n- [Hardware Requirements](#hardware-requirements)\n- [Install Package Dependencies](#install-package-dependencies)\n- [Data Preprocess](#data-preprocess)\n- [Download Pretrained MLM](#download-pretrained-mlm)\n- [Training](#training)\n    - [Finetune the SpanBERT Model on the Combination of Squad and Quoref Datasets](#finetune-the-spanbert-model-on-the-combination-of-squad-and-quoref-datasets)\n    - [Train the CorefQA Model on the CoNLL-2012 Coreference Resolution Task](#train-the-corefqa-model-on-the-conll-2012-coreference-resolution-task)\n- [Evaluation and Prediction](#evaluation-and-prediction)\n- [Download the Final CorefQA Model](#download-the-final-corefqa-model)\n- [Descriptions of Directories](#descriptions-of-directories)\n- [Acknowledgement](#acknowledgement)\n- [Useful Materials](#useful-materials)\n- [Contact](#contact)\n\n\n## Overview \nThe model introduces +3.5 (83.1) F1 performance boost over previous SOTA coreference models on the CoNLL benchmark. The current codebase is written in Tensorflow. We plan to release the PyTorch version soon.  The current code version only supports training on TPUs and testing on GPUs (due to the annoying features of TF and TPUs). You thus have to bear the trouble of transferring all saved checkpoints from TPUs to GPUs for evaluation (we will fix this soon). Please follow the parameter setting in the log directionary to reproduce the performance.  \n\n\n| Model          | F1 (%) |\n| -------------- |:------:|\n| Previous SOTA  (Joshi et al., 2019a)  | 79.6  |\n| CorefQA + SpanBERT-large | 83.1   |\n\n\n## Hardware Requirements\nTPU for training: Cloud TPU v3-8 device (128G memory) with Tensorflow 1.15 Python 3.5 \n\nGPU for evaluation: with CUDA 10.0 Tensorflow 1.15 Python 3.5\n\n## Install Package Dependencies\n \n```shell\n$ python3 -m pip install --user virtualenv\n$ virtualenv --python=python3.5 ~/corefqa_venv\n$ source ~/corefqa_venv/bin/activate\n$ cd CorefQA\n$ pip install -r requirements.txt\n# If you are using TPU, please run the following commands:\n$ pip install --upgrade google-api-python-client \n$ pip install --upgrade oauth2client \n```\n\n## Data Preprocess \n\n1) Download the offical released [Ontonotes 5.0 (LDC2013T19)](https://catalog.ldc.upenn.edu/LDC2013T19). <br> \n2) Preprocess Ontonotes5 annotations files for the CoNLL-2012 coreference resolution task. <br> \nRun the command with **Python 2**\n`bash ./scripts/data/preprocess_ontonotes_annfiles.sh  <path_to_LDC2013T19-ontonotes5_directory>  <path_to_save_CoNLL12_coreference_resolution_directory> <language>`<br> \nand it will create `{train/dev/test}.{language}.v4_gold_conll` files in the directory `<path_to_save_CoNLL12_coreference_resolution_directory>`. <br> \n`<language>` can be `english`, `arabic` or `chinese`. In this paper, we set `<language>` to `english`. <br>\nIf you want to use **Python 3**, please refer to the\n[guideline](https://github.com/huggingface/neuralcoref/blob/master/neuralcoref/train/training.md#get-the-data) <br> \n3) Generate TFRecord files for experiments. <br> \nRun the command with **Python 3** `bash ./scripts/data/generate_tfrecord_dataset.sh <path_to_save_CoNLL12_coreference_resolution_directory>  <path_to_save_tfrecord_directory> <path_to_pretrain_mlm_vocab_file>`\nand it will create `{train/dev/test}.overlap.corefqa.{language}.tfrecord` files in the directory `<path_to_save_CoNLL12_coreference_resolution_directory>`. <br> \n\n## Download Pretrained MLM\nIn our experiments, we used pretrained mask language models to initialize the mention_proposal and corefqa models. \n\n1) Download the pretrained models. <br> \nRun `bash ./scripts/data/download_pretrained_mlm.sh <path_to_save_pretrained_mlm> <model_sign>` to download and unzip the pretrained mlm models. <br> \n`<model_sign>` shoule take the value of `[bert_base, bert_large, spanbert_base, spanbert_large, bert_tiny]`.\n\n- `bert_base, bert_large, spanbert_base, spanbert_large` are trained with a cased(upppercase and lowercase tokens) vocabulary. Should use the cased train/dev/test coreference datasets. \n- `bert_tiny` is trained with a uncased(lowercase tokens) vocabulary. We use the tinyBERT model for fast debugging. Should use the uncased train/dev/test coreference datasets. <br> \n\n2) Transform SpanBERT from `Pytorch` to `Tensorflow`. <br> \n\nAfter downloading `bert_<scale>` to `<path_to_bert_<scale>_tf_dir>` and `spanbert_<scale>` to `<path_to_spanbert_<scale>_pytorch_dir>`, you can start transforming the SpanBERT model to Tensorflow and the model is saved to the directory `<path_to_save_spanbert_tf_checkpoint_dir>`. `<scale>` should take the value of `[base, large]`. <br> \n\nWe need to tranform the SpanBERT checkpoints from Pytorch to TF because the offical relased models were trained with Pytorch. \nRun `bash ./scripts/data/transform_ckpt_pytorch_to_tf.sh <model_name>  <path_to_spanbert_<scale>_pytorch_dir> <path_to_bert_<scale>_tf_dir>  <path_to_save_spanbert_tf_checkpoint_dir>` \nand the `<model_name>` in TF will be saved in `<path_to_save_spanbert_tf_checkpoint_dir>`.\n\n- `<model_name>` should take the value of `[spanbert_base, spanbert_large]`. \n- `<scale>` indicates that the `bert_model.ckpt` in the `<path_to_bert_<scale>_tf_dir>` should have the same scale(base, large) to the `bert_model.bin` in `<path_to_spanbert_<scale>_pytorch_dir>`.\n\n\n## Training \n\nFollow the pipeline described in the paper, you need to: <br> \n1) load a pretrained SpanBERT model. <br> \n2) finetune the SpanBERT model on the combination of Squad and Quoref datasets. <br> \n3) pretrain the mention proposal model on the coref dataset. <br>\n4) jointly train the mention proposal model and the mention linking model. <br>\n \n**Notice:** We provide the options of both pretraining these models yourself and loading the our pretrained models for 2) and 3). <br> \n\n### Finetune the SpanBERT Model on the Combination of Squad and Quoref Datasets\nWe finetune the SpanBERT model on the [SQuAD 2.0](https://rajpurkar.github.io/SQuAD-explorer/) and [Quoref](https://allennlp.org/quoref) QA tasks for data augmentation before the coreference resolution task. \n\n1. You can directly download the pretrained model on the datasets. \nDownload Data Augmentation Models on Squad and Quoref [link](https://www.dropbox.com/s/lqjc6kfe0w34jt0/finetune_spanbert_large_squad2.tar.gz?dl=0) <br>\nRun `./scripts/data/download_squad2_finetune_model.sh <model-scale> <path-to-save-model>` to download finetuned SpanBERT on SQuAD2.0. <br>\nThe `<model-scale>` should take the value of `[base, large]`. <br>\nThe `<path-to-save-model>` is the path to save finetuned spanbert on SQuAD2.0 datasets. <br>\n\n\n2. Or start to finetune the SpanBERT model on QA tasks yourself. \n- Download SQuAD 2.0 [train](https://rajpurkar.github.io/SQuAD-explorer/dataset/train-v2.0.json) and [dev](https://rajpurkar.github.io/SQuAD-explorer/dataset/dev-v2.0.json) sets. \n- Download Quoref [train and dev](https://quoref-dataset.s3-us-west-2.amazonaws.com/train_and_dev/quoref-train-dev-v0.1.zip) sets.\n- Finetune the SpanBERT model on Google Could V3-8 TPU. \n\nFor Squad 2.0, Run the script in [./script/model/squad_tpu.sh](https://github.com/ShannonAI/CorefQA/blob/master/scripts/models/squad_tpu.sh)\n  ```bash \n  \n   REPO_PATH=/home/shannon/coref-tf\n   export TPU_NAME=tf-tpu\n   export PYTHONPATH=\"$PYTHONPATH:$REPO_PATH\"\n   SQUAD_DIR=gs://qa_tasks/squad2\n   BERT_DIR=gs://pretrained_mlm_checkpoint/spanbert_large_tf\n   OUTPUT_DIR=gs://corefqa_output_squad/spanbert_large_squad2_2e-5  \n\n   python3 ${REPO_PATH}/run/run_squad.py \\\n   --vocab_file=$BERT_DIR/vocab.txt \\\n   --bert_config_file=$BERT_DIR/bert_config.json \\\n   --init_checkpoint=$BERT_DIR/bert_model.ckpt \\\n   --do_train=True \\\n   --train_file=$SQUAD_DIR/train-v2.0.json \\\n   --do_predict=True \\\n   --predict_file=$SQUAD_DIR/dev-v2.0.json \\\n   --train_batch_size=8 \\\n   --learning_rate=2e-5 \\\n   --num_train_epochs=4.0 \\\n   --max_seq_length=384 \\\n   --do_lower_case=False \\\n   --doc_stride=128 \\\n   --output_dir=${OUTPUT_DIR} \\\n   --use_tpu=True \\\n   --tpu_name=$TPU_NAME \\\n   --version_2_with_negative=True\n  ```\nAfter getting the best model (choose based on the performance on dev set) on `SQuAD2.0`, you should start finetuning the saved model on `Quoref`. <br>\n\nRun the script in [./script/model/quoref_tpu.sh](https://github.com/ShannonAI/CorefQA/blob/master/scripts/models/quoref_tpu.sh) \n  ```bash \n  \n   REPO_PATH=/home/shannon/coref-tf\n   export TPU_NAME=tf-tpu\n   export PYTHONPATH=\"$PYTHONPATH:$REPO_PATH\"\n   QUOREF_DIR=gs://qa_tasks/quoref\n   BERT_DIR=gs://corefqa_output_squad/panbert_large_squad2_2e-5\n   OUTPUT_DIR=gs://corefqa_output_quoref/spanbert_large_squad2_best_quoref_3e-5 \n\n   python3 ${REPO_PATH}/run_quoref.py \\\n   --vocab_file=$BERT_DIR/vocab.txt \\\n   --bert_config_file=$BERT_DIR/bert_config.json \\\n   --init_checkpoint=$BERT_DIR/best_bert_model.ckpt \\\n   --do_train=True \\\n   --train_file=$QUOREF_DIR/quoref-train-v0.1.json \\\n   --do_predict=True \\\n   --predict_file=$QUOREF_DIR/quoref-dev-v0.1.json \\\n   --train_batch_size=8 \\\n   --learning_rate=3e-5 \\\n   --num_train_epochs=5 \\\n   --max_seq_length=384 \\\n   --do_lower_case=False \\\n   --doc_stride=128 \\\n   --output_dir=${OUTPUT_DIR} \\\n   --use_tpu=True \\\n   --tpu_name=$TPU_NAME \n  ```\nWe use the best model (choose based on the performance on DEV set) on `Quoref` to initialize the CorefQA Model. \n  \n### Train the CorefQA Model on the CoNLL-2012 Coreference Resolution Task \n1.1 Your can  you can download the pre-trained mention proposal model (including [model](https://storage.googleapis.com/public_model_checkpoints/mention_proposal/model.ckpt-22000.data-00000-of-00001), [meta](https://storage.googleapis.com/public_model_checkpoints/mention_proposal/model.ckpt-22000.meta) and [index](https://storage.googleapis.com/public_model_checkpoints/mention_proposal/model.ckpt-22000.index)). \n\n1.2. Or train  the mention proposal model yourself. \n\nThe script can be found in [./script/model/mention_tpu.sh](https://github.com/ShannonAI/CorefQA/blob/master/scripts/models/mention_tpu.sh).\n\n```bash \n\nREPO_PATH=/home/shannon/coref-tf\nexport PYTHONPATH=\"$PYTHONPATH:$REPO_PATH\"\nexport TPU_NAME=tf-tpu\nexport TPU_ZONE=europe-west4-a\nexport GCP_PROJECT=xiaoyli-20-01-4820\n\nBERT_DIR=gs://corefqa_output_quoref/spanbert_large_squad2_best_quoref_1e-5\nDATA_DIR=gs://corefqa_data/final_overlap_384_6\nOUTPUT_DIR=gs://corefqa_output_mention_proposal/squad_quoref_large_384_6_1e5_8_0.2\n\npython3 ${REPO_PATH}/run/run_mention_proposal.py \\\n--output_dir=$OUTPUT_DIR \\\n--bert_config_file=$BERT_DIR/bert_config.json \\\n--init_checkpoint=$BERT_DIR/bert_model.ckpt \\\n--vocab_file=$BERT_DIR/vocab.txt \\\n--logfile_path=$OUTPUT_DIR/train.log \\\n--num_epochs=8 \\\n--keep_checkpoint_max=50 \\\n--save_checkpoints_steps=500 \\\n--train_file=$DATA_DIR/train.corefqa.english.tfrecord \\\n--dev_file=$DATA_DIR/dev.corefqa.english.tfrecord \\\n--test_file=$DATA_DIR/test.corefqa.english.tfrecord \\\n--do_train=True \\\n--do_eval=False \\\n--do_predict=False \\\n--learning_rate=1e-5 \\\n--dropout_rate=0.2 \\\n--mention_threshold=0.5 \\\n--hidden_size=1024 \\\n--num_docs=5604 \\\n--window_size=384 \\\n--num_window=6 \\\n--max_num_mention=60 \\\n--start_end_share=False \\\n--loss_start_ratio=0.3 \\\n--loss_end_ratio=0.3 \\\n--loss_span_ratio=0.3 \\\n--use_tpu=True \\\n--tpu_name=$TPU_NAME \\\n--tpu_zone=$TPU_ZONE \\\n--gcp_project=$GCP_PROJECT \\\n--num_tpu_cores=1 \\\n--seed=2333\n```\n\n2. Jointly train the mention proposal model and linking model on CoNLL-12. <br> \n\nAfter getting the best mention proposal model on the dev set, start jointly training the mention proposal and linking tasks. \n\nRun and the script can be found in [./script/model/corefqa_tpu.sh](https://github.com/ShannonAI/CorefQA/blob/master/scripts/models/corefqa_tpu.sh)\n\n```bash\n\nREPO_PATH=/home/shannon/coref-tf\nexport PYTHONPATH=\"$PYTHONPATH:$REPO_PATH\"\nexport TPU_NAME=tf-tpu\nexport TPU_ZONE=europe-west4-a\nexport GCP_PROJECT=xiaoyli-20-01-4820\n\nBERT_DIR=gs://corefqa_output_mention_proposal/output_bertlarge\nDATA_DIR=gs://corefqa_data/final_overlap_384_6\nOUTPUT_DIR=gs://corefqa_output_corefqa/squad_quoref_mention_large_384_6_8e4_8_0.2\n\npython3 ${REPO_PATH}/run/run_corefqa.py \\\n--output_dir=$OUTPUT_DIR \\\n--bert_config_file=$BERT_DIR/bert_config.json \\\n--init_checkpoint=$BERT_DIR/best_bert_model.ckpt \\\n--vocab_file=$BERT_DIR/vocab.txt \\\n--logfile_path=$OUTPUT_DIR/train.log \\\n--num_epochs=8 \\\n--keep_checkpoint_max=50 \\\n--save_checkpoints_steps=500 \\\n--train_file=$DATA_DIR/train.corefqa.english.tfrecord \\\n--dev_file=$DATA_DIR/dev.corefqa.english.tfrecord \\\n--test_file=$DATA_DIR/test.corefqa.english.tfrecord \\\n--do_train=True \\\n--do_eval=False \\\n--do_predict=False \\\n--learning_rate=8e-4 \\\n--dropout_rate=0.2 \\\n--mention_threshold=0.5 \\\n--hidden_size=1024 \\\n--num_docs=5604 \\\n--window_size=384 \\\n--num_window=6 \\\n--max_num_mention=50 \\\n--start_end_share=False \\\n--max_span_width=10 \\\n--max_candiate_mentions=100 \\\n--top_span_ratio=0.2 \\\n--max_top_antecedents=30 \\\n--max_query_len=150 \\\n--max_context_len=150 \\\n--sec_qa_mention_score=False \\\n--use_tpu=True \\\n--tpu_name=$TPU_NAME \\\n--tpu_zone=$TPU_ZONE \\\n--gcp_project=$GCP_PROJECT \\\n--num_tpu_cores=1 \\\n--seed=2333\n```\n\n## Evaluation and Prediction\n\nCurrently, the evaluation is conducted on a set of saved checkpoints after the training process, and DO NOT support evaluation during training. Please transfer all checkpoints (the output directory is set `--output_dir=<path_to_output_directory>` when running the `run_<model_sign>.py`) from TPUs to GPUs for evaluation. \nThis can be achieved by downloading the output directory from the Google Cloud Storage. <br>  \n\n\nThe performance  on the test set is obtained by using  the model achieving the highest F1-score on the dev set. <br> \nSet `--do_eval=True`、 `--do_train=False` and `--do_predict=False` to `run_<model_sign>.py` and start the evaluation process on a set of saved checkpoints. And other parameters should be the same with the training process.\n`<model_sign>` should take the value of `[mention_proposal, corefqa]`. <br>\n\nThe codebase also provides the option of evaluating a single model/checkpoint. Please set `--do_eval=False`、 `--do_train=False` and `--do_predict=True` to `run_<model_sign>.py` with the checkpoint path `--eval_checkpoint=<path_to_eval_checkpoint_model>`.\n`<model_sign>` should take the value of `[mention_proposal, corefqa]`.\n<br>\n \n## Download the Final CorefQA Model\nYou can download the final CorefQA model at [link](https://drive.google.com/file/d/1RPYsS2dDxYyii7-3NkBNG7VtuA96NBLf/view?usp=sharing) and follow the instructions in the prediciton to obtain the score reported in the paper. \n\n\n## Descriptions of Directories\n\nName | Descriptions \n----------- | ------------- \nbert | BERT modules (model,tokenizer,optimization) ref to the `google-research/bert` repository. \nconll-2012 | offical evaluation scripts for CoNLL2012 shared task.\ndata_utils | modules for processing training data.  \nfunc_builders | the input dataloader and model constructor for CorefQA.\nlogs | the log files in our experiments. \nmodels | an implementation of CorefQA/MentionProposal models based on TF.\nrun | modules for data preparation and training models.\nscripts/data | scripts for data preparation and loading pretrained models.\nscripts/models | scripts for {train/evaluate} {mention_proposal/corefqa} models on {TPU/GPU}. \nutils | modules including metrics、optimizers. \n\n\n\n## Acknowledgement\n\nMany thanks to `Yuxian Meng` and the previous work `https://github.com/mandarjoshi90/coref`.\n\n## Useful Materials\n\n- TPU Quick Start [link](https://cloud.google.com/tpu/docs/quickstart)\n- TPU Available Operations [link](https://cloud.google.com/tpu/docs/tensorflow-ops)\n\n## Contact \n\nFeel free to discuss papers/code with us through issues/emails!\n"
  },
  {
    "path": "bert/__init__.py",
    "content": "# coding=utf-8\n# Copyright 2018 The Google AI Language Team Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n"
  },
  {
    "path": "bert/modeling.py",
    "content": "# coding=utf-8\n# Copyright 2018 The Google AI Language Team Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"The main BERT model and related functions.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport collections\nimport copy\nimport json\nimport math\nimport re\n\nimport six\nimport tensorflow as tf\n\n\nclass BertConfig(object):\n    \"\"\"Configuration for `BertModel`.\"\"\"\n\n    def __init__(self,\n                 vocab_size,\n                 hidden_size=768,\n                 num_hidden_layers=12,\n                 num_attention_heads=12,\n                 intermediate_size=3072,\n                 hidden_act=\"gelu\",\n                 hidden_dropout_prob=0.1,\n                 attention_probs_dropout_prob=0.1,\n                 max_position_embeddings=512,\n                 type_vocab_size=16,\n                 initializer_range=0.02):\n        \"\"\"Constructs BertConfig.\n\n        Args:\n          vocab_size: Vocabulary size of `inputs_ids` in `BertModel`.\n          hidden_size: Size of the encoder layers and the pooler layer.\n          num_hidden_layers: Number of hidden layers in the Transformer encoder.\n          num_attention_heads: Number of attention heads for each attention layer in\n            the Transformer encoder.\n          intermediate_size: The size of the \"intermediate\" (i.e., feed-forward)\n            layer in the Transformer encoder.\n          hidden_act: The non-linear activation function (function or string) in the\n            encoder and pooler.\n          hidden_dropout_prob: The dropout probability for all fully connected\n            layers in the embeddings, encoder, and pooler.\n          attention_probs_dropout_prob: The dropout ratio for the attention\n            probabilities.\n          max_position_embeddings: The maximum sequence length that this model might\n            ever be used with. Typically set this to something large just in case\n            (e.g., 512 or 1024 or 2048).\n          type_vocab_size: The vocabulary size of the `token_type_ids` passed into\n            `BertModel`.\n          initializer_range: The stdev of the truncated_normal_initializer for\n            initializing all weight matrices.\n        \"\"\"\n        self.vocab_size = vocab_size\n        self.hidden_size = hidden_size\n        self.num_hidden_layers = num_hidden_layers\n        self.num_attention_heads = num_attention_heads\n        self.hidden_act = hidden_act\n        self.intermediate_size = intermediate_size\n        self.hidden_dropout_prob = hidden_dropout_prob\n        self.attention_probs_dropout_prob = attention_probs_dropout_prob\n        self.max_position_embeddings = max_position_embeddings\n        self.type_vocab_size = type_vocab_size\n        self.initializer_range = initializer_range\n\n    @classmethod\n    def from_dict(cls, json_object):\n        \"\"\"Constructs a `BertConfig` from a Python dictionary of parameters.\"\"\"\n        config = BertConfig(vocab_size=None)\n        for (key, value) in six.iteritems(json_object):\n            config.__dict__[key] = value\n        return config\n\n    @classmethod\n    def from_json_file(cls, json_file):\n        \"\"\"Constructs a `BertConfig` from a json file of parameters.\"\"\"\n        with tf.gfile.GFile(json_file, \"r\") as reader:\n            text = reader.read()\n        return cls.from_dict(json.loads(text))\n\n    def to_dict(self):\n        \"\"\"Serializes this instance to a Python dictionary.\"\"\"\n        output = copy.deepcopy(self.__dict__)\n        return output\n\n    def to_json_string(self):\n        \"\"\"Serializes this instance to a JSON string.\"\"\"\n        return json.dumps(self.to_dict(), indent=2, sort_keys=True) + \"\\n\"\n\n\nclass BertModel(object):\n    \"\"\"BERT model (\"Bidirectional Embedding Representations from a Transformer\").\n\n    Example usage:\n\n    ```python\n    # Already been converted into WordPiece token ids\n    input_ids = tf.constant([[31, 51, 99], [15, 5, 0]])\n    input_mask = tf.constant([[1, 1, 1], [1, 1, 0]])\n    token_type_ids = tf.constant([[0, 0, 1], [0, 2, 0]])\n\n    config = modeling.BertConfig(vocab_size=32000, hidden_size=512,\n      num_hidden_layers=8, num_attention_heads=6, intermediate_size=1024)\n\n    model = modeling.BertModel(config=config, is_training=True,\n      input_ids=input_ids, input_mask=input_mask, token_type_ids=token_type_ids)\n\n    label_embeddings = tf.get_variable(...)\n    pooled_output = model.get_pooled_output()\n    logits = tf.matmul(pooled_output, label_embeddings)\n    ...\n    ```\n    \"\"\"\n\n    def __init__(self,\n                 config,\n                 is_training,\n                 input_ids,\n                 input_mask=None,\n                 token_type_ids=None,\n                 use_one_hot_embeddings=True,\n                 scope=None):\n        \"\"\"Constructor for BertModel.\n\n        Args:\n          config: `BertConfig` instance.\n          is_training: bool. rue for training model, false for eval model. Controls\n            whether dropout will be applied.\n          input_ids: int32 Tensor of shape [batch_size, seq_length].\n          input_mask: (optional) int32 Tensor of shape [batch_size, seq_length].\n          token_type_ids: (optional) int32 Tensor of shape [batch_size, seq_length].\n          use_one_hot_embeddings: (optional) bool. Whether to use one-hot word\n            embeddings or tf.embedding_lookup() for the word embeddings. On the TPU,\n            it is must faster if this is True, on the CPU or GPU, it is faster if\n            this is False.\n          scope: (optional) variable scope. Defaults to \"bert\".\n\n        Raises:\n          ValueError: The config is invalid or one of the input tensor shapes\n            is invalid.\n        \"\"\"\n        config = copy.deepcopy(config)\n        config.hidden_dropout_prob = tf.to_float(is_training) * config.hidden_dropout_prob\n        config.attention_probs_dropout_prob = tf.to_float(is_training) * config.attention_probs_dropout_prob\n        # config.hidden_dropout_prob = tf.Print(config.hidden_dropout_prob, [config.hidden_dropout_prob], 'hdden')\n        # if not is_training:\n        # config.hidden_dropout_prob = 0.0\n        # config.attention_probs_dropout_prob = 0.0\n\n        input_shape = get_shape_list(input_ids, expected_rank=2)\n        batch_size = input_shape[0]\n        seq_length = input_shape[1]\n\n        if input_mask is None:\n            input_mask = tf.ones(shape=[batch_size, seq_length], dtype=tf.int32)\n\n        if token_type_ids is None:\n            token_type_ids = tf.zeros(shape=[batch_size, seq_length], dtype=tf.int32)\n\n        with tf.variable_scope(scope, default_name=\"bert\", reuse=tf.AUTO_REUSE):\n            with tf.variable_scope(\"embeddings\"):\n                # Perform embedding lookup on the word ids.\n                (self.embedding_output, self.embedding_table) = embedding_lookup(\n                    input_ids=input_ids,\n                    vocab_size=config.vocab_size,\n                    embedding_size=config.hidden_size,\n                    initializer_range=config.initializer_range,\n                    word_embedding_name=\"word_embeddings\",\n                    use_one_hot_embeddings=use_one_hot_embeddings)\n\n                # Add positional embeddings and token type embeddings, then layer\n                # normalize and perform dropout.\n                self.embedding_output = embedding_postprocessor(\n                    input_tensor=self.embedding_output,\n                    use_token_type=True,\n                    token_type_ids=token_type_ids,\n                    token_type_vocab_size=config.type_vocab_size,\n                    token_type_embedding_name=\"token_type_embeddings\",\n                    use_position_embeddings=True,\n                    position_embedding_name=\"position_embeddings\",\n                    initializer_range=config.initializer_range,\n                    max_position_embeddings=config.max_position_embeddings,\n                    dropout_prob=config.hidden_dropout_prob)\n\n            with tf.variable_scope(\"encoder\"):\n                # This converts a 2D mask of shape [batch_size, seq_length] to a 3D\n                # mask of shape [batch_size, seq_length, seq_length] which is used\n                # for the attention scores.\n                attention_mask = create_attention_mask_from_input_mask(\n                    input_ids, input_mask)\n\n                # Run the stacked transformer.\n                # `sequence_output` shape = [batch_size, seq_length, hidden_size].\n                self.all_encoder_layers = transformer_model(\n                    input_tensor=self.embedding_output,\n                    attention_mask=attention_mask,\n                    hidden_size=config.hidden_size,\n                    num_hidden_layers=config.num_hidden_layers,\n                    num_attention_heads=config.num_attention_heads,\n                    intermediate_size=config.intermediate_size,\n                    intermediate_act_fn=get_activation(config.hidden_act),\n                    hidden_dropout_prob=config.hidden_dropout_prob,\n                    attention_probs_dropout_prob=config.attention_probs_dropout_prob,\n                    initializer_range=config.initializer_range,\n                    do_return_all_layers=True)\n\n            self.sequence_output = self.all_encoder_layers[-1]\n            # The \"pooler\" converts the encoded sequence tensor of shape\n            # [batch_size, seq_length, hidden_size] to a tensor of shape\n            # [batch_size, hidden_size]. This is necessary for segment-level\n            # (or segment-pair-level) classification tasks where we need a fixed\n            # dimensional representation of the segment.\n            with tf.variable_scope(\"pooler\"):\n                # We \"pool\" the model by simply taking the hidden state corresponding\n                # to the first token. We assume that this has been pre-trained\n                first_token_tensor = tf.squeeze(self.sequence_output[:, 0:1, :], axis=1)\n                self.pooled_output = tf.layers.dense(\n                    first_token_tensor,\n                    config.hidden_size,\n                    activation=tf.tanh,\n                    kernel_initializer=create_initializer(config.initializer_range))\n\n    def get_pooled_output(self):\n        return self.pooled_output\n\n    def get_sequence_output(self):\n        \"\"\"Gets final hidden layer of encoder.\n\n        Returns:\n          float Tensor of shape [batch_size, seq_length, hidden_size] corresponding\n          to the final hidden of the transformer encoder.\n        \"\"\"\n        return self.sequence_output\n\n    def get_all_encoder_layers(self):\n        return self.all_encoder_layers\n\n    def get_embedding_output(self):\n        \"\"\"Gets output of the embedding lookup (i.e., input to the transformer).\n\n        Returns:\n          float Tensor of shape [batch_size, seq_length, hidden_size] corresponding\n          to the output of the embedding layer, after summing the word\n          embeddings with the positional embeddings and the token type embeddings,\n          then performing layer normalization. This is the input to the transformer.\n        \"\"\"\n        return self.embedding_output\n\n    def get_embedding_table(self):\n        return self.embedding_table\n\n\ndef gelu(input_tensor):\n    \"\"\"Gaussian Error Linear Unit.\n\n    This is a smoother version of the RELU.\n    Original paper: https://arxiv.org/abs/1606.08415\n\n    Args:\n      input_tensor: float Tensor to perform activation.\n\n    Returns:\n      `input_tensor` with the GELU activation applied.\n    \"\"\"\n    cdf = 0.5 * (1.0 + tf.erf(input_tensor / tf.sqrt(2.0)))\n    return input_tensor * cdf\n\n\ndef get_activation(activation_string):\n    \"\"\"Maps a string to a Python function, e.g., \"relu\" => `tf.nn.relu`.\n\n    Args:\n      activation_string: String name of the activation function.\n\n    Returns:\n      A Python function corresponding to the activation function. If\n      `activation_string` is None, empty, or \"linear\", this will return None.\n      If `activation_string` is not a string, it will return `activation_string`.\n\n    Raises:\n      ValueError: The `activation_string` does not correspond to a known\n        activation.\n    \"\"\"\n\n    # We assume that anything that\"s not a string is already an activation\n    # function, so we just return it.\n    if not isinstance(activation_string, six.string_types):\n        return activation_string\n\n    if not activation_string:\n        return None\n\n    act = activation_string.lower()\n    if act == \"linear\":\n        return None\n    elif act == \"relu\":\n        return tf.nn.relu\n    elif act == \"gelu\":\n        return gelu\n    elif act == \"tanh\":\n        return tf.tanh\n    else:\n        raise ValueError(\"Unsupported activation: %s\" % act)\n\n\ndef get_assignment_map_from_checkpoint(tvars, init_checkpoint):\n    \"\"\"Compute the union of the current variables and checkpoint variables.\"\"\"\n    initialized_variable_names = {}\n\n    name_to_variable = collections.OrderedDict()\n    for var in tvars:\n        name = var.name\n        m = re.match(\"^(.*):\\\\d+$\", name)\n        if m is not None:\n            name = m.group(1)\n        name_to_variable[name] = var\n\n    init_vars = tf.train.list_variables(init_checkpoint)  # checkpoint variables,\n\n    assignment_map = collections.OrderedDict()\n    for x in init_vars:\n        (name, var) = (x[0], x[1])\n        if name not in name_to_variable:\n            continue\n        assignment_map[name] = name\n        initialized_variable_names[name] = 1\n        initialized_variable_names[name + \":0\"] = 1\n\n    return assignment_map, initialized_variable_names\n\n\ndef dropout(input_tensor, dropout_prob):\n    \"\"\"Perform dropout.\n\n    Args:\n      input_tensor: float Tensor.\n      dropout_prob: Python float. The probability of dropping out a value (NOT of\n        *keeping* a dimension as in `tf.nn.dropout`).\n\n    Returns:\n      A version of `input_tensor` with dropout applied.\n    \"\"\"\n    if dropout_prob is None or dropout_prob == 0.0:\n        return input_tensor\n\n    output = tf.nn.dropout(input_tensor, 1.0 - dropout_prob)\n    return output\n\n\ndef layer_norm(input_tensor, name=None):\n    \"\"\"Run layer normalization on the last dimension of the tensor.\"\"\"\n    return tf.contrib.layers.layer_norm(\n        inputs=input_tensor, begin_norm_axis=-1, begin_params_axis=-1, scope=name)\n\n\ndef layer_norm_and_dropout(input_tensor, dropout_prob, name=None):\n    \"\"\"Runs layer normalization followed by dropout.\"\"\"\n    output_tensor = layer_norm(input_tensor, name)\n    output_tensor = dropout(output_tensor, dropout_prob)\n    return output_tensor\n\n\ndef create_initializer(initializer_range=0.02):\n    \"\"\"Creates a `truncated_normal_initializer` with the given range.\"\"\"\n    return tf.truncated_normal_initializer(stddev=initializer_range)\n\n\ndef embedding_lookup(input_ids,\n                     vocab_size,\n                     embedding_size=128,\n                     initializer_range=0.02,\n                     word_embedding_name=\"word_embeddings\",\n                     use_one_hot_embeddings=False):\n    \"\"\"Looks up words embeddings for id tensor.\n\n    Args:\n      input_ids: int32 Tensor of shape [batch_size, seq_length] containing word\n        ids.\n      vocab_size: int. Size of the embedding vocabulary.\n      embedding_size: int. Width of the word embeddings.\n      initializer_range: float. Embedding initialization range.\n      word_embedding_name: string. Name of the embedding table.\n      use_one_hot_embeddings: bool. If True, use one-hot method for word\n        embeddings. If False, use `tf.nn.embedding_lookup()`. One hot is better\n        for TPUs.\n\n    Returns:\n      float Tensor of shape [batch_size, seq_length, embedding_size].\n    \"\"\"\n    # This function assumes that the input is of shape [batch_size, seq_length,\n    # num_inputs].\n    #\n    # If the input is a 2D tensor of shape [batch_size, seq_length], we\n    # reshape to [batch_size, seq_length, 1].\n    if input_ids.shape.ndims == 2:\n        input_ids = tf.expand_dims(input_ids, axis=[-1])\n\n    embedding_table = tf.get_variable(\n        name=word_embedding_name,\n        shape=[vocab_size, embedding_size],\n        initializer=create_initializer(initializer_range))\n\n    if use_one_hot_embeddings:\n        flat_input_ids = tf.reshape(input_ids, [-1])\n        one_hot_input_ids = tf.one_hot(flat_input_ids, depth=vocab_size)\n        output = tf.matmul(one_hot_input_ids, embedding_table)\n    else:\n        output = tf.nn.embedding_lookup(embedding_table, input_ids)\n\n    input_shape = get_shape_list(input_ids)\n\n    output = tf.reshape(output,\n                        input_shape[0:-1] + [input_shape[-1] * embedding_size])\n    return (output, embedding_table)\n\n\ndef embedding_postprocessor(input_tensor,\n                            use_token_type=False,\n                            token_type_ids=None,\n                            token_type_vocab_size=16,\n                            token_type_embedding_name=\"token_type_embeddings\",\n                            use_position_embeddings=True,\n                            position_embedding_name=\"position_embeddings\",\n                            initializer_range=0.02,\n                            max_position_embeddings=512,\n                            dropout_prob=0.1):\n    \"\"\"Performs various post-processing on a word embedding tensor.\n\n    Args:\n      input_tensor: float Tensor of shape [batch_size, seq_length,\n        embedding_size].\n      use_token_type: bool. Whether to add embeddings for `token_type_ids`.\n      token_type_ids: (optional) int32 Tensor of shape [batch_size, seq_length].\n        Must be specified if `use_token_type` is True.\n      token_type_vocab_size: int. The vocabulary size of `token_type_ids`.\n      token_type_embedding_name: string. The name of the embedding table variable\n        for token type ids.\n      use_position_embeddings: bool. Whether to add position embeddings for the\n        position of each token in the sequence.\n      position_embedding_name: string. The name of the embedding table variable\n        for positional embeddings.\n      initializer_range: float. Range of the weight initialization.\n      max_position_embeddings: int. Maximum sequence length that might ever be\n        used with this model. This can be longer than the sequence length of\n        input_tensor, but cannot be shorter.\n      dropout_prob: float. Dropout probability applied to the final output tensor.\n\n    Returns:\n      float tensor with same shape as `input_tensor`.\n\n    Raises:\n      ValueError: One of the tensor shapes or input values is invalid.\n    \"\"\"\n    input_shape = get_shape_list(input_tensor, expected_rank=3)\n    batch_size = input_shape[0]\n    seq_length = input_shape[1]\n    width = input_shape[2]\n\n    output = input_tensor\n\n    if use_token_type:\n        if token_type_ids is None:\n            raise ValueError(\"`token_type_ids` must be specified if\"\n                             \"`use_token_type` is True.\")\n        token_type_table = tf.get_variable(\n            name=token_type_embedding_name,\n            shape=[token_type_vocab_size, width],\n            initializer=create_initializer(initializer_range))\n        # This vocab will be small so we always do one-hot here, since it is always\n        # faster for a small vocabulary.\n        flat_token_type_ids = tf.reshape(token_type_ids, [-1])\n        one_hot_ids = tf.one_hot(flat_token_type_ids, depth=token_type_vocab_size)\n        token_type_embeddings = tf.matmul(one_hot_ids, token_type_table)\n        token_type_embeddings = tf.reshape(token_type_embeddings,\n                                           [batch_size, seq_length, width])\n        output += token_type_embeddings\n\n    if use_position_embeddings:\n        assert_op = tf.assert_less_equal(seq_length, max_position_embeddings)\n        with tf.control_dependencies([assert_op]):\n            full_position_embeddings = tf.get_variable(\n                name=position_embedding_name,\n                shape=[max_position_embeddings, width],\n                initializer=create_initializer(initializer_range))\n            # Since the position embedding table is a learned variable, we create it\n            # using a (long) sequence length `max_position_embeddings`. The actual\n            # sequence length might be shorter than this, for faster training of\n            # tasks that do not have long sequences.\n            #\n            # So `full_position_embeddings` is effectively an embedding table\n            # for position [0, 1, 2, ..., max_position_embeddings-1], and the current\n            # sequence has positions [0, 1, 2, ... seq_length-1], so we can just\n            # perform a slice.\n            position_embeddings = tf.slice(full_position_embeddings, [0, 0],\n                                           [seq_length, -1])\n            num_dims = len(output.shape.as_list())\n\n            # Only the last two dimensions are relevant (`seq_length` and `width`), so\n            # we broadcast among the first dimensions, which is typically just\n            # the batch size.\n            position_broadcast_shape = []\n            for _ in range(num_dims - 2):\n                position_broadcast_shape.append(1)\n            position_broadcast_shape.extend([seq_length, width])\n            position_embeddings = tf.reshape(position_embeddings,\n                                             position_broadcast_shape)\n            output += position_embeddings\n\n    output = layer_norm_and_dropout(output, dropout_prob)\n    return output\n\n\ndef create_attention_mask_from_input_mask(from_tensor, to_mask):\n    \"\"\"Create 3D attention mask from a 2D tensor mask.\n\n    Args:\n      from_tensor: 2D or 3D Tensor of shape [batch_size, from_seq_length, ...].\n      to_mask: int32 Tensor of shape [batch_size, to_seq_length].\n\n    Returns:\n      float Tensor of shape [batch_size, from_seq_length, to_seq_length].\n    \"\"\"\n    from_shape = get_shape_list(from_tensor, expected_rank=[2, 3])\n    batch_size = from_shape[0]\n    from_seq_length = from_shape[1]\n\n    to_shape = get_shape_list(to_mask, expected_rank=2)\n    to_seq_length = to_shape[1]\n\n    to_mask = tf.cast(\n        tf.reshape(to_mask, [batch_size, 1, to_seq_length]), tf.float32)\n\n    # We don't assume that `from_tensor` is a mask (although it could be). We\n    # don't actually care if we attend *from* padding tokens (only *to* padding)\n    # tokens so we create a tensor of all ones.\n    #\n    # `broadcast_ones` = [batch_size, from_seq_length, 1]\n    broadcast_ones = tf.ones(\n        shape=[batch_size, from_seq_length, 1], dtype=tf.float32)\n\n    # Here we broadcast along two dimensions to create the mask.\n    mask = broadcast_ones * to_mask\n\n    return mask\n\n\ndef attention_layer(from_tensor,\n                    to_tensor,\n                    attention_mask=None,\n                    num_attention_heads=1,\n                    size_per_head=512,\n                    query_act=None,\n                    key_act=None,\n                    value_act=None,\n                    attention_probs_dropout_prob=0.0,\n                    initializer_range=0.02,\n                    do_return_2d_tensor=False,\n                    batch_size=None,\n                    from_seq_length=None,\n                    to_seq_length=None):\n    \"\"\"Performs multi-headed attention from `from_tensor` to `to_tensor`.\n\n    This is an implementation of multi-headed attention based on \"Attention\n    is all you Need\". If `from_tensor` and `to_tensor` are the same, then\n    this is self-attention. Each timestep in `from_tensor` attends to the\n    corresponding sequence in `to_tensor`, and returns a fixed-with vector.\n\n    This function first projects `from_tensor` into a \"query\" tensor and\n    `to_tensor` into \"key\" and \"value\" tensors. These are (effectively) a list\n    of tensors of length `num_attention_heads`, where each tensor is of shape\n    [batch_size, seq_length, size_per_head].\n\n    Then, the query and key tensors are dot-producted and scaled. These are\n    softmaxed to obtain attention probabilities. The value tensors are then\n    interpolated by these probabilities, then concatenated back to a single\n    tensor and returned.\n\n    In practice, the multi-headed attention are done with transposes and\n    reshapes rather than actual separate tensors.\n\n    Args:\n      from_tensor: float Tensor of shape [batch_size, from_seq_length,\n        from_width].\n      to_tensor: float Tensor of shape [batch_size, to_seq_length, to_width].\n      attention_mask: (optional) int32 Tensor of shape [batch_size,\n        from_seq_length, to_seq_length]. The values should be 1 or 0. The\n        attention scores will effectively be set to -infinity for any positions in\n        the mask that are 0, and will be unchanged for positions that are 1.\n      num_attention_heads: int. Number of attention heads.\n      size_per_head: int. Size of each attention head.\n      query_act: (optional) Activation function for the query transform.\n      key_act: (optional) Activation function for the key transform.\n      value_act: (optional) Activation function for the value transform.\n      attention_probs_dropout_prob: (optional) float. Dropout probability of the\n        attention probabilities.\n      initializer_range: float. Range of the weight initializer.\n      do_return_2d_tensor: bool. If True, the output will be of shape [batch_size\n        * from_seq_length, num_attention_heads * size_per_head]. If False, the\n        output will be of shape [batch_size, from_seq_length, num_attention_heads\n        * size_per_head].\n      batch_size: (Optional) int. If the input is 2D, this might be the batch size\n        of the 3D version of the `from_tensor` and `to_tensor`.\n      from_seq_length: (Optional) If the input is 2D, this might be the seq length\n        of the 3D version of the `from_tensor`.\n      to_seq_length: (Optional) If the input is 2D, this might be the seq length\n        of the 3D version of the `to_tensor`.\n\n    Returns:\n      float Tensor of shape [batch_size, from_seq_length,\n        num_attention_heads * size_per_head]. (If `do_return_2d_tensor` is\n        true, this will be of shape [batch_size * from_seq_length,\n        num_attention_heads * size_per_head]).\n\n    Raises:\n      ValueError: Any of the arguments or tensor shapes are invalid.\n    \"\"\"\n\n    def transpose_for_scores(input_tensor, batch_size, num_attention_heads,\n                             seq_length, width):\n        output_tensor = tf.reshape(\n            input_tensor, [batch_size, seq_length, num_attention_heads, width])\n\n        output_tensor = tf.transpose(output_tensor, [0, 2, 1, 3])\n        return output_tensor\n\n    from_shape = get_shape_list(from_tensor, expected_rank=[2, 3])\n    to_shape = get_shape_list(to_tensor, expected_rank=[2, 3])\n\n    if len(from_shape) != len(to_shape):\n        raise ValueError(\n            \"The rank of `from_tensor` must match the rank of `to_tensor`.\")\n\n    if len(from_shape) == 3:\n        batch_size = from_shape[0]\n        from_seq_length = from_shape[1]\n        to_seq_length = to_shape[1]\n    elif len(from_shape) == 2:\n        if (batch_size is None or from_seq_length is None or to_seq_length is None):\n            raise ValueError(\n                \"When passing in rank 2 tensors to attention_layer, the values \"\n                \"for `batch_size`, `from_seq_length`, and `to_seq_length` \"\n                \"must all be specified.\")\n\n    # Scalar dimensions referenced here:\n    #   B = batch size (number of sequences)\n    #   F = `from_tensor` sequence length\n    #   T = `to_tensor` sequence length\n    #   N = `num_attention_heads`\n    #   H = `size_per_head`\n\n    from_tensor_2d = reshape_to_matrix(from_tensor)\n    to_tensor_2d = reshape_to_matrix(to_tensor)\n\n    # `query_layer` = [B*F, N*H]\n    query_layer = tf.layers.dense(\n        from_tensor_2d,\n        num_attention_heads * size_per_head,\n        activation=query_act,\n        name=\"query\",\n        kernel_initializer=create_initializer(initializer_range))\n\n    # `key_layer` = [B*T, N*H]\n    key_layer = tf.layers.dense(\n        to_tensor_2d,\n        num_attention_heads * size_per_head,\n        activation=key_act,\n        name=\"key\",\n        kernel_initializer=create_initializer(initializer_range))\n\n    # `value_layer` = [B*T, N*H]\n    value_layer = tf.layers.dense(\n        to_tensor_2d,\n        num_attention_heads * size_per_head,\n        activation=value_act,\n        name=\"value\",\n        kernel_initializer=create_initializer(initializer_range))\n\n    # `query_layer` = [B, N, F, H]\n    query_layer = transpose_for_scores(query_layer, batch_size,\n                                       num_attention_heads, from_seq_length,\n                                       size_per_head)\n\n    # `key_layer` = [B, N, T, H]\n    key_layer = transpose_for_scores(key_layer, batch_size, num_attention_heads,\n                                     to_seq_length, size_per_head)\n\n    # Take the dot product between \"query\" and \"key\" to get the raw\n    # attention scores.\n    # `attention_scores` = [B, N, F, T]\n    attention_scores = tf.matmul(query_layer, key_layer, transpose_b=True)\n    attention_scores = tf.multiply(attention_scores,\n                                   1.0 / math.sqrt(float(size_per_head)))\n\n    if attention_mask is not None:\n        # `attention_mask` = [B, 1, F, T]\n        attention_mask = tf.expand_dims(attention_mask, axis=[1])\n\n        # Since attention_mask is 1.0 for positions we want to attend and 0.0 for\n        # masked positions, this operation will create a tensor which is 0.0 for\n        # positions we want to attend and -10000.0 for masked positions.\n        adder = (1.0 - tf.cast(attention_mask, tf.float32)) * -10000.0\n\n        # Since we are adding it to the raw scores before the softmax, this is\n        # effectively the same as removing these entirely.\n        attention_scores += adder\n\n    # Normalize the attention scores to probabilities.\n    # `attention_probs` = [B, N, F, T]\n    attention_probs = tf.nn.softmax(attention_scores)\n\n    # This is actually dropping out entire tokens to attend to, which might\n    # seem a bit unusual, but is taken from the original Transformer paper.\n    attention_probs = dropout(attention_probs, attention_probs_dropout_prob)\n\n    # `value_layer` = [B, T, N, H]\n    value_layer = tf.reshape(\n        value_layer,\n        [batch_size, to_seq_length, num_attention_heads, size_per_head])\n\n    # `value_layer` = [B, N, T, H]\n    value_layer = tf.transpose(value_layer, [0, 2, 1, 3])\n\n    # `context_layer` = [B, N, F, H]\n    context_layer = tf.matmul(attention_probs, value_layer)\n\n    # `context_layer` = [B, F, N, H]\n    context_layer = tf.transpose(context_layer, [0, 2, 1, 3])\n\n    if do_return_2d_tensor:\n        # `context_layer` = [B*F, N*V]\n        context_layer = tf.reshape(\n            context_layer,\n            [batch_size * from_seq_length, num_attention_heads * size_per_head])\n    else:\n        # `context_layer` = [B, F, N*V]\n        context_layer = tf.reshape(\n            context_layer,\n            [batch_size, from_seq_length, num_attention_heads * size_per_head])\n\n    return context_layer\n\n\ndef transformer_model(input_tensor,\n                      attention_mask=None,\n                      hidden_size=768,\n                      num_hidden_layers=12,\n                      num_attention_heads=12,\n                      intermediate_size=3072,\n                      intermediate_act_fn=gelu,\n                      hidden_dropout_prob=0.1,\n                      attention_probs_dropout_prob=0.1,\n                      initializer_range=0.02,\n                      do_return_all_layers=False):\n    \"\"\"Multi-headed, multi-layer Transformer from \"Attention is All You Need\".\n\n    This is almost an exact implementation of the original Transformer encoder.\n\n    See the original paper:\n    https://arxiv.org/abs/1706.03762\n\n    Also see:\n    https://github.com/tensorflow/tensor2tensor/blob/master/tensor2tensor/models/transformer.py\n\n    Args:\n      input_tensor: float Tensor of shape [batch_size, seq_length, hidden_size].\n      attention_mask: (optional) int32 Tensor of shape [batch_size, seq_length,\n        seq_length], with 1 for positions that can be attended to and 0 in\n        positions that should not be.\n      hidden_size: int. Hidden size of the Transformer.\n      num_hidden_layers: int. Number of layers (blocks) in the Transformer.\n      num_attention_heads: int. Number of attention heads in the Transformer.\n      intermediate_size: int. The size of the \"intermediate\" (a.k.a., feed\n        forward) layer.\n      intermediate_act_fn: function. The non-linear activation function to apply\n        to the output of the intermediate/feed-forward layer.\n      hidden_dropout_prob: float. Dropout probability for the hidden layers.\n      attention_probs_dropout_prob: float. Dropout probability of the attention\n        probabilities.\n      initializer_range: float. Range of the initializer (stddev of truncated\n        normal).\n      do_return_all_layers: Whether to also return all layers or just the final\n        layer.\n\n    Returns:\n      float Tensor of shape [batch_size, seq_length, hidden_size], the final\n      hidden layer of the Transformer.\n\n    Raises:\n      ValueError: A Tensor shape or parameter is invalid.\n    \"\"\"\n    if hidden_size % num_attention_heads != 0:\n        raise ValueError(\n            \"The hidden size (%d) is not a multiple of the number of attention \"\n            \"heads (%d)\" % (hidden_size, num_attention_heads))\n\n    attention_head_size = int(hidden_size / num_attention_heads)\n    input_shape = get_shape_list(input_tensor, expected_rank=3)\n    batch_size = input_shape[0]\n    seq_length = input_shape[1]\n    input_width = input_shape[2]\n\n    # The Transformer performs sum residuals on all layers so the input needs\n    # to be the same as the hidden size.\n    if input_width != hidden_size:\n        raise ValueError(\"The width of the input tensor (%d) != hidden size (%d)\" %\n                         (input_width, hidden_size))\n\n    # We keep the representation as a 2D tensor to avoid re-shaping it back and\n    # forth from a 3D tensor to a 2D tensor. Re-shapes are normally free on\n    # the GPU/CPU but may not be free on the TPU, so we want to minimize them to\n    # help the optimizer.\n    prev_output = reshape_to_matrix(input_tensor)\n\n    all_layer_outputs = []\n    for layer_idx in range(num_hidden_layers):\n        with tf.variable_scope(\"layer_%d\" % layer_idx):\n            layer_input = prev_output\n\n            with tf.variable_scope(\"attention\"):\n                attention_heads = []\n                with tf.variable_scope(\"self\"):\n                    attention_head = attention_layer(\n                        from_tensor=layer_input,\n                        to_tensor=layer_input,\n                        attention_mask=attention_mask,\n                        num_attention_heads=num_attention_heads,\n                        size_per_head=attention_head_size,\n                        attention_probs_dropout_prob=attention_probs_dropout_prob,\n                        initializer_range=initializer_range,\n                        do_return_2d_tensor=True,\n                        batch_size=batch_size,\n                        from_seq_length=seq_length,\n                        to_seq_length=seq_length)\n                    attention_heads.append(attention_head)\n\n                attention_output = None\n                if len(attention_heads) == 1:\n                    attention_output = attention_heads[0]\n                else:\n                    # In the case where we have other sequences, we just concatenate\n                    # them to the self-attention head before the projection.\n                    attention_output = tf.concat(attention_heads, axis=-1)\n\n                # Run a linear projection of `hidden_size` then add a residual\n                # with `layer_input`.\n                with tf.variable_scope(\"output\"):\n                    attention_output = tf.layers.dense(\n                        attention_output,\n                        hidden_size,\n                        kernel_initializer=create_initializer(initializer_range))\n                    attention_output = dropout(attention_output, hidden_dropout_prob)\n                    attention_output = layer_norm(attention_output + layer_input)\n\n            # The activation is only applied to the \"intermediate\" hidden layer.\n            with tf.variable_scope(\"intermediate\"):\n                intermediate_output = tf.layers.dense(\n                    attention_output,\n                    intermediate_size,\n                    activation=intermediate_act_fn,\n                    kernel_initializer=create_initializer(initializer_range))\n\n            # Down-project back to `hidden_size` then add the residual.\n            with tf.variable_scope(\"output\"):\n                layer_output = tf.layers.dense(\n                    intermediate_output,\n                    hidden_size,\n                    kernel_initializer=create_initializer(initializer_range))\n                layer_output = dropout(layer_output, hidden_dropout_prob)\n                layer_output = layer_norm(layer_output + attention_output)\n                prev_output = layer_output\n                all_layer_outputs.append(layer_output)\n\n    if do_return_all_layers:\n        final_outputs = []\n        for layer_output in all_layer_outputs:\n            final_output = reshape_from_matrix(layer_output, input_shape)\n            final_outputs.append(final_output)\n        return final_outputs\n    else:\n        final_output = reshape_from_matrix(prev_output, input_shape)\n        return final_output\n\n\ndef get_shape_list(tensor, expected_rank=None, name=None):\n    \"\"\"Returns a list of the shape of tensor, preferring static dimensions.\n\n    Args:\n      tensor: A tf.Tensor object to find the shape of.\n      expected_rank: (optional) int. The expected rank of `tensor`. If this is\n        specified and the `tensor` has a different rank, and exception will be\n        thrown.\n      name: Optional name of the tensor for the error message.\n\n    Returns:\n      A list of dimensions of the shape of tensor. All static dimensions will\n      be returned as python integers, and dynamic dimensions will be returned\n      as tf.Tensor scalars.\n    \"\"\"\n    if name is None:\n        name = tensor.name\n\n    if expected_rank is not None:\n        assert_rank(tensor, expected_rank, name)\n\n    shape = tensor.shape.as_list()\n\n    non_static_indexes = []\n    for (index, dim) in enumerate(shape):\n        if dim is None:\n            non_static_indexes.append(index)\n\n    if not non_static_indexes:\n        return shape\n\n    dyn_shape = tf.shape(tensor)\n    for index in non_static_indexes:\n        shape[index] = dyn_shape[index]\n    return shape\n\n\ndef reshape_to_matrix(input_tensor):\n    \"\"\"Reshapes a >= rank 2 tensor to a rank 2 tensor (i.e., a matrix).\"\"\"\n    ndims = input_tensor.shape.ndims\n    if ndims < 2:\n        raise ValueError(\"Input tensor must have at least rank 2. Shape = %s\" %\n                         (input_tensor.shape))\n    if ndims == 2:\n        return input_tensor\n\n    width = input_tensor.shape[-1]\n    output_tensor = tf.reshape(input_tensor, [-1, width])\n    return output_tensor\n\n\ndef reshape_from_matrix(output_tensor, orig_shape_list):\n    \"\"\"Reshapes a rank 2 tensor back to its original rank >= 2 tensor.\"\"\"\n    if len(orig_shape_list) == 2:\n        return output_tensor\n\n    output_shape = get_shape_list(output_tensor)\n\n    orig_dims = orig_shape_list[0:-1]\n    width = output_shape[-1]\n\n    return tf.reshape(output_tensor, orig_dims + [width])\n\n\ndef assert_rank(tensor, expected_rank, name=None):\n    \"\"\"Raises an exception if the tensor rank is not of the expected rank.\n\n    Args:\n      tensor: A tf.Tensor to check the rank of.\n      expected_rank: Python integer or list of integers, expected rank.\n      name: Optional name of the tensor for the error message.\n\n    Raises:\n      ValueError: If the expected shape doesn't match the actual shape.\n    \"\"\"\n    if name is None:\n        name = tensor.name\n\n    expected_rank_dict = {}\n    if isinstance(expected_rank, six.integer_types):\n        expected_rank_dict[expected_rank] = True\n    else:\n        for x in expected_rank:\n            expected_rank_dict[x] = True\n\n    actual_rank = tensor.shape.ndims\n    if actual_rank not in expected_rank_dict:\n        scope_name = tf.get_variable_scope().name\n        raise ValueError(\n            \"For the tensor `%s` in scope `%s`, the actual rank \"\n            \"`%d` (shape = %s) is not equal to the expected rank `%s`\" %\n            (name, scope_name, actual_rank, str(tensor.shape), str(expected_rank)))\n"
  },
  {
    "path": "bert/optimization.py",
    "content": "# coding=utf-8\n# Copyright 2018 The Google AI Language Team Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Functions and classes related to optimization (weight updates).\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport re\nimport tensorflow as tf\n\n\ndef create_optimizer(loss, init_lr, num_train_steps, num_warmup_steps, use_tpu):\n  \"\"\"Creates an optimizer training op.\"\"\"\n  global_step = tf.train.get_or_create_global_step()\n\n  learning_rate = tf.constant(value=init_lr, shape=[], dtype=tf.float32)\n\n  # Implements linear decay of the learning rate.\n  learning_rate = tf.train.polynomial_decay(\n      learning_rate,\n      global_step,\n      num_train_steps,\n      end_learning_rate=0.0,\n      power=1.0,\n      cycle=False)\n\n  # Implements linear warmup. I.e., if global_step < num_warmup_steps, the\n  # learning rate will be `global_step/num_warmup_steps * init_lr`.\n  if num_warmup_steps:\n    global_steps_int = tf.cast(global_step, tf.int32)\n    warmup_steps_int = tf.constant(num_warmup_steps, dtype=tf.int32)\n\n    global_steps_float = tf.cast(global_steps_int, tf.float32)\n    warmup_steps_float = tf.cast(warmup_steps_int, tf.float32)\n\n    warmup_percent_done = global_steps_float / warmup_steps_float\n    warmup_learning_rate = init_lr * warmup_percent_done\n\n    is_warmup = tf.cast(global_steps_int < warmup_steps_int, tf.float32)\n    learning_rate = (\n        (1.0 - is_warmup) * learning_rate + is_warmup * warmup_learning_rate)\n\n  # It is recommended that you use this optimizer for fine tuning, since this\n  # is how the model was trained (note that the Adam m/v variables are NOT\n  # loaded from init_checkpoint.)\n  optimizer = AdamWeightDecayOptimizer(\n      learning_rate=learning_rate,\n      weight_decay_rate=0.01,\n      beta_1=0.9,\n      beta_2=0.999,\n      epsilon=1e-6,\n      exclude_from_weight_decay=[\"LayerNorm\", \"layer_norm\", \"bias\"])\n\n  if use_tpu:\n    optimizer = tf.contrib.tpu.CrossShardOptimizer(optimizer)\n\n  tvars = tf.trainable_variables()\n  grads = tf.gradients(loss, tvars)\n\n  # This is how the model was pre-trained.\n  (grads, _) = tf.clip_by_global_norm(grads, clip_norm=1.0)\n\n  train_op = optimizer.apply_gradients(\n      zip(grads, tvars), global_step=global_step)\n\n  # Normally the global step update is done inside of `apply_gradients`.\n  # However, `AdamWeightDecayOptimizer` doesn't do this. But if you use\n  # a different optimizer, you should probably take this line out.\n  new_global_step = global_step + 1\n  train_op = tf.group(train_op, [global_step.assign(new_global_step)])\n  return train_op\n\n\nclass AdamWeightDecayOptimizer(tf.train.Optimizer):\n  \"\"\"A basic Adam optimizer that includes \"correct\" L2 weight decay.\"\"\"\n\n  def __init__(self,\n               learning_rate,\n               weight_decay_rate=0.0,\n               beta_1=0.9,\n               beta_2=0.999,\n               epsilon=1e-6,\n               exclude_from_weight_decay=None,\n               name=\"AdamWeightDecayOptimizer\"):\n    \"\"\"Constructs a AdamWeightDecayOptimizer.\"\"\"\n    super(AdamWeightDecayOptimizer, self).__init__(False, name)\n\n    self.learning_rate = learning_rate\n    self.weight_decay_rate = weight_decay_rate\n    self.beta_1 = beta_1\n    self.beta_2 = beta_2\n    self.epsilon = epsilon\n    self.exclude_from_weight_decay = exclude_from_weight_decay\n\n  def apply_gradients(self, grads_and_vars, global_step=None, name=None):\n    \"\"\"See base class.\"\"\"\n    assignments = []\n    for (grad, param) in grads_and_vars:\n      if grad is None or param is None:\n        continue\n\n      param_name = self._get_variable_name(param.name)\n\n      m = tf.get_variable(\n          name=param_name + \"/adam_m\",\n          shape=param.shape.as_list(),\n          dtype=tf.float32,\n          trainable=False,\n          initializer=tf.zeros_initializer())\n      v = tf.get_variable(\n          name=param_name + \"/adam_v\",\n          shape=param.shape.as_list(),\n          dtype=tf.float32,\n          trainable=False,\n          initializer=tf.zeros_initializer())\n\n      # Standard Adam update.\n      next_m = (\n          tf.multiply(self.beta_1, m) + tf.multiply(1.0 - self.beta_1, grad))\n      next_v = (\n          tf.multiply(self.beta_2, v) + tf.multiply(1.0 - self.beta_2,\n                                                    tf.square(grad)))\n\n      update = next_m / (tf.sqrt(next_v) + self.epsilon)\n\n      # Just adding the square of the weights to the loss function is *not*\n      # the correct way of using L2 regularization/weight decay with Adam,\n      # since that will interact with the m and v parameters in strange ways.\n      #\n      # Instead we want ot decay the weights in a manner that doesn't interact\n      # with the m/v parameters. This is equivalent to adding the square\n      # of the weights to the loss with plain (non-momentum) SGD.\n      if self._do_use_weight_decay(param_name):\n        update += self.weight_decay_rate * param\n\n      update_with_lr = self.learning_rate * update\n\n      next_param = param - update_with_lr\n\n      assignments.extend(\n          [param.assign(next_param),\n           m.assign(next_m),\n           v.assign(next_v)])\n    return tf.group(*assignments, name=name)\n\n  def _do_use_weight_decay(self, param_name):\n    \"\"\"Whether to use L2 weight decay for `param_name`.\"\"\"\n    if not self.weight_decay_rate:\n      return False\n    if self.exclude_from_weight_decay:\n      for r in self.exclude_from_weight_decay:\n        if re.search(r, param_name) is not None:\n          return False\n    return True\n\n  def _get_variable_name(self, param_name):\n    \"\"\"Get the variable name from the tensor name.\"\"\"\n    m = re.match(\"^(.*):\\\\d+$\", param_name)\n    if m is not None:\n      param_name = m.group(1)\n    return param_name\n"
  },
  {
    "path": "bert/tokenization.py",
    "content": "# coding=utf-8\n# Copyright 2018 The Google AI Language Team Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Tokenization classes.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport collections\nimport re\nimport unicodedata\nimport six\nimport tensorflow as tf\n\n\ndef validate_case_matches_checkpoint(do_lower_case, init_checkpoint):\n  \"\"\"Checks whether the casing config is consistent with the checkpoint name.\"\"\"\n\n  # The casing has to be passed in by the user and there is no explicit check\n  # as to whether it matches the checkpoint. The casing information probably\n  # should have been stored in the bert_config.json file, but it's not, so\n  # we have to heuristically detect it to validate.\n\n  if not init_checkpoint:\n    return\n\n  m = re.match(\"^.*?([A-Za-z0-9_-]+)/bert_model.ckpt\", init_checkpoint)\n  if m is None:\n    return\n\n  model_name = m.group(1)\n\n  lower_models = [\n      \"uncased_L-24_H-1024_A-16\", \"uncased_L-12_H-768_A-12\",\n      \"multilingual_L-12_H-768_A-12\", \"chinese_L-12_H-768_A-12\"\n  ]\n\n  cased_models = [\n      \"cased_L-12_H-768_A-12\", \"cased_L-24_H-1024_A-16\",\n      \"multi_cased_L-12_H-768_A-12\"\n  ]\n\n  is_bad_config = False\n  if model_name in lower_models and not do_lower_case:\n    is_bad_config = True\n    actual_flag = \"False\"\n    case_name = \"lowercased\"\n    opposite_flag = \"True\"\n\n  if model_name in cased_models and do_lower_case:\n    is_bad_config = True\n    actual_flag = \"True\"\n    case_name = \"cased\"\n    opposite_flag = \"False\"\n\n  if is_bad_config:\n    raise ValueError(\n        \"You passed in `--do_lower_case=%s` with `--init_checkpoint=%s`. \"\n        \"However, `%s` seems to be a %s model, so you \"\n        \"should pass in `--do_lower_case=%s` so that the fine-tuning matches \"\n        \"how the model was pre-training. If this error is wrong, please \"\n        \"just comment out this check.\" % (actual_flag, init_checkpoint,\n                                          model_name, case_name, opposite_flag))\n\n\ndef convert_to_unicode(text):\n  \"\"\"Converts `text` to Unicode (if it's not already), assuming utf-8 input.\"\"\"\n  if six.PY3:\n    if isinstance(text, str):\n      return text\n    elif isinstance(text, bytes):\n      return text.decode(\"utf-8\", \"ignore\")\n    else:\n      raise ValueError(\"Unsupported string type: %s\" % (type(text)))\n  elif six.PY2:\n    if isinstance(text, str):\n      return text.decode(\"utf-8\", \"ignore\")\n    elif isinstance(text, unicode):\n      return text\n    else:\n      raise ValueError(\"Unsupported string type: %s\" % (type(text)))\n  else:\n    raise ValueError(\"Not running on Python2 or Python 3?\")\n\n\ndef printable_text(text):\n  \"\"\"Returns text encoded in a way suitable for print or `tf.logging`.\"\"\"\n\n  # These functions want `str` for both Python2 and Python3, but in one case\n  # it's a Unicode string and in the other it's a byte string.\n  if six.PY3:\n    if isinstance(text, str):\n      return text\n    elif isinstance(text, bytes):\n      return text.decode(\"utf-8\", \"ignore\")\n    else:\n      raise ValueError(\"Unsupported string type: %s\" % (type(text)))\n  elif six.PY2:\n    if isinstance(text, str):\n      return text\n    elif isinstance(text, unicode):\n      return text.encode(\"utf-8\")\n    else:\n      raise ValueError(\"Unsupported string type: %s\" % (type(text)))\n  else:\n    raise ValueError(\"Not running on Python2 or Python 3?\")\n\n\ndef load_vocab(vocab_file):\n  \"\"\"Loads a vocabulary file into a dictionary.\"\"\"\n  vocab = collections.OrderedDict()\n  index = 0\n  with tf.gfile.GFile(vocab_file, \"r\") as reader:\n    while True:\n      token = convert_to_unicode(reader.readline())\n      if not token:\n        break\n      token = token.strip()\n      vocab[token] = index\n      index += 1\n  return vocab\n\n\ndef convert_by_vocab(vocab, items):\n  \"\"\"Converts a sequence of [tokens|ids] using the vocab.\"\"\"\n  output = []\n  for item in items:\n    output.append(vocab[item])\n  return output\n\n\ndef convert_tokens_to_ids(vocab, tokens):\n  return convert_by_vocab(vocab, tokens)\n\n\ndef convert_ids_to_tokens(inv_vocab, ids):\n  return convert_by_vocab(inv_vocab, ids)\n\n\ndef whitespace_tokenize(text):\n  \"\"\"Runs basic whitespace cleaning and splitting on a piece of text.\"\"\"\n  text = text.strip()\n  if not text:\n    return []\n  tokens = text.split()\n  return tokens\n\n\nclass FullTokenizer(object):\n  \"\"\"Runs end-to-end tokenziation.\"\"\"\n\n  def __init__(self, vocab_file, do_lower_case=True):\n    self.vocab = load_vocab(vocab_file)\n    self.inv_vocab = {v: k for k, v in self.vocab.items()}\n    self.basic_tokenizer = BasicTokenizer(do_lower_case=do_lower_case)\n    self.wordpiece_tokenizer = WordpieceTokenizer(vocab=self.vocab)\n\n  def tokenize(self, text):\n    split_tokens = []\n    for token in self.basic_tokenizer.tokenize(text):\n      for sub_token in self.wordpiece_tokenizer.tokenize(token):\n        split_tokens.append(sub_token)\n\n    return split_tokens\n\n  def convert_tokens_to_ids(self, tokens):\n    return convert_by_vocab(self.vocab, tokens)\n\n  def convert_ids_to_tokens(self, ids):\n    return convert_by_vocab(self.inv_vocab, ids)\n\n\nclass BasicTokenizer(object):\n  \"\"\"Runs basic tokenization (punctuation splitting, lower casing, etc.).\"\"\"\n\n  def __init__(self, do_lower_case=True):\n    \"\"\"Constructs a BasicTokenizer.\n\n    Args:\n      do_lower_case: Whether to lower case the input.\n    \"\"\"\n    self.do_lower_case = do_lower_case\n\n  def tokenize(self, text):\n    \"\"\"Tokenizes a piece of text.\"\"\"\n    text = convert_to_unicode(text)\n    text = self._clean_text(text)\n\n    # This was added on November 1st, 2018 for the multilingual and Chinese\n    # models. This is also applied to the English models now, but it doesn't\n    # matter since the English models were not trained on any Chinese data\n    # and generally don't have any Chinese data in them (there are Chinese\n    # characters in the vocabulary because Wikipedia does have some Chinese\n    # words in the English Wikipedia.).\n    text = self._tokenize_chinese_chars(text)\n\n    orig_tokens = whitespace_tokenize(text)\n    split_tokens = []\n    for token in orig_tokens:\n      if self.do_lower_case:\n        token = token.lower()\n        token = self._run_strip_accents(token)\n      split_tokens.extend(self._run_split_on_punc(token))\n\n    output_tokens = whitespace_tokenize(\" \".join(split_tokens))\n    return output_tokens\n\n  def _run_strip_accents(self, text):\n    \"\"\"Strips accents from a piece of text.\"\"\"\n    text = unicodedata.normalize(\"NFD\", text)\n    output = []\n    for char in text:\n      cat = unicodedata.category(char)\n      if cat == \"Mn\":\n        continue\n      output.append(char)\n    return \"\".join(output)\n\n  def _run_split_on_punc(self, text):\n    \"\"\"Splits punctuation on a piece of text.\"\"\"\n    chars = list(text)\n    i = 0\n    start_new_word = True\n    output = []\n    while i < len(chars):\n      char = chars[i]\n      if _is_punctuation(char):\n        output.append([char])\n        start_new_word = True\n      else:\n        if start_new_word:\n          output.append([])\n        start_new_word = False\n        output[-1].append(char)\n      i += 1\n\n    return [\"\".join(x) for x in output]\n\n  def _tokenize_chinese_chars(self, text):\n    \"\"\"Adds whitespace around any CJK character.\"\"\"\n    output = []\n    for char in text:\n      cp = ord(char)\n      if self._is_chinese_char(cp):\n        output.append(\" \")\n        output.append(char)\n        output.append(\" \")\n      else:\n        output.append(char)\n    return \"\".join(output)\n\n  def _is_chinese_char(self, cp):\n    \"\"\"Checks whether CP is the codepoint of a CJK character.\"\"\"\n    # This defines a \"chinese character\" as anything in the CJK Unicode block:\n    #   https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_(Unicode_block)\n    #\n    # Note that the CJK Unicode block is NOT all Japanese and Korean characters,\n    # despite its name. The modern Korean Hangul alphabet is a different block,\n    # as is Japanese Hiragana and Katakana. Those alphabets are used to write\n    # space-separated words, so they are not treated specially and handled\n    # like the all of the other languages.\n    if ((cp >= 0x4E00 and cp <= 0x9FFF) or  #\n        (cp >= 0x3400 and cp <= 0x4DBF) or  #\n        (cp >= 0x20000 and cp <= 0x2A6DF) or  #\n        (cp >= 0x2A700 and cp <= 0x2B73F) or  #\n        (cp >= 0x2B740 and cp <= 0x2B81F) or  #\n        (cp >= 0x2B820 and cp <= 0x2CEAF) or\n        (cp >= 0xF900 and cp <= 0xFAFF) or  #\n        (cp >= 0x2F800 and cp <= 0x2FA1F)):  #\n      return True\n\n    return False\n\n  def _clean_text(self, text):\n    \"\"\"Performs invalid character removal and whitespace cleanup on text.\"\"\"\n    output = []\n    for char in text:\n      cp = ord(char)\n      if cp == 0 or cp == 0xfffd or _is_control(char):\n        continue\n      if _is_whitespace(char):\n        output.append(\" \")\n      else:\n        output.append(char)\n    return \"\".join(output)\n\n\nclass WordpieceTokenizer(object):\n  \"\"\"Runs WordPiece tokenziation.\"\"\"\n\n  def __init__(self, vocab, unk_token=\"[UNK]\", max_input_chars_per_word=200):\n    self.vocab = vocab\n    self.unk_token = unk_token\n    self.max_input_chars_per_word = max_input_chars_per_word\n\n  def tokenize(self, text):\n    \"\"\"Tokenizes a piece of text into its word pieces.\n\n    This uses a greedy longest-match-first algorithm to perform tokenization\n    using the given vocabulary.\n\n    For example:\n      input = \"unaffable\"\n      output = [\"un\", \"##aff\", \"##able\"]\n\n    Args:\n      text: A single token or whitespace separated tokens. This should have\n        already been passed through `BasicTokenizer.\n\n    Returns:\n      A list of wordpiece tokens.\n    \"\"\"\n\n    text = convert_to_unicode(text)\n\n    output_tokens = []\n    for token in whitespace_tokenize(text):\n      chars = list(token)\n      if len(chars) > self.max_input_chars_per_word:\n        output_tokens.append(self.unk_token)\n        continue\n\n      is_bad = False\n      start = 0\n      sub_tokens = []\n      while start < len(chars):\n        end = len(chars)\n        cur_substr = None\n        while start < end:\n          substr = \"\".join(chars[start:end])\n          if start > 0:\n            substr = \"##\" + substr\n          if substr in self.vocab:\n            cur_substr = substr\n            break\n          end -= 1\n        if cur_substr is None:\n          is_bad = True\n          break\n        sub_tokens.append(cur_substr)\n        start = end\n\n      if is_bad:\n        output_tokens.append(self.unk_token)\n      else:\n        output_tokens.extend(sub_tokens)\n    return output_tokens\n\n\ndef _is_whitespace(char):\n  \"\"\"Checks whether `chars` is a whitespace character.\"\"\"\n  # \\t, \\n, and \\r are technically contorl characters but we treat them\n  # as whitespace since they are generally considered as such.\n  if char == \" \" or char == \"\\t\" or char == \"\\n\" or char == \"\\r\":\n    return True\n  cat = unicodedata.category(char)\n  if cat == \"Zs\":\n    return True\n  return False\n\n\ndef _is_control(char):\n  \"\"\"Checks whether `chars` is a control character.\"\"\"\n  # These are technically control characters but we count them as whitespace\n  # characters.\n  if char == \"\\t\" or char == \"\\n\" or char == \"\\r\":\n    return False\n  cat = unicodedata.category(char)\n  if cat.startswith(\"C\"):\n    return True\n  return False\n\n\ndef _is_punctuation(char):\n  \"\"\"Checks whether `chars` is a punctuation character.\"\"\"\n  cp = ord(char)\n  # We treat all non-letter/number ASCII as punctuation.\n  # Characters such as \"^\", \"$\", and \"`\" are not in the Unicode\n  # Punctuation class but we treat them as punctuation anyways, for\n  # consistency.\n  if ((cp >= 33 and cp <= 47) or (cp >= 58 and cp <= 64) or\n      (cp >= 91 and cp <= 96) or (cp >= 123 and cp <= 126)):\n    return True\n  cat = unicodedata.category(char)\n  if cat.startswith(\"P\"):\n    return True\n  return False\n"
  },
  {
    "path": "conll-2012/scorer/v8.01/README.txt",
    "content": "NAME\n   CorScorer: Perl package for scoring coreference resolution systems\n   using different metrics.\n\n\nVERSION\n   v8.01 -- reference implementations of MUC, B-cubed, CEAF and BLANC metrics.\n\n\nCHANGES SINCE v8.0\n   - fixed a bug that crashed the BLANC scorer when a duplicate singleton\n     mention was present in the response.\n\nINSTALLATION\n   Requirements:\n      1. Perl: downloadable from http://perl.org\n      2. Algorithm-Munkres: included in this package and downloadable\n         from CPAN http://search.cpan.org/~tpederse/Algorithm-Munkres-0.08\n\nUSE\n   This package is distributed with two scripts to execute the scorer from\n   the command line.\n\n   Windows (tm): scorer.bat\n   Linux: scorer.pl\n\n\nSYNOPSIS\n   use CorScorer;\n\n   $metric = 'ceafm';\n\n   # Scores the whole dataset\n   &CorScorer::Score($metric, $keys_file, $response_file);\n\n   # Scores one file\n   &CorScorer::Score($metric, $keys_file, $response_file, $name);\n\n\nINPUT\n   metric: the metric desired to score the results:\n     muc: MUCScorer (Vilain et al, 1995)\n     bcub: B-Cubed (Bagga and Baldwin, 1998)\n     ceafm: CEAF (Luo et al., 2005) using mention-based similarity\n     ceafe: CEAF (Luo et al., 2005) using entity-based similarity\n     blanc: BLANC (Luo et al., 2014) BLANC metric for gold and predicted mentions\n     all: uses all the metrics to score\n\n   keys_file: file with expected coreference chains in CoNLL-2011/2012 format\n\n   response_file: file with output of coreference system (CoNLL-2011/2012 format)\n\n   name: [optional] the name of the document to score. If name is not\n     given, all the documents in the dataset will be scored. If given\n     name is \"none\" then all the documents are scored but only total\n     results are shown.\n\n\nOUTPUT\n   The score subroutine returns an array with four values in this order:\n   1) Recall numerator\n   2) Recall denominator\n   3) Precision numerator\n   4) Precision denominator\n\n   Also recall, precision and F1 are printed in the standard output when variable\n   $VERBOSE is not null.\n\n   Final scores:\n   Recall = recall_numerator / recall_denominator\n   Precision = precision_numerator / precision_denominator\n   F1 = 2 * Recall * Precision / (Recall + Precision)\n\n   Identification of mentions\n   An scorer for identification of mentions (recall, precision and F1) is also included.\n   Mentions from system response are compared with key mentions. This version performs\n   strict mention matching as was used in the CoNLL-2011 and 2012 shared tasks.\n\nAUTHORS\n   Emili Sapena, Universitat Politècnica de Catalunya, http://www.lsi.upc.edu/~esapena, esapena <at> lsi.upc.edu\n   Sameer Pradhan, sameer.pradhan <at> childrens.harvard.edu\n   Sebastian Martschat, sebastian.martschat <at> h-its.org\n   Xiaoqiang Luo, xql <at> google.com\n\nCOPYRIGHT AND LICENSE\n   Copyright (C) 2009-2011, Emili Sapena esapena <at> lsi.upc.edu\n                 2011-2014, Sameer Pradhan sameer.pradhan <at> childrens.harvard.edu\n\n   This program is free software; you can redistribute it and/or modify it\n   under the terms of the GNU General Public License as published by the\n   Free Software Foundation; either version 2 of the License, or (at your\n   option) any later version. This program is distributed in the hope that\n   it will be useful, but WITHOUT ANY WARRANTY; without even the implied\n   warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n   GNU General Public License for more details.\n\n   You should have received a copy of the GNU General Public License along\n   with this program; if not, write to the Free Software Foundation, Inc.,\n   59 Temple Place - Suite 330, Boston, MA 02111-1307, USA.\n\n"
  },
  {
    "path": "conll-2012/scorer/v8.01/scorer.bat",
    "content": "@rem = '--*-Perl-*--\n@echo off\nif \"%OS%\" == \"Windows_NT\" goto WinNT\nperl -x -S \"%0\" %1 %2 %3 %4 %5 %6 %7 %8 %9\ngoto endofperl\n:WinNT\nperl -x -S %0 %*\nif NOT \"%COMSPEC%\" == \"%SystemRoot%\\system32\\cmd.exe\" goto endofperl\nif %errorlevel% == 9009 echo You do not have Perl in your PATH.\nif errorlevel 1 goto script_failed_so_exit_with_non_zero_val 2>nul\ngoto endofperl\n@rem ';\n#!perl\n#line 15\n\nBEGIN {\n    $d = $0;\n    $d =~ s/\\/[^\\/][^\\/]*$//g;\n    push(@INC, $d.\"/lib\");\n}\n\nuse strict;\nuse CorScorer;\n\nif (@ARGV < 3) {\n  print q|\n  use: scorer.bat <metric> <keys_file> <response_file> [name]\n  \n  metric: the metric desired to score the results:\n     muc: MUCScorer (Vilain et al, 1995)\n     bcub: B-Cubed (Bagga and Baldwin, 1998)\n     ceafm: CEAF (Luo et al, 2005) using mention-based similarity\n     ceafe: CEAF (Luo et al, 2005) using entity-based similarity\n     all: uses all the metrics to score\n  \n  keys_file: file with expected coreference chains in SemEval format\n  \n  response_file: file with output of coreference system (SemEval format)\n  \n  name: [optional] the name of the document to score. If name is not\n     given, all the documents in the dataset will be scored. If given\n     name is \"none\" then all the documents are scored but only total\n     results are shown.\n  \n  |;\n  exit;\n}\n\nmy $metric = shift (@ARGV);\nif ($metric !~ /^(muc|bcub|ceafm|ceafe|all)/i) {\n  print \"Invalid metric\\n\";\n  exit;\n}\n\n\nif ($metric eq 'all') {\n  foreach my $m ('muc', 'bcub', 'ceafm', 'ceafe') {\n    print \"\\nMETRIC $m:\\n\";\n    &CorScorer::Score( $m, @ARGV );\n  }\n}\nelse {\n  &CorScorer::Score( $metric, @ARGV );\n}\n\n__END__\n:endofperl\n"
  },
  {
    "path": "conll-2012/scorer/v8.01/scorer.pl",
    "content": "#!/usr/bin/perl\n\nBEGIN {\n  $d = $0;\n  $d =~ s/\\/[^\\/][^\\/]*$//g;\n\n  if ($d eq $0) {\n    unshift(@INC, \"lib\");\n  }\n  else {\n    unshift(@INC, $d . \"/lib\");\n  }\n}\n\nuse strict;\nuse CorScorer;\n\nif (@ARGV < 3) {\n  print q|\nuse: scorer.pl <metric> <keys_file> <response_file> [name]\n\n  metric: the metric desired to score the results:\n    muc: MUCScorer (Vilain et al, 1995)\n    bcub: B-Cubed (Bagga and Baldwin, 1998)\n    ceafm: CEAF (Luo et al, 2005) using mention-based similarity\n    ceafe: CEAF (Luo et al, 2005) using entity-based similarity\n    blanc: BLANC\n    all: uses all the metrics to score\n\n  keys_file: file with expected coreference chains in SemEval format\n\n  response_file: file with output of coreference system (SemEval format)\n\n  name: [optional] the name of the document to score. If name is not\n    given, all the documents in the dataset will be scored. If given\n    name is \"none\" then all the documents are scored but only total\n    results are shown.\n\n|;\n  exit;\n}\n\nmy $metric = shift(@ARGV);\nif ($metric !~ /^(muc|bcub|ceafm|ceafe|blanc|all)/i) {\n  print \"Invalid metric\\n\";\n  exit;\n}\n\nif ($metric eq 'all') {\n  foreach my $m ('muc', 'bcub', 'ceafm', 'ceafe', 'blanc') {\n    print \"\\nMETRIC $m:\\n\";\n    &CorScorer::Score($m, @ARGV);\n  }\n}\nelse {\n  &CorScorer::Score($metric, @ARGV);\n}\n\n"
  },
  {
    "path": "data_utils/config_utils.py",
    "content": "#!/usr/bin/env python3 \n# -*- coding: utf-8 -*- \n\n\n\n# author: xiaoy li \n# description:\n# config utils for the mention proposal and corefqa \n\n\n\nimport os \nimport json \nimport tensorflow as tf \n\n\nclass ModelConfig(object):\n    def __init__(self, tf_flags, output_dir, model_sign=\"model\"):\n        key_value_pairs = tf_flags.flag_values_dict()\n\n        for item_key, item_value in key_value_pairs.items():\n            self.__dict__[item_key] = item_value \n\n        self.output_dir = output_dir \n        config_path = os.path.join(self.output_dir, \"{}_config.json\".format(model_sign))\n\n    def logging_configs(self):\n        tf.logging.info(\"$*$\"*30)\n        tf.logging.info(\"****** print model configs : ******\")\n        tf.logging.info(\"$*$\"*30)\n\n        for item_key, item_value in self.__dict__.items():\n            tf.logging.info(\"{} : {}\".format(str(item_key), str(item_value)))\n"
  },
  {
    "path": "data_utils/conll.py",
    "content": "#!/usr/bin/env python3 \n# -*- coding: utf-8 -*- \n\n\n\n# author: xiaoy li \n# description:\n# use the offical conll-2012 evaluation scripts to evaluate the trained model and save the evaluation results into files. \n\n\nimport os \nimport re\nimport collections\nimport operator\nimport subprocess\nimport tempfile\n\n\n\nREPO_PATH = \"/\".join(os.path.realpath(__file__).split(\"/\")[:-2])\nBEGIN_DOCUMENT_REGEX = re.compile(r\"#begin document \\((.*)\\); part (\\d+)\")\nCOREF_RESULTS_REGEX = re.compile(\n    r\".*Coreference: Recall: \\([0-9.]+ / [0-9.]+\\) ([0-9.]+)%\\tPrecision: \\([0-9.]+ / [0-9.]+\\) ([0-9.]+)%\\tF1: ([0-9.]+)%.*\",\n    re.DOTALL)\n\n\n\ndef get_doc_key(doc_id, part):\n    return \"{}_{}\".format(doc_id, int(part))\n\n\ndef output_conll(input_file, output_file, predictions, subtoken_map):\n    prediction_map = {}\n    for doc_key, clusters in predictions.items():\n        start_map = collections.defaultdict(list)\n        end_map = collections.defaultdict(list)\n        word_map = collections.defaultdict(list)\n        for cluster_id, mentions in enumerate(clusters):\n            for start, end in mentions:\n                start, end = subtoken_map[doc_key][start], subtoken_map[doc_key][end]\n                if start == end:\n                    word_map[start].append(cluster_id)\n                else:\n                    start_map[start].append((cluster_id, end))\n                    end_map[end].append((cluster_id, start))\n        for k, v in start_map.items():\n            start_map[k] = [cluster_id for cluster_id, end in sorted(v, key=operator.itemgetter(1), reverse=True)]\n        for k, v in end_map.items():\n            end_map[k] = [cluster_id for cluster_id, start in sorted(v, key=operator.itemgetter(1), reverse=True)]\n        prediction_map[doc_key] = (start_map, end_map, word_map)\n\n    word_index = 0\n    for line in input_file.readlines():\n        row = line.split()\n        if len(row) == 0:\n            output_file.write(\"\\n\")\n        elif row[0].startswith(\"#\"):\n            begin_match = re.match(BEGIN_DOCUMENT_REGEX, line)\n            if begin_match:\n                doc_key = get_doc_key(begin_match.group(1), begin_match.group(2))\n                start_map, end_map, word_map = prediction_map[doc_key]\n                word_index = 0\n            output_file.write(line)\n            output_file.write(\"\\n\")\n        else:\n            assert get_doc_key(row[0], row[1]) == doc_key\n            coref_list = []\n            if word_index in end_map:\n                for cluster_id in end_map[word_index]:\n                    coref_list.append(\"{})\".format(cluster_id))\n            if word_index in word_map:\n                for cluster_id in word_map[word_index]:\n                    coref_list.append(\"({})\".format(cluster_id))\n            if word_index in start_map:\n                for cluster_id in start_map[word_index]:\n                    coref_list.append(\"({}\".format(cluster_id))\n\n            if len(coref_list) == 0:\n                row[-1] = \"-\"\n            else:\n                row[-1] = \"|\".join(coref_list)\n\n            output_file.write(\"   \".join(row))\n            output_file.write(\"\\n\")\n            word_index += 1\n\n\ndef official_conll_eval(gold_path, predicted_path, metric, official_stdout=False):\n    cmd = [\"perl\", os.path.join(REPO_PATH, \"conll-2012/scorer/v8.01/scorer.pl\"), metric, gold_path, predicted_path, \"none\"]\n    process = subprocess.Popen(cmd, stdout=subprocess.PIPE)\n    stdout, stderr = process.communicate()\n    process.wait()\n\n    stdout = stdout.decode(\"utf-8\")\n    if stderr is not None:\n        print(stderr)\n\n    if official_stdout:\n        print(\"Official result for {}\".format(metric))\n        print(stdout)\n\n    coref_results_match = re.match(COREF_RESULTS_REGEX, stdout)\n    recall = float(coref_results_match.group(1))\n    precision = float(coref_results_match.group(2))\n    f1 = float(coref_results_match.group(3))\n    return {\"r\": recall, \"p\": precision, \"f\": f1}\n\n\ndef evaluate_conll(gold_path, predictions, subtoken_maps, official_stdout=False):\n    with tempfile.NamedTemporaryFile(delete=False, mode=\"w\") as prediction_file:\n        with open(gold_path, \"r\") as gold_file:\n            output_conll(gold_file, prediction_file, predictions, subtoken_maps)\n        print(\"Predicted conll file: {}\".format(prediction_file.name))\n    return {m: official_conll_eval(gold_file.name, prediction_file.name, m, official_stdout) for m in\n            (\"muc\", \"bcub\", \"ceafe\")}\n"
  },
  {
    "path": "data_utils/lowercase_vocab.txt",
    "content": "[PAD]\n[unused0]\n[unused1]\n[unused2]\n[unused3]\n[unused4]\n[unused5]\n[unused6]\n[unused7]\n[unused8]\n[unused9]\n[unused10]\n[unused11]\n[unused12]\n[unused13]\n[unused14]\n[unused15]\n[unused16]\n[unused17]\n[unused18]\n[unused19]\n[unused20]\n[unused21]\n[unused22]\n[unused23]\n[unused24]\n[unused25]\n[unused26]\n[unused27]\n[unused28]\n[unused29]\n[unused30]\n[unused31]\n[unused32]\n[unused33]\n[unused34]\n[unused35]\n[unused36]\n[unused37]\n[unused38]\n[unused39]\n[unused40]\n[unused41]\n[unused42]\n[unused43]\n[unused44]\n[unused45]\n[unused46]\n[unused47]\n[unused48]\n[unused49]\n[unused50]\n[unused51]\n[unused52]\n[unused53]\n[unused54]\n[unused55]\n[unused56]\n[unused57]\n[unused58]\n[unused59]\n[unused60]\n[unused61]\n[unused62]\n[unused63]\n[unused64]\n[unused65]\n[unused66]\n[unused67]\n[unused68]\n[unused69]\n[unused70]\n[unused71]\n[unused72]\n[unused73]\n[unused74]\n[unused75]\n[unused76]\n[unused77]\n[unused78]\n[unused79]\n[unused80]\n[unused81]\n[unused82]\n[unused83]\n[unused84]\n[unused85]\n[unused86]\n[unused87]\n[unused88]\n[unused89]\n[unused90]\n[unused91]\n[unused92]\n[unused93]\n[unused94]\n[unused95]\n[unused96]\n[unused97]\n[unused98]\n[UNK]\n[CLS]\n[SEP]\n[MASK]\n[unused99]\n[unused100]\n[unused101]\n[unused102]\n[unused103]\n[unused104]\n[unused105]\n[unused106]\n[unused107]\n[unused108]\n[unused109]\n[unused110]\n[unused111]\n[unused112]\n[unused113]\n[unused114]\n[unused115]\n[unused116]\n[unused117]\n[unused118]\n[unused119]\n[unused120]\n[unused121]\n[unused122]\n[unused123]\n[unused124]\n[unused125]\n[unused126]\n[unused127]\n[unused128]\n[unused129]\n[unused130]\n[unused131]\n[unused132]\n[unused133]\n[unused134]\n[unused135]\n[unused136]\n[unused137]\n[unused138]\n[unused139]\n[unused140]\n[unused141]\n[unused142]\n[unused143]\n[unused144]\n[unused145]\n[unused146]\n[unused147]\n[unused148]\n[unused149]\n[unused150]\n[unused151]\n[unused152]\n[unused153]\n[unused154]\n[unused155]\n[unused156]\n[unused157]\n[unused158]\n[unused159]\n[unused160]\n[unused161]\n[unused162]\n[unused163]\n[unused164]\n[unused165]\n[unused166]\n[unused167]\n[unused168]\n[unused169]\n[unused170]\n[unused171]\n[unused172]\n[unused173]\n[unused174]\n[unused175]\n[unused176]\n[unused177]\n[unused178]\n[unused179]\n[unused180]\n[unused181]\n[unused182]\n[unused183]\n[unused184]\n[unused185]\n[unused186]\n[unused187]\n[unused188]\n[unused189]\n[unused190]\n[unused191]\n[unused192]\n[unused193]\n[unused194]\n[unused195]\n[unused196]\n[unused197]\n[unused198]\n[unused199]\n[unused200]\n[unused201]\n[unused202]\n[unused203]\n[unused204]\n[unused205]\n[unused206]\n[unused207]\n[unused208]\n[unused209]\n[unused210]\n[unused211]\n[unused212]\n[unused213]\n[unused214]\n[unused215]\n[unused216]\n[unused217]\n[unused218]\n[unused219]\n[unused220]\n[unused221]\n[unused222]\n[unused223]\n[unused224]\n[unused225]\n[unused226]\n[unused227]\n[unused228]\n[unused229]\n[unused230]\n[unused231]\n[unused232]\n[unused233]\n[unused234]\n[unused235]\n[unused236]\n[unused237]\n[unused238]\n[unused239]\n[unused240]\n[unused241]\n[unused242]\n[unused243]\n[unused244]\n[unused245]\n[unused246]\n[unused247]\n[unused248]\n[unused249]\n[unused250]\n[unused251]\n[unused252]\n[unused253]\n[unused254]\n[unused255]\n[unused256]\n[unused257]\n[unused258]\n[unused259]\n[unused260]\n[unused261]\n[unused262]\n[unused263]\n[unused264]\n[unused265]\n[unused266]\n[unused267]\n[unused268]\n[unused269]\n[unused270]\n[unused271]\n[unused272]\n[unused273]\n[unused274]\n[unused275]\n[unused276]\n[unused277]\n[unused278]\n[unused279]\n[unused280]\n[unused281]\n[unused282]\n[unused283]\n[unused284]\n[unused285]\n[unused286]\n[unused287]\n[unused288]\n[unused289]\n[unused290]\n[unused291]\n[unused292]\n[unused293]\n[unused294]\n[unused295]\n[unused296]\n[unused297]\n[unused298]\n[unused299]\n[unused300]\n[unused301]\n[unused302]\n[unused303]\n[unused304]\n[unused305]\n[unused306]\n[unused307]\n[unused308]\n[unused309]\n[unused310]\n[unused311]\n[unused312]\n[unused313]\n[unused314]\n[unused315]\n[unused316]\n[unused317]\n[unused318]\n[unused319]\n[unused320]\n[unused321]\n[unused322]\n[unused323]\n[unused324]\n[unused325]\n[unused326]\n[unused327]\n[unused328]\n[unused329]\n[unused330]\n[unused331]\n[unused332]\n[unused333]\n[unused334]\n[unused335]\n[unused336]\n[unused337]\n[unused338]\n[unused339]\n[unused340]\n[unused341]\n[unused342]\n[unused343]\n[unused344]\n[unused345]\n[unused346]\n[unused347]\n[unused348]\n[unused349]\n[unused350]\n[unused351]\n[unused352]\n[unused353]\n[unused354]\n[unused355]\n[unused356]\n[unused357]\n[unused358]\n[unused359]\n[unused360]\n[unused361]\n[unused362]\n[unused363]\n[unused364]\n[unused365]\n[unused366]\n[unused367]\n[unused368]\n[unused369]\n[unused370]\n[unused371]\n[unused372]\n[unused373]\n[unused374]\n[unused375]\n[unused376]\n[unused377]\n[unused378]\n[unused379]\n[unused380]\n[unused381]\n[unused382]\n[unused383]\n[unused384]\n[unused385]\n[unused386]\n[unused387]\n[unused388]\n[unused389]\n[unused390]\n[unused391]\n[unused392]\n[unused393]\n[unused394]\n[unused395]\n[unused396]\n[unused397]\n[unused398]\n[unused399]\n[unused400]\n[unused401]\n[unused402]\n[unused403]\n[unused404]\n[unused405]\n[unused406]\n[unused407]\n[unused408]\n[unused409]\n[unused410]\n[unused411]\n[unused412]\n[unused413]\n[unused414]\n[unused415]\n[unused416]\n[unused417]\n[unused418]\n[unused419]\n[unused420]\n[unused421]\n[unused422]\n[unused423]\n[unused424]\n[unused425]\n[unused426]\n[unused427]\n[unused428]\n[unused429]\n[unused430]\n[unused431]\n[unused432]\n[unused433]\n[unused434]\n[unused435]\n[unused436]\n[unused437]\n[unused438]\n[unused439]\n[unused440]\n[unused441]\n[unused442]\n[unused443]\n[unused444]\n[unused445]\n[unused446]\n[unused447]\n[unused448]\n[unused449]\n[unused450]\n[unused451]\n[unused452]\n[unused453]\n[unused454]\n[unused455]\n[unused456]\n[unused457]\n[unused458]\n[unused459]\n[unused460]\n[unused461]\n[unused462]\n[unused463]\n[unused464]\n[unused465]\n[unused466]\n[unused467]\n[unused468]\n[unused469]\n[unused470]\n[unused471]\n[unused472]\n[unused473]\n[unused474]\n[unused475]\n[unused476]\n[unused477]\n[unused478]\n[unused479]\n[unused480]\n[unused481]\n[unused482]\n[unused483]\n[unused484]\n[unused485]\n[unused486]\n[unused487]\n[unused488]\n[unused489]\n[unused490]\n[unused491]\n[unused492]\n[unused493]\n[unused494]\n[unused495]\n[unused496]\n[unused497]\n[unused498]\n[unused499]\n[unused500]\n[unused501]\n[unused502]\n[unused503]\n[unused504]\n[unused505]\n[unused506]\n[unused507]\n[unused508]\n[unused509]\n[unused510]\n[unused511]\n[unused512]\n[unused513]\n[unused514]\n[unused515]\n[unused516]\n[unused517]\n[unused518]\n[unused519]\n[unused520]\n[unused521]\n[unused522]\n[unused523]\n[unused524]\n[unused525]\n[unused526]\n[unused527]\n[unused528]\n[unused529]\n[unused530]\n[unused531]\n[unused532]\n[unused533]\n[unused534]\n[unused535]\n[unused536]\n[unused537]\n[unused538]\n[unused539]\n[unused540]\n[unused541]\n[unused542]\n[unused543]\n[unused544]\n[unused545]\n[unused546]\n[unused547]\n[unused548]\n[unused549]\n[unused550]\n[unused551]\n[unused552]\n[unused553]\n[unused554]\n[unused555]\n[unused556]\n[unused557]\n[unused558]\n[unused559]\n[unused560]\n[unused561]\n[unused562]\n[unused563]\n[unused564]\n[unused565]\n[unused566]\n[unused567]\n[unused568]\n[unused569]\n[unused570]\n[unused571]\n[unused572]\n[unused573]\n[unused574]\n[unused575]\n[unused576]\n[unused577]\n[unused578]\n[unused579]\n[unused580]\n[unused581]\n[unused582]\n[unused583]\n[unused584]\n[unused585]\n[unused586]\n[unused587]\n[unused588]\n[unused589]\n[unused590]\n[unused591]\n[unused592]\n[unused593]\n[unused594]\n[unused595]\n[unused596]\n[unused597]\n[unused598]\n[unused599]\n[unused600]\n[unused601]\n[unused602]\n[unused603]\n[unused604]\n[unused605]\n[unused606]\n[unused607]\n[unused608]\n[unused609]\n[unused610]\n[unused611]\n[unused612]\n[unused613]\n[unused614]\n[unused615]\n[unused616]\n[unused617]\n[unused618]\n[unused619]\n[unused620]\n[unused621]\n[unused622]\n[unused623]\n[unused624]\n[unused625]\n[unused626]\n[unused627]\n[unused628]\n[unused629]\n[unused630]\n[unused631]\n[unused632]\n[unused633]\n[unused634]\n[unused635]\n[unused636]\n[unused637]\n[unused638]\n[unused639]\n[unused640]\n[unused641]\n[unused642]\n[unused643]\n[unused644]\n[unused645]\n[unused646]\n[unused647]\n[unused648]\n[unused649]\n[unused650]\n[unused651]\n[unused652]\n[unused653]\n[unused654]\n[unused655]\n[unused656]\n[unused657]\n[unused658]\n[unused659]\n[unused660]\n[unused661]\n[unused662]\n[unused663]\n[unused664]\n[unused665]\n[unused666]\n[unused667]\n[unused668]\n[unused669]\n[unused670]\n[unused671]\n[unused672]\n[unused673]\n[unused674]\n[unused675]\n[unused676]\n[unused677]\n[unused678]\n[unused679]\n[unused680]\n[unused681]\n[unused682]\n[unused683]\n[unused684]\n[unused685]\n[unused686]\n[unused687]\n[unused688]\n[unused689]\n[unused690]\n[unused691]\n[unused692]\n[unused693]\n[unused694]\n[unused695]\n[unused696]\n[unused697]\n[unused698]\n[unused699]\n[unused700]\n[unused701]\n[unused702]\n[unused703]\n[unused704]\n[unused705]\n[unused706]\n[unused707]\n[unused708]\n[unused709]\n[unused710]\n[unused711]\n[unused712]\n[unused713]\n[unused714]\n[unused715]\n[unused716]\n[unused717]\n[unused718]\n[unused719]\n[unused720]\n[unused721]\n[unused722]\n[unused723]\n[unused724]\n[unused725]\n[unused726]\n[unused727]\n[unused728]\n[unused729]\n[unused730]\n[unused731]\n[unused732]\n[unused733]\n[unused734]\n[unused735]\n[unused736]\n[unused737]\n[unused738]\n[unused739]\n[unused740]\n[unused741]\n[unused742]\n[unused743]\n[unused744]\n[unused745]\n[unused746]\n[unused747]\n[unused748]\n[unused749]\n[unused750]\n[unused751]\n[unused752]\n[unused753]\n[unused754]\n[unused755]\n[unused756]\n[unused757]\n[unused758]\n[unused759]\n[unused760]\n[unused761]\n[unused762]\n[unused763]\n[unused764]\n[unused765]\n[unused766]\n[unused767]\n[unused768]\n[unused769]\n[unused770]\n[unused771]\n[unused772]\n[unused773]\n[unused774]\n[unused775]\n[unused776]\n[unused777]\n[unused778]\n[unused779]\n[unused780]\n[unused781]\n[unused782]\n[unused783]\n[unused784]\n[unused785]\n[unused786]\n[unused787]\n[unused788]\n[unused789]\n[unused790]\n[unused791]\n[unused792]\n[unused793]\n[unused794]\n[unused795]\n[unused796]\n[unused797]\n[unused798]\n[unused799]\n[unused800]\n[unused801]\n[unused802]\n[unused803]\n[unused804]\n[unused805]\n[unused806]\n[unused807]\n[unused808]\n[unused809]\n[unused810]\n[unused811]\n[unused812]\n[unused813]\n[unused814]\n[unused815]\n[unused816]\n[unused817]\n[unused818]\n[unused819]\n[unused820]\n[unused821]\n[unused822]\n[unused823]\n[unused824]\n[unused825]\n[unused826]\n[unused827]\n[unused828]\n[unused829]\n[unused830]\n[unused831]\n[unused832]\n[unused833]\n[unused834]\n[unused835]\n[unused836]\n[unused837]\n[unused838]\n[unused839]\n[unused840]\n[unused841]\n[unused842]\n[unused843]\n[unused844]\n[unused845]\n[unused846]\n[unused847]\n[unused848]\n[unused849]\n[unused850]\n[unused851]\n[unused852]\n[unused853]\n[unused854]\n[unused855]\n[unused856]\n[unused857]\n[unused858]\n[unused859]\n[unused860]\n[unused861]\n[unused862]\n[unused863]\n[unused864]\n[unused865]\n[unused866]\n[unused867]\n[unused868]\n[unused869]\n[unused870]\n[unused871]\n[unused872]\n[unused873]\n[unused874]\n[unused875]\n[unused876]\n[unused877]\n[unused878]\n[unused879]\n[unused880]\n[unused881]\n[unused882]\n[unused883]\n[unused884]\n[unused885]\n[unused886]\n[unused887]\n[unused888]\n[unused889]\n[unused890]\n[unused891]\n[unused892]\n[unused893]\n[unused894]\n[unused895]\n[unused896]\n[unused897]\n[unused898]\n[unused899]\n[unused900]\n[unused901]\n[unused902]\n[unused903]\n[unused904]\n[unused905]\n[unused906]\n[unused907]\n[unused908]\n[unused909]\n[unused910]\n[unused911]\n[unused912]\n[unused913]\n[unused914]\n[unused915]\n[unused916]\n[unused917]\n[unused918]\n[unused919]\n[unused920]\n[unused921]\n[unused922]\n[unused923]\n[unused924]\n[unused925]\n[unused926]\n[unused927]\n[unused928]\n[unused929]\n[unused930]\n[unused931]\n[unused932]\n[unused933]\n[unused934]\n[unused935]\n[unused936]\n[unused937]\n[unused938]\n[unused939]\n[unused940]\n[unused941]\n[unused942]\n[unused943]\n[unused944]\n[unused945]\n[unused946]\n[unused947]\n[unused948]\n[unused949]\n[unused950]\n[unused951]\n[unused952]\n[unused953]\n[unused954]\n[unused955]\n[unused956]\n[unused957]\n[unused958]\n[unused959]\n[unused960]\n[unused961]\n[unused962]\n[unused963]\n[unused964]\n[unused965]\n[unused966]\n[unused967]\n[unused968]\n[unused969]\n[unused970]\n[unused971]\n[unused972]\n[unused973]\n[unused974]\n[unused975]\n[unused976]\n[unused977]\n[unused978]\n[unused979]\n[unused980]\n[unused981]\n[unused982]\n[unused983]\n[unused984]\n[unused985]\n[unused986]\n[unused987]\n[unused988]\n[unused989]\n[unused990]\n[unused991]\n[unused992]\n[unused993]\n!\n\"\n#\n$\n%\n&\n'\n(\n)\n*\n+\n,\n-\n.\n/\n0\n1\n2\n3\n4\n5\n6\n7\n8\n9\n:\n;\n<\n=\n>\n?\n@\n[\n\\\n]\n^\n_\n`\na\nb\nc\nd\ne\nf\ng\nh\ni\nj\nk\nl\nm\nn\no\np\nq\nr\ns\nt\nu\nv\nw\nx\ny\nz\n{\n|\n}\n~\n¡\n¢\n£\n¤\n¥\n¦\n§\n¨\n©\nª\n«\n¬\n®\n°\n±\n²\n³\n´\nµ\n¶\n·\n¹\nº\n»\n¼\n½\n¾\n¿\n×\nß\næ\nð\n÷\nø\nþ\nđ\nħ\nı\nł\nŋ\nœ\nƒ\nɐ\nɑ\nɒ\nɔ\nɕ\nə\nɛ\nɡ\nɣ\nɨ\nɪ\nɫ\nɬ\nɯ\nɲ\nɴ\nɹ\nɾ\nʀ\nʁ\nʂ\nʃ\nʉ\nʊ\nʋ\nʌ\nʎ\nʐ\nʑ\nʒ\nʔ\nʰ\nʲ\nʳ\nʷ\nʸ\nʻ\nʼ\nʾ\nʿ\nˈ\nː\nˡ\nˢ\nˣ\nˤ\nα\nβ\nγ\nδ\nε\nζ\nη\nθ\nι\nκ\nλ\nμ\nν\nξ\nο\nπ\nρ\nς\nσ\nτ\nυ\nφ\nχ\nψ\nω\nа\nб\nв\nг\nд\nе\nж\nз\nи\nк\nл\nм\nн\nо\nп\nр\nс\nт\nу\nф\nх\nц\nч\nш\nщ\nъ\nы\nь\nэ\nю\nя\nђ\nє\nі\nј\nљ\nњ\nћ\nӏ\nա\nբ\nգ\nդ\nե\nթ\nի\nլ\nկ\nհ\nմ\nյ\nն\nո\nպ\nս\nվ\nտ\nր\nւ\nք\n־\nא\nב\nג\nד\nה\nו\nז\nח\nט\nי\nך\nכ\nל\nם\nמ\nן\nנ\nס\nע\nף\nפ\nץ\nצ\nק\nר\nש\nת\n،\nء\nا\nب\nة\nت\nث\nج\nح\nخ\nد\nذ\nر\nز\nس\nش\nص\nض\nط\nظ\nع\nغ\nـ\nف\nق\nك\nل\nم\nن\nه\nو\nى\nي\nٹ\nپ\nچ\nک\nگ\nں\nھ\nہ\nی\nے\nअ\nआ\nउ\nए\nक\nख\nग\nच\nज\nट\nड\nण\nत\nथ\nद\nध\nन\nप\nब\nभ\nम\nय\nर\nल\nव\nश\nष\nस\nह\nा\nि\nी\nो\n।\n॥\nং\nঅ\nআ\nই\nউ\nএ\nও\nক\nখ\nগ\nচ\nছ\nজ\nট\nড\nণ\nত\nথ\nদ\nধ\nন\nপ\nব\nভ\nম\nয\nর\nল\nশ\nষ\nস\nহ\nা\nি\nী\nে\nக\nச\nட\nத\nந\nன\nப\nம\nய\nர\nல\nள\nவ\nா\nி\nு\nே\nை\nನ\nರ\nಾ\nක\nය\nර\nල\nව\nා\nก\nง\nต\nท\nน\nพ\nม\nย\nร\nล\nว\nส\nอ\nา\nเ\n་\n།\nག\nང\nད\nན\nཔ\nབ\nམ\nའ\nར\nལ\nས\nမ\nა\nბ\nგ\nდ\nე\nვ\nთ\nი\nკ\nლ\nმ\nნ\nო\nრ\nს\nტ\nუ\nᄀ\nᄂ\nᄃ\nᄅ\nᄆ\nᄇ\nᄉ\nᄊ\nᄋ\nᄌ\nᄎ\nᄏ\nᄐ\nᄑ\nᄒ\nᅡ\nᅢ\nᅥ\nᅦ\nᅧ\nᅩ\nᅪ\nᅭ\nᅮ\nᅯ\nᅲ\nᅳ\nᅴ\nᅵ\nᆨ\nᆫ\nᆯ\nᆷ\nᆸ\nᆼ\nᴬ\nᴮ\nᴰ\nᴵ\nᴺ\nᵀ\nᵃ\nᵇ\nᵈ\nᵉ\nᵍ\nᵏ\nᵐ\nᵒ\nᵖ\nᵗ\nᵘ\nᵢ\nᵣ\nᵤ\nᵥ\nᶜ\nᶠ\n‐\n‑\n‒\n–\n—\n―\n‖\n‘\n’\n‚\n“\n”\n„\n†\n‡\n•\n…\n‰\n′\n″\n›\n‿\n⁄\n⁰\nⁱ\n⁴\n⁵\n⁶\n⁷\n⁸\n⁹\n⁺\n⁻\nⁿ\n₀\n₁\n₂\n₃\n₄\n₅\n₆\n₇\n₈\n₉\n₊\n₍\n₎\nₐ\nₑ\nₒ\nₓ\nₕ\nₖ\nₗ\nₘ\nₙ\nₚ\nₛ\nₜ\n₤\n₩\n€\n₱\n₹\nℓ\n№\nℝ\n™\n⅓\n⅔\n←\n↑\n→\n↓\n↔\n↦\n⇄\n⇌\n⇒\n∂\n∅\n∆\n∇\n∈\n−\n∗\n∘\n√\n∞\n∧\n∨\n∩\n∪\n≈\n≡\n≤\n≥\n⊂\n⊆\n⊕\n⊗\n⋅\n─\n│\n■\n▪\n●\n★\n☆\n☉\n♠\n♣\n♥\n♦\n♭\n♯\n⟨\n⟩\nⱼ\n⺩\n⺼\n⽥\n、\n。\n〈\n〉\n《\n》\n「\n」\n『\n』\n〜\nあ\nい\nう\nえ\nお\nか\nき\nく\nけ\nこ\nさ\nし\nす\nせ\nそ\nた\nち\nっ\nつ\nて\nと\nな\nに\nぬ\nね\nの\nは\nひ\nふ\nへ\nほ\nま\nみ\nむ\nめ\nも\nや\nゆ\nよ\nら\nり\nる\nれ\nろ\nを\nん\nァ\nア\nィ\nイ\nウ\nェ\nエ\nオ\nカ\nキ\nク\nケ\nコ\nサ\nシ\nス\nセ\nタ\nチ\nッ\nツ\nテ\nト\nナ\nニ\nノ\nハ\nヒ\nフ\nヘ\nホ\nマ\nミ\nム\nメ\nモ\nャ\nュ\nョ\nラ\nリ\nル\nレ\nロ\nワ\nン\n・\nー\n一\n三\n上\n下\n不\n世\n中\n主\n久\n之\n也\n事\n二\n五\n井\n京\n人\n亻\n仁\n介\n代\n仮\n伊\n会\n佐\n侍\n保\n信\n健\n元\n光\n八\n公\n内\n出\n分\n前\n劉\n力\n加\n勝\n北\n区\n十\n千\n南\n博\n原\n口\n古\n史\n司\n合\n吉\n同\n名\n和\n囗\n四\n国\n國\n土\n地\n坂\n城\n堂\n場\n士\n夏\n外\n大\n天\n太\n夫\n奈\n女\n子\n学\n宀\n宇\n安\n宗\n定\n宣\n宮\n家\n宿\n寺\n將\n小\n尚\n山\n岡\n島\n崎\n川\n州\n巿\n帝\n平\n年\n幸\n广\n弘\n張\n彳\n後\n御\n德\n心\n忄\n志\n忠\n愛\n成\n我\n戦\n戸\n手\n扌\n政\n文\n新\n方\n日\n明\n星\n春\n昭\n智\n曲\n書\n月\n有\n朝\n木\n本\n李\n村\n東\n松\n林\n森\n楊\n樹\n橋\n歌\n止\n正\n武\n比\n氏\n民\n水\n氵\n氷\n永\n江\n沢\n河\n治\n法\n海\n清\n漢\n瀬\n火\n版\n犬\n王\n生\n田\n男\n疒\n発\n白\n的\n皇\n目\n相\n省\n真\n石\n示\n社\n神\n福\n禾\n秀\n秋\n空\n立\n章\n竹\n糹\n美\n義\n耳\n良\n艹\n花\n英\n華\n葉\n藤\n行\n街\n西\n見\n訁\n語\n谷\n貝\n貴\n車\n軍\n辶\n道\n郎\n郡\n部\n都\n里\n野\n金\n鈴\n镇\n長\n門\n間\n阝\n阿\n陳\n陽\n雄\n青\n面\n風\n食\n香\n馬\n高\n龍\n龸\nﬁ\nﬂ\n！\n（\n）\n，\n－\n．\n／\n：\n？\n～\nthe\nof\nand\nin\nto\nwas\nhe\nis\nas\nfor\non\nwith\nthat\nit\nhis\nby\nat\nfrom\nher\n##s\nshe\nyou\nhad\nan\nwere\nbut\nbe\nthis\nare\nnot\nmy\nthey\none\nwhich\nor\nhave\nhim\nme\nfirst\nall\nalso\ntheir\nhas\nup\nwho\nout\nbeen\nwhen\nafter\nthere\ninto\nnew\ntwo\nits\n##a\ntime\nwould\nno\nwhat\nabout\nsaid\nwe\nover\nthen\nother\nso\nmore\n##e\ncan\nif\nlike\nback\nthem\nonly\nsome\ncould\n##i\nwhere\njust\n##ing\nduring\nbefore\n##n\ndo\n##o\nmade\nschool\nthrough\nthan\nnow\nyears\nmost\nworld\nmay\nbetween\ndown\nwell\nthree\n##d\nyear\nwhile\nwill\n##ed\n##r\n##y\nlater\n##t\ncity\nunder\naround\ndid\nsuch\nbeing\nused\nstate\npeople\npart\nknow\nagainst\nyour\nmany\nsecond\nuniversity\nboth\nnational\n##er\nthese\ndon\nknown\noff\nway\nuntil\nre\nhow\neven\nget\nhead\n...\ndidn\n##ly\nteam\namerican\nbecause\nde\n##l\nborn\nunited\nfilm\nsince\nstill\nlong\nwork\nsouth\nus\nbecame\nany\nhigh\nagain\nday\nfamily\nsee\nright\nman\neyes\nhouse\nseason\nwar\nstates\nincluding\ntook\nlife\nnorth\nsame\neach\ncalled\nname\nmuch\nplace\nhowever\ngo\nfour\ngroup\nanother\nfound\nwon\narea\nhere\ngoing\n10\naway\nseries\nleft\nhome\nmusic\nbest\nmake\nhand\nnumber\ncompany\nseveral\nnever\nlast\njohn\n000\nvery\nalbum\ntake\nend\ngood\ntoo\nfollowing\nreleased\ngame\nplayed\nlittle\nbegan\ndistrict\n##m\nold\nwant\nthose\nside\nheld\nown\nearly\ncounty\nll\nleague\nuse\nwest\n##u\nface\nthink\n##es\n2010\ngovernment\n##h\nmarch\ncame\nsmall\ngeneral\ntown\njune\n##on\nline\nbased\nsomething\n##k\nseptember\nthought\nlooked\nalong\ninternational\n2011\nair\njuly\nclub\nwent\njanuary\noctober\nour\naugust\napril\nyork\n12\nfew\n2012\n2008\neast\nshow\nmember\ncollege\n2009\nfather\npublic\n##us\ncome\nmen\nfive\nset\nstation\nchurch\n##c\nnext\nformer\nnovember\nroom\nparty\nlocated\ndecember\n2013\nage\ngot\n2007\n##g\nsystem\nlet\nlove\n2006\nthough\nevery\n2014\nlook\nsong\nwater\ncentury\nwithout\nbody\nblack\nnight\nwithin\ngreat\nwomen\nsingle\nve\nbuilding\nlarge\npopulation\nriver\nnamed\nband\nwhite\nstarted\n##an\nonce\n15\n20\nshould\n18\n2015\nservice\ntop\nbuilt\nbritish\nopen\ndeath\nking\nmoved\nlocal\ntimes\nchildren\nfebruary\nbook\nwhy\n11\ndoor\nneed\npresident\norder\nfinal\nroad\nwasn\nalthough\ndue\nmajor\ndied\nvillage\nthird\nknew\n2016\nasked\nturned\nst\nwanted\nsay\n##p\ntogether\nreceived\nmain\nson\nserved\ndifferent\n##en\nbehind\nhimself\nfelt\nmembers\npower\nfootball\nlaw\nvoice\nplay\n##in\nnear\npark\nhistory\n30\nhaving\n2005\n16\n##man\nsaw\nmother\n##al\narmy\npoint\nfront\nhelp\nenglish\nstreet\nart\nlate\nhands\ngames\naward\n##ia\nyoung\n14\nput\npublished\ncountry\ndivision\nacross\ntold\n13\noften\never\nfrench\nlondon\ncenter\nsix\nred\n2017\nled\ndays\ninclude\nlight\n25\nfind\ntell\namong\nspecies\nreally\naccording\ncentral\nhalf\n2004\nform\noriginal\ngave\noffice\nmaking\nenough\nlost\nfull\nopened\nmust\nincluded\nlive\ngiven\ngerman\nplayer\nrun\nbusiness\nwoman\ncommunity\ncup\nmight\nmillion\nland\n2000\ncourt\ndevelopment\n17\nshort\nround\nii\nkm\nseen\nclass\nstory\nalways\nbecome\nsure\nresearch\nalmost\ndirector\ncouncil\nla\n##2\ncareer\nthings\nusing\nisland\n##z\ncouldn\ncar\n##is\n24\nclose\nforce\n##1\nbetter\nfree\nsupport\ncontrol\nfield\nstudents\n2003\neducation\nmarried\n##b\nnothing\nworked\nothers\nrecord\nbig\ninside\nlevel\nanything\ncontinued\ngive\njames\n##3\nmilitary\nestablished\nnon\nreturned\nfeel\ndoes\ntitle\nwritten\nthing\nfeet\nwilliam\nfar\nco\nassociation\nhard\nalready\n2002\n##ra\nchampionship\nhuman\nwestern\n100\n##na\ndepartment\nhall\nrole\nvarious\nproduction\n21\n19\nheart\n2001\nliving\nfire\nversion\n##ers\n##f\ntelevision\nroyal\n##4\nproduced\nworking\nact\ncase\nsociety\nregion\npresent\nradio\nperiod\nlooking\nleast\ntotal\nkeep\nengland\nwife\nprogram\nper\nbrother\nmind\nspecial\n22\n##le\nam\nworks\nsoon\n##6\npolitical\ngeorge\nservices\ntaken\ncreated\n##7\nfurther\nable\nreached\ndavid\nunion\njoined\nupon\ndone\nimportant\nsocial\ninformation\neither\n##ic\n##x\nappeared\nposition\nground\nlead\nrock\ndark\nelection\n23\nboard\nfrance\nhair\ncourse\narms\nsite\npolice\ngirl\ninstead\nreal\nsound\n##v\nwords\nmoment\n##te\nsomeone\n##8\nsummer\nproject\nannounced\nsan\nless\nwrote\npast\nfollowed\n##5\nblue\nfounded\nal\nfinally\nindia\ntaking\nrecords\namerica\n##ne\n1999\ndesign\nconsidered\nnorthern\ngod\nstop\nbattle\ntoward\neuropean\noutside\ndescribed\ntrack\ntoday\nplaying\nlanguage\n28\ncall\n26\nheard\nprofessional\nlow\naustralia\nmiles\ncalifornia\nwin\nyet\ngreen\n##ie\ntrying\nblood\n##ton\nsouthern\nscience\nmaybe\neverything\nmatch\nsquare\n27\nmouth\nvideo\nrace\nrecorded\nleave\nabove\n##9\ndaughter\npoints\nspace\n1998\nmuseum\nchange\nmiddle\ncommon\n##0\nmove\ntv\npost\n##ta\nlake\nseven\ntried\nelected\nclosed\nten\npaul\nminister\n##th\nmonths\nstart\nchief\nreturn\ncanada\nperson\nsea\nrelease\nsimilar\nmodern\nbrought\nrest\nhit\nformed\nmr\n##la\n1997\nfloor\nevent\ndoing\nthomas\n1996\nrobert\ncare\nkilled\ntraining\nstar\nweek\nneeded\nturn\nfinished\nrailway\nrather\nnews\nhealth\nsent\nexample\nran\nterm\nmichael\ncoming\ncurrently\nyes\nforces\ndespite\ngold\nareas\n50\nstage\nfact\n29\ndead\nsays\npopular\n2018\noriginally\ngermany\nprobably\ndeveloped\nresult\npulled\nfriend\nstood\nmoney\nrunning\nmi\nsigned\nword\nsongs\nchild\neventually\nmet\ntour\naverage\nteams\nminutes\nfestival\ncurrent\ndeep\nkind\n1995\ndecided\nusually\neastern\nseemed\n##ness\nepisode\nbed\nadded\ntable\nindian\nprivate\ncharles\nroute\navailable\nidea\nthroughout\ncentre\naddition\nappointed\nstyle\n1994\nbooks\neight\nconstruction\npress\nmean\nwall\nfriends\nremained\nschools\nstudy\n##ch\n##um\ninstitute\noh\nchinese\nsometimes\nevents\npossible\n1992\naustralian\ntype\nbrown\nforward\ntalk\nprocess\nfood\ndebut\nseat\nperformance\ncommittee\nfeatures\ncharacter\narts\nherself\nelse\nlot\nstrong\nrussian\nrange\nhours\npeter\narm\n##da\nmorning\ndr\nsold\n##ry\nquickly\ndirected\n1993\nguitar\nchina\n##w\n31\nlist\n##ma\nperformed\nmedia\nuk\nplayers\nsmile\n##rs\nmyself\n40\nplaced\ncoach\nprovince\ntowards\nwouldn\nleading\nwhole\nboy\nofficial\ndesigned\ngrand\ncensus\n##el\neurope\nattack\njapanese\nhenry\n1991\n##re\n##os\ncross\ngetting\nalone\naction\nlower\nnetwork\nwide\nwashington\njapan\n1990\nhospital\nbelieve\nchanged\nsister\n##ar\nhold\ngone\nsir\nhadn\nship\n##ka\nstudies\nacademy\nshot\nrights\nbelow\nbase\nbad\ninvolved\nkept\nlargest\n##ist\nbank\nfuture\nespecially\nbeginning\nmark\nmovement\nsection\nfemale\nmagazine\nplan\nprofessor\nlord\nlonger\n##ian\nsat\nwalked\nhill\nactually\ncivil\nenergy\nmodel\nfamilies\nsize\nthus\naircraft\ncompleted\nincludes\ndata\ncaptain\n##or\nfight\nvocals\nfeatured\nrichard\nbridge\nfourth\n1989\nofficer\nstone\nhear\n##ism\nmeans\nmedical\ngroups\nmanagement\nself\nlips\ncompetition\nentire\nlived\ntechnology\nleaving\nfederal\ntournament\nbit\npassed\nhot\nindependent\nawards\nkingdom\nmary\nspent\nfine\ndoesn\nreported\n##ling\njack\nfall\nraised\nitself\nstay\ntrue\nstudio\n1988\nsports\nreplaced\nparis\nsystems\nsaint\nleader\ntheatre\nwhose\nmarket\ncapital\nparents\nspanish\ncanadian\nearth\n##ity\ncut\ndegree\nwriting\nbay\nchristian\nawarded\nnatural\nhigher\nbill\n##as\ncoast\nprovided\nprevious\nsenior\nft\nvalley\norganization\nstopped\nonto\ncountries\nparts\nconference\nqueen\nsecurity\ninterest\nsaying\nallowed\nmaster\nearlier\nphone\nmatter\nsmith\nwinning\ntry\nhappened\nmoving\ncampaign\nlos\n##ley\nbreath\nnearly\nmid\n1987\ncertain\ngirls\ndate\nitalian\nafrican\nstanding\nfell\nartist\n##ted\nshows\ndeal\nmine\nindustry\n1986\n##ng\neveryone\nrepublic\nprovide\ncollection\nlibrary\nstudent\n##ville\nprimary\nowned\nolder\nvia\nheavy\n1st\nmakes\n##able\nattention\nanyone\nafrica\n##ri\nstated\nlength\nended\nfingers\ncommand\nstaff\nskin\nforeign\nopening\ngovernor\nokay\nmedal\nkill\nsun\ncover\njob\n1985\nintroduced\nchest\nhell\nfeeling\n##ies\nsuccess\nmeet\nreason\nstandard\nmeeting\nnovel\n1984\ntrade\nsource\nbuildings\n##land\nrose\nguy\ngoal\n##ur\nchapter\nnative\nhusband\npreviously\nunit\nlimited\nentered\nweeks\nproducer\noperations\nmountain\ntakes\ncovered\nforced\nrelated\nroman\ncomplete\nsuccessful\nkey\ntexas\ncold\n##ya\nchannel\n1980\ntraditional\nfilms\ndance\nclear\napproximately\n500\nnine\nvan\nprince\nquestion\nactive\ntracks\nireland\nregional\nsilver\nauthor\npersonal\nsense\noperation\n##ine\neconomic\n1983\nholding\ntwenty\nisbn\nadditional\nspeed\nhour\nedition\nregular\nhistoric\nplaces\nwhom\nshook\nmovie\nkm²\nsecretary\nprior\nreport\nchicago\nread\nfoundation\nview\nengine\nscored\n1982\nunits\nask\nairport\nproperty\nready\nimmediately\nlady\nmonth\nlisted\ncontract\n##de\nmanager\nthemselves\nlines\n##ki\nnavy\nwriter\nmeant\n##ts\nruns\n##ro\npractice\nchampionships\nsinger\nglass\ncommission\nrequired\nforest\nstarting\nculture\ngenerally\ngiving\naccess\nattended\ntest\ncouple\nstand\ncatholic\nmartin\ncaught\nexecutive\n##less\neye\n##ey\nthinking\nchair\nquite\nshoulder\n1979\nhope\ndecision\nplays\ndefeated\nmunicipality\nwhether\nstructure\noffered\nslowly\npain\nice\ndirection\n##ion\npaper\nmission\n1981\nmostly\n200\nnoted\nindividual\nmanaged\nnature\nlives\nplant\n##ha\nhelped\nexcept\nstudied\ncomputer\nfigure\nrelationship\nissue\nsignificant\nloss\ndie\nsmiled\ngun\nago\nhighest\n1972\n##am\nmale\nbring\ngoals\nmexico\nproblem\ndistance\ncommercial\ncompletely\nlocation\nannual\nfamous\ndrive\n1976\nneck\n1978\nsurface\ncaused\nitaly\nunderstand\ngreek\nhighway\nwrong\nhotel\ncomes\nappearance\njoseph\ndouble\nissues\nmusical\ncompanies\ncastle\nincome\nreview\nassembly\nbass\ninitially\nparliament\nartists\nexperience\n1974\nparticular\nwalk\nfoot\nengineering\ntalking\nwindow\ndropped\n##ter\nmiss\nbaby\nboys\nbreak\n1975\nstars\nedge\nremember\npolicy\ncarried\ntrain\nstadium\nbar\nsex\nangeles\nevidence\n##ge\nbecoming\nassistant\nsoviet\n1977\nupper\nstep\nwing\n1970\nyouth\nfinancial\nreach\n##ll\nactor\nnumerous\n##se\n##st\nnodded\narrived\n##ation\nminute\n##nt\nbelieved\nsorry\ncomplex\nbeautiful\nvictory\nassociated\ntemple\n1968\n1973\nchance\nperhaps\nmetal\n##son\n1945\nbishop\n##et\nlee\nlaunched\nparticularly\ntree\nle\nretired\nsubject\nprize\ncontains\nyeah\ntheory\nempire\n##ce\nsuddenly\nwaiting\ntrust\nrecording\n##to\nhappy\nterms\ncamp\nchampion\n1971\nreligious\npass\nzealand\nnames\n2nd\nport\nancient\ntom\ncorner\nrepresented\nwatch\nlegal\nanti\njustice\ncause\nwatched\nbrothers\n45\nmaterial\nchanges\nsimply\nresponse\nlouis\nfast\n##ting\nanswer\n60\nhistorical\n1969\nstories\nstraight\ncreate\nfeature\nincreased\nrate\nadministration\nvirginia\nel\nactivities\ncultural\noverall\nwinner\nprograms\nbasketball\nlegs\nguard\nbeyond\ncast\ndoctor\nmm\nflight\nresults\nremains\ncost\neffect\nwinter\n##ble\nlarger\nislands\nproblems\nchairman\ngrew\ncommander\nisn\n1967\npay\nfailed\nselected\nhurt\nfort\nbox\nregiment\nmajority\njournal\n35\nedward\nplans\n##ke\n##ni\nshown\npretty\nirish\ncharacters\ndirectly\nscene\nlikely\noperated\nallow\nspring\n##j\njunior\nmatches\nlooks\nmike\nhouses\nfellow\n##tion\nbeach\nmarriage\n##ham\n##ive\nrules\noil\n65\nflorida\nexpected\nnearby\ncongress\nsam\npeace\nrecent\niii\nwait\nsubsequently\ncell\n##do\nvariety\nserving\nagreed\nplease\npoor\njoe\npacific\nattempt\nwood\ndemocratic\npiece\nprime\n##ca\nrural\nmile\ntouch\nappears\ntownship\n1964\n1966\nsoldiers\n##men\n##ized\n1965\npennsylvania\ncloser\nfighting\nclaimed\nscore\njones\nphysical\neditor\n##ous\nfilled\ngenus\nspecific\nsitting\nsuper\nmom\n##va\ntherefore\nsupported\nstatus\nfear\ncases\nstore\nmeaning\nwales\nminor\nspain\ntower\nfocus\nvice\nfrank\nfollow\nparish\nseparate\ngolden\nhorse\nfifth\nremaining\nbranch\n32\npresented\nstared\n##id\nuses\nsecret\nforms\n##co\nbaseball\nexactly\n##ck\nchoice\nnote\ndiscovered\ntravel\ncomposed\ntruth\nrussia\nball\ncolor\nkiss\ndad\nwind\ncontinue\nring\nreferred\nnumbers\ndigital\ngreater\n##ns\nmetres\nslightly\ndirect\nincrease\n1960\nresponsible\ncrew\nrule\ntrees\ntroops\n##no\nbroke\ngoes\nindividuals\nhundred\nweight\ncreek\nsleep\nmemory\ndefense\nprovides\nordered\ncode\nvalue\njewish\nwindows\n1944\nsafe\njudge\nwhatever\ncorps\nrealized\ngrowing\npre\n##ga\ncities\nalexander\ngaze\nlies\nspread\nscott\nletter\nshowed\nsituation\nmayor\ntransport\nwatching\nworkers\nextended\n##li\nexpression\nnormal\n##ment\nchart\nmultiple\nborder\n##ba\nhost\n##ner\ndaily\nmrs\nwalls\npiano\n##ko\nheat\ncannot\n##ate\nearned\nproducts\ndrama\nera\nauthority\nseasons\njoin\ngrade\n##io\nsign\ndifficult\nmachine\n1963\nterritory\nmainly\n##wood\nstations\nsquadron\n1962\nstepped\niron\n19th\n##led\nserve\nappear\nsky\nspeak\nbroken\ncharge\nknowledge\nkilometres\nremoved\nships\narticle\ncampus\nsimple\n##ty\npushed\nbritain\n##ve\nleaves\nrecently\ncd\nsoft\nboston\nlatter\neasy\nacquired\npoland\n##sa\nquality\nofficers\npresence\nplanned\nnations\nmass\nbroadcast\njean\nshare\nimage\ninfluence\nwild\noffer\nemperor\nelectric\nreading\nheaded\nability\npromoted\nyellow\nministry\n1942\nthroat\nsmaller\npolitician\n##by\nlatin\nspoke\ncars\nwilliams\nmales\nlack\npop\n80\n##ier\nacting\nseeing\nconsists\n##ti\nestate\n1961\npressure\njohnson\nnewspaper\njr\nchris\nolympics\nonline\nconditions\nbeat\nelements\nwalking\nvote\n##field\nneeds\ncarolina\ntext\nfeaturing\nglobal\nblock\nshirt\nlevels\nfrancisco\npurpose\nfemales\net\ndutch\nduke\nahead\ngas\ntwice\nsafety\nserious\nturning\nhighly\nlieutenant\nfirm\nmaria\namount\nmixed\ndaniel\nproposed\nperfect\nagreement\naffairs\n3rd\nseconds\ncontemporary\npaid\n1943\nprison\nsave\nkitchen\nlabel\nadministrative\nintended\nconstructed\nacademic\nnice\nteacher\nraces\n1956\nformerly\ncorporation\nben\nnation\nissued\nshut\n1958\ndrums\nhousing\nvictoria\nseems\nopera\n1959\ngraduated\nfunction\nvon\nmentioned\npicked\nbuild\nrecognized\nshortly\nprotection\npicture\nnotable\nexchange\nelections\n1980s\nloved\npercent\nracing\nfish\nelizabeth\ngarden\nvolume\nhockey\n1941\nbeside\nsettled\n##ford\n1940\ncompeted\nreplied\ndrew\n1948\nactress\nmarine\nscotland\nsteel\nglanced\nfarm\nsteve\n1957\nrisk\ntonight\npositive\nmagic\nsingles\neffects\ngray\nscreen\ndog\n##ja\nresidents\nbus\nsides\nnone\nsecondary\nliterature\npolish\ndestroyed\nflying\nfounder\nhouseholds\n1939\nlay\nreserve\nusa\ngallery\n##ler\n1946\nindustrial\nyounger\napproach\nappearances\nurban\nones\n1950\nfinish\navenue\npowerful\nfully\ngrowth\npage\nhonor\njersey\nprojects\nadvanced\nrevealed\nbasic\n90\ninfantry\npair\nequipment\nvisit\n33\nevening\nsearch\ngrant\neffort\nsolo\ntreatment\nburied\nrepublican\nprimarily\nbottom\nowner\n1970s\nisrael\ngives\njim\ndream\nbob\nremain\nspot\n70\nnotes\nproduce\nchampions\ncontact\ned\nsoul\naccepted\nways\ndel\n##ally\nlosing\nsplit\nprice\ncapacity\nbasis\ntrial\nquestions\n##ina\n1955\n20th\nguess\nofficially\nmemorial\nnaval\ninitial\n##ization\nwhispered\nmedian\nengineer\n##ful\nsydney\n##go\ncolumbia\nstrength\n300\n1952\ntears\nsenate\n00\ncard\nasian\nagent\n1947\nsoftware\n44\ndraw\nwarm\nsupposed\ncom\npro\n##il\ntransferred\nleaned\n##at\ncandidate\nescape\nmountains\nasia\npotential\nactivity\nentertainment\nseem\ntraffic\njackson\nmurder\n36\nslow\nproduct\norchestra\nhaven\nagency\nbbc\ntaught\nwebsite\ncomedy\nunable\nstorm\nplanning\nalbums\nrugby\nenvironment\nscientific\ngrabbed\nprotect\n##hi\nboat\ntypically\n1954\n1953\ndamage\nprincipal\ndivided\ndedicated\nmount\nohio\n##berg\npick\nfought\ndriver\n##der\nempty\nshoulders\nsort\nthank\nberlin\nprominent\naccount\nfreedom\nnecessary\nefforts\nalex\nheadquarters\nfollows\nalongside\ndes\nsimon\nandrew\nsuggested\noperating\nlearning\nsteps\n1949\nsweet\ntechnical\nbegin\neasily\n34\nteeth\nspeaking\nsettlement\nscale\n##sh\nrenamed\nray\nmax\nenemy\nsemi\njoint\ncompared\n##rd\nscottish\nleadership\nanalysis\noffers\ngeorgia\npieces\ncaptured\nanimal\ndeputy\nguest\norganized\n##lin\ntony\ncombined\nmethod\nchallenge\n1960s\nhuge\nwants\nbattalion\nsons\nrise\ncrime\ntypes\nfacilities\ntelling\npath\n1951\nplatform\nsit\n1990s\n##lo\ntells\nassigned\nrich\npull\n##ot\ncommonly\nalive\n##za\nletters\nconcept\nconducted\nwearing\nhappen\nbought\nbecomes\nholy\ngets\nocean\ndefeat\nlanguages\npurchased\ncoffee\noccurred\ntitled\n##q\ndeclared\napplied\nsciences\nconcert\nsounds\njazz\nbrain\n##me\npainting\nfleet\ntax\nnick\n##ius\nmichigan\ncount\nanimals\nleaders\nepisodes\n##line\ncontent\n##den\nbirth\n##it\nclubs\n64\npalace\ncritical\nrefused\nfair\nleg\nlaughed\nreturning\nsurrounding\nparticipated\nformation\nlifted\npointed\nconnected\nrome\nmedicine\nlaid\ntaylor\nsanta\npowers\nadam\ntall\nshared\nfocused\nknowing\nyards\nentrance\nfalls\n##wa\ncalling\n##ad\nsources\nchosen\nbeneath\nresources\nyard\n##ite\nnominated\nsilence\nzone\ndefined\n##que\ngained\nthirty\n38\nbodies\nmoon\n##ard\nadopted\nchristmas\nwidely\nregister\napart\niran\npremier\nserves\ndu\nunknown\nparties\n##les\ngeneration\n##ff\ncontinues\nquick\nfields\nbrigade\nquiet\nteaching\nclothes\nimpact\nweapons\npartner\nflat\ntheater\nsupreme\n1938\n37\nrelations\n##tor\nplants\nsuffered\n1936\nwilson\nkids\nbegins\n##age\n1918\nseats\narmed\ninternet\nmodels\nworth\nlaws\n400\ncommunities\nclasses\nbackground\nknows\nthanks\nquarter\nreaching\nhumans\ncarry\nkilling\nformat\nkong\nhong\nsetting\n75\narchitecture\ndisease\nrailroad\ninc\npossibly\nwish\narthur\nthoughts\nharry\ndoors\ndensity\n##di\ncrowd\nillinois\nstomach\ntone\nunique\nreports\nanyway\n##ir\nliberal\nder\nvehicle\nthick\ndry\ndrug\nfaced\nlargely\nfacility\ntheme\nholds\ncreation\nstrange\ncolonel\n##mi\nrevolution\nbell\npolitics\nturns\nsilent\nrail\nrelief\nindependence\ncombat\nshape\nwrite\ndetermined\nsales\nlearned\n4th\nfinger\noxford\nproviding\n1937\nheritage\nfiction\nsituated\ndesignated\nallowing\ndistribution\nhosted\n##est\nsight\ninterview\nestimated\nreduced\n##ria\ntoronto\nfootballer\nkeeping\nguys\ndamn\nclaim\nmotion\nsport\nsixth\nstayed\n##ze\nen\nrear\nreceive\nhanded\ntwelve\ndress\naudience\ngranted\nbrazil\n##well\nspirit\n##ated\nnoticed\netc\nolympic\nrepresentative\neric\ntight\ntrouble\nreviews\ndrink\nvampire\nmissing\nroles\nranked\nnewly\nhousehold\nfinals\nwave\ncritics\n##ee\nphase\nmassachusetts\npilot\nunlike\nphiladelphia\nbright\nguns\ncrown\norganizations\nroof\n42\nrespectively\nclearly\ntongue\nmarked\ncircle\nfox\nkorea\nbronze\nbrian\nexpanded\nsexual\nsupply\nyourself\ninspired\nlabour\nfc\n##ah\nreference\nvision\ndraft\nconnection\nbrand\nreasons\n1935\nclassic\ndriving\ntrip\njesus\ncells\nentry\n1920\nneither\ntrail\nclaims\natlantic\norders\nlabor\nnose\nafraid\nidentified\nintelligence\ncalls\ncancer\nattacked\npassing\nstephen\npositions\nimperial\ngrey\njason\n39\nsunday\n48\nswedish\navoid\nextra\nuncle\nmessage\ncovers\nallows\nsurprise\nmaterials\nfame\nhunter\n##ji\n1930\ncitizens\nfigures\ndavis\nenvironmental\nconfirmed\nshit\ntitles\ndi\nperforming\ndifference\nacts\nattacks\n##ov\nexisting\nvotes\nopportunity\nnor\nshop\nentirely\ntrains\nopposite\npakistan\n##pa\ndevelop\nresulted\nrepresentatives\nactions\nreality\npressed\n##ish\nbarely\nwine\nconversation\nfaculty\nnorthwest\nends\ndocumentary\nnuclear\nstock\ngrace\nsets\neat\nalternative\n##ps\nbag\nresulting\ncreating\nsurprised\ncemetery\n1919\ndrop\nfinding\nsarah\ncricket\nstreets\ntradition\nride\n1933\nexhibition\ntarget\near\nexplained\nrain\ncomposer\ninjury\napartment\nmunicipal\neducational\noccupied\nnetherlands\nclean\nbillion\nconstitution\nlearn\n1914\nmaximum\nclassical\nfrancis\nlose\nopposition\njose\nontario\nbear\ncore\nhills\nrolled\nending\ndrawn\npermanent\nfun\n##tes\n##lla\nlewis\nsites\nchamber\nryan\n##way\nscoring\nheight\n1934\n##house\nlyrics\nstaring\n55\nofficials\n1917\nsnow\noldest\n##tic\norange\n##ger\nqualified\ninterior\napparently\nsucceeded\nthousand\ndinner\nlights\nexistence\nfans\nheavily\n41\ngreatest\nconservative\nsend\nbowl\nplus\nenter\ncatch\n##un\neconomy\nduty\n1929\nspeech\nauthorities\nprincess\nperformances\nversions\nshall\ngraduate\npictures\neffective\nremembered\npoetry\ndesk\ncrossed\nstarring\nstarts\npassenger\nsharp\n##ant\nacres\nass\nweather\nfalling\nrank\nfund\nsupporting\ncheck\nadult\npublishing\nheads\ncm\nsoutheast\nlane\n##burg\napplication\nbc\n##ura\nles\ncondition\ntransfer\nprevent\ndisplay\nex\nregions\nearl\nfederation\ncool\nrelatively\nanswered\nbesides\n1928\nobtained\nportion\n##town\nmix\n##ding\nreaction\nliked\ndean\nexpress\npeak\n1932\n##tte\ncounter\nreligion\nchain\nrare\nmiller\nconvention\naid\nlie\nvehicles\nmobile\nperform\nsquad\nwonder\nlying\ncrazy\nsword\n##ping\nattempted\ncenturies\nweren\nphilosophy\ncategory\n##ize\nanna\ninterested\n47\nsweden\nwolf\nfrequently\nabandoned\nkg\nliterary\nalliance\ntask\nentitled\n##ay\nthrew\npromotion\nfactory\ntiny\nsoccer\nvisited\nmatt\nfm\nachieved\n52\ndefence\ninternal\npersian\n43\nmethods\n##ging\narrested\notherwise\ncambridge\nprogramming\nvillages\nelementary\ndistricts\nrooms\ncriminal\nconflict\nworry\ntrained\n1931\nattempts\nwaited\nsignal\nbird\ntruck\nsubsequent\nprogramme\n##ol\nad\n49\ncommunist\ndetails\nfaith\nsector\npatrick\ncarrying\nlaugh\n##ss\ncontrolled\nkorean\nshowing\norigin\nfuel\nevil\n1927\n##ent\nbrief\nidentity\ndarkness\naddress\npool\nmissed\npublication\nweb\nplanet\nian\nanne\nwings\ninvited\n##tt\nbriefly\nstandards\nkissed\n##be\nideas\nclimate\ncausing\nwalter\nworse\nalbert\narticles\nwinners\ndesire\naged\nnortheast\ndangerous\ngate\ndoubt\n1922\nwooden\nmulti\n##ky\npoet\nrising\nfunding\n46\ncommunications\ncommunication\nviolence\ncopies\nprepared\nford\ninvestigation\nskills\n1924\npulling\nelectronic\n##ak\n##ial\n##han\ncontaining\nultimately\noffices\nsinging\nunderstanding\nrestaurant\ntomorrow\nfashion\nchrist\nward\nda\npope\nstands\n5th\nflow\nstudios\naired\ncommissioned\ncontained\nexist\nfresh\namericans\n##per\nwrestling\napproved\nkid\nemployed\nrespect\nsuit\n1925\nangel\nasking\nincreasing\nframe\nangry\nselling\n1950s\nthin\nfinds\n##nd\ntemperature\nstatement\nali\nexplain\ninhabitants\ntowns\nextensive\nnarrow\n51\njane\nflowers\nimages\npromise\nsomewhere\nobject\nfly\nclosely\n##ls\n1912\nbureau\ncape\n1926\nweekly\npresidential\nlegislative\n1921\n##ai\n##au\nlaunch\nfounding\n##ny\n978\n##ring\nartillery\nstrike\nun\ninstitutions\nroll\nwriters\nlanding\nchose\nkevin\nanymore\npp\n##ut\nattorney\nfit\ndan\nbillboard\nreceiving\nagricultural\nbreaking\nsought\ndave\nadmitted\nlands\nmexican\n##bury\ncharlie\nspecifically\nhole\niv\nhoward\ncredit\nmoscow\nroads\naccident\n1923\nproved\nwear\nstruck\nhey\nguards\nstuff\nslid\nexpansion\n1915\ncat\nanthony\n##kin\nmelbourne\nopposed\nsub\nsouthwest\narchitect\nfailure\nplane\n1916\n##ron\nmap\ncamera\ntank\nlisten\nregarding\nwet\nintroduction\nmetropolitan\nlink\nep\nfighter\ninch\ngrown\ngene\nanger\nfixed\nbuy\ndvd\nkhan\ndomestic\nworldwide\nchapel\nmill\nfunctions\nexamples\n##head\ndeveloping\n1910\nturkey\nhits\npocket\nantonio\npapers\ngrow\nunless\ncircuit\n18th\nconcerned\nattached\njournalist\nselection\njourney\nconverted\nprovincial\npainted\nhearing\naren\nbands\nnegative\naside\nwondered\nknight\nlap\nsurvey\nma\n##ow\nnoise\nbilly\n##ium\nshooting\nguide\nbedroom\npriest\nresistance\nmotor\nhomes\nsounded\ngiant\n##mer\n150\nscenes\nequal\ncomic\npatients\nhidden\nsolid\nactual\nbringing\nafternoon\ntouched\nfunds\nwedding\nconsisted\nmarie\ncanal\nsr\nkim\ntreaty\nturkish\nrecognition\nresidence\ncathedral\nbroad\nknees\nincident\nshaped\nfired\nnorwegian\nhandle\ncheek\ncontest\nrepresent\n##pe\nrepresenting\nbeauty\n##sen\nbirds\nadvantage\nemergency\nwrapped\ndrawing\nnotice\npink\nbroadcasting\n##ong\nsomehow\nbachelor\nseventh\ncollected\nregistered\nestablishment\nalan\nassumed\nchemical\npersonnel\nroger\nretirement\njeff\nportuguese\nwore\ntied\ndevice\nthreat\nprogress\nadvance\n##ised\nbanks\nhired\nmanchester\nnfl\nteachers\nstructures\nforever\n##bo\ntennis\nhelping\nsaturday\nsale\napplications\njunction\nhip\nincorporated\nneighborhood\ndressed\nceremony\n##ds\ninfluenced\nhers\nvisual\nstairs\ndecades\ninner\nkansas\nhung\nhoped\ngain\nscheduled\ndowntown\nengaged\naustria\nclock\nnorway\ncertainly\npale\nprotected\n1913\nvictor\nemployees\nplate\nputting\nsurrounded\n##ists\nfinishing\nblues\ntropical\n##ries\nminnesota\nconsider\nphilippines\naccept\n54\nretrieved\n1900\nconcern\nanderson\nproperties\ninstitution\ngordon\nsuccessfully\nvietnam\n##dy\nbacking\noutstanding\nmuslim\ncrossing\nfolk\nproducing\nusual\ndemand\noccurs\nobserved\nlawyer\neducated\n##ana\nkelly\nstring\npleasure\nbudget\nitems\nquietly\ncolorado\nphilip\ntypical\n##worth\nderived\n600\nsurvived\nasks\nmental\n##ide\n56\njake\njews\ndistinguished\nltd\n1911\nsri\nextremely\n53\nathletic\nloud\nthousands\nworried\nshadow\ntransportation\nhorses\nweapon\narena\nimportance\nusers\ntim\nobjects\ncontributed\ndragon\ndouglas\naware\nsenator\njohnny\njordan\nsisters\nengines\nflag\ninvestment\nsamuel\nshock\ncapable\nclark\nrow\nwheel\nrefers\nsession\nfamiliar\nbiggest\nwins\nhate\nmaintained\ndrove\nhamilton\nrequest\nexpressed\ninjured\nunderground\nchurches\nwalker\nwars\ntunnel\npasses\nstupid\nagriculture\nsoftly\ncabinet\nregarded\njoining\nindiana\n##ea\n##ms\npush\ndates\nspend\nbehavior\nwoods\nprotein\ngently\nchase\nmorgan\nmention\nburning\nwake\ncombination\noccur\nmirror\nleads\njimmy\nindeed\nimpossible\nsingapore\npaintings\ncovering\n##nes\nsoldier\nlocations\nattendance\nsell\nhistorian\nwisconsin\ninvasion\nargued\npainter\ndiego\nchanging\negypt\n##don\nexperienced\ninches\n##ku\nmissouri\nvol\ngrounds\nspoken\nswitzerland\n##gan\nreform\nrolling\nha\nforget\nmassive\nresigned\nburned\nallen\ntennessee\nlocked\nvalues\nimproved\n##mo\nwounded\nuniverse\nsick\ndating\nfacing\npack\npurchase\nuser\n##pur\nmoments\n##ul\nmerged\nanniversary\n1908\ncoal\nbrick\nunderstood\ncauses\ndynasty\nqueensland\nestablish\nstores\ncrisis\npromote\nhoping\nviews\ncards\nreferee\nextension\n##si\nraise\narizona\nimprove\ncolonial\nformal\ncharged\n##rt\npalm\nlucky\nhide\nrescue\nfaces\n95\nfeelings\ncandidates\njuan\n##ell\ngoods\n6th\ncourses\nweekend\n59\nluke\ncash\nfallen\n##om\ndelivered\naffected\ninstalled\ncarefully\ntries\nswiss\nhollywood\ncosts\nlincoln\nresponsibility\n##he\nshore\nfile\nproper\nnormally\nmaryland\nassistance\njump\nconstant\noffering\nfriendly\nwaters\npersons\nrealize\ncontain\ntrophy\n800\npartnership\nfactor\n58\nmusicians\ncry\nbound\noregon\nindicated\nhero\nhouston\nmedium\n##ure\nconsisting\nsomewhat\n##ara\n57\ncycle\n##che\nbeer\nmoore\nfrederick\ngotten\neleven\nworst\nweak\napproached\narranged\nchin\nloan\nuniversal\nbond\nfifteen\npattern\ndisappeared\n##ney\ntranslated\n##zed\nlip\narab\ncapture\ninterests\ninsurance\n##chi\nshifted\ncave\nprix\nwarning\nsections\ncourts\ncoat\nplot\nsmell\nfeed\ngolf\nfavorite\nmaintain\nknife\nvs\nvoted\ndegrees\nfinance\nquebec\nopinion\ntranslation\nmanner\nruled\noperate\nproductions\nchoose\nmusician\ndiscovery\nconfused\ntired\nseparated\nstream\ntechniques\ncommitted\nattend\nranking\nkings\nthrow\npassengers\nmeasure\nhorror\nfan\nmining\nsand\ndanger\nsalt\ncalm\ndecade\ndam\nrequire\nrunner\n##ik\nrush\nassociate\ngreece\n##ker\nrivers\nconsecutive\nmatthew\n##ski\nsighed\nsq\ndocuments\nsteam\nedited\nclosing\ntie\naccused\n1905\n##ini\nislamic\ndistributed\ndirectors\norganisation\nbruce\n7th\nbreathing\nmad\nlit\narrival\nconcrete\ntaste\n08\ncomposition\nshaking\nfaster\namateur\nadjacent\nstating\n1906\ntwin\nflew\n##ran\ntokyo\npublications\n##tone\nobviously\nridge\nstorage\n1907\ncarl\npages\nconcluded\ndesert\ndriven\nuniversities\nages\nterminal\nsequence\nborough\n250\nconstituency\ncreative\ncousin\neconomics\ndreams\nmargaret\nnotably\nreduce\nmontreal\nmode\n17th\nears\nsaved\njan\nvocal\n##ica\n1909\nandy\n##jo\nriding\nroughly\nthreatened\n##ise\nmeters\nmeanwhile\nlanded\ncompete\nrepeated\ngrass\nczech\nregularly\ncharges\ntea\nsudden\nappeal\n##ung\nsolution\ndescribes\npierre\nclassification\nglad\nparking\n##ning\nbelt\nphysics\n99\nrachel\nadd\nhungarian\nparticipate\nexpedition\ndamaged\ngift\nchildhood\n85\nfifty\n##red\nmathematics\njumped\nletting\ndefensive\nmph\n##ux\n##gh\ntesting\n##hip\nhundreds\nshoot\nowners\nmatters\nsmoke\nisraeli\nkentucky\ndancing\nmounted\ngrandfather\nemma\ndesigns\nprofit\nargentina\n##gs\ntruly\nli\nlawrence\ncole\nbegun\ndetroit\nwilling\nbranches\nsmiling\ndecide\nmiami\nenjoyed\nrecordings\n##dale\npoverty\nethnic\ngay\n##bi\ngary\narabic\n09\naccompanied\n##one\n##ons\nfishing\ndetermine\nresidential\nacid\n##ary\nalice\nreturns\nstarred\nmail\n##ang\njonathan\nstrategy\n##ue\nnet\nforty\ncook\nbusinesses\nequivalent\ncommonwealth\ndistinct\nill\n##cy\nseriously\n##ors\n##ped\nshift\nharris\nreplace\nrio\nimagine\nformula\nensure\n##ber\nadditionally\nscheme\nconservation\noccasionally\npurposes\nfeels\nfavor\n##and\n##ore\n1930s\ncontrast\nhanging\nhunt\nmovies\n1904\ninstruments\nvictims\ndanish\nchristopher\nbusy\ndemon\nsugar\nearliest\ncolony\nstudying\nbalance\nduties\n##ks\nbelgium\nslipped\ncarter\n05\nvisible\nstages\niraq\nfifa\n##im\ncommune\nforming\nzero\n07\ncontinuing\ntalked\ncounties\nlegend\nbathroom\noption\ntail\nclay\ndaughters\nafterwards\nsevere\njaw\nvisitors\n##ded\ndevices\naviation\nrussell\nkate\n##vi\nentering\nsubjects\n##ino\ntemporary\nswimming\nforth\nsmooth\nghost\naudio\nbush\noperates\nrocks\nmovements\nsigns\neddie\n##tz\nann\nvoices\nhonorary\n06\nmemories\ndallas\npure\nmeasures\nracial\npromised\n66\nharvard\nceo\n16th\nparliamentary\nindicate\nbenefit\nflesh\ndublin\nlouisiana\n1902\n1901\npatient\nsleeping\n1903\nmembership\ncoastal\nmedieval\nwanting\nelement\nscholars\nrice\n62\nlimit\nsurvive\nmakeup\nrating\ndefinitely\ncollaboration\nobvious\n##tan\nboss\nms\nbaron\nbirthday\nlinked\nsoil\ndiocese\n##lan\nncaa\n##mann\noffensive\nshell\nshouldn\nwaist\n##tus\nplain\nross\norgan\nresolution\nmanufacturing\nadding\nrelative\nkennedy\n98\nwhilst\nmoth\nmarketing\ngardens\ncrash\n72\nheading\npartners\ncredited\ncarlos\nmoves\ncable\n##zi\nmarshall\n##out\ndepending\nbottle\nrepresents\nrejected\nresponded\nexisted\n04\njobs\ndenmark\nlock\n##ating\ntreated\ngraham\nroutes\ntalent\ncommissioner\ndrugs\nsecure\ntests\nreign\nrestored\nphotography\n##gi\ncontributions\noklahoma\ndesigner\ndisc\ngrin\nseattle\nrobin\npaused\natlanta\nunusual\n##gate\npraised\nlas\nlaughing\nsatellite\nhungary\nvisiting\n##sky\ninteresting\nfactors\ndeck\npoems\nnorman\n##water\nstuck\nspeaker\nrifle\ndomain\npremiered\n##her\ndc\ncomics\nactors\n01\nreputation\neliminated\n8th\nceiling\nprisoners\nscript\n##nce\nleather\naustin\nmississippi\nrapidly\nadmiral\nparallel\ncharlotte\nguilty\ntools\ngender\ndivisions\nfruit\n##bs\nlaboratory\nnelson\nfantasy\nmarry\nrapid\naunt\ntribe\nrequirements\naspects\nsuicide\namongst\nadams\nbone\nukraine\nabc\nkick\nsees\nedinburgh\nclothing\ncolumn\nrough\ngods\nhunting\nbroadway\ngathered\nconcerns\n##ek\nspending\nty\n12th\nsnapped\nrequires\nsolar\nbones\ncavalry\n##tta\niowa\ndrinking\nwaste\nindex\nfranklin\ncharity\nthompson\nstewart\ntip\nflash\nlandscape\nfriday\nenjoy\nsingh\npoem\nlistening\n##back\neighth\nfred\ndifferences\nadapted\nbomb\nukrainian\nsurgery\ncorporate\nmasters\nanywhere\n##more\nwaves\nodd\nsean\nportugal\norleans\ndick\ndebate\nkent\neating\npuerto\ncleared\n96\nexpect\ncinema\n97\nguitarist\nblocks\nelectrical\nagree\ninvolving\ndepth\ndying\npanel\nstruggle\n##ged\npeninsula\nadults\nnovels\nemerged\nvienna\nmetro\ndebuted\nshoes\ntamil\nsongwriter\nmeets\nprove\nbeating\ninstance\nheaven\nscared\nsending\nmarks\nartistic\npassage\nsuperior\n03\nsignificantly\nshopping\n##tive\nretained\n##izing\nmalaysia\ntechnique\ncheeks\n##ola\nwarren\nmaintenance\ndestroy\nextreme\nallied\n120\nappearing\n##yn\nfill\nadvice\nalabama\nqualifying\npolicies\ncleveland\nhat\nbattery\nsmart\nauthors\n10th\nsoundtrack\nacted\ndated\nlb\nglance\nequipped\ncoalition\nfunny\nouter\nambassador\nroy\npossibility\ncouples\ncampbell\ndna\nloose\nethan\nsupplies\n1898\ngonna\n88\nmonster\n##res\nshake\nagents\nfrequency\nsprings\ndogs\npractices\n61\ngang\nplastic\neasier\nsuggests\ngulf\nblade\nexposed\ncolors\nindustries\nmarkets\npan\nnervous\nelectoral\ncharts\nlegislation\nownership\n##idae\nmac\nappointment\nshield\ncopy\nassault\nsocialist\nabbey\nmonument\nlicense\nthrone\nemployment\njay\n93\nreplacement\ncharter\ncloud\npowered\nsuffering\naccounts\noak\nconnecticut\nstrongly\nwright\ncolour\ncrystal\n13th\ncontext\nwelsh\nnetworks\nvoiced\ngabriel\njerry\n##cing\nforehead\nmp\n##ens\nmanage\nschedule\ntotally\nremix\n##ii\nforests\noccupation\nprint\nnicholas\nbrazilian\nstrategic\nvampires\nengineers\n76\nroots\nseek\ncorrect\ninstrumental\nund\nalfred\nbacked\nhop\n##des\nstanley\nrobinson\ntraveled\nwayne\nwelcome\naustrian\nachieve\n67\nexit\nrates\n1899\nstrip\nwhereas\n##cs\nsing\ndeeply\nadventure\nbobby\nrick\njamie\ncareful\ncomponents\ncap\nuseful\npersonality\nknee\n##shi\npushing\nhosts\n02\nprotest\nca\nottoman\nsymphony\n##sis\n63\nboundary\n1890\nprocesses\nconsidering\nconsiderable\ntons\n##work\n##ft\n##nia\ncooper\ntrading\ndear\nconduct\n91\nillegal\napple\nrevolutionary\nholiday\ndefinition\nharder\n##van\njacob\ncircumstances\ndestruction\n##lle\npopularity\ngrip\nclassified\nliverpool\ndonald\nbaltimore\nflows\nseeking\nhonour\napproval\n92\nmechanical\ntill\nhappening\nstatue\ncritic\nincreasingly\nimmediate\ndescribe\ncommerce\nstare\n##ster\nindonesia\nmeat\nrounds\nboats\nbaker\northodox\ndepression\nformally\nworn\nnaked\nclaire\nmuttered\nsentence\n11th\nemily\ndocument\n77\ncriticism\nwished\nvessel\nspiritual\nbent\nvirgin\nparker\nminimum\nmurray\nlunch\ndanny\nprinted\ncompilation\nkeyboards\nfalse\nblow\nbelonged\n68\nraising\n78\ncutting\n##board\npittsburgh\n##up\n9th\nshadows\n81\nhated\nindigenous\njon\n15th\nbarry\nscholar\nah\n##zer\noliver\n##gy\nstick\nsusan\nmeetings\nattracted\nspell\nromantic\n##ver\nye\n1895\nphoto\ndemanded\ncustomers\n##ac\n1896\nlogan\nrevival\nkeys\nmodified\ncommanded\njeans\n##ious\nupset\nraw\nphil\ndetective\nhiding\nresident\nvincent\n##bly\nexperiences\ndiamond\ndefeating\ncoverage\nlucas\nexternal\nparks\nfranchise\nhelen\nbible\nsuccessor\npercussion\ncelebrated\nil\nlift\nprofile\nclan\nromania\n##ied\nmills\n##su\nnobody\nachievement\nshrugged\nfault\n1897\nrhythm\ninitiative\nbreakfast\ncarbon\n700\n69\nlasted\nviolent\n74\nwound\nken\nkiller\ngradually\nfilmed\n°c\ndollars\nprocessing\n94\nremove\ncriticized\nguests\nsang\nchemistry\n##vin\nlegislature\ndisney\n##bridge\nuniform\nescaped\nintegrated\nproposal\npurple\ndenied\nliquid\nkarl\ninfluential\nmorris\nnights\nstones\nintense\nexperimental\ntwisted\n71\n84\n##ld\npace\nnazi\nmitchell\nny\nblind\nreporter\nnewspapers\n14th\ncenters\nburn\nbasin\nforgotten\nsurviving\nfiled\ncollections\nmonastery\nlosses\nmanual\ncouch\ndescription\nappropriate\nmerely\ntag\nmissions\nsebastian\nrestoration\nreplacing\ntriple\n73\nelder\njulia\nwarriors\nbenjamin\njulian\nconvinced\nstronger\namazing\ndeclined\nversus\nmerchant\nhappens\noutput\nfinland\nbare\nbarbara\nabsence\nignored\ndawn\ninjuries\n##port\nproducers\n##ram\n82\nluis\n##ities\nkw\nadmit\nexpensive\nelectricity\nnba\nexception\nsymbol\n##ving\nladies\nshower\nsheriff\ncharacteristics\n##je\naimed\nbutton\nratio\neffectively\nsummit\nangle\njury\nbears\nfoster\nvessels\npants\nexecuted\nevans\ndozen\nadvertising\nkicked\npatrol\n1889\ncompetitions\nlifetime\nprinciples\nathletics\n##logy\nbirmingham\nsponsored\n89\nrob\nnomination\n1893\nacoustic\n##sm\ncreature\nlongest\n##tra\ncredits\nharbor\ndust\njosh\n##so\nterritories\nmilk\ninfrastructure\ncompletion\nthailand\nindians\nleon\narchbishop\n##sy\nassist\npitch\nblake\narrangement\ngirlfriend\nserbian\noperational\nhence\nsad\nscent\nfur\ndj\nsessions\nhp\nrefer\nrarely\n##ora\nexists\n1892\n##ten\nscientists\ndirty\npenalty\nburst\nportrait\nseed\n79\npole\nlimits\nrival\n1894\nstable\nalpha\ngrave\nconstitutional\nalcohol\narrest\nflower\nmystery\ndevil\narchitectural\nrelationships\ngreatly\nhabitat\n##istic\nlarry\nprogressive\nremote\ncotton\n##ics\n##ok\npreserved\nreaches\n##ming\ncited\n86\nvast\nscholarship\ndecisions\ncbs\njoy\nteach\n1885\neditions\nknocked\neve\nsearching\npartly\nparticipation\ngap\nanimated\nfate\nexcellent\n##ett\nna\n87\nalternate\nsaints\nyoungest\n##ily\nclimbed\n##ita\n##tors\nsuggest\n##ct\ndiscussion\nstaying\nchoir\nlakes\njacket\nrevenue\nnevertheless\npeaked\ninstrument\nwondering\nannually\nmanaging\nneil\n1891\nsigning\nterry\n##ice\napply\nclinical\nbrooklyn\naim\ncatherine\nfuck\nfarmers\nfigured\nninth\npride\nhugh\nevolution\nordinary\ninvolvement\ncomfortable\nshouted\ntech\nencouraged\ntaiwan\nrepresentation\nsharing\n##lia\n##em\npanic\nexact\ncargo\ncompeting\nfat\ncried\n83\n1920s\noccasions\npa\ncabin\nborders\nutah\nmarcus\n##isation\nbadly\nmuscles\n##ance\nvictorian\ntransition\nwarner\nbet\npermission\n##rin\nslave\nterrible\nsimilarly\nshares\nseth\nuefa\npossession\nmedals\nbenefits\ncolleges\nlowered\nperfectly\nmall\ntransit\n##ye\n##kar\npublisher\n##ened\nharrison\ndeaths\nelevation\n##ae\nasleep\nmachines\nsigh\nash\nhardly\nargument\noccasion\nparent\nleo\ndecline\n1888\ncontribution\n##ua\nconcentration\n1000\nopportunities\nhispanic\nguardian\nextent\nemotions\nhips\nmason\nvolumes\nbloody\ncontroversy\ndiameter\nsteady\nmistake\nphoenix\nidentify\nviolin\n##sk\ndeparture\nrichmond\nspin\nfuneral\nenemies\n1864\ngear\nliterally\nconnor\nrandom\nsergeant\ngrab\nconfusion\n1865\ntransmission\ninformed\nop\nleaning\nsacred\nsuspended\nthinks\ngates\nportland\nluck\nagencies\nyours\nhull\nexpert\nmuscle\nlayer\npractical\nsculpture\njerusalem\nlatest\nlloyd\nstatistics\ndeeper\nrecommended\nwarrior\narkansas\nmess\nsupports\ngreg\neagle\n1880\nrecovered\nrated\nconcerts\nrushed\n##ano\nstops\neggs\nfiles\npremiere\nkeith\n##vo\ndelhi\nturner\npit\naffair\nbelief\npaint\n##zing\nmate\n##ach\n##ev\nvictim\n##ology\nwithdrew\nbonus\nstyles\nfled\n##ud\nglasgow\ntechnologies\nfunded\nnbc\nadaptation\n##ata\nportrayed\ncooperation\nsupporters\njudges\nbernard\njustin\nhallway\nralph\n##ick\ngraduating\ncontroversial\ndistant\ncontinental\nspider\nbite\n##ho\nrecognize\nintention\nmixing\n##ese\negyptian\nbow\ntourism\nsuppose\nclaiming\ntiger\ndominated\nparticipants\nvi\n##ru\nnurse\npartially\ntape\n##rum\npsychology\n##rn\nessential\ntouring\nduo\nvoting\ncivilian\nemotional\nchannels\n##king\napparent\nhebrew\n1887\ntommy\ncarrier\nintersection\nbeast\nhudson\n##gar\n##zo\nlab\nnova\nbench\ndiscuss\ncosta\n##ered\ndetailed\nbehalf\ndrivers\nunfortunately\nobtain\n##lis\nrocky\n##dae\nsiege\nfriendship\nhoney\n##rian\n1861\namy\nhang\nposted\ngovernments\ncollins\nrespond\nwildlife\npreferred\noperator\n##po\nlaura\npregnant\nvideos\ndennis\nsuspected\nboots\ninstantly\nweird\nautomatic\nbusinessman\nalleged\nplacing\nthrowing\nph\nmood\n1862\nperry\nvenue\njet\nremainder\n##lli\n##ci\npassion\nbiological\nboyfriend\n1863\ndirt\nbuffalo\nron\nsegment\nfa\nabuse\n##era\ngenre\nthrown\nstroke\ncolored\nstress\nexercise\ndisplayed\n##gen\nstruggled\n##tti\nabroad\ndramatic\nwonderful\nthereafter\nmadrid\ncomponent\nwidespread\n##sed\ntale\ncitizen\ntodd\nmonday\n1886\nvancouver\noverseas\nforcing\ncrying\ndescent\n##ris\ndiscussed\nsubstantial\nranks\nregime\n1870\nprovinces\nswitch\ndrum\nzane\nted\ntribes\nproof\nlp\ncream\nresearchers\nvolunteer\nmanor\nsilk\nmilan\ndonated\nallies\nventure\nprinciple\ndelivery\nenterprise\n##ves\n##ans\nbars\ntraditionally\nwitch\nreminded\ncopper\n##uk\npete\ninter\nlinks\ncolin\ngrinned\nelsewhere\ncompetitive\nfrequent\n##oy\nscream\n##hu\ntension\ntexts\nsubmarine\nfinnish\ndefending\ndefend\npat\ndetail\n1884\naffiliated\nstuart\nthemes\nvilla\nperiods\ntool\nbelgian\nruling\ncrimes\nanswers\nfolded\nlicensed\nresort\ndemolished\nhans\nlucy\n1881\nlion\ntraded\nphotographs\nwrites\ncraig\n##fa\ntrials\ngenerated\nbeth\nnoble\ndebt\npercentage\nyorkshire\nerected\nss\nviewed\ngrades\nconfidence\nceased\nislam\ntelephone\nretail\n##ible\nchile\nm²\nroberts\nsixteen\n##ich\ncommented\nhampshire\ninnocent\ndual\npounds\nchecked\nregulations\nafghanistan\nsung\nrico\nliberty\nassets\nbigger\noptions\nangels\nrelegated\ntribute\nwells\nattending\nleaf\n##yan\nbutler\nromanian\nforum\nmonthly\nlisa\npatterns\ngmina\n##tory\nmadison\nhurricane\nrev\n##ians\nbristol\n##ula\nelite\nvaluable\ndisaster\ndemocracy\nawareness\ngermans\nfreyja\n##ins\nloop\nabsolutely\npaying\npopulations\nmaine\nsole\nprayer\nspencer\nreleases\ndoorway\nbull\n##ani\nlover\nmidnight\nconclusion\n##sson\nthirteen\nlily\nmediterranean\n##lt\nnhl\nproud\nsample\n##hill\ndrummer\nguinea\n##ova\nmurphy\nclimb\n##ston\ninstant\nattributed\nhorn\nain\nrailways\nsteven\n##ao\nautumn\nferry\nopponent\nroot\ntraveling\nsecured\ncorridor\nstretched\ntales\nsheet\ntrinity\ncattle\nhelps\nindicates\nmanhattan\nmurdered\nfitted\n1882\ngentle\ngrandmother\nmines\nshocked\nvegas\nproduces\n##light\ncaribbean\n##ou\nbelong\ncontinuous\ndesperate\ndrunk\nhistorically\ntrio\nwaved\nraf\ndealing\nnathan\nbat\nmurmured\ninterrupted\nresiding\nscientist\npioneer\nharold\naaron\n##net\ndelta\nattempting\nminority\nmini\nbelieves\nchorus\ntend\nlots\neyed\nindoor\nload\nshots\nupdated\njail\n##llo\nconcerning\nconnecting\nwealth\n##ved\nslaves\narrive\nrangers\nsufficient\nrebuilt\n##wick\ncardinal\nflood\nmuhammad\nwhenever\nrelation\nrunners\nmoral\nrepair\nviewers\narriving\nrevenge\npunk\nassisted\nbath\nfairly\nbreathe\nlists\ninnings\nillustrated\nwhisper\nnearest\nvoters\nclinton\nties\nultimate\nscreamed\nbeijing\nlions\nandre\nfictional\ngathering\ncomfort\nradar\nsuitable\ndismissed\nhms\nban\npine\nwrist\natmosphere\nvoivodeship\nbid\ntimber\n##ned\n##nan\ngiants\n##ane\ncameron\nrecovery\nuss\nidentical\ncategories\nswitched\nserbia\nlaughter\nnoah\nensemble\ntherapy\npeoples\ntouching\n##off\nlocally\npearl\nplatforms\neverywhere\nballet\ntables\nlanka\nherbert\noutdoor\ntoured\nderek\n1883\nspaces\ncontested\nswept\n1878\nexclusive\nslight\nconnections\n##dra\nwinds\nprisoner\ncollective\nbangladesh\ntube\npublicly\nwealthy\nthai\n##ys\nisolated\nselect\n##ric\ninsisted\npen\nfortune\nticket\nspotted\nreportedly\nanimation\nenforcement\ntanks\n110\ndecides\nwider\nlowest\nowen\n##time\nnod\nhitting\n##hn\ngregory\nfurthermore\nmagazines\nfighters\nsolutions\n##ery\npointing\nrequested\nperu\nreed\nchancellor\nknights\nmask\nworker\neldest\nflames\nreduction\n1860\nvolunteers\n##tis\nreporting\n##hl\nwire\nadvisory\nendemic\norigins\nsettlers\npursue\nknock\nconsumer\n1876\neu\ncompound\ncreatures\nmansion\nsentenced\nivan\ndeployed\nguitars\nfrowned\ninvolves\nmechanism\nkilometers\nperspective\nshops\nmaps\nterminus\nduncan\nalien\nfist\nbridges\n##pers\nheroes\nfed\nderby\nswallowed\n##ros\npatent\nsara\nillness\ncharacterized\nadventures\nslide\nhawaii\njurisdiction\n##op\norganised\n##side\nadelaide\nwalks\nbiology\nse\n##ties\nrogers\nswing\ntightly\nboundaries\n##rie\nprepare\nimplementation\nstolen\n##sha\ncertified\ncolombia\nedwards\ngarage\n##mm\nrecalled\n##ball\nrage\nharm\nnigeria\nbreast\n##ren\nfurniture\npupils\nsettle\n##lus\ncuba\nballs\nclient\nalaska\n21st\nlinear\nthrust\ncelebration\nlatino\ngenetic\nterror\n##cia\n##ening\nlightning\nfee\nwitness\nlodge\nestablishing\nskull\n##ique\nearning\nhood\n##ei\nrebellion\nwang\nsporting\nwarned\nmissile\ndevoted\nactivist\nporch\nworship\nfourteen\npackage\n1871\ndecorated\n##shire\nhoused\n##ock\nchess\nsailed\ndoctors\noscar\njoan\ntreat\ngarcia\nharbour\njeremy\n##ire\ntraditions\ndominant\njacques\n##gon\n##wan\nrelocated\n1879\namendment\nsized\ncompanion\nsimultaneously\nvolleyball\nspun\nacre\nincreases\nstopping\nloves\nbelongs\naffect\ndrafted\ntossed\nscout\nbattles\n1875\nfilming\nshoved\nmunich\ntenure\nvertical\nromance\npc\n##cher\nargue\n##ical\ncraft\nranging\nwww\nopens\nhonest\ntyler\nyesterday\nvirtual\n##let\nmuslims\nreveal\nsnake\nimmigrants\nradical\nscreaming\nspeakers\nfiring\nsaving\nbelonging\nease\nlighting\nprefecture\nblame\nfarmer\nhungry\ngrows\nrubbed\nbeam\nsur\nsubsidiary\n##cha\narmenian\nsao\ndropping\nconventional\n##fer\nmicrosoft\nreply\nqualify\nspots\n1867\nsweat\nfestivals\n##ken\nimmigration\nphysician\ndiscover\nexposure\nsandy\nexplanation\nisaac\nimplemented\n##fish\nhart\ninitiated\nconnect\nstakes\npresents\nheights\nhouseholder\npleased\ntourist\nregardless\nslip\nclosest\n##ction\nsurely\nsultan\nbrings\nriley\npreparation\naboard\nslammed\nbaptist\nexperiment\nongoing\ninterstate\norganic\nplayoffs\n##ika\n1877\n130\n##tar\nhindu\nerror\ntours\ntier\nplenty\narrangements\ntalks\ntrapped\nexcited\nsank\nho\nathens\n1872\ndenver\nwelfare\nsuburb\nathletes\ntrick\ndiverse\nbelly\nexclusively\nyelled\n1868\n##med\nconversion\n##ette\n1874\ninternationally\ncomputers\nconductor\nabilities\nsensitive\nhello\ndispute\nmeasured\nglobe\nrocket\nprices\namsterdam\nflights\ntigers\ninn\nmunicipalities\nemotion\nreferences\n3d\n##mus\nexplains\nairlines\nmanufactured\npm\narchaeological\n1873\ninterpretation\ndevon\ncomment\n##ites\nsettlements\nkissing\nabsolute\nimprovement\nsuite\nimpressed\nbarcelona\nsullivan\njefferson\ntowers\njesse\njulie\n##tin\n##lu\ngrandson\nhi\ngauge\nregard\nrings\ninterviews\ntrace\nraymond\nthumb\ndepartments\nburns\nserial\nbulgarian\nscores\ndemonstrated\n##ix\n1866\nkyle\nalberta\nunderneath\nromanized\n##ward\nrelieved\nacquisition\nphrase\ncliff\nreveals\nhan\ncuts\nmerger\ncustom\n##dar\nnee\ngilbert\ngraduation\n##nts\nassessment\ncafe\ndifficulty\ndemands\nswung\ndemocrat\njennifer\ncommons\n1940s\ngrove\n##yo\ncompleting\nfocuses\nsum\nsubstitute\nbearing\nstretch\nreception\n##py\nreflected\nessentially\ndestination\npairs\n##ched\nsurvival\nresource\n##bach\npromoting\ndoubles\nmessages\ntear\n##down\n##fully\nparade\nflorence\nharvey\nincumbent\npartial\nframework\n900\npedro\nfrozen\nprocedure\nolivia\ncontrols\n##mic\nshelter\npersonally\ntemperatures\n##od\nbrisbane\ntested\nsits\nmarble\ncomprehensive\noxygen\nleonard\n##kov\ninaugural\niranian\nreferring\nquarters\nattitude\n##ivity\nmainstream\nlined\nmars\ndakota\nnorfolk\nunsuccessful\n##°\nexplosion\nhelicopter\ncongressional\n##sing\ninspector\nbitch\nseal\ndeparted\ndivine\n##ters\ncoaching\nexamination\npunishment\nmanufacturer\nsink\ncolumns\nunincorporated\nsignals\nnevada\nsqueezed\ndylan\ndining\nphotos\nmartial\nmanuel\neighteen\nelevator\nbrushed\nplates\nministers\nivy\ncongregation\n##len\nslept\nspecialized\ntaxes\ncurve\nrestricted\nnegotiations\nlikes\nstatistical\narnold\ninspiration\nexecution\nbold\nintermediate\nsignificance\nmargin\nruler\nwheels\ngothic\nintellectual\ndependent\nlistened\neligible\nbuses\nwidow\nsyria\nearn\ncincinnati\ncollapsed\nrecipient\nsecrets\naccessible\nphilippine\nmaritime\ngoddess\nclerk\nsurrender\nbreaks\nplayoff\ndatabase\n##ified\n##lon\nideal\nbeetle\naspect\nsoap\nregulation\nstrings\nexpand\nanglo\nshorter\ncrosses\nretreat\ntough\ncoins\nwallace\ndirections\npressing\n##oon\nshipping\nlocomotives\ncomparison\ntopics\nnephew\n##mes\ndistinction\nhonors\ntravelled\nsierra\nibn\n##over\nfortress\nsa\nrecognised\ncarved\n1869\nclients\n##dan\nintent\n##mar\ncoaches\ndescribing\nbread\n##ington\nbeaten\nnorthwestern\n##ona\nmerit\nyoutube\ncollapse\nchallenges\nem\nhistorians\nobjective\nsubmitted\nvirus\nattacking\ndrake\nassume\n##ere\ndiseases\nmarc\nstem\nleeds\n##cus\n##ab\nfarming\nglasses\n##lock\nvisits\nnowhere\nfellowship\nrelevant\ncarries\nrestaurants\nexperiments\n101\nconstantly\nbases\ntargets\nshah\ntenth\nopponents\nverse\nterritorial\n##ira\nwritings\ncorruption\n##hs\ninstruction\ninherited\nreverse\nemphasis\n##vic\nemployee\narch\nkeeps\nrabbi\nwatson\npayment\nuh\n##ala\nnancy\n##tre\nvenice\nfastest\nsexy\nbanned\nadrian\nproperly\nruth\ntouchdown\ndollar\nboards\nmetre\ncircles\nedges\nfavour\ncomments\nok\ntravels\nliberation\nscattered\nfirmly\n##ular\nholland\npermitted\ndiesel\nkenya\nden\noriginated\n##ral\ndemons\nresumed\ndragged\nrider\n##rus\nservant\nblinked\nextend\ntorn\n##ias\n##sey\ninput\nmeal\neverybody\ncylinder\nkinds\ncamps\n##fe\nbullet\nlogic\n##wn\ncroatian\nevolved\nhealthy\nfool\nchocolate\nwise\npreserve\npradesh\n##ess\nrespective\n1850\n##ew\nchicken\nartificial\ngross\ncorresponding\nconvicted\ncage\ncaroline\ndialogue\n##dor\nnarrative\nstranger\nmario\nbr\nchristianity\nfailing\ntrent\ncommanding\nbuddhist\n1848\nmaurice\nfocusing\nyale\nbike\naltitude\n##ering\nmouse\nrevised\n##sley\nveteran\n##ig\npulls\ntheology\ncrashed\ncampaigns\nlegion\n##ability\ndrag\nexcellence\ncustomer\ncancelled\nintensity\nexcuse\n##lar\nliga\nparticipating\ncontributing\nprinting\n##burn\nvariable\n##rk\ncurious\nbin\nlegacy\nrenaissance\n##my\nsymptoms\nbinding\nvocalist\ndancer\n##nie\ngrammar\ngospel\ndemocrats\nya\nenters\nsc\ndiplomatic\nhitler\n##ser\nclouds\nmathematical\nquit\ndefended\noriented\n##heim\nfundamental\nhardware\nimpressive\nequally\nconvince\nconfederate\nguilt\nchuck\nsliding\n##ware\nmagnetic\nnarrowed\npetersburg\nbulgaria\notto\nphd\nskill\n##ama\nreader\nhopes\npitcher\nreservoir\nhearts\nautomatically\nexpecting\nmysterious\nbennett\nextensively\nimagined\nseeds\nmonitor\nfix\n##ative\njournalism\nstruggling\nsignature\nranch\nencounter\nphotographer\nobservation\nprotests\n##pin\ninfluences\n##hr\ncalendar\n##all\ncruz\ncroatia\nlocomotive\nhughes\nnaturally\nshakespeare\nbasement\nhook\nuncredited\nfaded\ntheories\napproaches\ndare\nphillips\nfilling\nfury\nobama\n##ain\nefficient\narc\ndeliver\nmin\nraid\nbreeding\ninducted\nleagues\nefficiency\naxis\nmontana\neagles\n##ked\nsupplied\ninstructions\nkaren\npicking\nindicating\ntrap\nanchor\npractically\nchristians\ntomb\nvary\noccasional\nelectronics\nlords\nreaders\nnewcastle\nfaint\ninnovation\ncollect\nsituations\nengagement\n160\nclaude\nmixture\n##feld\npeer\ntissue\nlogo\nlean\n##ration\n°f\nfloors\n##ven\narchitects\nreducing\n##our\n##ments\nrope\n1859\nottawa\n##har\nsamples\nbanking\ndeclaration\nproteins\nresignation\nfrancois\nsaudi\nadvocate\nexhibited\narmor\ntwins\ndivorce\n##ras\nabraham\nreviewed\njo\ntemporarily\nmatrix\nphysically\npulse\ncurled\n##ena\ndifficulties\nbengal\nusage\n##ban\nannie\nriders\ncertificate\n##pi\nholes\nwarsaw\ndistinctive\njessica\n##mon\nmutual\n1857\ncustoms\ncircular\neugene\nremoval\nloaded\nmere\nvulnerable\ndepicted\ngenerations\ndame\nheir\nenormous\nlightly\nclimbing\npitched\nlessons\npilots\nnepal\nram\ngoogle\npreparing\nbrad\nlouise\nrenowned\n##₂\nliam\n##ably\nplaza\nshaw\nsophie\nbrilliant\nbills\n##bar\n##nik\nfucking\nmainland\nserver\npleasant\nseized\nveterans\njerked\nfail\nbeta\nbrush\nradiation\nstored\nwarmth\nsoutheastern\nnate\nsin\nraced\nberkeley\njoke\nathlete\ndesignation\ntrunk\n##low\nroland\nqualification\narchives\nheels\nartwork\nreceives\njudicial\nreserves\n##bed\nwoke\ninstallation\nabu\nfloating\nfake\nlesser\nexcitement\ninterface\nconcentrated\naddressed\ncharacteristic\namanda\nsaxophone\nmonk\nauto\n##bus\nreleasing\negg\ndies\ninteraction\ndefender\nce\noutbreak\nglory\nloving\n##bert\nsequel\nconsciousness\nhttp\nawake\nski\nenrolled\n##ress\nhandling\nrookie\nbrow\nsomebody\nbiography\nwarfare\namounts\ncontracts\npresentation\nfabric\ndissolved\nchallenged\nmeter\npsychological\nlt\nelevated\nrally\naccurate\n##tha\nhospitals\nundergraduate\nspecialist\nvenezuela\nexhibit\nshed\nnursing\nprotestant\nfluid\nstructural\nfootage\njared\nconsistent\nprey\n##ska\nsuccession\nreflect\nexile\nlebanon\nwiped\nsuspect\nshanghai\nresting\nintegration\npreservation\nmarvel\nvariant\npirates\nsheep\nrounded\ncapita\nsailing\ncolonies\nmanuscript\ndeemed\nvariations\nclarke\nfunctional\nemerging\nboxing\nrelaxed\ncurse\nazerbaijan\nheavyweight\nnickname\neditorial\nrang\ngrid\ntightened\nearthquake\nflashed\nmiguel\nrushing\n##ches\nimprovements\nboxes\nbrooks\n180\nconsumption\nmolecular\nfelix\nsocieties\nrepeatedly\nvariation\naids\ncivic\ngraphics\nprofessionals\nrealm\nautonomous\nreceiver\ndelayed\nworkshop\nmilitia\nchairs\ntrump\ncanyon\n##point\nharsh\nextending\nlovely\nhappiness\n##jan\nstake\neyebrows\nembassy\nwellington\nhannah\n##ella\nsony\ncorners\nbishops\nswear\ncloth\ncontents\nxi\nnamely\ncommenced\n1854\nstanford\nnashville\ncourage\ngraphic\ncommitment\ngarrison\n##bin\nhamlet\nclearing\nrebels\nattraction\nliteracy\ncooking\nruins\ntemples\njenny\nhumanity\ncelebrate\nhasn\nfreight\nsixty\nrebel\nbastard\n##art\nnewton\n##ada\ndeer\n##ges\n##ching\nsmiles\ndelaware\nsingers\n##ets\napproaching\nassists\nflame\n##ph\nboulevard\nbarrel\nplanted\n##ome\npursuit\n##sia\nconsequences\nposts\nshallow\ninvitation\nrode\ndepot\nernest\nkane\nrod\nconcepts\npreston\ntopic\nchambers\nstriking\nblast\narrives\ndescendants\nmontgomery\nranges\nworlds\n##lay\n##ari\nspan\nchaos\npraise\n##ag\nfewer\n1855\nsanctuary\nmud\nfbi\n##ions\nprogrammes\nmaintaining\nunity\nharper\nbore\nhandsome\nclosure\ntournaments\nthunder\nnebraska\nlinda\nfacade\nputs\nsatisfied\nargentine\ndale\ncork\ndome\npanama\n##yl\n1858\ntasks\nexperts\n##ates\nfeeding\nequation\n##las\n##ida\n##tu\nengage\nbryan\n##ax\num\nquartet\nmelody\ndisbanded\nsheffield\nblocked\ngasped\ndelay\nkisses\nmaggie\nconnects\n##non\nsts\npoured\ncreator\npublishers\n##we\nguided\nellis\nextinct\nhug\ngaining\n##ord\ncomplicated\n##bility\npoll\nclenched\ninvestigate\n##use\nthereby\nquantum\nspine\ncdp\nhumor\nkills\nadministered\nsemifinals\n##du\nencountered\nignore\n##bu\ncommentary\n##maker\nbother\nroosevelt\n140\nplains\nhalfway\nflowing\ncultures\ncrack\nimprisoned\nneighboring\nairline\n##ses\n##view\n##mate\n##ec\ngather\nwolves\nmarathon\ntransformed\n##ill\ncruise\norganisations\ncarol\npunch\nexhibitions\nnumbered\nalarm\nratings\ndaddy\nsilently\n##stein\nqueens\ncolours\nimpression\nguidance\nliu\ntactical\n##rat\nmarshal\ndella\narrow\n##ings\nrested\nfeared\ntender\nowns\nbitter\nadvisor\nescort\n##ides\nspare\nfarms\ngrants\n##ene\ndragons\nencourage\ncolleagues\ncameras\n##und\nsucked\npile\nspirits\nprague\nstatements\nsuspension\nlandmark\nfence\ntorture\nrecreation\nbags\npermanently\nsurvivors\npond\nspy\npredecessor\nbombing\ncoup\n##og\nprotecting\ntransformation\nglow\n##lands\n##book\ndug\npriests\nandrea\nfeat\nbarn\njumping\n##chen\n##ologist\n##con\ncasualties\nstern\nauckland\npipe\nserie\nrevealing\nba\n##bel\ntrevor\nmercy\nspectrum\nyang\nconsist\ngoverning\ncollaborated\npossessed\nepic\ncomprises\nblew\nshane\n##ack\nlopez\nhonored\nmagical\nsacrifice\njudgment\nperceived\nhammer\nmtv\nbaronet\ntune\ndas\nmissionary\nsheets\n350\nneutral\noral\nthreatening\nattractive\nshade\naims\nseminary\n##master\nestates\n1856\nmichel\nwounds\nrefugees\nmanufacturers\n##nic\nmercury\nsyndrome\nporter\n##iya\n##din\nhamburg\nidentification\nupstairs\npurse\nwidened\npause\ncared\nbreathed\naffiliate\nsantiago\nprevented\nceltic\nfisher\n125\nrecruited\nbyzantine\nreconstruction\nfarther\n##mp\ndiet\nsake\nau\nspite\nsensation\n##ert\nblank\nseparation\n105\n##hon\nvladimir\narmies\nanime\n##lie\naccommodate\norbit\ncult\nsofia\narchive\n##ify\n##box\nfounders\nsustained\ndisorder\nhonours\nnortheastern\nmia\ncrops\nviolet\nthreats\nblanket\nfires\ncanton\nfollowers\nsouthwestern\nprototype\nvoyage\nassignment\naltered\nmoderate\nprotocol\npistol\n##eo\nquestioned\nbrass\nlifting\n1852\nmath\nauthored\n##ual\ndoug\ndimensional\ndynamic\n##san\n1851\npronounced\ngrateful\nquest\nuncomfortable\nboom\npresidency\nstevens\nrelating\npoliticians\nchen\nbarrier\nquinn\ndiana\nmosque\ntribal\ncheese\npalmer\nportions\nsometime\nchester\ntreasure\nwu\nbend\ndownload\nmillions\nreforms\nregistration\n##osa\nconsequently\nmonitoring\nate\npreliminary\nbrandon\ninvented\nps\neaten\nexterior\nintervention\nports\ndocumented\nlog\ndisplays\nlecture\nsally\nfavourite\n##itz\nvermont\nlo\ninvisible\nisle\nbreed\n##ator\njournalists\nrelay\nspeaks\nbackward\nexplore\nmidfielder\nactively\nstefan\nprocedures\ncannon\nblond\nkenneth\ncentered\nservants\nchains\nlibraries\nmalcolm\nessex\nhenri\nslavery\n##hal\nfacts\nfairy\ncoached\ncassie\ncats\nwashed\ncop\n##fi\nannouncement\nitem\n2000s\nvinyl\nactivated\nmarco\nfrontier\ngrowled\ncurriculum\n##das\nloyal\naccomplished\nleslie\nritual\nkenny\n##00\nvii\nnapoleon\nhollow\nhybrid\njungle\nstationed\nfriedrich\ncounted\n##ulated\nplatinum\ntheatrical\nseated\ncol\nrubber\nglen\n1840\ndiversity\nhealing\nextends\nid\nprovisions\nadministrator\ncolumbus\n##oe\ntributary\nte\nassured\norg\n##uous\nprestigious\nexamined\nlectures\ngrammy\nronald\nassociations\nbailey\nallan\nessays\nflute\nbelieving\nconsultant\nproceedings\ntravelling\n1853\nkit\nkerala\nyugoslavia\nbuddy\nmethodist\n##ith\nburial\ncentres\nbatman\n##nda\ndiscontinued\nbo\ndock\nstockholm\nlungs\nseverely\n##nk\nciting\nmanga\n##ugh\nsteal\nmumbai\niraqi\nrobot\ncelebrity\nbride\nbroadcasts\nabolished\npot\njoel\noverhead\nfranz\npacked\nreconnaissance\njohann\nacknowledged\nintroduce\nhandled\ndoctorate\ndevelopments\ndrinks\nalley\npalestine\n##nis\n##aki\nproceeded\nrecover\nbradley\ngrain\npatch\nafford\ninfection\nnationalist\nlegendary\n##ath\ninterchange\nvirtually\ngen\ngravity\nexploration\namber\nvital\nwishes\npowell\ndoctrine\nelbow\nscreenplay\n##bird\ncontribute\nindonesian\npet\ncreates\n##com\nenzyme\nkylie\ndiscipline\ndrops\nmanila\nhunger\n##ien\nlayers\nsuffer\nfever\nbits\nmonica\nkeyboard\nmanages\n##hood\nsearched\nappeals\n##bad\ntestament\ngrande\nreid\n##war\nbeliefs\ncongo\n##ification\n##dia\nsi\nrequiring\n##via\ncasey\n1849\nregret\nstreak\nrape\ndepends\nsyrian\nsprint\npound\ntourists\nupcoming\npub\n##xi\ntense\n##els\npracticed\necho\nnationwide\nguild\nmotorcycle\nliz\n##zar\nchiefs\ndesired\nelena\nbye\nprecious\nabsorbed\nrelatives\nbooth\npianist\n##mal\ncitizenship\nexhausted\nwilhelm\n##ceae\n##hed\nnoting\nquarterback\nurge\nhectares\n##gue\nace\nholly\n##tal\nblonde\ndavies\nparked\nsustainable\nstepping\ntwentieth\nairfield\ngalaxy\nnest\nchip\n##nell\ntan\nshaft\npaulo\nrequirement\n##zy\nparadise\ntobacco\ntrans\nrenewed\nvietnamese\n##cker\n##ju\nsuggesting\ncatching\nholmes\nenjoying\nmd\ntrips\ncolt\nholder\nbutterfly\nnerve\nreformed\ncherry\nbowling\ntrailer\ncarriage\ngoodbye\nappreciate\ntoy\njoshua\ninteractive\nenabled\ninvolve\n##kan\ncollar\ndetermination\nbunch\nfacebook\nrecall\nshorts\nsuperintendent\nepiscopal\nfrustration\ngiovanni\nnineteenth\nlaser\nprivately\narray\ncirculation\n##ovic\narmstrong\ndeals\npainful\npermit\ndiscrimination\n##wi\naires\nretiring\ncottage\nni\n##sta\nhorizon\nellen\njamaica\nripped\nfernando\nchapters\nplaystation\npatron\nlecturer\nnavigation\nbehaviour\ngenes\ngeorgian\nexport\nsolomon\nrivals\nswift\nseventeen\nrodriguez\nprinceton\nindependently\nsox\n1847\narguing\nentity\ncasting\nhank\ncriteria\noakland\ngeographic\nmilwaukee\nreflection\nexpanding\nconquest\ndubbed\n##tv\nhalt\nbrave\nbrunswick\ndoi\narched\ncurtis\ndivorced\npredominantly\nsomerset\nstreams\nugly\nzoo\nhorrible\ncurved\nbuenos\nfierce\ndictionary\nvector\ntheological\nunions\nhandful\nstability\nchan\npunjab\nsegments\n##lly\naltar\nignoring\ngesture\nmonsters\npastor\n##stone\nthighs\nunexpected\noperators\nabruptly\ncoin\ncompiled\nassociates\nimproving\nmigration\npin\n##ose\ncompact\ncollegiate\nreserved\n##urs\nquarterfinals\nroster\nrestore\nassembled\nhurry\noval\n##cies\n1846\nflags\nmartha\n##del\nvictories\nsharply\n##rated\nargues\ndeadly\nneo\ndrawings\nsymbols\nperformer\n##iel\ngriffin\nrestrictions\nediting\nandrews\njava\njournals\narabia\ncompositions\ndee\npierce\nremoving\nhindi\ncasino\nrunway\ncivilians\nminds\nnasa\nhotels\n##zation\nrefuge\nrent\nretain\npotentially\nconferences\nsuburban\nconducting\n##tto\n##tions\n##tle\ndescended\nmassacre\n##cal\nammunition\nterrain\nfork\nsouls\ncounts\nchelsea\ndurham\ndrives\ncab\n##bank\nperth\nrealizing\npalestinian\nfinn\nsimpson\n##dal\nbetty\n##ule\nmoreover\nparticles\ncardinals\ntent\nevaluation\nextraordinary\n##oid\ninscription\n##works\nwednesday\nchloe\nmaintains\npanels\nashley\ntrucks\n##nation\ncluster\nsunlight\nstrikes\nzhang\n##wing\ndialect\ncanon\n##ap\ntucked\n##ws\ncollecting\n##mas\n##can\n##sville\nmaker\nquoted\nevan\nfranco\naria\nbuying\ncleaning\neva\ncloset\nprovision\napollo\nclinic\nrat\n##ez\nnecessarily\nac\n##gle\n##ising\nvenues\nflipped\ncent\nspreading\ntrustees\nchecking\nauthorized\n##sco\ndisappointed\n##ado\nnotion\nduration\ntrumpet\nhesitated\ntopped\nbrussels\nrolls\ntheoretical\nhint\ndefine\naggressive\nrepeat\nwash\npeaceful\noptical\nwidth\nallegedly\nmcdonald\nstrict\ncopyright\n##illa\ninvestors\nmar\njam\nwitnesses\nsounding\nmiranda\nmichelle\nprivacy\nhugo\nharmony\n##pp\nvalid\nlynn\nglared\nnina\n102\nheadquartered\ndiving\nboarding\ngibson\n##ncy\nalbanian\nmarsh\nroutine\ndealt\nenhanced\ner\nintelligent\nsubstance\ntargeted\nenlisted\ndiscovers\nspinning\nobservations\npissed\nsmoking\nrebecca\ncapitol\nvisa\nvaried\ncostume\nseemingly\nindies\ncompensation\nsurgeon\nthursday\narsenal\nwestminster\nsuburbs\nrid\nanglican\n##ridge\nknots\nfoods\nalumni\nlighter\nfraser\nwhoever\nportal\nscandal\n##ray\ngavin\nadvised\ninstructor\nflooding\nterrorist\n##ale\nteenage\ninterim\nsenses\nduck\nteen\nthesis\nabby\neager\novercome\n##ile\nnewport\nglenn\nrises\nshame\n##cc\nprompted\npriority\nforgot\nbomber\nnicolas\nprotective\n360\ncartoon\nkatherine\nbreeze\nlonely\ntrusted\nhenderson\nrichardson\nrelax\nbanner\ncandy\npalms\nremarkable\n##rio\nlegends\ncricketer\nessay\nordained\nedmund\nrifles\ntrigger\n##uri\n##away\nsail\nalert\n1830\naudiences\npenn\nsussex\nsiblings\npursued\nindianapolis\nresist\nrosa\nconsequence\nsucceed\navoided\n1845\n##ulation\ninland\n##tie\n##nna\ncounsel\nprofession\nchronicle\nhurried\n##una\neyebrow\neventual\nbleeding\ninnovative\ncure\n##dom\ncommittees\naccounting\ncon\nscope\nhardy\nheather\ntenor\ngut\nherald\ncodes\ntore\nscales\nwagon\n##oo\nluxury\ntin\nprefer\nfountain\ntriangle\nbonds\ndarling\nconvoy\ndried\ntraced\nbeings\ntroy\naccidentally\nslam\nfindings\nsmelled\njoey\nlawyers\noutcome\nsteep\nbosnia\nconfiguration\nshifting\ntoll\nbrook\nperformers\nlobby\nphilosophical\nconstruct\nshrine\naggregate\nboot\ncox\nphenomenon\nsavage\ninsane\nsolely\nreynolds\nlifestyle\n##ima\nnationally\nholdings\nconsideration\nenable\nedgar\nmo\nmama\n##tein\nfights\nrelegation\nchances\natomic\nhub\nconjunction\nawkward\nreactions\ncurrency\nfinale\nkumar\nunderwent\nsteering\nelaborate\ngifts\ncomprising\nmelissa\nveins\nreasonable\nsunshine\nchi\nsolve\ntrails\ninhabited\nelimination\nethics\nhuh\nana\nmolly\nconsent\napartments\nlayout\nmarines\n##ces\nhunters\nbulk\n##oma\nhometown\n##wall\n##mont\ncracked\nreads\nneighbouring\nwithdrawn\nadmission\nwingspan\ndamned\nanthology\nlancashire\nbrands\nbatting\nforgive\ncuban\nawful\n##lyn\n104\ndimensions\nimagination\n##ade\ndante\n##ship\ntracking\ndesperately\ngoalkeeper\n##yne\ngroaned\nworkshops\nconfident\nburton\ngerald\nmilton\ncircus\nuncertain\nslope\ncopenhagen\nsophia\nfog\nphilosopher\nportraits\naccent\ncycling\nvarying\ngripped\nlarvae\ngarrett\nspecified\nscotia\nmature\nluther\nkurt\nrap\n##kes\naerial\n750\nferdinand\nheated\nes\ntransported\n##shan\nsafely\nnonetheless\n##orn\n##gal\nmotors\ndemanding\n##sburg\nstartled\n##brook\nally\ngenerate\ncaps\nghana\nstained\ndemo\nmentions\nbeds\nap\nafterward\ndiary\n##bling\nutility\n##iro\nrichards\n1837\nconspiracy\nconscious\nshining\nfootsteps\nobserver\ncyprus\nurged\nloyalty\ndeveloper\nprobability\nolive\nupgraded\ngym\nmiracle\ninsects\ngraves\n1844\nourselves\nhydrogen\namazon\nkatie\ntickets\npoets\n##pm\nplanes\n##pan\nprevention\nwitnessed\ndense\njin\nrandy\ntang\nwarehouse\nmonroe\nbang\narchived\nelderly\ninvestigations\nalec\ngranite\nmineral\nconflicts\ncontrolling\naboriginal\ncarlo\n##zu\nmechanics\nstan\nstark\nrhode\nskirt\nest\n##berry\nbombs\nrespected\n##horn\nimposed\nlimestone\ndeny\nnominee\nmemphis\ngrabbing\ndisabled\n##als\namusement\naa\nfrankfurt\ncorn\nreferendum\nvaries\nslowed\ndisk\nfirms\nunconscious\nincredible\nclue\nsue\n##zhou\ntwist\n##cio\njoins\nidaho\nchad\ndevelopers\ncomputing\ndestroyer\n103\nmortal\ntucker\nkingston\nchoices\nyu\ncarson\n1800\nos\nwhitney\ngeneva\npretend\ndimension\nstaged\nplateau\nmaya\n##une\nfreestyle\n##bc\nrovers\nhiv\n##ids\ntristan\nclassroom\nprospect\n##hus\nhonestly\ndiploma\nlied\nthermal\nauxiliary\nfeast\nunlikely\niata\n##tel\nmorocco\npounding\ntreasury\nlithuania\nconsiderably\n1841\ndish\n1812\ngeological\nmatching\nstumbled\ndestroying\nmarched\nbrien\nadvances\ncake\nnicole\nbelle\nsettling\nmeasuring\ndirecting\n##mie\ntuesday\nbassist\ncapabilities\nstunned\nfraud\ntorpedo\n##list\n##phone\nanton\nwisdom\nsurveillance\nruined\n##ulate\nlawsuit\nhealthcare\ntheorem\nhalls\ntrend\naka\nhorizontal\ndozens\nacquire\nlasting\nswim\nhawk\ngorgeous\nfees\nvicinity\ndecrease\nadoption\ntactics\n##ography\npakistani\n##ole\ndraws\n##hall\nwillie\nburke\nheath\nalgorithm\nintegral\npowder\nelliott\nbrigadier\njackie\ntate\nvarieties\ndarker\n##cho\nlately\ncigarette\nspecimens\nadds\n##ree\n##ensis\n##inger\nexploded\nfinalist\ncia\nmurders\nwilderness\narguments\nnicknamed\nacceptance\nonwards\nmanufacture\nrobertson\njets\ntampa\nenterprises\nblog\nloudly\ncomposers\nnominations\n1838\nai\nmalta\ninquiry\nautomobile\nhosting\nviii\nrays\ntilted\ngrief\nmuseums\nstrategies\nfurious\neuro\nequality\ncohen\npoison\nsurrey\nwireless\ngoverned\nridiculous\nmoses\n##esh\n##room\nvanished\n##ito\nbarnes\nattract\nmorrison\nistanbul\n##iness\nabsent\nrotation\npetition\njanet\n##logical\nsatisfaction\ncustody\ndeliberately\nobservatory\ncomedian\nsurfaces\npinyin\nnovelist\nstrictly\ncanterbury\noslo\nmonks\nembrace\nibm\njealous\nphotograph\ncontinent\ndorothy\nmarina\ndoc\nexcess\nholden\nallegations\nexplaining\nstack\navoiding\nlance\nstoryline\nmajesty\npoorly\nspike\ndos\nbradford\nraven\ntravis\nclassics\nproven\nvoltage\npillow\nfists\nbutt\n1842\ninterpreted\n##car\n1839\ngage\ntelegraph\nlens\npromising\nexpelled\ncasual\ncollector\nzones\n##min\nsilly\nnintendo\n##kh\n##bra\ndownstairs\nchef\nsuspicious\nafl\nflies\nvacant\nuganda\npregnancy\ncondemned\nlutheran\nestimates\ncheap\ndecree\nsaxon\nproximity\nstripped\nidiot\ndeposits\ncontrary\npresenter\nmagnus\nglacier\nim\noffense\nedwin\n##ori\nupright\n##long\nbolt\n##ois\ntoss\ngeographical\n##izes\nenvironments\ndelicate\nmarking\nabstract\nxavier\nnails\nwindsor\nplantation\noccurring\nequity\nsaskatchewan\nfears\ndrifted\nsequences\nvegetation\nrevolt\n##stic\n1843\nsooner\nfusion\nopposing\nnato\nskating\n1836\nsecretly\nruin\nlease\n##oc\nedit\n##nne\nflora\nanxiety\nruby\n##ological\n##mia\ntel\nbout\ntaxi\nemmy\nfrost\nrainbow\ncompounds\nfoundations\nrainfall\nassassination\nnightmare\ndominican\n##win\nachievements\ndeserve\norlando\nintact\narmenia\n##nte\ncalgary\nvalentine\n106\nmarion\nproclaimed\ntheodore\nbells\ncourtyard\nthigh\ngonzalez\nconsole\ntroop\nminimal\nmonte\neveryday\n##ence\n##if\nsupporter\nterrorism\nbuck\nopenly\npresbyterian\nactivists\ncarpet\n##iers\nrubbing\nuprising\n##yi\ncute\nconceived\nlegally\n##cht\nmillennium\ncello\nvelocity\nji\nrescued\ncardiff\n1835\nrex\nconcentrate\nsenators\nbeard\nrendered\nglowing\nbattalions\nscouts\ncompetitors\nsculptor\ncatalogue\narctic\nion\nraja\nbicycle\nwow\nglancing\nlawn\n##woman\ngentleman\nlighthouse\npublish\npredicted\ncalculated\n##val\nvariants\n##gne\nstrain\n##ui\nwinston\ndeceased\n##nus\ntouchdowns\nbrady\ncaleb\nsinking\nechoed\ncrush\nhon\nblessed\nprotagonist\nhayes\nendangered\nmagnitude\neditors\n##tine\nestimate\nresponsibilities\n##mel\nbackup\nlaying\nconsumed\nsealed\nzurich\nlovers\nfrustrated\n##eau\nahmed\nkicking\nmit\ntreasurer\n1832\nbiblical\nrefuse\nterrified\npump\nagrees\ngenuine\nimprisonment\nrefuses\nplymouth\n##hen\nlou\n##nen\ntara\ntrembling\nantarctic\nton\nlearns\n##tas\ncrap\ncrucial\nfaction\natop\n##borough\nwrap\nlancaster\nodds\nhopkins\nerik\nlyon\n##eon\nbros\n##ode\nsnap\nlocality\ntips\nempress\ncrowned\ncal\nacclaimed\nchuckled\n##ory\nclara\nsends\nmild\ntowel\n##fl\n##day\n##а\nwishing\nassuming\ninterviewed\n##bal\n##die\ninteractions\neden\ncups\nhelena\n##lf\nindie\nbeck\n##fire\nbatteries\nfilipino\nwizard\nparted\n##lam\ntraces\n##born\nrows\nidol\nalbany\ndelegates\n##ees\n##sar\ndiscussions\n##ex\nnotre\ninstructed\nbelgrade\nhighways\nsuggestion\nlauren\npossess\norientation\nalexandria\nabdul\nbeats\nsalary\nreunion\nludwig\nalright\nwagner\nintimate\npockets\nslovenia\nhugged\nbrighton\nmerchants\ncruel\nstole\ntrek\nslopes\nrepairs\nenrollment\npolitically\nunderlying\npromotional\ncounting\nboeing\n##bb\nisabella\nnaming\n##и\nkeen\nbacteria\nlisting\nseparately\nbelfast\nussr\n450\nlithuanian\nanybody\nribs\nsphere\nmartinez\ncock\nembarrassed\nproposals\nfragments\nnationals\n##fs\n##wski\npremises\nfin\n1500\nalpine\nmatched\nfreely\nbounded\njace\nsleeve\n##af\ngaming\npier\npopulated\nevident\n##like\nfrances\nflooded\n##dle\nfrightened\npour\ntrainer\nframed\nvisitor\nchallenging\npig\nwickets\n##fold\ninfected\nemail\n##pes\narose\n##aw\nreward\necuador\noblast\nvale\nch\nshuttle\n##usa\nbach\nrankings\nforbidden\ncornwall\naccordance\nsalem\nconsumers\nbruno\nfantastic\ntoes\nmachinery\nresolved\njulius\nremembering\npropaganda\niceland\nbombardment\ntide\ncontacts\nwives\n##rah\nconcerto\nmacdonald\nalbania\nimplement\ndaisy\ntapped\nsudan\nhelmet\nangela\nmistress\n##lic\ncrop\nsunk\nfinest\n##craft\nhostile\n##ute\n##tsu\nboxer\nfr\npaths\nadjusted\nhabit\nballot\nsupervision\nsoprano\n##zen\nbullets\nwicked\nsunset\nregiments\ndisappear\nlamp\nperforms\napp\n##gia\n##oa\nrabbit\ndigging\nincidents\nentries\n##cion\ndishes\n##oi\nintroducing\n##ati\n##fied\nfreshman\nslot\njill\ntackles\nbaroque\nbacks\n##iest\nlone\nsponsor\ndestiny\naltogether\nconvert\n##aro\nconsensus\nshapes\ndemonstration\nbasically\nfeminist\nauction\nartifacts\n##bing\nstrongest\ntwitter\nhalifax\n2019\nallmusic\nmighty\nsmallest\nprecise\nalexandra\nviola\n##los\n##ille\nmanuscripts\n##illo\ndancers\nari\nmanagers\nmonuments\nblades\nbarracks\nspringfield\nmaiden\nconsolidated\nelectron\n##end\nberry\nairing\nwheat\nnobel\ninclusion\nblair\npayments\ngeography\nbee\ncc\neleanor\nreact\n##hurst\nafc\nmanitoba\n##yu\nsu\nlineup\nfitness\nrecreational\ninvestments\nairborne\ndisappointment\n##dis\nedmonton\nviewing\n##row\nrenovation\n##cast\ninfant\nbankruptcy\nroses\naftermath\npavilion\n##yer\ncarpenter\nwithdrawal\nladder\n##hy\ndiscussing\npopped\nreliable\nagreements\nrochester\n##abad\ncurves\nbombers\n220\nrao\nreverend\ndecreased\nchoosing\n107\nstiff\nconsulting\nnaples\ncrawford\ntracy\nka\nribbon\ncops\n##lee\ncrushed\ndeciding\nunified\nteenager\naccepting\nflagship\nexplorer\npoles\nsanchez\ninspection\nrevived\nskilled\ninduced\nexchanged\nflee\nlocals\ntragedy\nswallow\nloading\nhanna\ndemonstrate\n##ela\nsalvador\nflown\ncontestants\ncivilization\n##ines\nwanna\nrhodes\nfletcher\nhector\nknocking\nconsiders\n##ough\nnash\nmechanisms\nsensed\nmentally\nwalt\nunclear\n##eus\nrenovated\nmadame\n##cks\ncrews\ngovernmental\n##hin\nundertaken\nmonkey\n##ben\n##ato\nfatal\narmored\ncopa\ncaves\ngovernance\ngrasp\nperception\ncertification\nfroze\ndamp\ntugged\nwyoming\n##rg\n##ero\nnewman\n##lor\nnerves\ncuriosity\ngraph\n115\n##ami\nwithdraw\ntunnels\ndull\nmeredith\nmoss\nexhibits\nneighbors\ncommunicate\naccuracy\nexplored\nraiders\nrepublicans\nsecular\nkat\nsuperman\npenny\ncriticised\n##tch\nfreed\nupdate\nconviction\nwade\nham\nlikewise\ndelegation\ngotta\ndoll\npromises\ntechnological\nmyth\nnationality\nresolve\nconvent\n##mark\nsharon\ndig\nsip\ncoordinator\nentrepreneur\nfold\n##dine\ncapability\ncouncillor\nsynonym\nblown\nswan\ncursed\n1815\njonas\nhaired\nsofa\ncanvas\nkeeper\nrivalry\n##hart\nrapper\nspeedway\nswords\npostal\nmaxwell\nestonia\npotter\nrecurring\n##nn\n##ave\nerrors\n##oni\ncognitive\n1834\n##²\nclaws\nnadu\nroberto\nbce\nwrestler\nellie\n##ations\ninfinite\nink\n##tia\npresumably\nfinite\nstaircase\n108\nnoel\npatricia\nnacional\n##cation\nchill\neternal\ntu\npreventing\nprussia\nfossil\nlimbs\n##logist\nernst\nfrog\nperez\nrene\n##ace\npizza\nprussian\n##ios\n##vy\nmolecules\nregulatory\nanswering\nopinions\nsworn\nlengths\nsupposedly\nhypothesis\nupward\nhabitats\nseating\nancestors\ndrank\nyield\nhd\nsynthesis\nresearcher\nmodest\n##var\nmothers\npeered\nvoluntary\nhomeland\n##the\nacclaim\n##igan\nstatic\nvalve\nluxembourg\nalto\ncarroll\nfe\nreceptor\nnorton\nambulance\n##tian\njohnston\ncatholics\ndepicting\njointly\nelephant\ngloria\nmentor\nbadge\nahmad\ndistinguish\nremarked\ncouncils\nprecisely\nallison\nadvancing\ndetection\ncrowded\n##10\ncooperative\nankle\nmercedes\ndagger\nsurrendered\npollution\ncommit\nsubway\njeffrey\nlesson\nsculptures\nprovider\n##fication\nmembrane\ntimothy\nrectangular\nfiscal\nheating\nteammate\nbasket\nparticle\nanonymous\ndeployment\n##ple\nmissiles\ncourthouse\nproportion\nshoe\nsec\n##ller\ncomplaints\nforbes\nblacks\nabandon\nremind\nsizes\noverwhelming\nautobiography\nnatalie\n##awa\nrisks\ncontestant\ncountryside\nbabies\nscorer\ninvaded\nenclosed\nproceed\nhurling\ndisorders\n##cu\nreflecting\ncontinuously\ncruiser\ngraduates\nfreeway\ninvestigated\nore\ndeserved\nmaid\nblocking\nphillip\njorge\nshakes\ndove\nmann\nvariables\nlacked\nburden\naccompanying\nque\nconsistently\norganizing\nprovisional\ncomplained\nendless\n##rm\ntubes\njuice\ngeorges\nkrishna\nmick\nlabels\nthriller\n##uch\nlaps\narcade\nsage\nsnail\n##table\nshannon\nfi\nlaurence\nseoul\nvacation\npresenting\nhire\nchurchill\nsurprisingly\nprohibited\nsavannah\ntechnically\n##oli\n170\n##lessly\ntestimony\nsuited\nspeeds\ntoys\nromans\nmlb\nflowering\nmeasurement\ntalented\nkay\nsettings\ncharleston\nexpectations\nshattered\nachieving\ntriumph\nceremonies\nportsmouth\nlanes\nmandatory\nloser\nstretching\ncologne\nrealizes\nseventy\ncornell\ncareers\nwebb\n##ulating\namericas\nbudapest\nava\nsuspicion\n##ison\nyo\nconrad\n##hai\nsterling\njessie\nrector\n##az\n1831\ntransform\norganize\nloans\nchristine\nvolcanic\nwarrant\nslender\nsummers\nsubfamily\nnewer\ndanced\ndynamics\nrhine\nproceeds\nheinrich\ngastropod\ncommands\nsings\nfacilitate\neaster\nra\npositioned\nresponses\nexpense\nfruits\nyanked\nimported\n25th\nvelvet\nvic\nprimitive\ntribune\nbaldwin\nneighbourhood\ndonna\nrip\nhay\npr\n##uro\n1814\nespn\nwelcomed\n##aria\nqualifier\nglare\nhighland\ntiming\n##cted\nshells\neased\ngeometry\nlouder\nexciting\nslovakia\n##sion\n##iz\n##lot\nsavings\nprairie\n##ques\nmarching\nrafael\ntonnes\n##lled\ncurtain\npreceding\nshy\nheal\ngreene\nworthy\n##pot\ndetachment\nbury\nsherman\n##eck\nreinforced\nseeks\nbottles\ncontracted\nduchess\noutfit\nwalsh\n##sc\nmickey\n##ase\ngeoffrey\narcher\nsqueeze\ndawson\neliminate\ninvention\n##enberg\nneal\n##eth\nstance\ndealer\ncoral\nmaple\nretire\npolo\nsimplified\n##ht\n1833\nhid\nwatts\nbackwards\njules\n##oke\ngenesis\nmt\nframes\nrebounds\nburma\nwoodland\nmoist\nsantos\nwhispers\ndrained\nsubspecies\n##aa\nstreaming\nulster\nburnt\ncorrespondence\nmaternal\ngerard\ndenis\nstealing\n##load\ngenius\nduchy\n##oria\ninaugurated\nmomentum\nsuits\nplacement\nsovereign\nclause\nthames\n##hara\nconfederation\nreservation\nsketch\nyankees\nlets\nrotten\ncharm\nhal\nverses\nultra\ncommercially\ndot\nsalon\ncitation\nadopt\nwinnipeg\nmist\nallocated\ncairo\n##boy\njenkins\ninterference\nobjectives\n##wind\n1820\nportfolio\narmoured\nsectors\n##eh\ninitiatives\n##world\nintegrity\nexercises\nrobe\ntap\nab\ngazed\n##tones\ndistracted\nrulers\n111\nfavorable\njerome\ntended\ncart\nfactories\n##eri\ndiplomat\nvalued\ngravel\ncharitable\n##try\ncalvin\nexploring\nchang\nshepherd\nterrace\npdf\npupil\n##ural\nreflects\nups\n##rch\ngovernors\nshelf\ndepths\n##nberg\ntrailed\ncrest\ntackle\n##nian\n##ats\nhatred\n##kai\nclare\nmakers\nethiopia\nlongtime\ndetected\nembedded\nlacking\nslapped\nrely\nthomson\nanticipation\niso\nmorton\nsuccessive\nagnes\nscreenwriter\nstraightened\nphilippe\nplaywright\nhaunted\nlicence\niris\nintentions\nsutton\n112\nlogical\ncorrectly\n##weight\nbranded\nlicked\ntipped\nsilva\nricky\nnarrator\nrequests\n##ents\ngreeted\nsupernatural\ncow\n##wald\nlung\nrefusing\nemployer\nstrait\ngaelic\nliner\n##piece\nzoe\nsabha\n##mba\ndriveway\nharvest\nprints\nbates\nreluctantly\nthreshold\nalgebra\nira\nwherever\ncoupled\n240\nassumption\npicks\n##air\ndesigners\nraids\ngentlemen\n##ean\nroller\nblowing\nleipzig\nlocks\nscrew\ndressing\nstrand\n##lings\nscar\ndwarf\ndepicts\n##nu\nnods\n##mine\ndiffer\nboris\n##eur\nyuan\nflip\n##gie\nmob\ninvested\nquestioning\napplying\n##ture\nshout\n##sel\ngameplay\nblamed\nillustrations\nbothered\nweakness\nrehabilitation\n##of\n##zes\nenvelope\nrumors\nminers\nleicester\nsubtle\nkerry\n##ico\nferguson\n##fu\npremiership\nne\n##cat\nbengali\nprof\ncatches\nremnants\ndana\n##rily\nshouting\npresidents\nbaltic\nought\nghosts\ndances\nsailors\nshirley\nfancy\ndominic\n##bie\nmadonna\n##rick\nbark\nbuttons\ngymnasium\nashes\nliver\ntoby\noath\nprovidence\ndoyle\nevangelical\nnixon\ncement\ncarnegie\nembarked\nhatch\nsurroundings\nguarantee\nneeding\npirate\nessence\n##bee\nfilter\ncrane\nhammond\nprojected\nimmune\npercy\ntwelfth\n##ult\nregent\ndoctoral\ndamon\nmikhail\n##ichi\nlu\ncritically\nelect\nrealised\nabortion\nacute\nscreening\nmythology\nsteadily\n##fc\nfrown\nnottingham\nkirk\nwa\nminneapolis\n##rra\nmodule\nalgeria\nmc\nnautical\nencounters\nsurprising\nstatues\navailability\nshirts\npie\nalma\nbrows\nmunster\nmack\nsoup\ncrater\ntornado\nsanskrit\ncedar\nexplosive\nbordered\ndixon\nplanets\nstamp\nexam\nhappily\n##bble\ncarriers\nkidnapped\n##vis\naccommodation\nemigrated\n##met\nknockout\ncorrespondent\nviolation\nprofits\npeaks\nlang\nspecimen\nagenda\nancestry\npottery\nspelling\nequations\nobtaining\nki\nlinking\n1825\ndebris\nasylum\n##20\nbuddhism\nteddy\n##ants\ngazette\n##nger\n##sse\ndental\neligibility\nutc\nfathers\naveraged\nzimbabwe\nfrancesco\ncoloured\nhissed\ntranslator\nlynch\nmandate\nhumanities\nmackenzie\nuniforms\nlin\n##iana\n##gio\nasset\nmhz\nfitting\nsamantha\ngenera\nwei\nrim\nbeloved\nshark\nriot\nentities\nexpressions\nindo\ncarmen\nslipping\nowing\nabbot\nneighbor\nsidney\n##av\nrats\nrecommendations\nencouraging\nsquadrons\nanticipated\ncommanders\nconquered\n##oto\ndonations\ndiagnosed\n##mond\ndivide\n##iva\nguessed\ndecoration\nvernon\nauditorium\nrevelation\nconversations\n##kers\n##power\nherzegovina\ndash\nalike\nprotested\nlateral\nherman\naccredited\nmg\n##gent\nfreeman\nmel\nfiji\ncrow\ncrimson\n##rine\nlivestock\n##pped\nhumanitarian\nbored\noz\nwhip\n##lene\n##ali\nlegitimate\nalter\ngrinning\nspelled\nanxious\noriental\nwesley\n##nin\n##hole\ncarnival\ncontroller\ndetect\n##ssa\nbowed\neducator\nkosovo\nmacedonia\n##sin\noccupy\nmastering\nstephanie\njaneiro\npara\nunaware\nnurses\nnoon\n135\ncam\nhopefully\nranger\ncombine\nsociology\npolar\nrica\n##eer\nneill\n##sman\nholocaust\n##ip\ndoubled\nlust\n1828\n109\ndecent\ncooling\nunveiled\n##card\n1829\nnsw\nhomer\nchapman\nmeyer\n##gin\ndive\nmae\nreagan\nexpertise\n##gled\ndarwin\nbrooke\nsided\nprosecution\ninvestigating\ncomprised\npetroleum\ngenres\nreluctant\ndifferently\ntrilogy\njohns\nvegetables\ncorpse\nhighlighted\nlounge\npension\nunsuccessfully\nelegant\naided\nivory\nbeatles\namelia\ncain\ndubai\nsunny\nimmigrant\nbabe\nclick\n##nder\nunderwater\npepper\ncombining\nmumbled\natlas\nhorns\naccessed\nballad\nphysicians\nhomeless\ngestured\nrpm\nfreak\nlouisville\ncorporations\npatriots\nprizes\nrational\nwarn\nmodes\ndecorative\novernight\ndin\ntroubled\nphantom\n##ort\nmonarch\nsheer\n##dorf\ngenerals\nguidelines\norgans\naddresses\n##zon\nenhance\ncurling\nparishes\ncord\n##kie\nlinux\ncaesar\ndeutsche\nbavaria\n##bia\ncoleman\ncyclone\n##eria\nbacon\npetty\n##yama\n##old\nhampton\ndiagnosis\n1824\nthrows\ncomplexity\nrita\ndisputed\n##₃\npablo\n##sch\nmarketed\ntrafficking\n##ulus\nexamine\nplague\nformats\n##oh\nvault\nfaithful\n##bourne\nwebster\n##ox\nhighlights\n##ient\n##ann\nphones\nvacuum\nsandwich\nmodeling\n##gated\nbolivia\nclergy\nqualities\nisabel\n##nas\n##ars\nwears\nscreams\nreunited\nannoyed\nbra\n##ancy\n##rate\ndifferential\ntransmitter\ntattoo\ncontainer\npoker\n##och\nexcessive\nresides\ncowboys\n##tum\naugustus\ntrash\nproviders\nstatute\nretreated\nbalcony\nreversed\nvoid\nstorey\npreceded\nmasses\nleap\nlaughs\nneighborhoods\nwards\nschemes\nfalcon\nsanto\nbattlefield\npad\nronnie\nthread\nlesbian\nvenus\n##dian\nbeg\nsandstone\ndaylight\npunched\ngwen\nanalog\nstroked\nwwe\nacceptable\nmeasurements\ndec\ntoxic\n##kel\nadequate\nsurgical\neconomist\nparameters\nvarsity\n##sberg\nquantity\nella\n##chy\n##rton\ncountess\ngenerating\nprecision\ndiamonds\nexpressway\nga\n##ı\n1821\nuruguay\ntalents\ngalleries\nexpenses\nscanned\ncolleague\noutlets\nryder\nlucien\n##ila\nparamount\n##bon\nsyracuse\ndim\nfangs\ngown\nsweep\n##sie\ntoyota\nmissionaries\nwebsites\n##nsis\nsentences\nadviser\nval\ntrademark\nspells\n##plane\npatience\nstarter\nslim\n##borg\ntoe\nincredibly\nshoots\nelliot\nnobility\n##wyn\ncowboy\nendorsed\ngardner\ntendency\npersuaded\norganisms\nemissions\nkazakhstan\namused\nboring\nchips\nthemed\n##hand\nllc\nconstantinople\nchasing\nsystematic\nguatemala\nborrowed\nerin\ncarey\n##hard\nhighlands\nstruggles\n1810\n##ifying\n##ced\nwong\nexceptions\ndevelops\nenlarged\nkindergarten\ncastro\n##ern\n##rina\nleigh\nzombie\njuvenile\n##most\nconsul\n##nar\nsailor\nhyde\nclarence\nintensive\npinned\nnasty\nuseless\njung\nclayton\nstuffed\nexceptional\nix\napostolic\n230\ntransactions\n##dge\nexempt\nswinging\ncove\nreligions\n##ash\nshields\ndairy\nbypass\n190\npursuing\nbug\njoyce\nbombay\nchassis\nsouthampton\nchat\ninteract\nredesignated\n##pen\nnascar\npray\nsalmon\nrigid\nregained\nmalaysian\ngrim\npublicity\nconstituted\ncapturing\ntoilet\ndelegate\npurely\ntray\ndrift\nloosely\nstriker\nweakened\ntrinidad\nmitch\nitv\ndefines\ntransmitted\nming\nscarlet\nnodding\nfitzgerald\nfu\nnarrowly\nsp\ntooth\nstandings\nvirtue\n##₁\n##wara\n##cting\nchateau\ngloves\nlid\n##nel\nhurting\nconservatory\n##pel\nsinclair\nreopened\nsympathy\nnigerian\nstrode\nadvocated\noptional\nchronic\ndischarge\n##rc\nsuck\ncompatible\nlaurel\nstella\nshi\nfails\nwage\ndodge\n128\ninformal\nsorts\nlevi\nbuddha\nvillagers\n##aka\nchronicles\nheavier\nsummoned\ngateway\n3000\neleventh\njewelry\ntranslations\naccordingly\nseas\n##ency\nfiber\npyramid\ncubic\ndragging\n##ista\ncaring\n##ops\nandroid\ncontacted\nlunar\n##dt\nkai\nlisbon\npatted\n1826\nsacramento\ntheft\nmadagascar\nsubtropical\ndisputes\nta\nholidays\npiper\nwillow\nmare\ncane\nitunes\nnewfoundland\nbenny\ncompanions\ndong\nraj\nobserve\nroar\ncharming\nplaque\ntibetan\nfossils\nenacted\nmanning\nbubble\ntina\ntanzania\n##eda\n##hir\nfunk\nswamp\ndeputies\ncloak\nufc\nscenario\npar\nscratch\nmetals\nanthem\nguru\nengaging\nspecially\n##boat\ndialects\nnineteen\ncecil\nduet\ndisability\nmessenger\nunofficial\n##lies\ndefunct\neds\nmoonlight\ndrainage\nsurname\npuzzle\nhonda\nswitching\nconservatives\nmammals\nknox\nbroadcaster\nsidewalk\ncope\n##ried\nbenson\nprinces\npeterson\n##sal\nbedford\nsharks\neli\nwreck\nalberto\ngasp\narchaeology\nlgbt\nteaches\nsecurities\nmadness\ncompromise\nwaving\ncoordination\ndavidson\nvisions\nleased\npossibilities\neighty\njun\nfernandez\nenthusiasm\nassassin\nsponsorship\nreviewer\nkingdoms\nestonian\nlaboratories\n##fy\n##nal\napplies\nverb\ncelebrations\n##zzo\nrowing\nlightweight\nsadness\nsubmit\nmvp\nbalanced\ndude\n##vas\nexplicitly\nmetric\nmagnificent\nmound\nbrett\nmohammad\nmistakes\nirregular\n##hing\n##ass\nsanders\nbetrayed\nshipped\nsurge\n##enburg\nreporters\ntermed\ngeorg\npity\nverbal\nbulls\nabbreviated\nenabling\nappealed\n##are\n##atic\nsicily\nsting\nheel\nsweetheart\nbart\nspacecraft\nbrutal\nmonarchy\n##tter\naberdeen\ncameo\ndiane\n##ub\nsurvivor\nclyde\n##aries\ncomplaint\n##makers\nclarinet\ndelicious\nchilean\nkarnataka\ncoordinates\n1818\npanties\n##rst\npretending\nar\ndramatically\nkiev\nbella\ntends\ndistances\n113\ncatalog\nlaunching\ninstances\ntelecommunications\nportable\nlindsay\nvatican\n##eim\nangles\naliens\nmarker\nstint\nscreens\nbolton\n##rne\njudy\nwool\nbenedict\nplasma\neuropa\nspark\nimaging\nfilmmaker\nswiftly\n##een\ncontributor\n##nor\nopted\nstamps\napologize\nfinancing\nbutter\ngideon\nsophisticated\nalignment\navery\nchemicals\nyearly\nspeculation\nprominence\nprofessionally\n##ils\nimmortal\ninstitutional\ninception\nwrists\nidentifying\ntribunal\nderives\ngains\n##wo\npapal\npreference\nlinguistic\nvince\noperative\nbrewery\n##ont\nunemployment\nboyd\n##ured\n##outs\nalbeit\nprophet\n1813\nbi\n##rr\n##face\n##rad\nquarterly\nasteroid\ncleaned\nradius\ntemper\n##llen\ntelugu\njerk\nviscount\nmenu\n##ote\nglimpse\n##aya\nyacht\nhawaiian\nbaden\n##rl\nlaptop\nreadily\n##gu\nmonetary\noffshore\nscots\nwatches\n##yang\n##arian\nupgrade\nneedle\nxbox\nlea\nencyclopedia\nflank\nfingertips\n##pus\ndelight\nteachings\nconfirm\nroth\nbeaches\nmidway\nwinters\n##iah\nteasing\ndaytime\nbeverly\ngambling\nbonnie\n##backs\nregulated\nclement\nhermann\ntricks\nknot\n##shing\n##uring\n##vre\ndetached\necological\nowed\nspecialty\nbyron\ninventor\nbats\nstays\nscreened\nunesco\nmidland\ntrim\naffection\n##ander\n##rry\njess\nthoroughly\nfeedback\n##uma\nchennai\nstrained\nheartbeat\nwrapping\novertime\npleaded\n##sworth\nmon\nleisure\noclc\n##tate\n##ele\nfeathers\nangelo\nthirds\nnuts\nsurveys\nclever\ngill\ncommentator\n##dos\ndarren\nrides\ngibraltar\n##nc\n##mu\ndissolution\ndedication\nshin\nmeals\nsaddle\nelvis\nreds\nchaired\ntaller\nappreciation\nfunctioning\nniece\nfavored\nadvocacy\nrobbie\ncriminals\nsuffolk\nyugoslav\npassport\nconstable\ncongressman\nhastings\nvera\n##rov\nconsecrated\nsparks\necclesiastical\nconfined\n##ovich\nmuller\nfloyd\nnora\n1822\npaved\n1827\ncumberland\nned\nsaga\nspiral\n##flow\nappreciated\nyi\ncollaborative\ntreating\nsimilarities\nfeminine\nfinishes\n##ib\njade\nimport\n##nse\n##hot\nchampagne\nmice\nsecuring\ncelebrities\nhelsinki\nattributes\n##gos\ncousins\nphases\nache\nlucia\ngandhi\nsubmission\nvicar\nspear\nshine\ntasmania\nbiting\ndetention\nconstitute\ntighter\nseasonal\n##gus\nterrestrial\nmatthews\n##oka\neffectiveness\nparody\nphilharmonic\n##onic\n1816\nstrangers\nencoded\nconsortium\nguaranteed\nregards\nshifts\ntortured\ncollision\nsupervisor\ninform\nbroader\ninsight\ntheaters\narmour\nemeritus\nblink\nincorporates\nmapping\n##50\n##ein\nhandball\nflexible\n##nta\nsubstantially\ngenerous\nthief\n##own\ncarr\nloses\n1793\nprose\nucla\nromeo\ngeneric\nmetallic\nrealization\ndamages\nmk\ncommissioners\nzach\ndefault\n##ther\nhelicopters\nlengthy\nstems\nspa\npartnered\nspectators\nrogue\nindication\npenalties\nteresa\n1801\nsen\n##tric\ndalton\n##wich\nirving\nphotographic\n##vey\ndell\ndeaf\npeters\nexcluded\nunsure\n##vable\npatterson\ncrawled\n##zio\nresided\nwhipped\nlatvia\nslower\necole\npipes\nemployers\nmaharashtra\ncomparable\nva\ntextile\npageant\n##gel\nalphabet\nbinary\nirrigation\nchartered\nchoked\nantoine\noffs\nwaking\nsupplement\n##wen\nquantities\ndemolition\nregain\nlocate\nurdu\nfolks\nalt\n114\n##mc\nscary\nandreas\nwhites\n##ava\nclassrooms\nmw\naesthetic\npublishes\nvalleys\nguides\ncubs\njohannes\nbryant\nconventions\naffecting\n##itt\ndrain\nawesome\nisolation\nprosecutor\nambitious\napology\ncaptive\ndowns\natmospheric\nlorenzo\naisle\nbeef\nfoul\n##onia\nkidding\ncomposite\ndisturbed\nillusion\nnatives\n##ffer\nemi\nrockets\nriverside\nwartime\npainters\nadolf\nmelted\n##ail\nuncertainty\nsimulation\nhawks\nprogressed\nmeantime\nbuilder\nspray\nbreach\nunhappy\nregina\nrussians\n##urg\ndetermining\n##tation\ntram\n1806\n##quin\naging\n##12\n1823\ngarion\nrented\nmister\ndiaz\nterminated\nclip\n1817\ndepend\nnervously\ndisco\nowe\ndefenders\nshiva\nnotorious\ndisbelief\nshiny\nworcester\n##gation\n##yr\ntrailing\nundertook\nislander\nbelarus\nlimitations\nwatershed\nfuller\noverlooking\nutilized\nraphael\n1819\nsynthetic\nbreakdown\nklein\n##nate\nmoaned\nmemoir\nlamb\npracticing\n##erly\ncellular\narrows\nexotic\n##graphy\nwitches\n117\ncharted\nrey\nhut\nhierarchy\nsubdivision\nfreshwater\ngiuseppe\naloud\nreyes\nqatar\nmarty\nsideways\nutterly\nsexually\njude\nprayers\nmccarthy\nsoftball\nblend\ndamien\n##gging\n##metric\nwholly\nerupted\nlebanese\nnegro\nrevenues\ntasted\ncomparative\nteamed\ntransaction\nlabeled\nmaori\nsovereignty\nparkway\ntrauma\ngran\nmalay\n121\nadvancement\ndescendant\n2020\nbuzz\nsalvation\ninventory\nsymbolic\n##making\nantarctica\nmps\n##gas\n##bro\nmohammed\nmyanmar\nholt\nsubmarines\ntones\n##lman\nlocker\npatriarch\nbangkok\nemerson\nremarks\npredators\nkin\nafghan\nconfession\nnorwich\nrental\nemerge\nadvantages\n##zel\nrca\n##hold\nshortened\nstorms\naidan\n##matic\nautonomy\ncompliance\n##quet\ndudley\natp\n##osis\n1803\nmotto\ndocumentation\nsummary\nprofessors\nspectacular\nchristina\narchdiocese\nflashing\ninnocence\nremake\n##dell\npsychic\nreef\nscare\nemploy\nrs\nsticks\nmeg\ngus\nleans\n##ude\naccompany\nbergen\ntomas\n##iko\ndoom\nwages\npools\n##nch\n##bes\nbreasts\nscholarly\nalison\noutline\nbrittany\nbreakthrough\nwillis\nrealistic\n##cut\n##boro\ncompetitor\n##stan\npike\npicnic\nicon\ndesigning\ncommercials\nwashing\nvillain\nskiing\nmicro\ncostumes\nauburn\nhalted\nexecutives\n##hat\nlogistics\ncycles\nvowel\napplicable\nbarrett\nexclaimed\neurovision\neternity\nramon\n##umi\n##lls\nmodifications\nsweeping\ndisgust\n##uck\ntorch\naviv\nensuring\nrude\ndusty\nsonic\ndonovan\noutskirts\ncu\npathway\n##band\n##gun\n##lines\ndisciplines\nacids\ncadet\npaired\n##40\nsketches\n##sive\nmarriages\n##⁺\nfolding\npeers\nslovak\nimplies\nadmired\n##beck\n1880s\nleopold\ninstinct\nattained\nweston\nmegan\nhorace\n##ination\ndorsal\ningredients\nevolutionary\n##its\ncomplications\ndeity\nlethal\nbrushing\nlevy\ndeserted\ninstitutes\nposthumously\ndelivering\ntelescope\ncoronation\nmotivated\nrapids\nluc\nflicked\npays\nvolcano\ntanner\nweighed\n##nica\ncrowds\nfrankie\ngifted\naddressing\ngranddaughter\nwinding\n##rna\nconstantine\ngomez\n##front\nlandscapes\nrudolf\nanthropology\nslate\nwerewolf\n##lio\nastronomy\ncirca\nrouge\ndreaming\nsack\nknelt\ndrowned\nnaomi\nprolific\ntracked\nfreezing\nherb\n##dium\nagony\nrandall\ntwisting\nwendy\ndeposit\ntouches\nvein\nwheeler\n##bbled\n##bor\nbatted\nretaining\ntire\npresently\ncompare\nspecification\ndaemon\nnigel\n##grave\nmerry\nrecommendation\nczechoslovakia\nsandra\nng\nroma\n##sts\nlambert\ninheritance\nsheikh\nwinchester\ncries\nexamining\n##yle\ncomeback\ncuisine\nnave\n##iv\nko\nretrieve\ntomatoes\nbarker\npolished\ndefining\nirene\nlantern\npersonalities\nbegging\ntract\nswore\n1809\n175\n##gic\nomaha\nbrotherhood\n##rley\nhaiti\n##ots\nexeter\n##ete\n##zia\nsteele\ndumb\npearson\n210\nsurveyed\nelisabeth\ntrends\n##ef\nfritz\n##rf\npremium\nbugs\nfraction\ncalmly\nviking\n##birds\ntug\ninserted\nunusually\n##ield\nconfronted\ndistress\ncrashing\nbrent\nturks\nresign\n##olo\ncambodia\ngabe\nsauce\n##kal\nevelyn\n116\nextant\nclusters\nquarry\nteenagers\nluna\n##lers\n##ister\naffiliation\ndrill\n##ashi\npanthers\nscenic\nlibya\nanita\nstrengthen\ninscriptions\n##cated\nlace\nsued\njudith\nriots\n##uted\nmint\n##eta\npreparations\nmidst\ndub\nchallenger\n##vich\nmock\ncf\ndisplaced\nwicket\nbreaths\nenables\nschmidt\nanalyst\n##lum\nag\nhighlight\nautomotive\naxe\njosef\nnewark\nsufficiently\nresembles\n50th\n##pal\nflushed\nmum\ntraits\n##ante\ncommodore\nincomplete\nwarming\ntitular\nceremonial\nethical\n118\ncelebrating\neighteenth\ncao\nlima\nmedalist\nmobility\nstrips\nsnakes\n##city\nminiature\nzagreb\nbarton\nescapes\numbrella\nautomated\ndoubted\ndiffers\ncooled\ngeorgetown\ndresden\ncooked\nfade\nwyatt\nrna\njacobs\ncarlton\nabundant\nstereo\nboost\nmadras\ninning\n##hia\nspur\nip\nmalayalam\nbegged\nosaka\ngroan\nescaping\ncharging\ndose\nvista\n##aj\nbud\npapa\ncommunists\nadvocates\nedged\ntri\n##cent\nresemble\npeaking\nnecklace\nfried\nmontenegro\nsaxony\ngoose\nglances\nstuttgart\ncurator\nrecruit\ngrocery\nsympathetic\n##tting\n##fort\n127\nlotus\nrandolph\nancestor\n##rand\nsucceeding\njupiter\n1798\nmacedonian\n##heads\nhiking\n1808\nhanding\nfischer\n##itive\ngarbage\nnode\n##pies\nprone\nsingular\npapua\ninclined\nattractions\nitalia\npouring\nmotioned\ngrandma\ngarnered\njacksonville\ncorp\nego\nringing\naluminum\n##hausen\nordering\n##foot\ndrawer\ntraders\nsynagogue\n##play\n##kawa\nresistant\nwandering\nfragile\nfiona\nteased\nvar\nhardcore\nsoaked\njubilee\ndecisive\nexposition\nmercer\nposter\nvalencia\nhale\nkuwait\n1811\n##ises\n##wr\n##eed\ntavern\ngamma\n122\njohan\n##uer\nairways\namino\ngil\n##ury\nvocational\ndomains\ntorres\n##sp\ngenerator\nfolklore\noutcomes\n##keeper\ncanberra\nshooter\nfl\nbeams\nconfrontation\n##lling\n##gram\nfeb\naligned\nforestry\npipeline\njax\nmotorway\nconception\ndecay\n##tos\ncoffin\n##cott\nstalin\n1805\nescorted\nminded\n##nam\nsitcom\npurchasing\ntwilight\nveronica\nadditions\npassive\ntensions\nstraw\n123\nfrequencies\n1804\nrefugee\ncultivation\n##iate\nchristie\nclary\nbulletin\ncrept\ndisposal\n##rich\n##zong\nprocessor\ncrescent\n##rol\nbmw\nemphasized\nwhale\nnazis\naurora\n##eng\ndwelling\nhauled\nsponsors\ntoledo\nmega\nideology\ntheatres\ntessa\ncerambycidae\nsaves\nturtle\ncone\nsuspects\nkara\nrusty\nyelling\ngreeks\nmozart\nshades\ncocked\nparticipant\n##tro\nshire\nspit\nfreeze\nnecessity\n##cos\ninmates\nnielsen\ncouncillors\nloaned\nuncommon\nomar\npeasants\nbotanical\noffspring\ndaniels\nformations\njokes\n1794\npioneers\nsigma\nlicensing\n##sus\nwheelchair\npolite\n1807\nliquor\npratt\ntrustee\n##uta\nforewings\nballoon\n##zz\nkilometre\ncamping\nexplicit\ncasually\nshawn\nfoolish\nteammates\nnm\nhassan\ncarrie\njudged\nsatisfy\nvanessa\nknives\nselective\ncnn\nflowed\n##lice\neclipse\nstressed\neliza\nmathematician\ncease\ncultivated\n##roy\ncommissions\nbrowns\n##ania\ndestroyers\nsheridan\nmeadow\n##rius\nminerals\n##cial\ndownstream\nclash\ngram\nmemoirs\nventures\nbaha\nseymour\narchie\nmidlands\nedith\nfare\nflynn\ninvite\ncanceled\ntiles\nstabbed\nboulder\nincorporate\namended\ncamden\nfacial\nmollusk\nunreleased\ndescriptions\nyoga\ngrabs\n550\nraises\nramp\nshiver\n##rose\ncoined\npioneering\ntunes\nqing\nwarwick\ntops\n119\nmelanie\ngiles\n##rous\nwandered\n##inal\nannexed\nnov\n30th\nunnamed\n##ished\norganizational\nairplane\nnormandy\nstoke\nwhistle\nblessing\nviolations\nchased\nholders\nshotgun\n##ctic\noutlet\nreactor\n##vik\ntires\ntearing\nshores\nfortified\nmascot\nconstituencies\nnc\ncolumnist\nproductive\ntibet\n##rta\nlineage\nhooked\noct\ntapes\njudging\ncody\n##gger\nhansen\nkashmir\ntriggered\n##eva\nsolved\ncliffs\n##tree\nresisted\nanatomy\nprotesters\ntransparent\nimplied\n##iga\ninjection\nmattress\nexcluding\n##mbo\ndefenses\nhelpless\ndevotion\n##elli\ngrowl\nliberals\nweber\nphenomena\natoms\nplug\n##iff\nmortality\napprentice\nhowe\nconvincing\naaa\nswimmer\nbarber\nleone\npromptly\nsodium\ndef\nnowadays\narise\n##oning\ngloucester\ncorrected\ndignity\nnorm\nerie\n##ders\nelders\nevacuated\nsylvia\ncompression\n##yar\nhartford\npose\nbackpack\nreasoning\naccepts\n24th\nwipe\nmillimetres\nmarcel\n##oda\ndodgers\nalbion\n1790\noverwhelmed\naerospace\noaks\n1795\nshowcase\nacknowledge\nrecovering\nnolan\nashe\nhurts\ngeology\nfashioned\ndisappearance\nfarewell\nswollen\nshrug\nmarquis\nwimbledon\n124\nrue\n1792\ncommemorate\nreduces\nexperiencing\ninevitable\ncalcutta\nintel\n##court\nmurderer\nsticking\nfisheries\nimagery\nbloom\n280\nbrake\n##inus\ngustav\nhesitation\nmemorable\npo\nviral\nbeans\naccidents\ntunisia\nantenna\nspilled\nconsort\ntreatments\naye\nperimeter\n##gard\ndonation\nhostage\nmigrated\nbanker\naddiction\napex\nlil\ntrout\n##ously\nconscience\n##nova\nrams\nsands\ngenome\npassionate\ntroubles\n##lets\n##set\namid\n##ibility\n##ret\nhiggins\nexceed\nvikings\n##vie\npayne\n##zan\nmuscular\n##ste\ndefendant\nsucking\n##wal\nibrahim\nfuselage\nclaudia\nvfl\neuropeans\nsnails\ninterval\n##garh\npreparatory\nstatewide\ntasked\nlacrosse\nviktor\n##lation\nangola\n##hra\nflint\nimplications\nemploys\nteens\npatrons\nstall\nweekends\nbarriers\nscrambled\nnucleus\ntehran\njenna\nparsons\nlifelong\nrobots\ndisplacement\n5000\n##bles\nprecipitation\n##gt\nknuckles\nclutched\n1802\nmarrying\necology\nmarx\naccusations\ndeclare\nscars\nkolkata\nmat\nmeadows\nbermuda\nskeleton\nfinalists\nvintage\ncrawl\ncoordinate\naffects\nsubjected\norchestral\nmistaken\n##tc\nmirrors\ndipped\nrelied\n260\narches\ncandle\n##nick\nincorporating\nwildly\nfond\nbasilica\nowl\nfringe\nrituals\nwhispering\nstirred\nfeud\ntertiary\nslick\ngoat\nhonorable\nwhereby\nskip\nricardo\nstripes\nparachute\nadjoining\nsubmerged\nsynthesizer\n##gren\nintend\npositively\nninety\nphi\nbeaver\npartition\nfellows\nalexis\nprohibition\ncarlisle\nbizarre\nfraternity\n##bre\ndoubts\nicy\ncbc\naquatic\nsneak\nsonny\ncombines\nairports\ncrude\nsupervised\nspatial\nmerge\nalfonso\n##bic\ncorrupt\nscan\nundergo\n##ams\ndisabilities\ncolombian\ncomparing\ndolphins\nperkins\n##lish\nreprinted\nunanimous\nbounced\nhairs\nunderworld\nmidwest\nsemester\nbucket\npaperback\nminiseries\ncoventry\ndemise\n##leigh\ndemonstrations\nsensor\nrotating\nyan\n##hler\narrange\nsoils\n##idge\nhyderabad\nlabs\n##dr\nbrakes\ngrandchildren\n##nde\nnegotiated\nrover\nferrari\ncontinuation\ndirectorate\naugusta\nstevenson\ncounterpart\ngore\n##rda\nnursery\nrican\nave\ncollectively\nbroadly\npastoral\nrepertoire\nasserted\ndiscovering\nnordic\nstyled\nfiba\ncunningham\nharley\nmiddlesex\nsurvives\ntumor\ntempo\nzack\naiming\nlok\nurgent\n##rade\n##nto\ndevils\n##ement\ncontractor\nturin\n##wl\n##ool\nbliss\nrepaired\nsimmons\nmoan\nastronomical\ncr\nnegotiate\nlyric\n1890s\nlara\nbred\nclad\nangus\npbs\n##ience\nengineered\nposed\n##lk\nhernandez\npossessions\nelbows\npsychiatric\nstrokes\nconfluence\nelectorate\nlifts\ncampuses\nlava\nalps\n##ep\n##ution\n##date\nphysicist\nwoody\n##page\n##ographic\n##itis\njuliet\nreformation\nsparhawk\n320\ncomplement\nsuppressed\njewel\n##½\nfloated\n##kas\ncontinuity\nsadly\n##ische\ninability\nmelting\nscanning\npaula\nflour\njudaism\nsafer\nvague\n##lm\nsolving\ncurb\n##stown\nfinancially\ngable\nbees\nexpired\nmiserable\ncassidy\ndominion\n1789\ncupped\n145\nrobbery\nfacto\namos\nwarden\nresume\ntallest\nmarvin\ning\npounded\nusd\ndeclaring\ngasoline\n##aux\ndarkened\n270\n650\nsophomore\n##mere\nerection\ngossip\ntelevised\nrisen\ndial\n##eu\npillars\n##link\npassages\nprofound\n##tina\narabian\nashton\nsilicon\nnail\n##ead\n##lated\n##wer\n##hardt\nfleming\nfirearms\nducked\ncircuits\nblows\nwaterloo\ntitans\n##lina\natom\nfireplace\ncheshire\nfinanced\nactivation\nalgorithms\n##zzi\nconstituent\ncatcher\ncherokee\npartnerships\nsexuality\nplatoon\ntragic\nvivian\nguarded\nwhiskey\nmeditation\npoetic\n##late\n##nga\n##ake\nporto\nlisteners\ndominance\nkendra\nmona\nchandler\nfactions\n22nd\nsalisbury\nattitudes\nderivative\n##ido\n##haus\nintake\npaced\njavier\nillustrator\nbarrels\nbias\ncockpit\nburnett\ndreamed\nensuing\n##anda\nreceptors\nsomeday\nhawkins\nmattered\n##lal\nslavic\n1799\njesuit\ncameroon\nwasted\ntai\nwax\nlowering\nvictorious\nfreaking\noutright\nhancock\nlibrarian\nsensing\nbald\ncalcium\nmyers\ntablet\nannouncing\nbarack\nshipyard\npharmaceutical\n##uan\ngreenwich\nflush\nmedley\npatches\nwolfgang\npt\nspeeches\nacquiring\nexams\nnikolai\n##gg\nhayden\nkannada\n##type\nreilly\n##pt\nwaitress\nabdomen\ndevastated\ncapped\npseudonym\npharmacy\nfulfill\nparaguay\n1796\nclicked\n##trom\narchipelago\nsyndicated\n##hman\nlumber\norgasm\nrejection\nclifford\nlorraine\nadvent\nmafia\nrodney\nbrock\n##ght\n##used\n##elia\ncassette\nchamberlain\ndespair\nmongolia\nsensors\ndevelopmental\nupstream\n##eg\n##alis\nspanning\n165\ntrombone\nbasque\nseeded\ninterred\nrenewable\nrhys\nleapt\nrevision\nmolecule\n##ages\nchord\nvicious\nnord\nshivered\n23rd\narlington\ndebts\ncorpus\nsunrise\nbays\nblackburn\ncentimetres\n##uded\nshuddered\ngm\nstrangely\ngripping\ncartoons\nisabelle\norbital\n##ppa\nseals\nproving\n##lton\nrefusal\nstrengthened\nbust\nassisting\nbaghdad\nbatsman\nportrayal\nmara\npushes\nspears\nog\n##cock\nreside\nnathaniel\nbrennan\n1776\nconfirmation\ncaucus\n##worthy\nmarkings\nyemen\nnobles\nku\nlazy\nviewer\ncatalan\nencompasses\nsawyer\n##fall\nsparked\nsubstances\npatents\nbraves\narranger\nevacuation\nsergio\npersuade\ndover\ntolerance\npenguin\ncum\njockey\ninsufficient\ntownships\noccupying\ndeclining\nplural\nprocessed\nprojection\npuppet\nflanders\nintroduces\nliability\n##yon\ngymnastics\nantwerp\ntaipei\nhobart\ncandles\njeep\nwes\nobservers\n126\nchaplain\nbundle\nglorious\n##hine\nhazel\nflung\nsol\nexcavations\ndumped\nstares\nsh\nbangalore\ntriangular\nicelandic\nintervals\nexpressing\nturbine\n##vers\nsongwriting\ncrafts\n##igo\njasmine\nditch\nrite\n##ways\nentertaining\ncomply\nsorrow\nwrestlers\nbasel\nemirates\nmarian\nrivera\nhelpful\n##some\ncaution\ndownward\nnetworking\n##atory\n##tered\ndarted\ngenocide\nemergence\nreplies\nspecializing\nspokesman\nconvenient\nunlocked\nfading\naugustine\nconcentrations\nresemblance\nelijah\ninvestigator\nandhra\n##uda\npromotes\nbean\n##rrell\nfleeing\nwan\nsimone\nannouncer\n##ame\n##bby\nlydia\nweaver\n132\nresidency\nmodification\n##fest\nstretches\n##ast\nalternatively\nnat\nlowe\nlacks\n##ented\npam\ntile\nconcealed\ninferior\nabdullah\nresidences\ntissues\nvengeance\n##ided\nmoisture\npeculiar\ngroove\nzip\nbologna\njennings\nninja\noversaw\nzombies\npumping\nbatch\nlivingston\nemerald\ninstallations\n1797\npeel\nnitrogen\nrama\n##fying\n##star\nschooling\nstrands\nresponding\nwerner\n##ost\nlime\ncasa\naccurately\ntargeting\n##rod\nunderway\n##uru\nhemisphere\nlester\n##yard\noccupies\n2d\ngriffith\nangrily\nreorganized\n##owing\ncourtney\ndeposited\n##dd\n##30\nestadio\n##ifies\ndunn\nexiled\n##ying\nchecks\n##combe\n##о\n##fly\nsuccesses\nunexpectedly\nblu\nassessed\n##flower\n##ه\nobserving\nsacked\nspiders\nkn\n##tail\nmu\nnodes\nprosperity\naudrey\ndivisional\n155\nbroncos\ntangled\nadjust\nfeeds\nerosion\npaolo\nsurf\ndirectory\nsnatched\nhumid\nadmiralty\nscrewed\ngt\nreddish\n##nese\nmodules\ntrench\nlamps\nbind\nleah\nbucks\ncompetes\n##nz\n##form\ntranscription\n##uc\nisles\nviolently\nclutching\npga\ncyclist\ninflation\nflats\nragged\nunnecessary\n##hian\nstubborn\ncoordinated\nharriet\nbaba\ndisqualified\n330\ninsect\nwolfe\n##fies\nreinforcements\nrocked\nduel\nwinked\nembraced\nbricks\n##raj\nhiatus\ndefeats\npending\nbrightly\njealousy\n##xton\n##hm\n##uki\nlena\ngdp\ncolorful\n##dley\nstein\nkidney\n##shu\nunderwear\nwanderers\n##haw\n##icus\nguardians\nm³\nroared\nhabits\n##wise\npermits\ngp\nuranium\npunished\ndisguise\nbundesliga\nelise\ndundee\nerotic\npartisan\npi\ncollectors\nfloat\nindividually\nrendering\nbehavioral\nbucharest\nser\nhare\nvalerie\ncorporal\nnutrition\nproportional\n##isa\nimmense\n##kis\npavement\n##zie\n##eld\nsutherland\ncrouched\n1775\n##lp\nsuzuki\ntrades\nendurance\noperas\ncrosby\nprayed\npriory\nrory\nsocially\n##urn\ngujarat\n##pu\nwalton\ncube\npasha\nprivilege\nlennon\nfloods\nthorne\nwaterfall\nnipple\nscouting\napprove\n##lov\nminorities\nvoter\ndwight\nextensions\nassure\nballroom\nslap\ndripping\nprivileges\nrejoined\nconfessed\ndemonstrating\npatriotic\nyell\ninvestor\n##uth\npagan\nslumped\nsquares\n##cle\n##kins\nconfront\nbert\nembarrassment\n##aid\naston\nurging\nsweater\nstarr\nyuri\nbrains\nwilliamson\ncommuter\nmortar\nstructured\nselfish\nexports\n##jon\ncds\n##him\nunfinished\n##rre\nmortgage\ndestinations\n##nagar\ncanoe\nsolitary\nbuchanan\ndelays\nmagistrate\nfk\n##pling\nmotivation\n##lier\n##vier\nrecruiting\nassess\n##mouth\nmalik\nantique\n1791\npius\nrahman\nreich\ntub\nzhou\nsmashed\nairs\ngalway\nxii\nconditioning\nhonduras\ndischarged\ndexter\n##pf\nlionel\n129\ndebates\nlemon\ntiffany\nvolunteered\ndom\ndioxide\nprocession\ndevi\nsic\ntremendous\nadvertisements\ncolts\ntransferring\nverdict\nhanover\ndecommissioned\nutter\nrelate\npac\nracism\n##top\nbeacon\nlimp\nsimilarity\nterra\noccurrence\nant\n##how\nbecky\ncapt\nupdates\narmament\nrichie\npal\n##graph\nhalloween\nmayo\n##ssen\n##bone\ncara\nserena\nfcc\ndolls\nobligations\n##dling\nviolated\nlafayette\njakarta\nexploitation\n##ime\ninfamous\niconic\n##lah\n##park\nkitty\nmoody\nreginald\ndread\nspill\ncrystals\nolivier\nmodeled\nbluff\nequilibrium\nseparating\nnotices\nordnance\nextinction\nonset\ncosmic\nattachment\nsammy\nexpose\nprivy\nanchored\n##bil\nabbott\nadmits\nbending\nbaritone\nemmanuel\npoliceman\nvaughan\nwinged\nclimax\ndresses\ndenny\npolytechnic\nmohamed\nburmese\nauthentic\nnikki\ngenetics\ngrandparents\nhomestead\ngaza\npostponed\nmetacritic\nuna\n##sby\n##bat\nunstable\ndissertation\n##rial\n##cian\ncurls\nobscure\nuncovered\nbronx\npraying\ndisappearing\n##hoe\nprehistoric\ncoke\nturret\nmutations\nnonprofit\npits\nmonaco\n##ي\n##usion\nprominently\ndispatched\npodium\n##mir\nuci\n##uation\n133\nfortifications\nbirthplace\nkendall\n##lby\n##oll\npreacher\nrack\ngoodman\n##rman\npersistent\n##ott\ncountless\njaime\nrecorder\nlexington\npersecution\njumps\nrenewal\nwagons\n##11\ncrushing\n##holder\ndecorations\n##lake\nabundance\nwrath\nlaundry\n£1\ngarde\n##rp\njeanne\nbeetles\npeasant\n##sl\nsplitting\ncaste\nsergei\n##rer\n##ema\nscripts\n##ively\nrub\nsatellites\n##vor\ninscribed\nverlag\nscrapped\ngale\npackages\nchick\npotato\nslogan\nkathleen\narabs\n##culture\ncounterparts\nreminiscent\nchoral\n##tead\nrand\nretains\nbushes\ndane\naccomplish\ncourtesy\ncloses\n##oth\nslaughter\nhague\nkrakow\nlawson\ntailed\nelias\nginger\n##ttes\ncanopy\nbetrayal\nrebuilding\nturf\n##hof\nfrowning\nallegiance\nbrigades\nkicks\nrebuild\npolls\nalias\nnationalism\ntd\nrowan\naudition\nbowie\nfortunately\nrecognizes\nharp\ndillon\nhorrified\n##oro\nrenault\n##tics\nropes\n##α\npresumed\nrewarded\ninfrared\nwiping\naccelerated\nillustration\n##rid\npresses\npractitioners\nbadminton\n##iard\ndetained\n##tera\nrecognizing\nrelates\nmisery\n##sies\n##tly\nreproduction\npiercing\npotatoes\nthornton\nesther\nmanners\nhbo\n##aan\nours\nbullshit\nernie\nperennial\nsensitivity\nilluminated\nrupert\n##jin\n##iss\n##ear\nrfc\nnassau\n##dock\nstaggered\nsocialism\n##haven\nappointments\nnonsense\nprestige\nsharma\nhaul\n##tical\nsolidarity\ngps\n##ook\n##rata\nigor\npedestrian\n##uit\nbaxter\ntenants\nwires\nmedication\nunlimited\nguiding\nimpacts\ndiabetes\n##rama\nsasha\npas\nclive\nextraction\n131\ncontinually\nconstraints\n##bilities\nsonata\nhunted\nsixteenth\nchu\nplanting\nquote\nmayer\npretended\nabs\nspat\n##hua\nceramic\n##cci\ncurtains\npigs\npitching\n##dad\nlatvian\nsore\ndayton\n##sted\n##qi\npatrols\nslice\nplayground\n##nted\nshone\nstool\napparatus\ninadequate\nmates\ntreason\n##ija\ndesires\n##liga\n##croft\nsomalia\nlaurent\nmir\nleonardo\noracle\ngrape\nobliged\nchevrolet\nthirteenth\nstunning\nenthusiastic\n##ede\naccounted\nconcludes\ncurrents\nbasil\n##kovic\ndrought\n##rica\nmai\n##aire\nshove\nposting\n##shed\npilgrimage\nhumorous\npacking\nfry\npencil\nwines\nsmells\n144\nmarilyn\naching\nnewest\nclung\nbon\nneighbours\nsanctioned\n##pie\nmug\n##stock\ndrowning\n##mma\nhydraulic\n##vil\nhiring\nreminder\nlilly\ninvestigators\n##ncies\nsour\n##eous\ncompulsory\npacket\n##rion\n##graphic\n##elle\ncannes\n##inate\ndepressed\n##rit\nheroic\nimportantly\ntheresa\n##tled\nconway\nsaturn\nmarginal\nrae\n##xia\ncorresponds\nroyce\npact\njasper\nexplosives\npackaging\naluminium\n##ttered\ndenotes\nrhythmic\nspans\nassignments\nhereditary\noutlined\noriginating\nsundays\nlad\nreissued\ngreeting\nbeatrice\n##dic\npillar\nmarcos\nplots\nhandbook\nalcoholic\njudiciary\navant\nslides\nextract\nmasculine\nblur\n##eum\n##force\nhomage\ntrembled\nowens\nhymn\ntrey\nomega\nsignaling\nsocks\naccumulated\nreacted\nattic\ntheo\nlining\nangie\ndistraction\nprimera\ntalbot\n##key\n1200\nti\ncreativity\nbilled\n##hey\ndeacon\neduardo\nidentifies\nproposition\ndizzy\ngunner\nhogan\n##yam\n##pping\n##hol\nja\n##chan\njensen\nreconstructed\n##berger\nclearance\ndarius\n##nier\nabe\nharlem\nplea\ndei\ncircled\nemotionally\nnotation\nfascist\nneville\nexceeded\nupwards\nviable\nducks\n##fo\nworkforce\nracer\nlimiting\nshri\n##lson\npossesses\n1600\nkerr\nmoths\ndevastating\nladen\ndisturbing\nlocking\n##cture\ngal\nfearing\naccreditation\nflavor\naide\n1870s\nmountainous\n##baum\nmelt\n##ures\nmotel\ntexture\nservers\nsoda\n##mb\nherd\n##nium\nerect\npuzzled\nhum\npeggy\nexaminations\ngould\ntestified\ngeoff\nren\ndevised\nsacks\n##law\ndenial\nposters\ngrunted\ncesar\ntutor\nec\ngerry\nofferings\nbyrne\nfalcons\ncombinations\nct\nincoming\npardon\nrocking\n26th\navengers\nflared\nmankind\nseller\nuttar\nloch\nnadia\nstroking\nexposing\n##hd\nfertile\nancestral\ninstituted\n##has\nnoises\nprophecy\ntaxation\neminent\nvivid\npol\n##bol\ndart\nindirect\nmultimedia\nnotebook\nupside\ndisplaying\nadrenaline\nreferenced\ngeometric\n##iving\nprogression\n##ddy\nblunt\nannounce\n##far\nimplementing\n##lav\naggression\nliaison\ncooler\ncares\nheadache\nplantations\ngorge\ndots\nimpulse\nthickness\nashamed\naveraging\nkathy\nobligation\nprecursor\n137\nfowler\nsymmetry\nthee\n225\nhears\n##rai\nundergoing\nads\nbutcher\nbowler\n##lip\ncigarettes\nsubscription\ngoodness\n##ically\nbrowne\n##hos\n##tech\nkyoto\ndonor\n##erty\ndamaging\nfriction\ndrifting\nexpeditions\nhardened\nprostitution\n152\nfauna\nblankets\nclaw\ntossing\nsnarled\nbutterflies\nrecruits\ninvestigative\ncoated\nhealed\n138\ncommunal\nhai\nxiii\nacademics\nboone\npsychologist\nrestless\nlahore\nstephens\nmba\nbrendan\nforeigners\nprinter\n##pc\nached\nexplode\n27th\ndeed\nscratched\ndared\n##pole\ncardiac\n1780\nokinawa\nproto\ncommando\ncompelled\noddly\nelectrons\n##base\nreplica\nthanksgiving\n##rist\nsheila\ndeliberate\nstafford\ntidal\nrepresentations\nhercules\nou\n##path\n##iated\nkidnapping\nlenses\n##tling\ndeficit\nsamoa\nmouths\nconsuming\ncomputational\nmaze\ngranting\nsmirk\nrazor\nfixture\nideals\ninviting\naiden\nnominal\n##vs\nissuing\njulio\npitt\nramsey\ndocks\n##oss\nexhaust\n##owed\nbavarian\ndraped\nanterior\nmating\nethiopian\nexplores\nnoticing\n##nton\ndiscarded\nconvenience\nhoffman\nendowment\nbeasts\ncartridge\nmormon\npaternal\nprobe\nsleeves\ninterfere\nlump\ndeadline\n##rail\njenks\nbulldogs\nscrap\nalternating\njustified\nreproductive\nnam\nseize\ndescending\nsecretariat\nkirby\ncoupe\ngrouped\nsmash\npanther\nsedan\ntapping\n##18\nlola\ncheer\ngermanic\nunfortunate\n##eter\nunrelated\n##fan\nsubordinate\n##sdale\nsuzanne\nadvertisement\n##ility\nhorsepower\n##lda\ncautiously\ndiscourse\nluigi\n##mans\n##fields\nnoun\nprevalent\nmao\nschneider\neverett\nsurround\ngovernorate\nkira\n##avia\nwestward\n##take\nmisty\nrails\nsustainability\n134\nunused\n##rating\npacks\ntoast\nunwilling\nregulate\nthy\nsuffrage\nnile\nawe\nassam\ndefinitions\ntravelers\naffordable\n##rb\nconferred\nsells\nundefeated\nbeneficial\ntorso\nbasal\nrepeating\nremixes\n##pass\nbahrain\ncables\nfang\n##itated\nexcavated\nnumbering\nstatutory\n##rey\ndeluxe\n##lian\nforested\nramirez\nderbyshire\nzeus\nslamming\ntransfers\nastronomer\nbanana\nlottery\nberg\nhistories\nbamboo\n##uchi\nresurrection\nposterior\nbowls\nvaguely\n##thi\nthou\npreserving\ntensed\noffence\n##inas\nmeyrick\ncallum\nridden\nwatt\nlangdon\ntying\nlowland\nsnorted\ndaring\ntruman\n##hale\n##girl\naura\noverly\nfiling\nweighing\ngoa\ninfections\nphilanthropist\nsaunders\neponymous\n##owski\nlatitude\nperspectives\nreviewing\nmets\ncommandant\nradial\n##kha\nflashlight\nreliability\nkoch\nvowels\namazed\nada\nelaine\nsupper\n##rth\n##encies\npredator\ndebated\nsoviets\ncola\n##boards\n##nah\ncompartment\ncrooked\narbitrary\nfourteenth\n##ctive\nhavana\nmajors\nsteelers\nclips\nprofitable\nambush\nexited\npackers\n##tile\nnude\ncracks\nfungi\n##е\nlimb\ntrousers\njosie\nshelby\ntens\nfrederic\n##ος\ndefinite\nsmoothly\nconstellation\ninsult\nbaton\ndiscs\nlingering\n##nco\nconclusions\nlent\nstaging\nbecker\ngrandpa\nshaky\n##tron\neinstein\nobstacles\nsk\nadverse\nelle\neconomically\n##moto\nmccartney\nthor\ndismissal\nmotions\nreadings\nnostrils\ntreatise\n##pace\nsqueezing\nevidently\nprolonged\n1783\nvenezuelan\nje\nmarguerite\nbeirut\ntakeover\nshareholders\n##vent\ndenise\ndigit\nairplay\nnorse\n##bbling\nimaginary\npills\nhubert\nblaze\nvacated\neliminating\n##ello\nvine\nmansfield\n##tty\nretrospective\nbarrow\nborne\nclutch\nbail\nforensic\nweaving\n##nett\n##witz\ndesktop\ncitadel\npromotions\nworrying\ndorset\nieee\nsubdivided\n##iating\nmanned\nexpeditionary\npickup\nsynod\nchuckle\n185\nbarney\n##rz\n##ffin\nfunctionality\nkarachi\nlitigation\nmeanings\nuc\nlick\nturbo\nanders\n##ffed\nexecute\ncurl\noppose\nankles\ntyphoon\n##د\n##ache\n##asia\nlinguistics\ncompassion\npressures\ngrazing\nperfection\n##iting\nimmunity\nmonopoly\nmuddy\nbackgrounds\n136\nnamibia\nfrancesca\nmonitors\nattracting\nstunt\ntuition\n##ии\nvegetable\n##mates\n##quent\nmgm\njen\ncomplexes\nforts\n##ond\ncellar\nbites\nseventeenth\nroyals\nflemish\nfailures\nmast\ncharities\n##cular\nperuvian\ncapitals\nmacmillan\nipswich\noutward\nfrigate\npostgraduate\nfolds\nemploying\n##ouse\nconcurrently\nfiery\n##tai\ncontingent\nnightmares\nmonumental\nnicaragua\n##kowski\nlizard\nmal\nfielding\ngig\nreject\n##pad\nharding\n##ipe\ncoastline\n##cin\n##nos\nbeethoven\nhumphrey\ninnovations\n##tam\n##nge\nnorris\ndoris\nsolicitor\nhuang\nobey\n141\n##lc\nniagara\n##tton\nshelves\naug\nbourbon\ncurry\nnightclub\nspecifications\nhilton\n##ndo\ncentennial\ndispersed\nworm\nneglected\nbriggs\nsm\nfont\nkuala\nuneasy\nplc\n##nstein\n##bound\n##aking\n##burgh\nawaiting\npronunciation\n##bbed\n##quest\neh\noptimal\nzhu\nraped\ngreens\npresided\nbrenda\nworries\n##life\nvenetian\nmarxist\nturnout\n##lius\nrefined\nbraced\nsins\ngrasped\nsunderland\nnickel\nspeculated\nlowell\ncyrillic\ncommunism\nfundraising\nresembling\ncolonists\nmutant\nfreddie\nusc\n##mos\ngratitude\n##run\nmural\n##lous\nchemist\nwi\nreminds\n28th\nsteals\ntess\npietro\n##ingen\npromoter\nri\nmicrophone\nhonoured\nrai\nsant\n##qui\nfeather\n##nson\nburlington\nkurdish\nterrorists\ndeborah\nsickness\n##wed\n##eet\nhazard\nirritated\ndesperation\nveil\nclarity\n##rik\njewels\nxv\n##gged\n##ows\n##cup\nberkshire\nunfair\nmysteries\norchid\nwinced\nexhaustion\nrenovations\nstranded\nobe\ninfinity\n##nies\nadapt\nredevelopment\nthanked\nregistry\nolga\ndomingo\nnoir\ntudor\nole\n##atus\ncommenting\nbehaviors\n##ais\ncrisp\npauline\nprobable\nstirling\nwigan\n##bian\nparalympics\npanting\nsurpassed\n##rew\nluca\nbarred\npony\nfamed\n##sters\ncassandra\nwaiter\ncarolyn\nexported\n##orted\nandres\ndestructive\ndeeds\njonah\ncastles\nvacancy\nsuv\n##glass\n1788\norchard\nyep\nfamine\nbelarusian\nsprang\n##forth\nskinny\n##mis\nadministrators\nrotterdam\nzambia\nzhao\nboiler\ndiscoveries\n##ride\n##physics\nlucius\ndisappointing\noutreach\nspoon\n##frame\nqualifications\nunanimously\nenjoys\nregency\n##iidae\nstade\nrealism\nveterinary\nrodgers\ndump\nalain\nchestnut\ncastile\ncensorship\nrumble\ngibbs\n##itor\ncommunion\nreggae\ninactivated\nlogs\nloads\n##houses\nhomosexual\n##iano\nale\ninforms\n##cas\nphrases\nplaster\nlinebacker\nambrose\nkaiser\nfascinated\n850\nlimerick\nrecruitment\nforge\nmastered\n##nding\nleinster\nrooted\nthreaten\n##strom\nborneo\n##hes\nsuggestions\nscholarships\npropeller\ndocumentaries\npatronage\ncoats\nconstructing\ninvest\nneurons\ncomet\nentirety\nshouts\nidentities\nannoying\nunchanged\nwary\n##antly\n##ogy\nneat\noversight\n##kos\nphillies\nreplay\nconstance\n##kka\nincarnation\nhumble\nskies\nminus\n##acy\nsmithsonian\n##chel\nguerrilla\njar\ncadets\n##plate\nsurplus\naudit\n##aru\ncracking\njoanna\nlouisa\npacing\n##lights\nintentionally\n##iri\ndiner\nnwa\nimprint\naustralians\ntong\nunprecedented\nbunker\nnaive\nspecialists\nark\nnichols\nrailing\nleaked\npedal\n##uka\nshrub\nlonging\nroofs\nv8\ncaptains\nneural\ntuned\n##ntal\n##jet\nemission\nmedina\nfrantic\ncodex\ndefinitive\nsid\nabolition\nintensified\nstocks\nenrique\nsustain\ngenoa\noxide\n##written\nclues\ncha\n##gers\ntributaries\nfragment\nvenom\n##rity\n##ente\n##sca\nmuffled\nvain\nsire\nlaos\n##ingly\n##hana\nhastily\nsnapping\nsurfaced\nsentiment\nmotive\n##oft\ncontests\napproximate\nmesa\nluckily\ndinosaur\nexchanges\npropelled\naccord\nbourne\nrelieve\ntow\nmasks\noffended\n##ues\ncynthia\n##mmer\nrains\nbartender\nzinc\nreviewers\nlois\n##sai\nlegged\narrogant\nrafe\nrosie\ncomprise\nhandicap\nblockade\ninlet\nlagoon\ncopied\ndrilling\nshelley\npetals\n##inian\nmandarin\nobsolete\n##inated\nonward\narguably\nproductivity\ncindy\npraising\nseldom\nbusch\ndiscusses\nraleigh\nshortage\nranged\nstanton\nencouragement\nfirstly\nconceded\novers\ntemporal\n##uke\ncbe\n##bos\nwoo\ncertainty\npumps\n##pton\nstalked\n##uli\nlizzie\nperiodic\nthieves\nweaker\n##night\ngases\nshoving\nchooses\nwc\n##chemical\nprompting\nweights\n##kill\nrobust\nflanked\nsticky\nhu\ntuberculosis\n##eb\n##eal\nchristchurch\nresembled\nwallet\nreese\ninappropriate\npictured\ndistract\nfixing\nfiddle\ngiggled\nburger\nheirs\nhairy\nmechanic\ntorque\napache\nobsessed\nchiefly\ncheng\nlogging\n##tag\nextracted\nmeaningful\nnumb\n##vsky\ngloucestershire\nreminding\n##bay\nunite\n##lit\nbreeds\ndiminished\nclown\nglove\n1860s\n##ن\n##ug\narchibald\nfocal\nfreelance\nsliced\ndepiction\n##yk\norganism\nswitches\nsights\nstray\ncrawling\n##ril\nlever\nleningrad\ninterpretations\nloops\nanytime\nreel\nalicia\ndelighted\n##ech\ninhaled\nxiv\nsuitcase\nbernie\nvega\nlicenses\nnorthampton\nexclusion\ninduction\nmonasteries\nracecourse\nhomosexuality\n##right\n##sfield\n##rky\ndimitri\nmichele\nalternatives\nions\ncommentators\ngenuinely\nobjected\npork\nhospitality\nfencing\nstephan\nwarships\nperipheral\nwit\ndrunken\nwrinkled\nquentin\nspends\ndeparting\nchung\nnumerical\nspokesperson\n##zone\njohannesburg\ncaliber\nkillers\n##udge\nassumes\nneatly\ndemographic\nabigail\nbloc\n##vel\nmounting\n##lain\nbentley\nslightest\nxu\nrecipients\n##jk\nmerlin\n##writer\nseniors\nprisons\nblinking\nhindwings\nflickered\nkappa\n##hel\n80s\nstrengthening\nappealing\nbrewing\ngypsy\nmali\nlashes\nhulk\nunpleasant\nharassment\nbio\ntreaties\npredict\ninstrumentation\npulp\ntroupe\nboiling\nmantle\n##ffe\nins\n##vn\ndividing\nhandles\nverbs\n##onal\ncoconut\nsenegal\n340\nthorough\ngum\nmomentarily\n##sto\ncocaine\npanicked\ndestined\n##turing\nteatro\ndenying\nweary\ncaptained\nmans\n##hawks\n##code\nwakefield\nbollywood\nthankfully\n##16\ncyril\n##wu\namendments\n##bahn\nconsultation\nstud\nreflections\nkindness\n1787\ninternally\n##ovo\ntex\nmosaic\ndistribute\npaddy\nseeming\n143\n##hic\npiers\n##15\n##mura\n##verse\npopularly\nwinger\nkang\nsentinel\nmccoy\n##anza\ncovenant\n##bag\nverge\nfireworks\nsuppress\nthrilled\ndominate\n##jar\nswansea\n##60\n142\nreconciliation\n##ndi\nstiffened\ncue\ndorian\n##uf\ndamascus\namor\nida\nforemost\n##aga\nporsche\nunseen\ndir\n##had\n##azi\nstony\nlexi\nmelodies\n##nko\nangular\ninteger\npodcast\nants\ninherent\njaws\njustify\npersona\n##olved\njosephine\n##nr\n##ressed\ncustomary\nflashes\ngala\ncyrus\nglaring\nbackyard\nariel\nphysiology\ngreenland\nhtml\nstir\navon\natletico\nfinch\nmethodology\nked\n##lent\nmas\ncatholicism\ntownsend\nbranding\nquincy\nfits\ncontainers\n1777\nashore\naragon\n##19\nforearm\npoisoning\n##sd\nadopting\nconquer\ngrinding\namnesty\nkeller\nfinances\nevaluate\nforged\nlankan\ninstincts\n##uto\nguam\nbosnian\nphotographed\nworkplace\ndesirable\nprotector\n##dog\nallocation\nintently\nencourages\nwilly\n##sten\nbodyguard\nelectro\nbrighter\n##ν\nbihar\n##chev\nlasts\nopener\namphibious\nsal\nverde\narte\n##cope\ncaptivity\nvocabulary\nyields\n##tted\nagreeing\ndesmond\npioneered\n##chus\nstrap\ncampaigned\nrailroads\n##ович\nemblem\n##dre\nstormed\n501\n##ulous\nmarijuana\nnorthumberland\n##gn\n##nath\nbowen\nlandmarks\nbeaumont\n##qua\ndanube\n##bler\nattorneys\nth\nge\nflyers\ncritique\nvillains\ncass\nmutation\nacc\n##0s\ncolombo\nmckay\nmotif\nsampling\nconcluding\nsyndicate\n##rell\nneon\nstables\nds\nwarnings\nclint\nmourning\nwilkinson\n##tated\nmerrill\nleopard\nevenings\nexhaled\nemil\nsonia\nezra\ndiscrete\nstove\nfarrell\nfifteenth\nprescribed\nsuperhero\n##rier\nworms\nhelm\nwren\n##duction\n##hc\nexpo\n##rator\nhq\nunfamiliar\nantony\nprevents\nacceleration\nfiercely\nmari\npainfully\ncalculations\ncheaper\nign\nclifton\nirvine\ndavenport\nmozambique\n##np\npierced\n##evich\nwonders\n##wig\n##cate\n##iling\ncrusade\nware\n##uel\nenzymes\nreasonably\nmls\n##coe\nmater\nambition\nbunny\neliot\nkernel\n##fin\nasphalt\nheadmaster\ntorah\naden\nlush\npins\nwaived\n##care\n##yas\njoao\nsubstrate\nenforce\n##grad\n##ules\nalvarez\nselections\nepidemic\ntempted\n##bit\nbremen\ntranslates\nensured\nwaterfront\n29th\nforrest\nmanny\nmalone\nkramer\nreigning\ncookies\nsimpler\nabsorption\n205\nengraved\n##ffy\nevaluated\n1778\nhaze\n146\ncomforting\ncrossover\n##abe\nthorn\n##rift\n##imo\n##pop\nsuppression\nfatigue\ncutter\n##tr\n201\nwurttemberg\n##orf\nenforced\nhovering\nproprietary\ngb\nsamurai\nsyllable\nascent\nlacey\ntick\nlars\ntractor\nmerchandise\nrep\nbouncing\ndefendants\n##yre\nhuntington\n##ground\n##oko\nstandardized\n##hor\n##hima\nassassinated\nnu\npredecessors\nrainy\nliar\nassurance\nlyrical\n##uga\nsecondly\nflattened\nios\nparameter\nundercover\n##mity\nbordeaux\npunish\nridges\nmarkers\nexodus\ninactive\nhesitate\ndebbie\nnyc\npledge\nsavoy\nnagar\noffset\norganist\n##tium\nhesse\nmarin\nconverting\n##iver\ndiagram\npropulsion\npu\nvalidity\nreverted\nsupportive\n##dc\nministries\nclans\nresponds\nproclamation\n##inae\n##ø\n##rea\nein\npleading\npatriot\nsf\nbirch\nislanders\nstrauss\nhates\n##dh\nbrandenburg\nconcession\nrd\n##ob\n1900s\nkillings\ntextbook\nantiquity\ncinematography\nwharf\nembarrassing\nsetup\ncreed\nfarmland\ninequality\ncentred\nsignatures\nfallon\n370\n##ingham\n##uts\nceylon\ngazing\ndirective\nlaurie\n##tern\nglobally\n##uated\n##dent\nallah\nexcavation\nthreads\n##cross\n148\nfrantically\nicc\nutilize\ndetermines\nrespiratory\nthoughtful\nreceptions\n##dicate\nmerging\nchandra\nseine\n147\nbuilders\nbuilds\ndiagnostic\ndev\nvisibility\ngoddamn\nanalyses\ndhaka\ncho\nproves\nchancel\nconcurrent\ncuriously\ncanadians\npumped\nrestoring\n1850s\nturtles\njaguar\nsinister\nspinal\ntraction\ndeclan\nvows\n1784\nglowed\ncapitalism\nswirling\ninstall\nuniversidad\n##lder\n##oat\nsoloist\n##genic\n##oor\ncoincidence\nbeginnings\nnissan\ndip\nresorts\ncaucasus\ncombustion\ninfectious\n##eno\npigeon\nserpent\n##itating\nconclude\nmasked\nsalad\njew\n##gr\nsurreal\ntoni\n##wc\nharmonica\n151\n##gins\n##etic\n##coat\nfishermen\nintending\nbravery\n##wave\nklaus\ntitan\nwembley\ntaiwanese\nransom\n40th\nincorrect\nhussein\neyelids\njp\ncooke\ndramas\nutilities\n##etta\n##print\neisenhower\nprincipally\ngranada\nlana\n##rak\nopenings\nconcord\n##bl\nbethany\nconnie\nmorality\nsega\n##mons\n##nard\nearnings\n##kara\n##cine\nwii\ncommunes\n##rel\ncoma\ncomposing\nsoftened\nsevered\ngrapes\n##17\nnguyen\nanalyzed\nwarlord\nhubbard\nheavenly\nbehave\nslovenian\n##hit\n##ony\nhailed\nfilmmakers\ntrance\ncaldwell\nskye\nunrest\ncoward\nlikelihood\n##aging\nbern\nsci\ntaliban\nhonolulu\npropose\n##wang\n1700\nbrowser\nimagining\ncobra\ncontributes\ndukes\ninstinctively\nconan\nviolinist\n##ores\naccessories\ngradual\n##amp\nquotes\nsioux\n##dating\nundertake\nintercepted\nsparkling\ncompressed\n139\nfungus\ntombs\nhaley\nimposing\nrests\ndegradation\nlincolnshire\nretailers\nwetlands\ntulsa\ndistributor\ndungeon\nnun\ngreenhouse\nconvey\natlantis\naft\nexits\noman\ndresser\nlyons\n##sti\njoking\neddy\njudgement\nomitted\ndigits\n##cts\n##game\njuniors\n##rae\ncents\nstricken\nune\n##ngo\nwizards\nweir\nbreton\nnan\ntechnician\nfibers\nliking\nroyalty\n##cca\n154\npersia\nterribly\nmagician\n##rable\n##unt\nvance\ncafeteria\nbooker\ncamille\nwarmer\n##static\nconsume\ncavern\ngaps\ncompass\ncontemporaries\nfoyer\nsoothing\ngraveyard\nmaj\nplunged\nblush\n##wear\ncascade\ndemonstrates\nordinance\n##nov\nboyle\n##lana\nrockefeller\nshaken\nbanjo\nizzy\n##ense\nbreathless\nvines\n##32\n##eman\nalterations\nchromosome\ndwellings\nfeudal\nmole\n153\ncatalonia\nrelics\ntenant\nmandated\n##fm\nfridge\nhats\nhonesty\npatented\nraul\nheap\ncruisers\naccusing\nenlightenment\ninfants\nwherein\nchatham\ncontractors\nzen\naffinity\nhc\nosborne\npiston\n156\ntraps\nmaturity\n##rana\nlagos\n##zal\npeering\n##nay\nattendant\ndealers\nprotocols\nsubset\nprospects\nbiographical\n##cre\nartery\n##zers\ninsignia\nnuns\nendured\n##eration\nrecommend\nschwartz\nserbs\nberger\ncromwell\ncrossroads\n##ctor\nenduring\nclasped\ngrounded\n##bine\nmarseille\ntwitched\nabel\nchoke\nhttps\ncatalyst\nmoldova\nitalians\n##tist\ndisastrous\nwee\n##oured\n##nti\nwwf\nnope\n##piration\n##asa\nexpresses\nthumbs\n167\n##nza\ncoca\n1781\ncheating\n##ption\nskipped\nsensory\nheidelberg\nspies\nsatan\ndangers\nsemifinal\n202\nbohemia\nwhitish\nconfusing\nshipbuilding\nrelies\nsurgeons\nlandings\nravi\nbaku\nmoor\nsuffix\nalejandro\n##yana\nlitre\nupheld\n##unk\nrajasthan\n##rek\ncoaster\ninsists\nposture\nscenarios\netienne\nfavoured\nappoint\ntransgender\nelephants\npoked\ngreenwood\ndefences\nfulfilled\nmilitant\nsomali\n1758\nchalk\npotent\n##ucci\nmigrants\nwink\nassistants\nnos\nrestriction\nactivism\nniger\n##ario\ncolon\nshaun\n##sat\ndaphne\n##erated\nswam\ncongregations\nreprise\nconsiderations\nmagnet\nplayable\nxvi\n##р\noverthrow\ntobias\nknob\nchavez\ncoding\n##mers\npropped\nkatrina\norient\nnewcomer\n##suke\ntemperate\n##pool\nfarmhouse\ninterrogation\n##vd\ncommitting\n##vert\nforthcoming\nstrawberry\njoaquin\nmacau\nponds\nshocking\nsiberia\n##cellular\nchant\ncontributors\n##nant\n##ologists\nsped\nabsorb\nhail\n1782\nspared\n##hore\nbarbados\nkarate\nopus\noriginates\nsaul\n##xie\nevergreen\nleaped\n##rock\ncorrelation\nexaggerated\nweekday\nunification\nbump\ntracing\nbrig\nafb\npathways\nutilizing\n##ners\nmod\nmb\ndisturbance\nkneeling\n##stad\n##guchi\n100th\npune\n##thy\ndecreasing\n168\nmanipulation\nmiriam\nacademia\necosystem\noccupational\nrbi\n##lem\nrift\n##14\nrotary\nstacked\nincorporation\nawakening\ngenerators\nguerrero\nracist\n##omy\ncyber\nderivatives\nculminated\nallie\nannals\npanzer\nsainte\nwikipedia\npops\nzu\naustro\n##vate\nalgerian\npolitely\nnicholson\nmornings\neducate\ntastes\nthrill\ndartmouth\n##gating\ndb\n##jee\nregan\ndiffering\nconcentrating\nchoreography\ndivinity\n##media\npledged\nalexandre\nrouting\ngregor\nmadeline\n##idal\napocalypse\n##hora\ngunfire\nculminating\nelves\nfined\nliang\nlam\nprogrammed\ntar\nguessing\ntransparency\ngabrielle\n##gna\ncancellation\nflexibility\n##lining\naccession\nshea\nstronghold\nnets\nspecializes\n##rgan\nabused\nhasan\nsgt\nling\nexceeding\n##₄\nadmiration\nsupermarket\n##ark\nphotographers\nspecialised\ntilt\nresonance\nhmm\nperfume\n380\nsami\nthreatens\ngarland\nbotany\nguarding\nboiled\ngreet\npuppy\nrusso\nsupplier\nwilmington\nvibrant\nvijay\n##bius\nparalympic\ngrumbled\npaige\nfaa\nlicking\nmargins\nhurricanes\n##gong\nfest\ngrenade\nripping\n##uz\ncounseling\nweigh\n##sian\nneedles\nwiltshire\nedison\ncostly\n##not\nfulton\ntramway\nredesigned\nstaffordshire\ncache\ngasping\nwatkins\nsleepy\ncandidacy\n##group\nmonkeys\ntimeline\nthrobbing\n##bid\n##sos\nberth\nuzbekistan\nvanderbilt\nbothering\noverturned\nballots\ngem\n##iger\nsunglasses\nsubscribers\nhooker\ncompelling\nang\nexceptionally\nsaloon\nstab\n##rdi\ncarla\nterrifying\nrom\n##vision\ncoil\n##oids\nsatisfying\nvendors\n31st\nmackay\ndeities\noverlooked\nambient\nbahamas\nfelipe\nolympia\nwhirled\nbotanist\nadvertised\ntugging\n##dden\ndisciples\nmorales\nunionist\nrites\nfoley\nmorse\nmotives\ncreepy\n##₀\nsoo\n##sz\nbargain\nhighness\nfrightening\nturnpike\ntory\nreorganization\n##cer\ndepict\nbiographer\n##walk\nunopposed\nmanifesto\n##gles\ninstitut\nemile\naccidental\nkapoor\n##dam\nkilkenny\ncortex\nlively\n##13\nromanesque\njain\nshan\ncannons\n##ood\n##ske\npetrol\nechoing\namalgamated\ndisappears\ncautious\nproposes\nsanctions\ntrenton\n##ر\nflotilla\naus\ncontempt\ntor\ncanary\ncote\ntheirs\n##hun\nconceptual\ndeleted\nfascinating\npaso\nblazing\nelf\nhonourable\nhutchinson\n##eiro\n##outh\n##zin\nsurveyor\ntee\namidst\nwooded\nreissue\nintro\n##ono\ncobb\nshelters\nnewsletter\nhanson\nbrace\nencoding\nconfiscated\ndem\ncaravan\nmarino\nscroll\nmelodic\ncows\nimam\n##adi\n##aneous\nnorthward\nsearches\nbiodiversity\ncora\n310\nroaring\n##bers\nconnell\ntheologian\nhalo\ncompose\npathetic\nunmarried\ndynamo\n##oot\naz\ncalculation\ntoulouse\ndeserves\nhumour\nnr\nforgiveness\ntam\nundergone\nmartyr\npamela\nmyths\nwhore\ncounselor\nhicks\n290\nheavens\nbattleship\nelectromagnetic\n##bbs\nstellar\nestablishments\npresley\nhopped\n##chin\ntemptation\n90s\nwills\nnas\n##yuan\nnhs\n##nya\nseminars\n##yev\nadaptations\ngong\nasher\nlex\nindicator\nsikh\ntobago\ncites\ngoin\n##yte\nsatirical\n##gies\ncharacterised\ncorrespond\nbubbles\nlure\nparticipates\n##vid\neruption\nskate\ntherapeutic\n1785\ncanals\nwholesale\ndefaulted\nsac\n460\npetit\n##zzled\nvirgil\nleak\nravens\n256\nportraying\n##yx\nghetto\ncreators\ndams\nportray\nvicente\n##rington\nfae\nnamesake\nbounty\n##arium\njoachim\n##ota\n##iser\naforementioned\naxle\nsnout\ndepended\ndismantled\nreuben\n480\n##ibly\ngallagher\n##lau\n##pd\nearnest\n##ieu\n##iary\ninflicted\nobjections\n##llar\nasa\ngritted\n##athy\njericho\n##sea\n##was\nflick\nunderside\nceramics\nundead\nsubstituted\n195\neastward\nundoubtedly\nwheeled\nchimney\n##iche\nguinness\ncb\n##ager\nsiding\n##bell\ntraitor\nbaptiste\ndisguised\ninauguration\n149\ntipperary\nchoreographer\nperched\nwarmed\nstationary\neco\n##ike\n##ntes\nbacterial\n##aurus\nflores\nphosphate\n##core\nattacker\ninvaders\nalvin\nintersects\na1\nindirectly\nimmigrated\nbusinessmen\ncornelius\nvalves\nnarrated\npill\nsober\nul\nnationale\nmonastic\napplicants\nscenery\n##jack\n161\nmotifs\nconstitutes\ncpu\n##osh\njurisdictions\nsd\ntuning\nirritation\nwoven\n##uddin\nfertility\ngao\n##erie\nantagonist\nimpatient\nglacial\nhides\nboarded\ndenominations\ninterception\n##jas\ncookie\nnicola\n##tee\nalgebraic\nmarquess\nbahn\nparole\nbuyers\nbait\nturbines\npaperwork\nbestowed\nnatasha\nrenee\noceans\npurchases\n157\nvaccine\n215\n##tock\nfixtures\nplayhouse\nintegrate\njai\noswald\nintellectuals\n##cky\nbooked\nnests\nmortimer\n##isi\nobsession\nsept\n##gler\n##sum\n440\nscrutiny\nsimultaneous\nsquinted\n##shin\ncollects\noven\nshankar\npenned\nremarkably\n##я\nslips\nluggage\nspectral\n1786\ncollaborations\nlouie\nconsolidation\n##ailed\n##ivating\n420\nhoover\nblackpool\nharness\nignition\nvest\ntails\nbelmont\nmongol\nskinner\n##nae\nvisually\nmage\nderry\n##tism\n##unce\nstevie\ntransitional\n##rdy\nredskins\ndrying\nprep\nprospective\n##21\nannoyance\noversee\n##loaded\nfills\n##books\n##iki\nannounces\nfda\nscowled\nrespects\nprasad\nmystic\ntucson\n##vale\nrevue\nspringer\nbankrupt\n1772\naristotle\nsalvatore\nhabsburg\n##geny\ndal\nnatal\nnut\npod\nchewing\ndarts\nmoroccan\nwalkover\nrosario\nlenin\npunjabi\n##ße\ngrossed\nscattering\nwired\ninvasive\nhui\npolynomial\ncorridors\nwakes\ngina\nportrays\n##cratic\narid\nretreating\nerich\nirwin\nsniper\n##dha\nlinen\nlindsey\nmaneuver\nbutch\nshutting\nsocio\nbounce\ncommemorative\npostseason\njeremiah\npines\n275\nmystical\nbeads\nbp\nabbas\nfurnace\nbidding\nconsulted\nassaulted\nempirical\nrubble\nenclosure\nsob\nweakly\ncancel\npolly\nyielded\n##emann\ncurly\nprediction\nbattered\n70s\nvhs\njacqueline\nrender\nsails\nbarked\ndetailing\ngrayson\nriga\nsloane\nraging\n##yah\nherbs\nbravo\n##athlon\nalloy\ngiggle\nimminent\nsuffers\nassumptions\nwaltz\n##itate\naccomplishments\n##ited\nbathing\nremixed\ndeception\nprefix\n##emia\ndeepest\n##tier\n##eis\nbalkan\nfrogs\n##rong\nslab\n##pate\nphilosophers\npeterborough\ngrains\nimports\ndickinson\nrwanda\n##atics\n1774\ndirk\nlan\ntablets\n##rove\nclone\n##rice\ncaretaker\nhostilities\nmclean\n##gre\nregimental\ntreasures\nnorms\nimpose\ntsar\ntango\ndiplomacy\nvariously\ncomplain\n192\nrecognise\narrests\n1779\ncelestial\npulitzer\n##dus\nbing\nlibretto\n##moor\nadele\nsplash\n##rite\nexpectation\nlds\nconfronts\n##izer\nspontaneous\nharmful\nwedge\nentrepreneurs\nbuyer\n##ope\nbilingual\ntranslate\nrugged\nconner\ncirculated\nuae\neaton\n##gra\n##zzle\nlingered\nlockheed\nvishnu\nreelection\nalonso\n##oom\njoints\nyankee\nheadline\ncooperate\nheinz\nlaureate\ninvading\n##sford\nechoes\nscandinavian\n##dham\nhugging\nvitamin\nsalute\nmicah\nhind\ntrader\n##sper\nradioactive\n##ndra\nmilitants\npoisoned\nratified\nremark\ncampeonato\ndeprived\nwander\nprop\n##dong\noutlook\n##tani\n##rix\n##eye\nchiang\ndarcy\n##oping\nmandolin\nspice\nstatesman\nbabylon\n182\nwalled\nforgetting\nafro\n##cap\n158\ngiorgio\nbuffer\n##polis\nplanetary\n##gis\noverlap\nterminals\nkinda\ncentenary\n##bir\narising\nmanipulate\nelm\nke\n1770\nak\n##tad\nchrysler\nmapped\nmoose\npomeranian\nquad\nmacarthur\nassemblies\nshoreline\nrecalls\nstratford\n##rted\nnoticeable\n##evic\nimp\n##rita\n##sque\naccustomed\nsupplying\ntents\ndisgusted\nvogue\nsipped\nfilters\nkhz\nreno\nselecting\nluftwaffe\nmcmahon\ntyne\nmasterpiece\ncarriages\ncollided\ndunes\nexercised\nflare\nremembers\nmuzzle\n##mobile\nheck\n##rson\nburgess\nlunged\nmiddleton\nboycott\nbilateral\n##sity\nhazardous\nlumpur\nmultiplayer\nspotlight\njackets\ngoldman\nliege\nporcelain\nrag\nwaterford\nbenz\nattracts\nhopeful\nbattling\nottomans\nkensington\nbaked\nhymns\ncheyenne\nlattice\nlevine\nborrow\npolymer\nclashes\nmichaels\nmonitored\ncommitments\ndenounced\n##25\n##von\ncavity\n##oney\nhobby\nakin\n##holders\nfutures\nintricate\ncornish\npatty\n##oned\nillegally\ndolphin\n##lag\nbarlow\nyellowish\nmaddie\napologized\nluton\nplagued\n##puram\nnana\n##rds\nsway\nfanny\nłodz\n##rino\npsi\nsuspicions\nhanged\n##eding\ninitiate\ncharlton\n##por\nnak\ncompetent\n235\nanalytical\nannex\nwardrobe\nreservations\n##rma\nsect\n162\nfairfax\nhedge\npiled\nbuckingham\nuneven\nbauer\nsimplicity\nsnyder\ninterpret\naccountability\ndonors\nmoderately\nbyrd\ncontinents\n##cite\n##max\ndisciple\nhr\njamaican\nping\nnominees\n##uss\nmongolian\ndiver\nattackers\neagerly\nideological\npillows\nmiracles\napartheid\nrevolver\nsulfur\nclinics\nmoran\n163\n##enko\nile\nkaty\nrhetoric\n##icated\nchronology\nrecycling\n##hrer\nelongated\nmughal\npascal\nprofiles\nvibration\ndatabases\ndomination\n##fare\n##rant\nmatthias\ndigest\nrehearsal\npolling\nweiss\ninitiation\nreeves\nclinging\nflourished\nimpress\nngo\n##hoff\n##ume\nbuckley\nsymposium\nrhythms\nweed\nemphasize\ntransforming\n##taking\n##gence\n##yman\naccountant\nanalyze\nflicker\nfoil\npriesthood\nvoluntarily\ndecreases\n##80\n##hya\nslater\nsv\ncharting\nmcgill\n##lde\nmoreno\n##iu\nbesieged\nzur\nrobes\n##phic\nadmitting\napi\ndeported\nturmoil\npeyton\nearthquakes\n##ares\nnationalists\nbeau\nclair\nbrethren\ninterrupt\nwelch\ncurated\ngalerie\nrequesting\n164\n##ested\nimpending\nsteward\nviper\n##vina\ncomplaining\nbeautifully\nbrandy\nfoam\nnl\n1660\n##cake\nalessandro\npunches\nlaced\nexplanations\n##lim\nattribute\nclit\nreggie\ndiscomfort\n##cards\nsmoothed\nwhales\n##cene\nadler\ncountered\nduffy\ndisciplinary\nwidening\nrecipe\nreliance\nconducts\ngoats\ngradient\npreaching\n##shaw\nmatilda\nquasi\nstriped\nmeridian\ncannabis\ncordoba\ncertificates\n##agh\n##tering\ngraffiti\nhangs\npilgrims\nrepeats\n##ych\nrevive\nurine\netat\n##hawk\nfueled\nbelts\nfuzzy\nsusceptible\n##hang\nmauritius\nsalle\nsincere\nbeers\nhooks\n##cki\narbitration\nentrusted\nadvise\nsniffed\nseminar\njunk\ndonnell\nprocessors\nprincipality\nstrapped\ncelia\nmendoza\neverton\nfortunes\nprejudice\nstarving\nreassigned\nsteamer\n##lund\ntuck\nevenly\nforeman\n##ffen\ndans\n375\nenvisioned\nslit\n##xy\nbaseman\nliberia\nrosemary\n##weed\nelectrified\nperiodically\npotassium\nstride\ncontexts\nsperm\nslade\nmariners\ninflux\nbianca\nsubcommittee\n##rane\nspilling\nicao\nestuary\n##nock\ndelivers\niphone\n##ulata\nisa\nmira\nbohemian\ndessert\n##sbury\nwelcoming\nproudly\nslowing\n##chs\nmusee\nascension\nruss\n##vian\nwaits\n##psy\nafricans\nexploit\n##morphic\ngov\neccentric\ncrab\npeck\n##ull\nentrances\nformidable\nmarketplace\ngroom\nbolted\nmetabolism\npatton\nrobbins\ncourier\npayload\nendure\n##ifier\nandes\nrefrigerator\n##pr\nornate\n##uca\nruthless\nillegitimate\nmasonry\nstrasbourg\nbikes\nadobe\n##³\napples\nquintet\nwillingly\nniche\nbakery\ncorpses\nenergetic\n##cliffe\n##sser\n##ards\n177\ncentimeters\ncentro\nfuscous\ncretaceous\nrancho\n##yde\nandrei\ntelecom\ntottenham\noasis\nordination\nvulnerability\npresiding\ncorey\ncp\npenguins\nsims\n##pis\nmalawi\npiss\n##48\ncorrection\n##cked\n##ffle\n##ryn\ncountdown\ndetectives\npsychiatrist\npsychedelic\ndinosaurs\nblouse\n##get\nchoi\nvowed\n##oz\nrandomly\n##pol\n49ers\nscrub\nblanche\nbruins\ndusseldorf\n##using\nunwanted\n##ums\n212\ndominique\nelevations\nheadlights\nom\nlaguna\n##oga\n1750\nfamously\nignorance\nshrewsbury\n##aine\najax\nbreuning\nche\nconfederacy\ngreco\noverhaul\n##screen\npaz\nskirts\ndisagreement\ncruelty\njagged\nphoebe\nshifter\nhovered\nviruses\n##wes\nmandy\n##lined\n##gc\nlandlord\nsquirrel\ndashed\n##ι\nornamental\ngag\nwally\ngrange\nliteral\nspurs\nundisclosed\nproceeding\nyin\n##text\nbillie\norphan\nspanned\nhumidity\nindy\nweighted\npresentations\nexplosions\nlucian\n##tary\nvaughn\nhindus\n##anga\n##hell\npsycho\n171\ndaytona\nprotects\nefficiently\nrematch\nsly\ntandem\n##oya\nrebranded\nimpaired\nhee\nmetropolis\npeach\ngodfrey\ndiaspora\nethnicity\nprosperous\ngleaming\ndar\ngrossing\nplayback\n##rden\nstripe\npistols\n##tain\nbirths\nlabelled\n##cating\n172\nrudy\nalba\n##onne\naquarium\nhostility\n##gb\n##tase\nshudder\nsumatra\nhardest\nlakers\nconsonant\ncreeping\ndemos\nhomicide\ncapsule\nzeke\nliberties\nexpulsion\npueblo\n##comb\ntrait\ntransporting\n##ddin\n##neck\n##yna\ndepart\ngregg\nmold\nledge\nhangar\noldham\nplayboy\ntermination\nanalysts\ngmbh\nromero\n##itic\ninsist\ncradle\nfilthy\nbrightness\nslash\nshootout\ndeposed\nbordering\n##truct\nisis\nmicrowave\ntumbled\nsheltered\ncathy\nwerewolves\nmessy\nandersen\nconvex\nclapped\nclinched\nsatire\nwasting\nedo\nvc\nrufus\n##jak\nmont\n##etti\npoznan\n##keeping\nrestructuring\ntransverse\n##rland\nazerbaijani\nslovene\ngestures\nroommate\nchoking\nshear\n##quist\nvanguard\noblivious\n##hiro\ndisagreed\nbaptism\n##lich\ncoliseum\n##aceae\nsalvage\nsociete\ncory\nlocke\nrelocation\nrelying\nversailles\nahl\nswelling\n##elo\ncheerful\n##word\n##edes\ngin\nsarajevo\nobstacle\ndiverted\n##nac\nmessed\nthoroughbred\nfluttered\nutrecht\nchewed\nacquaintance\nassassins\ndispatch\nmirza\n##wart\nnike\nsalzburg\nswell\nyen\n##gee\nidle\nligue\nsamson\n##nds\n##igh\nplayful\nspawned\n##cise\ntease\n##case\nburgundy\n##bot\nstirring\nskeptical\ninterceptions\nmarathi\n##dies\nbedrooms\naroused\npinch\n##lik\npreferences\ntattoos\nbuster\ndigitally\nprojecting\nrust\n##ital\nkitten\npriorities\naddison\npseudo\n##guard\ndusk\nicons\nsermon\n##psis\n##iba\nbt\n##lift\n##xt\nju\ntruce\nrink\n##dah\n##wy\ndefects\npsychiatry\noffences\ncalculate\nglucose\n##iful\n##rized\n##unda\nfrancaise\n##hari\nrichest\nwarwickshire\ncarly\n1763\npurity\nredemption\nlending\n##cious\nmuse\nbruises\ncerebral\naero\ncarving\n##name\npreface\nterminology\ninvade\nmonty\n##int\nanarchist\nblurred\n##iled\nrossi\ntreats\nguts\nshu\nfoothills\nballads\nundertaking\npremise\ncecilia\naffiliates\nblasted\nconditional\nwilder\nminors\ndrone\nrudolph\nbuffy\nswallowing\nhorton\nattested\n##hop\nrutherford\nhowell\nprimetime\nlivery\npenal\n##bis\nminimize\nhydro\nwrecked\nwrought\npalazzo\n##gling\ncans\nvernacular\nfriedman\nnobleman\nshale\nwalnut\ndanielle\n##ection\n##tley\nsears\n##kumar\nchords\nlend\nflipping\nstreamed\npor\ndracula\ngallons\nsacrifices\ngamble\norphanage\n##iman\nmckenzie\n##gible\nboxers\ndaly\n##balls\n##ان\n208\n##ific\n##rative\n##iq\nexploited\nslated\n##uity\ncircling\nhillary\npinched\ngoldberg\nprovost\ncampaigning\nlim\npiles\nironically\njong\nmohan\nsuccessors\nusaf\n##tem\n##ught\nautobiographical\nhaute\npreserves\n##ending\nacquitted\ncomparisons\n203\nhydroelectric\ngangs\ncypriot\ntorpedoes\nrushes\nchrome\nderive\nbumps\ninstability\nfiat\npets\n##mbe\nsilas\ndye\nreckless\nsettler\n##itation\ninfo\nheats\n##writing\n176\ncanonical\nmaltese\nfins\nmushroom\nstacy\naspen\navid\n##kur\n##loading\nvickers\ngaston\nhillside\nstatutes\nwilde\ngail\nkung\nsabine\ncomfortably\nmotorcycles\n##rgo\n169\npneumonia\nfetch\n##sonic\naxel\nfaintly\nparallels\n##oop\nmclaren\nspouse\ncompton\ninterdisciplinary\nminer\n##eni\n181\nclamped\n##chal\n##llah\nseparates\nversa\n##mler\nscarborough\nlabrador\n##lity\n##osing\nrutgers\nhurdles\ncomo\n166\nburt\ndivers\n##100\nwichita\ncade\ncoincided\n##erson\nbruised\nmla\n##pper\nvineyard\n##ili\n##brush\nnotch\nmentioning\njase\nhearted\nkits\ndoe\n##acle\npomerania\n##ady\nronan\nseizure\npavel\nproblematic\n##zaki\ndomenico\n##ulin\ncatering\npenelope\ndependence\nparental\nemilio\nministerial\natkinson\n##bolic\nclarkson\nchargers\ncolby\ngrill\npeeked\narises\nsummon\n##aged\nfools\n##grapher\nfaculties\nqaeda\n##vial\ngarner\nrefurbished\n##hwa\ngeelong\ndisasters\nnudged\nbs\nshareholder\nlori\nalgae\nreinstated\nrot\n##ades\n##nous\ninvites\nstainless\n183\ninclusive\n##itude\ndiocesan\ntil\n##icz\ndenomination\n##xa\nbenton\nfloral\nregisters\n##ider\n##erman\n##kell\nabsurd\nbrunei\nguangzhou\nhitter\nretaliation\n##uled\n##eve\nblanc\nnh\nconsistency\ncontamination\n##eres\n##rner\ndire\npalermo\nbroadcasters\ndiaries\ninspire\nvols\nbrewer\ntightening\nky\nmixtape\nhormone\n##tok\nstokes\n##color\n##dly\n##ssi\npg\n##ometer\n##lington\nsanitation\n##tility\nintercontinental\napps\n##adt\n¹⁄₂\ncylinders\neconomies\nfavourable\nunison\ncroix\ngertrude\nodyssey\nvanity\ndangling\n##logists\nupgrades\ndice\nmiddleweight\npractitioner\n##ight\n206\nhenrik\nparlor\norion\nangered\nlac\npython\nblurted\n##rri\nsensual\nintends\nswings\nangled\n##phs\nhusky\nattain\npeerage\nprecinct\ntextiles\ncheltenham\nshuffled\ndai\nconfess\ntasting\nbhutan\n##riation\ntyrone\nsegregation\nabrupt\nruiz\n##rish\nsmirked\nblackwell\nconfidential\nbrowning\namounted\n##put\nvase\nscarce\nfabulous\nraided\nstaple\nguyana\nunemployed\nglider\nshay\n##tow\ncarmine\ntroll\nintervene\nsquash\nsuperstar\n##uce\ncylindrical\nlen\nroadway\nresearched\nhandy\n##rium\n##jana\nmeta\nlao\ndeclares\n##rring\n##tadt\n##elin\n##kova\nwillem\nshrubs\nnapoleonic\nrealms\nskater\nqi\nvolkswagen\n##ł\ntad\nhara\narchaeologist\nawkwardly\neerie\n##kind\nwiley\n##heimer\n##24\ntitus\norganizers\ncfl\ncrusaders\nlama\nusb\nvent\nenraged\nthankful\noccupants\nmaximilian\n##gaard\npossessing\ntextbooks\n##oran\ncollaborator\nquaker\n##ulo\navalanche\nmono\nsilky\nstraits\nisaiah\nmustang\nsurged\nresolutions\npotomac\ndescend\ncl\nkilograms\nplato\nstrains\nsaturdays\n##olin\nbernstein\n##ype\nholstein\nponytail\n##watch\nbelize\nconversely\nheroine\nperpetual\n##ylus\ncharcoal\npiedmont\nglee\nnegotiating\nbackdrop\nprologue\n##jah\n##mmy\npasadena\nclimbs\nramos\nsunni\n##holm\n##tner\n##tri\nanand\ndeficiency\nhertfordshire\nstout\n##avi\naperture\norioles\n##irs\ndoncaster\nintrigued\nbombed\ncoating\notis\n##mat\ncocktail\n##jit\n##eto\namir\narousal\nsar\n##proof\n##act\n##ories\ndixie\npots\n##bow\nwhereabouts\n159\n##fted\ndrains\nbullying\ncottages\nscripture\ncoherent\nfore\npoe\nappetite\n##uration\nsampled\n##ators\n##dp\nderrick\nrotor\njays\npeacock\ninstallment\n##rro\nadvisors\n##coming\nrodeo\nscotch\n##mot\n##db\n##fen\n##vant\nensued\nrodrigo\ndictatorship\nmartyrs\ntwenties\n##н\ntowed\nincidence\nmarta\nrainforest\nsai\nscaled\n##cles\noceanic\nqualifiers\nsymphonic\nmcbride\ndislike\ngeneralized\naubrey\ncolonization\n##iation\n##lion\n##ssing\ndisliked\nlublin\nsalesman\n##ulates\nspherical\nwhatsoever\nsweating\navalon\ncontention\npunt\nseverity\nalderman\natari\n##dina\n##grant\n##rop\nscarf\nseville\nvertices\nannexation\nfairfield\nfascination\ninspiring\nlaunches\npalatinate\nregretted\n##rca\nferal\n##iom\nelk\nnap\nolsen\nreddy\nyong\n##leader\n##iae\ngarment\ntransports\nfeng\ngracie\noutrage\nviceroy\ninsides\n##esis\nbreakup\ngrady\norganizer\nsofter\ngrimaced\n222\nmurals\ngalicia\narranging\nvectors\n##rsten\nbas\n##sb\n##cens\nsloan\n##eka\nbitten\nara\nfender\nnausea\nbumped\nkris\nbanquet\ncomrades\ndetector\npersisted\n##llan\nadjustment\nendowed\ncinemas\n##shot\nsellers\n##uman\npeek\nepa\nkindly\nneglect\nsimpsons\ntalon\nmausoleum\nrunaway\nhangul\nlookout\n##cic\nrewards\ncoughed\nacquainted\nchloride\n##ald\nquicker\naccordion\nneolithic\n##qa\nartemis\ncoefficient\nlenny\npandora\ntx\n##xed\necstasy\nlitter\nsegunda\nchairperson\ngemma\nhiss\nrumor\nvow\nnasal\nantioch\ncompensate\npatiently\ntransformers\n##eded\njudo\nmorrow\npenis\nposthumous\nphilips\nbandits\nhusbands\ndenote\nflaming\n##any\n##phones\nlangley\nyorker\n1760\nwalters\n##uo\n##kle\ngubernatorial\nfatty\nsamsung\nleroy\noutlaw\n##nine\nunpublished\npoole\njakob\n##ᵢ\n##ₙ\ncrete\ndistorted\nsuperiority\n##dhi\nintercept\ncrust\nmig\nclaus\ncrashes\npositioning\n188\nstallion\n301\nfrontal\narmistice\n##estinal\nelton\naj\nencompassing\ncamel\ncommemorated\nmalaria\nwoodward\ncalf\ncigar\npenetrate\n##oso\nwillard\n##rno\n##uche\nillustrate\namusing\nconvergence\nnoteworthy\n##lma\n##rva\njourneys\nrealise\nmanfred\n##sable\n410\n##vocation\nhearings\nfiance\n##posed\neducators\nprovoked\nadjusting\n##cturing\nmodular\nstockton\npaterson\nvlad\nrejects\nelectors\nselena\nmaureen\n##tres\nuber\n##rce\nswirled\n##num\nproportions\nnanny\npawn\nnaturalist\nparma\napostles\nawoke\nethel\nwen\n##bey\nmonsoon\noverview\n##inating\nmccain\nrendition\nrisky\nadorned\n##ih\nequestrian\ngermain\nnj\nconspicuous\nconfirming\n##yoshi\nshivering\n##imeter\nmilestone\nrumours\nflinched\nbounds\nsmacked\ntoken\n##bei\nlectured\nautomobiles\n##shore\nimpacted\n##iable\nnouns\nnero\n##leaf\nismail\nprostitute\ntrams\n##lace\nbridget\nsud\nstimulus\nimpressions\nreins\nrevolves\n##oud\n##gned\ngiro\nhoneymoon\n##swell\ncriterion\n##sms\n##uil\nlibyan\nprefers\n##osition\n211\npreview\nsucks\naccusation\nbursts\nmetaphor\ndiffusion\ntolerate\nfaye\nbetting\ncinematographer\nliturgical\nspecials\nbitterly\nhumboldt\n##ckle\nflux\nrattled\n##itzer\narchaeologists\nodor\nauthorised\nmarshes\ndiscretion\n##ов\nalarmed\narchaic\ninverse\n##leton\nexplorers\n##pine\ndrummond\ntsunami\nwoodlands\n##minate\n##tland\nbooklet\ninsanity\nowning\ninsert\ncrafted\ncalculus\n##tore\nreceivers\n##bt\nstung\n##eca\n##nched\nprevailing\ntravellers\neyeing\nlila\ngraphs\n##borne\n178\njulien\n##won\nmorale\nadaptive\ntherapist\nerica\ncw\nlibertarian\nbowman\npitches\nvita\n##ional\ncrook\n##ads\n##entation\ncaledonia\nmutiny\n##sible\n1840s\nautomation\n##ß\nflock\n##pia\nironic\npathology\n##imus\nremarried\n##22\njoker\nwithstand\nenergies\n##att\nshropshire\nhostages\nmadeleine\ntentatively\nconflicting\nmateo\nrecipes\neuros\nol\nmercenaries\nnico\n##ndon\nalbuquerque\naugmented\nmythical\nbel\nfreud\n##child\ncough\n##lica\n365\nfreddy\nlillian\ngenetically\nnuremberg\ncalder\n209\nbonn\noutdoors\npaste\nsuns\nurgency\nvin\nrestraint\ntyson\n##cera\n##selle\nbarrage\nbethlehem\nkahn\n##par\nmounts\nnippon\nbarony\nhappier\nryu\nmakeshift\nsheldon\nblushed\ncastillo\nbarking\nlistener\ntaped\nbethel\nfluent\nheadlines\npornography\nrum\ndisclosure\nsighing\nmace\ndoubling\ngunther\nmanly\n##plex\nrt\ninterventions\nphysiological\nforwards\nemerges\n##tooth\n##gny\ncompliment\nrib\nrecession\nvisibly\nbarge\nfaults\nconnector\nexquisite\nprefect\n##rlin\npatio\n##cured\nelevators\nbrandt\nitalics\npena\n173\nwasp\nsatin\nea\nbotswana\ngraceful\nrespectable\n##jima\n##rter\n##oic\nfranciscan\ngenerates\n##dl\nalfredo\ndisgusting\n##olate\n##iously\nsherwood\nwarns\ncod\npromo\ncheryl\nsino\n##ة\n##escu\ntwitch\n##zhi\nbrownish\nthom\nortiz\n##dron\ndensely\n##beat\ncarmel\nreinforce\n##bana\n187\nanastasia\ndownhill\nvertex\ncontaminated\nremembrance\nharmonic\nhomework\n##sol\nfiancee\ngears\nolds\nangelica\nloft\nramsay\nquiz\ncolliery\nsevens\n##cape\nautism\n##hil\nwalkway\n##boats\nruben\nabnormal\nounce\nkhmer\n##bbe\nzachary\nbedside\nmorphology\npunching\n##olar\nsparrow\nconvinces\n##35\nhewitt\nqueer\nremastered\nrods\nmabel\nsolemn\nnotified\nlyricist\nsymmetric\n##xide\n174\nencore\npassports\nwildcats\n##uni\nbaja\n##pac\nmildly\n##ease\nbleed\ncommodity\nmounds\nglossy\norchestras\n##omo\ndamian\nprelude\nambitions\n##vet\nawhile\nremotely\n##aud\nasserts\nimply\n##iques\ndistinctly\nmodelling\nremedy\n##dded\nwindshield\ndani\nxiao\n##endra\naudible\npowerplant\n1300\ninvalid\nelemental\nacquisitions\n##hala\nimmaculate\nlibby\nplata\nsmuggling\nventilation\ndenoted\nminh\n##morphism\n430\ndiffered\ndion\nkelley\nlore\nmocking\nsabbath\nspikes\nhygiene\ndrown\nrunoff\nstylized\ntally\nliberated\naux\ninterpreter\nrighteous\naba\nsiren\nreaper\npearce\nmillie\n##cier\n##yra\ngaius\n##iso\ncaptures\n##ttering\ndorm\nclaudio\n##sic\nbenches\nknighted\nblackness\n##ored\ndiscount\nfumble\noxidation\nrouted\n##ς\nnovak\nperpendicular\nspoiled\nfracture\nsplits\n##urt\npads\ntopology\n##cats\naxes\nfortunate\noffenders\nprotestants\nesteem\n221\nbroadband\nconvened\nfrankly\nhound\nprototypes\nisil\nfacilitated\nkeel\n##sher\nsahara\nawaited\nbubba\norb\nprosecutors\n186\nhem\n520\n##xing\nrelaxing\nremnant\nromney\nsorted\nslalom\nstefano\nulrich\n##active\nexemption\nfolder\npauses\nfoliage\nhitchcock\nepithet\n204\ncriticisms\n##aca\nballistic\nbrody\nhinduism\nchaotic\nyouths\nequals\n##pala\npts\nthicker\nanalogous\ncapitalist\nimprovised\noverseeing\nsinatra\nascended\nbeverage\n##tl\nstraightforward\n##kon\ncurran\n##west\nbois\n325\ninduce\nsurveying\nemperors\nsax\nunpopular\n##kk\ncartoonist\nfused\n##mble\nunto\n##yuki\nlocalities\n##cko\n##ln\ndarlington\nslain\nacademie\nlobbying\nsediment\npuzzles\n##grass\ndefiance\ndickens\nmanifest\ntongues\nalumnus\narbor\ncoincide\n184\nappalachian\nmustafa\nexaminer\ncabaret\ntraumatic\nyves\nbracelet\ndraining\nheroin\nmagnum\nbaths\nodessa\nconsonants\nmitsubishi\n##gua\nkellan\nvaudeville\n##fr\njoked\nnull\nstraps\nprobation\n##ław\nceded\ninterfaces\n##pas\n##zawa\nblinding\nviet\n224\nrothschild\nmuseo\n640\nhuddersfield\n##vr\ntactic\n##storm\nbrackets\ndazed\nincorrectly\n##vu\nreg\nglazed\nfearful\nmanifold\nbenefited\nirony\n##sun\nstumbling\n##rte\nwillingness\nbalkans\nmei\nwraps\n##aba\ninjected\n##lea\ngu\nsyed\nharmless\n##hammer\nbray\ntakeoff\npoppy\ntimor\ncardboard\nastronaut\npurdue\nweeping\nsouthbound\ncursing\nstalls\ndiagonal\n##neer\nlamar\nbryce\ncomte\nweekdays\nharrington\n##uba\nnegatively\n##see\nlays\ngrouping\n##cken\n##henko\naffirmed\nhalle\nmodernist\n##lai\nhodges\nsmelling\naristocratic\nbaptized\ndismiss\njustification\noilers\n##now\ncoupling\nqin\nsnack\nhealer\n##qing\ngardener\nlayla\nbattled\nformulated\nstephenson\ngravitational\n##gill\n##jun\n1768\ngranny\ncoordinating\nsuites\n##cd\n##ioned\nmonarchs\n##cote\n##hips\nsep\nblended\napr\nbarrister\ndeposition\nfia\nmina\npolicemen\nparanoid\n##pressed\nchurchyard\ncovert\ncrumpled\ncreep\nabandoning\ntr\ntransmit\nconceal\nbarr\nunderstands\nreadiness\nspire\n##cology\n##enia\n##erry\n610\nstartling\nunlock\nvida\nbowled\nslots\n##nat\n##islav\nspaced\ntrusting\nadmire\nrig\n##ink\nslack\n##70\nmv\n207\ncasualty\n##wei\nclassmates\n##odes\n##rar\n##rked\namherst\nfurnished\nevolve\nfoundry\nmenace\nmead\n##lein\nflu\nwesleyan\n##kled\nmonterey\nwebber\n##vos\nwil\n##mith\n##на\nbartholomew\njustices\nrestrained\n##cke\namenities\n191\nmediated\nsewage\ntrenches\nml\nmainz\n##thus\n1800s\n##cula\n##inski\ncaine\nbonding\n213\nconverts\nspheres\nsuperseded\nmarianne\ncrypt\nsweaty\nensign\nhistoria\n##br\nspruce\n##post\n##ask\nforks\nthoughtfully\nyukon\npamphlet\names\n##uter\nkarma\n##yya\nbryn\nnegotiation\nsighs\nincapable\n##mbre\n##ntial\nactresses\ntaft\n##mill\nluce\nprevailed\n##amine\n1773\nmotionless\nenvoy\ntestify\ninvesting\nsculpted\ninstructors\nprovence\nkali\ncullen\nhorseback\n##while\ngoodwin\n##jos\ngaa\nnorte\n##ldon\nmodify\nwavelength\nabd\n214\nskinned\nsprinter\nforecast\nscheduling\nmarries\nsquared\ntentative\n##chman\nboer\n##isch\nbolts\nswap\nfisherman\nassyrian\nimpatiently\nguthrie\nmartins\nmurdoch\n194\ntanya\nnicely\ndolly\nlacy\nmed\n##45\nsyn\ndecks\nfashionable\nmillionaire\n##ust\nsurfing\n##ml\n##ision\nheaved\ntammy\nconsulate\nattendees\nroutinely\n197\nfuse\nsaxophonist\nbackseat\nmalaya\n##lord\nscowl\ntau\n##ishly\n193\nsighted\nsteaming\n##rks\n303\n911\n##holes\n##hong\nching\n##wife\nbless\nconserved\njurassic\nstacey\nunix\nzion\nchunk\nrigorous\nblaine\n198\npeabody\nslayer\ndismay\nbrewers\nnz\n##jer\ndet\n##glia\nglover\npostwar\nint\npenetration\nsylvester\nimitation\nvertically\nairlift\nheiress\nknoxville\nviva\n##uin\n390\nmacon\n##rim\n##fighter\n##gonal\njanice\n##orescence\n##wari\nmarius\nbelongings\nleicestershire\n196\nblanco\ninverted\npreseason\nsanity\nsobbing\n##due\n##elt\n##dled\ncollingwood\nregeneration\nflickering\nshortest\n##mount\n##osi\nfeminism\n##lat\nsherlock\ncabinets\nfumbled\nnorthbound\nprecedent\nsnaps\n##mme\nresearching\n##akes\nguillaume\ninsights\nmanipulated\nvapor\nneighbour\nsap\ngangster\nfrey\nf1\nstalking\nscarcely\ncallie\nbarnett\ntendencies\naudi\ndoomed\nassessing\nslung\npanchayat\nambiguous\nbartlett\n##etto\ndistributing\nviolating\nwolverhampton\n##hetic\nswami\nhistoire\n##urus\nliable\npounder\ngroin\nhussain\nlarsen\npopping\nsurprises\n##atter\nvie\ncurt\n##station\nmute\nrelocate\nmusicals\nauthorization\nrichter\n##sef\nimmortality\ntna\nbombings\n##press\ndeteriorated\nyiddish\n##acious\nrobbed\ncolchester\ncs\npmid\nao\nverified\nbalancing\napostle\nswayed\nrecognizable\noxfordshire\nretention\nnottinghamshire\ncontender\njudd\ninvitational\nshrimp\nuhf\n##icient\ncleaner\nlongitudinal\ntanker\n##mur\nacronym\nbroker\nkoppen\nsundance\nsuppliers\n##gil\n4000\nclipped\nfuels\npetite\n##anne\nlandslide\nhelene\ndiversion\npopulous\nlandowners\nauspices\nmelville\nquantitative\n##xes\nferries\nnicky\n##llus\ndoo\nhaunting\nroche\ncarver\ndowned\nunavailable\n##pathy\napproximation\nhiroshima\n##hue\ngarfield\nvalle\ncomparatively\nkeyboardist\ntraveler\n##eit\ncongestion\ncalculating\nsubsidiaries\n##bate\nserb\nmodernization\nfairies\ndeepened\nville\naverages\n##lore\ninflammatory\ntonga\n##itch\nco₂\nsquads\n##hea\ngigantic\nserum\nenjoyment\nretailer\nverona\n35th\ncis\n##phobic\nmagna\ntechnicians\n##vati\narithmetic\n##sport\nlevin\n##dation\namtrak\nchow\nsienna\n##eyer\nbackstage\nentrepreneurship\n##otic\nlearnt\ntao\n##udy\nworcestershire\nformulation\nbaggage\nhesitant\nbali\nsabotage\n##kari\nbarren\nenhancing\nmurmur\npl\nfreshly\nputnam\nsyntax\naces\nmedicines\nresentment\nbandwidth\n##sier\ngrins\nchili\nguido\n##sei\nframing\nimplying\ngareth\nlissa\ngenevieve\npertaining\nadmissions\ngeo\nthorpe\nproliferation\nsato\nbela\nanalyzing\nparting\n##gor\nawakened\n##isman\nhuddled\nsecrecy\n##kling\nhush\ngentry\n540\ndungeons\n##ego\ncoasts\n##utz\nsacrificed\n##chule\nlandowner\nmutually\nprevalence\nprogrammer\nadolescent\ndisrupted\nseaside\ngee\ntrusts\nvamp\ngeorgie\n##nesian\n##iol\nschedules\nsindh\n##market\netched\nhm\nsparse\nbey\nbeaux\nscratching\ngliding\nunidentified\n216\ncollaborating\ngems\njesuits\noro\naccumulation\nshaping\nmbe\nanal\n##xin\n231\nenthusiasts\nnewscast\n##egan\njanata\ndewey\nparkinson\n179\nankara\nbiennial\ntowering\ndd\ninconsistent\n950\n##chet\nthriving\nterminate\ncabins\nfuriously\neats\nadvocating\ndonkey\nmarley\nmuster\nphyllis\nleiden\n##user\ngrassland\nglittering\niucn\nloneliness\n217\nmemorandum\narmenians\n##ddle\npopularized\nrhodesia\n60s\nlame\n##illon\nsans\nbikini\nheader\norbits\n##xx\n##finger\n##ulator\nsharif\nspines\nbiotechnology\nstrolled\nnaughty\nyates\n##wire\nfremantle\nmilo\n##mour\nabducted\nremoves\n##atin\nhumming\nwonderland\n##chrome\n##ester\nhume\npivotal\n##rates\narmand\ngrams\nbelievers\nelector\nrte\napron\nbis\nscraped\n##yria\nendorsement\ninitials\n##llation\neps\ndotted\nhints\nbuzzing\nemigration\nnearer\n##tom\nindicators\n##ulu\ncoarse\nneutron\nprotectorate\n##uze\ndirectional\nexploits\npains\nloire\n1830s\nproponents\nguggenheim\nrabbits\nritchie\n305\nhectare\ninputs\nhutton\n##raz\nverify\n##ako\nboilers\nlongitude\n##lev\nskeletal\nyer\nemilia\ncitrus\ncompromised\n##gau\npokemon\nprescription\nparagraph\neduard\ncadillac\nattire\ncategorized\nkenyan\nweddings\ncharley\n##bourg\nentertain\nmonmouth\n##lles\nnutrients\ndavey\nmesh\nincentive\npractised\necosystems\nkemp\nsubdued\noverheard\n##rya\nbodily\nmaxim\n##nius\napprenticeship\nursula\n##fight\nlodged\nrug\nsilesian\nunconstitutional\npatel\ninspected\ncoyote\nunbeaten\n##hak\n34th\ndisruption\nconvict\nparcel\n##cl\n##nham\ncollier\nimplicated\nmallory\n##iac\n##lab\nsusannah\nwinkler\n##rber\nshia\nphelps\nsediments\ngraphical\nrobotic\n##sner\nadulthood\nmart\nsmoked\n##isto\nkathryn\nclarified\n##aran\ndivides\nconvictions\noppression\npausing\nburying\n##mt\nfederico\nmathias\neileen\n##tana\nkite\nhunched\n##acies\n189\n##atz\ndisadvantage\nliza\nkinetic\ngreedy\nparadox\nyokohama\ndowager\ntrunks\nventured\n##gement\ngupta\nvilnius\nolaf\n##thest\ncrimean\nhopper\n##ej\nprogressively\narturo\nmouthed\narrondissement\n##fusion\nrubin\nsimulcast\noceania\n##orum\n##stra\n##rred\nbusiest\nintensely\nnavigator\ncary\n##vine\n##hini\n##bies\nfife\nrowe\nrowland\nposing\ninsurgents\nshafts\nlawsuits\nactivate\nconor\ninward\nculturally\ngarlic\n265\n##eering\neclectic\n##hui\n##kee\n##nl\nfurrowed\nvargas\nmeteorological\nrendezvous\n##aus\nculinary\ncommencement\n##dition\nquota\n##notes\nmommy\nsalaries\noverlapping\nmule\n##iology\n##mology\nsums\nwentworth\n##isk\n##zione\nmainline\nsubgroup\n##illy\nhack\nplaintiff\nverdi\nbulb\ndifferentiation\nengagements\nmultinational\nsupplemented\nbertrand\ncaller\nregis\n##naire\n##sler\n##arts\n##imated\nblossom\npropagation\nkilometer\nviaduct\nvineyards\n##uate\nbeckett\noptimization\ngolfer\nsongwriters\nseminal\nsemitic\nthud\nvolatile\nevolving\nridley\n##wley\ntrivial\ndistributions\nscandinavia\njiang\n##ject\nwrestled\ninsistence\n##dio\nemphasizes\nnapkin\n##ods\nadjunct\nrhyme\n##ricted\n##eti\nhopeless\nsurrounds\ntremble\n32nd\nsmoky\n##ntly\noils\nmedicinal\npadded\nsteer\nwilkes\n219\n255\nconcessions\nhue\nuniquely\nblinded\nlandon\nyahoo\n##lane\nhendrix\ncommemorating\ndex\nspecify\nchicks\n##ggio\nintercity\n1400\nmorley\n##torm\nhighlighting\n##oting\npang\noblique\nstalled\n##liner\nflirting\nnewborn\n1769\nbishopric\nshaved\n232\ncurrie\n##ush\ndharma\nspartan\n##ooped\nfavorites\nsmug\nnovella\nsirens\nabusive\ncreations\nespana\n##lage\nparadigm\nsemiconductor\nsheen\n##rdo\n##yen\n##zak\nnrl\nrenew\n##pose\n##tur\nadjutant\nmarches\nnorma\n##enity\nineffective\nweimar\ngrunt\n##gat\nlordship\nplotting\nexpenditure\ninfringement\nlbs\nrefrain\nav\nmimi\nmistakenly\npostmaster\n1771\n##bara\nras\nmotorsports\ntito\n199\nsubjective\n##zza\nbully\nstew\n##kaya\nprescott\n1a\n##raphic\n##zam\nbids\nstyling\nparanormal\nreeve\nsneaking\nexploding\nkatz\nakbar\nmigrant\nsyllables\nindefinitely\n##ogical\ndestroys\nreplaces\napplause\n##phine\npest\n##fide\n218\narticulated\nbertie\n##thing\n##cars\n##ptic\ncourtroom\ncrowley\naesthetics\ncummings\ntehsil\nhormones\ntitanic\ndangerously\n##ibe\nstadion\njaenelle\nauguste\nciudad\n##chu\nmysore\npartisans\n##sio\nlucan\nphilipp\n##aly\ndebating\nhenley\ninteriors\n##rano\n##tious\nhomecoming\nbeyonce\nusher\nhenrietta\nprepares\nweeds\n##oman\nely\nplucked\n##pire\n##dable\nluxurious\n##aq\nartifact\npassword\npasture\njuno\nmaddy\nminsk\n##dder\n##ologies\n##rone\nassessments\nmartian\nroyalist\n1765\nexamines\n##mani\n##rge\nnino\n223\nparry\nscooped\nrelativity\n##eli\n##uting\n##cao\ncongregational\nnoisy\ntraverse\n##agawa\nstrikeouts\nnickelodeon\nobituary\ntransylvania\nbinds\ndepictions\npolk\ntrolley\n##yed\n##lard\nbreeders\n##under\ndryly\nhokkaido\n1762\nstrengths\nstacks\nbonaparte\nconnectivity\nneared\nprostitutes\nstamped\nanaheim\ngutierrez\nsinai\n##zzling\nbram\nfresno\nmadhya\n##86\nproton\n##lena\n##llum\n##phon\nreelected\nwanda\n##anus\n##lb\nample\ndistinguishing\n##yler\ngrasping\nsermons\ntomato\nbland\nstimulation\navenues\n##eux\nspreads\nscarlett\nfern\npentagon\nassert\nbaird\nchesapeake\nir\ncalmed\ndistortion\nfatalities\n##olis\ncorrectional\npricing\n##astic\n##gina\nprom\ndammit\nying\ncollaborate\n##chia\nwelterweight\n33rd\npointer\nsubstitution\nbonded\numpire\ncommunicating\nmultitude\npaddle\n##obe\nfederally\nintimacy\n##insky\nbetray\nssr\n##lett\n##lean\n##lves\n##therapy\nairbus\n##tery\nfunctioned\nud\nbearer\nbiomedical\nnetflix\n##hire\n##nca\ncondom\nbrink\nik\n##nical\nmacy\n##bet\nflap\ngma\nexperimented\njelly\nlavender\n##icles\n##ulia\nmunro\n##mian\n##tial\nrye\n##rle\n60th\ngigs\nhottest\nrotated\npredictions\nfuji\nbu\n##erence\n##omi\nbarangay\n##fulness\n##sas\nclocks\n##rwood\n##liness\ncereal\nroe\nwight\ndecker\nuttered\nbabu\nonion\nxml\nforcibly\n##df\npetra\nsarcasm\nhartley\npeeled\nstorytelling\n##42\n##xley\n##ysis\n##ffa\nfibre\nkiel\nauditor\nfig\nharald\ngreenville\n##berries\ngeographically\nnell\nquartz\n##athic\ncemeteries\n##lr\ncrossings\nnah\nholloway\nreptiles\nchun\nsichuan\nsnowy\n660\ncorrections\n##ivo\nzheng\nambassadors\nblacksmith\nfielded\nfluids\nhardcover\nturnover\nmedications\nmelvin\nacademies\n##erton\nro\nroach\nabsorbing\nspaniards\ncolton\n##founded\noutsider\nespionage\nkelsey\n245\nedible\n##ulf\ndora\nestablishes\n##sham\n##tries\ncontracting\n##tania\ncinematic\ncostello\nnesting\n##uron\nconnolly\nduff\n##nology\nmma\n##mata\nfergus\nsexes\ngi\noptics\nspectator\nwoodstock\nbanning\n##hee\n##fle\ndifferentiate\noutfielder\nrefinery\n226\n312\ngerhard\nhorde\nlair\ndrastically\n##udi\nlandfall\n##cheng\nmotorsport\nodi\n##achi\npredominant\nquay\nskins\n##ental\nedna\nharshly\ncomplementary\nmurdering\n##aves\nwreckage\n##90\nono\noutstretched\nlennox\nmunitions\ngalen\nreconcile\n470\nscalp\nbicycles\ngillespie\nquestionable\nrosenberg\nguillermo\nhostel\njarvis\nkabul\nvolvo\nopium\nyd\n##twined\nabuses\ndecca\noutpost\n##cino\nsensible\nneutrality\n##64\nponce\nanchorage\natkins\nturrets\ninadvertently\ndisagree\nlibre\nvodka\nreassuring\nweighs\n##yal\nglide\njumper\nceilings\nrepertory\nouts\nstain\n##bial\nenvy\n##ucible\nsmashing\nheightened\npolicing\nhyun\nmixes\nlai\nprima\n##ples\nceleste\n##bina\nlucrative\nintervened\nkc\nmanually\n##rned\nstature\nstaffed\nbun\nbastards\nnairobi\npriced\n##auer\nthatcher\n##kia\ntripped\ncomune\n##ogan\n##pled\nbrasil\nincentives\nemanuel\nhereford\nmusica\n##kim\nbenedictine\nbiennale\n##lani\neureka\ngardiner\nrb\nknocks\nsha\n##ael\n##elled\n##onate\nefficacy\nventura\nmasonic\nsanford\nmaize\nleverage\n##feit\ncapacities\nsantana\n##aur\nnovelty\nvanilla\n##cter\n##tour\nbenin\n##oir\n##rain\nneptune\ndrafting\ntallinn\n##cable\nhumiliation\n##boarding\nschleswig\nfabian\nbernardo\nliturgy\nspectacle\nsweeney\npont\nroutledge\n##tment\ncosmos\nut\nhilt\nsleek\nuniversally\n##eville\n##gawa\ntyped\n##dry\nfavors\nallegheny\nglaciers\n##rly\nrecalling\naziz\n##log\nparasite\nrequiem\nauf\n##berto\n##llin\nillumination\n##breaker\n##issa\nfestivities\nbows\ngovern\nvibe\nvp\n333\nsprawled\nlarson\npilgrim\nbwf\nleaping\n##rts\n##ssel\nalexei\ngreyhound\nhoarse\n##dler\n##oration\nseneca\n##cule\ngaping\n##ulously\n##pura\ncinnamon\n##gens\n##rricular\ncraven\nfantasies\nhoughton\nengined\nreigned\ndictator\nsupervising\n##oris\nbogota\ncommentaries\nunnatural\nfingernails\nspirituality\ntighten\n##tm\ncanadiens\nprotesting\nintentional\ncheers\nsparta\n##ytic\n##iere\n##zine\nwiden\nbelgarath\ncontrollers\ndodd\niaaf\nnavarre\n##ication\ndefect\nsquire\nsteiner\nwhisky\n##mins\n560\ninevitably\ntome\n##gold\nchew\n##uid\n##lid\nelastic\n##aby\nstreaked\nalliances\njailed\nregal\n##ined\n##phy\nczechoslovak\nnarration\nabsently\n##uld\nbluegrass\nguangdong\nquran\ncriticizing\nhose\nhari\n##liest\n##owa\nskier\nstreaks\ndeploy\n##lom\nraft\nbose\ndialed\nhuff\n##eira\nhaifa\nsimplest\nbursting\nendings\nib\nsultanate\n##titled\nfranks\nwhitman\nensures\nsven\n##ggs\ncollaborators\nforster\norganising\nui\nbanished\nnapier\ninjustice\nteller\nlayered\nthump\n##otti\nroc\nbattleships\nevidenced\nfugitive\nsadie\nrobotics\n##roud\nequatorial\ngeologist\n##iza\nyielding\n##bron\n##sr\ninternationale\nmecca\n##diment\nsbs\nskyline\ntoad\nuploaded\nreflective\nundrafted\nlal\nleafs\nbayern\n##dai\nlakshmi\nshortlisted\n##stick\n##wicz\ncamouflage\ndonate\naf\nchristi\nlau\n##acio\ndisclosed\nnemesis\n1761\nassemble\nstraining\nnorthamptonshire\ntal\n##asi\nbernardino\npremature\nheidi\n42nd\ncoefficients\ngalactic\nreproduce\nbuzzed\nsensations\nzionist\nmonsieur\nmyrtle\n##eme\narchery\nstrangled\nmusically\nviewpoint\nantiquities\nbei\ntrailers\nseahawks\ncured\npee\npreferring\ntasmanian\nlange\nsul\n##mail\n##working\ncolder\noverland\nlucivar\nmassey\ngatherings\nhaitian\n##smith\ndisapproval\nflaws\n##cco\n##enbach\n1766\nnpr\n##icular\nboroughs\ncreole\nforums\ntechno\n1755\ndent\nabdominal\nstreetcar\n##eson\n##stream\nprocurement\ngemini\npredictable\n##tya\nacheron\nchristoph\nfeeder\nfronts\nvendor\nbernhard\njammu\ntumors\nslang\n##uber\ngoaltender\ntwists\ncurving\nmanson\nvuelta\nmer\npeanut\nconfessions\npouch\nunpredictable\nallowance\ntheodor\nvascular\n##factory\nbala\nauthenticity\nmetabolic\ncoughing\nnanjing\n##cea\npembroke\n##bard\nsplendid\n36th\nff\nhourly\n##ahu\nelmer\nhandel\n##ivate\nawarding\nthrusting\ndl\nexperimentation\n##hesion\n##46\ncaressed\nentertained\nsteak\n##rangle\nbiologist\norphans\nbaroness\noyster\nstepfather\n##dridge\nmirage\nreefs\nspeeding\n##31\nbarons\n1764\n227\ninhabit\npreached\nrepealed\n##tral\nhonoring\nboogie\ncaptives\nadminister\njohanna\n##imate\ngel\nsuspiciously\n1767\nsobs\n##dington\nbackbone\nhayward\ngarry\n##folding\n##nesia\nmaxi\n##oof\n##ppe\nellison\ngalileo\n##stand\ncrimea\nfrenzy\namour\nbumper\nmatrices\nnatalia\nbaking\ngarth\npalestinians\n##grove\nsmack\nconveyed\nensembles\ngardening\n##manship\n##rup\n##stituting\n1640\nharvesting\ntopography\njing\nshifters\ndormitory\n##carriage\n##lston\nist\nskulls\n##stadt\ndolores\njewellery\nsarawak\n##wai\n##zier\nfences\nchristy\nconfinement\ntumbling\ncredibility\nfir\nstench\n##bria\n##plication\n##nged\n##sam\nvirtues\n##belt\nmarjorie\npba\n##eem\n##made\ncelebrates\nschooner\nagitated\nbarley\nfulfilling\nanthropologist\n##pro\nrestrict\nnovi\nregulating\n##nent\npadres\n##rani\n##hesive\nloyola\ntabitha\nmilky\nolson\nproprietor\ncrambidae\nguarantees\nintercollegiate\nljubljana\nhilda\n##sko\nignorant\nhooded\n##lts\nsardinia\n##lidae\n##vation\nfrontman\nprivileged\nwitchcraft\n##gp\njammed\nlaude\npoking\n##than\nbracket\namazement\nyunnan\n##erus\nmaharaja\nlinnaeus\n264\ncommissioning\nmilano\npeacefully\n##logies\nakira\nrani\nregulator\n##36\ngrasses\n##rance\nluzon\ncrows\ncompiler\ngretchen\nseaman\nedouard\ntab\nbuccaneers\nellington\nhamlets\nwhig\nsocialists\n##anto\ndirectorial\neaston\nmythological\n##kr\n##vary\nrhineland\nsemantic\ntaut\ndune\ninventions\nsucceeds\n##iter\nreplication\nbranched\n##pired\njul\nprosecuted\nkangaroo\npenetrated\n##avian\nmiddlesbrough\ndoses\nbleak\nmadam\npredatory\nrelentless\n##vili\nreluctance\n##vir\nhailey\ncrore\nsilvery\n1759\nmonstrous\nswimmers\ntransmissions\nhawthorn\ninforming\n##eral\ntoilets\ncaracas\ncrouch\nkb\n##sett\n295\ncartel\nhadley\n##aling\nalexia\nyvonne\n##biology\ncinderella\neton\nsuperb\nblizzard\nstabbing\nindustrialist\nmaximus\n##gm\n##orus\ngroves\nmaud\nclade\noversized\ncomedic\n##bella\nrosen\nnomadic\nfulham\nmontane\nbeverages\ngalaxies\nredundant\nswarm\n##rot\n##folia\n##llis\nbuckinghamshire\nfen\nbearings\nbahadur\n##rom\ngilles\nphased\ndynamite\nfaber\nbenoit\nvip\n##ount\n##wd\nbooking\nfractured\ntailored\nanya\nspices\nwestwood\ncairns\nauditions\ninflammation\nsteamed\n##rocity\n##acion\n##urne\nskyla\nthereof\nwatford\ntorment\narchdeacon\ntransforms\nlulu\ndemeanor\nfucked\nserge\n##sor\nmckenna\nminas\nentertainer\n##icide\ncaress\noriginate\nresidue\n##sty\n1740\n##ilised\n##org\nbeech\n##wana\nsubsidies\n##ghton\nemptied\ngladstone\nru\nfirefighters\nvoodoo\n##rcle\nhet\nnightingale\ntamara\nedmond\ningredient\nweaknesses\nsilhouette\n285\ncompatibility\nwithdrawing\nhampson\n##mona\nanguish\ngiggling\n##mber\nbookstore\n##jiang\nsouthernmost\ntilting\n##vance\nbai\neconomical\nrf\nbriefcase\ndreadful\nhinted\nprojections\nshattering\ntotaling\n##rogate\nanalogue\nindicted\nperiodical\nfullback\n##dman\nhaynes\n##tenberg\n##ffs\n##ishment\n1745\nthirst\nstumble\npenang\nvigorous\n##ddling\n##kor\n##lium\noctave\n##ove\n##enstein\n##inen\n##ones\nsiberian\n##uti\ncbn\nrepeal\nswaying\n##vington\nkhalid\ntanaka\nunicorn\notago\nplastered\nlobe\nriddle\n##rella\nperch\n##ishing\ncroydon\nfiltered\ngraeme\ntripoli\n##ossa\ncrocodile\n##chers\nsufi\nmined\n##tung\ninferno\nlsu\n##phi\nswelled\nutilizes\n£2\ncale\nperiodicals\nstyx\nhike\ninformally\ncoop\nlund\n##tidae\nala\nhen\nqui\ntransformations\ndisposed\nsheath\nchickens\n##cade\nfitzroy\nsas\nsilesia\nunacceptable\nodisha\n1650\nsabrina\npe\nspokane\nratios\nathena\nmassage\nshen\ndilemma\n##drum\n##riz\n##hul\ncorona\ndoubtful\nniall\n##pha\n##bino\nfines\ncite\nacknowledging\nbangor\nballard\nbathurst\n##resh\nhuron\nmustered\nalzheimer\ngarments\nkinase\ntyre\nwarship\n##cp\nflashback\npulmonary\nbraun\ncheat\nkamal\ncyclists\nconstructions\ngrenades\nndp\ntraveller\nexcuses\nstomped\nsignalling\ntrimmed\nfutsal\nmosques\nrelevance\n##wine\nwta\n##23\n##vah\n##lter\nhoc\n##riding\noptimistic\n##´s\ndeco\nsim\ninteracting\nrejecting\nmoniker\nwaterways\n##ieri\n##oku\nmayors\ngdansk\noutnumbered\npearls\n##ended\n##hampton\nfairs\ntotals\ndominating\n262\nnotions\nstairway\ncompiling\npursed\ncommodities\ngrease\nyeast\n##jong\ncarthage\ngriffiths\nresidual\namc\ncontraction\nlaird\nsapphire\n##marine\n##ivated\namalgamation\ndissolve\ninclination\nlyle\npackaged\naltitudes\nsuez\ncanons\ngraded\nlurched\nnarrowing\nboasts\nguise\nwed\nenrico\n##ovsky\nrower\nscarred\nbree\ncub\niberian\nprotagonists\nbargaining\nproposing\ntrainers\nvoyages\nvans\nfishes\n##aea\n##ivist\n##verance\nencryption\nartworks\nkazan\nsabre\ncleopatra\nhepburn\nrotting\nsupremacy\nmecklenburg\n##brate\nburrows\nhazards\noutgoing\nflair\norganizes\n##ctions\nscorpion\n##usions\nboo\n234\nchevalier\ndunedin\nslapping\n##34\nineligible\npensions\n##38\n##omic\nmanufactures\nemails\nbismarck\n238\nweakening\nblackish\nding\nmcgee\nquo\n##rling\nnorthernmost\nxx\nmanpower\ngreed\nsampson\nclicking\n##ange\n##horpe\n##inations\n##roving\ntorre\n##eptive\n##moral\nsymbolism\n38th\nasshole\nmeritorious\noutfits\nsplashed\nbiographies\nsprung\nastros\n##tale\n302\n737\nfilly\nraoul\nnw\ntokugawa\nlinden\nclubhouse\n##apa\ntracts\nromano\n##pio\nputin\ntags\n##note\nchained\ndickson\ngunshot\nmoe\ngunn\nrashid\n##tails\nzipper\n##bas\n##nea\ncontrasted\n##ply\n##udes\nplum\npharaoh\n##pile\naw\ncomedies\ningrid\nsandwiches\nsubdivisions\n1100\nmariana\nnokia\nkamen\nhz\ndelaney\nveto\nherring\n##words\npossessive\noutlines\n##roup\nsiemens\nstairwell\nrc\ngallantry\nmessiah\npalais\nyells\n233\nzeppelin\n##dm\nbolivar\n##cede\nsmackdown\nmckinley\n##mora\n##yt\nmuted\ngeologic\nfinely\nunitary\navatar\nhamas\nmaynard\nrees\nbog\ncontrasting\n##rut\nliv\nchico\ndisposition\npixel\n##erate\nbecca\ndmitry\nyeshiva\nnarratives\n##lva\n##ulton\nmercenary\nsharpe\ntempered\nnavigate\nstealth\namassed\nkeynes\n##lini\nuntouched\n##rrie\nhavoc\nlithium\n##fighting\nabyss\ngraf\nsouthward\nwolverine\nballoons\nimplements\nngos\ntransitions\n##icum\nambushed\nconcacaf\ndormant\neconomists\n##dim\ncosting\ncsi\nrana\nuniversite\nboulders\nverity\n##llon\ncollin\nmellon\nmisses\ncypress\nfluorescent\nlifeless\nspence\n##ulla\ncrewe\nshepard\npak\nrevelations\n##م\njolly\ngibbons\npaw\n##dro\n##quel\nfreeing\n##test\nshack\nfries\npalatine\n##51\n##hiko\naccompaniment\ncruising\nrecycled\n##aver\nerwin\nsorting\nsynthesizers\ndyke\nrealities\nsg\nstrides\nenslaved\nwetland\n##ghan\ncompetence\ngunpowder\ngrassy\nmaroon\nreactors\nobjection\n##oms\ncarlson\ngearbox\nmacintosh\nradios\nshelton\n##sho\nclergyman\nprakash\n254\nmongols\ntrophies\noricon\n228\nstimuli\ntwenty20\ncantonese\ncortes\nmirrored\n##saurus\nbhp\ncristina\nmelancholy\n##lating\nenjoyable\nnuevo\n##wny\ndownfall\nschumacher\n##ind\nbanging\nlausanne\nrumbled\nparamilitary\nreflex\nax\namplitude\nmigratory\n##gall\n##ups\nmidi\nbarnard\nlastly\nsherry\n##hp\n##nall\nkeystone\n##kra\ncarleton\nslippery\n##53\ncoloring\nfoe\nsocket\notter\n##rgos\nmats\n##tose\nconsultants\nbafta\nbison\ntopping\n##km\n490\nprimal\nabandonment\ntransplant\natoll\nhideous\nmort\npained\nreproduced\ntae\nhowling\n##turn\nunlawful\nbillionaire\nhotter\npoised\nlansing\n##chang\ndinamo\nretro\nmessing\nnfc\ndomesday\n##mina\nblitz\ntimed\n##athing\n##kley\nascending\ngesturing\n##izations\nsignaled\ntis\nchinatown\nmermaid\nsavanna\njameson\n##aint\ncatalina\n##pet\n##hers\ncochrane\ncy\nchatting\n##kus\nalerted\ncomputation\nmused\nnoelle\nmajestic\nmohawk\ncampo\noctagonal\n##sant\n##hend\n241\naspiring\n##mart\ncomprehend\niona\nparalyzed\nshimmering\nswindon\nrhone\n##eley\nreputed\nconfigurations\npitchfork\nagitation\nfrancais\ngillian\nlipstick\n##ilo\noutsiders\npontifical\nresisting\nbitterness\nsewer\nrockies\n##edd\n##ucher\nmisleading\n1756\nexiting\ngalloway\n##nging\nrisked\n##heart\n246\ncommemoration\nschultz\n##rka\nintegrating\n##rsa\nposes\nshrieked\n##weiler\nguineas\ngladys\njerking\nowls\ngoldsmith\nnightly\npenetrating\n##unced\nlia\n##33\nignited\nbetsy\n##aring\n##thorpe\nfollower\nvigorously\n##rave\ncoded\nkiran\nknit\nzoology\ntbilisi\n##28\n##bered\nrepository\ngovt\ndeciduous\ndino\ngrowling\n##bba\nenhancement\nunleashed\nchanting\npussy\nbiochemistry\n##eric\nkettle\nrepression\ntoxicity\nnrhp\n##arth\n##kko\n##bush\nernesto\ncommended\noutspoken\n242\nmca\nparchment\nsms\nkristen\n##aton\nbisexual\nraked\nglamour\nnavajo\na2\nconditioned\nshowcased\n##hma\nspacious\nyouthful\n##esa\nusl\nappliances\njunta\nbrest\nlayne\nconglomerate\nenchanted\nchao\nloosened\npicasso\ncirculating\ninspect\nmontevideo\n##centric\n##kti\npiazza\nspurred\n##aith\nbari\nfreedoms\npoultry\nstamford\nlieu\n##ect\nindigo\nsarcastic\nbahia\nstump\nattach\ndvds\nfrankenstein\nlille\napprox\nscriptures\npollen\n##script\nnmi\noverseen\n##ivism\ntides\nproponent\nnewmarket\ninherit\nmilling\n##erland\ncentralized\n##rou\ndistributors\ncredentials\ndrawers\nabbreviation\n##lco\n##xon\ndowning\nuncomfortably\nripe\n##oes\nerase\nfranchises\n##ever\npopulace\n##bery\n##khar\ndecomposition\npleas\n##tet\ndaryl\nsabah\n##stle\n##wide\nfearless\ngenie\nlesions\nannette\n##ogist\noboe\nappendix\nnair\ndripped\npetitioned\nmaclean\nmosquito\nparrot\nrpg\nhampered\n1648\noperatic\nreservoirs\n##tham\nirrelevant\njolt\nsummarized\n##fp\nmedallion\n##taff\n##−\nclawed\nharlow\nnarrower\ngoddard\nmarcia\nbodied\nfremont\nsuarez\naltering\ntempest\nmussolini\nporn\n##isms\nsweetly\noversees\nwalkers\nsolitude\ngrimly\nshrines\nhk\nich\nsupervisors\nhostess\ndietrich\nlegitimacy\nbrushes\nexpressive\n##yp\ndissipated\n##rse\nlocalized\nsystemic\n##nikov\ngettysburg\n##js\n##uaries\ndialogues\nmuttering\n251\nhousekeeper\nsicilian\ndiscouraged\n##frey\nbeamed\nkaladin\nhalftime\nkidnap\n##amo\n##llet\n1754\nsynonymous\ndepleted\ninstituto\ninsulin\nreprised\n##opsis\nclashed\n##ctric\ninterrupting\nradcliffe\ninsisting\nmedici\n1715\nejected\nplayfully\nturbulent\n##47\nstarvation\n##rini\nshipment\nrebellious\npetersen\nverification\nmerits\n##rified\ncakes\n##charged\n1757\nmilford\nshortages\nspying\nfidelity\n##aker\nemitted\nstorylines\nharvested\nseismic\n##iform\ncheung\nkilda\ntheoretically\nbarbie\nlynx\n##rgy\n##tius\ngoblin\nmata\npoisonous\n##nburg\nreactive\nresidues\nobedience\n##евич\nconjecture\n##rac\n401\nhating\nsixties\nkicker\nmoaning\nmotown\n##bha\nemancipation\nneoclassical\n##hering\nconsoles\nebert\nprofessorship\n##tures\nsustaining\nassaults\nobeyed\naffluent\nincurred\ntornadoes\n##eber\n##zow\nemphasizing\nhighlanders\ncheated\nhelmets\n##ctus\ninternship\nterence\nbony\nexecutions\nlegislators\nberries\npeninsular\ntinged\n##aco\n1689\namplifier\ncorvette\nribbons\nlavish\npennant\n##lander\nworthless\n##chfield\n##forms\nmariano\npyrenees\nexpenditures\n##icides\nchesterfield\nmandir\ntailor\n39th\nsergey\nnestled\nwilled\naristocracy\ndevotees\ngoodnight\nraaf\nrumored\nweaponry\nremy\nappropriations\nharcourt\nburr\nriaa\n##lence\nlimitation\nunnoticed\nguo\nsoaking\nswamps\n##tica\ncollapsing\ntatiana\ndescriptive\nbrigham\npsalm\n##chment\nmaddox\n##lization\npatti\ncaliph\n##aja\nakron\ninjuring\nserra\n##ganj\nbasins\n##sari\nastonished\nlauncher\n##church\nhilary\nwilkins\nsewing\n##sf\nstinging\n##fia\n##ncia\nunderwood\nstartup\n##ition\ncompilations\nvibrations\nembankment\njurist\n##nity\nbard\njuventus\ngroundwater\nkern\npalaces\nhelium\nboca\ncramped\nmarissa\nsoto\n##worm\njae\nprincely\n##ggy\nfaso\nbazaar\nwarmly\n##voking\n229\npairing\n##lite\n##grate\n##nets\nwien\nfreaked\nulysses\nrebirth\n##alia\n##rent\nmummy\nguzman\njimenez\nstilled\n##nitz\ntrajectory\ntha\nwoken\narchival\nprofessions\n##pts\n##pta\nhilly\nshadowy\nshrink\n##bolt\nnorwood\nglued\nmigrate\nstereotypes\ndevoid\n##pheus\n625\nevacuate\nhorrors\ninfancy\ngotham\nknowles\noptic\ndownloaded\nsachs\nkingsley\nparramatta\ndarryl\nmor\n##onale\nshady\ncommence\nconfesses\nkan\n##meter\n##placed\nmarlborough\nroundabout\nregents\nfrigates\nio\n##imating\ngothenburg\nrevoked\ncarvings\nclockwise\nconvertible\nintruder\n##sche\nbanged\n##ogo\nvicky\nbourgeois\n##mony\ndupont\nfooting\n##gum\npd\n##real\nbuckle\nyun\npenthouse\nsane\n720\nserviced\nstakeholders\nneumann\nbb\n##eers\ncomb\n##gam\ncatchment\npinning\nrallies\ntyping\n##elles\nforefront\nfreiburg\nsweetie\ngiacomo\nwidowed\ngoodwill\nworshipped\naspirations\nmidday\n##vat\nfishery\n##trick\nbournemouth\nturk\n243\nhearth\nethanol\nguadalajara\nmurmurs\nsl\n##uge\nafforded\nscripted\n##hta\nwah\n##jn\ncoroner\ntranslucent\n252\nmemorials\npuck\nprogresses\nclumsy\n##race\n315\ncandace\nrecounted\n##27\n##slin\n##uve\nfiltering\n##mac\nhowl\nstrata\nheron\nleveled\n##ays\ndubious\n##oja\n##т\n##wheel\ncitations\nexhibiting\n##laya\n##mics\n##pods\nturkic\n##lberg\ninjunction\n##ennial\n##mit\nantibodies\n##44\norganise\n##rigues\ncardiovascular\ncushion\ninverness\n##zquez\ndia\ncocoa\nsibling\n##tman\n##roid\nexpanse\nfeasible\ntunisian\nalgiers\n##relli\nrus\nbloomberg\ndso\nwestphalia\nbro\ntacoma\n281\ndownloads\n##ours\nkonrad\nduran\n##hdi\ncontinuum\njett\ncompares\nlegislator\nsecession\n##nable\n##gues\n##zuka\ntranslating\nreacher\n##gley\n##ła\naleppo\n##agi\ntc\norchards\ntrapping\nlinguist\nversatile\ndrumming\npostage\ncalhoun\nsuperiors\n##mx\nbarefoot\nleary\n##cis\nignacio\nalfa\nkaplan\n##rogen\nbratislava\nmori\n##vot\ndisturb\nhaas\n313\ncartridges\ngilmore\nradiated\nsalford\ntunic\nhades\n##ulsive\narcheological\ndelilah\nmagistrates\nauditioned\nbrewster\ncharters\nempowerment\nblogs\ncappella\ndynasties\niroquois\nwhipping\n##krishna\nraceway\ntruths\nmyra\nweaken\njudah\nmcgregor\n##horse\nmic\nrefueling\n37th\nburnley\nbosses\nmarkus\npremio\nquery\n##gga\ndunbar\n##economic\ndarkest\nlyndon\nsealing\ncommendation\nreappeared\n##mun\naddicted\nezio\nslaughtered\nsatisfactory\nshuffle\n##eves\n##thic\n##uj\nfortification\nwarrington\n##otto\nresurrected\nfargo\nmane\n##utable\n##lei\n##space\nforeword\nox\n##aris\n##vern\nabrams\nhua\n##mento\nsakura\n##alo\nuv\nsentimental\n##skaya\nmidfield\n##eses\nsturdy\nscrolls\nmacleod\n##kyu\nentropy\n##lance\nmitochondrial\ncicero\nexcelled\nthinner\nconvoys\nperceive\n##oslav\n##urable\nsystematically\ngrind\nburkina\n287\n##tagram\nops\n##aman\nguantanamo\n##cloth\n##tite\nforcefully\nwavy\n##jou\npointless\n##linger\n##tze\nlayton\nportico\nsuperficial\nclerical\noutlaws\n##hism\nburials\nmuir\n##inn\ncreditors\nhauling\nrattle\n##leg\ncalais\nmonde\narchers\nreclaimed\ndwell\nwexford\nhellenic\nfalsely\nremorse\n##tek\ndough\nfurnishings\n##uttered\ngabon\nneurological\nnovice\n##igraphy\ncontemplated\npulpit\nnightstand\nsaratoga\n##istan\ndocumenting\npulsing\ntaluk\n##firmed\nbusted\nmarital\n##rien\ndisagreements\nwasps\n##yes\nhodge\nmcdonnell\nmimic\nfran\npendant\ndhabi\nmusa\n##nington\ncongratulations\nargent\ndarrell\nconcussion\nlosers\nregrets\nthessaloniki\nreversal\ndonaldson\nhardwood\nthence\nachilles\nritter\n##eran\ndemonic\njurgen\nprophets\ngoethe\neki\nclassmate\nbuff\n##cking\nyank\nirrational\n##inging\nperished\nseductive\nqur\nsourced\n##crat\n##typic\nmustard\nravine\nbarre\nhorizontally\ncharacterization\nphylogenetic\nboise\n##dit\n##runner\n##tower\nbrutally\nintercourse\nseduce\n##bbing\nfay\nferris\nogden\namar\nnik\nunarmed\n##inator\nevaluating\nkyrgyzstan\nsweetness\n##lford\n##oki\nmccormick\nmeiji\nnotoriety\nstimulate\ndisrupt\nfiguring\ninstructional\nmcgrath\n##zoo\ngroundbreaking\n##lto\nflinch\nkhorasan\nagrarian\nbengals\nmixer\nradiating\n##sov\ningram\npitchers\nnad\ntariff\n##cript\ntata\n##codes\n##emi\n##ungen\nappellate\nlehigh\n##bled\n##giri\nbrawl\nduct\ntexans\n##ciation\n##ropolis\nskipper\nspeculative\nvomit\ndoctrines\nstresses\n253\ndavy\ngraders\nwhitehead\njozef\ntimely\ncumulative\nharyana\npaints\nappropriately\nboon\ncactus\n##ales\n##pid\ndow\nlegions\n##pit\nperceptions\n1730\npicturesque\n##yse\nperiphery\nrune\nwr\n##aha\nceltics\nsentencing\nwhoa\n##erin\nconfirms\nvariance\n425\nmoines\nmathews\nspade\nrave\nm1\nfronted\nfx\nblending\nalleging\nreared\n##gl\n237\n##paper\ngrassroots\neroded\n##free\n##physical\ndirects\nordeal\n##sław\naccelerate\nhacker\nrooftop\n##inia\nlev\nbuys\ncebu\ndevote\n##lce\nspecialising\n##ulsion\nchoreographed\nrepetition\nwarehouses\n##ryl\npaisley\ntuscany\nanalogy\nsorcerer\nhash\nhuts\nshards\ndescends\nexclude\nnix\nchaplin\ngaga\nito\nvane\n##drich\ncauseway\nmisconduct\nlimo\norchestrated\nglands\njana\n##kot\nu2\n##mple\n##sons\nbranching\ncontrasts\nscoop\nlonged\n##virus\nchattanooga\n##75\nsyrup\ncornerstone\n##tized\n##mind\n##iaceae\ncareless\nprecedence\nfrescoes\n##uet\nchilled\nconsult\nmodelled\nsnatch\npeat\n##thermal\ncaucasian\nhumane\nrelaxation\nspins\ntemperance\n##lbert\noccupations\nlambda\nhybrids\nmoons\nmp3\n##oese\n247\nrolf\nsocietal\nyerevan\nness\n##ssler\nbefriended\nmechanized\nnominate\ntrough\nboasted\ncues\nseater\n##hom\nbends\n##tangle\nconductors\nemptiness\n##lmer\neurasian\nadriatic\ntian\n##cie\nanxiously\nlark\npropellers\nchichester\njock\nev\n2a\n##holding\ncredible\nrecounts\ntori\nloyalist\nabduction\n##hoot\n##redo\nnepali\n##mite\nventral\ntempting\n##ango\n##crats\nsteered\n##wice\njavelin\ndipping\nlaborers\nprentice\nlooming\ntitanium\n##ː\nbadges\nemir\ntensor\n##ntation\negyptians\nrash\ndenies\nhawthorne\nlombard\nshowers\nwehrmacht\ndietary\ntrojan\n##reus\nwelles\nexecuting\nhorseshoe\nlifeboat\n##lak\nelsa\ninfirmary\nnearing\nroberta\nboyer\nmutter\ntrillion\njoanne\n##fine\n##oked\nsinks\nvortex\nuruguayan\nclasp\nsirius\n##block\naccelerator\nprohibit\nsunken\nbyu\nchronological\ndiplomats\nochreous\n510\nsymmetrical\n1644\nmaia\n##tology\nsalts\nreigns\natrocities\n##ия\nhess\nbared\nissn\n##vyn\ncater\nsaturated\n##cycle\n##isse\nsable\nvoyager\ndyer\nyusuf\n##inge\nfountains\nwolff\n##39\n##nni\nengraving\nrollins\natheist\nominous\n##ault\nherr\nchariot\nmartina\nstrung\n##fell\n##farlane\nhorrific\nsahib\ngazes\nsaetan\nerased\nptolemy\n##olic\nflushing\nlauderdale\nanalytic\n##ices\n530\nnavarro\nbeak\ngorilla\nherrera\nbroom\nguadalupe\nraiding\nsykes\n311\nbsc\ndeliveries\n1720\ninvasions\ncarmichael\ntajikistan\nthematic\necumenical\nsentiments\nonstage\n##rians\n##brand\n##sume\ncatastrophic\nflanks\nmolten\n##arns\nwaller\naimee\nterminating\n##icing\nalternately\n##oche\nnehru\nprinters\noutraged\n##eving\nempires\ntemplate\nbanners\nrepetitive\nza\n##oise\nvegetarian\n##tell\nguiana\nopt\ncavendish\nlucknow\nsynthesized\n##hani\n##mada\nfinalized\n##ctable\nfictitious\nmayoral\nunreliable\n##enham\nembracing\npeppers\nrbis\n##chio\n##neo\ninhibition\nslashed\ntogo\norderly\nembroidered\nsafari\nsalty\n236\nbarron\nbenito\ntotaled\n##dak\npubs\nsimulated\ncaden\ndevin\ntolkien\nmomma\nwelding\nsesame\n##ept\ngottingen\nhardness\n630\nshaman\ntemeraire\n620\nadequately\npediatric\n##kit\nck\nassertion\nradicals\ncomposure\ncadence\nseafood\nbeaufort\nlazarus\nmani\nwarily\ncunning\nkurdistan\n249\ncantata\n##kir\nares\n##41\n##clusive\nnape\ntownland\ngeared\ninsulted\nflutter\nboating\nviolate\ndraper\ndumping\nmalmo\n##hh\n##romatic\nfirearm\nalta\nbono\nobscured\n##clave\nexceeds\npanorama\nunbelievable\n##train\npreschool\n##essed\ndisconnected\ninstalling\nrescuing\nsecretaries\naccessibility\n##castle\n##drive\n##ifice\n##film\nbouts\nslug\nwaterway\nmindanao\n##buro\n##ratic\nhalves\n##ل\ncalming\nliter\nmaternity\nadorable\nbragg\nelectrification\nmcc\n##dote\nroxy\nschizophrenia\n##body\nmunoz\nkaye\nwhaling\n239\nmil\ntingling\ntolerant\n##ago\nunconventional\nvolcanoes\n##finder\ndeportivo\n##llie\nrobson\nkaufman\nneuroscience\nwai\ndeportation\nmasovian\nscraping\nconverse\n##bh\nhacking\nbulge\n##oun\nadministratively\nyao\n580\namp\nmammoth\nbooster\nclaremont\nhooper\nnomenclature\npursuits\nmclaughlin\nmelinda\n##sul\ncatfish\nbarclay\nsubstrates\ntaxa\nzee\noriginals\nkimberly\npackets\npadma\n##ality\nborrowing\nostensibly\nsolvent\n##bri\n##genesis\n##mist\nlukas\nshreveport\nveracruz\n##ь\n##lou\n##wives\ncheney\ntt\nanatolia\nhobbs\n##zyn\ncyclic\nradiant\nalistair\ngreenish\nsiena\ndat\nindependents\n##bation\nconform\npieter\nhyper\napplicant\nbradshaw\nspores\ntelangana\nvinci\ninexpensive\nnuclei\n322\njang\nnme\nsoho\nspd\n##ign\ncradled\nreceptionist\npow\n##43\n##rika\nfascism\n##ifer\nexperimenting\n##ading\n##iec\n##region\n345\njocelyn\nmaris\nstair\nnocturnal\ntoro\nconstabulary\nelgin\n##kker\nmsc\n##giving\n##schen\n##rase\ndoherty\ndoping\nsarcastically\nbatter\nmaneuvers\n##cano\n##apple\n##gai\n##git\nintrinsic\n##nst\n##stor\n1753\nshowtime\ncafes\ngasps\nlviv\nushered\n##thed\nfours\nrestart\nastonishment\ntransmitting\nflyer\nshrugs\n##sau\nintriguing\ncones\ndictated\nmushrooms\nmedial\n##kovsky\n##elman\nescorting\ngaped\n##26\ngodfather\n##door\n##sell\ndjs\nrecaptured\ntimetable\nvila\n1710\n3a\naerodrome\nmortals\nscientology\n##orne\nangelina\nmag\nconvection\nunpaid\ninsertion\nintermittent\nlego\n##nated\nendeavor\nkota\npereira\n##lz\n304\nbwv\nglamorgan\ninsults\nagatha\nfey\n##cend\nfleetwood\nmahogany\nprotruding\nsteamship\nzeta\n##arty\nmcguire\nsuspense\n##sphere\nadvising\nurges\n##wala\nhurriedly\nmeteor\ngilded\ninline\narroyo\nstalker\n##oge\nexcitedly\nrevered\n##cure\nearle\nintroductory\n##break\n##ilde\nmutants\npuff\npulses\nreinforcement\n##haling\ncurses\nlizards\nstalk\ncorrelated\n##fixed\nfallout\nmacquarie\n##unas\nbearded\ndenton\nheaving\n802\n##ocation\nwinery\nassign\ndortmund\n##lkirk\neverest\ninvariant\ncharismatic\nsusie\n##elling\nbled\nlesley\ntelegram\nsumner\nbk\n##ogen\n##к\nwilcox\nneedy\ncolbert\nduval\n##iferous\n##mbled\nallotted\nattends\nimperative\n##hita\nreplacements\nhawker\n##inda\ninsurgency\n##zee\n##eke\ncasts\n##yla\n680\nives\ntransitioned\n##pack\n##powering\nauthoritative\nbaylor\nflex\ncringed\nplaintiffs\nwoodrow\n##skie\ndrastic\nape\naroma\nunfolded\ncommotion\nnt\npreoccupied\ntheta\nroutines\nlasers\nprivatization\nwand\ndomino\nek\nclenching\nnsa\nstrategically\nshowered\nbile\nhandkerchief\npere\nstoring\nchristophe\ninsulting\n316\nnakamura\nromani\nasiatic\nmagdalena\npalma\ncruises\nstripping\n405\nkonstantin\nsoaring\n##berman\ncolloquially\nforerunner\nhavilland\nincarcerated\nparasites\nsincerity\n##utus\ndisks\nplank\nsaigon\n##ining\ncorbin\nhomo\nornaments\npowerhouse\n##tlement\nchong\nfastened\nfeasibility\nidf\nmorphological\nusable\n##nish\n##zuki\naqueduct\njaguars\nkeepers\n##flies\naleksandr\nfaust\nassigns\newing\nbacterium\nhurled\ntricky\nhungarians\nintegers\nwallis\n321\nyamaha\n##isha\nhushed\noblivion\naviator\nevangelist\nfriars\n##eller\nmonograph\node\n##nary\nairplanes\nlabourers\ncharms\n##nee\n1661\nhagen\ntnt\nrudder\nfiesta\ntranscript\ndorothea\nska\ninhibitor\nmaccabi\nretorted\nraining\nencompassed\nclauses\nmenacing\n1642\nlineman\n##gist\nvamps\n##ape\n##dick\ngloom\n##rera\ndealings\neasing\nseekers\n##nut\n##pment\nhelens\nunmanned\n##anu\n##isson\nbasics\n##amy\n##ckman\nadjustments\n1688\nbrutality\nhorne\n##zell\nsui\n##55\n##mable\naggregator\n##thal\nrhino\n##drick\n##vira\ncounters\nzoom\n##01\n##rting\nmn\nmontenegrin\npackard\n##unciation\n##♭\n##kki\nreclaim\nscholastic\nthugs\npulsed\n##icia\nsyriac\nquan\nsaddam\nbanda\nkobe\nblaming\nbuddies\ndissent\n##lusion\n##usia\ncorbett\njaya\ndelle\nerratic\nlexie\n##hesis\n435\namiga\nhermes\n##pressing\n##leen\nchapels\ngospels\njamal\n##uating\ncompute\nrevolving\nwarp\n##sso\n##thes\narmory\n##eras\n##gol\nantrim\nloki\n##kow\n##asian\n##good\n##zano\nbraid\nhandwriting\nsubdistrict\nfunky\npantheon\n##iculate\nconcurrency\nestimation\nimproper\njuliana\n##his\nnewcomers\njohnstone\nstaten\ncommunicated\n##oco\n##alle\nsausage\nstormy\n##stered\n##tters\nsuperfamily\n##grade\nacidic\ncollateral\ntabloid\n##oped\n##rza\nbladder\nausten\n##ellant\nmcgraw\n##hay\nhannibal\nmein\naquino\nlucifer\nwo\nbadger\nboar\ncher\nchristensen\ngreenberg\ninterruption\n##kken\njem\n244\nmocked\nbottoms\ncambridgeshire\n##lide\nsprawling\n##bbly\neastwood\nghent\nsynth\n##buck\nadvisers\n##bah\nnominally\nhapoel\nqu\ndaggers\nestranged\nfabricated\ntowels\nvinnie\nwcw\nmisunderstanding\nanglia\nnothin\nunmistakable\n##dust\n##lova\nchilly\nmarquette\ntruss\n##edge\n##erine\nreece\n##lty\n##chemist\n##connected\n272\n308\n41st\nbash\nraion\nwaterfalls\n##ump\n##main\nlabyrinth\nqueue\ntheorist\n##istle\nbharatiya\nflexed\nsoundtracks\nrooney\nleftist\npatrolling\nwharton\nplainly\nalleviate\neastman\nschuster\ntopographic\nengages\nimmensely\nunbearable\nfairchild\n1620\ndona\nlurking\nparisian\noliveira\nia\nindictment\nhahn\nbangladeshi\n##aster\nvivo\n##uming\n##ential\nantonia\nexpects\nindoors\nkildare\nharlan\n##logue\n##ogenic\n##sities\nforgiven\n##wat\nchildish\ntavi\n##mide\n##orra\nplausible\ngrimm\nsuccessively\nscooted\n##bola\n##dget\n##rith\nspartans\nemery\nflatly\nazure\nepilogue\n##wark\nflourish\n##iny\n##tracted\n##overs\n##oshi\nbestseller\ndistressed\nreceipt\nspitting\nhermit\ntopological\n##cot\ndrilled\nsubunit\nfrancs\n##layer\neel\n##fk\n##itas\noctopus\nfootprint\npetitions\nufo\n##say\n##foil\ninterfering\nleaking\npalo\n##metry\nthistle\nvaliant\n##pic\nnarayan\nmcpherson\n##fast\ngonzales\n##ym\n##enne\ndustin\nnovgorod\nsolos\n##zman\ndoin\n##raph\n##patient\n##meyer\nsoluble\nashland\ncuffs\ncarole\npendleton\nwhistling\nvassal\n##river\ndeviation\nrevisited\nconstituents\nrallied\nrotate\nloomed\n##eil\n##nting\namateurs\naugsburg\nauschwitz\ncrowns\nskeletons\n##cona\nbonnet\n257\ndummy\nglobalization\nsimeon\nsleeper\nmandal\ndifferentiated\n##crow\n##mare\nmilne\nbundled\nexasperated\ntalmud\nowes\nsegregated\n##feng\n##uary\ndentist\npiracy\nprops\n##rang\ndevlin\n##torium\nmalicious\npaws\n##laid\ndependency\n##ergy\n##fers\n##enna\n258\npistons\nrourke\njed\ngrammatical\ntres\nmaha\nwig\n512\nghostly\njayne\n##achal\n##creen\n##ilis\n##lins\n##rence\ndesignate\n##with\narrogance\ncambodian\nclones\nshowdown\nthrottle\ntwain\n##ception\nlobes\nmetz\nnagoya\n335\nbraking\n##furt\n385\nroaming\n##minster\namin\ncrippled\n##37\n##llary\nindifferent\nhoffmann\nidols\nintimidating\n1751\n261\ninfluenza\nmemo\nonions\n1748\nbandage\nconsciously\n##landa\n##rage\nclandestine\nobserves\nswiped\ntangle\n##ener\n##jected\n##trum\n##bill\n##lta\nhugs\ncongresses\njosiah\nspirited\n##dek\nhumanist\nmanagerial\nfilmmaking\ninmate\nrhymes\ndebuting\ngrimsby\nur\n##laze\nduplicate\nvigor\n##tf\nrepublished\nbolshevik\nrefurbishment\nantibiotics\nmartini\nmethane\nnewscasts\nroyale\nhorizons\nlevant\niain\nvisas\n##ischen\npaler\n##around\nmanifestation\nsnuck\nalf\nchop\nfutile\npedestal\nrehab\n##kat\nbmg\nkerman\nres\nfairbanks\njarrett\nabstraction\nsaharan\n##zek\n1746\nprocedural\nclearer\nkincaid\nsash\nluciano\n##ffey\ncrunch\nhelmut\n##vara\nrevolutionaries\n##tute\ncreamy\nleach\n##mmon\n1747\npermitting\nnes\nplight\nwendell\n##lese\ncontra\nts\nclancy\nipa\nmach\nstaples\nautopsy\ndisturbances\nnueva\nkarin\npontiac\n##uding\nproxy\nvenerable\nhaunt\nleto\nbergman\nexpands\n##helm\nwal\n##pipe\ncanning\nceline\ncords\nobesity\n##enary\nintrusion\nplanner\n##phate\nreasoned\nsequencing\n307\nharrow\n##chon\n##dora\nmarred\nmcintyre\nrepay\ntarzan\ndarting\n248\nharrisburg\nmargarita\nrepulsed\n##hur\n##lding\nbelinda\nhamburger\nnovo\ncompliant\nrunways\nbingham\nregistrar\nskyscraper\nic\ncuthbert\nimprovisation\nlivelihood\n##corp\n##elial\nadmiring\n##dened\nsporadic\nbeliever\ncasablanca\npopcorn\n##29\nasha\nshovel\n##bek\n##dice\ncoiled\ntangible\n##dez\ncasper\nelsie\nresin\ntenderness\nrectory\n##ivision\navail\nsonar\n##mori\nboutique\n##dier\nguerre\nbathed\nupbringing\nvaulted\nsandals\nblessings\n##naut\n##utnant\n1680\n306\nfoxes\npia\ncorrosion\nhesitantly\nconfederates\ncrystalline\nfootprints\nshapiro\ntirana\nvalentin\ndrones\n45th\nmicroscope\nshipments\ntexted\ninquisition\nwry\nguernsey\nunauthorized\nresigning\n760\nripple\nschubert\nstu\nreassure\nfelony\n##ardo\nbrittle\nkoreans\n##havan\n##ives\ndun\nimplicit\ntyres\n##aldi\n##lth\nmagnolia\n##ehan\n##puri\n##poulos\naggressively\nfei\ngr\nfamiliarity\n##poo\nindicative\n##trust\nfundamentally\njimmie\noverrun\n395\nanchors\nmoans\n##opus\nbritannia\narmagh\n##ggle\npurposely\nseizing\n##vao\nbewildered\nmundane\navoidance\ncosmopolitan\ngeometridae\nquartermaster\ncaf\n415\nchatter\nengulfed\ngleam\npurge\n##icate\njuliette\njurisprudence\nguerra\nrevisions\n##bn\ncasimir\nbrew\n##jm\n1749\nclapton\ncloudy\nconde\nhermitage\n278\nsimulations\ntorches\nvincenzo\nmatteo\n##rill\nhidalgo\nbooming\nwestbound\naccomplishment\ntentacles\nunaffected\n##sius\nannabelle\nflopped\nsloping\n##litz\ndreamer\ninterceptor\nvu\n##loh\nconsecration\ncopying\nmessaging\nbreaker\nclimates\nhospitalized\n1752\ntorino\nafternoons\nwinfield\nwitnessing\n##teacher\nbreakers\nchoirs\nsawmill\ncoldly\n##ege\nsipping\nhaste\nuninhabited\nconical\nbibliography\npamphlets\nsevern\nedict\n##oca\ndeux\nillnesses\ngrips\n##pl\nrehearsals\nsis\nthinkers\ntame\n##keepers\n1690\nacacia\nreformer\n##osed\n##rys\nshuffling\n##iring\n##shima\neastbound\nionic\nrhea\nflees\nlittered\n##oum\nrocker\nvomiting\ngroaning\nchamp\noverwhelmingly\ncivilizations\npaces\nsloop\nadoptive\n##tish\nskaters\n##vres\naiding\nmango\n##joy\nnikola\nshriek\n##ignon\npharmaceuticals\n##mg\ntuna\ncalvert\ngustavo\nstocked\nyearbook\n##urai\n##mana\ncomputed\nsubsp\nriff\nhanoi\nkelvin\nhamid\nmoors\npastures\nsummons\njihad\nnectar\n##ctors\nbayou\nuntitled\npleasing\nvastly\nrepublics\nintellect\n##η\n##ulio\n##tou\ncrumbling\nstylistic\nsb\n##ی\nconsolation\nfrequented\nh₂o\nwalden\nwidows\n##iens\n404\n##ignment\nchunks\nimproves\n288\ngrit\nrecited\n##dev\nsnarl\nsociological\n##arte\n##gul\ninquired\n##held\nbruise\nclube\nconsultancy\nhomogeneous\nhornets\nmultiplication\npasta\nprick\nsavior\n##grin\n##kou\n##phile\nyoon\n##gara\ngrimes\nvanishing\ncheering\nreacting\nbn\ndistillery\n##quisite\n##vity\ncoe\ndockyard\nmassif\n##jord\nescorts\nvoss\n##valent\nbyte\nchopped\nhawke\nillusions\nworkings\nfloats\n##koto\n##vac\nkv\nannapolis\nmadden\n##onus\nalvaro\nnoctuidae\n##cum\n##scopic\navenge\nsteamboat\nforte\nillustrates\nerika\n##trip\n570\ndew\nnationalities\nbran\nmanifested\nthirsty\ndiversified\nmuscled\nreborn\n##standing\narson\n##lessness\n##dran\n##logram\n##boys\n##kushima\n##vious\nwilloughby\n##phobia\n286\nalsace\ndashboard\nyuki\n##chai\ngranville\nmyspace\npublicized\ntricked\n##gang\nadjective\n##ater\nrelic\nreorganisation\nenthusiastically\nindications\nsaxe\n##lassified\nconsolidate\niec\npadua\nhelplessly\nramps\nrenaming\nregulars\npedestrians\naccents\nconvicts\ninaccurate\nlowers\nmana\n##pati\nbarrie\nbjp\noutta\nsomeplace\nberwick\nflanking\ninvoked\nmarrow\nsparsely\nexcerpts\nclothed\nrei\n##ginal\nwept\n##straße\n##vish\nalexa\nexcel\n##ptive\nmembranes\naquitaine\ncreeks\ncutler\nsheppard\nimplementations\nns\n##dur\nfragrance\nbudge\nconcordia\nmagnesium\nmarcelo\n##antes\ngladly\nvibrating\n##rral\n##ggles\nmontrose\n##omba\nlew\nseamus\n1630\ncocky\n##ament\n##uen\nbjorn\n##rrick\nfielder\nfluttering\n##lase\nmethyl\nkimberley\nmcdowell\nreductions\nbarbed\n##jic\n##tonic\naeronautical\ncondensed\ndistracting\n##promising\nhuffed\n##cala\n##sle\nclaudius\ninvincible\nmissy\npious\nbalthazar\nci\n##lang\nbutte\ncombo\norson\n##dication\nmyriad\n1707\nsilenced\n##fed\n##rh\ncoco\nnetball\nyourselves\n##oza\nclarify\nheller\npeg\ndurban\netudes\noffender\nroast\nblackmail\ncurvature\n##woods\nvile\n309\nillicit\nsuriname\n##linson\noverture\n1685\nbubbling\ngymnast\ntucking\n##mming\n##ouin\nmaldives\n##bala\ngurney\n##dda\n##eased\n##oides\nbackside\npinto\njars\nracehorse\ntending\n##rdial\nbaronetcy\nwiener\nduly\n##rke\nbarbarian\ncupping\nflawed\n##thesis\nbertha\npleistocene\npuddle\nswearing\n##nob\n##tically\nfleeting\nprostate\namulet\neducating\n##mined\n##iti\n##tler\n75th\njens\nrespondents\nanalytics\ncavaliers\npapacy\nraju\n##iente\n##ulum\n##tip\nfunnel\n271\ndisneyland\n##lley\nsociologist\n##iam\n2500\nfaulkner\nlouvre\nmenon\n##dson\n276\n##ower\nafterlife\nmannheim\npeptide\nreferees\ncomedians\nmeaningless\n##anger\n##laise\nfabrics\nhurley\nrenal\nsleeps\n##bour\n##icle\nbreakout\nkristin\nroadside\nanimator\nclover\ndisdain\nunsafe\nredesign\n##urity\nfirth\nbarnsley\nportage\nreset\nnarrows\n268\ncommandos\nexpansive\nspeechless\ntubular\n##lux\nessendon\neyelashes\nsmashwords\n##yad\n##bang\n##claim\ncraved\nsprinted\nchet\nsomme\nastor\nwrocław\norton\n266\nbane\n##erving\n##uing\nmischief\n##amps\n##sund\nscaling\nterre\n##xious\nimpairment\noffenses\nundermine\nmoi\nsoy\ncontiguous\narcadia\ninuit\nseam\n##tops\nmacbeth\nrebelled\n##icative\n##iot\n590\nelaborated\nfrs\nuniformed\n##dberg\n259\npowerless\npriscilla\nstimulated\n980\nqc\narboretum\nfrustrating\ntrieste\nbullock\n##nified\nenriched\nglistening\nintern\n##adia\nlocus\nnouvelle\nollie\nike\nlash\nstarboard\nee\ntapestry\nheadlined\nhove\nrigged\n##vite\npollock\n##yme\nthrive\nclustered\ncas\nroi\ngleamed\nolympiad\n##lino\npressured\nregimes\n##hosis\n##lick\nripley\n##ophone\nkickoff\ngallon\nrockwell\n##arable\ncrusader\nglue\nrevolutions\nscrambling\n1714\ngrover\n##jure\nenglishman\naztec\n263\ncontemplating\ncoven\nipad\npreach\ntriumphant\ntufts\n##esian\nrotational\n##phus\n328\nfalkland\n##brates\nstrewn\nclarissa\nrejoin\nenvironmentally\nglint\nbanded\ndrenched\nmoat\nalbanians\njohor\nrr\nmaestro\nmalley\nnouveau\nshaded\ntaxonomy\nv6\nadhere\nbunk\nairfields\n##ritan\n1741\nencompass\nremington\ntran\n##erative\namelie\nmazda\nfriar\nmorals\npassions\n##zai\nbreadth\nvis\n##hae\nargus\nburnham\ncaressing\ninsider\nrudd\n##imov\n##mini\n##rso\nitalianate\nmurderous\ntextual\nwainwright\narmada\nbam\nweave\ntimer\n##taken\n##nh\nfra\n##crest\nardent\nsalazar\ntaps\ntunis\n##ntino\nallegro\ngland\nphilanthropic\n##chester\nimplication\n##optera\nesq\njudas\nnoticeably\nwynn\n##dara\ninched\nindexed\ncrises\nvilliers\nbandit\nroyalties\npatterned\ncupboard\ninterspersed\naccessory\nisla\nkendrick\nentourage\nstitches\n##esthesia\nheadwaters\n##ior\ninterlude\ndistraught\ndraught\n1727\n##basket\nbiased\nsy\ntransient\ntriad\nsubgenus\nadapting\nkidd\nshortstop\n##umatic\ndimly\nspiked\nmcleod\nreprint\nnellie\npretoria\nwindmill\n##cek\nsingled\n##mps\n273\nreunite\n##orous\n747\nbankers\noutlying\n##omp\n##ports\n##tream\napologies\ncosmetics\npatsy\n##deh\n##ocks\n##yson\nbender\nnantes\nserene\n##nad\nlucha\nmmm\n323\n##cius\n##gli\ncmll\ncoinage\nnestor\njuarez\n##rook\nsmeared\nsprayed\ntwitching\nsterile\nirina\nembodied\njuveniles\nenveloped\nmiscellaneous\ncancers\ndq\ngulped\nluisa\ncrested\nswat\ndonegal\nref\n##anov\n##acker\nhearst\nmercantile\n##lika\ndoorbell\nua\nvicki\n##alla\n##som\nbilbao\npsychologists\nstryker\nsw\nhorsemen\nturkmenistan\nwits\n##national\nanson\nmathew\nscreenings\n##umb\nrihanna\n##agne\n##nessy\naisles\n##iani\n##osphere\nhines\nkenton\nsaskatoon\ntasha\ntruncated\n##champ\n##itan\nmildred\nadvises\nfredrik\ninterpreting\ninhibitors\n##athi\nspectroscopy\n##hab\n##kong\nkarim\npanda\n##oia\n##nail\n##vc\nconqueror\nkgb\nleukemia\n##dity\narrivals\ncheered\npisa\nphosphorus\nshielded\n##riated\nmammal\nunitarian\nurgently\nchopin\nsanitary\n##mission\nspicy\ndrugged\nhinges\n##tort\ntipping\ntrier\nimpoverished\nwestchester\n##caster\n267\nepoch\nnonstop\n##gman\n##khov\naromatic\ncentrally\ncerro\n##tively\n##vio\nbillions\nmodulation\nsedimentary\n283\nfacilitating\noutrageous\ngoldstein\n##eak\n##kt\nld\nmaitland\npenultimate\npollard\n##dance\nfleets\nspaceship\nvertebrae\n##nig\nalcoholism\nals\nrecital\n##bham\n##ference\n##omics\nm2\n##bm\ntrois\n##tropical\n##в\ncommemorates\n##meric\nmarge\n##raction\n1643\n670\ncosmetic\nravaged\n##ige\ncatastrophe\neng\n##shida\nalbrecht\narterial\nbellamy\ndecor\nharmon\n##rde\nbulbs\nsynchronized\nvito\neasiest\nshetland\nshielding\nwnba\n##glers\n##ssar\n##riam\nbrianna\ncumbria\n##aceous\n##rard\ncores\nthayer\n##nsk\nbrood\nhilltop\nluminous\ncarts\nkeynote\nlarkin\nlogos\n##cta\n##ا\n##mund\n##quay\nlilith\ntinted\n277\nwrestle\nmobilization\n##uses\nsequential\nsiam\nbloomfield\ntakahashi\n274\n##ieving\npresenters\nringo\nblazed\nwitty\n##oven\n##ignant\ndevastation\nhaydn\nharmed\nnewt\ntherese\n##peed\ngershwin\nmolina\nrabbis\nsudanese\n001\ninnate\nrestarted\n##sack\n##fus\nslices\nwb\n##shah\nenroll\nhypothetical\nhysterical\n1743\nfabio\nindefinite\nwarped\n##hg\nexchanging\n525\nunsuitable\n##sboro\ngallo\n1603\nbret\ncobalt\nhomemade\n##hunter\nmx\noperatives\n##dhar\nterraces\ndurable\nlatch\npens\nwhorls\n##ctuated\n##eaux\nbilling\nligament\nsuccumbed\n##gly\nregulators\nspawn\n##brick\n##stead\nfilmfare\nrochelle\n##nzo\n1725\ncircumstance\nsaber\nsupplements\n##nsky\n##tson\ncrowe\nwellesley\ncarrot\n##9th\n##movable\nprimate\ndrury\nsincerely\ntopical\n##mad\n##rao\ncallahan\nkyiv\nsmarter\ntits\nundo\n##yeh\nannouncements\nanthologies\nbarrio\nnebula\n##islaus\n##shaft\n##tyn\nbodyguards\n2021\nassassinate\nbarns\nemmett\nscully\n##mah\n##yd\n##eland\n##tino\n##itarian\ndemoted\ngorman\nlashed\nprized\nadventist\nwrit\n##gui\nalla\ninvertebrates\n##ausen\n1641\namman\n1742\nalign\nhealy\nredistribution\n##gf\n##rize\ninsulation\n##drop\nadherents\nhezbollah\nvitro\nferns\nyanking\n269\nphp\nregistering\nuppsala\ncheerleading\nconfines\nmischievous\ntully\n##ross\n49th\ndocked\nroam\nstipulated\npumpkin\n##bry\nprompt\n##ezer\nblindly\nshuddering\ncraftsmen\nfrail\nscented\nkatharine\nscramble\nshaggy\nsponge\nhelix\nzaragoza\n279\n##52\n43rd\nbacklash\nfontaine\nseizures\nposse\ncowan\nnonfiction\ntelenovela\nwwii\nhammered\nundone\n##gpur\nencircled\nirs\n##ivation\nartefacts\noneself\nsearing\nsmallpox\n##belle\n##osaurus\nshandong\nbreached\nupland\nblushing\nrankin\ninfinitely\npsyche\ntolerated\ndocking\nevicted\n##col\nunmarked\n##lving\ngnome\nlettering\nlitres\nmusique\n##oint\nbenevolent\n##jal\nblackened\n##anna\nmccall\nracers\ntingle\n##ocene\n##orestation\nintroductions\nradically\n292\n##hiff\n##باد\n1610\n1739\nmunchen\nplead\n##nka\ncondo\nscissors\n##sight\n##tens\napprehension\n##cey\n##yin\nhallmark\nwatering\nformulas\nsequels\n##llas\naggravated\nbae\ncommencing\n##building\nenfield\nprohibits\nmarne\nvedic\ncivilized\neuclidean\njagger\nbeforehand\nblasts\ndumont\n##arney\n##nem\n740\nconversions\nhierarchical\nrios\nsimulator\n##dya\n##lellan\nhedges\noleg\nthrusts\nshadowed\ndarby\nmaximize\n1744\ngregorian\n##nded\n##routed\nsham\nunspecified\n##hog\nemory\nfactual\n##smo\n##tp\nfooled\n##rger\nortega\nwellness\nmarlon\n##oton\n##urance\ncasket\nkeating\nley\nenclave\n##ayan\nchar\ninfluencing\njia\n##chenko\n412\nammonia\nerebidae\nincompatible\nviolins\ncornered\n##arat\ngrooves\nastronauts\ncolumbian\nrampant\nfabrication\nkyushu\nmahmud\nvanish\n##dern\nmesopotamia\n##lete\nict\n##rgen\ncaspian\nkenji\npitted\n##vered\n999\ngrimace\nroanoke\ntchaikovsky\ntwinned\n##analysis\n##awan\nxinjiang\narias\nclemson\nkazakh\nsizable\n1662\n##khand\n##vard\nplunge\ntatum\nvittorio\n##nden\ncholera\n##dana\n##oper\nbracing\nindifference\nprojectile\nsuperliga\n##chee\nrealises\nupgrading\n299\nporte\nretribution\n##vies\nnk\nstil\n##resses\nama\nbureaucracy\nblackberry\nbosch\ntestosterone\ncollapses\ngreer\n##pathic\nioc\nfifties\nmalls\n##erved\nbao\nbaskets\nadolescents\nsiegfried\n##osity\n##tosis\nmantra\ndetecting\nexistent\nfledgling\n##cchi\ndissatisfied\ngan\ntelecommunication\nmingled\nsobbed\n6000\ncontroversies\noutdated\ntaxis\n##raus\nfright\nslams\n##lham\n##fect\n##tten\ndetectors\nfetal\ntanned\n##uw\nfray\ngoth\nolympian\nskipping\nmandates\nscratches\nsheng\nunspoken\nhyundai\ntracey\nhotspur\nrestrictive\n##buch\namericana\nmundo\n##bari\nburroughs\ndiva\nvulcan\n##6th\ndistinctions\nthumping\n##ngen\nmikey\nsheds\nfide\nrescues\nspringsteen\nvested\nvaluation\n##ece\n##ely\npinnacle\nrake\nsylvie\n##edo\nalmond\nquivering\n##irus\nalteration\nfaltered\n##wad\n51st\nhydra\nticked\n##kato\nrecommends\n##dicated\nantigua\narjun\nstagecoach\nwilfred\ntrickle\npronouns\n##pon\naryan\nnighttime\n##anian\ngall\npea\nstitch\n##hei\nleung\nmilos\n##dini\neritrea\nnexus\nstarved\nsnowfall\nkant\nparasitic\ncot\ndiscus\nhana\nstrikers\nappleton\nkitchens\n##erina\n##partisan\n##itha\n##vius\ndisclose\nmetis\n##channel\n1701\ntesla\n##vera\nfitch\n1735\nblooded\n##tila\ndecimal\n##tang\n##bai\ncyclones\neun\nbottled\npeas\npensacola\nbasha\nbolivian\ncrabs\nboil\nlanterns\npartridge\nroofed\n1645\nnecks\n##phila\nopined\npatting\n##kla\n##lland\nchuckles\nvolta\nwhereupon\n##nche\ndevout\neuroleague\nsuicidal\n##dee\ninherently\ninvoluntary\nknitting\nnasser\n##hide\npuppets\ncolourful\ncourageous\nsouthend\nstills\nmiraculous\nhodgson\nricher\nrochdale\nethernet\ngreta\nuniting\nprism\numm\n##haya\n##itical\n##utation\ndeterioration\npointe\nprowess\n##ropriation\nlids\nscranton\nbillings\nsubcontinent\n##koff\n##scope\nbrute\nkellogg\npsalms\ndegraded\n##vez\nstanisław\n##ructured\nferreira\npun\nastonishing\ngunnar\n##yat\narya\nprc\ngottfried\n##tight\nexcursion\n##ographer\ndina\n##quil\n##nare\nhuffington\nillustrious\nwilbur\ngundam\nverandah\n##zard\nnaacp\n##odle\nconstructive\nfjord\nkade\n##naud\ngenerosity\nthrilling\nbaseline\ncayman\nfrankish\nplastics\naccommodations\nzoological\n##fting\ncedric\nqb\nmotorized\n##dome\n##otted\nsquealed\ntackled\ncanucks\nbudgets\nsitu\nasthma\ndail\ngabled\ngrasslands\nwhimpered\nwrithing\njudgments\n##65\nminnie\npv\n##carbon\nbananas\ngrille\ndomes\nmonique\nodin\nmaguire\nmarkham\ntierney\n##estra\n##chua\nlibel\npoke\nspeedy\natrium\nlaval\nnotwithstanding\n##edly\nfai\nkala\n##sur\nrobb\n##sma\nlistings\nluz\nsupplementary\ntianjin\n##acing\nenzo\njd\nric\nscanner\ncroats\ntranscribed\n##49\narden\ncv\n##hair\n##raphy\n##lver\n##uy\n357\nseventies\nstaggering\nalam\nhorticultural\nhs\nregression\ntimbers\nblasting\n##ounded\nmontagu\nmanipulating\n##cit\ncatalytic\n1550\ntroopers\n##meo\ncondemnation\nfitzpatrick\n##oire\n##roved\ninexperienced\n1670\ncastes\n##lative\nouting\n314\ndubois\nflicking\nquarrel\nste\nlearners\n1625\niq\nwhistled\n##class\n282\nclassify\ntariffs\ntemperament\n355\nfolly\nliszt\n##yles\nimmersed\njordanian\nceasefire\napparel\nextras\nmaru\nfished\n##bio\nharta\nstockport\nassortment\ncraftsman\nparalysis\ntransmitters\n##cola\nblindness\n##wk\nfatally\nproficiency\nsolemnly\n##orno\nrepairing\namore\ngroceries\nultraviolet\n##chase\nschoolhouse\n##tua\nresurgence\nnailed\n##otype\n##×\nruse\nsaliva\ndiagrams\n##tructing\nalbans\nrann\nthirties\n1b\nantennas\nhilarious\ncougars\npaddington\nstats\n##eger\nbreakaway\nipod\nreza\nauthorship\nprohibiting\nscoffed\n##etz\n##ttle\nconscription\ndefected\ntrondheim\n##fires\nivanov\nkeenan\n##adan\n##ciful\n##fb\n##slow\nlocating\n##ials\n##tford\ncadiz\nbasalt\nblankly\ninterned\nrags\nrattling\n##tick\ncarpathian\nreassured\nsync\nbum\nguildford\niss\nstaunch\n##onga\nastronomers\nsera\nsofie\nemergencies\nsusquehanna\n##heard\nduc\nmastery\nvh1\nwilliamsburg\nbayer\nbuckled\ncraving\n##khan\n##rdes\nbloomington\n##write\nalton\nbarbecue\n##bians\njustine\n##hri\n##ndt\ndelightful\nsmartphone\nnewtown\nphoton\nretrieval\npeugeot\nhissing\n##monium\n##orough\nflavors\nlighted\nrelaunched\ntainted\n##games\n##lysis\nanarchy\nmicroscopic\nhopping\nadept\nevade\nevie\n##beau\ninhibit\nsinn\nadjustable\nhurst\nintuition\nwilton\ncisco\n44th\nlawful\nlowlands\nstockings\nthierry\n##dalen\n##hila\n##nai\nfates\nprank\ntb\nmaison\nlobbied\nprovocative\n1724\n4a\nutopia\n##qual\ncarbonate\ngujarati\npurcell\n##rford\ncurtiss\n##mei\novergrown\narenas\nmediation\nswallows\n##rnik\nrespectful\nturnbull\n##hedron\n##hope\nalyssa\nozone\n##ʻi\nami\ngestapo\njohansson\nsnooker\ncanteen\ncuff\ndeclines\nempathy\nstigma\n##ags\n##iner\n##raine\ntaxpayers\ngui\nvolga\n##wright\n##copic\nlifespan\novercame\ntattooed\nenactment\ngiggles\n##ador\n##camp\nbarrington\nbribe\nobligatory\norbiting\npeng\n##enas\nelusive\nsucker\n##vating\ncong\nhardship\nempowered\nanticipating\nestrada\ncryptic\ngreasy\ndetainees\nplanck\nsudbury\nplaid\ndod\nmarriott\nkayla\n##ears\n##vb\n##zd\nmortally\n##hein\ncognition\nradha\n319\nliechtenstein\nmeade\nrichly\nargyle\nharpsichord\nliberalism\ntrumpets\nlauded\ntyrant\nsalsa\ntiled\nlear\npromoters\nreused\nslicing\ntrident\n##chuk\n##gami\n##lka\ncantor\ncheckpoint\n##points\ngaul\nleger\nmammalian\n##tov\n##aar\n##schaft\ndoha\nfrenchman\nnirvana\n##vino\ndelgado\nheadlining\n##eron\n##iography\njug\ntko\n1649\nnaga\nintersections\n##jia\nbenfica\nnawab\n##suka\nashford\ngulp\n##deck\n##vill\n##rug\nbrentford\nfrazier\npleasures\ndunne\npotsdam\nshenzhen\ndentistry\n##tec\nflanagan\n##dorff\n##hear\nchorale\ndinah\nprem\nquezon\n##rogated\nrelinquished\nsutra\nterri\n##pani\nflaps\n##rissa\npoly\n##rnet\nhomme\naback\n##eki\nlinger\nwomb\n##kson\n##lewood\ndoorstep\northodoxy\nthreaded\nwestfield\n##rval\ndioceses\nfridays\nsubsided\n##gata\nloyalists\n##biotic\n##ettes\nletterman\nlunatic\nprelate\ntenderly\ninvariably\nsouza\nthug\nwinslow\n##otide\nfurlongs\ngogh\njeopardy\n##runa\npegasus\n##umble\nhumiliated\nstandalone\ntagged\n##roller\nfreshmen\nklan\n##bright\nattaining\ninitiating\ntransatlantic\nlogged\nviz\n##uance\n1723\ncombatants\nintervening\nstephane\nchieftain\ndespised\ngrazed\n317\ncdc\ngalveston\ngodzilla\nmacro\nsimulate\n##planes\nparades\n##esses\n960\n##ductive\n##unes\nequator\noverdose\n##cans\n##hosh\n##lifting\njoshi\nepstein\nsonora\ntreacherous\naquatics\nmanchu\nresponsive\n##sation\nsupervisory\n##christ\n##llins\n##ibar\n##balance\n##uso\nkimball\nkarlsruhe\nmab\n##emy\nignores\nphonetic\nreuters\nspaghetti\n820\nalmighty\ndanzig\nrumbling\ntombstone\ndesignations\nlured\noutset\n##felt\nsupermarkets\n##wt\ngrupo\nkei\nkraft\nsusanna\n##blood\ncomprehension\ngenealogy\n##aghan\n##verted\nredding\n##ythe\n1722\nbowing\n##pore\n##roi\nlest\nsharpened\nfulbright\nvalkyrie\nsikhs\n##unds\nswans\nbouquet\nmerritt\n##tage\n##venting\ncommuted\nredhead\nclerks\nleasing\ncesare\ndea\nhazy\n##vances\nfledged\ngreenfield\nservicemen\n##gical\narmando\nblackout\ndt\nsagged\ndownloadable\nintra\npotion\npods\n##4th\n##mism\nxp\nattendants\ngambia\nstale\n##ntine\nplump\nasteroids\nrediscovered\nbuds\nflea\nhive\n##neas\n1737\nclassifications\ndebuts\n##eles\nolympus\nscala\n##eurs\n##gno\n##mute\nhummed\nsigismund\nvisuals\nwiggled\nawait\npilasters\nclench\nsulfate\n##ances\nbellevue\nenigma\ntrainee\nsnort\n##sw\nclouded\ndenim\n##rank\n##rder\nchurning\nhartman\nlodges\nriches\nsima\n##missible\naccountable\nsocrates\nregulates\nmueller\n##cr\n1702\navoids\nsolids\nhimalayas\nnutrient\npup\n##jevic\nsquat\nfades\nnec\n##lates\n##pina\n##rona\n##ου\nprivateer\ntequila\n##gative\n##mpton\napt\nhornet\nimmortals\n##dou\nasturias\ncleansing\ndario\n##rries\n##anta\netymology\nservicing\nzhejiang\n##venor\n##nx\nhorned\nerasmus\nrayon\nrelocating\n£10\n##bags\nescalated\npromenade\nstubble\n2010s\nartisans\naxial\nliquids\nmora\nsho\nyoo\n##tsky\nbundles\noldies\n##nally\nnotification\nbastion\n##ths\nsparkle\n##lved\n1728\nleash\npathogen\nhighs\n##hmi\nimmature\n880\ngonzaga\nignatius\nmansions\nmonterrey\nsweets\nbryson\n##loe\npolled\nregatta\nbrightest\npei\nrosy\nsquid\nhatfield\npayroll\naddict\nmeath\ncornerback\nheaviest\nlodging\n##mage\ncapcom\nrippled\n##sily\nbarnet\nmayhem\nymca\nsnuggled\nrousseau\n##cute\nblanchard\n284\nfragmented\nleighton\nchromosomes\nrisking\n##md\n##strel\n##utter\ncorinne\ncoyotes\ncynical\nhiroshi\nyeomanry\n##ractive\nebook\ngrading\nmandela\nplume\nagustin\nmagdalene\n##rkin\nbea\nfemme\ntrafford\n##coll\n##lun\n##tance\n52nd\nfourier\nupton\n##mental\ncamilla\ngust\niihf\nislamabad\nlongevity\n##kala\nfeldman\nnetting\n##rization\nendeavour\nforaging\nmfa\norr\n##open\ngreyish\ncontradiction\ngraz\n##ruff\nhandicapped\nmarlene\ntweed\noaxaca\nspp\ncampos\nmiocene\npri\nconfigured\ncooks\npluto\ncozy\npornographic\n##entes\n70th\nfairness\nglided\njonny\nlynne\nrounding\nsired\n##emon\n##nist\nremade\nuncover\n##mack\ncomplied\nlei\nnewsweek\n##jured\n##parts\n##enting\n##pg\n293\nfiner\nguerrillas\nathenian\ndeng\ndisused\nstepmother\naccuse\ngingerly\nseduction\n521\nconfronting\n##walker\n##going\ngora\nnostalgia\nsabres\nvirginity\nwrenched\n##minated\nsyndication\nwielding\neyre\n##56\n##gnon\n##igny\nbehaved\ntaxpayer\nsweeps\n##growth\nchildless\ngallant\n##ywood\namplified\ngeraldine\nscrape\n##ffi\nbabylonian\nfresco\n##rdan\n##kney\n##position\n1718\nrestricting\ntack\nfukuoka\nosborn\nselector\npartnering\n##dlow\n318\ngnu\nkia\ntak\nwhitley\ngables\n##54\n##mania\nmri\nsoftness\nimmersion\n##bots\n##evsky\n1713\nchilling\ninsignificant\npcs\n##uis\nelites\nlina\npurported\nsupplemental\nteaming\n##americana\n##dding\n##inton\nproficient\nrouen\n##nage\n##rret\nniccolo\nselects\n##bread\nfluffy\n1621\ngruff\nknotted\nmukherjee\npolgara\nthrash\nnicholls\nsecluded\nsmoothing\nthru\ncorsica\nloaf\nwhitaker\ninquiries\n##rrier\n##kam\nindochina\n289\nmarlins\nmyles\npeking\n##tea\nextracts\npastry\nsuperhuman\nconnacht\nvogel\n##ditional\n##het\n##udged\n##lash\ngloss\nquarries\nrefit\nteaser\n##alic\n##gaon\n20s\nmaterialized\nsling\ncamped\npickering\ntung\ntracker\npursuant\n##cide\ncranes\nsoc\n##cini\n##typical\n##viere\nanhalt\noverboard\nworkout\nchores\nfares\norphaned\nstains\n##logie\nfenton\nsurpassing\njoyah\ntriggers\n##itte\ngrandmaster\n##lass\n##lists\nclapping\nfraudulent\nledger\nnagasaki\n##cor\n##nosis\n##tsa\neucalyptus\ntun\n##icio\n##rney\n##tara\ndax\nheroism\nina\nwrexham\nonboard\nunsigned\n##dates\nmoshe\ngalley\nwinnie\ndroplets\nexiles\npraises\nwatered\nnoodles\n##aia\nfein\nadi\nleland\nmulticultural\nstink\nbingo\ncomets\nerskine\nmodernized\ncanned\nconstraint\ndomestically\nchemotherapy\nfeatherweight\nstifled\n##mum\ndarkly\nirresistible\nrefreshing\nhasty\nisolate\n##oys\nkitchener\nplanners\n##wehr\ncages\nyarn\nimplant\ntoulon\nelects\nchildbirth\nyue\n##lind\n##lone\ncn\nrightful\nsportsman\njunctions\nremodeled\nspecifies\n##rgh\n291\n##oons\ncomplimented\n##urgent\nlister\not\n##logic\nbequeathed\ncheekbones\nfontana\ngabby\n##dial\namadeus\ncorrugated\nmaverick\nresented\ntriangles\n##hered\n##usly\nnazareth\ntyrol\n1675\nassent\npoorer\nsectional\naegean\n##cous\n296\nnylon\nghanaian\n##egorical\n##weig\ncushions\nforbid\nfusiliers\nobstruction\nsomerville\n##scia\ndime\nearrings\nelliptical\nleyte\noder\npolymers\ntimmy\natm\nmidtown\npiloted\nsettles\ncontinual\nexternally\nmayfield\n##uh\nenrichment\nhenson\nkeane\npersians\n1733\nbenji\nbraden\npep\n324\n##efe\ncontenders\npepsi\nvalet\n##isches\n298\n##asse\n##earing\ngoofy\nstroll\n##amen\nauthoritarian\noccurrences\nadversary\nahmedabad\ntangent\ntoppled\ndorchester\n1672\nmodernism\nmarxism\nislamist\ncharlemagne\nexponential\nracks\nunicode\nbrunette\nmbc\npic\nskirmish\n##bund\n##lad\n##powered\n##yst\nhoisted\nmessina\nshatter\n##ctum\njedi\nvantage\n##music\n##neil\nclemens\nmahmoud\ncorrupted\nauthentication\nlowry\nnils\n##washed\nomnibus\nwounding\njillian\n##itors\n##opped\nserialized\nnarcotics\nhandheld\n##arm\n##plicity\nintersecting\nstimulating\n##onis\ncrate\nfellowships\nhemingway\ncasinos\nclimatic\nfordham\ncopeland\ndrip\nbeatty\nleaflets\nrobber\nbrothel\nmadeira\n##hedral\nsphinx\nultrasound\n##vana\nvalor\nforbade\nleonid\nvillas\n##aldo\nduane\nmarquez\n##cytes\ndisadvantaged\nforearms\nkawasaki\nreacts\nconsular\nlax\nuncles\nuphold\n##hopper\nconcepcion\ndorsey\nlass\n##izan\narching\npassageway\n1708\nresearches\ntia\ninternationals\n##graphs\n##opers\ndistinguishes\njavanese\ndivert\n##uven\nplotted\n##listic\n##rwin\n##erik\n##tify\naffirmative\nsignifies\nvalidation\n##bson\nkari\nfelicity\ngeorgina\nzulu\n##eros\n##rained\n##rath\novercoming\n##dot\nargyll\n##rbin\n1734\nchiba\nratification\nwindy\nearls\nparapet\n##marks\nhunan\npristine\nastrid\npunta\n##gart\nbrodie\n##kota\n##oder\nmalaga\nminerva\nrouse\n##phonic\nbellowed\npagoda\nportals\nreclamation\n##gur\n##odies\n##⁄₄\nparentheses\nquoting\nallergic\npalette\nshowcases\nbenefactor\nheartland\nnonlinear\n##tness\nbladed\ncheerfully\nscans\n##ety\n##hone\n1666\ngirlfriends\npedersen\nhiram\nsous\n##liche\n##nator\n1683\n##nery\n##orio\n##umen\nbobo\nprimaries\nsmiley\n##cb\nunearthed\nuniformly\nfis\nmetadata\n1635\nind\n##oted\nrecoil\n##titles\n##tura\n##ια\n406\nhilbert\njamestown\nmcmillan\ntulane\nseychelles\n##frid\nantics\ncoli\nfated\nstucco\n##grants\n1654\nbulky\naccolades\narrays\ncaledonian\ncarnage\noptimism\npuebla\n##tative\n##cave\nenforcing\nrotherham\nseo\ndunlop\naeronautics\nchimed\nincline\nzoning\narchduke\nhellenistic\n##oses\n##sions\ncandi\nthong\n##ople\nmagnate\nrustic\n##rsk\nprojective\nslant\n##offs\ndanes\nhollis\nvocalists\n##ammed\ncongenital\ncontend\ngesellschaft\n##ocating\n##pressive\ndouglass\nquieter\n##cm\n##kshi\nhowled\nsalim\nspontaneously\ntownsville\nbuena\nsouthport\n##bold\nkato\n1638\nfaerie\nstiffly\n##vus\n##rled\n297\nflawless\nrealising\ntaboo\n##7th\nbytes\nstraightening\n356\njena\n##hid\n##rmin\ncartwright\nberber\nbertram\nsoloists\n411\nnoses\n417\ncoping\nfission\nhardin\ninca\n##cen\n1717\nmobilized\nvhf\n##raf\nbiscuits\ncurate\n##85\n##anial\n331\ngaunt\nneighbourhoods\n1540\n##abas\nblanca\nbypassed\nsockets\nbehold\ncoincidentally\n##bane\nnara\nshave\nsplinter\nterrific\n##arion\n##erian\ncommonplace\njuris\nredwood\nwaistband\nboxed\ncaitlin\nfingerprints\njennie\nnaturalized\n##ired\nbalfour\ncraters\njody\nbungalow\nhugely\nquilt\nglitter\npigeons\nundertaker\nbulging\nconstrained\ngoo\n##sil\n##akh\nassimilation\nreworked\n##person\npersuasion\n##pants\nfelicia\n##cliff\n##ulent\n1732\nexplodes\n##dun\n##inium\n##zic\nlyman\nvulture\nhog\noverlook\nbegs\nnorthwards\now\nspoil\n##urer\nfatima\nfavorably\naccumulate\nsargent\nsorority\ncorresponded\ndispersal\nkochi\ntoned\n##imi\n##lita\ninternacional\nnewfound\n##agger\n##lynn\n##rigue\nbooths\npeanuts\n##eborg\nmedicare\nmuriel\nnur\n##uram\ncrates\nmillennia\npajamas\nworsened\n##breakers\njimi\nvanuatu\nyawned\n##udeau\ncarousel\n##hony\nhurdle\n##ccus\n##mounted\n##pod\nrv\n##eche\nairship\nambiguity\ncompulsion\nrecapture\n##claiming\narthritis\n##osomal\n1667\nasserting\nngc\nsniffing\ndade\ndiscontent\nglendale\nported\n##amina\ndefamation\nrammed\n##scent\nfling\nlivingstone\n##fleet\n875\n##ppy\napocalyptic\ncomrade\nlcd\n##lowe\ncessna\neine\npersecuted\nsubsistence\ndemi\nhoop\nreliefs\n710\ncoptic\nprogressing\nstemmed\nperpetrators\n1665\npriestess\n##nio\ndobson\nebony\nrooster\nitf\ntortricidae\n##bbon\n##jian\ncleanup\n##jean\n##øy\n1721\neighties\ntaxonomic\nholiness\n##hearted\n##spar\nantilles\nshowcasing\nstabilized\n##nb\ngia\nmascara\nmichelangelo\ndawned\n##uria\n##vinsky\nextinguished\nfitz\ngrotesque\n£100\n##fera\n##loid\n##mous\nbarges\nneue\nthrobbed\ncipher\njohnnie\n##a1\n##mpt\noutburst\n##swick\nspearheaded\nadministrations\nc1\nheartbreak\npixels\npleasantly\n##enay\nlombardy\nplush\n##nsed\nbobbie\n##hly\nreapers\ntremor\nxiang\nminogue\nsubstantive\nhitch\nbarak\n##wyl\nkwan\n##encia\n910\nobscene\nelegance\nindus\nsurfer\nbribery\nconserve\n##hyllum\n##masters\nhoratio\n##fat\napes\nrebound\npsychotic\n##pour\niteration\n##mium\n##vani\nbotanic\nhorribly\nantiques\ndispose\npaxton\n##hli\n##wg\ntimeless\n1704\ndisregard\nengraver\nhounds\n##bau\n##version\nlooted\nuno\nfacilitates\ngroans\nmasjid\nrutland\nantibody\ndisqualification\ndecatur\nfootballers\nquake\nslacks\n48th\nrein\nscribe\nstabilize\ncommits\nexemplary\ntho\n##hort\n##chison\npantry\ntraversed\n##hiti\ndisrepair\nidentifiable\nvibrated\nbaccalaureate\n##nnis\ncsa\ninterviewing\n##iensis\n##raße\ngreaves\nwealthiest\n343\nclassed\njogged\n£5\n##58\n##atal\nilluminating\nknicks\nrespecting\n##uno\nscrubbed\n##iji\n##dles\nkruger\nmoods\ngrowls\nraider\nsilvia\nchefs\nkam\nvr\ncree\npercival\n##terol\ngunter\ncounterattack\ndefiant\nhenan\nze\n##rasia\n##riety\nequivalence\nsubmissions\n##fra\n##thor\nbautista\nmechanically\n##heater\ncornice\nherbal\ntemplar\n##mering\noutputs\nruining\nligand\nrenumbered\nextravagant\nmika\nblockbuster\neta\ninsurrection\n##ilia\ndarkening\nferocious\npianos\nstrife\nkinship\n##aer\nmelee\n##anor\n##iste\n##may\n##oue\ndecidedly\nweep\n##jad\n##missive\n##ppel\n354\npuget\nunease\n##gnant\n1629\nhammering\nkassel\nob\nwessex\n##lga\nbromwich\negan\nparanoia\nutilization\n##atable\n##idad\ncontradictory\nprovoke\n##ols\n##ouring\n##tangled\nknesset\n##very\n##lette\nplumbing\n##sden\n##¹\ngreensboro\noccult\nsniff\n338\nzev\nbeaming\ngamer\nhaggard\nmahal\n##olt\n##pins\nmendes\nutmost\nbriefing\ngunnery\n##gut\n##pher\n##zh\n##rok\n1679\nkhalifa\nsonya\n##boot\nprincipals\nurbana\nwiring\n##liffe\n##minating\n##rrado\ndahl\nnyu\nskepticism\nnp\ntownspeople\nithaca\nlobster\nsomethin\n##fur\n##arina\n##−1\nfreighter\nzimmerman\nbiceps\ncontractual\n##herton\namend\nhurrying\nsubconscious\n##anal\n336\nmeng\nclermont\nspawning\n##eia\n##lub\ndignitaries\nimpetus\nsnacks\nspotting\ntwigs\n##bilis\n##cz\n##ouk\nlibertadores\nnic\nskylar\n##aina\n##firm\ngustave\nasean\n##anum\ndieter\nlegislatures\nflirt\nbromley\ntrolls\numar\n##bbies\n##tyle\nblah\nparc\nbridgeport\ncrank\nnegligence\n##nction\n46th\nconstantin\nmolded\nbandages\nseriousness\n00pm\nsiegel\ncarpets\ncompartments\nupbeat\nstatehood\n##dner\n##edging\nmarko\n730\nplatt\n##hane\npaving\n##iy\n1738\nabbess\nimpatience\nlimousine\nnbl\n##talk\n441\nlucille\nmojo\nnightfall\nrobbers\n##nais\nkarel\nbrisk\ncalves\nreplicate\nascribed\ntelescopes\n##olf\nintimidated\n##reen\nballast\nspecialization\n##sit\naerodynamic\ncaliphate\nrainer\nvisionary\n##arded\nepsilon\n##aday\n##onte\naggregation\nauditory\nboosted\nreunification\nkathmandu\nloco\nrobyn\n402\nacknowledges\nappointing\nhumanoid\nnewell\nredeveloped\nrestraints\n##tained\nbarbarians\nchopper\n1609\nitaliana\n##lez\n##lho\ninvestigates\nwrestlemania\n##anies\n##bib\n690\n##falls\ncreaked\ndragoons\ngravely\nminions\nstupidity\nvolley\n##harat\n##week\nmusik\n##eries\n##uously\nfungal\nmassimo\nsemantics\nmalvern\n##ahl\n##pee\ndiscourage\nembryo\nimperialism\n1910s\nprofoundly\n##ddled\njiangsu\nsparkled\nstat\n##holz\nsweatshirt\ntobin\n##iction\nsneered\n##cheon\n##oit\nbrit\ncausal\nsmyth\n##neuve\ndiffuse\nperrin\nsilvio\n##ipes\n##recht\ndetonated\niqbal\nselma\n##nism\n##zumi\nroasted\n##riders\ntay\n##ados\n##mament\n##mut\n##rud\n840\ncompletes\nnipples\ncfa\nflavour\nhirsch\n##laus\ncalderon\nsneakers\nmoravian\n##ksha\n1622\nrq\n294\n##imeters\nbodo\n##isance\n##pre\n##ronia\nanatomical\nexcerpt\n##lke\ndh\nkunst\n##tablished\n##scoe\nbiomass\npanted\nunharmed\ngael\nhousemates\nmontpellier\n##59\ncoa\nrodents\ntonic\nhickory\nsingleton\n##taro\n451\n1719\naldo\nbreaststroke\ndempsey\noch\nrocco\n##cuit\nmerton\ndissemination\nmidsummer\nserials\n##idi\nhaji\npolynomials\n##rdon\ngs\nenoch\nprematurely\nshutter\ntaunton\n£3\n##grating\n##inates\narchangel\nharassed\n##asco\n326\narchway\ndazzling\n##ecin\n1736\nsumo\nwat\n##kovich\n1086\nhonneur\n##ently\n##nostic\n##ttal\n##idon\n1605\n403\n1716\nblogger\nrents\n##gnan\nhires\n##ikh\n##dant\nhowie\n##rons\nhandler\nretracted\nshocks\n1632\narun\nduluth\nkepler\ntrumpeter\n##lary\npeeking\nseasoned\ntrooper\n##mara\nlaszlo\n##iciencies\n##rti\nheterosexual\n##inatory\n##ssion\nindira\njogging\n##inga\n##lism\nbeit\ndissatisfaction\nmalice\n##ately\nnedra\npeeling\n##rgeon\n47th\nstadiums\n475\nvertigo\n##ains\niced\nrestroom\n##plify\n##tub\nillustrating\npear\n##chner\n##sibility\ninorganic\nrappers\nreceipts\nwatery\n##kura\nlucinda\n##oulos\nreintroduced\n##8th\n##tched\ngracefully\nsaxons\nnutritional\nwastewater\nrained\nfavourites\nbedrock\nfisted\nhallways\nlikeness\nupscale\n##lateral\n1580\nblinds\nprequel\n##pps\n##tama\ndeter\nhumiliating\nrestraining\ntn\nvents\n1659\nlaundering\nrecess\nrosary\ntractors\ncoulter\nfederer\n##ifiers\n##plin\npersistence\n##quitable\ngeschichte\npendulum\nquakers\n##beam\nbassett\npictorial\nbuffet\nkoln\n##sitor\ndrills\nreciprocal\nshooters\n##57\n##cton\n##tees\nconverge\npip\ndmitri\ndonnelly\nyamamoto\naqua\nazores\ndemographics\nhypnotic\nspitfire\nsuspend\nwryly\nroderick\n##rran\nsebastien\n##asurable\nmavericks\n##fles\n##200\nhimalayan\nprodigy\n##iance\ntransvaal\ndemonstrators\nhandcuffs\ndodged\nmcnamara\nsublime\n1726\ncrazed\n##efined\n##till\nivo\npondered\nreconciled\nshrill\nsava\n##duk\nbal\ncad\nheresy\njaipur\ngoran\n##nished\n341\nlux\nshelly\nwhitehall\n##hre\nisraelis\npeacekeeping\n##wled\n1703\ndemetrius\nousted\n##arians\n##zos\nbeale\nanwar\nbackstroke\nraged\nshrinking\ncremated\n##yck\nbenign\ntowing\nwadi\ndarmstadt\nlandfill\nparana\nsoothe\ncolleen\nsidewalks\nmayfair\ntumble\nhepatitis\nferrer\nsuperstructure\n##gingly\n##urse\n##wee\nanthropological\ntranslators\n##mies\ncloseness\nhooves\n##pw\nmondays\n##roll\n##vita\nlandscaping\n##urized\npurification\nsock\nthorns\nthwarted\njalan\ntiberius\n##taka\nsaline\n##rito\nconfidently\nkhyber\nsculptors\n##ij\nbrahms\nhammersmith\ninspectors\nbattista\nfivb\nfragmentation\nhackney\n##uls\narresting\nexercising\nantoinette\nbedfordshire\n##zily\ndyed\n##hema\n1656\nracetrack\nvariability\n##tique\n1655\naustrians\ndeteriorating\nmadman\ntheorists\naix\nlehman\nweathered\n1731\ndecreed\neruptions\n1729\nflaw\nquinlan\nsorbonne\nflutes\nnunez\n1711\nadored\ndownwards\nfable\nrasped\n1712\nmoritz\nmouthful\nrenegade\nshivers\nstunts\ndysfunction\nrestrain\ntranslit\n327\npancakes\n##avio\n##cision\n##tray\n351\nvial\n##lden\nbain\n##maid\n##oxide\nchihuahua\nmalacca\nvimes\n##rba\n##rnier\n1664\ndonnie\nplaques\n##ually\n337\nbangs\nfloppy\nhuntsville\nloretta\nnikolay\n##otte\neater\nhandgun\nubiquitous\n##hett\neras\nzodiac\n1634\n##omorphic\n1820s\n##zog\ncochran\n##bula\n##lithic\nwarring\n##rada\ndalai\nexcused\nblazers\nmcconnell\nreeling\nbot\neste\n##abi\ngeese\nhoax\ntaxon\n##bla\nguitarists\n##icon\ncondemning\nhunts\ninversion\nmoffat\ntaekwondo\n##lvis\n1624\nstammered\n##rest\n##rzy\nsousa\nfundraiser\nmarylebone\nnavigable\nuptown\ncabbage\ndaniela\nsalman\nshitty\nwhimper\n##kian\n##utive\nprogrammers\nprotections\nrm\n##rmi\n##rued\nforceful\n##enes\nfuss\n##tao\n##wash\nbrat\noppressive\nreykjavik\nspartak\nticking\n##inkles\n##kiewicz\nadolph\nhorst\nmaui\nprotege\nstraighten\ncpc\nlandau\nconcourse\nclements\nresultant\n##ando\nimaginative\njoo\nreactivated\n##rem\n##ffled\n##uising\nconsultative\n##guide\nflop\nkaitlyn\nmergers\nparenting\nsomber\n##vron\nsupervise\nvidhan\n##imum\ncourtship\nexemplified\nharmonies\nmedallist\nrefining\n##rrow\n##ка\namara\n##hum\n780\ngoalscorer\nsited\novershadowed\nrohan\ndispleasure\nsecretive\nmultiplied\nosman\n##orth\nengravings\npadre\n##kali\n##veda\nminiatures\nmis\n##yala\nclap\npali\nrook\n##cana\n1692\n57th\nantennae\nastro\noskar\n1628\nbulldog\ncrotch\nhackett\nyucatan\n##sure\namplifiers\nbrno\nferrara\nmigrating\n##gree\nthanking\nturing\n##eza\nmccann\nting\nandersson\nonslaught\ngaines\nganga\nincense\nstandardization\n##mation\nsentai\nscuba\nstuffing\nturquoise\nwaivers\nalloys\n##vitt\nregaining\nvaults\n##clops\n##gizing\ndigger\nfurry\nmemorabilia\nprobing\n##iad\npayton\nrec\ndeutschland\nfilippo\nopaque\nseamen\nzenith\nafrikaans\n##filtration\ndisciplined\ninspirational\n##merie\nbanco\nconfuse\ngrafton\ntod\n##dgets\nchampioned\nsimi\nanomaly\nbiplane\n##ceptive\nelectrode\n##para\n1697\ncleavage\ncrossbow\nswirl\ninformant\n##lars\n##osta\nafi\nbonfire\nspec\n##oux\nlakeside\nslump\n##culus\n##lais\n##qvist\n##rrigan\n1016\nfacades\nborg\ninwardly\ncervical\nxl\npointedly\n050\nstabilization\n##odon\nchests\n1699\nhacked\nctv\northogonal\nsuzy\n##lastic\ngaulle\njacobite\nrearview\n##cam\n##erted\nashby\n##drik\n##igate\n##mise\n##zbek\naffectionately\ncanine\ndisperse\nlatham\n##istles\n##ivar\nspielberg\n##orin\n##idium\nezekiel\ncid\n##sg\ndurga\nmiddletown\n##cina\ncustomized\nfrontiers\nharden\n##etano\n##zzy\n1604\nbolsheviks\n##66\ncoloration\nyoko\n##bedo\nbriefs\nslabs\ndebra\nliquidation\nplumage\n##oin\nblossoms\ndementia\nsubsidy\n1611\nproctor\nrelational\njerseys\nparochial\nter\n##ici\nesa\npeshawar\ncavalier\nloren\ncpi\nidiots\nshamrock\n1646\ndutton\nmalabar\nmustache\n##endez\n##ocytes\nreferencing\nterminates\nmarche\nyarmouth\n##sop\nacton\nmated\nseton\nsubtly\nbaptised\nbeige\nextremes\njolted\nkristina\ntelecast\n##actic\nsafeguard\nwaldo\n##baldi\n##bular\nendeavors\nsloppy\nsubterranean\n##ensburg\n##itung\ndelicately\npigment\ntq\n##scu\n1626\n##ound\ncollisions\ncoveted\nherds\n##personal\n##meister\n##nberger\nchopra\n##ricting\nabnormalities\ndefective\ngalician\nlucie\n##dilly\nalligator\nlikened\n##genase\nburundi\nclears\ncomplexion\nderelict\ndeafening\ndiablo\nfingered\nchampaign\ndogg\nenlist\nisotope\nlabeling\nmrna\n##erre\nbrilliance\nmarvelous\n##ayo\n1652\ncrawley\nether\nfooted\ndwellers\ndeserts\nhamish\nrubs\nwarlock\nskimmed\n##lizer\n870\nbuick\nembark\nheraldic\nirregularities\n##ajan\nkiara\n##kulam\n##ieg\nantigen\nkowalski\n##lge\noakley\nvisitation\n##mbit\nvt\n##suit\n1570\nmurderers\n##miento\n##rites\nchimneys\n##sling\ncondemn\ncuster\nexchequer\nhavre\n##ghi\nfluctuations\n##rations\ndfb\nhendricks\nvaccines\n##tarian\nnietzsche\nbiking\njuicy\n##duced\nbrooding\nscrolling\nselangor\n##ragan\n352\nannum\nboomed\nseminole\nsugarcane\n##dna\ndepartmental\ndismissing\ninnsbruck\narteries\nashok\nbatavia\ndaze\nkun\novertook\n##rga\n##tlan\nbeheaded\ngaddafi\nholm\nelectronically\nfaulty\ngalilee\nfractures\nkobayashi\n##lized\ngunmen\nmagma\naramaic\nmala\neastenders\ninference\nmessengers\nbf\n##qu\n407\nbathrooms\n##vere\n1658\nflashbacks\nideally\nmisunderstood\n##jali\n##weather\nmendez\n##grounds\n505\nuncanny\n##iii\n1709\nfriendships\n##nbc\nsacrament\naccommodated\nreiterated\nlogistical\npebbles\nthumped\n##escence\nadministering\ndecrees\ndrafts\n##flight\n##cased\n##tula\nfuturistic\npicket\nintimidation\nwinthrop\n##fahan\ninterfered\n339\nafar\nfrancoise\nmorally\nuta\ncochin\ncroft\ndwarfs\n##bruck\n##dents\n##nami\nbiker\n##hner\n##meral\nnano\n##isen\n##ometric\n##pres\n##ан\nbrightened\nmeek\nparcels\nsecurely\ngunners\n##jhl\n##zko\nagile\nhysteria\n##lten\n##rcus\nbukit\nchamps\nchevy\ncuckoo\nleith\nsadler\ntheologians\nwelded\n##section\n1663\njj\nplurality\nxander\n##rooms\n##formed\nshredded\ntemps\nintimately\npau\ntormented\n##lok\n##stellar\n1618\ncharred\nems\nessen\n##mmel\nalarms\nspraying\nascot\nblooms\ntwinkle\n##abia\n##apes\ninternment\nobsidian\n##chaft\nsnoop\n##dav\n##ooping\nmalibu\n##tension\nquiver\n##itia\nhays\nmcintosh\ntravers\nwalsall\n##ffie\n1623\nbeverley\nschwarz\nplunging\nstructurally\nm3\nrosenthal\nvikram\n##tsk\n770\nghz\n##onda\n##tiv\nchalmers\ngroningen\npew\nreckon\nunicef\n##rvis\n55th\n##gni\n1651\nsulawesi\navila\ncai\nmetaphysical\nscrewing\nturbulence\n##mberg\naugusto\nsamba\n56th\nbaffled\nmomentary\ntoxin\n##urian\n##wani\naachen\ncondoms\ndali\nsteppe\n##3d\n##app\n##oed\n##year\nadolescence\ndauphin\nelectrically\ninaccessible\nmicroscopy\nnikita\n##ega\natv\n##cel\n##enter\n##oles\n##oteric\n##ы\naccountants\npunishments\nwrongly\nbribes\nadventurous\nclinch\nflinders\nsouthland\n##hem\n##kata\ngough\n##ciency\nlads\nsoared\n##ה\nundergoes\ndeformation\noutlawed\nrubbish\n##arus\n##mussen\n##nidae\n##rzburg\narcs\n##ingdon\n##tituted\n1695\nwheelbase\nwheeling\nbombardier\ncampground\nzebra\n##lices\n##oj\n##bain\nlullaby\n##ecure\ndonetsk\nwylie\ngrenada\n##arding\n##ης\nsquinting\neireann\nopposes\n##andra\nmaximal\nrunes\n##broken\n##cuting\n##iface\n##ror\n##rosis\nadditive\nbritney\nadultery\ntriggering\n##drome\ndetrimental\naarhus\ncontainment\njc\nswapped\nvichy\n##ioms\nmadly\n##oric\n##rag\nbrant\n##ckey\n##trix\n1560\n1612\nbroughton\nrustling\n##stems\n##uder\nasbestos\nmentoring\n##nivorous\nfinley\nleaps\n##isan\napical\npry\nslits\nsubstitutes\n##dict\nintuitive\nfantasia\ninsistent\nunreasonable\n##igen\n##vna\ndomed\nhannover\nmargot\nponder\n##zziness\nimpromptu\njian\nlc\nrampage\nstemming\n##eft\nandrey\ngerais\nwhichever\namnesia\nappropriated\nanzac\nclicks\nmodifying\nultimatum\ncambrian\nmaids\nverve\nyellowstone\n##mbs\nconservatoire\n##scribe\nadherence\ndinners\nspectra\nimperfect\nmysteriously\nsidekick\ntatar\ntuba\n##aks\n##ifolia\ndistrust\n##athan\n##zle\nc2\nronin\nzac\n##pse\ncelaena\ninstrumentalist\nscents\nskopje\n##mbling\ncomical\ncompensated\nvidal\ncondor\nintersect\njingle\nwavelengths\n##urrent\nmcqueen\n##izzly\ncarp\nweasel\n422\nkanye\nmilitias\npostdoctoral\neugen\ngunslinger\n##ɛ\nfaux\nhospice\n##for\nappalled\nderivation\ndwarves\n##elis\ndilapidated\n##folk\nastoria\nphilology\n##lwyn\n##otho\n##saka\ninducing\nphilanthropy\n##bf\n##itative\ngeek\nmarkedly\nsql\n##yce\nbessie\nindices\nrn\n##flict\n495\nfrowns\nresolving\nweightlifting\ntugs\ncleric\ncontentious\n1653\nmania\nrms\n##miya\n##reate\n##ruck\n##tucket\nbien\neels\nmarek\n##ayton\n##cence\ndiscreet\nunofficially\n##ife\nleaks\n##bber\n1705\n332\ndung\ncompressor\nhillsborough\npandit\nshillings\ndistal\n##skin\n381\n##tat\n##you\nnosed\n##nir\nmangrove\nundeveloped\n##idia\ntextures\n##inho\n##500\n##rise\nae\nirritating\nnay\namazingly\nbancroft\napologetic\ncompassionate\nkata\nsymphonies\n##lovic\nairspace\n##lch\n930\ngifford\nprecautions\nfulfillment\nsevilla\nvulgar\nmartinique\n##urities\nlooting\npiccolo\ntidy\n##dermott\nquadrant\narmchair\nincomes\nmathematicians\nstampede\nnilsson\n##inking\n##scan\nfoo\nquarterfinal\n##ostal\nshang\nshouldered\nsquirrels\n##owe\n344\nvinegar\n##bner\n##rchy\n##systems\ndelaying\n##trics\nars\ndwyer\nrhapsody\nsponsoring\n##gration\nbipolar\ncinder\nstarters\n##olio\n##urst\n421\nsignage\n##nty\naground\nfigurative\nmons\nacquaintances\nduets\nerroneously\nsoyuz\nelliptic\nrecreated\n##cultural\n##quette\n##ssed\n##tma\n##zcz\nmoderator\nscares\n##itaire\n##stones\n##udence\njuniper\nsighting\n##just\n##nsen\nbritten\ncalabria\nry\nbop\ncramer\nforsyth\nstillness\n##л\nairmen\ngathers\nunfit\n##umber\n##upt\ntaunting\n##rip\nseeker\nstreamlined\n##bution\nholster\nschumann\ntread\nvox\n##gano\n##onzo\nstrive\ndil\nreforming\ncovent\nnewbury\npredicting\n##orro\ndecorate\ntre\n##puted\nandover\nie\nasahi\ndept\ndunkirk\ngills\n##tori\nburen\nhuskies\n##stis\n##stov\nabstracts\nbets\nloosen\n##opa\n1682\nyearning\n##glio\n##sir\nberman\neffortlessly\nenamel\nnapoli\npersist\n##peration\n##uez\nattache\nelisa\nb1\ninvitations\n##kic\naccelerating\nreindeer\nboardwalk\nclutches\nnelly\npolka\nstarbucks\n##kei\nadamant\nhuey\nlough\nunbroken\nadventurer\nembroidery\ninspecting\nstanza\n##ducted\nnaia\ntaluka\n##pone\n##roids\nchases\ndeprivation\nflorian\n##jing\n##ppet\nearthly\n##lib\n##ssee\ncolossal\nforeigner\nvet\nfreaks\npatrice\nrosewood\ntriassic\nupstate\n##pkins\ndominates\nata\nchants\nks\nvo\n##400\n##bley\n##raya\n##rmed\n555\nagra\ninfiltrate\n##ailing\n##ilation\n##tzer\n##uppe\n##werk\nbinoculars\nenthusiast\nfujian\nsqueak\n##avs\nabolitionist\nalmeida\nboredom\nhampstead\nmarsden\nrations\n##ands\ninflated\n334\nbonuses\nrosalie\npatna\n##rco\n329\ndetachments\npenitentiary\n54th\nflourishing\nwoolf\n##dion\n##etched\npapyrus\n##lster\n##nsor\n##toy\nbobbed\ndismounted\nendelle\ninhuman\nmotorola\ntbs\nwince\nwreath\n##ticus\nhideout\ninspections\nsanjay\ndisgrace\ninfused\npudding\nstalks\n##urbed\narsenic\nleases\n##hyl\n##rrard\ncollarbone\n##waite\n##wil\ndowry\n##bant\n##edance\ngenealogical\nnitrate\nsalamanca\nscandals\nthyroid\nnecessitated\n##!\n##\"\n###\n##$\n##%\n##&\n##'\n##(\n##)\n##*\n##+\n##,\n##-\n##.\n##/\n##:\n##;\n##<\n##=\n##>\n##?\n##@\n##[\n##\\\n##]\n##^\n##_\n##`\n##{\n##|\n##}\n##~\n##¡\n##¢\n##£\n##¤\n##¥\n##¦\n##§\n##¨\n##©\n##ª\n##«\n##¬\n##®\n##±\n##´\n##µ\n##¶\n##·\n##º\n##»\n##¼\n##¾\n##¿\n##æ\n##ð\n##÷\n##þ\n##đ\n##ħ\n##ŋ\n##œ\n##ƒ\n##ɐ\n##ɑ\n##ɒ\n##ɔ\n##ɕ\n##ə\n##ɡ\n##ɣ\n##ɨ\n##ɪ\n##ɫ\n##ɬ\n##ɯ\n##ɲ\n##ɴ\n##ɹ\n##ɾ\n##ʀ\n##ʁ\n##ʂ\n##ʃ\n##ʉ\n##ʊ\n##ʋ\n##ʌ\n##ʎ\n##ʐ\n##ʑ\n##ʒ\n##ʔ\n##ʰ\n##ʲ\n##ʳ\n##ʷ\n##ʸ\n##ʻ\n##ʼ\n##ʾ\n##ʿ\n##ˈ\n##ˡ\n##ˢ\n##ˣ\n##ˤ\n##β\n##γ\n##δ\n##ε\n##ζ\n##θ\n##κ\n##λ\n##μ\n##ξ\n##ο\n##π\n##ρ\n##σ\n##τ\n##υ\n##φ\n##χ\n##ψ\n##ω\n##б\n##г\n##д\n##ж\n##з\n##м\n##п\n##с\n##у\n##ф\n##х\n##ц\n##ч\n##ш\n##щ\n##ъ\n##э\n##ю\n##ђ\n##є\n##і\n##ј\n##љ\n##њ\n##ћ\n##ӏ\n##ա\n##բ\n##գ\n##դ\n##ե\n##թ\n##ի\n##լ\n##կ\n##հ\n##մ\n##յ\n##ն\n##ո\n##պ\n##ս\n##վ\n##տ\n##ր\n##ւ\n##ք\n##־\n##א\n##ב\n##ג\n##ד\n##ו\n##ז\n##ח\n##ט\n##י\n##ך\n##כ\n##ל\n##ם\n##מ\n##ן\n##נ\n##ס\n##ע\n##ף\n##פ\n##ץ\n##צ\n##ק\n##ר\n##ש\n##ת\n##،\n##ء\n##ب\n##ت\n##ث\n##ج\n##ح\n##خ\n##ذ\n##ز\n##س\n##ش\n##ص\n##ض\n##ط\n##ظ\n##ع\n##غ\n##ـ\n##ف\n##ق\n##ك\n##و\n##ى\n##ٹ\n##پ\n##چ\n##ک\n##گ\n##ں\n##ھ\n##ہ\n##ے\n##अ\n##आ\n##उ\n##ए\n##क\n##ख\n##ग\n##च\n##ज\n##ट\n##ड\n##ण\n##त\n##थ\n##द\n##ध\n##न\n##प\n##ब\n##भ\n##म\n##य\n##र\n##ल\n##व\n##श\n##ष\n##स\n##ह\n##ा\n##ि\n##ी\n##ो\n##।\n##॥\n##ং\n##অ\n##আ\n##ই\n##উ\n##এ\n##ও\n##ক\n##খ\n##গ\n##চ\n##ছ\n##জ\n##ট\n##ড\n##ণ\n##ত\n##থ\n##দ\n##ধ\n##ন\n##প\n##ব\n##ভ\n##ম\n##য\n##র\n##ল\n##শ\n##ষ\n##স\n##হ\n##া\n##ি\n##ী\n##ে\n##க\n##ச\n##ட\n##த\n##ந\n##ன\n##ப\n##ம\n##ய\n##ர\n##ல\n##ள\n##வ\n##ா\n##ி\n##ு\n##ே\n##ை\n##ನ\n##ರ\n##ಾ\n##ක\n##ය\n##ර\n##ල\n##ව\n##ා\n##ก\n##ง\n##ต\n##ท\n##น\n##พ\n##ม\n##ย\n##ร\n##ล\n##ว\n##ส\n##อ\n##า\n##เ\n##་\n##།\n##ག\n##ང\n##ད\n##ན\n##པ\n##བ\n##མ\n##འ\n##ར\n##ལ\n##ས\n##မ\n##ა\n##ბ\n##გ\n##დ\n##ე\n##ვ\n##თ\n##ი\n##კ\n##ლ\n##მ\n##ნ\n##ო\n##რ\n##ს\n##ტ\n##უ\n##ᄀ\n##ᄂ\n##ᄃ\n##ᄅ\n##ᄆ\n##ᄇ\n##ᄉ\n##ᄊ\n##ᄋ\n##ᄌ\n##ᄎ\n##ᄏ\n##ᄐ\n##ᄑ\n##ᄒ\n##ᅡ\n##ᅢ\n##ᅥ\n##ᅦ\n##ᅧ\n##ᅩ\n##ᅪ\n##ᅭ\n##ᅮ\n##ᅯ\n##ᅲ\n##ᅳ\n##ᅴ\n##ᅵ\n##ᆨ\n##ᆫ\n##ᆯ\n##ᆷ\n##ᆸ\n##ᆼ\n##ᴬ\n##ᴮ\n##ᴰ\n##ᴵ\n##ᴺ\n##ᵀ\n##ᵃ\n##ᵇ\n##ᵈ\n##ᵉ\n##ᵍ\n##ᵏ\n##ᵐ\n##ᵒ\n##ᵖ\n##ᵗ\n##ᵘ\n##ᵣ\n##ᵤ\n##ᵥ\n##ᶜ\n##ᶠ\n##‐\n##‑\n##‒\n##–\n##—\n##―\n##‖\n##‘\n##’\n##‚\n##“\n##”\n##„\n##†\n##‡\n##•\n##…\n##‰\n##′\n##″\n##›\n##‿\n##⁄\n##⁰\n##ⁱ\n##⁴\n##⁵\n##⁶\n##⁷\n##⁸\n##⁹\n##⁻\n##ⁿ\n##₅\n##₆\n##₇\n##₈\n##₉\n##₊\n##₍\n##₎\n##ₐ\n##ₑ\n##ₒ\n##ₓ\n##ₕ\n##ₖ\n##ₗ\n##ₘ\n##ₚ\n##ₛ\n##ₜ\n##₤\n##₩\n##€\n##₱\n##₹\n##ℓ\n##№\n##ℝ\n##™\n##⅓\n##⅔\n##←\n##↑\n##→\n##↓\n##↔\n##↦\n##⇄\n##⇌\n##⇒\n##∂\n##∅\n##∆\n##∇\n##∈\n##∗\n##∘\n##√\n##∞\n##∧\n##∨\n##∩\n##∪\n##≈\n##≡\n##≤\n##≥\n##⊂\n##⊆\n##⊕\n##⊗\n##⋅\n##─\n##│\n##■\n##▪\n##●\n##★\n##☆\n##☉\n##♠\n##♣\n##♥\n##♦\n##♯\n##⟨\n##⟩\n##ⱼ\n##⺩\n##⺼\n##⽥\n##、\n##。\n##〈\n##〉\n##《\n##》\n##「\n##」\n##『\n##』\n##〜\n##あ\n##い\n##う\n##え\n##お\n##か\n##き\n##く\n##け\n##こ\n##さ\n##し\n##す\n##せ\n##そ\n##た\n##ち\n##っ\n##つ\n##て\n##と\n##な\n##に\n##ぬ\n##ね\n##の\n##は\n##ひ\n##ふ\n##へ\n##ほ\n##ま\n##み\n##む\n##め\n##も\n##や\n##ゆ\n##よ\n##ら\n##り\n##る\n##れ\n##ろ\n##を\n##ん\n##ァ\n##ア\n##ィ\n##イ\n##ウ\n##ェ\n##エ\n##オ\n##カ\n##キ\n##ク\n##ケ\n##コ\n##サ\n##シ\n##ス\n##セ\n##タ\n##チ\n##ッ\n##ツ\n##テ\n##ト\n##ナ\n##ニ\n##ノ\n##ハ\n##ヒ\n##フ\n##ヘ\n##ホ\n##マ\n##ミ\n##ム\n##メ\n##モ\n##ャ\n##ュ\n##ョ\n##ラ\n##リ\n##ル\n##レ\n##ロ\n##ワ\n##ン\n##・\n##ー\n##一\n##三\n##上\n##下\n##不\n##世\n##中\n##主\n##久\n##之\n##也\n##事\n##二\n##五\n##井\n##京\n##人\n##亻\n##仁\n##介\n##代\n##仮\n##伊\n##会\n##佐\n##侍\n##保\n##信\n##健\n##元\n##光\n##八\n##公\n##内\n##出\n##分\n##前\n##劉\n##力\n##加\n##勝\n##北\n##区\n##十\n##千\n##南\n##博\n##原\n##口\n##古\n##史\n##司\n##合\n##吉\n##同\n##名\n##和\n##囗\n##四\n##国\n##國\n##土\n##地\n##坂\n##城\n##堂\n##場\n##士\n##夏\n##外\n##大\n##天\n##太\n##夫\n##奈\n##女\n##子\n##学\n##宀\n##宇\n##安\n##宗\n##定\n##宣\n##宮\n##家\n##宿\n##寺\n##將\n##小\n##尚\n##山\n##岡\n##島\n##崎\n##川\n##州\n##巿\n##帝\n##平\n##年\n##幸\n##广\n##弘\n##張\n##彳\n##後\n##御\n##德\n##心\n##忄\n##志\n##忠\n##愛\n##成\n##我\n##戦\n##戸\n##手\n##扌\n##政\n##文\n##新\n##方\n##日\n##明\n##星\n##春\n##昭\n##智\n##曲\n##書\n##月\n##有\n##朝\n##木\n##本\n##李\n##村\n##東\n##松\n##林\n##森\n##楊\n##樹\n##橋\n##歌\n##止\n##正\n##武\n##比\n##氏\n##民\n##水\n##氵\n##氷\n##永\n##江\n##沢\n##河\n##治\n##法\n##海\n##清\n##漢\n##瀬\n##火\n##版\n##犬\n##王\n##生\n##田\n##男\n##疒\n##発\n##白\n##的\n##皇\n##目\n##相\n##省\n##真\n##石\n##示\n##社\n##神\n##福\n##禾\n##秀\n##秋\n##空\n##立\n##章\n##竹\n##糹\n##美\n##義\n##耳\n##良\n##艹\n##花\n##英\n##華\n##葉\n##藤\n##行\n##街\n##西\n##見\n##訁\n##語\n##谷\n##貝\n##貴\n##車\n##軍\n##辶\n##道\n##郎\n##郡\n##部\n##都\n##里\n##野\n##金\n##鈴\n##镇\n##長\n##門\n##間\n##阝\n##阿\n##陳\n##陽\n##雄\n##青\n##面\n##風\n##食\n##香\n##馬\n##高\n##龍\n##龸\n##ﬁ\n##ﬂ\n##！\n##（\n##）\n##，\n##－\n##．\n##／\n##：\n##？\n##～\n"
  },
  {
    "path": "data_utils/uppercase_vocab.txt",
    "content": "[PAD]\n[unused1]\n[unused2]\n[unused3]\n[unused4]\n[unused5]\n[unused6]\n[unused7]\n[unused8]\n[unused9]\n[unused10]\n[unused11]\n[unused12]\n[unused13]\n[unused14]\n[unused15]\n[unused16]\n[unused17]\n[unused18]\n[unused19]\n[unused20]\n[unused21]\n[unused22]\n[unused23]\n[unused24]\n[unused25]\n[unused26]\n[unused27]\n[unused28]\n[unused29]\n[unused30]\n[unused31]\n[unused32]\n[unused33]\n[unused34]\n[unused35]\n[unused36]\n[unused37]\n[unused38]\n[unused39]\n[unused40]\n[unused41]\n[unused42]\n[unused43]\n[unused44]\n[unused45]\n[unused46]\n[unused47]\n[unused48]\n[unused49]\n[unused50]\n[unused51]\n[unused52]\n[unused53]\n[unused54]\n[unused55]\n[unused56]\n[unused57]\n[unused58]\n[unused59]\n[unused60]\n[unused61]\n[unused62]\n[unused63]\n[unused64]\n[unused65]\n[unused66]\n[unused67]\n[unused68]\n[unused69]\n[unused70]\n[unused71]\n[unused72]\n[unused73]\n[unused74]\n[unused75]\n[unused76]\n[unused77]\n[unused78]\n[unused79]\n[unused80]\n[unused81]\n[unused82]\n[unused83]\n[unused84]\n[unused85]\n[unused86]\n[unused87]\n[unused88]\n[unused89]\n[unused90]\n[unused91]\n[unused92]\n[unused93]\n[unused94]\n[unused95]\n[unused96]\n[unused97]\n[unused98]\n[unused99]\n[UNK]\n[CLS]\n[SEP]\n[MASK]\n[unused100]\n[unused101]\n!\n\"\n#\n$\n%\n&\n'\n(\n)\n*\n+\n,\n-\n.\n/\n0\n1\n2\n3\n4\n5\n6\n7\n8\n9\n:\n;\n<\n=\n>\n?\n@\nA\nB\nC\nD\nE\nF\nG\nH\nI\nJ\nK\nL\nM\nN\nO\nP\nQ\nR\nS\nT\nU\nV\nW\nX\nY\nZ\n[\n\\\n]\n^\n_\n`\na\nb\nc\nd\ne\nf\ng\nh\ni\nj\nk\nl\nm\nn\no\np\nq\nr\ns\nt\nu\nv\nw\nx\ny\nz\n{\n|\n}\n~\n¡\n¢\n£\n¥\n§\n¨\n©\nª\n«\n¬\n®\n°\n±\n²\n³\n´\nµ\n¶\n·\n¹\nº\n»\n¼\n½\n¾\n¿\nÀ\nÁ\nÂ\nÄ\nÅ\nÆ\nÇ\nÈ\nÉ\nÍ\nÎ\nÑ\nÓ\nÖ\n×\nØ\nÚ\nÜ\nÞ\nß\nà\ná\nâ\nã\nä\nå\næ\nç\nè\né\nê\në\nì\ní\nî\nï\nð\nñ\nò\nó\nô\nõ\nö\n÷\nø\nù\nú\nû\nü\ný\nþ\nÿ\nĀ\nā\nă\ną\nĆ\nć\nČ\nč\nď\nĐ\nđ\nē\nė\nę\ně\nğ\nġ\nĦ\nħ\nĩ\nĪ\nī\nİ\nı\nļ\nĽ\nľ\nŁ\nł\nń\nņ\nň\nŋ\nŌ\nō\nŏ\nő\nŒ\nœ\nř\nŚ\nś\nŞ\nş\nŠ\nš\nŢ\nţ\nť\nũ\nū\nŭ\nů\nű\nų\nŵ\nŷ\nź\nŻ\nż\nŽ\nž\nƏ\nƒ\nơ\nư\nǎ\nǐ\nǒ\nǔ\nǫ\nȘ\nș\nȚ\nț\nɐ\nɑ\nɔ\nɕ\nə\nɛ\nɡ\nɣ\nɨ\nɪ\nɲ\nɾ\nʀ\nʁ\nʂ\nʃ\nʊ\nʋ\nʌ\nʐ\nʑ\nʒ\nʔ\nʰ\nʲ\nʳ\nʷ\nʻ\nʼ\nʾ\nʿ\nˈ\nː\nˡ\nˢ\nˣ\ń\ñ\n̍\n̯\n͡\nΑ\nΒ\nΓ\nΔ\nΕ\nΗ\nΘ\nΙ\nΚ\nΛ\nΜ\nΝ\nΟ\nΠ\nΣ\nΤ\nΦ\nΧ\nΨ\nΩ\nά\nέ\nή\nί\nα\nβ\nγ\nδ\nε\nζ\nη\nθ\nι\nκ\nλ\nμ\nν\nξ\nο\nπ\nρ\nς\nσ\nτ\nυ\nφ\nχ\nψ\nω\nό\nύ\nώ\nІ\nЈ\nА\nБ\nВ\nГ\nД\nЕ\nЖ\nЗ\nИ\nК\nЛ\nМ\nН\nО\nП\nР\nС\nТ\nУ\nФ\nХ\nЦ\nЧ\nШ\nЭ\nЮ\nЯ\nа\nб\nв\nг\nд\nе\nж\nз\nи\nй\nк\nл\nм\nн\nо\nп\nр\nс\nт\nу\nф\nх\nц\nч\nш\nщ\nъ\nы\nь\nэ\nю\nя\nё\nі\nї\nј\nњ\nћ\nԱ\nՀ\nա\nե\nի\nկ\nմ\nյ\nն\nո\nս\nտ\nր\nւ\nְ\nִ\nֵ\nֶ\nַ\nָ\nֹ\nּ\nא\nב\nג\nד\nה\nו\nז\nח\nט\nי\nכ\nל\nם\nמ\nן\nנ\nס\nע\nפ\nצ\nק\nר\nש\nת\n،\nء\nآ\nأ\nإ\nئ\nا\nب\nة\nت\nث\nج\nح\nخ\nد\nذ\nر\nز\nس\nش\nص\nض\nط\nظ\nع\nغ\nف\nق\nك\nل\nم\nن\nه\nو\nى\nي\nَ\nِ\nٹ\nپ\nچ\nک\nگ\nہ\nی\nے\nं\nआ\nक\nग\nच\nज\nण\nत\nद\nध\nन\nप\nब\nभ\nम\nय\nर\nल\nव\nश\nष\nस\nह\nा\nि\nी\nु\nे\nो\n्\n।\n॥\nআ\nই\nএ\nও\nক\nখ\nগ\nচ\nছ\nজ\nট\nত\nথ\nদ\nধ\nন\nপ\nব\nম\nয\nর\nল\nশ\nস\nহ\n়\nা\nি\nী\nু\nে\nো\n্\nয়\nக\nத\nப\nம\nய\nர\nல\nவ\nா\nி\nு\n்\nร\n་\nག\nང\nད\nན\nབ\nམ\nར\nལ\nས\nི\nུ\nེ\nོ\nა\nე\nი\nლ\nნ\nო\nრ\nს\nᴬ\nᴵ\nᵀ\nᵃ\nᵇ\nᵈ\nᵉ\nᵍ\nᵏ\nᵐ\nᵒ\nᵖ\nᵗ\nᵘ\nᵢ\nᵣ\nᵤ\nᵥ\nᶜ\nᶠ\nḍ\nḤ\nḥ\nḨ\nḩ\nḳ\nṃ\nṅ\nṇ\nṛ\nṣ\nṭ\nạ\nả\nấ\nầ\nẩ\nậ\nắ\nế\nề\nể\nễ\nệ\nị\nọ\nố\nồ\nổ\nộ\nớ\nờ\nợ\nụ\nủ\nứ\nừ\nử\nữ\nự\nỳ\nỹ\nἀ\nἐ\nὁ\nὐ\nὰ\nὶ\nὸ\nῆ\nῖ\nῦ\nῶ\n‐\n‑\n‒\n–\n—\n―\n‖\n‘\n’\n‚\n“\n”\n„\n†\n‡\n•\n…\n‰\n′\n″\n⁄\n⁰\nⁱ\n⁴\n⁵\n⁶\n⁷\n⁸\n⁹\n⁺\n⁻\nⁿ\n₀\n₁\n₂\n₃\n₄\n₅\n₆\n₇\n₈\n₉\n₊\n₍\n₎\nₐ\nₑ\nₒ\nₓ\nₕ\nₖ\nₘ\nₙ\nₚ\nₛ\nₜ\n₤\n€\n₱\n₹\nℓ\n№\nℝ\n⅓\n←\n↑\n→\n↔\n⇌\n⇒\n∂\n∈\n−\n∗\n∘\n√\n∞\n∧\n∨\n∩\n∪\n≈\n≠\n≡\n≤\n≥\n⊂\n⊆\n⊕\n⋅\n─\n│\n■\n●\n★\n☆\n☉\n♠\n♣\n♥\n♦\n♭\n♯\n⟨\n⟩\nⱼ\n、\n。\n《\n》\n「\n」\n『\n』\n〜\nい\nう\nえ\nお\nか\nき\nく\nけ\nこ\nさ\nし\nす\nせ\nそ\nた\nち\nつ\nて\nと\nな\nに\nの\nは\nひ\nま\nみ\nむ\nめ\nも\nや\nゆ\nよ\nら\nり\nる\nれ\nん\nア\nィ\nイ\nウ\nエ\nオ\nカ\nガ\nキ\nク\nグ\nコ\nサ\nシ\nジ\nス\nズ\nタ\nダ\nッ\nテ\nデ\nト\nド\nナ\nニ\nハ\nバ\nパ\nフ\nブ\nプ\nマ\nミ\nム\nャ\nュ\nラ\nリ\nル\nレ\nロ\nン\n・\nー\n一\n三\n上\n下\n中\n事\n二\n井\n京\n人\n亻\n仁\n佐\n侍\n光\n公\n力\n北\n十\n南\n原\n口\n史\n司\n吉\n同\n和\n囗\n国\n國\n土\n城\n士\n大\n天\n太\n夫\n女\n子\n宀\n安\n宮\n宿\n小\n尚\n山\n島\n川\n州\n平\n年\n心\n愛\n戸\n文\n新\n方\n日\n明\n星\n書\n月\n木\n本\n李\n村\n東\n松\n林\n正\n武\n氏\n水\n氵\n江\n河\n海\n版\n犬\n王\n生\n田\n白\n皇\n省\n真\n石\n社\n神\n竹\n美\n義\n花\n藤\n西\n谷\n車\n辶\n道\n郎\n郡\n部\n野\n金\n長\n門\n陽\n青\n食\n馬\n高\n龍\n龸\n사\n씨\n의\n이\n한\nﬁ\nﬂ\n！\n（\n）\n，\n－\n／\n：\nthe\nof\nand\nto\nin\nwas\nThe\nis\nfor\nas\non\nwith\nthat\n##s\nhis\nby\nhe\nat\nfrom\nit\nher\nHe\nhad\nan\nwere\nyou\nbe\nIn\nshe\nare\nbut\nwhich\nIt\nnot\nor\nhave\nmy\nhim\none\nthis\nme\nhas\nalso\nup\ntheir\nfirst\nout\nwho\nbeen\nthey\nShe\ninto\nall\nwould\nits\n##ing\ntime\ntwo\n##a\n##e\nsaid\nabout\nwhen\nover\nmore\nother\ncan\nafter\nback\nthem\nthen\n##ed\nthere\nlike\nso\nonly\n##n\ncould\n##d\n##i\n##y\nwhat\nno\n##o\nwhere\nThis\nmade\nthan\nif\nYou\n##ly\nthrough\nwe\nbefore\n##r\njust\nsome\n##er\nyears\ndo\nNew\n##t\ndown\nbetween\nnew\nnow\nwill\nthree\nmost\nOn\naround\nyear\nused\nsuch\nbeing\nwell\nduring\nThey\nknow\nagainst\nunder\nlater\ndid\npart\nknown\noff\nwhile\nHis\nre\n...\n##l\npeople\nuntil\nway\nAmerican\ndidn\nUniversity\nyour\nboth\nmany\nget\nUnited\nbecame\nhead\nThere\nsecond\nAs\nwork\nany\nBut\nstill\nagain\nborn\neven\neyes\nAfter\nincluding\nde\ntook\nAnd\nlong\nteam\nseason\nfamily\nsee\nright\nsame\ncalled\nname\nbecause\nfilm\ndon\n10\nfound\nmuch\nschool\n##es\ngoing\nwon\nplace\naway\nWe\nday\nleft\nJohn\n000\nhand\nsince\nWorld\nthese\nhow\nmake\nnumber\neach\nlife\narea\nman\nfour\ngo\nNo\nhere\nvery\nNational\n##m\nplayed\nreleased\nnever\nbegan\nStates\nalbum\nhome\nlast\ntoo\nheld\nseveral\nMay\nown\n##on\ntake\nend\nSchool\n##h\nll\nseries\nWhat\nwant\nuse\nanother\ncity\nWhen\n2010\nside\nAt\nmay\nThat\ncame\nface\nJune\nthink\ngame\nthose\nhigh\nMarch\nearly\nSeptember\n##al\n2011\nlooked\nJuly\nstate\nsmall\nthought\nwent\nJanuary\nOctober\n##u\nbased\nAugust\n##us\nworld\ngood\nApril\nYork\nus\n12\n2012\n2008\nFor\n2009\ngroup\nalong\nfew\nSouth\nlittle\n##k\nfollowing\nNovember\nsomething\n2013\nDecember\nset\n2007\nold\n2006\n2014\nlocated\n##an\nmusic\nCounty\nCity\nformer\n##in\nroom\nve\nnext\nAll\n##man\ngot\nfather\nhouse\n##g\nbody\n15\n20\n18\nstarted\nIf\n2015\ntown\nour\nline\nWar\nlarge\npopulation\nnamed\nBritish\ncompany\nmember\nfive\nMy\nsingle\n##en\nage\nState\nmoved\nFebruary\n11\nHer\nshould\ncentury\ngovernment\nbuilt\ncome\nbest\nshow\nHowever\nwithin\nlook\nmen\ndoor\nwithout\nneed\nwasn\n2016\nwater\nOne\nsystem\nknew\nevery\ndied\nLeague\nturned\nasked\nNorth\nSt\nwanted\nbuilding\nreceived\nsong\nserved\nthough\nfelt\n##ia\nstation\nband\n##ers\nlocal\npublic\nhimself\ndifferent\ndeath\nsay\n##1\n30\n##2\n2005\n16\nnight\nbehind\nchildren\nEnglish\nmembers\nnear\nsaw\ntogether\nson\n14\nvoice\nvillage\n13\nhands\nhelp\n##3\ndue\nFrench\nLondon\ntop\ntold\nopen\npublished\nthird\n2017\nplay\nacross\nDuring\nput\nfinal\noften\ninclude\n25\n##le\nmain\nhaving\n2004\nonce\never\nlet\nbook\nled\ngave\nlate\nfront\nfind\nclub\n##4\nGerman\nincluded\nspecies\nCollege\nform\nopened\nmother\nwomen\nenough\nWest\nmust\n2000\npower\nreally\n17\nmaking\nhalf\n##6\norder\nmight\n##is\ngiven\nmillion\ntimes\ndays\npoint\nfull\nservice\nWith\nkm\nmajor\n##7\noriginal\nbecome\nseen\nII\nnorth\nsix\n##te\nlove\n##0\nnational\nInternational\n##5\n24\nSo\nDistrict\nlost\nrun\ncouldn\ncareer\nalways\n##9\n2003\n##th\ncountry\n##z\nHouse\nair\ntell\nsouth\nworked\nwoman\nplayer\n##A\nalmost\nwar\nRiver\n##ic\nmarried\ncontinued\nThen\nJames\nclose\nblack\nshort\n##8\n##na\nusing\nhistory\nreturned\nlight\ncar\n##ra\nsure\nWilliam\nthings\nGeneral\n##ry\n2002\nbetter\nsupport\n100\namong\nFrom\nfeet\nKing\nanything\n21\n19\nestablished\ndistrict\n2001\nfeel\ngreat\n##ton\nlevel\nCup\nThese\nwritten\ngames\nothers\nalready\ntitle\nstory\n##p\nlaw\nthing\nUS\nrecord\nrole\nhowever\nBy\nstudents\nEngland\nwhite\ncontrol\nleast\ninside\nland\n##C\n22\ngive\ncommunity\nhard\n##ie\nnon\n##c\nproduced\nGeorge\nround\nperiod\nPark\nbusiness\nvarious\n##ne\ndoes\npresent\nwife\nfar\ntaken\nper\nreached\nDavid\nable\nversion\nworking\nyoung\nlive\ncreated\njoined\nEast\nliving\nappeared\ncase\nHigh\ndone\n23\nimportant\nPresident\nAward\nFrance\nposition\noffice\nlooking\ntotal\ngeneral\nclass\nTo\nproduction\n##S\nfootball\nparty\nbrother\nkeep\nmind\nfree\nStreet\nhair\nannounced\ndevelopment\neither\nnothing\nmoment\nChurch\nfollowed\nwrote\nwhy\nIndia\nSan\nelection\n1999\nlead\nHow\n##ch\n##rs\nwords\nEuropean\ncourse\nconsidered\nAmerica\narms\nArmy\npolitical\n##la\n28\n26\nwest\neast\nground\nfurther\nchurch\nless\nsite\nFirst\nNot\nAustralia\ntoward\nCalifornia\n##ness\ndescribed\nworks\nAn\nCouncil\nheart\npast\nmilitary\n27\n##or\nheard\nfield\nhuman\nsoon\nfounded\n1998\nplaying\ntrying\n##x\n##ist\n##ta\ntelevision\nmouth\nalthough\ntaking\nwin\nfire\nDivision\n##ity\nParty\nRoyal\nprogram\nSome\nDon\nAssociation\nAccording\ntried\nTV\nPaul\noutside\ndaughter\nBest\nWhile\nsomeone\nmatch\nrecorded\nCanada\nclosed\nregion\nAir\nabove\nmonths\nelected\n##da\n##ian\nroad\n##ar\nbrought\nmove\n1997\nleave\n##um\nThomas\n1996\nam\nlow\nRobert\nformed\nperson\nservices\npoints\nMr\nmiles\n##b\nstop\nrest\ndoing\nneeded\ninternational\nrelease\nfloor\nstart\nsound\ncall\nkilled\nreal\ndark\nresearch\nfinished\nlanguage\nMichael\nprofessional\nchange\nsent\n50\nupon\n29\ntrack\nhit\nevent\n2018\nterm\nexample\nGermany\nsimilar\nreturn\n##ism\nfact\npulled\nstood\nsays\nran\ninformation\nyet\nresult\ndeveloped\ngirl\n##re\nGod\n1995\nareas\nsigned\ndecided\n##ment\nCompany\nseemed\n##el\nco\nturn\nrace\ncommon\nvideo\nCharles\nIndian\n##ation\nblood\nart\nred\n##able\nadded\nrather\n1994\nmet\ndirector\naddition\ndesign\naverage\nminutes\n##ies\n##ted\navailable\nbed\ncoming\nfriend\nidea\nkind\nUnion\nRoad\nremained\n##ting\neverything\n##ma\nrunning\ncare\nfinally\nChinese\nappointed\n1992\nAustralian\n##ley\npopular\nmean\nteams\nprobably\n##land\nusually\nproject\nsocial\nChampionship\npossible\nword\nRussian\ninstead\nmi\nherself\n##T\nPeter\nHall\nCenter\nseat\nstyle\nmoney\n1993\nelse\nDepartment\ntable\nMusic\ncurrent\n31\nfeatures\nspecial\nevents\ncharacter\nTwo\nsquare\nsold\ndebut\n##v\nprocess\nAlthough\nSince\n##ka\n40\nCentral\ncurrently\neducation\nplaced\nlot\nChina\nquickly\nforward\nseven\n##ling\nEurope\narm\nperformed\nJapanese\n1991\nHenry\nNow\nDr\n##ion\nweek\nGroup\nmyself\nbig\nUK\nWashington\nten\ndeep\n1990\nClub\nJapan\nspace\nLa\ndirected\nsmile\nepisode\nhours\nwhole\n##de\n##less\nWhy\nwouldn\ndesigned\nstrong\ntraining\nchanged\nSociety\nstage\ninvolved\nhadn\ntowards\nleading\npolice\neight\nkept\nInstitute\nstudy\nlargest\nchild\neventually\nprivate\nmodern\nCourt\nthroughout\ngetting\noriginally\nattack\n##E\ntalk\nGreat\nlonger\nsongs\nalone\n##ine\nwide\ndead\nwalked\nshot\n##ri\nOh\nforce\n##st\nArt\ntoday\nfriends\nIsland\nRichard\n1989\ncenter\nconstruction\nbelieve\nsize\nWhite\nship\ncompleted\n##B\ngone\nJust\nrock\nsat\n##R\nradio\nbelow\nentire\nfamilies\nleague\nincludes\ntype\nlived\nofficial\nrange\nhold\nfeatured\nMost\n##ter\npresident\npassed\nmeans\n##f\nforces\nlips\nMary\nDo\nguitar\n##ce\nfood\nwall\nOf\nspent\nIts\nperformance\nhear\n##P\nWestern\nreported\nsister\n##et\nmorning\n##M\nespecially\n##ive\nMinister\nitself\npost\nbit\ngroups\n1988\n##tion\nBlack\n##ng\nWell\nraised\nsometimes\nCanadian\nParis\nSpanish\nreplaced\nschools\nAcademy\nleaving\ncentral\nfemale\nChristian\nJack\nwhose\ncollege\nonto\nprovided\n##D\n##ville\nplayers\nactually\nstopped\n##son\nMuseum\ndoesn\n##ts\nbooks\nfight\nallowed\n##ur\nbeginning\nRecords\nawarded\nparents\ncoach\n##os\nRed\nsaying\n##ck\nSmith\nYes\nLake\n##L\naircraft\n1987\n##ble\nprevious\nft\naction\nItalian\nAfrican\nhappened\nvocals\nAct\nfuture\ncourt\n##ge\n1986\ndegree\nphone\n##ro\nIs\ncountries\nwinning\nbreath\nLove\nriver\nmatter\nLord\nOther\nlist\nself\nparts\n##ate\nprovide\ncut\nshows\nplan\n1st\ninterest\n##ized\nAfrica\nstated\nSir\nfell\nowned\nearlier\nended\ncompetition\nattention\n1985\nlower\nnearly\nbad\nolder\nstay\nSaint\n##se\ncertain\n1984\nfingers\nblue\ntry\nfourth\nGrand\n##as\nking\n##nt\nmakes\nchest\nmovement\nstates\nmoving\ndata\nintroduced\nmodel\ndate\nsection\nLos\ndeal\n##I\nskin\nentered\nmiddle\nsuccess\nTexas\n##w\nsummer\nisland\n##N\nRepublic\nlength\nhusband\n1980\n##ey\nreason\nanyone\nforced\nvia\nbase\n500\njob\ncovered\nFestival\nRoman\nsuccessful\nrights\ncover\nMan\nwriting\nIreland\n##F\nrelated\ngoal\ntakes\nbuildings\ntrue\nweeks\n1983\nBecause\nopening\nnovel\nISBN\nmeet\ngold\n##ous\nmid\nkm²\nstanding\nFootball\nChicago\nshook\nwhom\n##ki\n1982\nDay\nfeeling\nscored\nboy\nhigher\nForce\nleader\nheavy\nfall\nquestion\nsense\narmy\nSecond\nenergy\nmeeting\nthemselves\nkill\n##am\nboard\ncensus\n##ya\n##ns\nmine\nmeant\nmarket\nrequired\nbattle\ncampaign\nattended\napproximately\nKingdom\nruns\nactive\n##ha\ncontract\nclear\npreviously\nhealth\n1979\nArts\ncomplete\nCatholic\ncouple\nunits\n##ll\n##ty\nCommittee\nshoulder\nsea\nsystems\nlisted\n##O\ncaught\ntournament\n##G\nnorthern\nauthor\nFilm\nYour\n##men\nholding\noffered\npersonal\n1981\nsouthern\nartist\ntraditional\nstudio\n200\ncapital\n##ful\nregular\nask\ngiving\norganization\nmonth\nnews\nAre\nread\nmanaged\nhelped\nstudied\nstudent\ndefeated\nnatural\nindustry\nYear\nnoted\ndecision\nGovernment\nquite\n##id\nsmiled\n1972\nMaybe\ntracks\n##ke\nMark\nal\nmedia\nengine\nhour\nTheir\nrelationship\nplays\nproperty\nstructure\n1976\nago\nHill\nMartin\n1978\nready\nMany\nLike\nBay\nimmediately\ngenerally\nItaly\nGreek\npractice\ncaused\ndivision\nsignificant\nJoseph\nspeed\nLet\nthinking\ncompletely\n1974\nprimary\nmostly\n##field\n##K\n1975\n##to\nEven\nwriter\n##led\ndropped\nmagazine\ncollection\nunderstand\nroute\nhighest\nparticular\nfilms\nlines\nnetwork\nScience\nloss\ncarried\ndirection\ngreen\n1977\nlocation\nproducer\naccording\nWomen\nQueen\nneck\nthus\nindependent\nview\n1970\nAngeles\nSoviet\ndistance\nproblem\nBoard\ntour\nwestern\nincome\nappearance\naccess\nMexico\nnodded\nstreet\nsurface\narrived\nbelieved\nOld\n1968\n1973\nbecoming\nwhether\n1945\nfigure\nsinger\nstand\nFollowing\nissue\nwindow\nwrong\npain\neveryone\nlives\nissues\npark\nslowly\nla\nact\n##va\nbring\nLee\noperations\nkey\ncomes\nfine\ncold\nfamous\nNavy\n1971\nMe\nadditional\nindividual\n##ner\nZealand\ngoals\ncounty\ncontains\nService\nminute\n2nd\nreach\ntalking\nparticularly\n##ham\nmovie\nDirector\nglass\npaper\nstudies\n##co\nrailway\nstandard\nEducation\n45\nrepresented\nChief\nLouis\nlaunched\nStar\nterms\n60\n1969\nexperience\nwatched\nAnother\nPress\nTom\nstaff\nstarting\nsubject\nbreak\nVirginia\nnine\neye\n##age\nevidence\nfoot\n##est\ncompanies\nPrince\n##V\ngun\ncreate\nBig\nPeople\nguy\nGreen\nsimply\nnumerous\n##line\nincreased\ntwenty\n##ga\n##do\n1967\naward\nofficer\nstone\nBefore\nmaterial\nNorthern\ngrew\nmale\nplant\nLife\nlegs\nstep\nAl\nunit\n35\nexcept\nanswer\n##U\nreport\nresponse\nEdward\ncommercial\nedition\ntrade\nscience\n##ca\nIrish\nLaw\nshown\nrate\nfailed\n##ni\nremains\nchanges\nmm\nlimited\nlarger\nLater\ncause\nwaiting\nTime\n##wood\ncost\nBill\nmanager\nactivities\nlikely\nallow\noperated\nretired\n##ping\n65\ndirectly\nWho\nassociated\neffect\nhell\nFlorida\nstraight\nhot\nValley\nmanagement\ngirls\nexpected\neastern\nMike\nchance\ncast\ncentre\nchair\nhurt\nproblems\n##li\nwalk\nprograms\nTeam\ncharacters\nBattle\nedge\npay\nmaybe\ncorner\nmajority\nmedical\nJoe\nSummer\n##io\nattempt\nPacific\ncommand\nRadio\n##by\nnames\nmunicipality\n1964\ntrain\neconomic\nBrown\nfeature\nsex\nsource\nagreed\nremember\nThree\n1966\n1965\nPennsylvania\nvictory\nsenior\nannual\nIII\nSouthern\nresults\nSam\nserving\nreligious\nJones\nappears\n##der\ndespite\nclaimed\nBoth\nmusical\nmatches\nfast\nsecurity\nselected\nYoung\ndouble\ncomplex\nhospital\nchief\nTimes\n##ve\nChampionships\nfilled\nPublic\nDespite\nbeautiful\nResearch\nplans\nProvince\n##ally\nWales\n##ko\nartists\nmetal\nnearby\nSpain\n##il\n32\nhouses\nsupported\npiece\n##no\nstared\nrecording\nnature\nlegal\nRussia\n##ization\nremaining\nlooks\n##sh\nbridge\ncloser\ncases\nscene\nmarriage\nLittle\n##é\nuses\nEarth\nspecific\nFrank\ntheory\nGood\ndiscovered\nreferred\nbass\nculture\nuniversity\npresented\nCongress\n##go\nmetres\ncontinue\n1960\nisn\nAwards\nmeaning\ncell\ncomposed\nseparate\nSeries\nforms\nBlue\ncross\n##tor\nincrease\ntest\ncomputer\nslightly\nWhere\nJewish\nTown\ntree\nstatus\n1944\nvariety\nresponsible\npretty\ninitially\n##way\nrealized\npass\nprovides\nCaptain\nAlexander\nrecent\nscore\nbroke\nScott\ndrive\nfinancial\nshowed\nLine\nstories\nordered\nsoldiers\ngenus\noperation\ngaze\nsitting\nsociety\nOnly\nhope\nactor\nfollow\nEmpire\nYeah\ntechnology\nhappy\nfocus\npolicy\nspread\nsituation\n##ford\n##ba\nMrs\nwatch\nCan\n1963\nCommission\ntouch\nearned\ntroops\nUnder\n1962\nindividuals\ncannot\n19th\n##lin\nmile\nexpression\nexactly\nsuddenly\nweight\ndance\nstepped\nplaces\nappear\ndifficult\nRailway\nanti\nnumbers\nkilometres\nstar\n##ier\ndepartment\nice\nBritain\nremoved\nOnce\n##lo\nBoston\nvalue\n##ant\nmission\ntrees\nOrder\nsports\njoin\nserve\nMajor\npoor\nPoland\nmainly\nTheatre\npushed\nStation\n##it\nLady\nfederal\nsilver\n##ler\nforeign\n##ard\nEastern\n##den\nbox\nhall\nsubsequently\nlies\nacquired\n1942\nancient\nCD\nHistory\nJean\nbeyond\n##ger\nEl\n##les\ngrowing\nchampionship\nnative\nParliament\nWilliams\nwatching\ndirect\noverall\noffer\nAlso\n80\nSecretary\nspoke\nLatin\nability\n##ated\nsafe\npresence\n##ial\nheaded\nregional\nplanned\n1961\nJohnson\nthroat\nconsists\n##W\nextended\nOr\nbar\nwalls\nChris\nstations\npolitician\nOlympics\ninfluence\nshare\nfighting\nspeak\nhundred\nCarolina\ndie\nstars\n##tic\ncolor\nChapter\n##ish\nfear\nsleep\ngoes\nFrancisco\noil\nBank\nsign\nphysical\n##berg\nDutch\nseasons\n##rd\nGames\nGovernor\nsorry\nlack\nCentre\nmemory\nbaby\nsmaller\ncharge\nDid\nmultiple\nships\nshirt\nAssembly\namount\nleaves\n3rd\nFoundation\nconditions\n1943\nRock\nDemocratic\nDaniel\n##at\nwinner\nproducts\n##ina\nstore\nlatter\nProfessor\ncivil\nprior\nhost\n1956\nsoft\nvote\nneeds\nEach\nrules\n1958\npressure\nletter\nnormal\nproposed\nlevels\nrecords\n1959\npaid\nintended\nVictoria\npurpose\nokay\nhistorical\nissued\n1980s\nbroadcast\nrule\nsimple\npicked\nfirm\nSea\n1941\nElizabeth\n1940\nserious\nfeaturing\nhighly\ngraduated\nmentioned\nchoice\n1948\nreplied\npercent\nScotland\n##hi\nfemales\nconstructed\n1957\nsettled\nSteve\nrecognized\ncities\ncrew\nglanced\nkiss\ncompeted\nflight\nknowledge\neditor\nMore\nConference\n##H\nfifth\nelements\n##ee\n##tes\nfunction\nnewspaper\nrecently\nMiss\ncultural\nbrown\ntwice\nOffice\n1939\ntruth\nCreek\n1946\nhouseholds\nUSA\n1950\nquality\n##tt\nborder\nseconds\ndestroyed\npre\nwait\nahead\nbuild\nimage\n90\ncars\n##mi\n33\npromoted\nprofessor\net\nbank\nmedal\ntext\nbroken\nMiddle\nrevealed\nsides\nwing\nseems\nchannel\n1970s\nBen\nloved\neffort\nofficers\nWill\n##ff\n70\nIsrael\nJim\nupper\nfully\nlabel\nJr\nassistant\npowerful\npair\npositive\n##ary\ngives\n1955\n20th\nraces\nremain\nkitchen\nprimarily\n##ti\nSydney\neasy\nTour\nwhispered\nburied\n300\nNews\nPolish\n1952\nDuke\nColumbia\nproduce\naccepted\n00\napproach\nminor\n1947\nSpecial\n44\nAsian\nbasis\nvisit\nFort\nCivil\nfinish\nformerly\nbeside\nleaned\n##ite\nmedian\nrose\ncoast\neffects\nsupposed\nCross\n##hip\nCorps\nresidents\nJackson\n##ir\nBob\nbasketball\n36\nAsia\nseem\nBishop\nBook\n##ber\nring\n##ze\nowner\nBBC\n##ja\ntransferred\nacting\nDe\nappearances\nwalking\nLe\npress\ngrabbed\n1954\nofficially\n1953\n##pe\nrisk\ntaught\nreview\n##X\nlay\n##well\ncouncil\nAvenue\nseeing\nlosing\nOhio\nSuper\nprovince\nones\ntravel\n##sa\nprojects\nequipment\nspot\nBerlin\nadministrative\nheat\npotential\nshut\ncapacity\nelections\ngrowth\nfought\nRepublican\nmixed\nAndrew\nteacher\nturning\nstrength\nshoulders\nbeat\nwind\n1949\nHealth\nfollows\ncamp\nsuggested\nperhaps\nAlex\nmountain\ncontact\ndivided\ncandidate\nfellow\n34\nShow\nnecessary\nworkers\nball\nhorse\nways\nquestions\nprotect\ngas\nactivity\nyounger\nbottom\nfounder\nScottish\nscreen\ntreatment\neasily\ncom\n##house\ndedicated\nMaster\nwarm\nNight\nGeorgia\nLong\nvon\n##me\nperfect\nwebsite\n1960s\npiano\nefforts\n##ide\nTony\nsort\noffers\nDevelopment\nSimon\nexecutive\n##nd\nsave\nOver\nSenate\n1951\n1990s\ndraw\nmaster\nPolice\n##ius\nrenamed\nboys\ninitial\nprominent\ndamage\nCo\n##ov\n##za\nonline\nbegin\noccurred\ncaptured\nyouth\nTop\naccount\ntells\nJustice\nconducted\nforest\n##town\nbought\nteeth\nJersey\n##di\npurchased\nagreement\nMichigan\n##ure\ncampus\nprison\nbecomes\nproduct\nsecret\nguess\nRoute\nhuge\ntypes\ndrums\n64\nsplit\ndefeat\nestate\nhousing\n##ot\nbrothers\nCoast\ndeclared\nhappen\ntitled\ntherefore\nsun\ncommonly\nalongside\nStadium\nlibrary\nHome\narticle\nsteps\ntelling\nslow\nassigned\nrefused\nlaughed\nwants\nNick\nwearing\nRome\nOpen\n##ah\nHospital\npointed\nTaylor\nlifted\nescape\nparticipated\n##j\ndrama\nparish\nSanta\n##per\norganized\nmass\npick\nAirport\ngets\nLibrary\nunable\npull\nLive\n##ging\nsurrounding\n##ries\nfocused\nAdam\nfacilities\n##ning\n##ny\n38\n##ring\nnotable\nera\nconnected\ngained\noperating\nlaid\nRegiment\nbranch\ndefined\nChristmas\nmachine\nFour\nacademic\nIran\nadopted\nconcept\nMen\ncompared\nsearch\ntraffic\nMax\nMaria\ngreater\n##ding\nwidely\n##burg\nserves\n1938\n37\nGo\nhotel\nshared\ntypically\nscale\n1936\nleg\nsuffered\nyards\npieces\nMinistry\nWilson\nepisodes\nempty\n1918\nsafety\ncontinues\nyellow\nhistoric\nsettlement\n400\nCome\nCorporation\nenemy\ncontent\npicture\nevening\nterritory\nmethod\ntrial\nsolo\ndriver\nHere\n##ls\nentrance\nPrize\nspring\nwhatever\n##ent\n75\n##ji\nreading\nArthur\n##cy\nOur\nclothes\nPrime\nIllinois\nKong\ncode\n##ria\nsit\nHarry\nFederal\nchosen\nadministration\nbodies\nbegins\nstomach\nThough\nseats\nHong\ndensity\nSun\nleaders\nField\nmuseum\nchart\nplatform\nlanguages\n##ron\nbirth\nholds\nGold\n##un\nfish\ncombined\n##ps\n4th\n1937\nlargely\ncaptain\ntrust\nGame\nvan\nboat\nOxford\nbasic\nbeneath\nIslands\npainting\nnice\nToronto\npath\nmales\nsources\nblock\nconference\nparties\nmurder\nclubs\ncrowd\ncalling\nAbout\nBusiness\npeace\nknows\nlake\nspeaking\nstayed\nBrazil\nallowing\nBorn\nunique\nthick\nTechnology\n##que\nreceive\ndes\nsemi\nalive\nnoticed\nformat\n##ped\ncoffee\ndigital\n##ned\nhanded\nguard\ntall\nfaced\nsetting\nplants\npartner\nclaim\nreduced\ntemple\nanimals\ndetermined\nclasses\n##out\nestimated\n##ad\nOlympic\nproviding\nMassachusetts\nlearned\nInc\nPhiladelphia\nSocial\ncarry\n42\npossibly\nhosted\ntonight\nrespectively\nToday\nshape\nMount\nroles\ndesignated\nbrain\netc\nKorea\nthoughts\nBrian\nHighway\ndoors\nbackground\ndrew\nmodels\nfootballer\ntone\nturns\n1935\nquiet\ntower\nwood\nbus\nwrite\nsoftware\nweapons\nflat\nmarked\n1920\nnewly\ntight\nEric\nfinger\nJournal\nFC\nVan\nrise\ncritical\nAtlantic\ngranted\nreturning\ncommunities\nhumans\nquick\n39\n48\nranked\nsight\npop\nSwedish\nStephen\ncard\nanalysis\nattacked\n##wa\nSunday\nidentified\nJason\nchampion\nsituated\n1930\nexpanded\ntears\n##nce\nreaching\nDavis\nprotection\nEmperor\npositions\nnominated\nBridge\ntax\ndress\nallows\navoid\nleadership\nkilling\nactress\nguest\nsteel\nknowing\nelectric\ncells\ndisease\ngrade\nunknown\n##ium\nresulted\nPakistan\nconfirmed\n##ged\ntongue\ncovers\n##Y\nroof\nentirely\napplied\nvotes\ndrink\ninterview\nexchange\nTownship\nreasons\n##ised\npage\ncalls\ndog\nagent\nnose\nteaching\n##ds\n##ists\nadvanced\nwish\nGolden\nexisting\nvehicle\ndel\n1919\ndevelop\nattacks\npressed\nSports\nplanning\nresulting\nfacility\nSarah\nnotes\n1933\nClass\nHistoric\nwinter\n##mo\naudience\nCommunity\nhousehold\nNetherlands\ncreation\n##ize\nkeeping\n1914\nclaims\ndry\nguys\nopposite\n##ak\nexplained\nOntario\nsecondary\ndifference\nFrancis\nactions\norganizations\nyard\nanimal\nUp\nLewis\ntitles\nSeveral\n1934\nRyan\n55\nSupreme\nrolled\n1917\ndistribution\nfigures\nafraid\nrural\nyourself\n##rt\nsets\nbarely\nInstead\npassing\nawards\n41\nsilence\nauthority\noccupied\nenvironment\nwindows\nengineering\nsurprised\nflying\ncrime\nreports\nMountain\npowers\ndriving\nsucceeded\nreviews\n1929\nHead\nmissing\nSong\nJesus\nopportunity\ninspired\nends\nalbums\nconversation\nimpact\ninjury\nsurprise\nbillion\nlearning\nheavily\noldest\nunion\ncreating\n##ky\nfestival\nliterature\nletters\nsexual\n##tte\napartment\nFinal\ncomedy\nnation\norders\n##sen\ncontemporary\nPower\ndrawn\nexistence\nconnection\n##ating\nPost\nJunior\nremembered\nmessage\nMedal\ncastle\nnote\nengineer\nsounds\nBeach\ncrossed\n##dy\near\nscientific\nsales\n##ai\ntheme\nstarts\nclearly\n##ut\ntrouble\n##gan\nbag\n##han\nBC\nsons\n1928\nsilent\nversions\ndaily\nStudies\nending\nRose\nguns\n1932\nheadquarters\nreference\nobtained\nSquadron\nconcert\nnone\ndu\nAmong\n##don\nprevent\nMember\nanswered\nstaring\nBetween\n##lla\nportion\ndrug\nliked\nassociation\nperformances\nNations\nformation\nCastle\nlose\nlearn\nscoring\nrelatively\nquarter\n47\nPremier\n##ors\nSweden\nbaseball\nattempted\ntrip\nworth\nperform\nairport\nfields\nenter\nhonor\nMedical\nrear\ncommander\nofficials\ncondition\nsupply\nmaterials\n52\nAnna\nvolume\nthrew\nPersian\n43\ninterested\nGallery\nachieved\nvisited\nlaws\nrelief\nArea\nMatt\nsingles\nLieutenant\nCountry\nfans\nCambridge\nsky\nMiller\neffective\ntradition\nPort\n##ana\nminister\nextra\nentitled\nSystem\nsites\nauthorities\nacres\ncommittee\nracing\n1931\ndesk\ntrains\nass\nweren\nFamily\nfarm\n##ance\nindustrial\n##head\niron\n49\nabandoned\nOut\nHoly\nchairman\nwaited\nfrequently\ndisplay\nLight\ntransport\nstarring\nPatrick\nEngineering\neat\nFM\njudge\nreaction\ncenturies\nprice\n##tive\nKorean\ndefense\nGet\narrested\n1927\nsend\nurban\n##ss\npilot\nOkay\nMedia\nreality\narts\nsoul\nthirty\n##be\ncatch\ngeneration\n##nes\napart\nAnne\ndrop\nSee\n##ving\nsixth\ntrained\nManagement\nmagic\ncm\nheight\nFox\nIan\nresources\nvampire\nprincipal\nWas\nhaven\n##au\nWalter\nAlbert\nrich\n1922\ncausing\nentry\n##ell\nshortly\n46\nworry\ndoctor\ncomposer\nrank\nNetwork\nbright\nshowing\nregions\n1924\nwave\ncarrying\nkissed\nfinding\nmissed\nEarl\nlying\ntarget\nvehicles\nMilitary\ncontrolled\ndinner\n##board\nbriefly\nlyrics\nmotion\nduty\nstrange\nattempts\ninvited\nkg\nvillages\n5th\nLand\n##mer\nChrist\nprepared\ntwelve\ncheck\nthousand\nearth\ncopies\nen\ntransfer\ncitizens\nAmericans\npolitics\nnor\ntheatre\nProject\n##bo\nclean\nrooms\nlaugh\n##ran\napplication\ncontained\nanyway\ncontaining\nSciences\n1925\nrare\nspeech\nexist\n1950s\nfalling\npassenger\n##im\nstands\n51\n##ol\n##ow\nphase\ngovernor\nkids\ndetails\nmethods\nVice\nemployed\nperforming\ncounter\nJane\nheads\nChannel\nwine\nopposition\naged\n1912\nEvery\n1926\nhighway\n##ura\n1921\naired\n978\npermanent\nForest\nfinds\njoint\napproved\n##pur\nbrief\ndoubt\nacts\nbrand\nwild\nclosely\nFord\nKevin\nchose\nshall\nport\nsweet\nfun\nasking\nBe\n##bury\nsought\nDave\nMexican\nmom\nRight\nHoward\nMoscow\nCharlie\nStone\n##mann\nadmitted\n##ver\nwooden\n1923\nOfficer\nrelations\nHot\ncombat\npublication\nchain\nshop\ninhabitants\nproved\nideas\naddress\n1915\nMemorial\nexplain\nincreasing\nconflict\nAnthony\nMelbourne\nnarrow\ntemperature\nslid\n1916\nworse\nselling\ndocumentary\nAli\nRay\nopposed\nvision\ndad\nextensive\nInfantry\ncommissioned\nDoctor\noffices\nprogramming\ncore\nrespect\nstorm\n##pa\n##ay\n##om\npromotion\nder\nstruck\nanymore\nshit\nRegion\nreceiving\nDVD\nalternative\n##ue\nride\nmaximum\n1910\n##ious\nThird\nAffairs\ncancer\nExecutive\n##op\ndream\n18th\nDue\n##ker\n##worth\neconomy\nIV\nBillboard\nidentity\nsubsequent\nstatement\nskills\n##back\nfunding\n##ons\nRound\nForeign\ntruck\nPlease\nlights\nwondered\n##ms\nframe\nyes\nStill\ndistricts\nfiction\nColonel\nconverted\n150\ngrown\naccident\ncritics\nfit\nInformation\narchitecture\nPoint\nFive\narmed\nBilly\npoet\nfunctions\nconsisted\nsuit\nTurkish\nBand\nobject\ndesire\n##ities\nsounded\nflow\nNorwegian\narticles\nMarie\npulling\nthin\nsinging\nHunter\nHuman\nBattalion\nFederation\nKim\norigin\nrepresent\ndangerous\nweather\nfuel\nex\n##sing\nLast\nbedroom\naid\nknees\nAlan\nangry\nassumed\nplane\nSomething\nfounding\nconcerned\nglobal\nFire\ndi\nplease\nPortuguese\ntouched\nRoger\nnuclear\nRegister\nJeff\nfixed\nroyal\nlie\nfinals\nNFL\nManchester\ntowns\nhandle\nshaped\nChairman\nDean\nlaunch\nunderstanding\nChildren\nviolence\nfailure\nsector\nBrigade\nwrapped\nfired\nsharp\ntiny\ndeveloping\nexpansion\nFree\ninstitutions\ntechnical\nNothing\notherwise\nMain\ninch\nSaturday\nwore\nSenior\nattached\ncheek\nrepresenting\nKansas\n##chi\n##kin\nactual\nadvantage\nDan\nAustria\n##dale\nhoped\nmulti\nsquad\nNorway\nstreets\n1913\nServices\nhired\ngrow\npp\nwear\npainted\nMinnesota\nstuff\nBuilding\n54\nPhilippines\n1900\n##ties\neducational\nKhan\nMagazine\n##port\nCape\nsignal\nGordon\nsword\nAnderson\ncool\nengaged\nCommander\nimages\nUpon\ntied\nSecurity\ncup\nrail\nVietnam\nsuccessfully\n##red\nMuslim\ngain\nbringing\nNative\nhers\noccurs\nnegative\nPhilip\nKelly\nColorado\ncategory\n##lan\n600\nHave\nsupporting\nwet\n56\nstairs\nGrace\nobserved\n##ung\nfunds\nrestaurant\n1911\nJews\n##ments\n##che\nJake\nBack\n53\nasks\njournalist\naccept\nbands\nbronze\nhelping\n##ice\ndecades\nmayor\nsurvived\nusual\ninfluenced\nDouglas\nHey\n##izing\nsurrounded\nretirement\nTemple\nderived\nPope\nregistered\nproducing\n##ral\nstructures\nJohnny\ncontributed\nfinishing\nbuy\nspecifically\n##king\npatients\nJordan\ninternal\nregarding\nSamuel\nClark\n##q\nafternoon\nFinally\nscenes\nnotice\nrefers\nquietly\nthreat\nWater\nThose\nHamilton\npromise\nfreedom\nTurkey\nbreaking\nmaintained\ndevice\nlap\nultimately\nChampion\nTim\nBureau\nexpressed\ninvestigation\nextremely\ncapable\nqualified\nrecognition\nitems\n##up\nIndiana\nadult\nrain\ngreatest\narchitect\nMorgan\ndressed\nequal\nAntonio\ncollected\ndrove\noccur\nGrant\ngraduate\nanger\nSri\nworried\nstandards\n##ore\ninjured\nsomewhere\ndamn\nSingapore\nJimmy\npocket\nhomes\nstock\nreligion\naware\nregarded\nWisconsin\n##tra\npasses\nfresh\n##ea\nargued\nLtd\nEP\nDiego\nimportance\nCensus\nincident\nEgypt\nMissouri\ndomestic\nleads\nceremony\nEarly\ncamera\nFather\nchallenge\nSwitzerland\nlands\nfamiliar\nhearing\nspend\neducated\nTennessee\nThank\n##ram\nThus\nconcern\nputting\ninches\nmap\nclassical\nAllen\ncrazy\nvalley\nSpace\nsoftly\n##my\npool\nworldwide\nclimate\nexperienced\nneighborhood\nscheduled\nneither\nfleet\n1908\nGirl\n##J\nPart\nengines\nlocations\ndarkness\nRevolution\nestablishment\nlawyer\nobjects\napparently\nQueensland\nEntertainment\nbill\nmark\nTelevision\n##ong\npale\ndemand\nHotel\nselection\n##rn\n##ino\nLabour\nLiberal\nburned\nMom\nmerged\nArizona\nrequest\n##lia\n##light\nhole\nemployees\n##ical\nincorporated\n95\nindependence\nWalker\ncovering\njoining\n##ica\ntask\npapers\nbacking\nsell\nbiggest\n6th\nstrike\nestablish\n##ō\ngently\n59\nOrchestra\nWinter\nprotein\nJuan\nlocked\ndates\nBoy\naren\nshooting\nLuke\nsolid\ncharged\nPrior\nresigned\ninterior\ngarden\nspoken\nimprove\nwonder\npromote\nhidden\n##med\ncombination\nHollywood\nSwiss\nconsider\n##ks\nLincoln\nliterary\ndrawing\nMarine\nweapon\nVictor\nTrust\nMaryland\nproperties\n##ara\nexhibition\nunderstood\nhung\nTell\ninstalled\nloud\nfashion\naffected\njunior\nlanding\nflowers\n##he\nInternet\nbeach\nHeart\ntries\nMayor\nprogramme\n800\nwins\nnoise\n##ster\n##ory\n58\ncontain\nfair\ndelivered\n##ul\nwedding\nSquare\nadvance\nbehavior\nProgram\nOregon\n##rk\nresidence\nrealize\ncertainly\nhill\nHouston\n57\nindicated\n##water\nwounded\nVillage\nmassive\nMoore\nthousands\npersonnel\ndating\nopera\npoetry\n##her\ncauses\nfeelings\nFrederick\napplications\npush\napproached\nfoundation\npleasure\nsale\nfly\ngotten\nnortheast\ncosts\nraise\npaintings\n##ney\nviews\nhorses\nformal\nArab\nhockey\ntypical\nrepresentative\nrising\n##des\nclock\nstadium\nshifted\nDad\npeak\nFame\nvice\ndisappeared\nusers\nWay\nNaval\nprize\nhoping\nvalues\nevil\nBell\nconsisting\n##ón\nRegional\n##ics\nimproved\ncircle\ncarefully\nbroad\n##ini\nFine\nmaintain\noperate\noffering\nmention\nDeath\nstupid\nThrough\nPrincess\nattend\ninterests\nruled\nsomewhat\nwings\nroads\ngrounds\n##ual\nGreece\nChampions\nfacing\nhide\nvoted\nrequire\nDark\nMatthew\ncredit\nsighed\nseparated\nmanner\n##ile\nBoys\n1905\ncommitted\nimpossible\nlip\ncandidates\n7th\nBruce\narranged\nIslamic\ncourses\ncriminal\n##ened\nsmell\n##bed\n08\nconsecutive\n##ening\nproper\npurchase\nweak\nPrix\n1906\naside\nintroduction\nLook\n##ku\nchanging\nbudget\nresistance\nfactory\nForces\nagency\n##tone\nnorthwest\nuser\n1907\nstating\n##one\nsport\nDesign\nenvironmental\ncards\nconcluded\nCarl\n250\naccused\n##ology\nGirls\nsick\nintelligence\nMargaret\nresponsibility\nGuard\n##tus\n17th\nsq\ngoods\n1909\nhate\n##ek\ncapture\nstores\nGray\ncomic\nModern\nSilver\nAndy\nelectronic\nwheel\n##ied\nDeputy\n##bs\nCzech\nzone\nchoose\nconstant\nreserve\n##lle\nTokyo\nspirit\nsub\ndegrees\nflew\npattern\ncompete\nDance\n##ik\nsecretary\nImperial\n99\nreduce\nHungarian\nconfused\n##rin\nPierre\ndescribes\nregularly\nRachel\n85\nlanded\npassengers\n##ise\n##sis\nhistorian\nmeters\nYouth\n##ud\nparticipate\n##cing\narrival\ntired\nMother\n##gy\njumped\nKentucky\nfaces\nfeed\nIsraeli\nOcean\n##Q\n##án\nplus\nsnow\ntechniques\nplate\nsections\nfalls\njazz\n##ris\ntank\nloan\nrepeated\nopinion\n##res\nunless\nrugby\njournal\nLawrence\nmoments\nshock\ndistributed\n##ded\nadjacent\nArgentina\ncrossing\nuncle\n##ric\nDetroit\ncommunication\nmental\ntomorrow\nsession\nEmma\nWithout\n##gen\nMiami\ncharges\nAdministration\nhits\ncoat\nprotected\nCole\ninvasion\npriest\n09\nGary\nenjoyed\nplot\nmeasure\nbound\nfriendly\nthrow\nmusician\n##lon\n##ins\nAge\nknife\ndamaged\nbirds\ndriven\nlit\nears\nbreathing\nArabic\nJan\nfaster\nJonathan\n##gate\nIndependent\nstarred\nHarris\nteachers\nAlice\nsequence\nmph\nfile\ntranslated\ndecide\ndetermine\nReview\ndocuments\nsudden\nthreatened\n##ft\nbear\ndistinct\ndecade\nburning\n##sky\n1930s\nreplace\nbegun\nextension\n##time\n1904\nequivalent\naccompanied\nChristopher\nDanish\n##ye\nBesides\n##more\npersons\nfallen\nRural\nroughly\nsaved\nwilling\nensure\nBelgium\n05\nmusicians\n##ang\ngiant\nSix\nRetrieved\nworst\npurposes\n##bly\nmountains\nseventh\nslipped\nbrick\n07\n##py\nsomehow\nCarter\nIraq\ncousin\nfavor\nislands\njourney\nFIFA\ncontrast\nplanet\nvs\ncalm\n##ings\nconcrete\nbranches\ngray\nprofit\nRussell\n##ae\n##ux\n##ens\nphilosophy\nbusinesses\ntalked\nparking\n##ming\nowners\nPlace\n##tle\nagricultural\nKate\n06\nsoutheast\ndraft\nEddie\nearliest\nforget\nDallas\nCommonwealth\nedited\n66\ninner\ned\noperates\n16th\nHarvard\nassistance\n##si\ndesigns\nTake\nbathroom\nindicate\nCEO\nCommand\nLouisiana\n1902\nDublin\nBooks\n1901\ntropical\n1903\n##tors\nPlaces\ntie\nprogress\nforming\nsolution\n62\nletting\n##ery\nstudying\n##jo\nduties\nBaseball\ntaste\nReserve\n##ru\nAnn\n##gh\nvisible\n##vi\nnotably\nlink\nNCAA\nsouthwest\nNever\nstorage\nmobile\nwriters\nfavorite\nPro\npages\ntruly\ncount\n##tta\nstring\nkid\n98\nRoss\nrow\n##idae\nKennedy\n##tan\nHockey\nhip\nwaist\ngrandfather\nlisten\n##ho\nfeels\nbusy\n72\nstream\nobvious\ncycle\nshaking\nKnight\n##ren\nCarlos\npainter\ntrail\nweb\nlinked\n04\nPalace\nexisted\n##ira\nresponded\nclosing\nEnd\nexamples\nMarshall\nweekend\njaw\nDenmark\nlady\ntownship\nmedium\nchin\nStory\noption\nfifteen\nMoon\nrepresents\nmakeup\ninvestment\njump\nchildhood\nOklahoma\nroll\nnormally\nTen\nOperation\nGraham\nSeattle\nAtlanta\npaused\npromised\nrejected\ntreated\nreturns\nflag\n##ita\nHungary\ndanger\nglad\nmovements\nvisual\nsubjects\ncredited\nsoldier\nNorman\nill\ntranslation\nJosé\nQuebec\nmedicine\nwarning\ntheater\npraised\nmunicipal\n01\ncommune\nchurches\nacid\nfolk\n8th\ntesting\nadd\nsurvive\nSound\ndevices\nresidential\nsevere\npresidential\nMississippi\nAustin\nPerhaps\nCharlotte\nhanging\nMontreal\ngrin\n##ten\nracial\npartnership\nshoot\nshift\n##nie\nLes\ndowntown\nBrothers\nGarden\nmatters\nrestored\nmirror\nforever\nwinners\nrapidly\npoverty\n##ible\nUntil\nDC\nfaith\nhundreds\nReal\nUkraine\nNelson\nbalance\nAdams\ncontest\nrelative\nethnic\nEdinburgh\ncomposition\n##nts\nemergency\n##van\nmarine\nreputation\nDown\npack\n12th\nCommunist\nMountains\npro\nstages\nmeasures\n##ld\nABC\nLi\nvictims\nbenefit\nIowa\nBroadway\ngathered\nrating\nDefense\nclassic\n##ily\nceiling\n##ions\nsnapped\nEverything\nconstituency\nFranklin\nThompson\nStewart\nentering\nJudge\nforth\n##sk\nwanting\nsmiling\nmoves\ntunnel\npremiered\ngrass\nunusual\nUkrainian\nbird\nFriday\ntail\nPortugal\ncoal\nelement\nFred\nguards\nSenator\ncollaboration\nbeauty\nWood\nchemical\nbeer\njustice\nsigns\n##Z\nsees\n##zi\nPuerto\n##zed\n96\nsmooth\nBowl\ngift\nlimit\n97\nheading\nSource\nwake\nrequires\nEd\nConstitution\nfactor\nLane\nfactors\nadding\nNote\ncleared\npictures\npink\n##ola\nKent\nLocal\nSingh\nmoth\nTy\n##ture\ncourts\nSeven\ntemporary\ninvolving\nVienna\nemerged\nfishing\nagree\ndefensive\nstuck\nsecure\nTamil\n##ick\nbottle\n03\nPlayer\ninstruments\nSpring\npatient\nflesh\ncontributions\ncry\nMalaysia\n120\nGlobal\nda\nAlabama\nWithin\n##work\ndebuted\nexpect\nCleveland\nconcerns\nretained\nhorror\n10th\nspending\nPeace\nTransport\ngrand\nCrown\ninstance\ninstitution\nacted\nHills\nmounted\nCampbell\nshouldn\n1898\n##ably\nchamber\nsoil\n88\nEthan\nsand\ncheeks\n##gi\nmarry\n61\nweekly\nclassification\nDNA\nElementary\nRoy\ndefinitely\nSoon\nRights\ngate\nsuggests\naspects\nimagine\ngolden\nbeating\nStudios\nWarren\ndifferences\nsignificantly\nglance\noccasionally\n##od\nclothing\nAssistant\ndepth\nsending\npossibility\nmode\nprisoners\nrequirements\ndaughters\ndated\nRepresentatives\nprove\nguilty\ninteresting\nsmoke\ncricket\n93\n##ates\nrescue\nConnecticut\nunderground\nOpera\n13th\nreign\n##ski\nthanks\nleather\nequipped\nroutes\nfan\n##ans\nscript\nWright\nbishop\nWelsh\njobs\nfaculty\neleven\nRailroad\nappearing\nanniversary\nUpper\n##down\nanywhere\nRugby\nMetropolitan\nMeanwhile\nNicholas\nchampions\nforehead\nmining\ndrinking\n76\nJerry\nmembership\nBrazilian\nWild\nRio\nscheme\nUnlike\nstrongly\n##bility\nfill\n##rian\neasier\nMP\nHell\n##sha\nStanley\nbanks\nBaron\n##ique\nRobinson\n67\nGabriel\nAustrian\nWayne\nexposed\n##wan\nAlfred\n1899\nmanage\nmix\nvisitors\neating\n##rate\nSean\ncommission\nCemetery\npolicies\nCamp\nparallel\ntraveled\nguitarist\n02\nsupplies\ncouples\npoem\nblocks\nRick\nTraining\nEnergy\nachieve\nappointment\nWing\nJamie\n63\nnovels\n##em\n1890\nsongwriter\nBase\nJay\n##gar\nnaval\nscared\nmiss\nlabor\ntechnique\ncrisis\nAdditionally\nbacked\ndestroy\nseriously\ntools\ntennis\n91\ngod\n##ington\ncontinuing\nsteam\nobviously\nBobby\nadapted\nfifty\nenjoy\nJacob\npublishing\ncolumn\n##ular\nBaltimore\nDonald\nLiverpool\n92\ndrugs\nmovies\n##ock\nHeritage\n##je\n##istic\nvocal\nstrategy\ngene\nadvice\n##bi\nOttoman\nriding\n##side\nAgency\nIndonesia\n11th\nlaughing\nsleeping\nund\nmuttered\nlistening\ndeck\ntip\n77\nownership\ngrey\nClaire\ndeeply\nprovincial\npopularity\nCooper\n##á\nEmily\n##sed\ndesigner\nMurray\ndescribe\nDanny\nAround\nParker\n##dae\n68\nrates\nsuffering\nconsiderable\n78\nnervous\npowered\ntons\ncircumstances\nwished\nbelonged\nPittsburgh\nflows\n9th\n##use\nbelt\n81\nuseful\n15th\ncontext\nList\nDead\nIron\nseek\nSeason\nworn\nfrequency\nlegislation\nreplacement\nmemories\nTournament\nAgain\nBarry\norganisation\ncopy\nGulf\nwaters\nmeets\nstruggle\nOliver\n1895\nSusan\nprotest\nkick\nAlliance\ncomponents\n1896\nTower\nWindows\ndemanded\nregiment\nsentence\nWoman\nLogan\nReferee\nhosts\ndebate\nknee\nBlood\n##oo\nuniversities\npractices\nWard\nranking\ncorrect\nhappening\nVincent\nattracted\nclassified\n##stic\nprocesses\nimmediate\nwaste\nincreasingly\nHelen\n##po\nLucas\nPhil\norgan\n1897\ntea\nsuicide\nactors\nlb\ncrash\napproval\nwaves\n##ered\nhated\ngrip\n700\namongst\n69\n74\nhunting\ndying\nlasted\nillegal\n##rum\nstare\ndefeating\n##gs\nshrugged\n°C\nJon\nCount\nOrleans\n94\naffairs\nformally\n##and\n##ves\ncriticized\nDisney\nVol\nsuccessor\ntests\nscholars\npalace\nWould\ncelebrated\nrounds\ngrant\nSchools\nSuch\ncommanded\ndemon\nRomania\n##all\nKarl\n71\n##yn\n84\nDaily\ntotally\nMedicine\nfruit\nDie\nupset\nLower\nConservative\n14th\nMitchell\nescaped\nshoes\nMorris\n##tz\nqueen\nharder\nprime\nThanks\nindeed\nSky\nauthors\nrocks\ndefinition\nNazi\naccounts\nprinted\nexperiences\n##ters\ndivisions\nCathedral\ndenied\ndepending\nExpress\n##let\n73\nappeal\nloose\ncolors\nfiled\n##isation\ngender\n##ew\nthrone\nforests\nFinland\ndomain\nboats\nBaker\nsquadron\nshore\nremove\n##ification\ncareful\nwound\nrailroad\n82\nseeking\nagents\n##ved\nBlues\n##off\ncustomers\nignored\nnet\n##ction\nhiding\nOriginally\ndeclined\n##ess\nfranchise\neliminated\nNBA\nmerely\npure\nappropriate\nvisiting\nforty\nmarkets\noffensive\ncoverage\ncave\n##nia\nspell\n##lar\nBenjamin\n##ire\nConvention\nfilmed\nTrade\n##sy\n##ct\nHaving\npalm\n1889\nEvans\nintense\nplastic\nJulia\ndocument\njeans\nvessel\nSR\n##fully\nproposal\nBirmingham\nle\n##ative\nassembly\n89\nfund\nlock\n1893\nAD\nmeetings\noccupation\nmodified\nYears\nodd\naimed\nreform\nMission\nWorks\nshake\ncat\nexception\nconvinced\nexecuted\npushing\ndollars\nreplacing\nsoccer\nmanufacturing\n##ros\nexpensive\nkicked\nminimum\nJosh\ncoastal\nChase\nha\nThailand\npublications\ndeputy\nSometimes\nAngel\neffectively\n##illa\ncriticism\nconduct\nSerbian\nlandscape\nNY\nabsence\npassage\n##ula\nBlake\nIndians\n1892\nadmit\nTrophy\n##ball\nNext\n##rated\n##ians\ncharts\nkW\norchestra\n79\nheritage\n1894\nrough\nexists\nboundary\nBible\nLegislative\nmoon\nmedieval\n##over\ncutting\nprint\n##ett\nbirthday\n##hood\ndestruction\nJulian\ninjuries\ninfluential\nsisters\nraising\nstatue\ncolour\ndancing\ncharacteristics\norange\n##ok\n##aries\nKen\ncolonial\ntwin\nLarry\nsurviving\n##shi\nBarbara\npersonality\nentertainment\nassault\n##ering\ntalent\nhappens\nlicense\n86\ncouch\nCentury\nsoundtrack\nshower\nswimming\ncash\nStaff\nbent\n1885\nbay\nlunch\n##lus\ndozen\nvessels\nCBS\ngreatly\ncritic\nTest\nsymbol\npanel\nshell\noutput\nreaches\n87\nFront\nmotor\nocean\n##era\n##ala\nmaintenance\nviolent\nscent\nLimited\nLas\nHope\nTheater\nWhich\nsurvey\nRobin\nrecordings\ncompilation\n##ward\nbomb\ninsurance\nAuthority\nsponsored\nsatellite\nJazz\nrefer\nstronger\nblow\nwhilst\nWrestling\nsuggest\n##rie\nclimbed\n##els\nvoices\nshopping\n1891\nNeil\ndiscovery\n##vo\n##ations\nburst\nBaby\npeaked\nBrooklyn\nknocked\nlift\n##try\nfalse\nnations\nHugh\nCatherine\npreserved\ndistinguished\nterminal\nresolution\nratio\npants\ncited\ncompetitions\ncompletion\nDJ\nbone\nuniform\nschedule\nshouted\n83\n1920s\nrarely\nBasketball\nTaiwan\nartistic\nbare\nvampires\narrest\nUtah\nMarcus\nassist\ngradually\nqualifying\nVictorian\nvast\nrival\nWarner\nTerry\nEconomic\n##cia\nlosses\nboss\nversus\naudio\nrunner\napply\nsurgery\nPlay\ntwisted\ncomfortable\n##cs\nEveryone\nguests\n##lt\nHarrison\nUEFA\nlowered\noccasions\n##lly\n##cher\nchapter\nyoungest\neighth\nCulture\n##room\n##stone\n1888\nSongs\nSeth\nDigital\ninvolvement\nexpedition\nrelationships\nsigning\n1000\nfault\nannually\ncircuit\nafterwards\nmeat\ncreature\n##ou\ncable\nBush\n##net\nHispanic\nrapid\ngonna\nfigured\nextent\nconsidering\ncried\n##tin\nsigh\ndynasty\n##ration\ncabinet\nRichmond\nstable\n##zo\n1864\nAdmiral\nUnit\noccasion\nshares\nbadly\nlongest\n##ify\nConnor\nextreme\nwondering\ngirlfriend\nStudio\n##tions\n1865\ntribe\nexact\nmuscles\nhat\nLuis\nOrthodox\ndecisions\namateur\ndescription\n##lis\nhips\nkingdom\n##ute\nPortland\nwhereas\nBachelor\nouter\ndiscussion\npartly\nArkansas\n1880\ndreams\nperfectly\nLloyd\n##bridge\nasleep\n##tti\nGreg\npermission\ntrading\npitch\nmill\nStage\nliquid\nKeith\n##tal\nwolf\nprocessing\nstick\nJerusalem\nprofile\nrushed\nspiritual\nargument\nIce\nGuy\ntill\nDelhi\nroots\nSection\nmissions\nGlasgow\npenalty\nNBC\nencouraged\nidentify\nkeyboards\n##zing\n##ston\ndisc\nplain\ninformed\nBernard\nthinks\nfled\nJustin\n##day\nnewspapers\n##wick\nRalph\n##zer\nunlike\nStars\nartillery\n##ified\nrecovered\narrangement\nsearching\n##pers\n##tory\n##rus\ndeaths\nEgyptian\ndiameter\n##í\nmarketing\ncorporate\nteach\nmarks\nTurner\nstaying\nhallway\nSebastian\nchapel\nnaked\nmistake\npossession\n1887\ndominated\njacket\ncreative\nFellow\nFalls\nDefence\nsuspended\nemployment\n##rry\nHebrew\nHudson\nWeek\nWars\nrecognize\nNatural\ncontroversial\nTommy\nthank\nAthletic\nbenefits\ndecline\nintention\n##ets\nLost\nWall\nparticipation\nelevation\nsupports\nparliament\n1861\nconcentration\nMovement\n##IS\ncompeting\nstops\nbehalf\n##mm\nlimits\nfunded\ndiscuss\nCollins\ndeparture\nobtain\nwoods\nlatest\nuniverse\nalcohol\nLaura\nrush\nblade\nfunny\nDennis\nforgotten\nAmy\nSymphony\napparent\ngraduating\n1862\nRob\nGrey\ncollections\nMason\nemotions\n##ugh\nliterally\nAny\ncounties\n1863\nnomination\nfighter\nhabitat\nrespond\nexternal\nCapital\nexit\nVideo\ncarbon\nsharing\nBad\nopportunities\nPerry\nphoto\n##mus\nOrange\nposted\nremainder\ntransportation\nportrayed\nLabor\nrecommended\npercussion\nrated\nGrade\nrivers\npartially\nsuspected\nstrip\nadults\nbutton\nstruggled\nintersection\nCanal\n##ability\npoems\nclaiming\nMadrid\n1886\nTogether\n##our\nMuch\nVancouver\ninstrument\ninstrumental\n1870\nmad\nangle\nControl\nPhoenix\nLeo\nCommunications\nmail\n##ette\n##ev\npreferred\nadaptation\nalleged\ndiscussed\ndeeper\n##ane\nYet\nMonday\nvolumes\nthrown\nZane\n##logy\ndisplayed\nrolling\ndogs\nAlong\nTodd\n##ivity\nwithdrew\nrepresentation\nbelief\n##sia\ncrown\nLate\nShort\nhardly\ngrinned\nromantic\nPete\n##ken\nnetworks\nenemies\nColin\nEventually\nSide\ndonated\n##su\nsteady\ngrab\nguide\nFinnish\nMilan\npregnant\ncontroversy\nreminded\n1884\nStuart\n##bach\n##ade\nRace\nBelgian\nLP\nProduction\nZone\nlieutenant\ninfantry\nChild\nconfusion\nsang\nresident\n##ez\nvictim\n1881\nchannels\nRon\nbusinessman\n##gle\nDick\ncolony\npace\nproducers\n##ese\nagencies\nCraig\nLucy\nVery\ncenters\nYorkshire\nphotography\n##ched\nAlbum\nchampionships\nMetro\nsubstantial\nStandard\nterrible\ndirectors\ncontribution\nadvertising\nemotional\n##its\nlayer\nsegment\nsir\nfolded\nRoberts\nceased\nHampshire\n##ray\ndetailed\npartners\nm²\n##pt\nBeth\ngenre\ncommented\ngenerated\nremote\naim\nHans\ncredits\nconcerts\nperiods\nbreakfast\ngay\nshadow\ndefence\nToo\nHad\ntransition\nAfghanistan\n##book\neggs\ndefend\n##lli\nwrites\nSystems\nbones\nmess\nseed\nscientists\nShortly\nRomanian\n##zy\nFreedom\nmuscle\nhero\nparent\nagriculture\nchecked\nIslam\nBristol\nFreyja\nArena\ncabin\nGermans\nelectricity\nranks\nviewed\nmedals\nWolf\nassociate\nMadison\nSorry\nfort\nChile\ndetail\nwidespread\nattorney\nboyfriend\n##nan\nStudents\nSpencer\n##ig\nbite\nMaine\ndemolished\nLisa\nerected\nSomeone\noperational\nCommissioner\nNHL\nCoach\nBar\nforcing\nDream\nRico\ncargo\nMurphy\n##fish\n##ase\ndistant\n##master\n##ora\nOrganization\ndoorway\nSteven\ntraded\nelectrical\nfrequent\n##wn\nBranch\nSure\n1882\nplacing\nManhattan\nattending\nattributed\nexcellent\npounds\nruling\nprinciples\ncomponent\nMediterranean\nVegas\nmachines\npercentage\ninfrastructure\nthrowing\naffiliated\nKings\nsecured\nCaribbean\nTrack\nTed\nhonour\nopponent\nVirgin\nConstruction\ngrave\nproduces\nChallenge\nstretched\npaying\nmurmured\n##ata\nintegrated\nwaved\nNathan\n##ator\ntransmission\nvideos\n##yan\n##hu\nNova\ndescent\nAM\nHarold\nconservative\nTherefore\nvenue\ncompetitive\n##ui\nconclusion\nfuneral\nconfidence\nreleases\nscholar\n##sson\nTreaty\nstress\nmood\n##sm\nMac\nresiding\nAction\nFund\n##ship\nanimated\nfitted\n##kar\ndefending\nvoting\ntend\n##berry\nanswers\nbelieves\n##ci\nhelps\nAaron\n##tis\nthemes\n##lay\npopulations\nPlayers\nstroke\nTrinity\nelectoral\npaint\nabroad\ncharity\nkeys\nFair\n##pes\ninterrupted\nparticipants\nmurdered\nDays\nsupporters\n##ab\nexpert\nborders\nmate\n##llo\nsolar\narchitectural\ntension\n##bling\nParish\ntape\noperator\nCultural\nClinton\nindicates\npublisher\nordinary\nsugar\narrive\nrifle\nacoustic\n##uring\nassets\n##shire\nSS\nsufficient\noptions\nHMS\nClassic\nbars\nrebuilt\ngovernments\nBeijing\nreporter\nscreamed\nAbbey\ncrying\nmechanical\ninstantly\ncommunications\nPolitical\ncemetery\nCameron\nStop\nrepresentatives\nUSS\ntexts\nmathematics\ninnings\ncivilian\nSerbia\n##hill\npractical\npatterns\ndust\nFaculty\ndebt\n##end\n##cus\njunction\nsuppose\nexperimental\nComputer\nFood\nwrist\nabuse\ndealing\nbigger\ncap\nprinciple\n##pin\nMuhammad\nFleet\nCollection\nattempting\ndismissed\n##burn\nregime\nHerbert\n##ua\nshadows\n1883\nEve\nLanka\n1878\nPerformance\nfictional\n##lock\nNoah\nRun\nVoivodeship\nexercise\nbroadcasting\n##fer\nRAF\nMagic\nBangladesh\nsuitable\n##low\n##del\nstyles\ntoured\nCode\nidentical\nlinks\ninsisted\n110\nflash\nModel\nslave\nDerek\nRev\nfairly\nGreater\nsole\n##lands\nconnecting\nzero\nbench\n##ome\nswitched\nFall\nOwen\nyours\nElectric\nshocked\nconvention\n##bra\nclimb\nmemorial\nswept\nRacing\ndecides\nbelong\n##nk\nparliamentary\n##und\nages\nproof\n##dan\ndelivery\n1860\n##ów\nsad\npublicly\nleaning\nArchbishop\ndirt\n##ose\ncategories\n1876\nburn\n##bing\nrequested\nGuinea\nHistorical\nrhythm\nrelation\n##heim\nye\npursue\nmerchant\n##mes\nlists\ncontinuous\nfrowned\ncolored\ntool\ngods\ninvolves\nDuncan\nphotographs\nCricket\nslight\nGregory\natmosphere\nwider\nCook\n##tar\nessential\nBeing\nFA\nemperor\nwealthy\nnights\n##bar\nlicensed\nHawaii\nviewers\nLanguage\nload\nnearest\nmilk\nkilometers\nplatforms\n##ys\nterritories\nRogers\nsheet\nRangers\ncontested\n##lation\nisolated\nassisted\nswallowed\nSmall\nContemporary\nTechnical\nEdwards\nexpress\nVolume\nendemic\n##ei\ntightly\nWhatever\nindigenous\nColombia\n##ulation\nhp\ncharacterized\n##ida\nNigeria\nProfessional\nduo\nSoccer\nslaves\nFarm\nsmart\nAttorney\nAttendance\nCommon\nsalt\n##vin\ntribes\nnod\nsentenced\nbid\nsample\nDrive\nswitch\ninstant\n21st\nCuba\ndrunk\nAlaska\nproud\nawareness\nhitting\nsessions\nThai\nlocally\nelsewhere\nDragon\ngentle\ntouching\n##lee\nSprings\nUniversal\nLatino\nspin\n1871\nChart\nrecalled\nType\npointing\n##ii\nlowest\n##ser\ngrandmother\nAdelaide\nJacques\nspotted\nBuffalo\nrestoration\nSon\nJoan\nfarmers\nLily\n1879\nlucky\n##dal\nluck\neldest\n##rant\nMarket\ndrummer\ndeployed\nwarned\nprince\nsing\namazing\nsailed\n##oon\n1875\nPrimary\ntraveling\nMasters\nSara\ncattle\nTrail\ngang\nFurther\ndesert\nrelocated\n##tch\n##ord\nFlight\nillness\nMunich\nninth\nrepair\nSingles\n##lated\nTyler\ntossed\nboots\nWork\nsized\nearning\nshoved\nmagazines\nhoused\ndam\nresearchers\nFormer\nspun\npremiere\nspaces\norganised\nwealth\ncrimes\ndevoted\nstones\nUrban\nautomatic\nhop\naffect\noutstanding\ntanks\nmechanism\nMuslims\nMs\nshots\nargue\nJeremy\nconnections\nArmenian\nincreases\nrubbed\n1867\nretail\ngear\nPan\nbonus\njurisdiction\nweird\nconcerning\nwhisper\n##gal\nMicrosoft\ntenure\nhills\nwww\nGmina\nporch\nfiles\nreportedly\nventure\nStorm\n##ence\nNature\nkiller\npanic\nfate\nSecret\nWang\nscream\ndrivers\nbelongs\nChamber\nclan\nmonument\nmixing\nPeru\nbet\nRiley\nFriends\nIsaac\nsubmarine\n1877\n130\njudges\nharm\nranging\naffair\nprepare\npupils\nhouseholder\nPolicy\ndecorated\nNation\nslammed\nactivist\nimplemented\nRoom\nqualify\nPublishing\nestablishing\nBaptist\ntouring\nsubsidiary\n##nal\nlegend\n1872\nlaughter\nPC\nAthens\nsettlers\nties\ndual\ndear\nDraft\nstrategic\nIvan\nreveal\nclosest\ndominant\nAh\n##ult\nDenver\nbond\nboundaries\ndrafted\ntables\n##TV\neyed\nEdition\n##ena\n1868\nbelonging\n1874\nIndustrial\ncream\nRidge\nHindu\nscholarship\nMa\nopens\ninitiated\n##ith\nyelled\ncompound\nrandom\nThroughout\ngrades\nphysics\nsank\ngrows\nexclusively\nsettle\nSaints\nbrings\nAmsterdam\nMake\nHart\nwalks\nbattery\nviolin\n##born\nexplanation\n##ware\n1873\n##har\nprovinces\nthrust\nexclusive\nsculpture\nshops\n##fire\nVI\nconstitution\nBarcelona\nmonster\nDevon\nJefferson\nSullivan\nbow\n##din\ndesperate\n##ć\nJulie\n##mon\n##ising\nterminus\nJesse\nabilities\ngolf\n##ple\n##via\n##away\nRaymond\nmeasured\njury\nfiring\nrevenue\nsuburb\nBulgarian\n1866\n##cha\ntimber\nThings\n##weight\nMorning\nspots\nAlberta\nData\nexplains\nKyle\nfriendship\nraw\ntube\ndemonstrated\naboard\nimmigrants\nreply\nbreathe\nManager\nease\n##ban\n##dia\nDiocese\n##vy\n##ía\npit\nongoing\n##lie\nGilbert\nCosta\n1940s\nReport\nvoters\ncloud\ntraditions\n##MS\ngallery\nJennifer\nswung\nBroadcasting\nDoes\ndiverse\nreveals\narriving\ninitiative\n##ani\nGive\nAllied\nPat\nOutstanding\nmonastery\nblind\nCurrently\n##war\nbloody\nstopping\nfocuses\nmanaging\nFlorence\nHarvey\ncreatures\n900\nbreast\ninternet\nArtillery\npurple\n##mate\nalliance\nexcited\nfee\nBrisbane\nlifetime\nPrivate\n##aw\n##nis\n##gue\n##ika\nphrase\nregulations\nreflected\nmanufactured\nconventional\npleased\nclient\n##ix\n##ncy\nPedro\nreduction\n##con\nwelcome\njail\ncomfort\nIranian\nNorfolk\nDakota\n##tein\nevolution\neverywhere\nInitially\nsensitive\nOlivia\nOscar\nimplementation\nsits\nstolen\ndemands\nslide\ngrandson\n##ich\nmerger\n##mic\nSpirit\n##°\nticket\nroot\ndifficulty\nNevada\n##als\nlined\nDylan\nOriginal\nCall\nbiological\nEU\ndramatic\n##hn\nOperations\ntreaty\ngap\n##list\nAm\nRomanized\nmoral\nButler\nperspective\nFurthermore\nManuel\nabsolutely\nunsuccessful\ndisaster\ndispute\npreparation\ntested\ndiscover\n##ach\nshield\nsqueezed\nbrushed\nbattalion\nArnold\n##ras\nsuperior\ntreat\nclinical\n##so\nApple\nSyria\nCincinnati\npackage\nflights\neditions\nLeader\nminority\nwonderful\nhang\nPop\nPhilippine\ntelephone\nbell\nhonorary\n##mar\nballs\nDemocrat\ndirty\nthereafter\ncollapsed\nInside\nslip\nwrestling\n##ín\nlistened\nregard\nbowl\nNone\nSport\ncompleting\ntrapped\n##view\ncopper\nWallace\nHonor\nblame\nPeninsula\n##ert\n##oy\nAnglo\nbearing\nsimultaneously\nhonest\n##ias\nMix\nGot\nspeaker\nvoiced\nimpressed\nprices\nerror\n1869\n##feld\ntrials\nNine\nIndustry\nsubstitute\nMunicipal\ndeparted\nslept\n##ama\nJunction\nSocialist\nflower\ndropping\ncomment\nfantasy\n##ress\narrangements\ntravelled\nfurniture\nfist\nrelieved\n##tics\nLeonard\nlinear\nearn\nexpand\nSoul\nPlan\nLeeds\nSierra\naccessible\ninnocent\nWinner\nFighter\nRange\nwinds\nvertical\nPictures\n101\ncharter\ncooperation\nprisoner\ninterviews\nrecognised\nsung\nmanufacturer\nexposure\nsubmitted\nMars\nleaf\ngauge\nscreaming\nlikes\neligible\n##ac\ngathering\ncolumns\n##dra\nbelly\nUN\nmaps\nmessages\nspeakers\n##ants\ngarage\nunincorporated\nNumber\nWatson\nsixteen\nlots\nbeaten\nCould\nMunicipality\n##ano\nHorse\ntalks\nDrake\nscores\nVenice\ngenetic\n##mal\n##ère\nCold\nJose\nnurse\ntraditionally\n##bus\nTerritory\nKey\nNancy\n##win\nthumb\nSão\nindex\ndependent\ncarries\ncontrols\nComics\ncoalition\nphysician\nreferring\nRuth\nBased\nrestricted\ninherited\ninternationally\nstretch\nTHE\nplates\nmargin\nHolland\nknock\nsignificance\nvaluable\nKenya\ncarved\nemotion\nconservation\nmunicipalities\noverseas\nresumed\nFinance\ngraduation\nblinked\ntemperatures\nconstantly\nproductions\nscientist\nghost\ncuts\npermitted\n##ches\nfirmly\n##bert\npatrol\n##yo\nCroatian\nattacking\n1850\nportrait\npromoting\nsink\nconversion\n##kov\nlocomotives\nGuide\n##val\nnephew\nrelevant\nMarc\ndrum\noriginated\nChair\nvisits\ndragged\nPrice\nfavour\ncorridor\nproperly\nrespective\nCaroline\nreporting\ninaugural\n1848\nindustries\n##ching\nedges\nChristianity\nMaurice\nTrent\nEconomics\ncarrier\nReed\n##gon\ntribute\nPradesh\n##ale\nextend\nattitude\nYale\n##lu\nsettlements\nglasses\ntaxes\ntargets\n##ids\nquarters\n##ological\nconnect\nhence\nmetre\ncollapse\nunderneath\nbanned\nFuture\nclients\nalternate\nexplosion\nkinds\nCommons\nhungry\ndragon\nChapel\nBuddhist\nlover\ndepression\npulls\n##ges\n##uk\norigins\ncomputers\ncrosses\nkissing\nassume\nemphasis\nlighting\n##ites\npersonally\ncrashed\nbeam\ntouchdown\nlane\ncomparison\n##mont\nHitler\n##las\nexecution\n##ene\nacre\nsum\nPearl\nray\n##point\nessentially\nworker\nconvicted\ntear\nClay\nrecovery\nLiterature\nUnfortunately\n##row\npartial\nPetersburg\nBulgaria\ncoaching\nevolved\nreception\nenters\nnarrowed\nelevator\ntherapy\ndefended\npairs\n##lam\nbreaks\nBennett\nUncle\ncylinder\n##ison\npassion\nbases\nActor\ncancelled\nbattles\nextensively\noxygen\nAncient\nspecialized\nnegotiations\n##rat\nacquisition\nconvince\ninterpretation\n##00\nphotos\naspect\ncolleges\nArtist\nkeeps\n##wing\nCroatia\n##ona\nHughes\nOtto\ncomments\n##du\nPh\nSweet\nadventure\ndescribing\nStudent\nShakespeare\nscattered\nobjective\nAviation\nPhillips\nFourth\nathletes\n##hal\n##tered\nGuitar\nintensity\nnée\ndining\ncurve\nObama\ntopics\nlegislative\nMill\nCruz\n##ars\nMembers\nrecipient\nDerby\ninspiration\ncorresponding\nfed\nYouTube\ncoins\npressing\nintent\nKaren\ncinema\nDelta\ndestination\nshorter\nChristians\nimagined\ncanal\nNewcastle\nShah\nAdrian\nsuper\nMales\n160\nliberal\nlord\nbat\nsupplied\nClaude\nmeal\nworship\n##atic\nHan\nwire\n°F\n##tha\npunishment\nthirteen\nfighters\n##ibility\n1859\nBall\ngardens\n##ari\nOttawa\npole\nindicating\nTwenty\nHigher\nBass\nIvy\nfarming\n##urs\ncertified\nSaudi\nplenty\n##ces\nrestaurants\nRepresentative\nMiles\npayment\n##inger\n##rit\nConfederate\nfestivals\nreferences\n##ić\nMario\nPhD\nplayoffs\nwitness\nrice\nmask\nsaving\nopponents\nenforcement\nautomatically\nrelegated\n##oe\nradar\nwhenever\nFinancial\nimperial\nuncredited\ninfluences\nAbraham\nskull\nGuardian\nHaven\nBengal\nimpressive\ninput\nmixture\nWarsaw\naltitude\ndistinction\n1857\ncollective\nAnnie\n##ean\n##bal\ndirections\nFlying\n##nic\nfaded\n##ella\ncontributing\n##ó\nemployee\n##lum\n##yl\nruler\noriented\nconductor\nfocusing\n##die\nGiants\nMills\nmines\nDeep\ncurled\nJessica\nguitars\nLouise\nprocedure\nMachine\nfailing\nattendance\nNepal\nBrad\nLiam\ntourist\nexhibited\nSophie\ndepicted\nShaw\nChuck\n##can\nexpecting\nchallenges\n##nda\nequally\nresignation\n##logical\nTigers\nloop\npitched\noutdoor\nreviewed\nhopes\nTrue\ntemporarily\nBorough\ntorn\njerked\ncollect\nBerkeley\nIndependence\ncotton\nretreat\ncampaigns\nparticipating\nIntelligence\nHeaven\n##ked\nsituations\nborough\nDemocrats\nHarbor\n##len\nLiga\nserial\ncircles\nfourteen\n##lot\nseized\nfilling\ndepartments\nfinance\nabsolute\nRoland\nNate\nfloors\nraced\nstruggling\ndeliver\nprotests\n##tel\nExchange\nefficient\nexperiments\n##dar\nfaint\n3D\nbinding\nLions\nlightly\nskill\nproteins\ndifficulties\n##cal\nmonthly\ncamps\nflood\nloves\nAmanda\nCommerce\n##oid\n##lies\nelementary\n##tre\norganic\n##stein\n##ph\nreceives\nTech\nenormous\ndistinctive\nJoint\nexperiment\nCircuit\ncitizen\n##hy\nshelter\nideal\npractically\nformula\naddressed\nFoster\nProductions\n##ax\nvariable\npunk\nVoice\nfastest\nconcentrated\n##oma\n##yer\nstored\nsurrender\nvary\nSergeant\nWells\nward\nWait\n##ven\nplayoff\nreducing\ncavalry\n##dle\nVenezuela\ntissue\namounts\nsweat\n##we\nNon\n##nik\nbeetle\n##bu\n##tu\nJared\nHunt\n##₂\nfat\nSultan\nLiving\nCircle\nSecondary\nSuddenly\nreverse\n##min\nTravel\n##bin\nLebanon\n##mas\nvirus\nWind\ndissolved\nenrolled\nholiday\nKeep\nhelicopter\nClarke\nconstitutional\ntechnologies\ndoubles\ninstructions\n##ace\nAzerbaijan\n##ill\noccasional\nfrozen\ntrick\nwiped\nwritings\nShanghai\npreparing\nchallenged\nmainstream\nsummit\n180\n##arian\n##rating\ndesignation\n##ada\nrevenge\nfilming\ntightened\nMiguel\nMontana\nreflect\ncelebration\nbitch\nflashed\nsignals\nrounded\npeoples\n##tation\nrenowned\nGoogle\ncharacteristic\nCampaign\nsliding\n##rman\nusage\nRecord\nUsing\nwoke\nsolutions\nholes\ntheories\nlogo\nProtestant\nrelaxed\nbrow\nnickname\nReading\nmarble\n##tro\nsymptoms\nOverall\ncapita\n##ila\noutbreak\nrevolution\ndeemed\nPrincipal\nHannah\napproaches\ninducted\nWellington\nvulnerable\nEnvironmental\nDrama\nincumbent\nDame\n1854\ntravels\nsamples\naccurate\nphysically\nSony\nNashville\n##sville\n##lic\n##og\nProducer\nLucky\ntough\nStanford\nresort\nrepeatedly\neyebrows\nFar\nchoir\ncommenced\n##ep\n##ridge\nrage\nswing\nsequel\nheir\nbuses\nad\nGrove\n##late\n##rick\nupdated\n##SA\nDelaware\n##fa\nAthletics\nwarmth\nOff\nexcitement\nverse\nProtection\nVilla\ncorruption\nintellectual\nJenny\n##lyn\nmystery\nprayer\nhealthy\n##ologist\nBear\nlab\nErnest\nRemix\nregister\nbasement\nMontgomery\nconsistent\ntier\n1855\nPreston\nBrooks\n##maker\nvocalist\nlaboratory\ndelayed\nwheels\nrope\nbachelor\npitcher\nBlock\nNevertheless\nsuspect\nefficiency\nNebraska\nsiege\nFBI\nplanted\n##AC\nNewton\nbreeding\n##ain\neighteen\nArgentine\nencounter\nservant\n1858\nelder\nShadow\nEpisode\nfabric\ndoctors\nsurvival\nremoval\nchemistry\nvolunteers\nKane\nvariant\narrives\nEagle\nLeft\n##fe\nJo\ndivorce\n##ret\nyesterday\nBryan\nhandling\ndiseases\ncustomer\nSheriff\nTiger\nHarper\n##oi\nresting\nLinda\nSheffield\ngasped\nsexy\neconomics\nalien\ntale\nfootage\nLiberty\nyeah\nfundamental\nGround\nflames\nActress\nphotographer\nMaggie\nAdditional\njoke\ncustom\nSurvey\nAbu\nsilk\nconsumption\nEllis\nbread\n##uous\nengagement\nputs\nDog\n##hr\npoured\nguilt\nCDP\nboxes\nhardware\nclenched\n##cio\nstem\narena\nextending\n##com\nexamination\nSteel\nencountered\nrevised\n140\npicking\nCar\nhasn\nMinor\npride\nRoosevelt\nboards\n##mia\nblocked\ncurious\ndrag\nnarrative\nbrigade\nPrefecture\nmysterious\nnamely\nconnects\nDevil\nhistorians\nCHAPTER\nquit\ninstallation\nGolf\nempire\nelevated\n##eo\nreleasing\nBond\n##uri\nharsh\nban\n##BA\ncontracts\ncloth\npresents\nstake\nchorus\n##eau\nswear\n##mp\nallies\ngenerations\nMotor\nmeter\npen\nwarrior\nveteran\n##EC\ncomprehensive\nmissile\ninteraction\ninstruction\nRenaissance\nrested\nDale\nfix\nfluid\nles\ninvestigate\nloaded\nwidow\nexhibit\nartificial\nselect\nrushing\ntasks\nsignature\nnowhere\nEngineer\nfeared\nPrague\nbother\nextinct\ngates\nBird\nclimbing\nheels\nstriking\nartwork\nhunt\nawake\n##hin\nFormula\nthereby\ncommitment\nimprisoned\nBeyond\n##MA\ntransformed\nAgriculture\nLow\nMovie\nradical\ncomplicated\nYellow\nAuckland\nmansion\ntenth\nTrevor\npredecessor\n##eer\ndisbanded\nsucked\ncircular\nwitch\ngaining\nlean\nBehind\nillustrated\nrang\ncelebrate\nbike\nconsist\nframework\n##cent\nShane\nowns\n350\ncomprises\ncollaborated\ncolleagues\n##cast\nengage\nfewer\n##ave\n1856\nobservation\ndiplomatic\nlegislature\nimprovements\nInterstate\ncraft\nMTV\nmartial\nadministered\njet\napproaching\npermanently\nattraction\nmanuscript\nnumbered\nHappy\nAndrea\nshallow\nGothic\nAnti\n##bad\nimprovement\ntrace\npreserve\nregardless\nrode\ndies\nachievement\nmaintaining\nHamburg\nspine\n##air\nflowing\nencourage\nwidened\nposts\n##bound\n125\nSoutheast\nSantiago\n##bles\nimpression\nreceiver\nSingle\nclosure\n##unt\ncommunist\nhonors\nNorthwest\n105\n##ulated\ncared\nun\nhug\nmagnetic\nseeds\ntopic\nperceived\nprey\nprevented\nMarvel\nEight\nMichel\nTransportation\nrings\nGate\n##gne\nByzantine\naccommodate\nfloating\n##dor\nequation\nministry\n##ito\n##gled\nRules\nearthquake\nrevealing\nBrother\nCeltic\nblew\nchairs\nPanama\nLeon\nattractive\ndescendants\nCare\nAmbassador\ntours\nbreathed\nthreatening\n##cho\nsmiles\nLt\nBeginning\n##iness\nfake\nassists\nfame\nstrings\nMobile\nLiu\nparks\nhttp\n1852\nbrush\nAunt\nbullet\nconsciousness\n##sta\n##ther\nconsequences\ngather\ndug\n1851\nbridges\nDoug\n##sion\nArtists\nignore\nCarol\nbrilliant\nradiation\ntemples\nbasin\nclouds\n##cted\nStevens\nspite\nsoap\nconsumer\nDamn\nSnow\nrecruited\n##craft\nAdvanced\ntournaments\nQuinn\nundergraduate\nquestioned\nPalmer\nAnnual\nOthers\nfeeding\nSpider\nprinting\n##orn\ncameras\nfunctional\nChester\nreaders\nAlpha\nuniversal\nFaith\nBrandon\nFrançois\nauthored\nRing\nel\naims\nathletic\npossessed\nVermont\nprogrammes\n##uck\nbore\nFisher\nstatements\nshed\nsaxophone\nneighboring\npronounced\nbarrel\nbags\n##dge\norganisations\npilots\ncasualties\nKenneth\n##brook\nsilently\nMalcolm\nspan\nEssex\nanchor\n##hl\nvirtual\nlessons\nHenri\nTrump\nPage\npile\nlocomotive\nwounds\nuncomfortable\nsustained\nDiana\nEagles\n##pi\n2000s\ndocumented\n##bel\nCassie\ndelay\nkisses\n##ines\nvariation\n##ag\ngrowled\n##mark\n##ways\nLeslie\nstudios\nFriedrich\naunt\nactively\narmor\neaten\nhistorically\nBetter\npurse\nhoney\nratings\n##ée\nnaturally\n1840\npeer\nKenny\nCardinal\ndatabase\nLooking\nrunners\nhandsome\nDouble\nPA\n##boat\n##sted\nprotecting\n##jan\nDiamond\nconcepts\ninterface\n##aki\nWatch\nArticle\nColumbus\ndialogue\npause\n##rio\nextends\nblanket\npulse\n1853\naffiliate\nladies\nRonald\ncounted\nkills\ndemons\n##zation\nAirlines\nMarco\nCat\ncompanion\nmere\nYugoslavia\nForum\nAllan\npioneer\nCompetition\nMethodist\npatent\nnobody\nStockholm\n##ien\nregulation\n##ois\naccomplished\n##itive\nwashed\nsake\nVladimir\ncrops\nprestigious\nhumor\nSally\nlabour\ntributary\ntrap\naltered\nexamined\nMumbai\nbombing\nAsh\nnoble\nsuspension\nruins\n##bank\nspare\ndisplays\nguided\ndimensional\nIraqi\n##hon\nsciences\nFranz\nrelating\nfence\nfollowers\nPalestine\ninvented\nproceeded\nBatman\nBradley\n##yard\n##ova\ncrystal\nKerala\n##ima\nshipping\nhandled\nWant\nabolished\nDrew\n##tter\nPowell\nHalf\n##table\n##cker\nexhibitions\nWere\nassignment\nassured\n##rine\nIndonesian\nGrammy\nacknowledged\nKylie\ncoaches\nstructural\nclearing\nstationed\nSay\nTotal\nRail\nbesides\nglow\nthreats\nafford\nTree\nMusical\n##pp\nelite\ncentered\nexplore\nEngineers\nStakes\nHello\ntourism\nseverely\nassessment\n##tly\ncrack\npoliticians\n##rrow\nsheets\nvolunteer\n##borough\n##hold\nannouncement\nrecover\ncontribute\nlungs\n##ille\nmainland\npresentation\nJohann\nWriting\n1849\n##bird\nStudy\nBoulevard\ncoached\nfail\nairline\nCongo\nPlus\nSyrian\nintroduce\nridge\nCasey\nmanages\n##fi\nsearched\nSupport\nsuccession\nprogressive\ncoup\ncultures\n##lessly\nsensation\nCork\nElena\nSofia\nPhilosophy\nmini\ntrunk\nacademy\nMass\nLiz\npracticed\nReid\n##ule\nsatisfied\nexperts\nWilhelm\nWoods\ninvitation\nAngels\ncalendar\njoy\nSr\nDam\npacked\n##uan\nbastard\nWorkers\nbroadcasts\nlogic\ncooking\nbackward\n##ack\nChen\ncreates\nenzyme\n##xi\nDavies\naviation\nVII\nConservation\nfucking\nKnights\n##kan\nrequiring\nhectares\nwars\nate\n##box\nMind\ndesired\noak\nabsorbed\nReally\nVietnamese\nPaulo\nathlete\n##car\n##eth\nTalk\nWu\n##cks\nsurvivors\nYang\nJoel\nAlmost\nHolmes\nArmed\nJoshua\npriests\ndiscontinued\n##sey\nblond\nRolling\nsuggesting\nCA\nclay\nexterior\nScientific\n##sive\nGiovanni\nHi\nfarther\ncontents\nWinners\nanimation\nneutral\nmall\nNotes\nlayers\nprofessionals\nArmstrong\nAgainst\nPiano\ninvolve\nmonitor\nangel\nparked\nbears\nseated\nfeat\nbeliefs\n##kers\nVersion\nsuffer\n##ceae\nguidance\n##eur\nhonored\nraid\nalarm\nGlen\nEllen\nJamaica\ntrio\nenabled\n##ils\nprocedures\n##hus\nmoderate\nupstairs\n##ses\ntorture\nGeorgian\nrebellion\nFernando\nNice\n##are\nAires\nCampus\nbeast\n##hing\n1847\n##FA\nIsle\n##logist\nPrinceton\ncathedral\nOakland\nSolomon\n##tto\nMilwaukee\nupcoming\nmidfielder\nNeither\nsacred\nEyes\nappreciate\nBrunswick\nsecrets\nRice\nSomerset\nChancellor\nCurtis\n##gel\nRich\nseparation\ngrid\n##los\n##bon\nurge\n##ees\n##ree\nfreight\ntowers\npsychology\nrequirement\ndollar\n##fall\n##sman\nexile\ntomb\nSalt\nStefan\nBuenos\nRevival\nPorter\ntender\ndiesel\nchocolate\nEugene\nLegion\nLaboratory\nsheep\narched\nhospitals\norbit\nFull\n##hall\ndrinks\nripped\n##RS\ntense\nHank\nleagues\n##nberg\nPlayStation\nfool\nPunjab\nrelatives\nComedy\nsur\n1846\nTonight\nSox\n##if\nRabbi\norg\nspeaks\ninstitute\ndefender\npainful\nwishes\nWeekly\nliteracy\nportions\nsnake\nitem\ndeals\n##tum\nautumn\nsharply\nreforms\nthighs\nprototype\n##ition\nargues\ndisorder\nPhysics\nterror\nprovisions\nrefugees\npredominantly\nindependently\nmarch\n##graphy\nArabia\nAndrews\nBus\nMoney\ndrops\n##zar\npistol\nmatrix\nrevolutionary\n##ust\nStarting\n##ptic\nOak\nMonica\n##ides\nservants\n##hed\narchaeological\ndivorced\nrocket\nenjoying\nfires\n##nel\nassembled\nqualification\nretiring\n##fied\nDistinguished\nhandful\ninfection\nDurham\n##itz\nfortune\nrenewed\nChelsea\n##sley\ncurved\ngesture\nretain\nexhausted\n##ifying\nPerth\njumping\nPalestinian\nSimpson\ncolonies\nsteal\n##chy\ncorners\nFinn\narguing\nMartha\n##var\nBetty\nemerging\nHeights\nHindi\nManila\npianist\nfounders\nregret\nNapoleon\nelbow\noverhead\nbold\npraise\nhumanity\n##ori\nRevolutionary\n##ere\nfur\n##ole\nAshley\nOfficial\n##rm\nlovely\nArchitecture\n##sch\nBaronet\nvirtually\n##OS\ndescended\nimmigration\n##das\n##kes\nHolly\nWednesday\nmaintains\ntheatrical\nEvan\nGardens\nciting\n##gia\nsegments\nBailey\nGhost\n##city\ngoverning\ngraphics\n##ined\nprivately\npotentially\ntransformation\nCrystal\nCabinet\nsacrifice\nhesitated\nmud\nApollo\nDesert\nbin\nvictories\nEditor\nRailways\nWeb\nCase\ntourists\nBrussels\nFranco\ncompiled\ntopped\nGene\nengineers\ncommentary\negg\nescort\nnerve\narch\nnecessarily\nfrustration\nMichelle\ndemocracy\ngenes\nFacebook\nhalfway\n##ient\n102\nflipped\nWon\n##mit\nNASA\nLynn\nProvincial\nambassador\nInspector\nglared\nChange\nMcDonald\ndevelopments\ntucked\nnoting\nGibson\ncirculation\ndubbed\narmies\nresource\nHeadquarters\n##iest\nMia\nAlbanian\nOil\nAlbums\nexcuse\nintervention\nGrande\nHugo\nintegration\ncivilians\ndepends\nreserves\nDee\ncompositions\nidentification\nrestrictions\nquarterback\nMiranda\nUniverse\nfavourite\nranges\nhint\nloyal\nOp\nentity\nManual\nquoted\ndealt\nspecialist\nZhang\ndownload\nWestminster\nRebecca\nstreams\nAnglican\nvariations\nMine\ndetective\nFilms\nreserved\n##oke\n##key\nsailing\n##gger\nexpanding\nrecall\ndiscovers\nparticles\nbehaviour\nGavin\nblank\npermit\nJava\nFraser\nPass\n##non\n##TA\npanels\nstatistics\nnotion\ncourage\ndare\nvenues\n##roy\nBox\nNewport\ntravelling\nThursday\nwarriors\nGlenn\ncriteria\n360\nmutual\nrestore\nvaried\nbitter\nKatherine\n##lant\nritual\nbits\n##à\nHenderson\ntrips\nRichardson\nDetective\ncurse\npsychological\nIl\nmidnight\nstreak\nfacts\nDawn\nIndies\nEdmund\nroster\nGen\n##nation\n1830\ncongregation\nshaft\n##ically\n##mination\nIndianapolis\nSussex\nloving\n##bit\nsounding\nhorrible\nContinental\nGriffin\nadvised\nmagical\nmillions\n##date\n1845\nSafety\nlifting\ndetermination\nvalid\ndialect\nPenn\nKnow\ntriple\navoided\ndancer\njudgment\nsixty\nfarmer\nlakes\nblast\naggressive\nAbby\ntag\nchains\ninscription\n##nn\nconducting\nScout\nbuying\n##wich\nspreading\n##OC\narray\nhurried\nEnvironment\nimproving\nprompted\nfierce\nTaking\nAway\ntune\npissed\nBull\ncatching\n##ying\neyebrow\nmetropolitan\nterrain\n##rel\nLodge\nmanufacturers\ncreator\n##etic\nhappiness\nports\n##ners\nRelations\nfortress\ntargeted\n##ST\nallegedly\nblues\n##osa\nBosnia\n##dom\nburial\nsimilarly\nstranger\npursued\nsymbols\nrebels\nreflection\nroutine\ntraced\nindoor\neventual\n##ska\n##ão\n##una\nMD\n##phone\noh\ngrants\nReynolds\nrid\noperators\n##nus\nJoey\nvital\nsiblings\nkeyboard\nbr\nremoving\nsocieties\ndrives\nsolely\nprincess\nlighter\nVarious\nCavalry\nbelieving\nSC\nunderwent\nrelay\nsmelled\nsyndrome\nwelfare\nauthorized\nseemingly\nHard\nchicken\n##rina\nAges\nBo\ndemocratic\nbarn\nEye\nshorts\n##coming\n##hand\ndisappointed\nunexpected\ncentres\nExhibition\nStories\nSite\nbanking\naccidentally\nAgent\nconjunction\nAndré\nChloe\nresist\nwidth\nQueens\nprovision\n##art\nMelissa\nHonorary\nDel\nprefer\nabruptly\nduration\n##vis\nGlass\nenlisted\n##ado\ndiscipline\nSisters\ncarriage\n##ctor\n##sburg\nLancashire\nlog\nfuck\n##iz\ncloset\ncollecting\nholy\nrape\ntrusted\ncleaning\ninhabited\nRocky\n104\neditorial\n##yu\n##ju\nsucceed\nstrict\nCuban\n##iya\nBronze\noutcome\n##ifies\n##set\ncorps\nHero\nbarrier\nKumar\ngroaned\nNina\nBurton\nenable\nstability\nMilton\nknots\n##ination\nslavery\n##borg\ncurriculum\ntrailer\nwarfare\nDante\nEdgar\nrevival\nCopenhagen\ndefine\nadvocate\nGarrett\nLuther\novercome\npipe\n750\nconstruct\nScotia\nkings\nflooding\n##hard\nFerdinand\nFelix\nforgot\nFish\nKurt\nelaborate\n##BC\ngraphic\ngripped\ncolonel\nSophia\nAdvisory\nSelf\n##uff\n##lio\nmonitoring\nseal\nsenses\nrises\npeaceful\njournals\n1837\nchecking\nlegendary\nGhana\n##power\nammunition\nRosa\nRichards\nnineteenth\nferry\naggregate\nTroy\ninter\n##wall\nTriple\nsteep\ntent\nCyprus\n1844\n##woman\ncommanding\nfarms\ndoi\nnavy\nspecified\nna\ncricketer\ntransported\nThink\ncomprising\ngrateful\nsolve\n##core\nbeings\nclerk\ngrain\nvector\ndiscrimination\n##TC\nKatie\nreasonable\ndrawings\nveins\nconsideration\nMonroe\nrepeat\nbreed\ndried\nwitnessed\nordained\nCurrent\nspirits\nremarkable\nconsultant\nurged\nRemember\nanime\nsingers\nphenomenon\nRhode\nCarlo\ndemanding\nfindings\nmanual\nvarying\nFellowship\ngenerate\nsafely\nheated\nwithdrawn\n##ao\nheadquartered\n##zon\n##lav\n##ency\nCol\nMemphis\nimposed\nrivals\nPlanet\nhealing\n##hs\nensemble\nWarriors\n##bone\ncult\nFrankfurt\n##HL\ndiversity\nGerald\nintermediate\n##izes\nreactions\nSister\n##ously\n##lica\nquantum\nawkward\nmentions\npursuit\n##ography\nvaries\nprofession\nmolecular\nconsequence\nlectures\ncracked\n103\nslowed\n##tsu\ncheese\nupgraded\nsuite\nsubstance\nKingston\n1800\nIdaho\nTheory\n##een\nain\nCarson\nMolly\n##OR\nconfiguration\nWhitney\nreads\naudiences\n##tie\nGeneva\nOutside\n##nen\n##had\ntransit\nvolleyball\nRandy\nChad\nrubber\nmotorcycle\nrespected\neager\nLevel\ncoin\n##lets\nneighbouring\n##wski\nconfident\n##cious\npoll\nuncertain\npunch\nthesis\nTucker\nIATA\nAlec\n##ographic\n##law\n1841\ndesperately\n1812\nLithuania\naccent\nCox\nlightning\nskirt\n##load\nBurns\nDynasty\n##ug\nchapters\nWorking\ndense\nMorocco\n##kins\ncasting\nSet\nactivated\noral\nBrien\nhorn\nHIV\ndawn\nstumbled\naltar\ntore\nconsiderably\nNicole\ninterchange\nregistration\nbiography\nHull\nStan\nbulk\nconsent\nPierce\n##ER\nFifth\nmarched\nterrorist\n##piece\n##itt\nPresidential\nHeather\nstaged\nPlant\nrelegation\nsporting\njoins\n##ced\nPakistani\ndynamic\nHeat\n##lf\nourselves\nExcept\nElliott\nnationally\ngoddess\ninvestors\nBurke\nJackie\n##ā\n##RA\nTristan\nAssociate\nTuesday\nscope\nNear\nbunch\n##abad\n##ben\nsunlight\n##aire\nmanga\nWillie\ntrucks\nboarding\nLion\nlawsuit\nLearning\nDer\npounding\nawful\n##mine\nIT\nLegend\nromance\nSerie\nAC\ngut\nprecious\nRobertson\nhometown\nrealm\nGuards\nTag\nbatting\n##vre\nhalt\nconscious\n1838\nacquire\ncollar\n##gg\n##ops\nHerald\nnationwide\ncitizenship\nAircraft\ndecrease\nem\nFiction\nFemale\ncorporation\nLocated\n##ip\nfights\nunconscious\nTampa\nPoetry\nlobby\nMalta\n##sar\n##bie\nlayout\nTate\nreader\nstained\n##bre\n##rst\n##ulate\nloudly\nEva\nCohen\nexploded\nMerit\nMaya\n##rable\nRovers\n##IC\nMorrison\nShould\nvinyl\n##mie\nonwards\n##gie\nvicinity\nWildlife\nprobability\nMar\nBarnes\n##ook\nspinning\nMoses\n##vie\nSurrey\nPlanning\nconferences\nprotective\nPlaza\ndeny\nCanterbury\nmanor\nEstate\ntilted\ncomics\nIBM\ndestroying\nserver\nDorothy\n##horn\nOslo\nlesser\nheaven\nMarshal\nscales\nstrikes\n##ath\nfirms\nattract\n##BS\ncontrolling\nBradford\nsoutheastern\nAmazon\nTravis\nJanet\ngoverned\n1842\nTrain\nHolden\nbleeding\ngifts\nrent\n1839\npalms\n##ū\njudicial\nHo\nFinals\nconflicts\nunlikely\ndraws\n##cies\ncompensation\nadds\nelderly\nAnton\nlasting\nNintendo\ncodes\nministers\npot\nassociations\ncapabilities\n##cht\nlibraries\n##sie\nchances\nperformers\nrunway\n##af\n##nder\nMid\nVocals\n##uch\n##eon\ninterpreted\npriority\nUganda\nruined\nMathematics\ncook\nAFL\nLutheran\nAIDS\nCapitol\nchase\naxis\nMoreover\nMaría\nSaxon\nstoryline\n##ffed\nTears\nKid\ncent\ncolours\nSex\n##long\npm\nblonde\nEdwin\nCE\ndiocese\n##ents\n##boy\nInn\n##ller\nSaskatchewan\n##kh\nstepping\nWindsor\n##oka\n##eri\nXavier\nResources\n1843\n##top\n##rad\n##lls\nTestament\npoorly\n1836\ndrifted\nslope\nCIA\nremix\nLords\nmature\nhosting\ndiamond\nbeds\n##ncies\nluxury\ntrigger\n##lier\npreliminary\nhybrid\njournalists\nEnterprise\nproven\nexpelled\ninsects\nBeautiful\nlifestyle\nvanished\n##ake\n##ander\nmatching\nsurfaces\nDominican\nKids\nreferendum\nOrlando\nTruth\nSandy\nprivacy\nCalgary\nSpeaker\nsts\nNobody\nshifting\n##gers\nRoll\nArmenia\nHand\n##ES\n106\n##ont\nGuild\nlarvae\nStock\nflame\ngravity\nenhanced\nMarion\nsurely\n##tering\nTales\nalgorithm\nEmmy\ndarker\nVIII\n##lash\nhamlet\ndeliberately\noccurring\nchoices\nGage\nfees\nsettling\nridiculous\n##ela\nSons\ncop\ncustody\n##ID\nproclaimed\nCardinals\n##pm\nMetal\nAna\n1835\nclue\nCardiff\nriders\nobservations\nMA\nsometime\n##och\nperformer\nintact\nPoints\nallegations\nrotation\nTennis\ntenor\nDirectors\n##ats\nTransit\nthigh\nComplex\n##works\ntwentieth\nFactory\ndoctrine\nDaddy\n##ished\npretend\nWinston\ncigarette\n##IA\nspecimens\nhydrogen\nsmoking\nmathematical\narguments\nopenly\ndeveloper\n##iro\nfists\nsomebody\n##san\nStanding\nCaleb\nintelligent\nStay\nInterior\nechoed\nValentine\nvarieties\nBrady\ncluster\nEver\nvoyage\n##of\ndeposits\nultimate\nHayes\nhorizontal\nproximity\n##ás\nestates\nexploration\nNATO\nClassical\n##most\nbills\ncondemned\n1832\nhunger\n##ato\nplanes\ndeserve\noffense\nsequences\nrendered\nacceptance\n##ony\nmanufacture\nPlymouth\ninnovative\npredicted\n##RC\nFantasy\n##une\nsupporter\nabsent\nPicture\nbassist\nrescued\n##MC\nAhmed\nMonte\n##sts\n##rius\ninsane\nnovelist\n##és\nagrees\nAntarctic\nLancaster\nHopkins\ncalculated\nstartled\n##star\ntribal\nAmendment\n##hoe\ninvisible\npatron\ndeer\nWalk\ntracking\nLyon\ntickets\n##ED\nphilosopher\ncompounds\nchuckled\n##wi\npound\nloyalty\nAcademic\npetition\nrefuses\nmarking\nMercury\nnortheastern\ndimensions\nscandal\nCanyon\npatch\npublish\n##oning\nPeak\nminds\n##boro\nPresbyterian\nHardy\ntheoretical\nmagnitude\nbombs\ncage\n##ders\n##kai\nmeasuring\nexplaining\navoiding\ntouchdowns\nCard\ntheology\n##ured\nPopular\nexport\nsuspicious\nProbably\nphotograph\nLou\nParks\nArms\ncompact\nApparently\nexcess\nBanks\nlied\nstunned\nterritorial\nFilipino\nspectrum\nlearns\nwash\nimprisonment\nugly\n##rose\nAlbany\nErik\nsends\n##hara\n##rid\nconsumed\n##gling\nBelgrade\nDa\nopposing\nMagnus\nfootsteps\nglowing\ndelicate\nAlexandria\nLudwig\ngorgeous\nBros\nIndex\n##PA\ncustoms\npreservation\nbonds\n##mond\nenvironments\n##nto\ninstructed\nparted\nadoption\nlocality\nworkshops\ngoalkeeper\n##rik\n##uma\nBrighton\nSlovenia\n##ulating\n##tical\ntowel\nhugged\nstripped\nBears\nupright\nWagner\n##aux\nsecretly\nAdventures\nnest\nCourse\nLauren\nBoeing\nAbdul\nLakes\n450\n##cu\nUSSR\ncaps\nChan\n##nna\nconceived\nActually\nBelfast\nLithuanian\nconcentrate\npossess\nmilitia\npine\nprotagonist\nHelena\n##PS\n##band\nBelle\nClara\nReform\ncurrency\npregnancy\n1500\n##rim\nIsabella\nhull\nName\ntrend\njournalism\ndiet\n##mel\nRecording\nacclaimed\nTang\nJace\nsteering\nvacant\nsuggestion\ncostume\nlaser\n##š\n##ink\n##pan\n##vić\nintegral\nachievements\nwise\nclassroom\nunions\nsouthwestern\n##uer\nGarcia\ntoss\nTara\nLarge\n##tate\nevident\nresponsibilities\npopulated\nsatisfaction\n##bia\ncasual\nEcuador\n##ght\narose\n##ović\nCornwall\nembrace\nrefuse\nHeavyweight\nXI\nEden\nactivists\n##uation\nbiology\n##shan\nfraud\nFuck\nmatched\nlegacy\nRivers\nmissionary\nextraordinary\nDidn\nholder\nwickets\ncrucial\nWriters\nHurricane\nIceland\ngross\ntrumpet\naccordance\nhurry\nflooded\ndoctorate\nAlbania\n##yi\nunited\ndeceased\njealous\ngrief\nflute\nportraits\n##а\npleasant\nFounded\nFace\ncrowned\nRaja\nadvisor\nSalem\n##ec\nAchievement\nadmission\nfreely\nminimal\nSudan\ndevelopers\nestimate\ndisabled\n##lane\ndownstairs\nBruno\n##pus\npinyin\n##ude\nlecture\ndeadly\nunderlying\noptical\nwitnesses\nCombat\nJulius\ntapped\nvariants\n##like\nColonial\nCritics\nSimilarly\nmouse\nvoltage\nsculptor\nConcert\nsalary\nFrances\n##ground\nhook\npremises\nSoftware\ninstructor\nnominee\n##ited\nfog\nslopes\n##zu\nvegetation\nsail\n##rch\nBody\nApart\natop\nView\nutility\nribs\ncab\nmigration\n##wyn\nbounded\n2019\npillow\ntrails\n##ub\nHalifax\nshade\nRush\n##lah\n##dian\nNotre\ninterviewed\nAlexandra\nSpringfield\nIndeed\nrubbing\ndozens\namusement\nlegally\n##lers\nJill\nCinema\nignoring\nChoice\n##ures\npockets\n##nell\nlaying\nBlair\ntackles\nseparately\n##teen\nCriminal\nperforms\ntheorem\nCommunication\nsuburbs\n##iel\ncompetitors\nrows\n##hai\nManitoba\nEleanor\ninteractions\nnominations\nassassination\n##dis\nEdmonton\ndiving\n##dine\nessay\n##tas\nAFC\nEdge\ndirecting\nimagination\nsunk\nimplement\nTheodore\ntrembling\nsealed\n##rock\nNobel\n##ancy\n##dorf\n##chen\ngenuine\napartments\nNicolas\nAA\nBach\nGlobe\nStore\n220\n##10\nRochester\n##ño\nalert\n107\nBeck\n##nin\nNaples\nBasin\nCrawford\nfears\nTracy\n##hen\ndisk\n##pped\nseventeen\nLead\nbackup\nreconstruction\n##lines\nterrified\nsleeve\nnicknamed\npopped\n##making\n##ern\nHoliday\nGospel\nibn\n##ime\nconvert\ndivine\nresolved\n##quet\nski\nrealizing\n##RT\nLegislature\nreservoir\nRain\nsinking\nrainfall\nelimination\nchallenging\ntobacco\n##outs\nGiven\nsmallest\nCommercial\npin\nrebel\ncomedian\nexchanged\nairing\ndish\nSalvador\npromising\n##wl\nrelax\npresenter\ntoll\naerial\n##eh\nFletcher\nbrass\ndisappear\nzones\nadjusted\ncontacts\n##lk\nsensed\nWalt\nmild\ntoes\nflies\nshame\nconsiders\nwildlife\nHanna\nArsenal\nLadies\nnaming\n##ishing\nanxiety\ndiscussions\ncute\nundertaken\nCash\nstrain\nWyoming\ndishes\nprecise\nAngela\n##ided\nhostile\ntwins\n115\nBuilt\n##pel\nOnline\ntactics\nNewman\n##bourne\nunclear\nrepairs\nembarrassed\nlisting\ntugged\nVale\n##gin\nMeredith\nbout\n##cle\nvelocity\ntips\nfroze\nevaluation\ndemonstrate\n##card\ncriticised\nNash\nlineup\nRao\nmonks\nbacteria\nlease\n##lish\nfrightened\nden\nrevived\nfinale\n##rance\nflee\nLetters\ndecreased\n##oh\nSounds\nwrap\nSharon\nincidents\nrenovated\neverybody\nstole\nBath\nboxing\n1815\nwithdraw\nbacks\ninterim\nreact\nmurders\nRhodes\nCopa\nframed\nflown\nEstonia\nHeavy\nexplored\n##rra\n##GA\n##ali\nIstanbul\n1834\n##rite\n##aging\n##ues\nEpiscopal\narc\norientation\nMaxwell\ninfected\n##rot\nBCE\nBrook\ngrasp\nRoberto\nExcellence\n108\nwithdrawal\nMarines\nrider\nLo\n##sin\n##run\nSubsequently\ngarrison\nhurricane\nfacade\nPrussia\ncrushed\nenterprise\n##mber\nTwitter\nGeneration\nPhysical\nSugar\nediting\ncommunicate\nEllie\n##hurst\nErnst\nwagon\npromotional\nconquest\nParliamentary\ncourtyard\nlawyers\nSuperman\nemail\nPrussian\nlately\nlecturer\nSinger\nMajesty\nParadise\nsooner\nHeath\nslot\ncurves\nconvoy\n##vian\ninduced\nsynonym\nbreeze\n##plane\n##ox\npeered\nCoalition\n##hia\nodds\n##esh\n##lina\nTomorrow\nNadu\n##ico\n##rah\ndamp\nautonomous\nconsole\nVictory\ncounts\nLuxembourg\nintimate\nArchived\nCarroll\nspy\nZero\nhabit\nAlways\nfaction\nteenager\nJohnston\nchaos\nruin\ncommerce\nblog\n##shed\n##the\nreliable\nWord\nYu\nNorton\nparade\nCatholics\ndamned\n##iling\nsurgeon\n##tia\nAllison\nJonas\nremarked\n##ès\nidiot\nMaking\nproposals\nIndustries\nstrategies\nartifacts\nbatteries\nreward\n##vers\nAgricultural\ndistinguish\nlengths\nJeffrey\nProgressive\nkicking\nPatricia\n##gio\nballot\n##ios\nskilled\n##gation\nColt\nlimestone\n##AS\npeninsula\n##itis\nLA\nhotels\nshapes\nCrime\ndepicting\nnorthwestern\nHD\nsilly\nDas\n##²\n##ws\n##ash\n##matic\nthermal\nHas\nforgive\nsurrendered\nPalm\nNacional\ndrank\nhaired\nMercedes\n##foot\nloading\nTimothy\n##roll\nmechanisms\ntraces\ndigging\ndiscussing\nNatalie\n##zhou\nForbes\nlandmark\nAnyway\nManor\nconspiracy\ngym\nknocking\nviewing\nFormation\nPink\nBeauty\nlimbs\nPhillip\nsponsor\nJoy\ngranite\nHarbour\n##ero\npayments\nBallet\nconviction\n##dam\nHood\nestimates\nlacked\nMad\nJorge\n##wen\nrefuge\n##LA\ninvaded\nKat\nsuburban\n##fold\ninvestigated\nAri\ncomplained\ncreek\nGeorges\n##uts\npowder\naccepting\ndeserved\ncarpet\nThunder\nmolecules\nLegal\ncliff\nstrictly\nenrollment\nranch\n##rg\n##mba\nproportion\nrenovation\ncrop\ngrabbing\n##liga\nfinest\nentries\nreceptor\nhelmet\nblown\nListen\nflagship\nworkshop\nresolve\nnails\nShannon\nportal\njointly\nshining\nViolet\noverwhelming\nupward\nMick\nproceedings\n##dies\n##aring\nLaurence\nChurchill\n##rice\ncommit\n170\ninclusion\nExamples\n##verse\n##rma\nfury\npaths\n##SC\nankle\nnerves\nChemistry\nrectangular\nsworn\nscreenplay\ncake\nMann\nSeoul\nAnimal\nsizes\nSpeed\nvol\nPopulation\nSouthwest\nHold\ncontinuously\nQualified\nwishing\nFighting\nMade\ndisappointment\nPortsmouth\nThirty\n##beck\nAhmad\nteammate\nMLB\ngraph\nCharleston\nrealizes\n##dium\nexhibits\npreventing\n##int\nfever\nrivalry\nMale\nmentally\ndull\n##lor\n##rich\nconsistently\n##igan\nMadame\ncertificate\nsuited\nKrishna\naccuracy\nWebb\nBudapest\nRex\n1831\nCornell\nOK\nsurveillance\n##gated\nhabitats\nAdventure\nConrad\nSuperior\nGay\nsofa\naka\nboot\nStatistics\nJessie\nLiberation\n##lip\n##rier\nbrands\nsaint\nHeinrich\nChristine\nbath\nRhine\nballet\nJin\nconsensus\nchess\nArctic\nstack\nfurious\ncheap\ntoy\n##yre\n##face\n##gging\ngastropod\n##nne\nRomans\nmembrane\nanswering\n25th\narchitects\nsustainable\n##yne\nHon\n1814\nBaldwin\ndome\n##awa\n##zen\ncelebrity\nenclosed\n##uit\n##mmer\nElectronic\nlocals\n##CE\nsupervision\nmineral\nChemical\nSlovakia\nalley\nhub\n##az\nheroes\nCreative\n##AM\nincredible\npolitically\nESPN\nyanked\nhalls\nAboriginal\nGreatest\nyield\n##20\ncongressional\nrobot\nKiss\nwelcomed\nMS\nspeeds\nproceed\nSherman\neased\nGreene\nWalsh\nGeoffrey\nvariables\nrocky\n##print\nacclaim\nReverend\nWonder\ntonnes\nrecurring\nDawson\ncontinent\nfinite\nAP\ncontinental\nID\nfacilitate\nessays\nRafael\nNeal\n1833\nancestors\n##met\n##gic\nEspecially\nteenage\nfrustrated\nJules\ncock\nexpense\n##oli\n##old\nblocking\nNotable\nprohibited\nca\ndock\norganize\n##wald\nBurma\nGloria\ndimension\naftermath\nchoosing\nMickey\ntorpedo\npub\n##used\nmanuscripts\nlaps\nUlster\nstaircase\nsphere\nInsurance\nContest\nlens\nrisks\ninvestigations\nERA\nglare\n##play\nGraduate\nauction\nChronicle\n##tric\n##50\nComing\nseating\nWade\nseeks\ninland\nThames\nRather\nbutterfly\ncontracted\npositioned\nconsumers\ncontestants\nfragments\nYankees\nSantos\nadministrator\nhypothesis\nretire\nDenis\nagreements\nWinnipeg\n##rill\n1820\ntrophy\ncrap\nshakes\nJenkins\n##rium\nya\ntwist\nlabels\nMaritime\n##lings\n##iv\n111\n##ensis\nCairo\nAnything\n##fort\nopinions\ncrowded\n##nian\nabandon\n##iff\ndrained\nimported\n##rr\ntended\n##rain\nGoing\nintroducing\nsculptures\nbankruptcy\ndanced\ndemonstration\nstance\nsettings\ngazed\nabstract\npet\nCalvin\nstiff\nstrongest\nwrestler\n##dre\nRepublicans\ngrace\nallocated\ncursed\nsnail\nadvancing\nReturn\nerrors\nMall\npresenting\neliminate\nAmateur\nInstitution\ncounting\n##wind\nwarehouse\n##nde\nEthiopia\ntrailed\nhollow\n##press\nLiterary\ncapability\nnursing\npreceding\nlamp\nThomson\nMorton\n##ctic\nCrew\nClose\ncomposers\nboom\nClare\nmissiles\n112\nhunter\nsnap\n##oni\n##tail\nUs\ndeclaration\n##cock\nrally\nhuh\nlion\nstraightened\nPhilippe\nSutton\nalpha\nvalued\nmaker\nnavigation\ndetected\nfavorable\nperception\nCharter\n##ña\nRicky\nrebounds\ntunnels\nslapped\nEmergency\nsupposedly\n##act\ndeployment\nsocialist\ntubes\nanybody\ncorn\n##NA\nSeminary\nheating\npump\n##AA\nachieving\nsouls\n##ass\nLink\n##ele\n##smith\ngreeted\nBates\nAmericas\nElder\ncure\ncontestant\n240\nfold\nRunner\nUh\nlicked\nPolitics\ncommittees\nneighbors\nfairy\nSilva\nLeipzig\ntipped\ncorrectly\nexciting\nelectronics\nfoundations\ncottage\ngovernmental\n##hat\nallied\nclaws\npresidency\ncruel\nAgreement\nslender\naccompanying\nprecisely\n##pass\ndriveway\nswim\nStand\ncrews\n##mission\nrely\neveryday\nWings\ndemo\n##hic\nrecreational\nmin\nnationality\n##duction\nEaster\n##hole\ncanvas\nKay\nLeicester\ntalented\nDiscovery\nshells\n##ech\nKerry\nFerguson\nLeave\n##place\naltogether\nadopt\nbutt\nwolves\n##nsis\n##ania\nmodest\nsoprano\nBoris\n##ught\nelectron\ndepicts\nhid\ncruise\ndiffer\ntreasure\n##nch\nGun\nMama\nBengali\ntrainer\nmerchants\ninnovation\npresumably\nShirley\nbottles\nproceeds\nFear\ninvested\nPirates\nparticle\nDominic\nblamed\nFight\nDaisy\n##pper\n##graphic\nnods\nknight\nDoyle\ntales\nCarnegie\nEvil\nInter\nShore\nNixon\ntransform\nSavannah\n##gas\nBaltic\nstretching\nworlds\nprotocol\nPercy\nToby\nHeroes\nbrave\ndancers\n##aria\nbackwards\nresponses\nChi\nGaelic\nBerry\ncrush\nembarked\npromises\nMadonna\nresearcher\nrealised\ninaugurated\nCherry\nMikhail\nNottingham\nreinforced\nsubspecies\nrapper\n##kie\nDreams\nRe\nDamon\nMinneapolis\nmonsters\nsuspicion\nTel\nsurroundings\nafterward\ncomplaints\nOF\nsectors\nAlgeria\nlanes\nSabha\nobjectives\nDonna\nbothered\ndistracted\ndeciding\n##ives\n##CA\n##onia\nbishops\nStrange\nmachinery\nVoiced\nsynthesis\nreflects\ninterference\n##TS\n##ury\nkeen\n##ign\nfrown\nfreestyle\nton\nDixon\nSacred\nRuby\nPrison\n##ión\n1825\noutfit\n##tain\ncuriosity\n##ight\nframes\nsteadily\nemigrated\nhorizon\n##erly\nDoc\nphilosophical\nTable\nUTC\nMarina\n##DA\nsecular\n##eed\nZimbabwe\ncops\nMack\nsheriff\nSanskrit\nFrancesco\ncatches\nquestioning\nstreaming\nKill\ntestimony\nhissed\ntackle\ncountryside\ncopyright\n##IP\nBuddhism\n##rator\nladder\n##ON\nPast\nrookie\ndepths\n##yama\n##ister\n##HS\nSamantha\nDana\nEducational\nbrows\nHammond\nraids\nenvelope\n##sco\n##hart\n##ulus\nepic\ndetection\nStreets\nPotter\nstatistical\nfür\nni\naccounting\n##pot\nemployer\nSidney\nDepression\ncommands\nTracks\naveraged\nlets\nRam\nlongtime\nsuits\nbranded\nchip\nShield\nloans\nought\nSaid\nsip\n##rome\nrequests\nVernon\nbordered\nveterans\n##ament\nMarsh\nHerzegovina\nPine\n##igo\nmills\nanticipation\nreconnaissance\n##ef\nexpectations\nprotested\narrow\nguessed\ndepot\nmaternal\nweakness\n##ap\nprojected\npour\nCarmen\nprovider\nnewer\nremind\nfreed\n##rily\n##wal\n##tones\nintentions\nFiji\ntiming\nMatch\nmanagers\nKosovo\nHerman\nWesley\nChang\n135\nsemifinals\nshouting\nIndo\nJaneiro\nChess\nMacedonia\nBuck\n##onies\nrulers\nMail\n##vas\n##sel\nMHz\nProgramme\nTask\ncommercially\nsubtle\npropaganda\nspelled\nbowling\nbasically\nRaven\n1828\nColony\n109\n##ingham\n##wara\nanticipated\n1829\n##iers\ngraduates\n##rton\n##fication\nendangered\nISO\ndiagnosed\n##tage\nexercises\nBattery\nbolt\npoison\ncartoon\n##ción\nhood\nbowed\nheal\nMeyer\nReagan\n##wed\nsubfamily\n##gent\nmomentum\ninfant\ndetect\n##sse\nChapman\nDarwin\nmechanics\nNSW\nCancer\nBrooke\nNuclear\ncomprised\nhire\nsanctuary\nwingspan\ncontrary\nremembering\nsurprising\nBasic\nstealing\nOS\nhatred\n##lled\nmasters\nviolation\nRule\n##nger\nassuming\nconquered\nlouder\nrobe\nBeatles\nlegitimate\n##vation\nmassacre\nRica\nunsuccessfully\npoets\n##enberg\ncareers\ndoubled\npremier\nbattalions\nDubai\nPaper\nLouisville\ngestured\ndressing\nsuccessive\nmumbled\nVic\nreferee\npupil\n##cated\n##rre\nceremonies\npicks\n##IN\ndiplomat\nalike\ngeographical\nrays\n##HA\n##read\nharbour\nfactories\npastor\nplaywright\nUltimate\nnationalist\nuniforms\nobtaining\nkit\nAmber\n##pling\nscreenwriter\nancestry\n##cott\nFields\nPR\nColeman\nrat\nBavaria\nsqueeze\nhighlighted\nAdult\nreflecting\nMel\n1824\nbicycle\norganizing\nsided\nPreviously\nUnderground\nProf\nathletics\ncoupled\nmortal\nHampton\nworthy\nimmune\nAva\n##gun\nencouraging\nsimplified\n##ssa\n##nte\n##ann\nProvidence\nentities\nPablo\nStrong\nHousing\n##ista\n##ators\nkidnapped\nmosque\nKirk\nwhispers\nfruits\nshattered\nfossil\nEmpress\nJohns\nWebster\nThing\nrefusing\ndifferently\nspecimen\nHa\n##EN\n##tina\n##elle\n##night\nHorn\nneighbourhood\nBolivia\n##rth\ngenres\nPre\n##vich\nAmelia\nswallow\nTribune\nForever\nPsychology\nUse\n##bers\nGazette\nash\n##usa\nMonster\n##cular\ndelegation\nblowing\nOblast\nretreated\nautomobile\n##ex\nprofits\nshirts\ndevil\nTreasury\n##backs\nDrums\nRonnie\ngameplay\nexpertise\nEvening\nresides\nCaesar\nunity\nCrazy\nlinking\nVision\ndonations\nIsabel\nvalve\nSue\nWWE\nlogical\navailability\nfitting\nrevolt\n##mill\nLinux\ntaxi\nAccess\npollution\nstatues\nAugustus\n##pen\ncello\n##some\nlacking\n##ati\nGwen\n##aka\n##ovich\n1821\nWow\ninitiatives\nUruguay\nCain\nstroked\nexamine\n##ī\nmentor\nmoist\ndisorders\nbuttons\n##tica\n##anna\nSpecies\nLynch\nmuseums\nscorer\nPoor\neligibility\nop\nunveiled\ncats\nTitle\nwheat\ncritically\nSyracuse\n##osis\nmarketed\nenhance\nRyder\n##NG\n##ull\n##rna\nembedded\nthrows\nfoods\nhappily\n##ami\nlesson\nformats\npunched\n##rno\nexpressions\nqualities\n##sal\nGods\n##lity\nelect\nwives\n##lling\njungle\nToyota\nreversed\nGrammar\nCloud\nAgnes\n##ules\ndisputed\nverses\nLucien\nthreshold\n##rea\nscanned\n##bled\n##dley\n##lice\nKazakhstan\nGardner\nFreeman\n##rz\ninspection\nRita\naccommodation\nadvances\nchill\nElliot\nthriller\nConstantinople\n##mos\ndebris\nwhoever\n1810\nSanto\nCarey\nremnants\nGuatemala\n##irs\ncarriers\nequations\nmandatory\n##WA\nanxious\nmeasurement\nSummit\nTerminal\nErin\n##zes\nLLC\n##uo\nglancing\nsin\n##₃\nDowntown\nflowering\nEuro\nLeigh\nLance\nwarn\ndecent\nrecommendations\n##ote\nQuartet\n##rrell\nClarence\ncolleague\nguarantee\n230\nClayton\nBeast\naddresses\nprospect\ndestroyer\nvegetables\nLeadership\nfatal\nprints\n190\n##makers\nHyde\npersuaded\nillustrations\nSouthampton\nJoyce\nbeats\neditors\nmount\n##grave\nMalaysian\nBombay\nendorsed\n##sian\n##bee\napplying\nReligion\nnautical\nbomber\nNa\nairfield\ngravel\n##rew\nCave\nbye\ndig\ndecree\nburden\nElection\nHawk\nFe\n##iled\nreunited\n##tland\nliver\nTeams\nPut\ndelegates\nElla\n##fect\nCal\ninvention\nCastro\nbored\n##kawa\n##ail\nTrinidad\nNASCAR\npond\ndevelops\n##pton\nexpenses\nZoe\nReleased\n##rf\norgans\nbeta\nparameters\nNeill\n##lene\nlateral\nBeat\nblades\nEither\n##hale\nMitch\n##ET\n##vous\nRod\nburnt\nphones\nRising\n##front\ninvestigating\n##dent\nStephanie\n##keeper\nscreening\n##uro\nSwan\nSinclair\nmodes\nbullets\nNigerian\nmelody\n##ques\nRifle\n##12\n128\n##jin\ncharm\nVenus\n##tian\nfusion\nadvocated\nvisitor\npinned\ngenera\n3000\nFerry\nSolo\nquantity\nregained\nplatinum\nshoots\nnarrowly\npreceded\nupdate\n##ichi\nequality\nunaware\nregiments\nally\n##tos\ntransmitter\nlocks\nSeeing\noutlets\nfeast\nreopened\n##ows\nstruggles\nBuddy\n1826\nbark\nelegant\namused\nPretty\nthemed\nschemes\nLisbon\nTe\npatted\nterrorism\nMystery\n##croft\n##imo\nMadagascar\nJourney\ndealer\ncontacted\n##quez\nITV\nvacation\nWong\nSacramento\norganisms\n##pts\nbalcony\ncoloured\nsheer\ndefines\nMC\nabortion\nforbidden\naccredited\nNewfoundland\ntendency\nentrepreneur\nBenny\nTanzania\nneeding\nfinalist\nmythology\nweakened\ngown\nsentences\nGuest\nwebsites\nTibetan\nUFC\nvoluntary\nannoyed\nWelcome\nhonestly\ncorrespondence\ngeometry\nDeutsche\nBiology\nHelp\n##aya\nLines\nHector\n##ael\nreluctant\n##ages\nwears\ninquiry\n##dell\nHolocaust\nTourism\nWei\nvolcanic\n##mates\nVisual\nsorts\nneighborhoods\nRunning\napple\nshy\nLaws\nbend\nNortheast\nfeminist\nSpeedway\nMurder\nvisa\nstuffed\nfangs\ntransmitted\nfiscal\nAin\nenlarged\n##ndi\nCecil\nPeterson\nBenson\nBedford\nacceptable\n##CC\n##wer\npurely\ntriangle\nfoster\nAlberto\neducator\nHighland\nacute\nLGBT\nTina\nMi\nadventures\nDavidson\nHonda\ntranslator\nmonk\nenacted\nsummoned\n##ional\ncollector\nGenesis\nUn\nliner\nDi\nStatistical\n##CS\nfilter\nKnox\nReligious\nStella\nEstonian\nTurn\n##ots\nprimitive\nparishes\n##lles\ncomplexity\nautobiography\nrigid\ncannon\npursuing\nexploring\n##gram\n##mme\nfreshman\ncaves\nExpedition\nTraditional\niTunes\ncertification\ncooling\n##ort\n##gna\n##IT\n##lman\n##VA\nMotion\nexplosive\nlicence\nboxer\nshrine\nloosely\nBrigadier\nSavage\nBrett\nMVP\nheavier\n##elli\n##gged\nBuddha\nEasy\nspells\nfails\nincredibly\nGeorg\nstern\ncompatible\nPerfect\napplies\ncognitive\nexcessive\nnightmare\nneighbor\nSicily\nappealed\nstatic\n##₁\nAberdeen\n##leigh\nslipping\nbride\n##guard\nUm\nClyde\n1818\n##gible\nHal\nFrost\nSanders\ninteractive\nHour\n##vor\nhurting\nbull\ntermed\nshelf\ncapturing\n##pace\nrolls\n113\n##bor\nChilean\nteaches\n##rey\nexam\nshipped\nTwin\nborrowed\n##lift\nShit\n##hot\nLindsay\nBelow\nKiev\nLin\nleased\n##sto\nEli\nDiane\nVal\nsubtropical\nshoe\nBolton\nDragons\n##rification\nVatican\n##pathy\nCrisis\ndramatically\ntalents\nbabies\n##ores\nsurname\n##AP\n##cology\ncubic\nopted\nArcher\nsweep\ntends\nKarnataka\nJudy\nstint\nSimilar\n##nut\nexplicitly\n##nga\ninteract\nMae\nportfolio\nclinic\nabbreviated\nCounties\n##iko\nhearts\n##ı\nproviders\nscreams\nIndividual\n##etti\nMonument\n##iana\naccessed\nencounters\ngasp\n##rge\ndefunct\nAvery\n##rne\nnobility\nuseless\nPhase\nVince\nsenator\n##FL\n1813\nsurprisingly\n##illo\n##chin\nBoyd\nrumors\nequity\nGone\nHearts\nchassis\novernight\nTrek\nwrists\nsubmit\ncivic\ndesigners\n##rity\nprominence\ndecorative\nderives\nstarter\n##AF\nwisdom\nPowers\nreluctantly\nmeasurements\ndoctoral\nNoel\nGideon\nBaden\nCologne\nlawn\nHawaiian\nanthology\n##rov\nRaiders\nembassy\nSterling\n##pal\nTelugu\ntroubled\n##FC\n##bian\nfountain\nobserve\nore\n##uru\n##gence\nspelling\nBorder\ngrinning\nsketch\nBenedict\nXbox\ndialects\nreadily\nimmigrant\nConstitutional\naided\nnevertheless\nSE\ntragedy\n##ager\n##rden\nFlash\n##MP\nEuropa\nemissions\n##ield\npanties\nBeverly\nHomer\ncurtain\n##oto\ntoilet\nIsn\nJerome\nChiefs\nHermann\nsupernatural\njuice\nintegrity\nScots\nauto\nPatriots\nStrategic\nengaging\nprosecution\ncleaned\nByron\ninvestments\nadequate\nvacuum\nlaughs\n##inus\n##nge\nUsually\nRoth\nCities\nBrand\ncorpse\n##ffy\nGas\nrifles\nPlains\nsponsorship\nLevi\ntray\nowed\ndella\ncommanders\n##ead\ntactical\n##rion\nGarcía\nharbor\ndischarge\n##hausen\ngentleman\nendless\nhighways\n##itarian\npleaded\n##eta\narchive\nMidnight\nexceptions\ninstances\nGibraltar\ncart\n##NS\nDarren\nBonnie\n##yle\n##iva\nOCLC\nbra\nJess\n##EA\nconsulting\nArchives\nChance\ndistances\ncommissioner\n##AR\nLL\nsailors\n##sters\nenthusiasm\nLang\n##zia\nYugoslav\nconfirm\npossibilities\nSuffolk\n##eman\nbanner\n1822\nSupporting\nfingertips\ncivilization\n##gos\ntechnically\n1827\nHastings\nsidewalk\nstrained\nmonuments\nFloyd\nChennai\nElvis\nvillagers\nCumberland\nstrode\nalbeit\nBelieve\nplanets\ncombining\nMohammad\ncontainer\n##mouth\n##tures\nverb\nBA\nTank\nMidland\nscreened\nGang\nDemocracy\nHelsinki\nscreens\nthread\ncharitable\n##version\nswiftly\nma\nrational\ncombine\n##SS\n##antly\ndragging\nCliff\nTasmania\nquest\nprofessionally\n##aj\nrap\n##lion\nlivestock\n##hua\ninformal\nspecially\nlonely\nMatthews\nDictionary\n1816\nObservatory\ncorrespondent\nconstitute\nhomeless\nwaving\nappreciated\nAnalysis\nMeeting\ndagger\n##AL\nGandhi\nflank\nGiant\nChoir\n##not\nglimpse\ntoe\nWriter\nteasing\nsprings\n##dt\nGlory\nhealthcare\nregulated\ncomplaint\nmath\nPublications\nmakers\n##hips\ncement\nNeed\napologize\ndisputes\nfinishes\nPartners\nboring\nups\ngains\n1793\nCongressional\nclergy\nFolk\n##made\n##nza\nWaters\nstays\nencoded\nspider\nbetrayed\nApplied\ninception\n##urt\n##zzo\nwards\nbells\nUCLA\nWorth\nbombers\nMo\ntrademark\nPiper\n##vel\nincorporates\n1801\n##cial\ndim\nTwelve\n##word\nAppeals\ntighter\nspacecraft\n##tine\ncoordinates\n##iac\nmistakes\nZach\nlaptop\nTeresa\n##llar\n##yr\nfavored\nNora\nsophisticated\nIrving\nhammer\nDivisión\ncorporations\nniece\n##rley\nPatterson\nUNESCO\ntrafficking\nMing\nbalanced\nplaque\nLatvia\nbroader\n##owed\nSave\nconfined\n##vable\nDalton\ntide\n##right\n##ural\n##num\nswords\ncaring\n##eg\nIX\nActing\npaved\n##moto\nlaunching\nAntoine\nsubstantially\nPride\nPhilharmonic\ngrammar\nIndoor\nEnsemble\nenabling\n114\nresided\nAngelo\npublicity\nchaired\ncrawled\nMaharashtra\nTelegraph\nlengthy\npreference\ndifferential\nanonymous\nHoney\n##itation\nwage\n##iki\nconsecrated\nBryant\nregulatory\nCarr\n##én\nfunctioning\nwatches\n##ú\nshifts\ndiagnosis\nSearch\napp\nPeters\n##SE\n##cat\nAndreas\nhonours\ntemper\ncounsel\nUrdu\nAnniversary\nmaritime\n##uka\nharmony\n##unk\nessence\nLorenzo\nchoked\nQuarter\nindie\n##oll\nloses\n##prints\namendment\nAdolf\nscenario\nsimilarities\n##rade\n##LC\ntechnological\nmetric\nRussians\nthoroughly\n##tead\ncruiser\n1806\n##nier\n1823\nTeddy\n##psy\nau\nprogressed\nexceptional\nbroadcaster\npartnered\nfitness\nirregular\nplacement\nmothers\nunofficial\nGarion\nJohannes\n1817\nregain\nSolar\npublishes\nGates\nBroken\nthirds\nconversations\ndive\nRaj\ncontributor\nquantities\nWorcester\ngovernance\n##flow\ngenerating\npretending\nBelarus\n##voy\nradius\nskating\nMarathon\n1819\naffection\nundertook\n##wright\nlos\n##bro\nlocate\nPS\nexcluded\nrecreation\ntortured\njewelry\nmoaned\n##logue\n##cut\nComplete\n##rop\n117\n##II\nplantation\nwhipped\nslower\ncrater\n##drome\nVolunteer\nattributes\ncelebrations\nregards\nPublishers\noath\nutilized\nRobbie\nGiuseppe\nfiber\nindication\nmelted\narchives\nDamien\nstorey\naffecting\nidentifying\ndances\nalumni\ncomparable\nupgrade\nrented\nsprint\n##kle\nMarty\n##lous\ntreating\nrailways\nLebanese\nerupted\noccupy\nsympathy\nJude\nDarling\nQatar\ndrainage\nMcCarthy\nheel\nKlein\ncomputing\nwireless\nflip\nDu\nBella\n##ast\n##ssen\nnarrator\nmist\nsings\nalignment\n121\n2020\nsecuring\n##rail\nProgress\nmissionaries\nbrutal\nmercy\n##shing\nHip\n##ache\n##olo\nswitching\n##here\nMalay\n##ob\nconstituted\nMohammed\nOften\nstandings\nsurge\nteachings\nink\ndetached\nsystematic\nTrial\nMyanmar\n##wo\noffs\nReyes\ndecoration\ntranslations\nwherever\nreviewer\nspeculation\nBangkok\nterminated\n##ester\nbeard\nRCA\nAidan\nAssociated\nEmerson\nCharity\n1803\ngenerous\nDudley\nATP\n##haven\nprizes\ntoxic\ngloves\n##iles\n##dos\nTurning\nmyth\nParade\n##building\nHits\n##eva\nteamed\nAbove\nDuchess\nHolt\n##oth\nSub\nAce\natomic\ninform\nShip\ndepend\nJun\n##bes\nNorwich\nglobe\nBaroque\nChristina\nCotton\nTunnel\nkidding\nConcerto\nBrittany\ntasted\nphases\nstems\nangles\n##TE\n##nam\n##40\ncharted\nAlison\nintensive\nWillis\nglory\n##lit\nBergen\nest\ntaller\n##dicate\nlabeled\n##ido\ncommentator\nWarrior\nViscount\nshortened\naisle\nAria\nSpike\nspectators\ngoodbye\noverlooking\nmammals\n##lude\nwholly\nBarrett\n##gus\naccompany\nseventy\nemploy\n##mb\nambitious\nbeloved\nbasket\n##mma\n##lding\nhalted\ndescendant\npad\nexclaimed\ncloak\n##pet\nStrait\nBang\nAviv\nsadness\n##ffer\nDonovan\n1880s\nagenda\nswinging\n##quin\njerk\nBoat\n##rist\nnervously\nSilence\nEcho\nshout\nimplies\n##iser\n##cking\nShiva\nWeston\ndamages\n##tist\neffectiveness\nHorace\ncycling\nRey\nache\nPhotography\nPDF\nDear\nleans\nLea\n##vision\nbooth\nattained\ndisbelief\n##eus\n##ution\nHop\npension\ntoys\nEurovision\nfaithful\n##heads\nAndre\nowe\ndefault\nAtlas\nMegan\nhighlights\nlovers\nConstantine\nSixth\nmasses\n##garh\nemerge\nAuto\nSlovak\n##oa\n##vert\nSuperintendent\nflicked\ninventor\nChambers\nFrankie\nRomeo\npottery\ncompanions\nRudolf\n##liers\ndiary\nUnless\ntap\nalter\nRandall\n##ddle\n##eal\nlimitations\n##boards\nutterly\nknelt\nguaranteed\nCowboys\nIslander\nhorns\n##ike\nWendy\nsexually\nSmart\nbreasts\n##cian\ncompromise\nDuchy\nAT\nGalaxy\nanalog\nStyle\n##aking\nweighed\nNigel\noptional\nCzechoslovakia\npracticing\nHam\n##0s\nfeedback\nbatted\nuprising\noperative\napplicable\ncriminals\nclassrooms\nSomehow\n##ode\n##OM\nNaomi\nWinchester\n##pping\nBart\nRegina\ncompetitor\nRecorded\nYuan\nVera\nlust\nConfederation\n##test\nsuck\n1809\nLambert\n175\nFriend\n##ppa\nSlowly\n##⁺\nWake\nDec\n##aneous\nchambers\nColor\nGus\n##site\nAlternative\n##world\nExeter\nOmaha\ncelebrities\nstriker\n210\ndwarf\nmeals\nOriental\nPearson\nfinancing\nrevenues\nunderwater\nSteele\nscrew\nFeeling\nMt\nacids\nbadge\nswore\ntheaters\nMoving\nadmired\nlung\nknot\npenalties\n116\nfork\n##cribed\nAfghan\noutskirts\nCambodia\noval\nwool\nfossils\nNed\nCountess\nDarkness\ndelicious\n##nica\nEvelyn\nRecordings\nguidelines\n##CP\nSandra\nmeantime\nAntarctica\nmodeling\ngranddaughter\n##rial\nRoma\nSeventh\nSunshine\nGabe\n##nton\nShop\nTurks\nprolific\nsoup\nparody\n##nta\nJudith\ndisciplines\nresign\nCompanies\nLibya\nJets\ninserted\nMile\nretrieve\nfilmmaker\n##rand\nrealistic\nunhappy\n##30\nsandstone\n##nas\n##lent\n##ush\n##rous\nBrent\ntrash\nRescue\n##unted\nAutumn\ndisgust\nflexible\ninfinite\nsideways\n##oss\n##vik\ntrailing\ndisturbed\n50th\nNewark\nposthumously\n##rol\nSchmidt\nJosef\n##eous\ndetermining\nmenu\nPole\nAnita\nLuc\npeaks\n118\nYard\nwarrant\ngeneric\ndeserted\nWalking\nstamp\ntracked\n##berger\npaired\nsurveyed\nsued\nRainbow\n##isk\nCarpenter\nsubmarines\nrealization\ntouches\nsweeping\nFritz\nmodule\nWhether\nresembles\n##form\n##lop\nunsure\nhunters\nZagreb\nunemployment\nSenators\nGeorgetown\n##onic\nBarker\nfoul\ncommercials\nDresden\nWords\ncollision\nCarlton\nFashion\ndoubted\n##ril\nprecision\nMIT\nJacobs\nmob\nMonk\nretaining\ngotta\n##rod\nremake\nFast\nchips\n##pled\nsufficiently\n##lights\ndelivering\n##enburg\nDancing\nBarton\nOfficers\nmetals\n##lake\nreligions\n##ré\nmotivated\ndiffers\ndorsal\n##birds\n##rts\nPriest\npolished\n##aling\nSaxony\nWyatt\nknockout\n##hor\nLopez\nRNA\n##link\nmetallic\n##kas\ndaylight\nMontenegro\n##lining\nwrapping\nresemble\nJam\nViking\nuncertainty\nangels\nenables\n##fy\nStuttgart\ntricks\ntattoo\n127\nwicked\nasset\nbreach\n##yman\nMW\nbreaths\nJung\nim\n1798\nnoon\nvowel\n##qua\ncalmly\nseasonal\nchat\ningredients\ncooled\nRandolph\nensuring\n##ib\n##idal\nflashing\n1808\nMacedonian\nCool\ncouncils\n##lick\nadvantages\nImmediately\nMadras\n##cked\nPain\nfancy\nchronic\nMalayalam\nbegged\n##nese\nInner\nfeathers\n##vey\nNames\ndedication\nSing\npan\nFischer\nnurses\nSharp\ninning\nstamps\nMeg\n##ello\nedged\nmotioned\nJacksonville\n##ffle\n##dic\n##US\ndivide\ngarnered\nRanking\nchasing\nmodifications\n##oc\nclever\nmidst\nflushed\n##DP\nvoid\n##sby\nambulance\nbeaches\ngroan\nisolation\nstrengthen\nprevention\n##ffs\nScouts\nreformed\ngeographic\nsquadrons\nFiona\nKai\nConsequently\n##uss\novertime\n##yas\nFr\n##BL\nPapua\nMixed\nglances\nHaiti\nSporting\nsandy\nconfronted\nRené\nTanner\n1811\n##IM\nadvisory\ntrim\n##ibe\nGonzález\ngambling\nJupiter\n##ility\n##owski\n##nar\n122\napology\nteased\nPool\nfeminine\nwicket\neagle\nshiny\n##lator\nblend\npeaking\nnasty\nnodding\nfraction\ntech\nNoble\nKuwait\nbrushing\nItalia\nCanberra\nduet\nJohan\n1805\nWritten\ncameo\nStalin\npig\ncord\n##zio\nSurely\nSA\nowing\nholidays\n123\nRanger\nlighthouse\n##ige\nminers\n1804\n##ë\n##gren\n##ried\ncrashing\n##atory\nwartime\nhighlight\ninclined\nTorres\nTax\n##zel\n##oud\nOwn\n##corn\nDivine\nEMI\nRelief\nNorthwestern\nethics\nBMW\nclick\nplasma\nChristie\ncoordinator\nShepherd\nwashing\ncooked\n##dio\n##eat\nCerambycidae\nalgebra\nEngine\ncostumes\nVampire\nvault\nsubmission\nvirtue\nassumption\n##rell\nToledo\n##oting\n##rva\ncrept\nemphasized\n##lton\n##ood\nGreeks\nsurgical\ncrest\nPatrol\nBeta\nTessa\n##GS\npizza\ntraits\nrats\nIris\nspray\n##GC\nLightning\nbinary\nescapes\n##take\nClary\ncrowds\n##zong\nhauled\nmaid\n##fen\nManning\n##yang\nNielsen\naesthetic\nsympathetic\naffiliation\nsoaked\nMozart\npersonalities\nbegging\n##iga\nclip\nRaphael\nyearly\nLima\nabundant\n##lm\n1794\nstrips\nInitiative\nreporters\n##vsky\nconsolidated\n##itated\nCivic\nrankings\nmandate\nsymbolic\n##ively\n1807\nrental\nduck\nnave\ncomplications\n##nor\nIrene\nNazis\nhaunted\nscholarly\nPratt\nGran\nEmbassy\nWave\npity\ngenius\nbats\ncanton\nTropical\nmarker\n##cos\nescorted\nClimate\n##posed\nappreciation\nfreezing\npuzzle\nInternal\npools\nShawn\npathway\nDaniels\nFitzgerald\nextant\nolive\nVanessa\nmarriages\ncocked\n##dging\nprone\nchemicals\ndoll\ndrawer\n##HF\nStark\nProperty\n##tai\nflowed\nSheridan\n##uated\nLess\nOmar\nremarks\ncatalogue\nSeymour\nwreck\nCarrie\n##bby\nMercer\ndisplaced\nsovereignty\nrip\nFlynn\nArchie\nQuarterfinals\nHassan\n##ards\nvein\nOsaka\npouring\nwages\nRomance\n##cript\n##phere\n550\n##eil\n##stown\nDocumentary\nancestor\nCNN\nPanthers\npublishers\nRise\n##mu\nbiting\nBright\nString\nsucceeding\n119\nloaned\nWarwick\nSheikh\nVon\nAfterwards\nJax\nCamden\nhelicopters\nHence\nLaurel\n##ddy\ntransaction\nCorp\nclause\n##owing\n##kel\nInvestment\ncups\nLucia\nMoss\nGiles\nchef\nLópez\ndecisive\n30th\ndistress\nlinguistic\nsurveys\nReady\nmaiden\nTouch\nfrontier\nincorporate\nexotic\nmollusk\nLeopold\nRide\n##wain\n##ndo\nteammates\ntones\ndrift\nordering\nFeb\nPenny\nNormandy\nPresent\nFlag\npipes\n##rro\ndelight\nmotto\nTibet\nleap\nEliza\nProduced\nteenagers\nsitcom\nTry\nHansen\nCody\nwandered\nterrestrial\nfrog\nscare\nresisted\nemployers\ncoined\n##DS\nresistant\nFly\ncaptive\ndissolution\njudged\nassociates\ndefining\n##court\nHale\n##mbo\nraises\nclusters\ntwelfth\n##metric\nRoads\n##itude\nsatisfy\nAndroid\nReds\nGloucester\nCategory\nValencia\nDaemon\nstabbed\nLuna\nChurches\nCanton\n##eller\nAttack\nKashmir\nannexed\ngrabs\nasteroid\nHartford\nrecommendation\nRodriguez\nhanding\nstressed\nfrequencies\ndelegate\nBones\nErie\nWeber\nHands\nActs\nmillimetres\n24th\nFat\nHowe\ncasually\n##SL\nconvent\n1790\nIF\n##sity\n1795\nyelling\n##ises\ndrain\naddressing\namino\nMarcel\nSylvia\nParamount\nGerard\nVolleyball\nbutter\n124\nAlbion\n##GB\ntriggered\n1792\nfolding\naccepts\n##ße\npreparations\nWimbledon\ndose\n##grass\nescaping\n##tling\nimport\ncharging\n##dation\n280\nNolan\n##fried\nCalcutta\n##pool\nCove\nexamining\nminded\nheartbeat\ntwisting\ndomains\nbush\nTunisia\nPurple\nLeone\n##code\nevacuated\nbattlefield\ntiger\nElectrical\n##ared\nchased\n##cre\ncultivated\nJet\nsolved\nshrug\nringing\nImpact\n##iant\nkilometre\n##log\ncommemorate\nmigrated\nsingular\ndesigning\npromptly\nHiggins\n##own\n##aves\nfreshwater\nMarketing\nPayne\nbeg\nlocker\npray\nimplied\nAAA\ncorrected\nTrans\nEuropeans\nAshe\nacknowledge\nIntroduction\n##writer\n##llen\nMunster\nauxiliary\ngrowl\nHours\nPoems\n##AT\nreduces\nPlain\nplague\ncanceled\ndetention\npolite\nnecklace\nGustav\n##gu\n##lance\nEn\nAngola\n##bb\ndwelling\n##hea\n5000\nQing\nDodgers\nrim\n##ored\n##haus\nspilled\nElisabeth\nViktor\nbackpack\n1802\namended\n##worthy\nPhantom\n##ctive\nkeeper\n##loom\nVikings\n##gua\nemploys\nTehran\nspecialty\n##bate\nMarx\nMirror\nJenna\nrides\nneedle\nprayers\nclarinet\nforewings\n##walk\nMidlands\nconvincing\nadvocacy\nCao\nBirds\ncycles\nClement\nGil\nbubble\nMaximum\nhumanitarian\nTan\ncries\n##SI\nParsons\nTrio\noffshore\nInnovation\nclutched\n260\n##mund\n##duct\nPrairie\nrelied\nFalcon\n##ste\nKolkata\nGill\nSwift\nNegro\nZoo\nvalleys\n##OL\nOpening\nbeams\nMPs\noutline\nBermuda\nPersonal\nexceed\nproductive\n##MT\nrepublic\nforum\n##sty\ntornado\nKnown\ndipped\nEdith\nfolks\nmathematician\nwatershed\nRicardo\nsynthetic\n##dication\ndeity\n##₄\ngaming\nsubjected\nsuspects\nFoot\nswollen\nMotors\n##tty\n##ý\naloud\nceremonial\nes\nnuts\nintend\nCarlisle\ntasked\nhesitation\nsponsors\nunified\ninmates\n##ctions\n##stan\ntiles\njokes\nwhereby\noutcomes\nLights\nscary\nStoke\nPortrait\nBlind\nsergeant\nviolations\ncultivation\nfuselage\nMister\nAlfonso\ncandy\nsticks\nteen\nagony\nEnough\ninvite\nPerkins\nAppeal\nmapping\nundergo\nGlacier\nMelanie\naffects\nincomplete\n##dd\nColombian\n##nate\nCBC\npurchasing\nbypass\nDrug\nElectronics\nFrontier\nCoventry\n##aan\nautonomy\nscrambled\nRecent\nbounced\ncow\nexperiencing\nRouge\ncuisine\nElite\ndisability\nJi\ninheritance\nwildly\nInto\n##wig\nconfrontation\nWheeler\nshiver\nPerforming\naligned\nconsequently\nAlexis\nSin\nwoodland\nexecutives\nStevenson\nFerrari\ninevitable\n##cist\n##dha\n##base\nCorner\ncomeback\nLeón\n##eck\n##urus\nMacDonald\npioneering\nbreakdown\nlandscapes\nVeterans\nRican\nTheological\nstirred\nparticipant\nCredit\nHyderabad\nsnails\nClaudia\n##ocene\ncompliance\n##MI\nFlags\nMiddlesex\nstorms\nwinding\nasserted\ner\n##ault\n##kal\nwaking\n##rates\nabbey\nAugusta\ntooth\ntrustees\nCommodore\n##uded\nCunningham\nNC\nWitch\nmarching\nSword\nSame\nspiral\nHarley\n##ahan\nZack\nAudio\n1890s\n##fit\nSimmons\nKara\nVeronica\nnegotiated\nSpeaking\nFIBA\nConservatory\nformations\nconstituencies\nexplicit\nfacial\neleventh\n##ilt\nvillain\n##dog\n##case\n##hol\narmored\ntin\nhairs\n##umi\n##rai\nmattress\nAngus\ncease\nverbal\nRecreation\nsavings\nAurora\npeers\nMonastery\nAirways\ndrowned\nadditions\ndownstream\nsticking\nShi\nmice\nskiing\n##CD\nRaw\nRiverside\nwarming\nhooked\nboost\nmemorable\nposed\ntreatments\n320\n##dai\ncelebrating\nblink\nhelpless\ncirca\nFlowers\nPM\nuncommon\nOct\nHawks\noverwhelmed\nSparhawk\nrepaired\nMercy\npose\ncounterpart\ncompare\nsurvives\n##½\n##eum\ncoordinate\nLil\ngrandchildren\nnotorious\nYi\nJudaism\nJuliet\naccusations\n1789\nfloated\nmarathon\nroar\nfortified\nreunion\n145\nNov\nPaula\n##fare\n##toria\ntearing\nCedar\ndisappearance\nSi\ngifted\nscar\n270\nPBS\nTechnologies\nMarvin\n650\nroller\ncupped\nnegotiate\n##erman\npassport\ntram\nmiracle\nstyled\n##tier\nnecessity\nDes\nrehabilitation\nLara\nUSD\npsychic\nwipe\n##lem\nmistaken\n##lov\ncharming\nRider\npageant\ndynamics\nCassidy\n##icus\ndefenses\n##tadt\n##vant\naging\n##inal\ndeclare\nmistress\nsupervised\n##alis\n##rest\nAshton\nsubmerged\nsack\nDodge\ngrocery\nramp\nTeacher\nlineage\nimagery\narrange\ninscriptions\nOrganisation\nSiege\ncombines\npounded\nFleming\nlegends\ncolumnist\nApostolic\nprose\ninsight\nArabian\nexpired\n##uses\n##nos\nAlone\nelbows\n##asis\n##adi\n##combe\nStep\nWaterloo\nAlternate\ninterval\nSonny\nplains\nGoals\nincorporating\nrecruit\nadjoining\nCheshire\nexcluding\nmarrying\nducked\nCherokee\npar\n##inate\nhiking\nCoal\n##bow\nnatives\nribbon\nAllies\ncon\ndescriptions\npositively\n##lal\ndefendant\n22nd\nVivian\n##beat\nWeather\npossessions\nDate\nsweetheart\ninability\nSalisbury\nadviser\nideology\nNordic\n##eu\nCubs\nIP\nAdministrative\n##nick\nfacto\nliberation\nBurnett\nJavier\nfashioned\nElectoral\nTurin\ntheft\nunanimous\nPer\n1799\nClan\nHawkins\nTeachers\n##wes\nCameroon\nParkway\n##gment\ndemolition\natoms\nnucleus\n##thi\nrecovering\n##yte\n##vice\nlifts\nMust\ndeposit\nHancock\nSemi\ndarkened\nDeclaration\nmoan\nmuscular\nMyers\nattractions\nsauce\nsimulation\n##weed\nAlps\nbarriers\n##baum\nBarack\ngalleries\nMin\nholders\nGreenwich\ndonation\nEverybody\nWolfgang\nsandwich\nKendra\nCollegiate\ncasino\nSlavic\nensuing\nPorto\n##grapher\nJesuit\nsuppressed\ntires\nIbrahim\nprotesters\nIbn\nAmos\n1796\nphenomena\nHayden\nParaguay\nSquad\nReilly\ncomplement\naluminum\n##eers\ndoubts\ndecay\ndemise\nPractice\npatience\nfireplace\ntransparent\nmonarchy\n##person\nRodney\nmattered\nrotating\nClifford\ndisposal\nStandards\npaced\n##llie\narise\ntallest\ntug\ndocumentation\nnode\nfreeway\nNikolai\n##cite\nclicked\nimaging\nLorraine\nTactical\nDifferent\nRegular\nHolding\n165\nPilot\nguarded\n##polis\nClassics\nMongolia\nBrock\nmonarch\ncellular\nreceptors\nMini\nChandler\nfinanced\nfinancially\nLives\nerection\nFuller\nunnamed\nKannada\ncc\npassive\nplateau\n##arity\nfreak\n##rde\nretrieved\ntransactions\n##sus\n23rd\nswimmer\nbeef\nfulfill\nArlington\noffspring\nreasoning\nRhys\nsaves\npseudonym\ncentimetres\nshivered\nshuddered\n##ME\nFeel\n##otic\nprofessors\nBlackburn\n##eng\n##life\n##haw\ninterred\nlodge\nfragile\nDella\nguardian\n##bbled\ncatalog\nclad\nobserver\ntract\ndeclaring\n##headed\nLok\ndean\nIsabelle\n1776\nirrigation\nspectacular\nshuttle\nmastering\n##aro\nNathaniel\nRetired\n##lves\nBrennan\n##kha\ndick\n##dated\n##hler\nRookie\nleapt\ntelevised\nweekends\nBaghdad\nYemen\n##fo\nfactions\nion\nLab\nmortality\npassionate\nHammer\nencompasses\nconfluence\ndemonstrations\nKi\nderivative\nsoils\n##unch\nRanch\nUniversities\nconventions\noutright\naiming\nhierarchy\nreside\nillusion\ngraves\nrituals\n126\nAntwerp\nDover\n##ema\ncampuses\nHobart\nlifelong\naliens\n##vity\nMemory\ncoordination\nalphabet\n##mina\nTitans\npushes\nFlanders\n##holder\nNormal\nexcellence\ncapped\nprofound\nTaipei\nportrayal\nsparked\nscratch\nse\n##eas\n##hir\nMackenzie\n##cation\nNeo\nShin\n##lined\nmagnificent\nposter\nbatsman\n##rgent\npersuade\n##ement\nIcelandic\nmiserable\ncollegiate\nFeature\ngeography\n##mura\nComic\nCircus\nprocessor\nbarracks\nTale\n##11\nBulls\n##rap\nstrengthened\n##bell\ninjection\nminiature\nbroadly\nLetter\nfare\nhostage\ntraders\n##nium\n##mere\nFortune\nRivera\nLu\ntriumph\nBrowns\nBangalore\ncooperative\nBasel\nannouncing\nSawyer\n##him\n##cco\n##kara\ndarted\n##AD\n##nova\nsucking\n##position\nperimeter\nflung\nHoldings\n##NP\nBasque\nsketches\nAugustine\nSilk\nElijah\nanalyst\narmour\nriots\nacquiring\nghosts\n##ems\n132\nPioneer\nColleges\nSimone\nEconomy\nAuthor\nsemester\nSoldier\nil\n##unting\n##bid\nfreaking\nVista\ntumor\n##bat\nmurderer\n##eda\nunreleased\n##grove\n##sser\n##té\nedit\nstatute\nsovereign\n##gawa\nKiller\nstares\nFury\ncomply\n##lord\n##nant\nbarrels\nAndhra\nMaple\ngenerator\nmascot\nunusually\neds\n##ante\n##runner\nrod\n##tles\nHistorically\nJennings\ndumped\nEstablished\nresemblance\n##lium\n##cise\n##body\n##voke\nLydia\n##hou\n##iring\nnonetheless\n1797\ncorrupt\npatrons\nphysicist\nsneak\nLivingston\nCitizens\nArchitects\nWerner\ntrends\nMelody\neighty\nmarkings\nbrakes\n##titled\noversaw\nprocessed\nmock\nMidwest\nintervals\n##EF\nstretches\nwerewolf\n##MG\nPack\ncontroller\n##dition\nHonours\ncane\nGriffith\nvague\nrepertoire\nCourtney\norgasm\nAbdullah\ndominance\noccupies\nYa\nintroduces\nLester\ninstinct\ncollaborative\nIndigenous\nrefusal\n##rank\noutlet\ndebts\nspear\n155\n##keeping\n##ulu\nCatalan\n##osh\ntensions\n##OT\nbred\ncrude\nDunn\nabdomen\naccurately\n##fu\n##lough\naccidents\nRow\nAudrey\nrude\nGetting\npromotes\nreplies\nPaolo\nmerge\n##nock\ntrans\nEvangelical\nautomated\nCanon\n##wear\n##ggy\n##gma\nBroncos\nfoolish\nicy\nVoices\nknives\nAside\ndreamed\ngenerals\nmolecule\nAG\nrejection\ninsufficient\n##nagar\ndeposited\nsacked\nLanding\narches\nhelpful\ndevotion\nintake\nFlower\nPGA\ndragons\nevolutionary\n##mail\n330\nGM\ntissues\n##tree\narcade\ncomposite\nlid\nAcross\nimplications\nlacks\ntheological\nassessed\nconcentrations\nDen\n##mans\n##ulous\nFu\nhomeland\n##stream\nHarriet\necclesiastical\ntroop\necological\nwinked\n##xed\neighteenth\nCasino\nspecializing\n##sworth\nunlocked\nsupreme\ndevastated\nsnatched\ntrauma\nGDP\nNord\nsaddle\nWes\nconvenient\ncompetes\n##nu\n##iss\nMarian\nsubway\n##rri\nsuccesses\numbrella\n##far\n##ually\nDundee\n##cence\nspark\n##rix\n##я\nQuality\nGeological\ncockpit\nrpm\nCam\nBucharest\nriot\n##PM\nLeah\n##dad\n##pose\nKa\nm³\nBundesliga\nWolfe\ngrim\ntextile\nquartet\nexpressing\nfantastic\ndestroyers\neternal\npicnic\n##oro\ncontractor\n1775\nspanning\ndeclining\n##cating\nLowe\nSutherland\nEmirates\ndownward\nnineteen\nviolently\nscout\nviral\nmelting\nenterprises\n##cer\nCrosby\nJubilee\nantenna\nurgent\nRory\n##uin\n##sure\nwandering\n##gler\n##vent\nSuzuki\nLifetime\nDirty\noccupying\n##quent\nDisc\nGuru\nmound\nLennon\nHumanities\nlisteners\nWalton\nuh\nBraves\nBologna\n##bis\n##gra\nDwight\ncrawl\nflags\nmemoir\nThorne\nArchdiocese\ndairy\n##uz\n##tery\nroared\nadjust\npatches\ninn\nKnowing\n##bbed\n##zan\nscan\nPapa\nprecipitation\nangrily\npassages\npostal\nPhi\nembraced\nblacks\neconomist\ntriangular\nSen\nshooter\npunished\nMillennium\nSwimming\nconfessed\nAston\ndefeats\nEra\ncousins\nWilliamson\n##rer\ndaytime\ndumb\n##rek\nunderway\nspecification\nBuchanan\nprayed\nconcealed\nactivation\n##issa\ncanon\nawesome\nStarr\nplural\nsummers\n##fields\nSlam\nunnecessary\n1791\nresume\ntrilogy\ncompression\n##rough\nselective\ndignity\nYan\n##xton\nimmense\n##yun\nlone\nseeded\nhiatus\nlightweight\nsummary\nYo\napprove\nGalway\nrejoined\nElise\ngarbage\nburns\nspeeches\n129\nHonduras\n##liness\ninventory\njersey\nFK\nassure\nslumped\nLionel\nSuite\n##sbury\nLena\ncontinuation\n##AN\nbrightly\n##nti\nGT\nKnowledge\n##park\n##lius\nlethal\n##tribution\n##sions\nCertificate\nMara\n##lby\nalgorithms\nJade\nblows\npirates\nfleeing\nwheelchair\nStein\nsophomore\nAlt\nTerritorial\ndiploma\nsnakes\n##olic\n##tham\nTiffany\nPius\nflush\nurging\nHanover\nReich\n##olate\nUnity\nPike\ncollectively\nTheme\nballad\nkindergarten\nrocked\nzoo\n##page\nwhip\nRodríguez\nstrokes\nchecks\nBecky\nStern\nupstream\n##uta\nSilent\nvolunteered\nSigma\n##ingen\n##tract\n##ede\nGujarat\nscrewed\nentertaining\n##action\n##ryn\ndefenders\ninnocence\nlesbian\nque\nRichie\nnodes\nLie\njuvenile\nJakarta\nsafer\nconfront\nBert\nbreakthrough\ngospel\nCable\n##zie\ninstitutional\nArchive\nbrake\nliquor\nfeeds\n##iate\nchancellor\nEncyclopedia\nAnimation\nscanning\nteens\n##mother\nCore\nRear\nWine\n##flower\nreactor\nAve\ncardinal\nsodium\nstrands\nOlivier\ncrouched\nVaughan\nSammy\nImage\nscars\nEmmanuel\nflour\nbias\nnipple\nrevelation\n##ucci\nDenny\n##ssy\nForm\nRunners\nadmits\nRama\nviolated\nBurmese\nfeud\nunderwear\nMohamed\nNamed\nswift\nstatewide\nDoor\nRecently\ncomparing\nHundred\n##idge\n##nity\n##rds\nRally\nReginald\nAuburn\nsolving\nwaitress\nTreasurer\n##ilization\nHalloween\nMinisters\nBoss\nShut\n##listic\nRahman\ndemonstrating\n##pies\nGaza\nYuri\ninstallations\nMath\nschooling\n##bble\nBronx\nexiled\ngasoline\n133\nbundle\nhumid\nFCC\nproportional\nrelate\nVFL\n##dez\ncontinuity\n##cene\nsyndicated\natmospheric\narrows\nWanderers\nreinforcements\nWillow\nLexington\nRotten\n##yon\ndiscovering\nSerena\nportable\n##lysis\ntargeting\n£1\nGoodman\nSteam\nsensors\ndetachment\nMalik\n##erie\nattitudes\nGoes\nKendall\nRead\nSleep\nbeans\nNikki\nmodification\nJeanne\nknuckles\nEleven\n##iously\nGross\nJaime\ndioxide\nmoisture\nStones\nUCI\ndisplacement\nMetacritic\nJury\nlace\nrendering\nelephant\nSergei\n##quire\nGP\nAbbott\n##type\nprojection\nMouse\nBishops\nwhispering\nKathleen\nRams\n##jar\nwhites\n##oran\nassess\ndispatched\n##hire\nkin\n##mir\nNursing\nadvocates\ntremendous\nsweater\nassisting\n##bil\nFarmer\nprominently\nreddish\nHague\ncyclone\n##SD\nSage\nLawson\nSanctuary\ndischarged\nretains\n##ube\nshotgun\nwilderness\nReformed\nsimilarity\nEntry\nWatts\nBahá\nQuest\nLooks\nvisions\nReservoir\nArabs\ncurls\nBlu\ndripping\naccomplish\nVerlag\ndrill\nsensor\nDillon\nphysicians\nsmashed\n##dir\npainters\nRenault\nstraw\nfading\nDirectorate\nlounge\ncommissions\nBrain\n##graph\nneo\n##urg\nplug\ncoordinated\n##houses\nCritical\nlamps\nillustrator\nReturning\nerosion\nCrow\n##ciation\nblessing\nThought\nWife\nmedalist\nsynthesizer\nPam\nThornton\nEsther\nHBO\nfond\nAssociates\n##raz\npirate\npermits\nWide\ntire\n##PC\nErnie\nNassau\ntransferring\nRFC\n##ntly\num\nspit\nAS\n##mps\nMining\npolar\nvilla\nanchored\n##zzi\nembarrassment\nrelates\n##ă\nRupert\ncounterparts\n131\nBaxter\n##18\nIgor\nrecognizes\nClive\n##hane\n##eries\n##ibly\noccurrence\n##scope\nfin\ncolorful\nRapids\nbanker\ntile\n##rative\n##dus\ndelays\ndestinations\n##llis\nPond\nDane\ngrandparents\nrewarded\nsocially\nmotorway\n##hof\n##lying\n##human\nmodeled\nDayton\nForward\nconscience\nSharma\nwhistle\nMayer\nSasha\n##pical\ncircuits\nZhou\n##ça\nLatvian\nfinalists\npredators\nLafayette\ncloses\nobligations\nResolution\n##vier\nTrustees\nreminiscent\n##hos\nHighlands\nProtected\nasylum\nevacuation\n##acy\nChevrolet\nconfession\nSomalia\nemergence\nseparating\n##rica\nalright\ncalcium\nLaurent\nWelfare\nLeonardo\nashes\ndental\nDeal\nminerals\n##lump\n##mount\naccounted\nstaggered\nslogan\nphotographic\nbuilder\n##imes\n##raft\ntragic\n144\nSEC\nHit\ntailed\n##ples\n##rring\n##rson\nethical\nwrestlers\nconcludes\nlunar\n##ept\nnitrogen\nAid\ncyclist\nquarterfinals\n##ه\nharvest\n##hem\nPasha\nIL\n##mis\ncontinually\n##forth\nIntel\nbucket\n##ended\nwitches\npretended\ndresses\nviewer\npeculiar\nlowering\nvolcano\nMarilyn\nQualifier\nclung\n##sher\nCut\nmodules\nBowie\n##lded\nonset\ntranscription\nresidences\n##pie\n##itor\nscrapped\n##bic\nMonaco\nMayo\neternity\nStrike\nuncovered\nskeleton\n##wicz\nIsles\nbug\nPromoted\n##rush\nMechanical\nXII\n##ivo\ngripping\nstubborn\nvelvet\nTD\ndecommissioned\noperas\nspatial\nunstable\nCongressman\nwasted\n##aga\n##ume\nadvertisements\n##nya\nobliged\nCannes\nConway\nbricks\n##gnant\n##mity\n##uise\njumps\nClear\n##cine\n##sche\nchord\nutter\nSu\npodium\nspokesman\nRoyce\nassassin\nconfirmation\nlicensing\nliberty\n##rata\nGeographic\nindividually\ndetained\n##ffe\nSaturn\ncrushing\nairplane\nbushes\nknights\n##PD\nLilly\nhurts\nunexpectedly\nConservatives\npumping\nForty\ncandle\nPérez\npeasants\nsupplement\nSundays\n##ggs\n##rries\nrisen\nenthusiastic\ncorresponds\npending\n##IF\nOwens\nfloods\nPainter\ninflation\npresumed\ninscribed\nChamberlain\nbizarre\n1200\nliability\nreacted\ntub\nLegacy\n##eds\n##pted\nshone\n##litz\n##NC\nTiny\ngenome\nbays\nEduardo\nrobbery\nstall\nhatch\nDepot\nVariety\nFlora\nreprinted\ntrembled\noutlined\nCR\nTheresa\nspans\n##plication\nJensen\n##eering\nposting\n##rky\npays\n##ost\nMarcos\nfortifications\ninferior\n##ential\nDevi\ndespair\nTalbot\n##chus\nupdates\nego\nBooth\nDarius\ntops\n##lau\nScene\n##DC\nHarlem\nTrey\nGenerally\ncandles\n##α\nNeville\nAdmiralty\n##hong\niconic\nvictorious\n1600\nRowan\nabundance\nminiseries\nclutching\nsanctioned\n##words\nobscure\n##ision\n##rle\n##EM\ndisappearing\nResort\nObviously\n##eb\nexceeded\n1870s\nAdults\n##cts\nCry\nKerr\nragged\nselfish\n##lson\ncircled\npillars\ngalaxy\n##asco\n##mental\nrebuild\ncaution\nResistance\nStart\nbind\nsplitting\nBaba\nHogan\nps\npartnerships\nslam\nPeggy\ncourthouse\n##OD\norganizational\npackages\nAngie\n##nds\npossesses\n##rp\nExpressway\nGould\nTerror\nHim\nGeoff\nnobles\n##ope\nshark\n##nh\nidentifies\n##oor\ntestified\nPlaying\n##ump\n##isa\nstool\nIdol\n##pice\n##tana\nByrne\nGerry\ngrunted\n26th\nobserving\nhabits\nprivilege\nimmortal\nwagons\n##thy\ndot\nBring\n##lian\n##witz\nnewest\n##uga\nconstraints\nScreen\nIssue\n##RNA\n##vil\nreminder\n##gles\naddiction\npiercing\nstunning\nvar\n##rita\nSignal\naccumulated\n##wide\nfloat\ndevastating\nviable\ncartoons\nUttar\nflared\n##encies\nTheology\npatents\n##bahn\nprivileges\n##ava\n##CO\n137\n##oped\n##NT\norchestral\nmedication\n225\nerect\nNadia\nÉcole\nfried\nSales\nscripts\n##rease\nairs\nCage\ninadequate\nstructured\ncountless\nAvengers\nKathy\ndisguise\nmirrors\nInvestigation\nreservation\n##nson\nLegends\nhumorous\nMona\ndecorations\nattachment\nVia\nmotivation\nBrowne\nstrangers\n##ński\nShadows\nTwins\n##pressed\nAlma\nNominated\n##ott\nSergio\ncanopy\n152\nSemifinals\ndevised\n##irk\nupwards\nTraffic\nGoddess\nMove\nbeetles\n138\nspat\n##anne\nholdings\n##SP\ntangled\nWhilst\nFowler\nanthem\n##ING\n##ogy\nsnarled\nmoonlight\nsongwriting\ntolerance\nWorlds\nexams\n##pia\nnotices\nsensitivity\npoetic\nStephens\nBoone\ninsect\nreconstructed\nFresh\n27th\nballoon\n##ables\nBrendan\nmug\n##gee\n1780\napex\nexports\nslides\nLahore\nhiring\nShell\nelectorate\nsexuality\npoker\nnonprofit\n##imate\ncone\n##uce\nOkinawa\nsuperintendent\n##HC\nreferenced\nturret\nSprint\nCitizen\nequilibrium\nStafford\ncurb\nDriver\nValerie\n##rona\naching\nimpacts\n##bol\nobservers\nDowns\nShri\n##uth\nairports\n##uda\nassignments\ncurtains\nsolitary\nicon\npatrols\nsubstances\nJasper\nmountainous\nPublished\nached\n##ingly\nannounce\ndove\ndamaging\n##tism\nPrimera\nDexter\nlimiting\nbatch\n##uli\nundergoing\nrefugee\nYe\nadmiral\npavement\n##WR\n##reed\npipeline\ndesires\nRamsey\nSheila\nthickness\nBrotherhood\nTea\ninstituted\nBelt\nBreak\nplots\n##ais\nmasculine\n##where\nTheo\n##aged\n##mined\nExperience\nscratched\nEthiopian\nTeaching\n##nov\nAiden\nAbe\nSamoa\nconditioning\n##mous\nOtherwise\nfade\nJenks\n##encing\nNat\n##lain\nAnyone\n##kis\nsmirk\nRiding\n##nny\nBavarian\nblessed\npotatoes\nHook\n##wise\nlikewise\nhardened\nMerry\namid\npersecution\n##sten\nElections\nHoffman\nPitt\n##vering\ndistraction\nexploitation\ninfamous\nquote\naveraging\nhealed\nRhythm\nGermanic\nMormon\nilluminated\nguides\n##ische\ninterfere\n##ilized\nrector\nperennial\n##ival\nEverett\ncourtesy\n##nham\nKirby\nMk\n##vic\nMedieval\n##tale\nLuigi\nlimp\n##diction\nAlive\ngreeting\nshove\n##force\n##fly\nJasmine\nBend\nCapt\nSuzanne\nditch\n134\n##nning\nHost\nfathers\nrebuilding\nVocal\nwires\n##manship\ntan\nFactor\nfixture\n##LS\nMāori\nPlate\npyramid\n##umble\nslap\nSchneider\nyell\n##ulture\n##tional\nGoodbye\nsore\n##pher\ndepressed\n##dox\npitching\nFind\nLotus\n##wang\nstrand\nTeen\ndebates\nprevalent\n##bilities\nexposing\nhears\nbilled\n##rse\nreorganized\ncompelled\ndisturbing\ndisplaying\n##tock\nClinical\nemotionally\n##iah\nDerbyshire\ngrouped\n##quel\nBahrain\nJournalism\nIN\npersistent\nblankets\nCrane\ncamping\nDirect\nproving\nLola\n##dding\nCorporate\nbirthplace\n##boats\n##ender\nFigure\ndared\nAssam\nprecursor\n##nched\nTribe\nRestoration\nslate\nMeyrick\nhunted\nstroking\nEarlier\nKind\npolls\nappeals\nmonetary\n##reate\nKira\nLangdon\nexplores\nGPS\nextensions\nsquares\nResults\ndraped\nannouncer\nmerit\n##ennial\n##tral\n##roved\n##cion\nrobots\nsupervisor\nsnorted\n##group\nCannon\nprocession\nmonkey\nfreeze\nsleeves\nNile\nverdict\nropes\nfirearms\nextraction\ntensed\nEC\nSaunders\n##tches\ndiamonds\nMarriage\n##amble\ncurling\nAmazing\n##haling\nunrelated\n##roads\nDaughter\ncum\ndiscarded\nkidney\ncliffs\nforested\nCandy\n##lap\nauthentic\ntablet\nnotation\n##nburg\nBulldogs\nCallum\nMeet\nmouths\ncoated\n##xe\nTruman\ncombinations\n##mation\nSteelers\nFan\nThan\npaternal\n##father\n##uti\nRebellion\ninviting\nFun\ntheatres\n##ي\n##rom\ncurator\n##cision\nnetworking\nOz\ndrought\n##ssel\ngranting\nMBA\nShelby\nElaine\njealousy\nKyoto\nshores\nsignaling\ntenants\ndebated\nIntermediate\nWise\n##hes\n##pu\nHavana\nduke\nvicious\nexited\nservers\nNonetheless\nReports\nexplode\n##beth\nNationals\nofferings\nOval\nconferred\neponymous\nfolklore\n##NR\nShire\nplanting\n1783\nZeus\naccelerated\nConstable\nconsuming\ntroubles\nMcCartney\ntexture\nbust\nImmigration\nexcavated\nhopefully\n##cession\n##coe\n##name\n##ully\nlining\nEinstein\nVenezuelan\nreissued\nminorities\nBeatrice\ncrystals\n##nies\ncircus\nlava\nBeirut\nextinction\n##shu\nBecker\n##uke\nissuing\nZurich\nextract\n##esta\n##rred\nregulate\nprogression\nhut\nalcoholic\nplea\nAB\nNorse\nHubert\nMansfield\nashamed\n##put\nBombardment\nstripes\nelectrons\nDenise\nhorrified\nNor\narranger\nHay\nKoch\n##ddling\n##iner\nBirthday\nJosie\ndeliberate\nexplorer\n##jiang\n##signed\nArrow\nwiping\nsatellites\nbaritone\nmobility\n##rals\nDorset\nturbine\nCoffee\n185\n##lder\nCara\nColts\npits\nCrossing\ncoral\n##birth\nTai\nzombie\nsmoothly\n##hp\nmates\n##ady\nMarguerite\n##tary\npuzzled\ntapes\noverly\nSonic\nPrayer\nThinking\n##uf\nIEEE\nobligation\n##cliffe\nBasil\nredesignated\n##mmy\nnostrils\nBarney\nXIII\n##phones\nvacated\nunused\nBerg\n##roid\nTowards\nviola\n136\nEvent\nsubdivided\nrabbit\nrecruiting\n##nery\nNamibia\n##16\n##ilation\nrecruits\nFamous\nFrancesca\n##hari\nGoa\n##lat\nKarachi\nhaul\nbiblical\n##cible\nMGM\n##rta\nhorsepower\nprofitable\nGrandma\nimportantly\nMartinez\nincoming\n##kill\nbeneficial\nnominal\npraying\n##isch\ngable\nnail\nnoises\n##ttle\nPolytechnic\nrub\n##cope\nThor\naudition\nerotic\n##ending\n##iano\nUltimately\narmoured\n##mum\npresently\npedestrian\n##tled\nIpswich\noffence\n##ffin\n##borne\nFlemish\n##hman\necho\n##cting\nauditorium\ngentlemen\nwinged\n##tched\nNicaragua\nUnknown\nprosperity\nexhaust\npie\nPeruvian\ncompartment\nheights\ndisabilities\n##pole\nHarding\nHumphrey\npostponed\nmoths\nMathematical\nMets\nposters\naxe\n##nett\nNights\nTypically\nchuckle\ncouncillors\nalternating\n141\nNorris\n##ately\n##etus\ndeficit\ndreaming\ncooler\noppose\nBeethoven\n##esis\nMarquis\nflashlight\nheadache\ninvestor\nresponding\nappointments\n##shore\nElias\nideals\nshades\ntorch\nlingering\n##real\npier\nfertile\nDiploma\ncurrents\nSnake\n##horse\n##15\nBriggs\n##ota\n##hima\n##romatic\nCoastal\nKuala\nankles\nRae\nslice\nHilton\nlocking\nApproximately\nWorkshop\nNiagara\nstrangely\n##scence\nfunctionality\nadvertisement\nRapid\nAnders\nho\nSoviets\npacking\nbasal\nSunderland\nPermanent\n##fting\nrack\ntying\nLowell\n##ncing\nWizard\nmighty\ntertiary\npencil\ndismissal\ntorso\ngrasped\n##yev\nSand\ngossip\n##nae\nBeer\nimplementing\n##19\n##riya\nFork\nBee\n##eria\nWin\n##cid\nsailor\npressures\n##oping\nspeculated\nFreddie\noriginating\n##DF\n##SR\n##outh\n28th\nmelt\nBrenda\nlump\nBurlington\nUSC\nmarginal\n##bine\nDogs\nswamp\ncu\nEx\nuranium\nmetro\nspill\nPietro\nseize\nChorus\npartition\n##dock\n##media\nengineered\n##oria\nconclusions\nsubdivision\n##uid\nIllustrated\nLeading\n##hora\nBerkshire\ndefinite\n##books\n##cin\n##suke\nnoun\nwinced\nDoris\ndissertation\nWilderness\n##quest\nbraced\narbitrary\nkidnapping\nKurdish\n##but\nclearance\nexcavations\nwanna\nAllmusic\ninsult\npresided\nyacht\n##SM\nHonour\nTin\nattracting\nexplosives\nGore\nBride\n##ience\nPackers\nDevils\nObserver\n##course\nLoser\n##erry\n##hardt\n##mble\nCyrillic\nundefeated\n##stra\nsubordinate\n##ame\nWigan\ncompulsory\nPauline\nCruise\nOpposition\n##ods\nPeriod\ndispersed\nexpose\n##60\n##has\nCertain\nClerk\nWolves\n##hibition\napparatus\nallegiance\norbital\njustified\nthanked\n##ević\nBiblical\nCarolyn\nGraves\n##tton\nHercules\nbackgrounds\nreplica\n1788\naquatic\nMega\nStirling\nobstacles\nfiling\nFounder\nvowels\nDeborah\nRotterdam\nsurpassed\nBelarusian\n##ologists\nZambia\nRen\nOlga\nAlpine\nbi\ncouncillor\nOaks\nAnimals\neliminating\ndigit\nManaging\n##GE\nlaundry\n##rdo\npresses\nslamming\nTudor\nthief\nposterior\n##bas\nRodgers\nsmells\n##ining\nHole\nSUV\ntrombone\nnumbering\nrepresentations\nDomingo\nParalympics\ncartridge\n##rash\nCombined\nshelves\nKraków\nrevision\n##frame\nSánchez\n##tracted\n##bler\nAlain\ntownships\nsic\ntrousers\nGibbs\nanterior\nsymmetry\nvaguely\nCastile\nIRA\nresembling\nPenguin\n##ulent\ninfections\n##stant\nraped\n##pressive\nworrying\nbrains\nbending\nJR\nEvidence\nVenetian\ncomplexes\nJonah\n850\nexported\nAmbrose\nGap\nphilanthropist\n##atus\nMarxist\nweighing\n##KO\n##nath\nSoldiers\nchiefs\nreject\nrepeating\nshaky\nZürich\npreserving\n##xin\ncigarettes\n##break\nmortar\n##fin\nAlready\nreproduction\nsocks\nWaiting\namazed\n##aca\ndash\n##path\nAirborne\n##harf\n##get\ndescending\nOBE\nSant\nTess\nLucius\nenjoys\n##ttered\n##ivation\n##ete\nLeinster\nPhillies\nexecute\ngeological\nunfinished\nCourts\nSP\nBeaver\nDuck\nmotions\nPlatinum\nfriction\n##aud\n##bet\nParts\nStade\nentirety\nsprang\nSmithsonian\ncoffin\nprolonged\nBorneo\n##vise\nunanimously\n##uchi\nCars\nCassandra\nAustralians\n##CT\n##rgen\nLouisa\nspur\nConstance\n##lities\nPatent\nracism\ntempo\n##ssion\n##chard\n##nology\n##claim\nMillion\nNichols\n##dah\nNumerous\ning\nPure\nplantations\ndonor\n##EP\n##rip\nconvenience\n##plate\ndots\nindirect\n##written\nDong\nfailures\nadapt\nwizard\nunfortunately\n##gion\npractitioners\neconomically\nEnrique\nunchanged\nkingdoms\nrefined\ndefinitions\nlazy\nworries\nrailing\n##nay\nKaiser\n##lug\ncracks\nsells\nninety\n##WC\nDirected\ndenotes\ndevelopmental\npapal\nunfortunate\ndisappointing\nsixteenth\nJen\n##urier\nNWA\ndrifting\nHorror\n##chemical\nbehaviors\nbury\nsurfaced\nforeigners\nslick\nAND\n##rene\n##ditions\n##teral\nscrap\nkicks\ncomprise\nbuddy\n##anda\nMental\n##ype\nDom\nwines\nLimerick\nLuca\nRand\n##won\nTomatoes\nhomage\ngeometric\n##nted\ntelescope\nShelley\npoles\n##fan\nshareholders\nAutonomous\ncope\nintensified\nGenoa\nReformation\ngrazing\n##tern\nZhao\nprovisional\n##bies\nCon\n##riel\nCynthia\nRaleigh\nvivid\nthreaten\nLength\nsubscription\nroses\nMüller\n##isms\nrobin\n##tial\nLaos\nStanton\nnationalism\n##clave\n##ND\n##17\n##zz\nstaging\nBusch\nCindy\nrelieve\n##spective\npacks\nneglected\nCBE\nalpine\nEvolution\nuneasy\ncoastline\nDestiny\nBarber\nJulio\n##tted\ninforms\nunprecedented\nPavilion\n##bei\n##ference\nbetrayal\nawaiting\nleaked\nV8\npuppet\nadverse\nBourne\nSunset\ncollectors\n##glass\n##sque\ncopied\nDemon\nconceded\nresembled\nRafe\nLevy\nprosecutor\n##ject\nflora\nmanned\ndeaf\nMosque\nreminds\nLizzie\nProducts\nFunny\ncassette\ncongress\n##rong\nRover\ntossing\nprompting\nchooses\nSatellite\ncautiously\nReese\n##UT\nHuang\nGloucestershire\ngiggled\nKitty\n##å\nPleasant\nAye\n##ond\njudging\n1860s\nintentionally\nHurling\naggression\n##xy\ntransfers\nemploying\n##fies\n##oda\nArchibald\nBlessed\nSki\nflavor\nRosie\n##burgh\nsunset\nScholarship\nWC\nsurround\nranged\n##jay\nDegree\nHouses\nsqueezing\nlimb\npremium\nLeningrad\nsteals\n##inated\n##ssie\nmadness\nvacancy\nhydraulic\nNorthampton\n##prise\nMarks\nBoxing\n##fying\nacademics\n##lich\n##TY\nCDs\n##lma\nhardcore\nmonitors\npaperback\ncables\nDimitri\nupside\nadvent\nRa\n##clusive\nAug\nChristchurch\nobjected\nstalked\nSimple\ncolonists\n##laid\nCT\ndiscusses\nfellowship\nCarnival\ncares\nMiracle\npastoral\nrooted\nshortage\nborne\nQuentin\nmeditation\ntapping\nNovel\n##ades\nAlicia\nBurn\nfamed\nresidency\nFernández\nJohannesburg\nZhu\noffended\nMao\noutward\n##inas\nXV\ndenial\nnoticing\n##ís\nquarry\n##hound\n##amo\nBernie\nBentley\nJoanna\nmortgage\n##rdi\n##sumption\nlenses\nextracted\ndepiction\n##RE\nNetworks\nBroad\nRevenue\nflickered\nvirgin\nflanked\n##о\nEnterprises\nprobable\nLiberals\nFalcons\ndrowning\nphrases\nloads\nassumes\ninhaled\nawe\nlogs\nslightest\nspiders\nwaterfall\n##pate\nrocking\nshrub\n##uil\nroofs\n##gard\nprehistoric\nwary\n##rak\nTO\nclips\nsustain\ntreason\nmicrophone\nvoter\nLamb\npsychologist\nwrinkled\n##ères\nmating\nCarrier\n340\n##lbert\nsensing\n##rino\ndestiny\ndistract\nweaker\nUC\nNearly\nneurons\nspends\nApache\n##rem\ngenuinely\nwells\n##lanted\nstereo\n##girl\nLois\nLeaving\nconsul\nfungi\nPier\nCyril\n80s\nJungle\n##tani\nillustration\nSplit\n##hana\nAbigail\n##patrick\n1787\ndiminished\nSelected\npackaging\n##EG\nMartínez\ncommunal\nManufacturing\nsentiment\n143\nunwilling\npraising\nCitation\npills\n##iti\n##rax\nmuffled\nneatly\nworkforce\nYep\nleisure\nTu\n##nding\nWakefield\nancestral\n##uki\ndestructive\nseas\nPassion\nshowcase\n##ceptive\nheroic\n142\nexhaustion\nCustoms\n##aker\nScholar\nsliced\n##inian\nDirection\n##OW\nSwansea\naluminium\n##eep\nceramic\nMcCoy\nCareer\nSector\nchartered\nDamascus\npictured\nInterest\nstiffened\nPlateau\nobsolete\n##tant\nirritated\ninappropriate\novers\n##nko\nbail\nTalent\nSur\nours\n##nah\nbarred\nlegged\nsociology\nBud\ndictionary\n##luk\nCover\nobey\n##oring\nannoying\n##dong\napprentice\nCyrus\nRole\n##GP\n##uns\n##bag\nGreenland\nPorsche\nRocket\n##32\norganism\n##ntary\nreliability\n##vocation\n##й\nFound\n##hine\nmotors\npromoter\nunfair\n##oms\n##note\ndistribute\neminent\nrails\nappealing\nchiefly\nmeaningful\nStephan\n##rehension\nConsumer\npsychiatric\nbowler\nsaints\n##iful\n##н\n1777\nPol\nDorian\nTownsend\nhastily\n##jima\nQuincy\nSol\nfascinated\nScarlet\nalto\nAvon\ncertainty\n##eding\nKeys\n##chu\nChu\n##VE\nions\ntributaries\nThanksgiving\n##fusion\nastronomer\noxide\npavilion\nSupply\nCasa\nBollywood\nsadly\nmutations\nKeller\n##wave\nnationals\n##rgo\n##ym\npredict\nCatholicism\nVega\n##eration\n##ums\nMali\ntuned\nLankan\nPlans\nradial\nBosnian\nLexi\n##14\n##ü\nsacks\nunpleasant\nEmpty\nhandles\n##taking\nBon\nswitches\nintently\ntuition\nantique\n##jk\nfraternity\nnotebook\nDesmond\n##sei\nprostitution\n##how\ndeed\n##OP\n501\nSomewhere\nRocks\n##mons\ncampaigned\nfrigate\ngases\nsuppress\n##hang\nMerlin\nNorthumberland\ndominate\nexpeditions\nthunder\n##ups\n##rical\nCap\nthorough\nAriel\n##kind\nrenewable\nconstructing\npacing\nterrorists\nBowen\ndocumentaries\nwestward\n##lass\n##nage\nMerchant\n##ued\nBeaumont\nDin\n##hian\nDanube\npeasant\nGarrison\nencourages\ngratitude\nreminding\nstormed\n##ouse\npronunciation\n##ailed\nWeekend\nsuggestions\n##ffing\n##DI\nActive\nColombo\n##logists\nMerrill\n##cens\nArchaeological\nMedina\ncaptained\n##yk\nduel\ncracking\nWilkinson\nGuam\npickup\nrenovations\n##ël\n##izer\ndelighted\n##iri\nWeaver\n##ctional\ntens\n##hab\nClint\n##usion\n##each\npetals\nFarrell\n##sable\ncaste\n##will\nEzra\n##qi\n##standing\nthrilled\nambush\nexhaled\n##SU\nResource\nblur\nforearm\nspecifications\ncontingent\ncafe\n##iology\nAntony\nfundraising\ngrape\n##rgy\nturnout\n##udi\nClifton\nlaboratories\nIrvine\n##opus\n##lid\nMonthly\nBihar\nstatutory\nRoses\nEmil\n##rig\nlumber\noptimal\n##DR\npumps\nplaster\nMozambique\n##aco\nnightclub\npropelled\n##hun\nked\nsurplus\nwax\n##urai\npioneered\nSunny\nimprint\nForget\nEliot\napproximate\npatronage\n##bek\n##ely\n##mbe\nPartnership\ncurl\nsnapping\n29th\nPatriarch\n##jord\nseldom\n##ature\nastronomy\nBremen\nXIV\nairborne\n205\n1778\nrecognizing\nstranded\narrogant\nbombardment\ndestined\nensured\n146\nrobust\nDavenport\nInteractive\nOffensive\nFi\nprevents\nprobe\npropeller\nsorrow\nBlade\nmounting\nautomotive\n##dged\nwallet\n201\nlashes\nForrest\n##ift\nCell\nYounger\nshouts\n##cki\nfolds\n##chet\nEpic\nyields\nhomosexual\ntunes\n##minate\n##text\nManny\nchemist\nhindwings\n##urn\npilgrimage\n##sfield\n##riff\nMLS\n##rive\nHuntington\ntranslates\nPath\nslim\n##ndra\n##oz\nclimax\ncommuter\ndesperation\n##reet\ndenying\n##rious\ndaring\nseminary\npolo\n##clamation\nTeatro\nTorah\nCats\nidentities\nPoles\nphotographed\nfiery\npopularly\n##cross\nwinters\nHesse\n##vio\nNurse\nSenegal\nSalon\nprescribed\njustify\n##gues\n##и\n##orted\nHQ\n##hiro\nevaluated\nmomentarily\n##unts\nDebbie\n##licity\n##TP\nMighty\nRabbit\n##chal\nEvents\nSavoy\n##ht\nBrandenburg\nBordeaux\n##laus\nRelease\n##IE\n##kowski\n1900s\nSK\nStrauss\n##aly\nSonia\nUpdated\nsynagogue\nMcKay\nflattened\n370\nclutch\ncontests\ntoast\nevaluate\npope\nheirs\njam\ntutor\nreverted\n##ading\nnonsense\nhesitate\nLars\nCeylon\nLaurie\n##guchi\naccordingly\ncustomary\n148\nEthics\nMultiple\ninstincts\nIGN\n##ä\nbullshit\n##hit\n##par\ndesirable\n##ducing\n##yam\nalias\nashore\nlicenses\n##lification\nmisery\n147\nCola\nassassinated\nfiercely\n##aft\nlas\ngoat\nsubstrate\nlords\nCass\nBridges\nICC\nlasts\nsights\nreproductive\n##asi\nIvory\nClean\nfixing\n##lace\nseeming\naide\n1850s\nharassment\n##FF\n##LE\nreasonably\n##coat\n##cano\nNYC\n1784\nFifty\nimmunity\nCanadians\nCheng\ncomforting\nmeanwhile\n##tera\n##blin\nbreeds\nglowed\n##vour\nAden\n##verted\n##aded\n##oral\nneat\nenforced\npoisoning\n##ews\n##hone\nenforce\npredecessors\nsurvivor\nMonth\nunfamiliar\npierced\nwaived\ndump\nresponds\nMai\nDeclan\nangular\nDoesn\ninterpretations\n##yar\ninvest\nDhaka\npoliceman\nCongregation\nEighth\npainfully\n##este\n##vior\nWürttemberg\n##cles\nblockade\nencouragement\n##fie\nCaucasus\nMalone\nUniversidad\nutilize\nNissan\ninherent\n151\nagreeing\nsyllable\ndetermines\nProtocol\nconclude\n##gara\n40th\nXu\nTaiwanese\n##ather\nboiler\nprinter\nLacey\ntitular\nKlaus\nFallon\nWembley\nfox\nChandra\nGovernorate\nobsessed\n##Ps\nmicro\n##25\nCooke\ngymnasium\nweaving\nShall\nHussein\nglaring\nsoftball\nReader\nDominion\nTrouble\nvarsity\nCooperation\nChaos\nKang\nKramer\nEisenhower\nproves\nConnie\nconsortium\ngovernors\nBethany\nopener\nNormally\nWilly\nlinebacker\nRegent\nUsed\nAllMusic\nTwilight\n##shaw\nCompanion\nTribunal\nsimpler\n##gam\nExperimental\nSlovenian\ncellar\ndeadline\ntrout\nHubbard\nads\nidol\n##hetto\nGranada\nclues\nsalmon\n1700\nOmega\nCaldwell\nsoftened\nBills\nHonolulu\n##gn\nTerrace\nsuitcase\n##IL\nfrantic\n##oons\nAbbot\nSitting\nFortress\nRiders\nsickness\nenzymes\ntrustee\nBern\nforged\n##13\n##ruff\n##rl\n##versity\ninspector\nchampagne\n##held\n##FI\nhereditary\nTaliban\nhandball\n##wine\nSioux\n##dicated\nhonoured\n139\n##tude\nSkye\nmeanings\n##rkin\ncardiac\nanalyzed\nvegetable\n##FS\nRoyals\ndial\nfreelance\n##fest\npartisan\npetroleum\nridden\nLincolnshire\npanting\n##comb\npresidents\nHaley\n##chs\ncontributes\nJew\ndiscoveries\npanicked\nWoody\neyelids\nFate\nTulsa\nmg\nwhiskey\nzombies\nWii\n##udge\ninvestigators\n##bull\ncentred\n##screen\nBone\nLana\n##oise\nforts\n##ske\nConan\nLyons\n##writing\nSH\n##ride\nrhythmic\n154\n##llah\npioneers\n##bright\ncaptivity\nSanchez\nOman\n##mith\nFlint\nPlatform\n##ioned\nemission\npacket\nPersia\n##formed\ntakeover\ntempted\nVance\nFew\nToni\nreceptions\n##ن\nexchanges\nCamille\nwhale\nChronicles\n##rent\n##ushing\n##rift\nAlto\nGenus\n##asing\nonward\nforemost\nlonging\nRockefeller\ncontainers\n##cribe\nintercepted\n##olt\npleading\nBye\nbee\n##umbling\n153\nundertake\nIzzy\ncheaper\nUltra\nvalidity\n##pse\nSa\nhovering\n##pert\nvintage\nengraved\n##rise\nfarmland\n##ever\n##ifier\nAtlantis\npropose\nCatalonia\nplunged\n##edly\ndemonstrates\ngig\n##cover\n156\nOsborne\ncowboy\nherd\ninvestigator\nloops\nBurning\nrests\nInstrumental\nembarrassing\nfocal\ninstall\nreadings\nswirling\nChatham\nparameter\n##zin\n##holders\nMandarin\nMoody\nconverting\nEscape\nwarnings\n##chester\nincarnation\n##ophone\nadopting\n##lins\nCromwell\n##laws\nAxis\nVerde\nKappa\nSchwartz\nSerbs\ncaliber\nWanna\nChung\n##ality\nnursery\nprincipally\nBulletin\nlikelihood\nlogging\n##erty\nBoyle\nsupportive\ntwitched\n##usive\nbuilds\nMarseille\nomitted\nmotif\nLands\n##lusion\n##ssed\nBarrow\nAirfield\nHarmony\nWWF\nendured\nmerging\nconvey\nbranding\nexaminations\n167\nItalians\n##dh\ndude\n1781\n##teau\ncrawling\nthoughtful\nclasped\nconcluding\nbrewery\nMoldova\nWan\nTowers\nHeidelberg\n202\n##ict\nLagos\nimposing\n##eval\n##serve\nBacon\nfrowning\nthirteenth\nconception\ncalculations\n##ович\n##mile\n##ivated\nmutation\nstrap\n##lund\ndemographic\nnude\nperfection\nstocks\n##renched\n##dit\nAlejandro\nbites\nfragment\n##hack\n##rchy\nGB\nSurgery\nBerger\npunish\nboiling\nconsume\nElle\nSid\nDome\nrelies\nCrescent\ntreasurer\nBloody\n1758\nupheld\nGuess\nRestaurant\nsignatures\nfont\nmillennium\nmural\nstakes\nAbel\nhailed\ninsists\nAlumni\nBreton\n##jun\ndigits\n##FM\n##thal\nTalking\nmotive\nreigning\nbabe\nmasks\n##ø\nShaun\npotato\nsour\nwhitish\nSomali\n##derman\n##rab\n##wy\nchancel\ntelecommunications\nNoise\nmessenger\ntidal\ngrinding\n##ogenic\nRebel\nconstituent\nperipheral\nrecruitment\n##ograph\n##tler\npumped\nRavi\npoked\n##gley\nOlive\ndiabetes\ndiscs\nliking\nsting\nfits\nstir\nMari\nSega\ncreativity\nweights\nMacau\nmandated\nBohemia\ndisastrous\nKatrina\nBaku\nRajasthan\nwaiter\n##psis\nSiberia\nverbs\n##truction\npatented\n1782\n##ndon\nRelegated\nHunters\nGreenwood\nShock\naccusing\nskipped\nSessions\nmarkers\nsubset\nmonumental\nViola\ncomparative\nAlright\nBarbados\nsetup\nSession\nstandardized\n##ík\n##sket\nappoint\nAFB\nNationalist\n##WS\nTroop\nleaped\nTreasure\ngoodness\nweary\noriginates\n100th\ncompassion\nexpresses\nrecommend\n168\ncomposing\nseventeenth\nTex\nAtlético\nbald\nFinding\nPresidency\nSharks\nfavoured\ninactive\n##lter\nsuffix\nprinces\nbrighter\n##ctus\nclassics\ndefendants\nculminated\nterribly\nStrategy\nevenings\n##ção\n##iver\n##urance\nabsorb\n##rner\nTerritories\nRBI\nsoothing\nMartín\nconcurrently\n##tr\nNicholson\nfibers\nswam\n##oney\nAllie\nAlgerian\nDartmouth\nMafia\n##bos\n##tts\nCouncillor\nvocabulary\n##bla\n##lé\nintending\n##dler\nGuerrero\nsunshine\npedal\n##TO\nadministrators\nperiodic\nscholarships\nLoop\nMadeline\nexaggerated\n##ressed\nRegan\n##cellular\nExplorer\n##oids\nAlexandre\nvows\nReporter\nUnable\nAverage\nabsorption\n##bedience\nFortunately\nAuxiliary\nGrandpa\n##HP\n##ovo\npotent\ntemporal\nadrenaline\n##udo\nconfusing\nguiding\nDry\nqualifications\njoking\nwherein\nheavyweight\n##ices\nnightmares\npharmaceutical\nCommanding\n##aled\n##ove\nGregor\n##UP\ncensorship\ndegradation\nglorious\nAustro\n##rench\n380\nMiriam\nsped\n##orous\noffset\n##KA\nfined\nspecialists\nPune\nJoão\n##dina\npropped\nfungus\n##ς\nfrantically\nGabrielle\nHare\ncommitting\n##plied\nAsk\nWilmington\nstunt\nnumb\nwarmer\npreacher\nearnings\n##lating\ninteger\n##ija\nfederation\nhomosexuality\n##cademia\nepidemic\ngrumbled\nshoving\nMilk\nSatan\nTobias\ninnovations\n##dington\ngeology\nmemoirs\n##IR\nspared\nculminating\nDaphne\nFocus\nsevered\nstricken\nPaige\nMans\nflats\nRusso\ncommunes\nlitigation\nstrengthening\n##powered\nStaffordshire\nWiltshire\nPainting\nWatkins\n##د\nspecializes\nSelect\n##rane\n##aver\nFulton\nplayable\n##VN\nopenings\nsampling\n##coon\n##21\nAllah\ntravelers\nallocation\n##arily\nLoch\n##hm\ncommentators\nfulfilled\n##troke\nEmeritus\nVanderbilt\nVijay\npledged\n##tative\ndiagram\ndrilling\n##MD\n##plain\nEdison\nproductivity\n31st\n##rying\n##ption\n##gano\n##oration\n##bara\nposture\nbothering\nplatoon\npolitely\n##inating\nredevelopment\nJob\n##vale\nstark\nincorrect\nMansion\nrenewal\nthreatens\nBahamas\nfridge\n##tata\nUzbekistan\n##edia\nSainte\n##mio\ngaps\nneural\n##storm\noverturned\nPreservation\nshields\n##ngo\n##physics\nah\ngradual\nkillings\n##anza\nconsultation\npremiership\nFelipe\ncoincidence\n##ène\n##any\nHandbook\n##loaded\nEdit\nGuns\narguably\n##ş\ncompressed\ndepict\nseller\n##qui\nKilkenny\n##kling\nOlympia\nlibrarian\n##acles\ndramas\nJP\nKit\nMaj\n##lists\nproprietary\n##nged\n##ettes\n##tok\nexceeding\nLock\ninduction\nnumerical\n##vist\nStraight\nfoyer\nimaginary\n##pop\nviolinist\nCarla\nbouncing\n##ashi\nabolition\n##uction\nrestoring\nscenic\n##č\nDoom\noverthrow\npara\n##vid\n##ughty\nConcord\nHC\ncocaine\ndeputies\n##aul\nvisibility\n##wart\nKapoor\nHutchinson\n##agan\nflashes\nkn\ndecreasing\n##ronology\nquotes\nvain\nsatisfying\n##iam\n##linger\n310\nHanson\nfauna\n##zawa\n##rrel\nTrenton\n##VB\nEmployment\nvocational\nExactly\nbartender\nbutterflies\ntow\n##chers\n##ocks\npigs\nmerchandise\n##game\n##pine\nShea\n##gration\nConnell\nJosephine\nmonopoly\n##dled\nCobb\nwarships\ncancellation\nsomeday\nstove\n##Cs\ncandidacy\nsuperhero\nunrest\nToulouse\nadmiration\nundergone\nwhirled\nReconnaissance\ncostly\n##ships\n290\nCafe\namber\nTory\n##mpt\ndefinitive\n##dress\nproposes\nredesigned\nacceleration\n##asa\n##raphy\nPresley\nexits\nLanguages\n##cel\nMode\nspokesperson\n##tius\nBan\nforthcoming\ngrounded\nACC\ncompelling\nlogistics\nretailers\nabused\n##gating\nsoda\n##yland\n##lution\nLandmark\nXVI\nblush\n##tem\nhurling\ndread\nTobago\nFoley\n##uad\nscenarios\n##mentation\n##rks\nScore\nfatigue\nhairy\ncorrespond\n##iard\ndefences\nconfiscated\n##rudence\n1785\nFormerly\nShot\nadvertised\n460\nText\nridges\nPromise\nDev\nexclusion\nNHS\ntuberculosis\nrockets\n##offs\nsparkling\n256\ndisappears\nmankind\n##hore\nHP\n##omo\ntaxation\nMulti\nDS\nVirgil\n##ams\nDell\nstacked\nguessing\nJump\nNope\ncheer\nhates\nballots\noverlooked\nanalyses\nPrevention\nmaturity\ndos\n##cards\n##lect\nMare\n##yssa\nPetty\n##wning\ndiffering\niOS\n##ior\nJoachim\nSentinel\n##nstein\n90s\nPamela\n480\nAsher\n##lary\nVicente\nlandings\nportray\n##rda\n##xley\nVirtual\n##uary\nfinances\nJain\nSomebody\nTri\nbehave\nMichele\n##ider\ndwellings\nFAA\nGallagher\n##lide\nMonkey\n195\naforementioned\n##rism\n##bey\n##kim\n##puted\nMesa\nhopped\nunopposed\nrecipients\nReality\nBeen\ngritted\n149\nplayground\npillar\n##rone\nGuinness\n##tad\nThéâtre\ndepended\nTipperary\nReuben\nfrightening\nwooded\nTarget\nglobally\n##uted\nMorales\nBaptiste\ndrunken\nInstitut\ncharacterised\n##chemistry\nStrip\ndiscrete\nPremiership\n##zzling\ngazing\nOuter\n##quisition\nSikh\nBooker\n##yal\ncontemporaries\nJericho\n##chan\n##physical\n##witch\nMilitia\n##rez\n##zard\ndangers\n##utter\n##₀\nPrograms\ndarling\nparticipates\nrailroads\n##ienne\nbehavioral\nbureau\n##rook\n161\nHicks\n##rises\nComes\ninflicted\nbees\nkindness\nnorm\n##ković\ngenerators\n##pard\n##omy\n##ili\nmethodology\nAlvin\nfaçade\nlatitude\n##plified\nDE\nMorse\n##mered\neducate\nintersects\n##MF\n##cz\n##vated\nAL\n##graded\n##fill\nconstitutes\nartery\nfeudal\navant\ncautious\n##ogue\nimmigrated\n##chenko\nSaul\nClinic\nFang\nchoke\nCornelius\nflexibility\ntemperate\npins\n##erson\noddly\ninequality\n157\nNatasha\nSal\n##uter\n215\naft\nblinking\n##ntino\nnorthward\nExposition\ncookies\nWedding\nimpulse\nOverseas\nterrifying\n##ough\nMortimer\n##see\n440\nhttps\nog\nimagining\n##cars\nNicola\nexceptionally\nthreads\n##cup\nOswald\nProvisional\ndismantled\ndeserves\n1786\nFairy\ndiscourse\nCounsel\ndeparting\nArc\nguarding\n##orse\n420\nalterations\nvibrant\nEm\nsquinted\nterrace\nrowing\nLed\naccessories\nSF\nSgt\ncheating\nAtomic\n##raj\nBlackpool\n##iary\nboarded\nsubstituted\nbestowed\nlime\nkernel\n##jah\nBelmont\nshaken\nsticky\nretrospective\nLouie\nmigrants\nweigh\nsunglasses\nthumbs\n##hoff\nexcavation\n##nks\nExtra\nPolo\nmotives\nDrum\ninfrared\ntastes\nberth\nverge\n##stand\nprogrammed\nwarmed\nShankar\nTitan\nchromosome\ncafeteria\ndividing\npepper\nCPU\nStevie\nsatirical\nNagar\nscowled\nDied\nbackyard\n##gata\n##reath\n##bir\nGovernors\nportraying\n##yah\nRevenge\n##acing\n1772\nmargins\nBahn\nOH\nlowland\n##razed\ncatcher\nreplay\n##yoshi\nSeriously\n##licit\nAristotle\n##ald\nHabsburg\nweekday\nSecretariat\nCO\n##dly\n##joy\n##stad\nlitre\nultra\n##cke\nMongol\nTucson\ncorrelation\ncompose\ntraps\nGroups\nHai\nSalvatore\n##dea\ncents\n##eese\nconcession\nclash\nTrip\nPanzer\nMoroccan\ncruisers\ntorque\nBa\ngrossed\n##arate\nrestriction\nconcentrating\nFDA\n##Leod\n##ones\nScholars\n##esi\nthrobbing\nspecialised\n##heses\nChicken\n##fia\n##ificant\nErich\nResidence\n##trate\nmanipulation\nnamesake\n##tom\nHoover\ncue\nLindsey\nLonely\n275\n##HT\ncombustion\nsubscribers\nPunjabi\nrespects\nJeremiah\npenned\n##gor\n##rilla\nsuppression\n##tration\nCrimson\npiston\nDerry\ncrimson\nlyrical\noversee\nportrays\nCF\nDistricts\nLenin\nCora\nsearches\nclans\nVHS\n##hel\nJacqueline\nRedskins\nClubs\ndesktop\nindirectly\nalternatives\nmarijuana\nsuffrage\n##smos\nIrwin\n##liff\nProcess\n##hawks\nSloane\n##bson\nSonata\nyielded\nFlores\n##ares\narmament\nadaptations\nintegrate\nneighbours\nshelters\n##tour\nSkinner\n##jet\n##tations\n1774\nPeterborough\n##elles\nripping\nLiang\nDickinson\ncharities\nRwanda\nmonasteries\ncrossover\nracist\nbarked\nguerrilla\n##ivate\nGrayson\n##iques\n##vious\n##got\nRolls\ndenominations\natom\naffinity\n##delity\nWish\n##inted\n##inae\ninterrogation\n##cey\n##erina\n##lifting\n192\nSands\n1779\nmast\nLikewise\n##hyl\n##oft\ncontempt\n##por\nassaulted\nfills\nestablishments\nMal\nconsulted\n##omi\n##sight\ngreet\n##roma\n##egan\nPulitzer\n##rried\n##dius\n##ractical\n##voked\nHasan\nCB\n##zzy\nRomanesque\nPanic\nwheeled\nrecorder\n##tters\n##warm\n##gly\nbotanist\nBalkan\nLockheed\nPolly\nfarewell\nsuffers\npurchases\nEaton\n##80\nQuick\ncommenting\nSaga\nbeasts\nhides\nmotifs\n##icks\nAlonso\nSpringer\nWikipedia\ncirculated\nencoding\njurisdictions\nsnout\nUAE\nIntegrated\nunmarried\nHeinz\n##lein\n##figured\ndeleted\n##tley\nZen\nCycling\nFuel\nScandinavian\n##rants\nConner\nreef\nMarino\ncuriously\nlingered\nGina\nmanners\nactivism\nMines\nExpo\nMicah\npromotions\nServer\nbooked\nderivatives\neastward\ndetailing\nreelection\n##chase\n182\nCampeonato\nPo\n158\nPeel\nwinger\n##itch\ncanyon\n##pit\nLDS\nA1\n##shin\nGiorgio\npathetic\n##rga\n##mist\nAren\n##lag\nconfronts\nmotel\ntextbook\nshine\nturbines\n1770\nDarcy\n##cot\nSoutheastern\n##lessness\nBanner\nrecognise\nstray\nKitchen\npaperwork\nrealism\nChrysler\nfilmmakers\nfishermen\n##hetic\nvariously\nVishnu\nfiddle\nEddy\nOrigin\n##tec\n##ulin\nFlames\nRs\nbankrupt\nExtreme\nPomeranian\n##emption\nratified\n##iu\njockey\nStratford\n##ivating\n##oire\nBabylon\npardon\nAI\naffordable\ndeities\ndisturbance\nTrying\n##sai\nIda\nPapers\nadvancement\n70s\narchbishop\nLuftwaffe\nannounces\ntugging\n##lphin\n##sistence\n##eel\n##ishes\nambition\naura\n##fled\n##lected\n##vue\nPrasad\nboiled\nclarity\nViolin\ninvestigative\nrouting\nYankee\n##uckle\nMcMahon\nbugs\neruption\n##rooms\nMinutes\nrelics\n##ckle\n##nse\nsipped\nvalves\nweakly\n##ital\nMiddleton\ncollided\n##quer\nbamboo\ninsignia\nTyne\nexercised\nNinth\nechoing\npolynomial\nconsiderations\nlunged\n##bius\nobjections\ncomplain\ndisguised\nplaza\n##VC\ninstitutes\nJudicial\nascent\nimminent\nWaterford\nhello\nLumpur\nNiger\nGoldman\nvendors\nKensington\nWren\nbrowser\n##bner\n##tri\n##mize\n##pis\n##lea\nCheyenne\nBold\nSettlement\nHollow\nParalympic\naxle\n##toire\n##actic\nimpose\nperched\nutilizing\nslips\nBenz\nMichaels\nmanipulate\nChiang\n##mian\nDolphins\nprohibition\nattacker\necology\nEstadio\n##SB\n##uild\nattracts\nrecalls\nglacier\nlad\n##rima\nBarlow\nkHz\nmelodic\n##aby\n##iracy\nassumptions\nCornish\n##aru\nDOS\nMaddie\n##mers\nlyric\nLuton\nnm\n##tron\nReno\nFin\nYOU\nBroadcast\nFinch\nsensory\n##bent\nJeep\n##uman\nadditionally\nBuildings\nbusinessmen\ntreaties\n235\nStranger\ngateway\nCharlton\naccomplishments\nDiary\napologized\nzinc\nhistories\nsupplier\n##tting\n162\nasphalt\nTreatment\nAbbas\n##pating\n##yres\nBloom\nsedan\nsoloist\n##cum\nantagonist\ndenounced\nFairfax\n##aving\n##enko\nnoticeable\nBudget\nBuckingham\nSnyder\nretreating\nJai\nspoon\ninvading\ngiggle\nwoven\ngunfire\narrests\n##vered\n##come\nrespiratory\nviolet\n##aws\nByrd\nshocking\ntenant\nJamaican\nOttomans\nSeal\ntheirs\n##isse\n##48\ncooperate\npeering\n##nius\n163\nComposer\norganist\nMongolian\nBauer\nSpy\ncollects\nprophecy\ncongregations\n##moor\nBrick\ncalculation\nfixtures\nexempt\n##dden\nAda\nThousand\n##lue\ntracing\n##achi\nbodyguard\nvicar\nsupplying\nŁódź\ninterception\nmonitored\n##heart\nPaso\noverlap\nannoyance\n##dice\nyellowish\nstables\nelders\nillegally\nhonesty\n##oar\nskinny\nspinal\n##puram\nBourbon\n##cor\nflourished\nMedium\n##stics\n##aba\nFollow\n##ckey\nstationary\n##scription\ndresser\nscrutiny\nBuckley\nClearly\n##SF\nLyrics\n##heimer\ndrying\nOracle\ninternally\nrains\n##last\nEnemy\n##oes\nMcLean\nOle\nphosphate\nRosario\nRifles\n##mium\nbattered\nPepper\nPresidents\nconquer\nChâteau\ncastles\n##aldo\n##ulf\nDepending\nLesser\nBoom\ntrades\nPeyton\n164\nemphasize\naccustomed\nSM\nAi\nClassification\n##mins\n##35\n##rons\nleak\npiled\ndeeds\nlush\n##self\nbeginnings\nbreathless\n1660\nMcGill\n##ago\n##chaft\n##gies\nhumour\nBomb\nsecurities\nMight\n##zone\n##eves\nMatthias\nMovies\nLevine\nvengeance\n##ads\nChallenger\nMisty\nTraditionally\nconstellation\n##rass\ndeepest\nworkplace\n##oof\n##vina\nimpatient\n##ML\nMughal\nAlessandro\nscenery\nSlater\npostseason\ntroupe\n##ń\nVolunteers\nFacility\nmilitants\nReggie\nsanctions\nExpeditionary\nNam\ncountered\ninterpret\nBasilica\ncoding\nexpectation\nDuffy\ndef\nTong\nwakes\nBowling\nVehicle\nAdler\nsalad\nintricate\nstronghold\nmedley\n##uries\n##bur\njoints\n##rac\n##yx\n##IO\nOrdnance\nWelch\ndistributor\nArk\ncavern\ntrench\nWeiss\nMauritius\ndecreases\ndocks\neagerly\nirritation\nMatilda\nbiographer\nVisiting\n##marked\n##iter\n##ear\n##gong\nMoreno\nattendant\nBury\ninstrumentation\ntheologian\nclit\nnuns\nsymphony\ntranslate\n375\nloser\n##user\n##VR\n##meter\n##orious\nharmful\n##yuki\nCommissioners\nMendoza\nsniffed\nHulk\n##dded\n##ulator\n##nz\nDonnell\n##eka\ndeported\nMet\nSD\nAerospace\n##cultural\n##odes\nFantastic\ncavity\nremark\nemblem\nfearing\n##iance\nICAO\nLiberia\nstab\n##yd\nPac\nGymnasium\nIS\nEverton\n##vanna\nmantle\n##ief\nRamon\n##genic\nShooting\nSmoke\nRandom\nAfricans\nMB\ntavern\nbargain\nvoluntarily\nIon\nPeoples\nRusty\nattackers\nPatton\nsins\n##cake\nHat\nmoderately\n##hala\n##alia\nrequesting\nmechanic\n##eae\nSeine\nRobbins\n##ulum\nsusceptible\nBravo\nSlade\nStrasbourg\nrubble\nentrusted\nCreation\n##amp\nsmoothed\n##uintet\nevenly\nreviewers\nskip\nSculpture\n177\nRough\n##rrie\nReeves\n##cede\nAdministrator\ngarde\nminus\ncarriages\ngrenade\nNinja\nfuscous\n##kley\nPunk\ncontributors\nAragon\nTottenham\n##cca\n##sir\nVA\nlaced\ndealers\n##sonic\ncrisp\nharmonica\nArtistic\nButch\nAndes\nFarmers\ncorridors\nunseen\n##tium\nCountries\nLone\nenvisioned\nKaty\n##lang\n##cc\nQuarterly\n##neck\nconsort\n##aceae\nbidding\nCorey\nconcurrent\n##acts\n##gum\nHighness\n##lient\n##rators\narising\n##unta\npathways\n49ers\nbolted\ncomplaining\necosystem\nlibretto\nSer\nnarrated\n212\nSoft\ninflux\n##dder\nincorporation\nplagued\ntents\n##ddled\n1750\nRisk\ncitation\nTomas\nhostilities\nseals\nBruins\nDominique\nattic\ncompetent\n##UR\n##cci\nhugging\nBreuning\nbacterial\nShrewsbury\nvowed\neh\nelongated\nhangs\nrender\ncentimeters\n##ficient\nMu\nturtle\nbesieged\n##gaard\ngrapes\nbravery\ncollaborations\ndeprived\n##amine\n##using\n##gins\narid\n##uve\ncoats\nhanged\n##sting\nPa\nprefix\n##ranged\nExit\nChain\nFlood\nMaterials\nsuspicions\n##ö\nhovered\nHidden\n##state\nMalawi\n##24\nMandy\nnorms\nfascinating\nairlines\ndelivers\n##rust\nCretaceous\nspanned\npillows\n##onomy\njar\n##kka\nregent\nfireworks\nmorality\ndiscomfort\nlure\nuneven\n##jack\nLucian\n171\narchaeology\n##til\nmornings\nBillie\nMarquess\nimpending\nspilling\ntombs\n##volved\nCelia\nCoke\nunderside\n##bation\nVaughn\nDaytona\nGodfrey\nPascal\nAlien\n##sign\n172\n##lage\niPhone\nGonna\ngenocide\n##rber\noven\nendure\ndashed\nsimultaneous\n##phism\nWally\n##rō\nants\npredator\nreissue\n##aper\nSpeech\nfunk\nRudy\nclaw\nHindus\nNumbers\nBing\nlantern\n##aurus\nscattering\npoisoned\n##active\nAndrei\nalgebraic\nbaseman\n##ritz\nGregg\n##cola\nselections\n##putation\nlick\nLaguna\n##IX\nSumatra\nWarning\nturf\nbuyers\nBurgess\nOldham\nexploit\nworm\ninitiate\nstrapped\ntuning\nfilters\nhaze\n##е\n##ledge\n##ydro\n##culture\namendments\nPromotion\n##union\nClair\n##uria\npetty\nshutting\n##eveloped\nPhoebe\nZeke\nconducts\ngrains\nclashes\n##latter\nillegitimate\nwillingly\nDeer\nLakers\nReference\nchaplain\ncommitments\ninterrupt\nsalvation\nPanther\nQualifying\nAssessment\ncancel\nefficiently\nattorneys\nDynamo\nimpress\naccession\nclinging\nrandomly\nreviewing\nRomero\nCathy\ncharting\nclapped\nrebranded\nAzerbaijani\ncoma\nindicator\npunches\n##tons\nSami\nmonastic\nprospects\nPastor\n##rville\nelectrified\n##CI\n##utical\ntumbled\nChef\nmuzzle\nselecting\nUP\nWheel\nprotocols\n##tat\nExtended\nbeautifully\nnests\n##stal\nAndersen\n##anu\n##³\n##rini\nkneeling\n##reis\n##xia\nanatomy\ndusty\nSafe\nturmoil\nBianca\n##elo\nanalyze\n##ر\n##eran\npodcast\nSlovene\nLocke\nRue\n##retta\n##uni\nPerson\nProphet\ncrooked\ndisagreed\nVersailles\nSarajevo\nUtrecht\n##ogen\nchewing\n##ception\n##iidae\nMissile\nattribute\nmajors\nArch\nintellectuals\n##andra\nideological\nCory\nSalzburg\n##fair\nLot\nelectromagnetic\nDistribution\n##oper\n##pered\nRuss\nTerra\nrepeats\nfluttered\nRiga\n##ific\n##gt\ncows\nHair\nlabelled\nprotects\nGale\nPersonnel\nDüsseldorf\nMoran\nrematch\n##OE\nSlow\nforgiveness\n##ssi\nproudly\nMacmillan\ninsist\nundoubtedly\nQuébec\nViolence\n##yuan\n##aine\nmourning\nlinen\naccidental\n##iol\n##arium\ngrossing\nlattice\nmaneuver\n##marine\nprestige\npetrol\ngradient\ninvasive\nmilitant\nGalerie\nwidening\n##aman\n##quist\ndisagreement\n##ales\ncreepy\nremembers\nbuzz\n##erial\nExempt\nDirk\nmon\nAddison\n##inen\ndeposed\n##agon\nfifteenth\nHang\nornate\nslab\n##lades\nFountain\ncontractors\ndas\nWarwickshire\n1763\n##rc\nCarly\nEssays\nIndy\nLigue\ngreenhouse\nslit\n##sea\nchewed\nwink\n##azi\nPlayhouse\n##kon\nGram\nKo\nSamson\ncreators\nrevive\n##rians\nspawned\nseminars\nCraft\nTall\ndiverted\nassistants\ncomputational\nenclosure\n##acity\nCoca\n##eve\ndatabases\nDrop\n##loading\n##hage\nGreco\nPrivy\nentrances\npork\nprospective\nMemories\nrobes\n##market\ntransporting\n##lik\nRudolph\nHorton\nvisually\n##uay\n##nja\nCentro\nTor\nHowell\n##rsey\nadmitting\npostgraduate\nherbs\n##att\nChin\nRutherford\n##bot\n##etta\nSeasons\nexplanations\n##bery\nFriedman\nheap\n##ryl\n##sberg\njaws\n##agh\nChoi\nKilling\nFanny\n##suming\n##hawk\nhopeful\n##aid\nMonty\ngum\nremarkably\nSecrets\ndisco\nharp\nadvise\n##avia\nMarathi\n##cycle\nTruck\nabbot\nsincere\nurine\n##mology\nmasked\nbathing\n##tun\nFellows\n##TM\n##gnetic\nowl\n##jon\nhymn\n##leton\n208\nhostility\n##cée\nbaked\nBottom\n##AB\nshudder\n##ater\n##von\n##hee\nreorganization\nCycle\n##phs\nLex\n##style\n##rms\nTranslation\n##erick\n##imeter\n##ière\nattested\nHillary\n##DM\ngal\nwander\nSalle\n##laming\nPerez\nPit\n##LP\nUSAF\ncontexts\nDisease\nblazing\naroused\nrazor\nwalled\nDanielle\nMont\nFunk\nroyalty\nthee\n203\ndonors\n##erton\nfamously\nprocessors\nreassigned\nwelcoming\nGoldberg\n##quities\nundisclosed\nOrient\nPatty\nvaccine\nrefrigerator\nCypriot\nconsonant\n##waters\n176\nsober\n##lement\nRacecourse\n##uate\nLuckily\nSelection\nconceptual\nvines\nBreaking\nwa\nlions\noversight\nsheltered\nDancer\nponds\nborrow\n##BB\n##pulsion\nDaly\n##eek\nfertility\nspontaneous\nWorldwide\ngasping\n##tino\n169\nABS\nVickers\nambient\nenergetic\nprisons\n##eson\nStacy\n##roach\nGmbH\nAfro\nMarin\nfarmhouse\npinched\n##cursion\n##sp\nSabine\n##pire\n181\nnak\nswelling\nhumble\nperfume\n##balls\nRai\ncannons\n##taker\nMarried\nMaltese\ncanals\ninterceptions\nhats\nlever\nslowing\n##ppy\nNike\nSilas\nScarborough\nskirts\n166\ninauguration\nShuttle\nalloy\nbeads\nbelts\nCompton\nCause\nbattling\ncritique\nsurf\nDock\nroommate\n##ulet\ninvade\nGarland\n##slow\nnutrition\npersona\n##zam\nWichita\nacquaintance\ncoincided\n##cate\nDracula\nclamped\n##gau\noverhaul\n##broken\n##rrier\nmelodies\nventures\nPaz\nconvex\nRoots\n##holding\nTribute\ntransgender\n##ò\nchimney\n##riad\nAjax\nThereafter\nmessed\nnowadays\npH\n##100\n##alog\nPomerania\n##yra\nRossi\nglove\n##TL\nRaces\n##asily\ntablets\nJase\n##ttes\ndiner\n##rns\nHu\nMohan\nanytime\nweighted\nremixes\nDove\ncherry\nimports\n##urity\nGA\n##TT\n##iated\n##sford\nClarkson\nevidently\nrugged\nDust\nsiding\n##ometer\nacquitted\nchoral\n##mite\ninfants\nDomenico\ngallons\nAtkinson\ngestures\nslated\n##xa\nArchaeology\nunwanted\n##ibes\n##duced\npremise\nColby\nGeelong\ndisqualified\n##pf\n##voking\nsimplicity\nWalkover\nQaeda\nWarden\n##bourg\n##ān\nInvasion\nBabe\nharness\n183\n##tated\nmaze\nBurt\nbedrooms\n##nsley\nHorizon\n##oast\nminimize\npeeked\nMLA\nTrains\ntractor\nnudged\n##iform\nGrowth\nBenton\nseparates\n##about\n##kari\nbuffer\nanthropology\nbrigades\nfoil\n##wu\nDomain\nlicking\nwhore\n##rage\n##sham\nInitial\nCourthouse\nRutgers\ndams\nvillains\nsupermarket\n##brush\nBrunei\nPalermo\narises\nPassenger\noutreach\n##gill\nLabrador\nMcLaren\n##uy\nLori\n##fires\nHeads\nmagistrate\n¹⁄₂\nWeapons\n##wai\n##roke\nprojecting\n##ulates\nbordering\nMcKenzie\nPavel\nmidway\nGuangzhou\nstreamed\nracer\n##lished\neccentric\nspectral\n206\n##mism\nWilde\nGrange\npreparatory\nlent\n##tam\nstarving\nGertrude\n##cea\n##ricted\nBreakfast\nMira\nblurted\nderive\n##lair\nblunt\nsob\nCheltenham\nHenrik\nreinstated\nintends\n##istan\nunite\n##ector\nplayful\nsparks\nmapped\nCadet\nluggage\nprosperous\n##ein\nsalon\n##utes\nBiological\n##rland\nTyrone\nbuyer\n##lose\namounted\nSaw\nsmirked\nRonan\nReviews\nAdele\ntrait\n##proof\nBhutan\nGinger\n##junct\ndigitally\nstirring\n##isted\ncoconut\nHamlet\nDinner\nScale\npledge\n##RP\nWrong\nGoal\nPanel\ntherapeutic\nelevations\ninfectious\npriesthood\n##inda\nGuyana\ndiagnostic\n##mbre\nBlackwell\nsails\n##arm\nliteral\nperiodically\ngleaming\nRobot\nRector\n##abulous\n##tres\nReaching\nRomantic\nCP\nWonderful\n##tur\nornamental\n##nges\ntraitor\n##zilla\ngenetics\nmentioning\n##eim\nresonance\nAreas\nShopping\n##nard\nGail\nSolid\n##rito\n##mara\nWillem\nChip\nMatches\nVolkswagen\nobstacle\nOrgan\ninvites\nCoral\nattain\n##anus\n##dates\nMidway\nshuffled\nCecilia\ndessert\nGateway\nCh\nNapoleonic\nPetroleum\njets\ngoose\nstriped\nbowls\nvibration\nSims\nnickel\nThirteen\nproblematic\nintervene\n##grading\n##unds\nMum\nsemifinal\nRadical\n##izations\nrefurbished\n##sation\n##harine\nMaximilian\ncites\nAdvocate\nPotomac\nsurged\npreserves\nCurry\nangled\nordination\n##pad\nCade\n##DE\n##sko\nresearched\ntorpedoes\nResident\nwetlands\nhay\napplicants\ndepart\nBernstein\n##pic\n##ario\n##rae\nfavourable\n##wari\n##р\nmetabolism\nnobleman\nDefaulted\ncalculate\nignition\nCelebrity\nBelize\nsulfur\nFlat\nSc\nUSB\nflicker\nHertfordshire\nSept\nCFL\nPasadena\nSaturdays\nTitus\n##nir\nCanary\nComputing\nIsaiah\n##mler\nformidable\npulp\norchid\nCalled\nSolutions\nkilograms\nsteamer\n##hil\nDoncaster\nsuccessors\nStokes\nHolstein\n##sius\nsperm\nAPI\nRogue\ninstability\nAcoustic\n##rag\n159\nundercover\nWouldn\n##pra\n##medical\nEliminated\nhonorable\n##chel\ndenomination\nabrupt\nBuffy\nblouse\nfi\nRegardless\nSubsequent\n##rdes\nLover\n##tford\nbacon\n##emia\ncarving\n##cripts\nMassacre\nRamos\nLatter\n##ulp\nballroom\n##gement\nrichest\nbruises\nRest\nWiley\n##aster\nexplosions\n##lastic\nEdo\n##LD\nMir\nchoking\ndisgusted\nfaintly\nBarracks\nblasted\nheadlights\nTours\nensued\npresentations\n##cale\nwrought\n##oat\n##coa\nQuaker\n##sdale\nrecipe\n##gny\ncorpses\n##liance\ncomfortably\n##wat\nLandscape\nniche\ncatalyst\n##leader\nSecurities\nmessy\n##RL\nRodrigo\nbackdrop\n##opping\ntreats\nEmilio\nAnand\nbilateral\nmeadow\nVC\nsocialism\n##grad\nclinics\n##itating\n##ppe\n##ymphonic\nseniors\nAdvisor\nArmoured\nMethod\nAlley\n##orio\nSad\nfueled\nraided\nAxel\nNH\nrushes\nDixie\nOtis\nwrecked\n##22\ncapitalism\ncafé\n##bbe\n##pion\n##forcing\nAubrey\nLublin\nWhenever\nSears\nScheme\n##lana\nMeadows\ntreatise\n##RI\n##ustic\nsacrifices\nsustainability\nBiography\nmystical\nWanted\nmultiplayer\nApplications\ndisliked\n##tisfied\nimpaired\nempirical\nforgetting\nFairfield\nSunni\nblurred\nGrowing\nAvalon\ncoil\nCamera\nSkin\nbruised\nterminals\n##fted\n##roving\nCommando\n##hya\n##sper\nreservations\nneedles\ndangling\n##rsch\n##rsten\n##spect\n##mbs\nyoga\nregretted\nBliss\nOrion\nRufus\nglucose\nOlsen\nautobiographical\n##dened\n222\nhumidity\nShan\n##ifiable\nsupper\n##rou\nflare\n##MO\ncampaigning\ndescend\nsocio\ndeclares\nMounted\nGracie\nArte\nendurance\n##ety\nCopper\ncosta\nairplay\n##MB\nProceedings\ndislike\ngrimaced\noccupants\nbirths\nglacial\noblivious\ncans\ninstallment\nmuddy\n##ł\ncaptains\npneumonia\nQuiet\nSloan\nExcuse\n##nine\nGeography\ngymnastics\nmultimedia\ndrains\nAnthology\nGear\ncylindrical\nFry\nundertaking\n##pler\n##tility\nNan\n##recht\nDub\nphilosophers\npiss\nAtari\n##pha\nGalicia\nMéxico\n##nking\nContinuing\nbump\ngraveyard\npersisted\nShrine\n##erapy\ndefects\nAdvance\nBomber\n##oil\n##ffling\ncheerful\n##lix\nscrub\n##eto\nawkwardly\ncollaborator\nfencing\n##alo\nprophet\nCroix\ncoughed\n##lication\nroadway\nslaughter\nelephants\n##erated\nSimpsons\nvulnerability\nivory\nBirth\nlizard\nscarce\ncylinders\nfortunes\n##NL\nHate\nPriory\n##lai\nMcBride\n##copy\nLenny\nliaison\nTriangle\ncoronation\nsampled\nsavage\namidst\nGrady\nwhatsoever\ninstinctively\nReconstruction\ninsides\nseizure\nDrawing\n##rlin\nAntioch\nGao\nDíaz\n1760\nSparks\n##tien\n##bidae\nrehearsal\n##bbs\nbotanical\n##hers\ncompensate\nwholesale\nSeville\nshareholder\nprediction\nastronomical\nReddy\nhardest\ncircling\nwhereabouts\ntermination\nRep\nAssistance\nDramatic\nHerb\n##ghter\nclimbs\n188\nPoole\n301\n##pable\nwit\n##istice\nWalters\nrelying\nJakob\n##redo\nproceeding\nLangley\naffiliates\nou\n##allo\n##holm\nSamsung\n##ishi\nMissing\nXi\nvertices\nClaus\nfoam\nrestless\n##uating\n##sso\n##ttering\nPhilips\ndelta\nbombed\nCatalogue\ncoaster\nLing\nWillard\nsatire\n410\nComposition\nNet\nOrioles\n##ldon\nfins\nPalatinate\nWoodward\ntease\ntilt\nbrightness\n##70\n##bbling\n##loss\n##dhi\n##uilt\nWhoever\n##yers\nhitter\nElton\nExtension\nace\nAffair\nrestructuring\n##loping\nPaterson\nhi\n##rya\nspouse\nShay\nHimself\npiles\npreaching\n##gical\nbikes\nBrave\nexpulsion\nMirza\nstride\nTrees\ncommemorated\nfamine\nmasonry\nSelena\nWatt\nBanking\nRancho\nStockton\ndip\ntattoos\nVlad\nacquainted\nFlyers\nruthless\nfourteenth\nillustrate\n##akes\nEPA\n##rows\n##uiz\nbumped\nDesigned\nLeaders\nmastered\nManfred\nswirled\nMcCain\n##rout\nArtemis\nrabbi\nflinched\nupgrades\npenetrate\nshipyard\ntransforming\ncaretaker\n##eiro\nMaureen\ntightening\n##founded\nRAM\n##icular\n##mper\n##rung\nFifteen\nexploited\nconsistency\ninterstate\n##ynn\nBridget\ncontamination\nMistress\n##rup\ncoating\n##FP\n##jective\nLibyan\n211\nGemma\ndependence\nshrubs\n##ggled\nGermain\nretaliation\ntraction\n##PP\nDangerous\nterminology\npsychiatrist\n##garten\nhurdles\nNatal\nwasting\nWeir\nrevolves\nstripe\n##reased\npreferences\n##entation\n##lde\n##áil\n##otherapy\nFlame\n##ologies\nviruses\nLabel\nPandora\nveil\n##ogical\nColiseum\nCottage\ncreeping\nJong\nlectured\n##çaise\nshoreline\n##fference\n##hra\nShade\nClock\nFaye\nbilingual\nHumboldt\nOperating\n##fter\n##was\nalgae\ntowed\namphibious\nParma\nimpacted\nsmacked\nPiedmont\nMonsters\n##omb\nMoor\n##lberg\nsinister\nPostal\n178\nDrummond\nSign\ntextbooks\nhazardous\nBrass\nRosemary\nPick\nSit\nArchitect\ntransverse\nCentennial\nconfess\npolling\n##aia\nJulien\n##mand\nconsolidation\nEthel\n##ulse\nseverity\nYorker\nchoreographer\n1840s\n##ltry\nsofter\nversa\n##geny\n##quila\n##jō\nCaledonia\nFriendship\nVisa\nrogue\n##zzle\nbait\nfeather\nincidence\nFoods\nShips\n##uto\n##stead\narousal\n##rote\nHazel\n##bolic\nSwing\n##ej\n##cule\n##jana\n##metry\n##uity\nValuable\n##ₙ\nShropshire\n##nect\n365\nOnes\nrealise\nCafé\nAlbuquerque\n##grown\n##stadt\n209\n##ᵢ\nprefers\nwithstand\nLillian\nMacArthur\nHara\n##fulness\ndomination\n##VO\n##school\nFreddy\nethnicity\n##while\nadorned\nhormone\nCalder\nDomestic\nFreud\nShields\n##phus\n##rgan\nBP\nSegunda\nMustang\n##GI\nBonn\npatiently\nremarried\n##umbria\nCrete\nElephant\nNuremberg\ntolerate\nTyson\n##evich\nProgramming\n##lander\nBethlehem\nsegregation\nConstituency\nquarterly\nblushed\nphotographers\nSheldon\nporcelain\nBlanche\ngoddamn\nlively\n##fused\nbumps\n##eli\ncurated\ncoherent\nprovoked\n##vet\nMadeleine\n##isco\nrainy\nBethel\naccusation\nponytail\ngag\n##lington\nquicker\nscroll\n##vate\nBow\nGender\nIra\ncrashes\nACT\nMaintenance\n##aton\n##ieu\nbitterly\nstrains\nrattled\nvectors\n##arina\n##ishly\n173\nparole\n##nx\namusing\nGonzalez\n##erative\nCaucus\nsensual\nPenelope\ncoefficient\nMateo\n##mani\nproposition\nDuty\nlacrosse\nproportions\nPlato\nprofiles\nBotswana\nBrandt\nreins\nmandolin\nencompassing\n##gens\nKahn\nprop\nsummon\n##MR\n##yrian\n##zaki\nFalling\nconditional\nthy\n##bao\n##ych\nradioactive\n##nics\nNewspaper\n##people\n##nded\nGaming\nsunny\n##look\nSherwood\ncrafted\nNJ\nawoke\n187\ntimeline\ngiants\npossessing\n##ycle\nCheryl\nng\nRuiz\npolymer\npotassium\nRamsay\nrelocation\n##leen\nSociology\n##bana\nFranciscan\npropulsion\ndenote\n##erjee\nregisters\nheadline\nTests\nemerges\nArticles\nMint\nlivery\nbreakup\nkits\nRap\nBrowning\nBunny\n##mington\n##watch\nAnastasia\nZachary\narranging\nbiographical\nErica\nNippon\n##membrance\nCarmel\n##sport\n##xes\nPaddy\n##holes\nIssues\nSpears\ncompliment\n##stro\n##graphs\nCastillo\n##MU\n##space\nCorporal\n##nent\n174\nGentlemen\n##ilize\n##vage\nconvinces\nCarmine\nCrash\n##hashi\nFiles\nDoctors\nbrownish\nsweating\ngoats\n##conductor\nrendition\n##bt\nNL\n##spiration\ngenerates\n##cans\nobsession\n##noy\nDanger\nDiaz\nheats\nRealm\npriorities\n##phon\n1300\ninitiation\npagan\nbursts\narchipelago\nchloride\nScreenplay\nHewitt\nKhmer\nbang\njudgement\nnegotiating\n##ait\nMabel\ndensely\nBoulder\nknob\n430\nAlfredo\n##kt\npitches\n##ées\n##ان\nMacdonald\n##llum\nimply\n##mot\nSmile\nspherical\n##tura\nDerrick\nKelley\nNico\ncortex\nlaunches\ndiffered\nparallels\nNavigation\n##child\n##rming\ncanoe\nforestry\nreinforce\n##mote\nconfirming\ntasting\nscaled\n##resh\n##eting\nUnderstanding\nprevailing\nPearce\nCW\nearnest\nGaius\nasserts\ndenoted\nlandmarks\nChargers\nwarns\n##flies\nJudges\njagged\n##dain\ntails\nHistorian\nMillie\n##sler\n221\n##uard\nabsurd\nDion\n##ially\nmakeshift\nSpecifically\nignorance\nEat\n##ieri\ncomparisons\nforensic\n186\nGiro\nskeptical\ndisciplinary\nbattleship\n##45\nLibby\n520\nOdyssey\nledge\n##post\nEternal\nMissionary\ndeficiency\nsettler\nwonders\n##gai\nraging\n##cis\nRomney\nUlrich\nannexation\nboxers\nsect\n204\nARIA\ndei\nHitchcock\nte\nVarsity\n##fic\nCC\nlending\n##nial\n##tag\n##rdy\n##obe\nDefensive\n##dson\n##pore\nstellar\nLam\nTrials\ncontention\nSung\n##uminous\nPoe\nsuperiority\n##plicate\n325\nbitten\nconspicuous\n##olly\nLila\nPub\nPetit\ndistorted\nISIL\ndistinctly\n##family\nCowboy\nmutant\n##cats\n##week\nChanges\nSinatra\nepithet\nneglect\nInnocent\ngamma\nthrill\nreggae\n##adia\n##ational\n##due\nlandlord\n##leaf\nvisibly\n##ì\nDarlington\nGomez\n##iting\nscarf\n##lade\nHinduism\nFever\nscouts\n##roi\nconvened\n##oki\n184\nLao\nboycott\nunemployed\n##lore\n##ß\n##hammer\nCurran\ndisciples\nodor\n##ygiene\nLighthouse\nPlayed\nwhales\ndiscretion\nYves\n##ceived\npauses\ncoincide\n##nji\ndizzy\n##scopic\nrouted\nGuardians\nKellan\ncarnival\nnasal\n224\n##awed\nMitsubishi\n640\nCast\nsilky\nProjects\njoked\nHuddersfield\nRothschild\nzu\n##olar\nDivisions\nmildly\n##eni\n##lge\nAppalachian\nSahara\npinch\n##roon\nwardrobe\n##dham\n##etal\nBubba\n##lini\n##rumbling\nCommunities\nPoznań\nunification\nBeau\nKris\nSV\nRowing\nMinh\nreconciliation\n##saki\n##sor\ntaped\n##reck\ncertificates\ngubernatorial\nrainbow\n##uing\nlitter\n##lique\n##oted\nButterfly\nbenefited\nImages\ninduce\nBalkans\nVelvet\n##90\n##xon\nBowman\n##breaker\npenis\n##nitz\n##oint\n##otive\ncrust\n##pps\norganizers\nOutdoor\nnominees\n##rika\nTX\n##ucks\nProtestants\n##imation\nappetite\nBaja\nawaited\n##points\nwindshield\n##igh\n##zled\nBrody\nBuster\nstylized\nBryce\n##sz\nDollar\nvest\nmold\nounce\nok\nreceivers\n##uza\nPurdue\nHarrington\nHodges\ncaptures\n##ggio\nReservation\n##ssin\n##tman\ncosmic\nstraightforward\nflipping\nremixed\n##athed\nGómez\nLim\nmotorcycles\neconomies\nowning\nDani\n##rosis\nmyths\nsire\nkindly\n1768\nBean\ngraphs\n##mee\n##RO\n##geon\npuppy\nStephenson\nnotified\n##jer\nWatching\n##rama\nSino\nurgency\nIslanders\n##mash\nPlata\nfumble\n##chev\n##stance\n##rack\n##she\nfacilitated\nswings\nakin\nenduring\npayload\n##phine\nDeputies\nmurals\n##tooth\n610\nJays\neyeing\n##quito\ntransparency\n##cote\nTimor\nnegatively\n##isan\nbattled\n##fected\nthankful\nRage\nhospitality\nincorrectly\n207\nentrepreneurs\n##cula\n##wley\nhedge\n##cratic\nCorpus\nOdessa\nWhereas\n##ln\nfetch\nhappier\nAmherst\nbullying\ngraceful\nHeight\nBartholomew\nwillingness\nqualifier\n191\nSyed\nWesleyan\nLayla\n##rrence\nWebber\n##hum\nRat\n##cket\n##herence\nMonterey\ncontaminated\nBeside\nMustafa\nNana\n213\n##pruce\nReason\n##spense\nspike\n##gé\nAU\ndisciple\ncharcoal\n##lean\nformulated\nDiesel\nMariners\naccreditation\nglossy\n1800s\n##ih\nMainz\nunison\nMarianne\nshear\noverseeing\nvernacular\nbowled\n##lett\nunpopular\n##ckoned\n##monia\nGaston\n##TI\n##oters\nCups\n##bones\n##ports\nMuseo\nminors\n1773\nDickens\n##EL\n##NBC\nPresents\nambitions\naxes\nRío\nYukon\nbedside\nRibbon\nUnits\nfaults\nconceal\n##lani\nprevailed\n214\nGoodwin\nJaguar\ncrumpled\nCullen\nWireless\nceded\nremotely\nBin\nmocking\nstraps\nceramics\n##avi\n##uding\n##ader\nTaft\ntwenties\n##aked\nProblem\nquasi\nLamar\n##ntes\n##avan\nBarr\n##eral\nhooks\nsa\n##ône\n194\n##ross\nNero\nCaine\ntrance\nHomeland\nbenches\nGuthrie\ndismiss\n##lex\nCésar\nfoliage\n##oot\n##alty\nAssyrian\nAhead\nMurdoch\ndictatorship\nwraps\n##ntal\nCorridor\nMackay\nrespectable\njewels\nunderstands\n##pathic\nBryn\n##tep\nON\ncapsule\nintrigued\nSleeping\ncommunists\n##chayat\n##current\n##vez\ndoubling\nbooklet\n##uche\nCreed\n##NU\nspies\n##sef\nadjusting\n197\nImam\nheaved\nTanya\ncanonical\nrestraint\nsenators\nstainless\n##gnate\nMatter\ncache\nrestrained\nconflicting\nstung\n##ool\nSustainable\nantiquity\n193\nheavens\ninclusive\n##ador\nfluent\n303\n911\narchaeologist\nsuperseded\n##plex\nTammy\ninspire\n##passing\n##lub\nLama\nMixing\n##activated\n##yote\nparlor\ntactic\n198\nStefano\nprostitute\nrecycling\nsorted\nbanana\nStacey\nMusée\naristocratic\ncough\n##rting\nauthorised\ngangs\nrunoff\nthoughtfully\n##nish\nFisheries\nProvence\ndetector\nhum\n##zhen\npill\n##árez\nMap\nLeaves\nPeabody\nskater\nvent\n##color\n390\ncerebral\nhostages\nmare\nJurassic\nswell\n##isans\nKnoxville\nNaked\nMalaya\nscowl\nCobra\n##anga\nSexual\n##dron\n##iae\n196\n##drick\nRavens\nBlaine\n##throp\nIsmail\nsymmetric\n##lossom\nLeicestershire\nSylvester\nglazed\n##tended\nRadar\nfused\nFamilies\nBlacks\nSale\nZion\nfoothills\nmicrowave\nslain\nCollingwood\n##pants\n##dling\nkillers\nroutinely\nJanice\nhearings\n##chanted\n##ltration\ncontinents\n##iving\n##yster\n##shot\n##yna\ninjected\nGuillaume\n##ibi\nkinda\nConfederacy\nBarnett\ndisasters\nincapable\n##grating\nrhythms\nbetting\ndraining\n##hak\nCallie\nGlover\n##iliated\nSherlock\nhearted\npunching\nWolverhampton\nLeaf\nPi\nbuilders\nfurnished\nknighted\nPhoto\n##zle\nTouring\nfumbled\npads\n##ий\nBartlett\nGunner\neerie\nMarius\nBonus\npots\n##hino\n##pta\nBray\nFrey\nOrtiz\nstalls\nbelongings\nSubway\nfascination\nmetaphor\nBat\nBoer\nColchester\nsway\n##gro\nrhetoric\n##dheim\nFool\nPMID\nadmire\n##hsil\nStrand\nTNA\n##roth\nNottinghamshire\n##mat\n##yler\nOxfordshire\n##nacle\n##roner\nBS\n##nces\nstimulus\ntransports\nSabbath\n##postle\nRichter\n4000\n##grim\n##shima\n##lette\ndeteriorated\nanalogous\n##ratic\nUHF\nenergies\ninspiring\nYiddish\nActivities\n##quential\n##boe\nMelville\n##ilton\nJudd\nconsonants\nlabs\nsmuggling\n##fari\navid\n##uc\ntruce\nundead\n##raith\nMostly\nbracelet\nConnection\nHussain\nawhile\n##UC\n##vention\nliable\ngenetically\n##phic\nImportant\nWildcats\ndaddy\ntransmit\n##cas\nconserved\nYesterday\n##lite\nNicky\nGuys\nWilder\nLay\nskinned\nCommunists\nGarfield\nNearby\norganizer\nLoss\ncrafts\nwalkway\nChocolate\nSundance\nSynod\n##enham\nmodify\nswayed\nSurface\nanalysts\nbrackets\ndrone\nparachute\nsmelling\nAndrés\nfilthy\nfrogs\nvertically\n##OK\nlocalities\nmarries\nAHL\n35th\n##pian\nPalazzo\ncube\ndismay\nrelocate\n##на\nHear\n##digo\n##oxide\nprefecture\nconverts\nhangar\n##oya\n##ucking\nSpectrum\ndeepened\nspoiled\nKeeping\n##phobic\nVerona\noutrage\nImprovement\n##UI\nmasterpiece\nslung\nCalling\nchant\nHaute\nmediated\nmanipulated\naffirmed\n##hesis\nHangul\nskies\n##llan\nWorcestershire\n##kos\nmosaic\n##bage\n##wned\nPutnam\nfolder\n##LM\nguts\nnoteworthy\n##rada\nAJ\nsculpted\n##iselle\n##rang\nrecognizable\n##pent\ndolls\nlobbying\nimpatiently\nSe\nstaple\nSerb\ntandem\nHiroshima\nthieves\n##ynx\nfaculties\nNorte\n##alle\n##trusion\nchords\n##ylon\nGareth\n##lops\n##escu\nFIA\nLevin\nauspices\ngroin\nHui\nnun\nListed\nHonourable\nLarsen\nrigorous\n##erer\nTonga\n##pment\n##rave\n##track\n##aa\n##enary\n540\nclone\nsediment\nesteem\nsighted\ncruelty\n##boa\ninverse\nviolating\nAmtrak\nStatus\namalgamated\nvertex\nAR\nharmless\nAmir\nmounts\nCoronation\ncounseling\nAudi\nCO₂\nsplits\n##eyer\nHumans\nSalmon\n##have\n##rado\n##čić\n216\ntakeoff\nclassmates\npsychedelic\n##gni\nGypsy\n231\nAnger\nGAA\nME\n##nist\n##tals\nLissa\nOdd\nbaptized\nFiat\nfringe\n##hren\n179\nelevators\nperspectives\n##TF\n##ngle\nQuestion\nfrontal\n950\nthicker\nMolecular\n##nological\nSixteen\nBaton\nHearing\ncommemorative\ndorm\nArchitectural\npurity\n##erse\nrisky\nGeorgie\nrelaxing\n##ugs\ndowned\n##rar\nSlim\n##phy\nIUCN\n##thorpe\nParkinson\n217\nMarley\nShipping\nsweaty\nJesuits\nSindh\nJanata\nimplying\nArmenians\nintercept\nAnkara\ncommissioners\nascended\nsniper\nGrass\nWalls\nsalvage\nDewey\ngeneralized\nlearnt\nPT\n##fighter\n##tech\nDR\n##itrus\n##zza\nmercenaries\nslots\n##burst\n##finger\n##nsky\nPrinces\nRhodesia\n##munication\n##strom\nFremantle\nhomework\nins\n##Os\n##hao\n##uffed\nThorpe\nXiao\nexquisite\nfirstly\nliberated\ntechnician\nOilers\nPhyllis\nherb\nsharks\nMBE\n##stock\nProduct\nbanjo\n##morandum\n##than\nVisitors\nunavailable\nunpublished\noxidation\nVogue\n##copic\n##etics\nYates\n##ppard\nLeiden\nTrading\ncottages\nPrinciples\n##Millan\n##wife\n##hiva\nVicar\nnouns\nstrolled\n##eorological\n##eton\n##science\nprecedent\nArmand\nGuido\nrewards\n##ilis\n##tise\nclipped\nchick\n##endra\naverages\ntentatively\n1830s\n##vos\nCertainly\n305\nSociété\nCommandant\n##crats\n##dified\n##nka\nmarsh\nangered\nventilation\nHutton\nRitchie\n##having\nEclipse\nflick\nmotionless\nAmor\nFest\nLoire\nlays\n##icit\n##sband\nGuggenheim\nLuck\ndisrupted\n##ncia\nDisco\n##vigator\ncriticisms\ngrins\n##lons\n##vial\n##ody\nsalute\nCoaches\njunk\nsaxophonist\n##eology\nUprising\nDiet\n##marks\nchronicles\nrobbed\n##iet\n##ahi\nBohemian\nmagician\nwavelength\nKenyan\naugmented\nfashionable\n##ogies\nLuce\nF1\nMonmouth\n##jos\n##loop\nenjoyment\nexemption\nCenters\n##visor\nSoundtrack\nblinding\npractitioner\nsolidarity\nsacrificed\n##oso\n##cture\n##riated\nblended\nAbd\nCopyright\n##nob\n34th\n##reak\nClaudio\nhectare\nrotor\ntestify\n##ends\n##iably\n##sume\nlandowner\n##cess\n##ckman\nEduard\nSilesian\nbackseat\nmutually\n##abe\nMallory\nbounds\nCollective\nPoet\nWinkler\npertaining\nscraped\nPhelps\ncrane\nflickering\nProto\nbubbles\npopularized\nremoves\n##86\nCadillac\nWarfare\naudible\nrites\nshivering\n##sist\n##nst\n##biotic\nMon\nfascist\nBali\nKathryn\nambiguous\nfuriously\nmorale\npatio\nSang\ninconsistent\ntopology\nGreens\nmonkeys\nKöppen\n189\nToy\nvow\n##ías\nbombings\n##culus\nimprovised\nlodged\nsubsidiaries\ngarment\nstartling\npractised\nHume\nThorn\ncategorized\nTill\nEileen\nwedge\n##64\nFederico\npatriotic\nunlock\n##oshi\nbadminton\nCompared\nVilnius\n##KE\nCrimean\nKemp\ndecks\nspaced\nresolutions\nsighs\n##mind\nImagine\nCartoon\nhuddled\npolicemen\nforwards\n##rouch\nequals\n##nter\ninspected\nCharley\nMG\n##rte\npamphlet\nArturo\ndans\nscarcely\n##ulton\n##rvin\nparental\nunconstitutional\nwatts\nSusannah\nDare\n##sitive\nRowland\nValle\ninvalid\n##ué\nDetachment\nacronym\nYokohama\nverified\n##lsson\ngroove\nLiza\nclarified\ncompromised\n265\n##rgon\n##orf\nhesitant\nFruit\nApplication\nMathias\nicons\n##cell\nQin\ninterventions\n##uron\npunt\nremnant\n##rien\nAmes\nmanifold\nspines\nfloral\n##zable\ncomrades\nFallen\norbits\nAnnals\nhobby\nAuditorium\nimplicated\nresearching\nPueblo\nTa\nterminate\n##pella\nRings\napproximation\nfuzzy\n##ús\nthriving\n##ket\nConor\nalarmed\netched\nCary\n##rdon\nAlly\n##rington\nPay\nmint\n##hasa\n##unity\n##dman\n##itate\nOceania\nfurrowed\ntrams\n##aq\nWentworth\nventured\nchoreography\nprototypes\nPatel\nmouthed\ntrenches\n##licing\n##yya\nLies\ndeception\n##erve\n##vations\nBertrand\nearthquakes\n##tography\nSouthwestern\n##aja\ntoken\nGupta\n##yō\nBeckett\ninitials\nironic\nTsar\nsubdued\nshootout\nsobbing\nliar\nScandinavia\nSouls\nch\ntherapist\ntrader\nRegulation\nKali\nbusiest\n##pation\n32nd\nTelephone\nVargas\n##moky\n##nose\n##uge\nFavorite\nabducted\nbonding\n219\n255\ncorrection\nmat\ndrown\nfl\nunbeaten\nPocket\nSummers\nQuite\nrods\nPercussion\n##ndy\nbuzzing\ncadet\nWilkes\nattire\ndirectory\nutilities\nnaive\npopulous\nHendrix\n##actor\ndisadvantage\n1400\nLandon\nUnderworld\n##ense\nOccasionally\nmercury\nDavey\nMorley\nspa\nwrestled\n##vender\neclipse\nSienna\nsupplemented\nthou\nStream\nliturgical\n##gall\n##berries\n##piration\n1769\nBucks\nabandoning\n##jutant\n##nac\n232\nvenom\n##31\nRoche\ndotted\nCurrie\nCórdoba\nMilo\nSharif\ndivides\njustification\nprejudice\nfortunate\n##vide\n##ābād\nRowe\ninflammatory\n##eld\navenue\nSources\n##rimal\nMessenger\nBlanco\nadvocating\nformulation\n##pute\nemphasizes\nnut\nArmored\n##ented\nnutrients\n##tment\ninsistence\nMartins\nlandowners\n##RB\ncomparatively\nheadlines\nsnaps\n##qing\nCelebration\n##mad\nrepublican\n##NE\nTrace\n##500\n1771\nproclamation\nNRL\nRubin\nBuzz\nWeimar\n##AG\n199\nposthumous\n##ental\n##deacon\nDistance\nintensely\noverheard\nArcade\ndiagonal\nhazard\nGiving\nweekdays\n##ù\nVerdi\nactresses\n##hare\nPulling\n##erries\n##pores\ncatering\nshortest\n##ctors\n##cure\n##restle\n##reta\n##runch\n##brecht\n##uddin\nMoments\nsenate\nFeng\nPrescott\n##thest\n218\ndivisional\nBertie\nsparse\nsurrounds\ncoupling\ngravitational\nwerewolves\n##lax\nRankings\n##mated\n##tries\nShia\n##mart\n##23\n##vocative\ninterfaces\nmorphology\nnewscast\n##bide\ninputs\nsolicitor\nOlaf\ncabinets\npuzzles\n##tains\nUnified\n##firmed\nWA\nsolemn\n##opy\nTito\nJaenelle\nNeolithic\nhorseback\n##ires\npharmacy\nprevalence\n##lint\nSwami\n##bush\n##tudes\nPhilipp\nmythical\ndivers\nScouting\naperture\nprogressively\n##bay\n##nio\nbounce\nFloor\n##elf\nLucan\nadulthood\nhelm\nBluff\nPassage\nSalvation\nlemon\nnapkin\nscheduling\n##gets\nElements\nMina\nNovak\nstalled\n##llister\nInfrastructure\n##nky\n##tania\n##uished\nKatz\nNorma\nsucks\ntrusting\n1765\nboilers\nAccordingly\n##hered\n223\nCrowley\n##fight\n##ulo\nHenrietta\n##hani\npounder\nsurprises\n##chor\n##glia\nDukes\n##cracy\n##zier\n##fs\nPatriot\nsilicon\n##VP\nsimulcast\ntelegraph\nMysore\ncardboard\nLen\n##QL\nAuguste\naccordion\nanalytical\nspecify\nineffective\nhunched\nabnormal\nTransylvania\n##dn\n##tending\nEmilia\nglittering\nMaddy\n##wana\n1762\nExternal\nLecture\nendorsement\nHernández\nAnaheim\nWare\noffences\n##phorus\nPlantation\npopping\nBonaparte\ndisgusting\nneared\n##notes\nIdentity\nheroin\nnicely\n##raverse\napron\ncongestion\n##PR\npadded\n##fts\ninvaders\n##came\nfreshly\nHalle\nendowed\nfracture\nROM\n##max\nsediments\ndiffusion\ndryly\n##tara\nTam\nDraw\nSpin\nTalon\nAnthropology\n##lify\nnausea\n##shirt\ninsert\nFresno\ncapitalist\nindefinitely\napples\nGift\nscooped\n60s\nCooperative\nmistakenly\n##lover\nmurmur\n##iger\nEquipment\nabusive\norphanage\n##9th\n##lterweight\n##unda\nBaird\nant\nsaloon\n33rd\nChesapeake\n##chair\n##sound\n##tend\nchaotic\npornography\nbrace\n##aret\nheiress\nSSR\nresentment\nArbor\nheadmaster\n##uren\nunlimited\n##with\n##jn\nBram\nEly\nPokémon\npivotal\n##guous\nDatabase\nMarta\nShine\nstumbling\n##ovsky\n##skin\nHenley\nPolk\nfunctioned\n##layer\n##pas\n##udd\n##MX\nblackness\ncadets\nferal\nDamian\n##actions\n2D\n##yla\nApocalypse\n##aic\ninactivated\n##china\n##kovic\n##bres\ndestroys\nnap\nMacy\nsums\nMadhya\nWisdom\nrejects\n##amel\n60th\nCho\nbandwidth\n##sons\n##obbing\n##orama\nMutual\nshafts\n##estone\n##rsen\naccord\nreplaces\nwaterfront\n##gonal\n##rida\nconvictions\n##ays\ncalmed\nsuppliers\nCummings\nGMA\nfearful\nScientist\nSinai\nexamines\nexperimented\nNetflix\nEnforcement\nScarlett\n##lasia\nHealthcare\n##onte\nDude\ninverted\n##36\n##regation\n##lidae\nMunro\n##angay\nAirbus\noverlapping\nDrivers\nlawsuits\nbodily\n##udder\nWanda\nEffects\nFathers\n##finery\n##islav\nRidley\nobservatory\npod\n##utrition\nElectricity\nlandslide\n##mable\n##zoic\n##imator\n##uration\nEstates\nsleepy\nNickelodeon\nsteaming\nirony\nschedules\nsnack\nspikes\nHmm\n##nesia\n##bella\n##hibit\nGreenville\nplucked\nHarald\n##ono\nGamma\ninfringement\nroaring\ndeposition\n##pol\n##orum\n660\nseminal\npassports\nengagements\nAkbar\nrotated\n##bina\n##gart\nHartley\n##lown\n##truct\nuttered\ntraumatic\nDex\n##ôme\nHolloway\nMV\napartheid\n##nee\nCounter\nColton\nOR\n245\nSpaniards\nRegency\nSchedule\nscratching\nsquads\nverify\n##alk\nkeyboardist\nrotten\nForestry\naids\ncommemorating\n##yed\n##érie\nSting\n##elly\nDai\n##fers\n##berley\n##ducted\nMelvin\ncannabis\nglider\n##enbach\n##rban\nCostello\nSkating\ncartoonist\nAN\naudit\n##pectator\ndistributing\n226\n312\ninterpreter\nheader\nAlternatively\n##ases\nsmug\n##kumar\ncabins\nremastered\nConnolly\nKelsey\nLED\ntentative\nCheck\nSichuan\nshaved\n##42\nGerhard\nHarvest\ninward\n##rque\nHopefully\nhem\n##34\nTypical\nbinds\nwrath\nWoodstock\nforcibly\nFergus\n##charged\n##tured\nprepares\namenities\npenetration\n##ghan\ncoarse\n##oned\nenthusiasts\n##av\n##twined\nfielded\n##cky\nKiel\n##obia\n470\nbeers\ntremble\nyouths\nattendees\n##cademies\n##sex\nMacon\ncommunism\ndir\n##abi\nLennox\nWen\ndifferentiate\njewel\n##SO\nactivate\nassert\nladen\nunto\nGillespie\nGuillermo\naccumulation\n##GM\nNGO\nRosenberg\ncalculating\ndrastically\n##omorphic\npeeled\nLiège\ninsurgents\noutdoors\n##enia\nAspen\nSep\nawakened\n##eye\nConsul\nMaiden\ninsanity\n##brian\nfurnace\nColours\ndistributions\nlongitudinal\nsyllables\n##scent\nMartian\naccountant\nAtkins\nhusbands\nsewage\nzur\ncollaborate\nhighlighting\n##rites\n##PI\ncolonization\nnearer\n##XT\ndunes\npositioning\nKu\nmultitude\nluxurious\nVolvo\nlinguistics\nplotting\nsquared\n##inder\noutstretched\n##uds\nFuji\nji\n##feit\n##ahu\n##loat\n##gado\n##luster\n##oku\nAmérica\n##iza\nResidents\nvine\nPieces\nDD\nVampires\n##ová\nsmoked\nharshly\nspreads\n##turn\n##zhi\nbetray\nelectors\n##settled\nConsidering\nexploits\nstamped\nDusty\nenraged\nNairobi\n##38\nintervened\n##luck\norchestras\n##lda\nHereford\nJarvis\ncalf\n##itzer\n##CH\nsalesman\nLovers\ncigar\nAngelica\ndoomed\nheroine\n##tible\nSanford\noffenders\n##ulously\narticulated\n##oam\nEmanuel\nGardiner\nEdna\nShu\ngigantic\n##stable\nTallinn\ncoasts\nMaker\nale\nstalking\n##oga\n##smus\nlucrative\nsouthbound\n##changing\nReg\n##lants\nSchleswig\ndiscount\ngrouping\nphysiological\n##OH\n##sun\nGalen\nassurance\nreconcile\nrib\nscarlet\nThatcher\nanarchist\n##oom\nTurnpike\n##ceding\ncocktail\nSweeney\nAllegheny\nconcessions\noppression\nreassuring\n##poli\n##ticus\n##TR\n##VI\n##uca\n##zione\ndirectional\nstrikeouts\nBeneath\nCouldn\nKabul\n##national\nhydroelectric\n##jit\nDesire\n##riot\nenhancing\nnorthbound\n##PO\nOk\nRoutledge\nvolatile\nBernardo\nPython\n333\nample\nchestnut\nautomobiles\n##innamon\n##care\n##hering\nBWF\nsalaries\nTurbo\nacquisitions\n##stituting\nstrengths\npilgrims\nPonce\nPig\nActors\nBeard\nsanitation\n##RD\n##mett\nTelecommunications\nworms\n##idas\nJuno\nLarson\nVentura\nNortheastern\nweighs\nHoughton\ncollaborating\nlottery\n##rano\nWonderland\ngigs\n##lmer\n##zano\n##edd\n##nife\nmixtape\npredominant\ntripped\n##ruly\nAlexei\ninvesting\nBelgarath\nBrasil\nhiss\n##crat\n##xham\nCôte\n560\nkilometer\n##cological\nanalyzing\n##As\nengined\nlistener\n##cakes\nnegotiation\n##hisky\nSantana\n##lemma\nIAAF\nSeneca\nskeletal\nCovenant\nSteiner\n##lev\n##uen\nNeptune\nretention\n##upon\nClosing\nCzechoslovak\nchalk\nNavarre\nNZ\n##IG\n##hop\n##oly\n##quatorial\n##sad\nBrewery\nConflict\nThem\nrenew\nturrets\ndisagree\nPetra\nSlave\n##reole\nadjustment\n##dela\n##regard\n##sner\nframing\nstature\n##rca\n##sies\n##46\n##mata\nLogic\ninadvertently\nnaturalist\nspheres\ntowering\nheightened\nDodd\nrink\n##fle\nKeyboards\nbulb\ndiver\nul\n##tsk\nExodus\nDeacon\nEspaña\nCanadiens\noblique\nthud\nreigned\nrug\nWhitman\nDash\n##iens\nHaifa\npets\n##arland\nmanually\ndart\n##bial\nSven\ntextiles\nsubgroup\nNapier\ngraffiti\nrevolver\nhumming\nBabu\nprotector\ntyped\nProvinces\nSparta\nWills\nsubjective\n##rella\ntemptation\n##liest\nFL\nSadie\nmanifest\nGuangdong\nTransfer\nentertain\neve\nrecipes\n##33\nBenedictine\nretailer\n##dence\nestablishes\n##cluded\n##rked\nUrsula\n##ltz\n##lars\n##rena\nqualifiers\n##curement\ncolt\ndepictions\n##oit\nSpiritual\ndifferentiation\nstaffed\ntransitional\n##lew\n1761\nfatalities\n##oan\nBayern\nNorthamptonshire\nWeeks\n##CU\nFife\ncapacities\nhoarse\n##latt\n##ة\nevidenced\n##HD\n##ographer\nassessing\nevolve\nhints\n42nd\nstreaked\n##lve\nYahoo\n##estive\n##rned\n##zas\nbaggage\nElected\nsecrecy\n##champ\nCharacter\nPen\nDecca\ncape\nBernardino\nvapor\nDolly\ncounselor\n##isers\nBenin\n##khar\n##CR\nnotch\n##thus\n##racy\nbounty\nlend\ngrassland\n##chtenstein\n##dating\npseudo\ngolfer\nsimplest\n##ceive\nLucivar\nTriumph\ndinosaur\ndinosaurs\n##šić\nSeahawks\n##nco\nresorts\nreelected\n1766\nreproduce\nuniversally\n##OA\nER\ntendencies\nConsolidated\nMassey\nTasmanian\nreckless\n##icz\n##ricks\n1755\nquestionable\nAudience\n##lates\npreseason\nQuran\ntrivial\nHaitian\nFreeway\ndialed\nAppointed\nHeard\necosystems\n##bula\nhormones\nCarbon\nRd\n##arney\n##working\nChristoph\npresiding\npu\n##athy\nMorrow\nDar\nensures\nposing\nremedy\nEA\ndisclosed\n##hui\n##rten\nrumours\nsurveying\n##ficiency\nAziz\nJewel\nPlays\n##smatic\nBernhard\nChristi\n##eanut\n##friend\njailed\n##dr\ngovern\nneighbour\nbutler\nAcheron\nmurdering\noils\nmac\nEditorial\ndetectives\nbolts\n##ulon\nGuitars\nmalaria\n36th\nPembroke\nOpened\n##hium\nharmonic\nserum\n##sio\nFranks\nfingernails\n##gli\nculturally\nevolving\nscalp\nVP\ndeploy\nuploaded\nmater\n##evo\nJammu\nSpa\n##icker\nflirting\n##cursions\nHeidi\nMajority\nsprawled\n##alytic\nZheng\nbunker\n##lena\nST\n##tile\nJiang\nceilings\n##ently\n##ols\nRecovery\ndire\n##good\nManson\nHonestly\nMontréal\n1764\n227\nquota\nLakshmi\nincentive\nAccounting\n##cilla\nEureka\nReaper\nbuzzed\n##uh\ncourtroom\ndub\n##mberg\nKC\nGong\nTheodor\nAcadémie\nNPR\ncriticizing\nprotesting\n##pired\n##yric\nabuses\nfisheries\n##minated\n1767\nyd\nGemini\nSubcommittee\n##fuse\nDuff\nWasn\nWight\ncleaner\n##tite\nplanetary\nSurvivor\nZionist\nmounds\n##rary\nlandfall\ndisruption\nyielding\n##yana\nbids\nunidentified\nGarry\nEllison\nElmer\nFishing\nHayward\ndemos\nmodelling\n##anche\n##stick\ncaressed\nentertained\n##hesion\npiers\nCrimea\n##mass\nWHO\nboulder\ntrunks\n1640\nBiennale\nPalestinians\nPursuit\n##udes\nDora\ncontender\n##dridge\nNanjing\n##ezer\n##former\n##ibel\nWhole\nproliferation\n##tide\n##weiler\nfuels\npredictions\n##ente\n##onium\nFilming\nabsorbing\nRamón\nstrangled\nconveyed\ninhabit\nprostitutes\nrecession\nbonded\nclinched\n##eak\n##iji\n##edar\nPleasure\nRite\nChristy\nTherapy\nsarcasm\n##collegiate\nhilt\nprobation\nSarawak\ncoefficients\nunderworld\nbiodiversity\nSBS\ngroom\nbrewing\ndungeon\n##claiming\nHari\nturnover\n##ntina\n##omer\n##opped\northodox\nstyling\n##tars\n##ulata\npriced\nMarjorie\n##eley\n##abar\nYong\n##tically\nCrambidae\nHernandez\n##ego\n##rricular\n##ark\n##lamour\n##llin\n##augh\n##tens\nAdvancement\nLoyola\n##4th\n##hh\ngoin\nmarshes\nSardinia\n##ša\nLjubljana\nSinging\nsuspiciously\n##hesive\nFélix\nRegarding\nflap\nstimulation\n##raught\nApr\nYin\ngaping\ntighten\nskier\n##itas\n##lad\n##rani\n264\nAshes\nOlson\nProblems\nTabitha\n##rading\nbalancing\nsunrise\n##ease\n##iture\n##ritic\nFringe\n##iciency\nInspired\nLinnaeus\nPBA\ndisapproval\n##kles\n##rka\n##tails\n##urger\nDisaster\nLaboratories\napps\nparadise\nAero\nCame\nsneaking\nGee\nBeacon\nODI\ncommodity\nEllington\ngraphical\nGretchen\nspire\n##skaya\n##trine\nRTÉ\nefficacy\nplc\ntribunal\n##ytic\ndownhill\nflu\nmedications\n##kaya\nwiden\nSunrise\n##nous\ndistinguishing\npawn\n##BO\n##irn\n##ssing\n##ν\nEaston\n##vila\nRhineland\n##aque\ndefect\n##saurus\nGoose\nJu\n##classified\nMiddlesbrough\nshaping\npreached\n1759\n##erland\nEin\nHailey\nmusicals\n##altered\nGalileo\nHilda\nFighters\nLac\n##ometric\n295\nLeafs\nMilano\n##lta\n##VD\n##ivist\npenetrated\nMask\nOrchard\nplaintiff\n##icorn\nYvonne\n##fred\noutfielder\npeek\nCollier\nCaracas\nrepealed\nBois\ndell\nrestrict\nDolores\nHadley\npeacefully\n##LL\ncondom\nGranny\nOrders\nsabotage\n##toon\n##rings\ncompass\nmarshal\ngears\nbrigadier\ndye\nYunnan\ncommunicating\ndonate\nemerald\nvitamin\nadminister\nFulham\n##classical\n##llas\nBuckinghamshire\nHeld\nlayered\ndisclosure\nAkira\nprogrammer\nshrimp\nCrusade\n##ximal\nLuzon\nbakery\n##cute\nGarth\nCitadel\nuniquely\nCurling\ninfo\nmum\nPara\n##ști\nsleek\n##ione\nhey\nLantern\nmesh\n##lacing\n##lizzard\n##gade\nprosecuted\nAlba\nGilles\ngreedy\ntwists\n##ogged\nViper\n##kata\nAppearances\nSkyla\nhymns\n##pelled\ncurving\npredictable\nGrave\nWatford\n##dford\n##liptic\n##vary\nWestwood\nfluids\nModels\nstatutes\n##ynamite\n1740\n##culate\nFramework\nJohanna\n##gression\nVuelta\nimp\n##otion\n##raga\n##thouse\nCiudad\nfestivities\n##love\nBeyoncé\nitalics\n##vance\nDB\n##haman\nouts\nSingers\n##ueva\n##urning\n##51\n##ntiary\n##mobile\n285\nMimi\nemeritus\nnesting\nKeeper\nWays\n##onal\n##oux\nEdmond\nMMA\n##bark\n##oop\nHampson\n##ñez\n##rets\nGladstone\nwreckage\nPont\nPlayboy\nreluctance\n##ná\napprenticeship\npreferring\nValue\noriginate\n##wei\n##olio\nAlexia\n##rog\nParachute\njammed\nstud\nEton\nvols\n##ganized\n1745\nstraining\ncreep\nindicators\n##mán\nhumiliation\nhinted\nalma\ntanker\n##egation\nHaynes\nPenang\namazement\nbranched\nrumble\n##ddington\narchaeologists\nparanoid\nexpenditure\nAbsolutely\nMusicians\nbanished\n##fining\nbaptism\nJoker\nPersons\nhemisphere\n##tieth\n##ück\nflock\n##xing\nlbs\nKung\ncrab\n##dak\n##tinent\nRegulations\nbarrage\nparcel\n##ós\nTanaka\n##rsa\nNatalia\nVoyage\nflaws\nstepfather\n##aven\n##eological\nBotanical\nMinsk\n##ckers\nCinderella\nFeast\nLoving\nPrevious\nShark\n##took\nbarrister\ncollaborators\n##nnes\nCroydon\nGraeme\nJuniors\n##7th\n##formation\n##ulos\n##ák\n£2\n##hwa\n##rove\n##ș\nWhig\ndemeanor\nOtago\n##TH\n##ooster\nFaber\ninstructors\n##ahl\n##bha\nemptied\n##schen\nsaga\n##lora\nexploding\n##rges\nCrusaders\n##caster\n##uations\nstreaks\nCBN\nbows\ninsights\nka\n1650\ndiversion\nLSU\nWingspan\n##liva\nResponse\nsanity\nProducers\nimitation\n##fine\nLange\nSpokane\nsplash\nweed\nSiberian\nmagnet\n##rocodile\ncapitals\n##rgus\nswelled\nRani\nBells\nSilesia\narithmetic\nrumor\n##hampton\nfavors\nWeird\nmarketplace\n##orm\ntsunami\nunpredictable\n##citation\n##ferno\nTradition\npostwar\nstench\nsucceeds\n##roup\nAnya\nUsers\noversized\ntotaling\npouch\n##nat\nTripoli\nleverage\nsatin\n##cline\nBathurst\nLund\nNiall\nthereof\n##quid\nBangor\nbarge\nAnimated\n##53\n##alan\nBallard\nutilizes\nDone\nballistic\nNDP\ngatherings\n##elin\n##vening\nRockets\nSabrina\nTamara\nTribal\nWTA\n##citing\nblinded\nflux\nKhalid\nUna\nprescription\n##jee\nParents\n##otics\n##food\nSilicon\ncured\nelectro\nperpendicular\nintimacy\n##rified\nLots\n##ceiving\n##powder\nincentives\nMcKenna\n##arma\n##ounced\n##rinkled\nAlzheimer\n##tarian\n262\nSeas\n##cam\nNovi\n##hout\n##morphic\n##hazar\n##hul\n##nington\nHuron\nBahadur\nPirate\npursed\nGriffiths\nindicted\nswap\nrefrain\n##mulating\nLal\nstomped\n##Pad\n##mamoto\nReef\ndisposed\nplastered\nweeping\n##rato\nMinas\nhourly\ntumors\n##ruising\nLyle\n##yper\n##sol\nOdisha\ncredibility\n##Dowell\nBraun\nGraphic\nlurched\nmuster\n##nex\n##ührer\n##connected\n##iek\n##ruba\nCarthage\nPeck\nmaple\nbursting\n##lava\nEnrico\nrite\n##jak\nMoment\n##skar\nStyx\npoking\nSpartan\n##urney\nHepburn\nMart\nTitanic\nnewsletter\nwaits\nMecklenburg\nagitated\neats\n##dious\nChow\nmatrices\nMaud\n##sexual\nsermon\n234\n##sible\n##lung\nQi\ncemeteries\nmined\nsprinter\n##ckett\ncoward\n##gable\n##hell\n##thin\n##FB\nContact\n##hay\nrainforest\n238\nHemisphere\nboasts\n##nders\n##verance\n##kat\nConvent\nDunedin\nLecturer\nlyricist\n##bject\nIberian\ncomune\n##pphire\nchunk\n##boo\nthrusting\nfore\ninforming\npistols\nechoes\nTier\nbattleships\nsubstitution\n##belt\nmoniker\n##charya\n##lland\nThoroughbred\n38th\n##01\n##tah\nparting\ntongues\nCale\n##seau\nUnionist\nmodular\ncelebrates\npreview\nsteamed\nBismarck\n302\n737\nvamp\n##finity\n##nbridge\nweaknesses\nhusky\n##berman\nabsently\n##icide\nCraven\ntailored\nTokugawa\nVIP\nsyntax\nKazan\ncaptives\ndoses\nfiltered\noverview\nCleopatra\nConversely\nstallion\nBurger\nSuez\nRaoul\nth\n##reaves\nDickson\nNell\nRate\nanal\ncolder\n##sław\nArm\nSemitic\n##green\nreflective\n1100\nepiscopal\njourneys\n##ours\n##pository\n##dering\nresidue\nGunn\n##27\n##ntial\n##crates\n##zig\nAstros\nRenee\nEmerald\n##vili\nconnectivity\nundrafted\nSampson\ntreasures\n##kura\n##theon\n##vern\nDestroyer\n##iable\n##ener\nFrederic\nbriefcase\nconfinement\nBree\n##WD\nAthena\n233\nPadres\nThom\nspeeding\n##hali\nDental\nducks\nPutin\n##rcle\n##lou\nAsylum\n##usk\ndusk\npasture\nInstitutes\nONE\njack\n##named\ndiplomacy\nIntercontinental\nLeagues\nTowns\ncomedic\npremature\n##edic\n##mona\n##ories\ntrimmed\nCharge\nCream\nguarantees\nDmitry\nsplashed\nPhilosophical\ntramway\n##cape\nMaynard\npredatory\nredundant\n##gratory\n##wry\nsobs\nBurgundy\nedible\noutfits\nHandel\ndazed\ndangerously\nidle\nOperational\norganizes\n##sional\nblackish\nbroker\nweddings\n##halt\nBecca\nMcGee\n##gman\nprotagonists\n##pelling\nKeynes\naux\nstumble\n##ordination\nNokia\nreel\nsexes\n##woods\n##pheric\n##quished\n##voc\n##oir\n##pathian\n##ptus\n##sma\n##tating\n##ê\nfulfilling\nsheath\n##ayne\nMei\nOrdinary\nCollin\nSharpe\ngrasses\ninterdisciplinary\n##OX\nBackground\n##ignment\nAssault\ntransforms\nHamas\nSerge\nratios\n##sik\nswaying\n##rcia\nRosen\n##gant\n##versible\ncinematographer\ncurly\npenny\nKamal\nMellon\nSailor\nSpence\nphased\nBrewers\namassed\nSocieties\n##ropriations\n##buted\nmythological\n##SN\n##byss\n##ired\nSovereign\npreface\nParry\n##ife\naltitudes\ncrossings\n##28\nCrewe\nsouthernmost\ntaut\nMcKinley\n##owa\n##tore\n254\n##ckney\ncompiling\nShelton\n##hiko\n228\nPoll\nShepard\nLabs\nPace\nCarlson\ngrasping\n##ов\nDelaney\nWinning\nrobotic\nintentional\nshattering\n##boarding\n##git\n##grade\nEditions\nReserves\nignorant\nproposing\n##hanna\ncutter\nMongols\nNW\n##eux\nCodex\nCristina\nDaughters\nRees\nforecast\n##hita\nNGOs\nStations\nBeaux\nErwin\n##jected\n##EX\n##trom\nSchumacher\n##hrill\n##rophe\nMaharaja\nOricon\n##sul\n##dynamic\n##fighting\nCe\nIngrid\nrumbled\nProspect\nstairwell\nBarnard\napplause\ncomplementary\n##uba\ngrunt\n##mented\nBloc\nCarleton\nloft\nnoisy\n##hey\n490\ncontrasted\n##inator\n##rief\n##centric\n##fica\nCantonese\nBlanc\nLausanne\nLicense\nartifact\n##ddin\nrot\nAmongst\nPrakash\nRF\n##topia\nmilestone\n##vard\nWinters\nMead\nchurchyard\nLulu\nestuary\n##ind\nCha\nInfinity\nMeadow\nsubsidies\n##valent\nCONCACAF\nChing\nmedicinal\nnavigate\nCarver\nTwice\nabdominal\nregulating\nRB\ntoilets\nBrewer\nweakening\nambushed\n##aut\n##vignon\nLansing\nunacceptable\nreliance\nstabbing\n##mpo\n##naire\nInterview\n##ested\n##imed\nbearings\n##lts\nRashid\n##iation\nauthenticity\nvigorous\n##frey\n##uel\nbiologist\nNFC\n##rmaid\n##wash\nMakes\n##aunt\n##steries\nwithdrawing\n##qa\nBuccaneers\nbleed\ninclination\nstain\n##ilo\n##ppel\nTorre\nprivileged\ncereal\ntrailers\nalumnus\nneon\nCochrane\nMariana\ncaress\n##47\n##ients\nexperimentation\nWindow\nconvict\nsignaled\n##YP\nrower\nPharmacy\ninteracting\n241\nStrings\ndominating\nkinase\nDinamo\nWire\npains\nsensations\n##suse\nTwenty20\n##39\nspotlight\n##hend\nelemental\n##pura\nJameson\nSwindon\nhonoring\npained\n##ediatric\n##lux\nPsychological\nassemblies\ningredient\nMartial\nPenguins\nbeverage\nMonitor\nmysteries\n##ION\nemigration\nmused\n##sique\ncrore\nAMC\nFunding\nChinatown\nEstablishment\nFinalist\nenjoyable\n1756\n##mada\n##rams\nNO\nnewborn\nCS\ncomprehend\nInvisible\nSiemens\n##acon\n246\ncontraction\n##volving\n##moration\n##rok\nmontane\n##ntation\nGalloway\n##llow\nVerity\ndirectorial\npearl\nLeaning\n##rase\nFernandez\nswallowing\nAutomatic\nMadness\nhaunting\npaddle\n##UE\n##rrows\n##vies\n##zuki\n##bolt\n##iber\nFender\nemails\npaste\n##lancing\nhind\nhomestead\nhopeless\n##dles\nRockies\ngarlic\nfatty\nshrieked\n##ismic\nGillian\nInquiry\nSchultz\nXML\n##cius\n##uld\nDomesday\ngrenades\nnorthernmost\n##igi\nTbilisi\noptimistic\n##poon\nRefuge\nstacks\nBose\nsmash\nsurreal\nNah\nStraits\nConquest\n##roo\n##weet\n##kell\nGladys\nCH\n##lim\n##vitation\nDoctorate\nNRHP\nknocks\nBey\nRomano\n##pile\n242\nDiamonds\nstrides\neclectic\nBetsy\nclade\n##hady\n##leashed\ndissolve\nmoss\nSuburban\nsilvery\n##bria\ntally\nturtles\n##uctive\nfinely\nindustrialist\n##nary\nErnesto\noz\npact\nloneliness\n##hov\nTomb\nmultinational\nrisked\nLayne\nUSL\nne\n##quiries\nAd\nMessage\nKamen\nKristen\nreefs\nimplements\n##itative\neducators\ngarments\ngunshot\n##essed\n##rve\nMontevideo\nvigorously\nStamford\nassemble\npackaged\n##same\nétat\nViva\nparagraph\n##eter\n##wire\nStick\nNavajo\nMCA\n##pressing\nensembles\nABA\n##zor\n##llus\nPartner\nraked\n##BI\nIona\nthump\nCeleste\nKiran\n##iscovered\n##rith\ninflammation\n##arel\nFeatures\nloosened\n##yclic\nDeluxe\nSpeak\neconomical\nFrankenstein\nPicasso\nshowcased\n##zad\n##eira\n##planes\n##linear\n##overs\nmonsoon\nprosecutors\nslack\nHorses\n##urers\nAngry\ncoughing\n##truder\nQuestions\n##tō\n##zak\nchallenger\nclocks\n##ieving\nNewmarket\n##acle\ncursing\nstimuli\n##mming\n##qualified\nslapping\n##vasive\nnarration\n##kini\nAdvertising\nCSI\nalliances\nmixes\n##yes\ncovert\namalgamation\nreproduced\n##ardt\n##gis\n1648\nid\nAnnette\nBoots\nChampagne\nBrest\nDaryl\n##emon\n##jou\n##llers\nMean\nadaptive\ntechnicians\n##pair\n##usal\nYoga\nfronts\nleaping\nJul\nharvesting\nkeel\n##44\npetitioned\n##lved\nyells\nEndowment\nproponent\n##spur\n##tised\n##zal\nHomes\nIncludes\n##ifer\n##oodoo\n##rvette\nawarding\nmirrored\nransom\nFlute\noutlook\n##ganj\nDVDs\nSufi\nfrontman\nGoddard\nbarren\n##astic\nSuicide\nhillside\nHarlow\nLau\nnotions\nAmnesty\nHomestead\n##irt\nGE\nhooded\numpire\nmustered\nCatch\nMasonic\n##erd\nDynamics\nEquity\nOro\nCharts\nMussolini\npopulace\nmuted\naccompaniment\n##lour\n##ndes\nignited\n##iferous\n##laced\n##atch\nanguish\nregistry\n##tub\n##hards\n##neer\n251\nHooker\nuncomfortably\n##6th\n##ivers\nCatalina\nMiG\ngiggling\n1754\nDietrich\nKaladin\npricing\n##quence\nSabah\n##lving\n##nical\nGettysburg\nVita\nTelecom\nWorst\nPalais\nPentagon\n##brand\n##chichte\nGraf\nunnatural\n1715\nbio\n##26\nRadcliffe\n##utt\nchatting\nspices\n##aus\nuntouched\n##eper\nDoll\nturkey\nSyndicate\n##rlene\n##JP\n##roots\nComo\nclashed\nmodernization\n1757\nfantasies\n##iating\ndissipated\nSicilian\ninspect\nsensible\nreputed\n##final\nMilford\npoised\nRC\nmetabolic\nTobacco\nMecca\noptimization\n##heat\nlobe\nrabbits\nNAS\ngeologist\n##liner\nKilda\ncarpenter\nnationalists\n##brae\nsummarized\n##venge\nDesigner\nmisleading\nbeamed\n##meyer\nMatrix\nexcuses\n##aines\n##biology\n401\nMoose\ndrafting\nSai\n##ggle\nComprehensive\ndripped\nskate\n##WI\n##enan\n##ruk\nnarrower\noutgoing\n##enter\n##nounce\noverseen\n##structure\ntravellers\nbanging\nscarred\n##thing\n##arra\nEbert\nSometime\n##nated\nBAFTA\nHurricanes\nconfigurations\n##MLL\nimmortality\n##heus\ngothic\n##mpest\nclergyman\nviewpoint\nMaxim\nInstituto\nemitted\nquantitative\n1689\nConsortium\n##rsk\nMeat\nTao\nswimmers\nShaking\nTerence\nmainline\n##linity\nQuantum\n##rogate\nNair\nbanquet\n39th\nreprised\nlagoon\nsubdivisions\nsynonymous\nincurred\npassword\nsprung\n##vere\nCredits\nPetersen\nFaces\n##vu\nstatesman\nZombie\ngesturing\n##going\nSergey\ndormant\npossessive\ntotals\nsouthward\nÁngel\n##odies\nHM\nMariano\nRamirez\nWicked\nimpressions\n##Net\n##cap\n##ème\nTransformers\nPoker\nRIAA\nRedesignated\n##chuk\nHarcourt\nPeña\nspacious\ntinged\nalternatively\nnarrowing\nBrigham\nauthorization\nMembership\nZeppelin\n##amed\nHandball\nsteer\n##orium\n##rnal\n##rops\nCommittees\nendings\n##MM\n##yung\nejected\ngrams\n##relli\nBirch\nHilary\nStadion\norphan\nclawed\n##kner\nMotown\nWilkins\nballads\noutspoken\n##ancipation\n##bankment\n##cheng\nAdvances\nharvested\nnovelty\nineligible\noversees\n##´s\nobeyed\ninevitably\nKingdoms\nburying\nFabian\nrelevance\nTatiana\n##MCA\nsarcastic\n##onda\nAkron\n229\nsandwiches\nAdobe\nMaddox\n##azar\nHunting\n##onized\nSmiling\n##tology\nJuventus\nLeroy\nPoets\nattach\nlo\n##rly\n##film\nStructure\n##igate\nolds\nprojections\nSMS\noutnumbered\n##tase\njudiciary\nparamilitary\nplayfully\n##rsing\n##tras\nChico\nVin\ninformally\nabandonment\n##russ\nBaroness\ninjuring\noctagonal\ndeciduous\n##nea\n##olm\nHz\nNorwood\nposes\nMarissa\nalerted\nwilled\n##KS\nDino\n##ddler\n##vani\nBarbie\nThankfully\n625\nbicycles\nshimmering\n##tinuum\n##wolf\nChesterfield\n##idy\n##urgency\nKnowles\nsweetly\nVentures\n##ponents\n##valence\nDarryl\nPowerplant\nRAAF\n##pec\nKingsley\nParramatta\npenetrating\nspectacle\n##inia\nMarlborough\nresidual\ncompatibility\nhike\nUnderwood\ndepleted\nministries\n##odus\n##ropriation\nrotting\nFaso\n##inn\nHappiness\nLille\nSuns\ncookie\nrift\nwarmly\n##lvin\nBugs\nGotham\nGothenburg\nProperties\n##seller\n##ubi\nCreated\nMAC\nNoelle\nRequiem\nUlysses\n##ails\nfranchises\n##icious\n##rwick\ncelestial\nkinetic\n720\nSTS\ntransmissions\namplitude\nforums\nfreeing\nreptiles\ntumbling\n##continent\n##rising\n##tropy\nphysiology\n##uster\nLoves\nbodied\nneutrality\nNeumann\nassessments\nVicky\n##hom\nhampered\n##uku\nCustom\ntimed\n##eville\n##xious\nelastic\n##section\nrig\nstilled\nshipment\n243\nartworks\nboulders\nBournemouth\n##hly\n##LF\n##linary\nrumored\n##bino\n##drum\nChun\nFreiburg\n##dges\nEquality\n252\nGuadalajara\n##sors\n##taire\nRoach\ncramped\n##ultural\nLogistics\nPunch\nfines\nLai\ncaravan\n##55\nlame\nCollector\npausing\n315\nmigrant\nhawk\nsignalling\n##erham\n##oughs\nDemons\nsurfing\nRana\ninsisting\nWien\nadolescent\n##jong\n##rera\n##umba\nRegis\nbrushes\n##iman\nresidues\nstorytelling\nConsider\ncontrasting\nregeneration\n##elling\n##hlete\nafforded\nreactors\ncosting\n##biotics\n##gat\n##евич\nchanting\nsecondly\nconfesses\n##ikos\n##uang\n##ronological\n##−\nGiacomo\n##eca\nvaudeville\nweeds\nrejecting\nrevoked\naffluent\nfullback\nprogresses\ngeologic\nproprietor\nreplication\ngliding\nrecounted\n##bah\n##igma\nFlow\nii\nnewcomer\n##lasp\n##miya\nCandace\nfractured\ninteriors\nconfidential\nInverness\nfooting\n##robe\nCoordinator\nWestphalia\njumper\n##chism\ndormitory\n##gno\n281\nacknowledging\nleveled\n##éra\nAlgiers\nmigrate\nFrog\nRare\n##iovascular\n##urous\nDSO\nnomadic\n##iera\nwoken\nlifeless\n##graphical\n##ifications\nDot\nSachs\ncrow\nnmi\nTacoma\nWeight\nmushroom\nRS\nconditioned\n##zine\nTunisian\naltering\n##mizing\nHandicap\nPatti\nMonsieur\nclicking\ngorge\ninterrupting\n##powerment\ndrawers\nSerra\n##icides\nSpecialist\n##itte\nconnector\nworshipped\n##ask\nconsoles\ntags\n##iler\nglued\n##zac\nfences\nBratislava\nhoneymoon\n313\nA2\ndisposition\nGentleman\nGilmore\nglaciers\n##scribed\nCalhoun\nconvergence\nAleppo\nshortages\n##43\n##orax\n##worm\n##codes\n##rmal\nneutron\n##ossa\nBloomberg\nSalford\nperiodicals\n##ryan\nSlayer\n##ynasties\ncredentials\n##tista\nsurveyor\nFile\nstinging\nunnoticed\nMedici\necstasy\nespionage\nJett\nLeary\ncirculating\nbargaining\nconcerto\nserviced\n37th\nHK\n##fueling\nDelilah\nMarcia\ngraded\n##join\nKaplan\nfeasible\n##nale\n##yt\nBurnley\ndreadful\nministerial\nBrewster\nJudah\n##ngled\n##rrey\nrecycled\nIroquois\nbackstage\nparchment\n##numbered\nKern\nMotorsports\nOrganizations\n##mini\nSeems\nWarrington\nDunbar\nEzio\n##eor\nparalyzed\nAra\nyeast\n##olis\ncheated\nreappeared\nbanged\n##ymph\n##dick\nLyndon\nglide\nMat\n##natch\nHotels\nHousehold\nparasite\nirrelevant\nyouthful\n##smic\n##tero\n##anti\n2d\nIgnacio\nsquash\n##nets\nshale\n##اد\nAbrams\n##oese\nassaults\n##dier\n##otte\nSwamp\n287\nSpurs\n##economic\nFargo\nauditioned\n##mé\nHaas\nune\nabbreviation\nTurkic\n##tisfaction\nfavorites\nspecials\n##lial\nEnlightenment\nBurkina\n##vir\nComparative\nLacrosse\nelves\n##lerical\n##pear\nBorders\ncontrollers\n##villa\nexcelled\n##acher\n##varo\ncamouflage\nperpetual\n##ffles\ndevoid\nschooner\n##bered\n##oris\nGibbons\nLia\ndiscouraged\nsue\n##gnition\nExcellent\nLayton\nnoir\nsmack\n##ivable\n##evity\n##lone\nMyra\nweaken\nweaponry\n##azza\nShake\nbackbone\nCertified\nclown\noccupational\ncaller\nenslaved\nsoaking\nWexford\nperceive\nshortlisted\n##pid\nfeminism\nBari\nIndie\n##avelin\n##ldo\nHellenic\nHundreds\nSavings\ncomedies\nHonors\nMohawk\nTold\ncoded\nIncorporated\nhideous\ntrusts\nhose\nCalais\nForster\nGabon\nInternationale\nAK\nColour\n##UM\n##heist\nMcGregor\nlocalized\n##tronomy\nDarrell\n##iara\nsquirrel\nfreaked\n##eking\n##manned\n##ungen\nradiated\n##dua\ncommence\nDonaldson\n##iddle\nMR\nSAS\nTavern\nTeenage\nadmissions\nInstruments\n##ilizer\nKonrad\ncontemplated\n##ductor\nJing\nReacher\nrecalling\nDhabi\nemphasizing\nillumination\n##tony\nlegitimacy\nGoethe\nRitter\nMcDonnell\nPolar\nSeconds\naspiring\nderby\ntunic\n##rmed\noutlines\nChanging\ndistortion\n##cter\nMechanics\n##urly\n##vana\nEgg\nWolverine\nStupid\ncentralized\nknit\n##Ms\nSaratoga\nOgden\nstorylines\n##vres\nlavish\nbeverages\n##grarian\nKyrgyzstan\nforcefully\nsuperb\nElm\nThessaloniki\nfollower\nPlants\nslang\ntrajectory\nNowadays\nBengals\nIngram\nperch\ncoloring\ncarvings\ndoubtful\n##aph\n##gratulations\n##41\nCurse\n253\nnightstand\nCampo\nMeiji\ndecomposition\n##giri\nMcCormick\nYours\n##amon\n##bang\nTexans\ninjunction\norganise\nperiodical\n##peculative\noceans\n##aley\nSuccess\nLehigh\n##guin\n1730\nDavy\nallowance\nobituary\n##tov\ntreasury\n##wayne\neuros\nreadiness\nsystematically\n##stered\n##igor\n##xen\n##cliff\n##lya\nSend\n##umatic\nCeltics\nJudiciary\n425\npropagation\nrebellious\n##ims\n##lut\nDal\n##ayman\n##cloth\nBoise\npairing\nWaltz\ntorment\nHatch\naspirations\ndiaspora\n##hame\nRank\n237\nIncluding\nMuir\nchained\ntoxicity\nUniversité\n##aroo\nMathews\nmeadows\n##bio\nEditing\nKhorasan\n##them\n##ahn\n##bari\n##umes\nevacuate\n##sium\ngram\nkidnap\npinning\n##diation\n##orms\nbeacon\norganising\nMcGrath\n##ogist\nQur\nTango\n##ceptor\n##rud\n##cend\n##cie\n##jas\n##sided\nTuscany\nVenture\ncreations\nexhibiting\n##rcerer\n##tten\nButcher\nDivinity\nPet\nWhitehead\nfalsely\nperished\nhandy\nMoines\ncyclists\nsynthesizers\nMortal\nnotoriety\n##ronic\nDialogue\nexpressive\nuk\nNightingale\ngrimly\nvineyards\nDriving\nrelentless\ncompiler\n##district\n##tuated\nHades\nmedicines\nobjection\nAnswer\nSoap\nChattanooga\n##gogue\nHaryana\nParties\nTurtle\n##ferred\nexplorers\nstakeholders\n##aar\n##rbonne\ntempered\nconjecture\n##tee\n##hur\nReeve\nbumper\nstew\n##church\n##generate\n##ilitating\n##chanized\n##elier\n##enne\ntranslucent\n##lows\nPublisher\nevangelical\ninherit\n##rted\n247\nSmackDown\nbitterness\nlesions\n##worked\nmosques\nwed\n##lashes\nNg\nRebels\nbooking\n##nail\nIncident\nSailing\nyo\nconfirms\nChaplin\nbaths\n##kled\nmodernist\npulsing\nCicero\nslaughtered\nboasted\n##losure\nzipper\n##hales\naristocracy\nhalftime\njolt\nunlawful\nMarching\nsustaining\nYerevan\nbracket\nram\nMarkus\n##zef\nbutcher\nmassage\n##quisite\nLeisure\nPizza\ncollapsing\n##lante\ncommentaries\nscripted\n##disciplinary\n##sused\neroded\nalleging\nvase\nChichester\nPeacock\ncommencement\ndice\nhotter\npoisonous\nexecutions\n##occo\nfrost\nfielding\nvendor\nCounts\nTroops\nmaize\nDivisional\nanalogue\nshadowy\nNuevo\nVille\nradiating\nworthless\nAdriatic\nBuy\nblaze\nbrutally\nhorizontally\nlonged\n##matical\nfederally\nRolf\nRoot\nexclude\nrag\nagitation\nLounge\nastonished\n##wirl\nImpossible\ntransformations\n##IVE\n##ceded\n##slav\ndownloaded\nfucked\nEgyptians\nWelles\n##ffington\nU2\nbefriended\nradios\n##jid\narchaic\ncompares\n##ccelerator\n##imated\n##tosis\nHung\nScientists\nThousands\ngeographically\n##LR\nMacintosh\nfluorescent\n##ipur\nWehrmacht\n##BR\n##firmary\nChao\n##ague\nBoyer\n##grounds\n##hism\n##mento\n##taining\ninfancy\n##cton\n510\nBoca\n##loy\n1644\nben\ndong\nstresses\nSweat\nexpressway\ngraders\nochreous\nnets\nLawn\nthirst\nUruguayan\nsatisfactory\n##tracts\nbaroque\nrusty\n##ław\nShen\nGdańsk\nchickens\n##graving\nHodge\nPapal\nSAT\nbearer\n##ogo\n##rger\nmerits\nCalendar\nHighest\nSkills\n##ortex\nRoberta\nparadigm\nrecounts\nfrigates\nswamps\nunitary\n##oker\nballoons\nHawthorne\nMuse\nspurred\nadvisors\nreclaimed\nstimulate\nfibre\npat\nrepeal\n##dgson\n##iar\n##rana\nanthropologist\ndescends\nflinch\nreared\n##chang\n##eric\n##lithic\ncommissioning\n##cumenical\n##lume\n##rchen\nWolff\n##tsky\nEurasian\nNepali\nNightmare\nZIP\nplayback\n##latz\n##vington\nWarm\n##75\nMartina\nRollins\nSaetan\nVariations\nsorting\n##م\n530\nJoaquin\nPtolemy\nthinner\n##iator\n##pticism\nCebu\nHighlanders\nLinden\nVanguard\n##SV\n##mor\n##ulge\nISSN\ncartridges\nrepression\nÉtienne\n311\nLauderdale\ncommodities\nnull\n##rb\n1720\ngearbox\n##reator\nAng\nForgotten\ndubious\n##rls\n##dicative\n##phate\nGroove\nHerrera\n##çais\nCollections\nMaximus\n##published\nFell\nQualification\nfiltering\n##tized\nRoe\nhazards\n##37\n##lative\n##tröm\nGuadalupe\nTajikistan\nPreliminary\nfronted\nglands\n##paper\n##iche\n##iding\nCairns\nrallies\nLocation\nseduce\n##mple\nBYU\n##itic\n##FT\nCarmichael\nPrentice\nsongwriters\nforefront\nPhysicians\n##rille\n##zee\nPreparatory\n##cherous\nUV\n##dized\nNavarro\nmisses\n##nney\nInland\nresisting\n##sect\nHurt\n##lino\ngalaxies\n##raze\nInstitutions\ndevote\n##lamp\n##ciating\nbaron\n##bracing\nHess\noperatic\n##CL\n##ος\nChevalier\nGuiana\n##lattered\nFed\n##cuted\n##smo\nSkull\ndenies\n236\nWaller\n##mah\nSakura\nmole\nnominate\nsermons\n##bering\nwidowed\n##röm\nCavendish\n##struction\nNehru\nRevelation\ndoom\nGala\nbaking\nNr\nYourself\nbanning\nIndividuals\nSykes\norchestrated\n630\nPhone\nsteered\n620\nspecialising\nstarvation\n##AV\n##alet\n##upation\nseductive\n##jects\n##zure\nTolkien\nBenito\nWizards\nSubmarine\ndictator\nDuo\nCaden\napprox\nbasins\n##nc\nshrink\n##icles\n##sponsible\n249\nmit\noutpost\n##bayashi\n##rouse\n##tl\nJana\nLombard\nRBIs\nfinalized\nhumanities\n##function\nHonorable\ntomato\n##iot\nPie\ntee\n##pect\nBeaufort\nFerris\nbucks\n##graduate\n##ocytes\nDirectory\nanxiously\n##nating\nflanks\n##Ds\nvirtues\n##believable\nGrades\ncriterion\nmanufactures\nsourced\n##balt\n##dance\n##tano\nYing\n##BF\n##sett\nadequately\nblacksmith\ntotaled\ntrapping\nexpanse\nHistoria\nWorker\nSense\nascending\nhousekeeper\n##oos\nCrafts\nResurrection\n##verty\nencryption\n##aris\n##vat\n##pox\n##runk\n##iability\ngazes\nspying\n##ths\nhelmets\nwired\n##zophrenia\nCheung\nWR\ndownloads\nstereotypes\n239\nLucknow\nbleak\nBragg\nhauling\n##haft\nprohibit\n##ermined\n##castle\nbarony\n##hta\nTyphoon\nantibodies\n##ascism\nHawthorn\nKurdistan\nMinority\nGorge\nHerr\nappliances\ndisrupt\nDrugs\nLazarus\n##ilia\n##ryo\n##tany\nGotta\nMasovian\nRoxy\nchoreographed\n##rissa\nturbulent\n##listed\nAnatomy\nexiting\n##det\n##isław\n580\nKaufman\nsage\n##apa\nSymposium\n##rolls\nKaye\n##ptera\n##rocław\njerking\n##menclature\nGuo\nM1\nresurrected\ntrophies\n##lard\nGathering\nnestled\nserpent\nDow\nreservoirs\nClaremont\narbitration\nchronicle\neki\n##arded\n##zers\n##mmoth\nCongregational\nAstronomical\nNE\nRA\nRobson\nScotch\nmodelled\nslashed\n##imus\nexceeds\n##roper\n##utile\nLaughing\nvascular\nsuperficial\n##arians\nBarclay\nCaucasian\nclassmate\nsibling\nKimberly\nShreveport\n##ilde\n##liche\nCheney\nDeportivo\nVeracruz\nberries\n##lase\nBed\nMI\nAnatolia\nMindanao\nbroadband\n##olia\n##arte\n##wab\ndarts\n##immer\n##uze\nbelievers\nordinance\nviolate\n##wheel\n##ynth\nAlongside\nCoupe\nHobbs\narrondissement\nearl\ntownland\n##dote\n##lihood\n##sla\nGhosts\nmidfield\npulmonary\n##eno\ncues\n##gol\n##zda\n322\nSiena\nSultanate\nBradshaw\nPieter\n##thical\nRaceway\nbared\ncompetence\n##ssent\nBet\n##urer\n##ła\nAlistair\nGöttingen\nappropriately\nforge\n##osterone\n##ugen\nDL\n345\nconvoys\ninventions\n##resses\n##cturnal\nFay\nIntegration\nslash\n##roats\nWidow\nbarking\n##fant\n1A\nHooper\n##cona\n##runched\nunreliable\n##emont\n##esign\n##stabulary\n##stop\nJournalists\nbony\n##iba\n##trata\n##ège\nhorrific\n##bish\nJocelyn\n##rmon\n##apon\n##cier\ntrainers\n##ulatory\n1753\nBR\ncorpus\nsynthesized\n##bidden\n##rafford\nElgin\n##entry\nDoherty\nclockwise\n##played\nspins\n##ample\n##bley\nCope\nconstructions\nseater\nwarlord\nVoyager\ndocumenting\nfairies\n##viator\nLviv\njewellery\nsuites\n##gold\nMaia\nNME\n##eavor\n##kus\nEugène\nfurnishings\n##risto\nMCC\nMetropolis\nOlder\nTelangana\n##mpus\namplifier\nsupervising\n1710\nbuffalo\ncushion\nterminating\n##powering\nsteak\nQuickly\ncontracting\ndem\nsarcastically\nElsa\n##hein\nbastards\nnarratives\nTakes\n304\ncomposure\ntyping\nvariance\n##ifice\nSoftball\n##rations\nMcLaughlin\ngaped\nshrines\n##hogany\nGlamorgan\n##icle\n##nai\n##ntin\nFleetwood\nWoodland\n##uxe\nfictitious\nshrugs\n##iper\nBWV\nconform\n##uckled\nLaunch\n##ductory\n##mized\nTad\n##stituted\n##free\nBel\nChávez\nmessing\nquartz\n##iculate\n##folia\n##lynn\nushered\n##29\n##ailing\ndictated\nPony\n##opsis\nprecinct\n802\nPlastic\n##ughter\n##uno\n##porated\nDenton\nMatters\nSPD\nhating\n##rogen\nEssential\nDeck\nDortmund\nobscured\n##maging\nEarle\n##bred\n##ittle\n##ropolis\nsaturated\n##fiction\n##ression\nPereira\nVinci\nmute\nwarehouses\n##ún\nbiographies\n##icking\nsealing\n##dered\nexecuting\npendant\n##wives\nmurmurs\n##oko\nsubstrates\nsymmetrical\nSusie\n##mare\nYusuf\nanalogy\n##urage\nLesley\nlimitation\n##rby\n##ío\ndisagreements\n##mise\nembroidered\nnape\nunarmed\nSumner\nStores\ndwell\nWilcox\ncreditors\n##rivatization\n##shes\n##amia\ndirects\nrecaptured\nscouting\nMcGuire\ncradle\n##onnell\nSato\ninsulin\nmercenary\ntolerant\nMacquarie\ntransitions\ncradled\n##berto\n##ivism\n##yotes\nFF\nKe\nReach\n##dbury\n680\n##bill\n##oja\n##sui\nprairie\n##ogan\nreactive\n##icient\n##rits\nCyclone\nSirius\nSurvival\nPak\n##coach\n##trar\nhalves\nAgatha\nOpus\ncontrasts\n##jection\nominous\n##iden\nBaylor\nWoodrow\nduct\nfortification\nintercourse\n##rois\nColbert\nenvy\n##isi\nAfterward\ngeared\n##flections\naccelerate\n##lenching\nWitness\n##rrer\nAngelina\nMaterial\nassertion\nmisconduct\nNix\ncringed\ntingling\n##eti\n##gned\nEverest\ndisturb\nsturdy\n##keepers\n##vied\nProfile\nheavenly\n##kova\n##victed\ntranslating\n##sses\n316\nInvitational\nMention\nmartyr\n##uristic\nBarron\nhardness\nNakamura\n405\nGenevieve\nreflections\n##falls\njurist\n##LT\nPyramid\n##yme\nShoot\nheck\nlinguist\n##tower\nIves\nsuperiors\n##leo\nAchilles\n##phological\nChristophe\nPadma\nprecedence\ngrassy\nOral\nresurrection\n##itting\nclumsy\n##lten\n##rue\nhuts\n##stars\nEqual\n##queduct\nDevin\nGaga\ndiocesan\n##plating\n##upe\n##graphers\nPatch\nScream\nhail\nmoaning\ntracts\n##hdi\nExamination\noutsider\n##ergic\n##oter\nArchipelago\nHavilland\ngreenish\ntilting\nAleksandr\nKonstantin\nwarship\n##emann\n##gelist\n##ought\nbillionaire\n##blivion\n321\nHungarians\ntransplant\n##jured\n##fters\nCorbin\nautism\npitchers\nGarner\nthence\nScientology\ntransitioned\nintegrating\nrepetitive\n##dant\nRene\nvomit\n##burne\n1661\nResearchers\nWallis\ninsulted\nwavy\n##wati\nEwing\nexcitedly\n##kor\nfrescoes\ninjustice\n##achal\n##lumber\n##úl\nnovella\n##sca\nLiv\n##enstein\n##river\nmonstrous\ntopping\ndownfall\nlooming\nsinks\ntrillion\n##pont\nEffect\n##phi\n##urley\nSites\ncatchment\n##H1\nHopper\n##raiser\n1642\nMaccabi\nlance\n##chia\n##sboro\nNSA\nbranching\nretorted\ntensor\nImmaculate\ndrumming\nfeeder\n##mony\nDyer\nhomicide\nTemeraire\nfishes\nprotruding\nskins\norchards\n##nso\ninlet\nventral\n##finder\nAsiatic\nSul\n1688\nMelinda\nassigns\nparanormal\ngardening\nTau\ncalming\n##inge\n##crow\nregimental\nNik\nfastened\ncorrelated\n##gene\n##rieve\nSick\n##minster\n##politan\nhardwood\nhurled\n##ssler\nCinematography\nrhyme\nMontenegrin\nPackard\ndebating\n##itution\nHelens\nTrick\nMuseums\ndefiance\nencompassed\n##EE\n##TU\n##nees\n##uben\n##ünster\n##nosis\n435\nHagen\ncinemas\nCorbett\ncommended\n##fines\n##oman\nbosses\nripe\nscraping\n##loc\nfilly\nSaddam\npointless\nFaust\nOrléans\nSyriac\n##♭\nlongitude\n##ropic\nAlfa\nbliss\ngangster\n##ckling\nSL\nblending\n##eptide\n##nner\nbends\nescorting\n##bloid\n##quis\nburials\n##sle\n##è\nAmbulance\ninsults\n##gth\nAntrim\nunfolded\n##missible\nsplendid\nCure\nwarily\nSaigon\nWaste\nastonishment\nboroughs\n##VS\n##dalgo\n##reshing\n##usage\nrue\nmarital\nversatile\nunpaid\nallotted\nbacterium\n##coil\n##cue\nDorothea\nIDF\n##location\n##yke\nRPG\n##tropical\ndevotees\nliter\n##pree\nJohnstone\nastronaut\nattends\npollen\nperiphery\ndoctrines\nmeta\nshowered\n##tyn\nGO\nHuh\nlaude\n244\nAmar\nChristensen\nPing\nPontifical\nAusten\nraiding\nrealities\n##dric\nurges\n##dek\nCambridgeshire\n##otype\nCascade\nGreenberg\nPact\n##cognition\n##aran\n##urion\nRiot\nmimic\nEastwood\n##imating\nreversal\n##blast\n##henian\nPitchfork\n##sunderstanding\nStaten\nWCW\nlieu\n##bard\n##sang\nexperimenting\nAquino\n##lums\nTNT\nHannibal\ncatastrophic\n##lsive\n272\n308\n##otypic\n41st\nHighways\naggregator\n##fluenza\nFeatured\nReece\ndispatch\nsimulated\n##BE\nCommunion\nVinnie\nhardcover\ninexpensive\ntil\n##adores\ngroundwater\nkicker\nblogs\nfrenzy\n##wala\ndealings\nerase\nAnglia\n##umour\nHapoel\nMarquette\n##raphic\n##tives\nconsult\natrocities\nconcussion\n##érard\nDecree\nethanol\n##aen\nRooney\n##chemist\n##hoot\n1620\nmenacing\nSchuster\n##bearable\nlaborers\nsultan\nJuliana\nerased\nonstage\n##ync\nEastman\n##tick\nhushed\n##yrinth\nLexie\nWharton\nLev\n##PL\nTesting\nBangladeshi\n##bba\n##usions\ncommunicated\nintegers\ninternship\nsocietal\n##odles\nLoki\nET\nGhent\nbroadcasters\nUnix\n##auer\nKildare\nYamaha\n##quencing\n##zman\nchilled\n##rapped\n##uant\nDuval\nsentiments\nOliveira\npackets\nHorne\n##rient\nHarlan\nMirage\ninvariant\n##anger\n##tensive\nflexed\nsweetness\n##wson\nalleviate\ninsulting\nlimo\nHahn\n##llars\n##hesia\n##lapping\nbuys\n##oaming\nmocked\npursuits\nscooted\n##conscious\n##ilian\nBallad\njackets\n##kra\nhilly\n##cane\nScenic\nMcGraw\nsilhouette\nwhipping\n##roduced\n##wark\n##chess\n##rump\nLemon\ncalculus\ndemonic\n##latine\nBharatiya\nGovt\nQue\nTrilogy\nDucks\nSuit\nstairway\n##ceipt\nIsa\nregulator\nAutomobile\nflatly\n##buster\n##lank\nSpartans\ntopography\nTavi\nusable\nChartered\nFairchild\n##sance\n##vyn\nDigest\nnuclei\ntyphoon\n##llon\nAlvarez\nDJs\nGrimm\nauthoritative\nfirearm\n##chschule\nOrigins\nlair\nunmistakable\n##xial\n##cribing\nMouth\n##genesis\n##shū\n##gaon\n##ulter\nJaya\nNeck\n##UN\n##oing\n##static\nrelativity\n##mott\n##utive\n##esan\n##uveau\nBT\nsalts\n##roa\nDustin\npreoccupied\nNovgorod\n##asus\nMagnum\ntempting\n##histling\n##ilated\nMusa\n##ghty\nAshland\npubs\nroutines\n##etto\nSoto\n257\nFeaturing\nAugsburg\n##alaya\nBit\nloomed\nexpects\n##abby\n##ooby\nAuschwitz\nPendleton\nvodka\n##sent\nrescuing\nsystemic\n##inet\n##leg\nYun\napplicant\nrevered\n##nacht\n##ndas\nMuller\ncharacterization\n##patient\n##roft\nCarole\n##asperated\nAmiga\ndisconnected\ngel\n##cologist\nPatriotic\nrallied\nassign\nveterinary\ninstalling\n##cedural\n258\nJang\nParisian\nincarcerated\nstalk\n##iment\nJamal\nMcPherson\nPalma\n##oken\n##viation\n512\nRourke\nirrational\n##rippled\nDevlin\nerratic\n##NI\n##payers\nNi\nengages\nPortal\naesthetics\n##rrogance\nMilne\nassassins\n##rots\n335\n385\nCambodian\nFemales\nfellows\nsi\n##block\n##otes\nJayne\nToro\nflutter\n##eera\nBurr\n##lanche\nrelaxation\n##fra\nFitzroy\n##undy\n1751\n261\ncomb\nconglomerate\nribbons\nveto\n##Es\ncasts\n##ege\n1748\nAres\nspears\nspirituality\ncomet\n##nado\n##yeh\nVeterinary\naquarium\nyer\nCouncils\n##oked\n##ynamic\nMalmö\nremorse\nauditions\ndrilled\nHoffmann\nMoe\nNagoya\nYacht\n##hakti\n##race\n##rrick\nTalmud\ncoordinating\n##EI\n##bul\n##his\n##itors\n##ligent\n##uerra\nNarayan\ngoaltender\ntaxa\n##asures\nDet\n##mage\nInfinite\nMaid\nbean\nintriguing\n##cription\ngasps\nsocket\n##mentary\n##reus\nsewing\ntransmitting\n##different\n##furbishment\n##traction\nGrimsby\nsprawling\nShipyard\n##destine\n##hropic\n##icked\ntrolley\n##agi\n##lesh\nJosiah\ninvasions\nContent\nfirefighters\nintro\nLucifer\nsubunit\nSahib\nMyrtle\ninhibitor\nmaneuvers\n##teca\nWrath\nslippery\n##versing\nShoes\n##dial\n##illiers\n##luded\n##mmal\n##pack\nhandkerchief\n##edestal\n##stones\nFusion\ncumulative\n##mell\n##cacia\n##rudge\n##utz\nfoe\nstoring\nswiped\n##meister\n##orra\nbatter\nstrung\n##venting\n##kker\nDoo\nTaste\nimmensely\nFairbanks\nJarrett\nBoogie\n1746\nmage\nKick\nlegislators\nmedial\n##ilon\n##logies\n##ranton\nHybrid\n##uters\nTide\ndeportation\nMetz\n##secration\n##virus\nUFO\n##fell\n##orage\n##raction\n##rrigan\n1747\nfabricated\n##BM\n##GR\n##rter\nmuttering\ntheorist\n##tamine\nBMG\nKincaid\nsolvent\n##azed\nThin\nadorable\nWendell\nta\n##viour\npulses\n##pologies\ncounters\nexposition\nsewer\nLuciano\nClancy\n##angelo\n##riars\nShowtime\nobserves\nfrankly\n##oppy\nBergman\nlobes\ntimetable\n##bri\n##uest\nFX\n##dust\n##genus\nGlad\nHelmut\nMeridian\n##besity\n##ontaine\nRevue\nmiracles\n##titis\nPP\nbluff\nsyrup\n307\nMessiah\n##erne\ninterfering\npicturesque\nunconventional\ndipping\nhurriedly\nKerman\n248\nEthnic\nToward\nacidic\nHarrisburg\n##65\nintimidating\n##aal\nJed\nPontiac\nmunitions\n##nchen\ngrowling\nmausoleum\n##ération\n##wami\nCy\naerospace\ncaucus\nDoing\n##around\n##miring\nCuthbert\n##poradic\n##rovisation\n##wth\nevaluating\n##scraper\nBelinda\nowes\n##sitic\n##thermal\n##fast\neconomists\n##lishing\n##uerre\n##ân\ncredible\n##koto\nFourteen\ncones\n##ebrates\nbookstore\ntowels\n##phony\nAppearance\nnewscasts\n##olin\nKarin\nBingham\n##elves\n1680\n306\ndisks\n##lston\n##secutor\nLevant\n##vout\nMicro\nsnuck\n##ogel\n##racker\nExploration\ndrastic\n##kening\nElsie\nendowment\n##utnant\nBlaze\n##rrosion\nleaking\n45th\n##rug\n##uernsey\n760\nShapiro\ncakes\n##ehan\n##mei\n##ité\n##kla\nrepetition\nsuccessively\nFriendly\nÎle\nKoreans\nAu\nTirana\nflourish\nSpirits\nYao\nreasoned\n##leam\nConsort\ncater\nmarred\nordeal\nsupremacy\n##ritable\nPaisley\neuro\nhealer\nportico\nwetland\n##kman\nrestart\n##habilitation\n##zuka\n##Script\nemptiness\ncommunion\n##CF\n##inhabited\n##wamy\nCasablanca\npulsed\n##rrible\n##safe\n395\nDual\nTerrorism\n##urge\n##found\n##gnolia\nCourage\npatriarch\nsegregated\nintrinsic\n##liography\n##phe\nPD\nconvection\n##icidal\nDharma\nJimmie\ntexted\nconstituents\ntwitch\n##calated\n##mitage\n##ringing\n415\nmilling\n##geons\nArmagh\nGeometridae\nevergreen\nneedy\nreflex\ntemplate\n##pina\nSchubert\n##bruck\n##icted\n##scher\n##wildered\n1749\nJoanne\nclearer\n##narl\n278\nPrint\nautomation\nconsciously\nflashback\noccupations\n##ests\nCasimir\ndifferentiated\npolicing\nrepay\n##aks\n##gnesium\nEvaluation\ncommotion\n##CM\n##smopolitan\nClapton\nmitochondrial\nKobe\n1752\nIgnoring\nVincenzo\nWet\nbandage\n##rassed\n##unate\nMaris\n##eted\n##hetical\nfiguring\n##eit\n##nap\nleopard\nstrategically\n##reer\nFen\nIain\n##ggins\n##pipe\nMatteo\nMcIntyre\n##chord\n##feng\nRomani\nasshole\nflopped\nreassure\nFounding\nStyles\nTorino\npatrolling\n##erging\n##ibrating\n##ructural\nsincerity\n##ät\n##teacher\nJuliette\n##cé\n##hog\n##idated\n##span\nWinfield\n##fender\n##nast\n##pliant\n1690\nBai\nJe\nSaharan\nexpands\nBolshevik\nrotate\n##root\nBritannia\nSevern\n##cini\n##gering\n##say\nsly\nSteps\ninsertion\nrooftop\nPiece\ncuffs\nplausible\n##zai\nProvost\nsemantic\n##data\n##vade\n##cimal\nIPA\nindictment\nLibraries\nflaming\nhighlands\nliberties\n##pio\nElders\naggressively\n##pecific\nDecision\npigeon\nnominally\ndescriptive\nadjustments\nequestrian\nheaving\n##mour\n##dives\n##fty\n##yton\nintermittent\n##naming\n##sets\nCalvert\nCasper\nTarzan\n##kot\nRamírez\n##IB\n##erus\nGustavo\nRoller\nvaulted\n##solation\n##formatics\n##tip\nHunger\ncolloquially\nhandwriting\nhearth\nlauncher\n##idian\n##ilities\n##lind\n##locating\nMagdalena\nSoo\nclubhouse\n##kushima\n##ruit\nBogotá\nOrganic\nWorship\n##Vs\n##wold\nupbringing\n##kick\ngroundbreaking\n##urable\n##ván\nrepulsed\n##dira\n##ditional\n##ici\nmelancholy\n##bodied\n##cchi\n404\nconcurrency\nH₂O\nbouts\n##gami\n288\nLeto\ntroll\n##lak\nadvising\nbundled\n##nden\nlipstick\nlittered\n##leading\n##mogeneous\nExperiment\nNikola\ngrove\n##ogram\nMace\n##jure\ncheat\nAnnabelle\nTori\nlurking\nEmery\nWalden\n##riz\npaints\nMarkets\nbrutality\noverrun\n##agu\n##sat\ndin\nostensibly\nFielding\nflees\n##eron\nPound\nornaments\ntornadoes\n##nikov\n##organisation\n##reen\n##Works\n##ldred\n##olten\n##stillery\nsoluble\nMata\nGrimes\nLéon\n##NF\ncoldly\npermitting\n##inga\n##reaked\nAgents\nhostess\n##dl\nDyke\nKota\navail\norderly\n##saur\n##sities\nArroyo\n##ceps\n##egro\nHawke\nNoctuidae\nhtml\nseminar\n##ggles\n##wasaki\nClube\nrecited\n##sace\nAscension\nFitness\ndough\n##ixel\nNationale\n##solidate\npulpit\nvassal\n570\nAnnapolis\nbladder\nphylogenetic\n##iname\nconvertible\n##ppan\nComet\npaler\n##definite\nSpot\n##dices\nfrequented\nApostles\nslalom\n##ivision\n##mana\n##runcated\nTrojan\n##agger\n##iq\n##league\nConcept\nController\n##barian\n##curate\n##spersed\n##tring\nengulfed\ninquired\n##hmann\n286\n##dict\n##osy\n##raw\nMacKenzie\nsu\n##ienced\n##iggs\n##quitaine\nbisexual\n##noon\nrunways\nsubsp\n##!\n##\"\n###\n##$\n##%\n##&\n##'\n##(\n##)\n##*\n##+\n##,\n##-\n##.\n##/\n##:\n##;\n##<\n##=\n##>\n##?\n##@\n##[\n##\\\n##]\n##^\n##_\n##`\n##{\n##|\n##}\n##~\n##¡\n##¢\n##£\n##¥\n##§\n##¨\n##©\n##ª\n##«\n##¬\n##®\n##±\n##´\n##µ\n##¶\n##·\n##¹\n##º\n##»\n##¼\n##¾\n##¿\n##À\n##Á\n##Â\n##Ä\n##Å\n##Æ\n##Ç\n##È\n##É\n##Í\n##Î\n##Ñ\n##Ó\n##Ö\n##×\n##Ø\n##Ú\n##Ü\n##Þ\n##â\n##ã\n##æ\n##ç\n##î\n##ï\n##ð\n##ñ\n##ô\n##õ\n##÷\n##û\n##þ\n##ÿ\n##Ā\n##ą\n##Ć\n##Č\n##ď\n##Đ\n##đ\n##ē\n##ė\n##ę\n##ě\n##ğ\n##ġ\n##Ħ\n##ħ\n##ĩ\n##Ī\n##İ\n##ļ\n##Ľ\n##ľ\n##Ł\n##ņ\n##ň\n##ŋ\n##Ō\n##ŏ\n##ő\n##Œ\n##œ\n##ř\n##Ś\n##ś\n##Ş\n##Š\n##Ţ\n##ţ\n##ť\n##ũ\n##ŭ\n##ů\n##ű\n##ų\n##ŵ\n##ŷ\n##ź\n##Ż\n##ż\n##Ž\n##ž\n##Ə\n##ƒ\n##ơ\n##ư\n##ǎ\n##ǐ\n##ǒ\n##ǔ\n##ǫ\n##Ș\n##Ț\n##ț\n##ɐ\n##ɑ\n##ɔ\n##ɕ\n##ə\n##ɛ\n##ɡ\n##ɣ\n##ɨ\n##ɪ\n##ɲ\n##ɾ\n##ʀ\n##ʁ\n##ʂ\n##ʃ\n##ʊ\n##ʋ\n##ʌ\n##ʐ\n##ʑ\n##ʒ\n##ʔ\n##ʰ\n##ʲ\n##ʳ\n##ʷ\n##ʻ\n##ʼ\n##ʾ\n##ʿ\n##ˈ\n##ː\n##ˡ\n##ˢ\n##ˣ\n##́\n##̃\n##̍\n##̯\n##͡\n##Α\n##Β\n##Γ\n##Δ\n##Ε\n##Η\n##Θ\n##Ι\n##Κ\n##Λ\n##Μ\n##Ν\n##Ο\n##Π\n##Σ\n##Τ\n##Φ\n##Χ\n##Ψ\n##Ω\n##ά\n##έ\n##ή\n##ί\n##β\n##γ\n##δ\n##ε\n##ζ\n##η\n##θ\n##ι\n##κ\n##λ\n##μ\n##ξ\n##ο\n##π\n##ρ\n##σ\n##τ\n##υ\n##φ\n##χ\n##ψ\n##ω\n##ό\n##ύ\n##ώ\n##І\n##Ј\n##А\n##Б\n##В\n##Г\n##Д\n##Е\n##Ж\n##З\n##И\n##К\n##Л\n##М\n##Н\n##О\n##П\n##Р\n##С\n##Т\n##У\n##Ф\n##Х\n##Ц\n##Ч\n##Ш\n##Э\n##Ю\n##Я\n##б\n##в\n##г\n##д\n##ж\n##з\n##к\n##л\n##м\n##п\n##с\n##т\n##у\n##ф\n##х\n##ц\n##ч\n##ш\n##щ\n##ъ\n##ы\n##ь\n##э\n##ю\n##ё\n##і\n##ї\n##ј\n##њ\n##ћ\n##Ա\n##Հ\n##ա\n##ե\n##ի\n##կ\n##մ\n##յ\n##ն\n##ո\n##ս\n##տ\n##ր\n##ւ\n##ְ\n##ִ\n##ֵ\n##ֶ\n##ַ\n##ָ\n##ֹ\n##ּ\n##א\n##ב\n##ג\n##ד\n##ה\n##ו\n##ז\n##ח\n##ט\n##י\n##כ\n##ל\n##ם\n##מ\n##ן\n##נ\n##ס\n##ע\n##פ\n##צ\n##ק\n##ר\n##ש\n##ת\n##،\n##ء\n##آ\n##أ\n##إ\n##ئ\n##ا\n##ب\n##ت\n##ث\n##ج\n##ح\n##خ\n##ذ\n##ز\n##س\n##ش\n##ص\n##ض\n##ط\n##ظ\n##ع\n##غ\n##ف\n##ق\n##ك\n##ل\n##و\n##ى\n##َ\n##ِ\n##ٹ\n##پ\n##چ\n##ک\n##گ\n##ہ\n##ی\n##ے\n##ं\n##आ\n##क\n##ग\n##च\n##ज\n##ण\n##त\n##द\n##ध\n##न\n##प\n##ब\n##भ\n##म\n##य\n##र\n##ल\n##व\n##श\n##ष\n##स\n##ह\n##ा\n##ि\n##ी\n##ु\n##े\n##ो\n##्\n##।\n##॥\n##আ\n##ই\n##এ\n##ও\n##ক\n##খ\n##গ\n##চ\n##ছ\n##জ\n##ট\n##ত\n##থ\n##দ\n##ধ\n##ন\n##প\n##ব\n##ম\n##য\n##র\n##ল\n##শ\n##স\n##হ\n##়\n##া\n##ি\n##ী\n##ু\n##ে\n##ো\n##্\n##য়\n##க\n##த\n##ப\n##ம\n##ய\n##ர\n##ல\n##வ\n##ா\n##ி\n##ு\n##்\n##ร\n##་\n##ག\n##ང\n##ད\n##ན\n##བ\n##མ\n##ར\n##ལ\n##ས\n##ི\n##ུ\n##ེ\n##ོ\n##ა\n##ე\n##ი\n##ლ\n##ნ\n##ო\n##რ\n##ს\n##ᴬ\n##ᴵ\n##ᵀ\n##ᵃ\n##ᵇ\n##ᵈ\n##ᵉ\n##ᵍ\n##ᵏ\n##ᵐ\n##ᵒ\n##ᵖ\n##ᵗ\n##ᵘ\n##ᵣ\n##ᵤ\n##ᵥ\n##ᶜ\n##ᶠ\n##ḍ\n##Ḥ\n##ḥ\n##Ḩ\n##ḩ\n##ḳ\n##ṃ\n##ṅ\n##ṇ\n##ṛ\n##ṣ\n##ṭ\n##ạ\n##ả\n##ấ\n##ầ\n##ẩ\n##ậ\n##ắ\n##ế\n##ề\n##ể\n##ễ\n##ệ\n##ị\n##ọ\n##ố\n##ồ\n##ổ\n##ộ\n##ớ\n##ờ\n##ợ\n##ụ\n##ủ\n##ứ\n##ừ\n##ử\n##ữ\n##ự\n##ỳ\n##ỹ\n##ἀ\n##ἐ\n##ὁ\n##ὐ\n##ὰ\n##ὶ\n##ὸ\n##ῆ\n##ῖ\n##ῦ\n##ῶ\n##‐\n##‑\n##‒\n##–\n##—\n##―\n##‖\n##‘\n##’\n##‚\n##“\n##”\n##„\n##†\n##‡\n##•\n##…\n##‰\n##′\n##″\n##⁄\n##⁰\n##ⁱ\n##⁴\n##⁵\n##⁶\n##⁷\n##⁸\n##⁹\n##⁻\n##ⁿ\n##₅\n##₆\n##₇\n##₈\n##₉\n##₊\n##₍\n##₎\n##ₐ\n##ₑ\n##ₒ\n##ₓ\n##ₕ\n##ₖ\n##ₘ\n##ₚ\n##ₛ\n##ₜ\n##₤\n##€\n##₱\n##₹\n##ℓ\n##№\n##ℝ\n##⅓\n##←\n##↑\n##→\n##↔\n##⇌\n##⇒\n##∂\n##∈\n##∗\n##∘\n##√\n##∞\n##∧\n##∨\n##∩\n##∪\n##≈\n##≠\n##≡\n##≤\n##≥\n##⊂\n##⊆\n##⊕\n##⋅\n##─\n##│\n##■\n##●\n##★\n##☆\n##☉\n##♠\n##♣\n##♥\n##♦\n##♯\n##⟨\n##⟩\n##ⱼ\n##、\n##。\n##《\n##》\n##「\n##」\n##『\n##』\n##〜\n##い\n##う\n##え\n##お\n##か\n##き\n##く\n##け\n##こ\n##さ\n##し\n##す\n##せ\n##そ\n##た\n##ち\n##つ\n##て\n##と\n##な\n##に\n##の\n##は\n##ひ\n##ま\n##み\n##む\n##め\n##も\n##や\n##ゆ\n##よ\n##ら\n##り\n##る\n##れ\n##ん\n##ア\n##ィ\n##イ\n##ウ\n##エ\n##オ\n##カ\n##ガ\n##キ\n##ク\n##グ\n##コ\n##サ\n##シ\n##ジ\n##ス\n##ズ\n##タ\n##ダ\n##ッ\n##テ\n##デ\n##ト\n##ド\n##ナ\n##ニ\n##ハ\n##バ\n##パ\n##フ\n##ブ\n##プ\n##マ\n##ミ\n##ム\n##ャ\n##ュ\n##ラ\n##リ\n##ル\n##レ\n##ロ\n##ン\n##・\n##ー\n##一\n##三\n##上\n##下\n##中\n##事\n##二\n##井\n##京\n##人\n##亻\n##仁\n##佐\n##侍\n##光\n##公\n##力\n##北\n##十\n##南\n##原\n##口\n##史\n##司\n##吉\n##同\n##和\n##囗\n##国\n##國\n##土\n##城\n##士\n##大\n##天\n##太\n##夫\n##女\n##子\n##宀\n##安\n##宮\n##宿\n##小\n##尚\n##山\n##島\n##川\n##州\n##平\n##年\n##心\n##愛\n##戸\n##文\n##新\n##方\n##日\n##明\n##星\n##書\n##月\n##木\n##本\n##李\n##村\n##東\n##松\n##林\n##正\n##武\n##氏\n##水\n##氵\n##江\n##河\n##海\n##版\n##犬\n##王\n##生\n##田\n##白\n##皇\n##省\n##真\n##石\n##社\n##神\n##竹\n##美\n##義\n##花\n##藤\n##西\n##谷\n##車\n##辶\n##道\n##郎\n##郡\n##部\n##野\n##金\n##長\n##門\n##陽\n##青\n##食\n##馬\n##高\n##龍\n##龸\n##사\n##씨\n##의\n##이\n##한\n##ﬁ\n##ﬂ\n##！\n##（\n##）\n##，\n##－\n##／\n##：\n"
  },
  {
    "path": "func_builders/input_fn_builder.py",
    "content": "#!/usr/bin/env python3 \n# -*- coding: utf-8 -*- \n\n\n\n# author: xiaoy li \n# description:  \n# \n# \n\n\nimport tensorflow as tf\n\n\ndef file_based_input_fn_builder(input_file, num_window=None, window_size=None, max_num_mention=None, is_training=False, drop_remainder=True):\n    \"\"\"Creates an `input_fn` closure to be passed to TPUEstimator.\"\"\"\n    name_to_features = {\n            'sentence_map': tf.FixedLenFeature([num_window * window_size], tf.int64),\n            'text_len': tf.FixedLenFeature([num_window], tf.int64),\n            'subtoken_map': tf.FixedLenFeature([num_window *  window_size], tf.int64),\n            'speaker_ids': tf.FixedLenFeature([num_window * window_size], tf.int64),\n            'flattened_input_ids': tf.FixedLenFeature([num_window * window_size], tf.int64),\n            'flattened_input_mask': tf.FixedLenFeature([num_window * window_size], tf.int64),\n            'span_starts': tf.FixedLenFeature([max_num_mention], tf.int64),\n            'span_ends': tf.FixedLenFeature([max_num_mention], tf.int64), \n            'cluster_ids': tf.FixedLenFeature([max_num_mention], tf.int64),\n            }\n\n\n    def _decode_record(record, name_to_features):\n        \"\"\"Decodes a record to a TensorFlow example.\"\"\"\n        example = tf.io.parse_single_example(record, name_to_features)\n        # tf.Example only supports tf.int64, but the TPU only supports tf.int32.\n        # So cast all int64 to int32.\n        for name in list(example.keys()):\n            t = example[name]\n            if t.dtype == tf.int64:\n                t = tf.to_int32(t)\n            example[name] = t \n        return example\n\n\n    def input_fn_from_tfrecord(params):\n        \"\"\"The actual input function.\"\"\"\n        batch_size = params[\"batch_size\"]\n\n        # For training, we want a lot of parallel reading and shuffling.\n        # For eval, we want no shuffling and parallel reading doesn't matter.\n        d = tf.data.TFRecordDataset(input_file)\n        if is_training:\n            d = d.repeat() \n            d = d.shuffle(buffer_size=100)\n    \n        d = d.apply(\n        tf.contrib.data.map_and_batch(\n            lambda record: _decode_record(record, name_to_features),\n            batch_size=batch_size,\n            drop_remainder=drop_remainder))\n\n        return d\n        \n    return input_fn_from_tfrecord "
  },
  {
    "path": "func_builders/model_fn_builder.py",
    "content": "#!/usr/bin/env python3 \n# -*- coding: utf-8 -*- \n\n\n\n# author: xiaoy li \n# description:\n# \n#\n\n\n\nimport tensorflow as tf\nfrom utils import util\nfrom utils.radam import RAdam\n\n\ndef model_fn_builder(config, model_sign=\"mention_proposal\"):\n\n    def mention_proposal_model_fn(features, labels, mode, params): \n        \"\"\"The `model_fn` for TPUEstimator.\"\"\"\n        input_ids = features[\"flattened_input_ids\"]\n        input_mask = features[\"flattened_input_mask\"]\n        text_len = features[\"text_len\"]\n        speaker_ids = features[\"speaker_ids\"]\n        gold_starts = features[\"span_starts\"]\n        gold_ends = features[\"span_ends\"]\n        cluster_ids = features[\"cluster_ids\"]\n        sentence_map = features[\"sentence_map\"] \n        \n        is_training = (mode == tf.estimator.ModeKeys.TRAIN)\n\n        model = util.get_model(config, model_sign=\"mention_proposal\")\n\n        if config.use_tpu:\n            def tpu_scaffold():\n                return tf.train.Scaffold()\n            scaffold_fn = tpu_scaffold\n        else:\n            scaffold_fn = None \n\n        if mode == tf.estimator.ModeKeys.TRAIN: \n            tf.logging.info(\"****************************** tf.estimator.ModeKeys.TRAIN ******************************\")\n            tf.logging.info(\"********* Features *********\")\n            for name in sorted(features.keys()):\n                tf.logging.info(\"  name = %s, shape = %s\" % (name, features[name].shape))\n\n            instance = (input_ids, input_mask, sentence_map, text_len, speaker_ids, gold_starts, gold_ends, cluster_ids)\n            total_loss, start_scores, end_scores, span_scores = model.get_mention_proposal_and_loss(instance, is_training)\n            gold_start_sequence_labels, gold_end_sequence_labels, gold_span_sequence_labels = model.get_gold_mention_sequence_labels_from_pad_index(gold_starts, gold_ends, text_len)\n\n            if config.use_tpu:\n                optimizer = tf.train.AdamOptimizer(learning_rate=config.learning_rate, beta1=0.9, beta2=0.999, epsilon=1e-08)\n                optimizer = tf.contrib.tpu.CrossShardOptimizer(optimizer)\n                train_op = optimizer.minimize(total_loss, tf.train.get_global_step()) \n                output_spec = tf.contrib.tpu.TPUEstimatorSpec(\n                    mode=mode,\n                    loss=total_loss,\n                    train_op=train_op,\n                    scaffold_fn=scaffold_fn)\n            else:\n                optimizer = RAdam(learning_rate=config.learning_rate, epsilon=1e-8, beta1=0.9, beta2=0.999)\n                train_op = optimizer.minimize(total_loss, tf.train.get_global_step())\n        \n                train_logging_hook = tf.train.LoggingTensorHook({\"loss\": total_loss}, every_n_iter=1)\n                output_spec = tf.contrib.tpu.TPUEstimatorSpec(\n                    mode=mode,\n                    loss=total_loss,\n                    train_op=train_op,\n                    scaffold_fn=scaffold_fn,\n                    training_hooks=[train_logging_hook])\n\n        elif mode == tf.estimator.ModeKeys.EVAL: \n            tf.logging.info(\"****************************** tf.estimator.ModeKeys.EVAL ******************************\")\n            \n            instance = (input_ids, input_mask, sentence_map, text_len, speaker_ids, gold_starts, gold_ends, cluster_ids)\n            total_loss, start_scores, end_scores, span_scores = model.get_mention_proposal_and_loss(instance, is_training)\n            total_loss, start_scores, end_scores, span_scores = model.get_mention_proposal_and_loss(instance, is_training)\n            gold_start_sequence_labels, gold_end_sequence_labels, gold_span_sequence_labels = model.get_gold_mention_sequence_labels_from_pad_index(gold_starts, gold_ends, text_len)\n\n            def metric_fn(start_scores, end_scores, span_scores, gold_span_label):\n                start_scores = tf.reshape(start_scores, [-1, config.window_size])\n                end_scores = tf.reshape(end_scores, [-1, config.window_size])\n                start_scores = tf.tile(tf.expand_dims(start_scores, 2), [1, 1, config.window_size])\n                end_scores = tf.tile(tf.expand_dims(end_scores, 2), [1, 1, config.window_size])\n                sce_span_scores = (start_scores + end_scores + span_scores)/ 3\n                pred_span_label = tf.cast(tf.reshape(tf.math.greater_equal(sce_span_scores, config.mention_threshold), [-1]), tf.bool)\n\n                gold_span_label = tf.cast(tf.reshape(gold_span_sequence_labels, [-1]), tf.bool)\n\n                return {\"precision\": tf.compat.v1.metrics.precision(gold_span_label, pred_span_label), \n                        \"recall\": tf.compat.v1.metrics.recall(gold_span_label, pred_span_label)}\n\n            eval_metrics = (metric_fn, [start_scores, end_scores, span_scores])\n            output_spec = tf.contrib.tpu.TPUEstimatorSpec(\n                mode=tf.estimator.ModeKeys.EVAL,\n                loss=total_loss,\n                eval_metrics=eval_metrics,\n                scaffold_fn=scaffold_fn)\n\n        elif mode == tf.estimator.ModeKeys.PREDICT:\n            tf.logging.info(\"****************************** tf.estimator.ModeKeys.PREDICT ******************************\")\n            \n            instance = (input_ids, input_mask, sentence_map, text_len, speaker_ids, gold_starts, gold_ends, cluster_ids)\n            total_loss, start_scores, end_scores, span_scores = model.get_mention_proposal_and_loss(instance, is_training)\n            gold_start_sequence_labels, gold_end_sequence_labels, gold_span_sequence_labels = model.get_gold_mention_sequence_labels_from_pad_index(gold_starts, gold_ends, text_len)\n            predictions = {\n                    \"total_loss\": total_loss,\n                    \"start_scores\": start_scores,\n                    \"start_gold\": gold_starts,\n                    \"end_gold\": gold_ends,\n                    \"end_scores\": end_scores, \n                    \"span_scores\": span_scores\n            }            \n            output_spec = tf.contrib.tpu.TPUEstimatorSpec(\n                mode=tf.estimator.ModeKeys.PREDICT,\n                predictions=predictions,\n                scaffold_fn=scaffold_fn)\n        else:\n            raise ValueError(\"Please check the the mode ! \")\n        \n        return output_spec\n\n\n    def corefqa_model_fn(features, labels, mode, params):\n\n        \"\"\"The `model_fn` for TPUEstimator.\"\"\"\n        input_ids = features[\"flattened_input_ids\"]\n        input_mask = features[\"flattened_input_mask\"]\n        text_len = features[\"text_len\"]\n        speaker_ids = features[\"speaker_ids\"]\n        gold_starts = features[\"span_starts\"]\n        gold_ends = features[\"span_ends\"]\n        cluster_ids = features[\"cluster_ids\"]\n        sentence_map = features[\"sentence_map\"] \n        \n        is_training = (mode == tf.estimator.ModeKeys.TRAIN)\n\n        model = util.get_model(config, model_sign=\"corefqa\")\n    \n        if config.use_tpu:\n            tf.logging.info(\"****************************** Training on TPU ******************************\")\n            def tpu_scaffold():\n                return tf.train.Scaffold()\n            scaffold_fn = tpu_scaffold\n        else:\n            scaffold_fn = None \n\n\n        if mode == tf.estimator.ModeKeys.TRAIN:\n            tf.logging.info(\"****************************** tf.estimator.ModeKeys.TRAIN ******************************\")\n            tf.logging.info(\"********* Features *********\")\n            for name in sorted(features.keys()):\n                tf.logging.info(\"  name = %s, shape = %s\" % (name, features[name].shape))\n\n            instance = (input_ids, input_mask, sentence_map, text_len, speaker_ids, gold_starts, gold_ends, cluster_ids)\n            total_loss, (topk_mention_start_indices, topk_mention_end_indices), (forward_topc_mention_start_indices, forward_topc_mention_end_indices), top_mention_span_linking_scores  = model.get_coreference_resolution_and_loss(instance, is_training, use_tpu=config.use_tpu)\n\n            if config.use_tpu:\n                optimizer = tf.train.AdamOptimizer(learning_rate=config.learning_rate, beta1=0.9, beta2=0.999, epsilon=1e-08)\n                optimizer = tf.contrib.tpu.CrossShardOptimizer(optimizer)\n                train_op = optimizer.minimize(total_loss, tf.train.get_global_step()) \n                output_spec = tf.contrib.tpu.TPUEstimatorSpec(\n                    mode=tf.estimator.ModeKeys.TRAIN,\n                    loss=total_loss,\n                    train_op=train_op,\n                    scaffold_fn=scaffold_fn)\n            else:\n                optimizer = RAdam(learning_rate=config.learning_rate, epsilon=1e-8, beta1=0.9, beta2=0.999)\n                train_op = optimizer.minimize(total_loss, tf.train.get_global_step())\n\n                training_logging_hook = tf.train.LoggingTensorHook({\"loss\": total_loss}, every_n_iter=1)\n                output_spec = tf.contrib.tpu.TPUEstimatorSpec(\n                    mode=tf.estimator.ModeKeys.TRAIN,\n                    loss=total_loss,\n                    train_op=train_op,\n                    scaffold_fn=scaffold_fn, \n                    training_hooks=[training_logging_hook])\n\n\n        elif mode == tf.estimator.ModeKeys.EVAL: \n            tf.logging.info(\"****************************** tf.estimator.ModeKeys.EVAL ******************************\")\n            tf.logging.info(\"@@@@@ MERELY support tf.estimator.ModeKeys.PREDICT ! @@@@@\")\n            tf.logging.info(\"@@@@@ YOU can EVAL your checkpoints after the training process. @@@@@\")  \n            tf.logging.info(\"****************************** tf.estimator.ModeKeys.EVAL ******************************\")\n        \n        elif mode == tf.estimator.ModeKeys.PREDICT :\n            tf.logging.info(\"****************************** tf.estimator.ModeKeys.PREDICT ******************************\")\n\n            instance = (input_ids, input_mask, sentence_map, text_len, speaker_ids, gold_starts, gold_ends, cluster_ids)\n            total_loss, (topk_mention_start_indices, topk_mention_end_indices), (forward_topc_mention_start_indices, forward_topc_mention_end_indices), top_mention_span_linking_scores  = model.get_coreference_resolution_and_loss(instance, True, use_tpu=config.use_tpu)\n\n            top_antecedent = tf.math.argmax(top_mention_span_linking_scores, axis=-1)\n            predictions = {\n                        \"total_loss\": total_loss, \n                        \"topk_span_starts\": topk_mention_start_indices,\n                        \"topk_span_ends\": topk_mention_end_indices, \n                        \"top_antecedent_scores\": top_mention_span_linking_scores,\n                        \"top_antecedent\": top_antecedent,\n                        \"cluster_ids\" : cluster_ids, \n                        \"gold_starts\": gold_starts, \n                        \"gold_ends\": gold_ends}   \n\n            output_spec = tf.contrib.tpu.TPUEstimatorSpec(mode=tf.estimator.ModeKeys.PREDICT, \n                predictions=predictions, \n                scaffold_fn=scaffold_fn)\n        else:\n            raise ValueError(\"Please check the the mode ! \")\n        return output_spec\n\n\n    if model_sign == \"mention_proposal\":\n        return mention_proposal_model_fn\n    elif model_sign == \"corefqa\":\n        return corefqa_model_fn\n    else:\n        raise ValueError(\"Please check the model sign! Only support [mention_proposal] and [corefqa] .\")\n\n\n\n\n\n\n\n"
  },
  {
    "path": "logs/corefqa_log.txt",
    "content": "/home/xiaoyli1110/venv/lib/python3.5/site-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.\n  from ._conv import register_converters as _register_converters\n/home/xiaoyli1110/xiaoya/Coref-tf\nW0713 13:36:28.637916 139854454523648 module_wrapper.py:139] From /home/xiaoyli1110/xiaoya/Coref-tf/run/train_corefqa.py:308: The name tf.app.run is deprecated. Please use tf.compat.v1.app.run instead.\n\nloading experiments_tpu.conf ... \nW0713 13:36:28.752999 139854454523648 module_wrapper.py:139] From /home/xiaoyli1110/xiaoya/Coref-tf/utils/util.py:41: The name tf.logging.info is deprecated. Please use tf.compat.v1.logging.info instead.\nI0716 13:36:28.753216 139854454523648 util.py:41] %*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%\nI0716 13:36:28.753291 139854454523648 util.py:42] %*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%\nI0716 13:36:28.753346 139854454523648 util.py:43] %%%%%%%% Configs are showed as follows : %%%%%%%%\nI0716 13:36:28.753432 139854454523648 util.py:45] max_top_antecedents : 60\nI0716 13:36:28.753505 139854454523648 util.py:45] max_training_sentences : 3\nI0716 13:36:28.753576 139854454523648 util.py:45] top_span_ratio : 0.3\nI0716 13:36:28.753644 139854454523648 util.py:45] max_num_speakers : 20\nI0716 13:36:28.753710 139854454523648 util.py:45] max_segment_len : 384\nI0716 13:36:28.753773 139854454523648 util.py:45] max_cluster_num : 30\nI0716 13:36:28.753848 139854454523648 util.py:45] tpu : True\nI0716 13:36:28.753910 139854454523648 util.py:45] max_query_len : 150\nI0716 13:36:28.753972 139854454523648 util.py:45] max_context_len : 150\nI0716 13:36:28.754034 139854454523648 util.py:45] max_qa_len : 300\nI0716 13:36:28.754097 139854454523648 util.py:45] hidden_size : 1024\nI0716 13:36:28.754161 139854454523648 util.py:45] max_candidate_mentions : 60\nI0716 13:36:28.754229 139854454523648 util.py:45] learning_rate : 8e-06\nI0716 13:36:28.754301 139854454523648 util.py:45] num_docs : 5604\nI0716 13:36:28.754365 139854454523648 util.py:45] start_ratio : 0.8\nI0716 13:36:28.754428 139854454523648 util.py:45] end_ratio : 0.8\nI0716 13:36:28.754492 139854454523648 util.py:45] mention_ratio : 1.0\nI0716 13:36:28.754556 139854454523648 util.py:45] corefqa_loss_ratio : 0.9\nI0716 13:36:28.754620 139854454523648 util.py:45] score_ratio : 0.5\nI0716 13:36:28.754682 139854454523648 util.py:45] run : estimator\nI0716 13:36:28.754746 139854454523648 util.py:45] threshold : 0.5\nI0716 13:36:28.754814 139854454523648 util.py:45] dropout_rate : 0.3\nI0716 13:36:28.754945 139854454523648 util.py:45] ffnn_size : 1024\nI0716 13:36:28.755008 139854454523648 util.py:45] ffnn_depth : 1\nI0716 13:36:28.755071 139854454523648 util.py:45] num_epochs : 8\nI0716 13:36:28.755135 139854454523648 util.py:45] max_span_width : 30\nI0716 13:36:28.755199 139854454523648 util.py:45] use_segment_distance : True\nI0716 13:36:28.755261 139854454523648 util.py:45] model_heads : True\nI0716 13:36:28.755324 139854454523648 util.py:45] coref_depth : 2\nI0716 13:36:28.755383 139854454523648 util.py:45] corefqa_only_concate : False\nI0716 13:36:28.755445 139854454523648 util.py:45] train_path : gs://xiaoy-data-europe/overlap_384_3/train.128.english.tfrecord\nI0716 13:36:28.755507 139854454523648 util.py:45] eval_path : test.english.jsonlines\nI0716 13:36:28.755571 139854454523648 util.py:45] conll_eval_path : gs://corefqa-europe/spanbert_large_overlap_384_3_out\nput_2e-5/test.english.v4_gold_conll\nI0716 13:36:28.755634 139854454523648 util.py:45] single_example : False\nI0716 13:36:28.755702 139854454523648 util.py:45] genres : ['bc', 'bn', 'mz', 'nw', 'pt', 'tc', 'wb']\nI0716 13:36:28.755765 139854454523648 util.py:45] log_root : gs://corefqa-europe/spanbert_large_overlap_384_3_output_2e-5\nI0716 13:36:28.755842 139854454523648 util.py:45] save_checkpoints_steps : 1000\nI0716 13:36:28.755906 139854454523648 util.py:45] dev_path : gs://xiaoy-data-europe/overlap_384_3/dev.256.english.tfrecord\nI0716 13:36:28.755968 139854454523648 util.py:45] test_path : gs://xiaoy-data-europe/overlap_384_3/test.256.english.tfrecord\nI0716 13:36:28.756030 139854454523648 util.py:45] bert_config_file : gs://xiaoy-data-europe/spanbert_large_tf/bert_config.json\nI0716 13:36:28.756093 139854454523648 util.py:45] vocab_file : gs://xiaoy-data-europe/spanbert_large_tf/vocab.txt\nI0716 13:36:28.756155 139854454523648 util.py:45] tf_checkpoint : gs://xiaoy-data-europe/spanbert_large_tf/bert_model.ckpt\nI0716 13:36:28.756217 139854454523648 util.py:45] init_checkpoint : gs://xiaoy-data-europe/spanbert_large_tf/bert_model.ckpt\nI0716 13:36:28.756279 139854454523648 util.py:45] eval_checkpoint : gs://corefqa-europe/spanbert_large_overlap_384_3_out\nput_1e-5_0.3_8/model.ckpt-20\nI0716 13:36:28.756341 139854454523648 util.py:45] output_path : gs://corefqa-europe/spanbert_large_overlap_384_3_output_\n1e-5_0.3_8\nI0716 13:36:28.756391 139854454523648 util.py:47] %*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%%*%\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is ***** \nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-500\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.6219, recall: 0.5093, f1: 0.56\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON TEST SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [TEST EVAL] ***** : precision: 0.6413, recall: 0.5428, f1: 0.588\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-1000\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.6227, recall: 0.5841, f1: 0.6028\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON TEST SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [TEST EVAL] ***** : precision: 0.6744, recall: 0.6126, f1: 0.6149\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-1500\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.6906, recall: 0.5288, f1: 0.599\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON TEST SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [TEST EVAL] ***** : precision: 0.6966, recall: 0.506, f1: 0.5862\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-2000\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.6712, recall: 0.5717, f1: 0.6174\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON TEST SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [TEST EVAL] ***** : precision: 0.6696, recall: 0.541, f1: 0.5985\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-2500\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.8091, recall: 0.5967, f1: 0.6868\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON TEST SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [TEST EVAL] ***** : precision: 0.8018, recall: 0.5692, f1: 0.6657\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-3000\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.6675, recall: 0.6382, f1: 0.6525\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-3500\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.7155, recall: 0.7515, f1: 0.7331\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON TEST SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [TEST EVAL] ***** : precision: 0.7239, recall: 0.7711, f1: 0.7468\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-4000\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.7762, recall: 0.5819, f1: 0.6651\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-4500\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.6661, recall: 0.6236, f1: 0.6442\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-5000\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.7814, recall: 0.7246, f1: 0.7519\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON TEST SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [TEST EVAL] ***** : precision: 0.7155, recall: 0.7515, f1: 0.7331\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-5500\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.8042, recall: 0.7417, f1: 0.7717\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON TEST SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [TEST EVAL] ***** : precision: 0.8439, recall: 0.6328, f1: 0.7232\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-6000\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.7942, recall: 0.7217, f1: 0.7417\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-6500\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.7816, recall: 0.7831, f1: 0.7823\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON TEST SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [TEST EVAL] ***** : precision: 0.7893, recall: 0.8075, f1: 0.7983\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-7000\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.8103, recall: 0.814, f1: 0.8121\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON TEST SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [TEST EVAL] ***** : precision: 0.8022, recall: 0.7838, f1: 0.7929\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-7500\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.7935, recall: 0.8292, f1: 0.8109\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-8000\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.8997, recall: 0.7292, f1: 0.8056\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-8500\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.8737, recall: 0.7444, f1: 0.8039\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-9000\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.8107, recall: 0.814, f1: 0.8124\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-10500\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.8439, recall: 0.7952, f1: 0.8188\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON TEST SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [TEST EVAL] ***** : precision: 0.8228, recall: 0.8093, f1: 0.8107\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-11000\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.8386, recall: 0.8147, f1: 0.8273\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON TEST SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [TEST EVAL] ***** : precision: 0.8239, recall: 0.8104, f1: 0.8143\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-12000\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.8398, recall: 0.8201, f1: 0.8369\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-12500\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.8666, recall: 0.7766, f1: 0.8192\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-13000\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.8397, recall: 0.7892, f1: 0.8056\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-13500\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.8357, recall: 0.8169, f1: 0.8262\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON TEST SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [TEST EVAL] ***** : precision: 0.8327, recall: 0.8288, f1: 0.8322\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-14000\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.8269, recall: 0.8108, f1: 0.8201\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-14500\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:17:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.8344, recall: 0.8104, f1: 0.8215\nI0716 14:46:19.214575 139854454523648 tpu_estimator.py:2307] ***** Current ckpt path is *****\nI0716 14:46:19.214575 139854454523648 tpu_estimator.py:2307] gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-15000\nI0716 14:46:19.214575 139854454523648 tpu_estimator.py:2307] ***** EVAL ON DEV SET *****\nI0716 14:46:19.214575 139854454523648 tpu_estimator.py:2307] ***** [DEV EVAL] ***** : precision: 0.8287, recall: 0.8177, f1: 0.8223\nI0716 14:46:19.214575 139854454523648 tpu_estimator.py:2307] *************************\nI0716 14:46:19.214575 139854454523648 tpu_estimator.py:2307] - @@@@@ the path to the BEST DEV result is : gs://corefqa-europe-europe/spanbert_large_overlap_384_3_output_8e6_0.2_8/model.ckpt-13500 \nI0716 14:46:19.214575 139854454523648 tpu_estimator.py:2307] - @@@@@ BEST DEV F1 : 0.8262, Precision : 0.8357, Recall : 0.8169\nI0716 14:46:19.214575 139854454523648 tpu_estimator.py:2307] - @@@@@ TEST when DEV best F1 : 0.8322, Precision : 0.8327, Recall : 0.8288  \nI0716 14:46:19.214575 139854454523648 tpu_estimator.py:2307] - @@@@@ mention_proposal_only_concate False"
  },
  {
    "path": "models/corefqa.py",
    "content": "#!/usr/bin/env python3 \n# -*- coding: utf-8 -*- \n\n\nimport os\nimport sys \n\nrepo_path = \"/\".join(os.path.realpath(__file__).split(\"/\")[:-2])\nif repo_path not in sys.path:\n    sys.path.insert(0, repo_path)\n\nimport tensorflow as tf\nfrom bert import modeling\n\n\n\nclass CorefQAModel(object):\n    def __init__(self, config):\n        self.config = config \n        self.dropout = None\n        self.pad_idx = 0 \n        self.mention_start_idx = 37\n        self.mention_end_idx = 42\n        self.bert_config = modeling.BertConfig.from_json_file(config.bert_config_file)\n        self.bert_config.hidden_dropout_prob = config.dropout_rate\n        self.cls_in_vocab = 101\n        self.sep_in_vocab = 102\n\n\n    def get_coreference_resolution_and_loss(self, instance, is_training, use_tpu=False):\n\n\n        self.use_tpu = use_tpu \n        self.dropout = self.get_dropout(self.config.dropout_rate, is_training)\n\n        flat_window_input_ids, flat_window_input_mask, flat_doc_sentence_map, window_text_len, speaker_ids, gold_starts, gold_ends, gold_cluster_ids = instance\n        # flat_input_ids: (num_window, window_size)\n        # flat_doc_overlap_input_mask: (num_window, window_size)\n        # flat_sentence_map: (num_window, window_size)\n        # text_len: dynamic length and is padded to fix length\n        # gold_start: (max_num_mention), mention start index in the original (NON-OVERLAP) document. Pad with -1 to the fix length max_num_mention.\n        # gold_end: (max_num_mention), mention end index in the original (NON-OVERLAP) document. Pad with -1 to the fix length max_num_mention.\n        # cluster_ids/speaker_ids is not used in the mention proposal model.\n\n        flat_window_input_ids = tf.math.maximum(flat_window_input_ids, tf.zeros_like(flat_window_input_ids, tf.int32)) # (num_window * window_size)\n        \n        flat_doc_overlap_input_mask = tf.where(tf.math.greater_equal(flat_window_input_mask, 0), \n            x=tf.ones_like(flat_window_input_mask, tf.int32), y=tf.zeros_like(flat_window_input_mask, tf.int32)) # (num_window * window_size)\n        # flat_doc_overlap_input_mask = tf.math.maximum(flat_doc_overlap_input_mask, tf.zeros_like(flat_doc_overlap_input_mask, tf.int32))\n        flat_doc_sentence_map = tf.math.maximum(flat_doc_sentence_map, tf.zeros_like(flat_doc_sentence_map, tf.int32)) # (num_window * window_size)\n        \n        gold_start_end_mask = tf.cast(tf.math.greater_equal(gold_starts, tf.zeros_like(gold_starts, tf.int32)), tf.bool) # (max_num_mention)\n        gold_start_index_labels = self.boolean_mask_1d(gold_starts, gold_start_end_mask, name_scope=\"gold_starts\", use_tpu=self.use_tpu) # (num_of_mention)\n        gold_end_index_labels = self.boolean_mask_1d(gold_ends, gold_start_end_mask, name_scope=\"gold_ends\", use_tpu=self.use_tpu) # (num_of_mention)\n\n        gold_cluster_mask = tf.cast(tf.math.greater_equal(gold_cluster_ids, tf.zeros_like(gold_cluster_ids, tf.int32)), tf.bool) # (max_num_cluster)\n        gold_cluster_ids = self.boolean_mask_1d(gold_cluster_ids, gold_cluster_mask, name_scope=\"gold_cluster\", use_tpu=self.use_tpu)\n\n        window_text_len = tf.math.maximum(window_text_len, tf.zeros_like(window_text_len, tf.int32)) # (num_of_non_empty_window)\n        num_subtoken_in_doc = tf.math.reduce_sum(window_text_len) # the value should be num_subtoken_in_doc \n        ####################\n        ####################\n        ## mention proposal stage starts \n        mention_input_ids = tf.reshape(flat_window_input_ids, [-1, self.config.window_size]) # (num_window, window_size)\n        # each row of mention_input_ids is a subdocument \n        mention_input_mask = tf.ones_like(mention_input_ids, tf.int32) # (num_window, window_size)\n        mention_model = modeling.BertModel(config=self.bert_config, is_training=is_training, \n            input_ids=mention_input_ids, input_mask=mention_input_mask, use_one_hot_embeddings=False, scope='bert')\n\n        mention_doc_overlap_window_embs = mention_model.get_sequence_output() # (num_window, window_size, hidden_size)\n        # get BERT embeddings for mention_input_ids \n        doc_overlap_input_mask = tf.reshape(flat_doc_overlap_input_mask, [self.config.num_window, self.config.window_size]) # (num_window, window_size)\n\n        mention_doc_flat_embs = self.transform_overlap_sliding_windows_to_original_document(mention_doc_overlap_window_embs, doc_overlap_input_mask) \n        mention_doc_flat_embs = tf.reshape(mention_doc_flat_embs, [-1, self.config.hidden_size]) # (num_subtoken_in_doc, hidden_size) \n\n        candidate_mention_starts = tf.tile(tf.expand_dims(tf.range(num_subtoken_in_doc), 1), [1, self.config.max_span_width]) # (num_subtoken_in_doc, max_span_width)\n        # getting all eligible mentions in each subdocument\n        # the number if eligible mentions of each subdocument is  config.max_span_width * num_subtoken_in_doc\n        candidate_mention_ends = tf.math.add(candidate_mention_starts, tf.expand_dims(tf.range(self.config.max_span_width), 0)) # (num_subtoken_in_doc, max_span_width)\n        \n        candidate_mention_sentence_start_idx = tf.gather(tf.reshape(flat_doc_sentence_map, [-1]), candidate_mention_starts) # (num_subtoken_in_doc, max_span_width)\n        candidate_mention_sentence_end_idx = tf.gather(tf.reshape(flat_doc_sentence_map, [-1]), candidate_mention_ends) # (num_subtoken_in_doc, max_span_width)\n        \n        candidate_mention_mask = tf.logical_and(candidate_mention_ends < num_subtoken_in_doc, tf.equal(candidate_mention_sentence_start_idx, candidate_mention_sentence_end_idx))\n        candidate_mention_mask = tf.reshape(candidate_mention_mask, [-1]) \n\n        candidate_mention_starts = self.boolean_mask_1d(tf.reshape(candidate_mention_starts, [-1]), candidate_mention_mask, name_scope=\"candidate_mention_starts\", use_tpu=self.use_tpu)\n        candidate_mention_ends = self.boolean_mask_1d(tf.reshape(candidate_mention_ends, [-1]), candidate_mention_mask, name_scope=\"candidate_mention_ends\", use_tpu=self.use_tpu)\n        # num_candidate_mention_in_doc is smaller than num_subtoken_in_doc\n\n        candidate_cluster_idx_labels = self.get_candidate_cluster_labels(candidate_mention_starts, candidate_mention_ends, gold_start_index_labels, gold_end_index_labels, gold_cluster_ids)\n\n        candidate_mention_span_embs, candidate_mention_start_embs, candidate_mention_end_embs = self.get_candidate_span_embedding(\n            mention_doc_flat_embs, candidate_mention_starts, candidate_mention_ends) \n        # candidate_mention_span_embs -> (num_candidate_mention_in_doc, 2 * hidden_size)\n        # candidate_mention_start_embs -> (num_candidate_mention_in_doc, hidden_size)\n        # candidate_mention_end_embs -> (num_candidate_mention_in_doc, hidden_size)\n\n        gold_label_candidate_mention_spans, gold_label_candidate_mention_starts, gold_label_candidate_mention_ends = self.get_candidate_mention_gold_sequence_label(\n            candidate_mention_starts, candidate_mention_ends, gold_start_index_labels, gold_end_index_labels, num_subtoken_in_doc)\n        # gold_label_candidate_mention_spans -> (num_candidate_mention_in_doc)\n        # gold_label_candidate_mention_starts -> (num_candidate_mention_in_doc)\n        # gold_label_candidate_mention_ends -> (num_candidate_mention_in_doc)\n\n        mention_proposal_loss, candidate_mention_start_prob, candidate_mention_end_prob, candidate_mention_span_prob, candidate_mention_span_scores = self.get_mention_score_and_loss(\n            candidate_mention_span_embs, candidate_mention_start_embs, candidate_mention_end_embs, gold_label_candidate_mention_spans=gold_label_candidate_mention_spans, \n            gold_label_candidate_mention_starts=gold_label_candidate_mention_starts, gold_label_candidate_mention_ends=gold_label_candidate_mention_ends, expect_length_of_labels=num_subtoken_in_doc)\n        # mention_proposal_loss -> a scalar \n        # candidate_mention_start_prob, candidate_mention_end_prob, candidate_mention_span_prob, -> (num_candidate_mention_in_doc)\n\n        self.k = tf.minimum(self.config.max_candidate_mentions, tf.to_int32(tf.floor(tf.to_float(num_subtoken_in_doc) * self.config.top_span_ratio)))\n        # self.k is a hyper-parameter. We want to select the top self.k mentions from the config.max_span_width * num_subtoken_in_doc mentions.\n\n        candidate_mention_span_scores = tf.reshape(candidate_mention_span_scores, [-1])\n        topk_mention_span_scores, topk_mention_span_indices = tf.nn.top_k(candidate_mention_span_scores, self.k, sorted=False) \n        topk_mention_span_indices = tf.reshape(topk_mention_span_indices, [-1])\n        # topk_mention_span_scores -> (k,)\n        # topk_mention_span_indices -> (k,)\n\n        topk_mention_start_indices = tf.gather(candidate_mention_starts, topk_mention_span_indices) # (k,)\n        topk_mention_end_indices = tf.gather(candidate_mention_ends, topk_mention_span_indices) # (k,)\n        topk_mention_span_cluster_ids = tf.gather(candidate_cluster_idx_labels, topk_mention_span_indices) # (k,)\n        topk_mention_span_scores = tf.gather(candidate_mention_span_scores, topk_mention_span_indices) # (k,)\n        ## mention proposal stage ends\n        ###########\n        ###########\n\n\n        ###### mention linking stage starts\n        ## foward QA score computation starts\n        ## for a given proposed mention i, we first compute the score of a span j being the correferent answer to i, denoted by s(j|i) \n        i0 = tf.constant(0)\n        forward_qa_input_ids = tf.zeros((1, self.config.num_window, self.config.window_size + self.config.max_query_len + 2), dtype=tf.int32) # (1, num_window, max_query_len + window_size + 2)\n        forward_qa_input_mask = tf.zeros((1, self.config.num_window, self.config.window_size + self.config.max_query_len + 2), dtype=tf.int32) # (1, num_window, max_query_len + window_size + 2)\n        forward_qa_input_token_type_mask = tf.zeros((1, self.config.num_window, self.config.window_size + self.config.max_query_len + 2), dtype=tf.int32) # (1, num_window, max_query_len + window_size + 2)\n\n        # prepare for non-overlap input token ids \n        doc_overlap_input_mask = tf.reshape(flat_doc_overlap_input_mask, [self.config.num_window, self.config.window_size])\n        nonoverlap_doc_input_ids = self.transform_overlap_sliding_windows_to_original_document(flat_window_input_ids, doc_overlap_input_mask) # (num_subtoken_in_doc)\n        overlap_window_input_ids = tf.reshape(flat_window_input_ids, [self.config.num_window, self.config.window_size]) # (num_window, window_size)\n\n        @tf.function\n        def forward_qa_mention_linking(i, batch_qa_input_ids, batch_qa_input_mask, batch_qa_input_token_type_mask):\n            tmp_mention_start_idx = tf.gather(topk_mention_start_indices, i)\n            tmp_mention_end_idx = tf.gather(topk_mention_end_indices, i)\n\n            query_input_token_ids, mention_start_idx_in_sent, mention_end_idx_in_sent = self.get_query_token_ids(\n                nonoverlap_doc_input_ids, flat_doc_sentence_map, tmp_mention_start_idx, tmp_mention_end_idx)\n\n            query_pad_token_ids = tf.zeros([self.config.max_query_len - self.get_shape(query_input_token_ids, 0)], dtype=tf.int32)\n\n            pad_query_input_token_ids = tf.concat([query_input_token_ids, query_pad_token_ids], axis=0) # (max_query_len,)\n            pad_query_input_token_mask = tf.ones_like(pad_query_input_token_ids, tf.int32) # (max_query_len)\n            pad_query_input_token_type_mask = tf.zeros_like(pad_query_input_token_ids, tf.int32) # (max_query_len)\n\n\n            expand_pad_query_input_token_ids = tf.tile(tf.expand_dims(pad_query_input_token_ids, 0), [self.config.num_window, 1])  # (num_window, max_query_len)\n            expand_pad_query_input_token_mask = tf.tile(tf.expand_dims(pad_query_input_token_mask, 0), [self.config.num_window, 1]) # (num_window, max_query_len)\n            expand_pad_query_input_token_type_mask = tf.tile(tf.expand_dims(pad_query_input_token_type_mask, 0), [self.config.num_window, 1]) # (num_window, max_query_len)\n\n            sep_tokens = tf.cast(tf.fill([self.config.num_window, 1], self.sep_in_vocab), tf.int32) # (num_window, 1)\n            cls_tokens = tf.cast(tf.fill([self.config.num_window, 1], self.cls_in_vocab), tf.int32) # (num_window, 1)\n\n            query_context_input_token_ids = tf.concat([cls_tokens, expand_pad_query_input_token_ids, sep_tokens, overlap_window_input_ids], axis=1) # (1, num_window, max_query_len + window_size + 2)\n            query_context_input_token_mask = tf.concat([tf.ones_like(cls_tokens, tf.int32), expand_pad_query_input_token_mask, tf.ones_like(sep_tokens, tf.int32), tf.ones_like(overlap_window_input_ids, tf.int32)], axis=1) # (1, num_window, max_query_len + window_size + 2)\n            query_context_input_token_type_mask = tf.concat([tf.zeros_like(cls_tokens, tf.int32), expand_pad_query_input_token_type_mask, tf.zeros_like(sep_tokens, tf.int32), tf.ones_like(overlap_window_input_ids, tf.int32)], axis=1) # (1, num_window, max_query_len + window_size + 2)\n\n            query_context_input_token_ids = tf.reshape(query_context_input_token_ids, [1, self.config.num_window, self.config.max_query_len+self.config.window_size+2])\n            query_context_input_token_mask = tf.reshape(query_context_input_token_mask, [1, self.config.num_window, self.config.max_query_len+self.config.window_size+2])\n            query_context_input_token_type_mask = tf.reshape(query_context_input_token_type_mask, [1, self.config.num_window, self.config.max_query_len+self.config.window_size+2])\n\n\n            return [tf.math.add(i, 1), tf.concat([batch_qa_input_ids, query_context_input_token_ids], 0), \n                    tf.concat([batch_qa_input_mask, query_context_input_token_mask], 0), \n                    tf.concat([batch_qa_input_token_type_mask, query_context_input_token_type_mask], 0)]\n\n\n\n        _, stack_forward_qa_input_ids, stack_forward_qa_input_mask, stack_forward_qa_input_type_mask = tf.while_loop(\n            cond=lambda i, o1, o2, o3 : i < self.k,\n            body=forward_qa_mention_linking, \n            loop_vars=[i0, forward_qa_input_ids, forward_qa_input_mask, forward_qa_input_token_type_mask], \n            shape_invariants=[i0.get_shape(), tf.TensorShape([None, None, None]), \n                tf.TensorShape([None, None, None]), tf.TensorShape([None, None, None])])\n\n        # stack_forward_qa_input_ids, stack_forward_qa_input_mask, stack_forward_qa_input_type_mask -> (k, num_window, max_query_len + window_size + 2)\n\n        batch_forward_qa_input_ids = tf.reshape(stack_forward_qa_input_ids, [-1, self.config.max_query_len+self.config.window_size+2]) # (k * num_window, max_query_len + window_size + 2)\n        batch_forward_qa_input_mask = tf.reshape(stack_forward_qa_input_mask, [-1, self.config.max_query_len+self.config.window_size+2]) # (k * num_window, max_query_len + window_size + 2)\n        batch_forward_qa_input_type_mask = tf.reshape(stack_forward_qa_input_type_mask, [-1, self.config.max_query_len+self.config.window_size+2]) # (k * num_window, max_query_len + window_size + 2)\n\n        forward_qa_linking_model = modeling.BertModel(config=self.bert_config, is_training=is_training, \n            input_ids=batch_forward_qa_input_ids, input_mask=batch_forward_qa_input_mask, \n            token_type_ids=batch_forward_qa_input_type_mask, use_one_hot_embeddings=False, \n            scope=\"bert\")\n\n        forward_qa_overlap_window_embs = forward_qa_linking_model.get_sequence_output() # (k * num_window, max_query_len + window_size + 2, hidden_size)\n        forward_context_overlap_window_embs = self.transform_overlap_sliding_windows_to_original_document(forward_qa_overlap_window_embs, batch_forward_qa_input_type_mask)\n        forward_context_overlap_window_embs = tf.reshape(forward_context_overlap_window_embs, [self.k*self.config.num_window, self.config.window_size])\n        # forward_context_overlap_window_embs -> (k*num_window, window_size, hidden_size)\n\n        expand_doc_overlap_input_mask = tf.tile(tf.expand_dims(doc_overlap_input_mask, 0), [self.k, 1, 1]) # (k, num_window, window_size)\n        expand_doc_overlap_input_mask = tf.reshape(expand_doc_overlap_input_mask, [-1, self.config.window_size]) # (k * num_window, window_size)\n\n        forward_context_flat_doc_embs = self.transform_overlap_sliding_windows_to_original_document(forward_context_overlap_window_embs, expand_doc_overlap_input_mask) # (k * num_subtoken_in_doc, hidden_size)\n        forward_context_flat_doc_embs = tf.reshape(forward_context_flat_doc_embs, [self.k, -1, self.config.hidden_size]) # (k, num_subtoken_in_doc, hidden_size)\n        num_candidate_mention = self.get_shape(candidate_mention_span_embs, 0) # (num_candidate_mention_in_doc)\n        forward_qa_mention_pos_offset = tf.cast(tf.tile(tf.reshape(tf.range(0, num_candidate_mention) * num_subtoken_in_doc, [1, -1]), [self.k, 1]), tf.int32) # (k, num_candidate_mention_in_doc)\n\n        forward_qa_mention_starts = tf.tile(tf.expand_dims(candidate_mention_starts, 0), [self.k, 1]) + forward_qa_mention_pos_offset # (k, num_candidate_mention_in_doc)\n        forward_qa_mention_ends = tf.tile(tf.expand_dims(candidate_mention_ends, 0), [self.k, 1]) + forward_qa_mention_pos_offset # (k, num_candidate_mention_in_doc)\n\n        forward_qa_mention_span_embs, forward_qa_mention_start_embs, forward_qa_mention_end_embs = self.get_candidate_span_embedding(tf.reshape(forward_context_flat_doc_embs, \n                [-1, self.config.hidden_size]), tf.reshape(forward_qa_mention_starts, [-1]), tf.reshape(forward_qa_mention_ends, [-1]))\n        # forward_qa_mention_span_embs -> (k * num_candidate_mention_in_doc, hidden_size*2)\n        # forward_qa_mention_start_embs -> (k * num_candidate_mention_in_doc, hidden_size)\n\n        self.c = tf.to_int32(tf.minimum(self.config.max_top_antecedents, self.k))\n\n        forward_qa_mention_span_scores, forward_qa_mention_start_scores, forward_qa_mention_end_scores = self.get_mention_score_and_loss(forward_qa_mention_span_embs, \n                forward_qa_mention_start_embs, forward_qa_mention_end_embs, name_scope=\"forward_qa\") \n        # forward_qa_mention_span_prob, forward_qa_mention_start_prob, forward_qa_mention_end_prob -> (k * num_candidate_mention_in_doc)\n\n        # computes the s(j|i) for all eligible span j in the document \n        if self.config.sec_qa_mention_score:\n            forward_qa_mention_span_scores = (forward_qa_mention_span_scores + forward_qa_mention_start_scores + forward_qa_mention_end_scores)/3.0\n        else:\n            forward_qa_mention_span_scores = forward_qa_mention_span_scores\n\n        forward_candidate_mention_span_scores = tf.reshape(forward_qa_mention_span_scores, [self.k, -1]) # (k, num_candidate_mention_in_doc)\n        forward_topc_mention_span_scores, local_forward_topc_mention_span_indices = tf.nn.top_k(forward_candidate_mention_span_scores, self.c, sorted=False) # (k, c)\n        # for each i, we only maintain the top self.c spans based on s(j|i)\n        local_flat_forward_topc_mention_span_indices = tf.reshape(local_forward_topc_mention_span_indices, [-1]) # (k * c)\n\n        # topk_mention_start_indices\n        forward_topc_mention_start_indices = tf.gather(candidate_mention_starts, local_flat_forward_topc_mention_span_indices) # (k, c)\n        forward_topc_mention_end_indices = tf.gather(candidate_mention_ends, local_flat_forward_topc_mention_span_indices) # (k, c)\n        forward_topc_mention_span_scores_in_mention_proposal = tf.gather(candidate_mention_span_scores, local_flat_forward_topc_mention_span_indices) # (k, c)\n        forward_topc_span_cluster_ids = tf.gather(candidate_cluster_idx_labels, local_flat_forward_topc_mention_span_indices)\n        ## foward QA score computation ends\n\n\n        ## backward QA score computation begins\n        ## we need to compute the score of backward score, i.e., the span i is the correferent answer for j, denoted by s(i|j)\n        i0 = tf.constant(0)\n        backward_qa_input_ids = tf.reshape(tf.zeros((1, self.config.max_query_len + self.config.max_context_len + 2), dtype=tf.int32), [1, -1]) # (1, max_query_len + max_context_len + 2)\n        backward_qa_input_mask = tf.reshape(tf.zeros((1, self.config.max_query_len + self.config.max_context_len + 2), dtype=tf.int32), [1, -1]) # (1, max_query_len + max_context_len + 2)\n        backward_qa_input_token_type_mask = tf.reshape(tf.zeros((1, self.config.max_query_len + self.config.max_context_len + 2), dtype=tf.int32), [1, -1]) # (1, max_query_len + max_context_len + 2)\n        backward_qa_mention_start_in_context = tf.convert_to_tensor(tf.constant([0]), dtype=tf.int32)\n        backward_qa_mention_end_in_context = tf.convert_to_tensor(tf.constant([0]), dtype=tf.int32)\n        \n\n        @tf.function\n        def backward_qa_mention_linking(i, batch_qa_input_ids, batch_qa_input_mask, batch_qa_input_token_type_mask, \n            batch_qa_mention_start_in_context, batch_qa_mention_end_in_context):\n\n            tmp_query_mention_start_idx = tf.gather(forward_topc_mention_start_indices, i)\n            tmp_query_mention_end_idx = tf.gather(forward_topc_mention_end_indices, i)\n\n            tmp_index_for_topk_mention = tf.floor_div(i, self.k)\n            tmp_context_mention_start_idx = tf.gather(topk_mention_start_indices, tmp_index_for_topk_mention)\n            tmp_context_mention_end_idx = tf.gather(topk_mention_end_indices, tmp_index_for_topk_mention)\n\n            query_input_token_ids, mention_start_idx_in_query, mention_end_idx_in_query = self.get_query_token_ids(\n                nonoverlap_doc_input_ids, flat_doc_sentence_map, tmp_query_mention_start_idx, tmp_query_mention_end_idx)\n\n            context_input_token_ids, mention_start_idx_in_context, mention_end_idx_in_context = self.get_query_token_ids(\n                nonoverlap_doc_input_ids, flat_doc_sentence_map, tmp_context_mention_start_idx, tmp_context_mention_end_idx)\n\n            query_pad_token_ids = tf.zeros([self.config.max_query_len - self.get_shape(query_input_token_ids, 0)], dtype=tf.int32)\n            context_pad_token_ids = tf.zeros([self.config.max_context_len - self.get_shape(context_input_token_ids, 0)], dtype=tf.int32)\n\n            pad_query_input_token_ids = tf.concat([query_input_token_ids, query_pad_token_ids], axis=0) # (max_query_len)\n            pad_query_input_token_mask = tf.ones_like(pad_query_input_token_ids, tf.int32)\n            pad_query_input_token_type_mask = tf.zeros_like(pad_query_input_token_ids, tf.int32)\n\n            pad_context_input_token_ids = tf.concat([context_input_token_ids, context_pad_token_ids], axis=0) # (max_context_len)\n            pad_context_input_token_mask = tf.ones_like(pad_context_input_token_ids, tf.int32)\n            pad_context_input_token_type_mask = tf.ones_like(pad_context_input_token_ids, tf.int32)\n\n            sep_tokens = tf.cast(tf.fill([1], self.sep_in_vocab), tf.int32) # (num_window, 1)\n            cls_tokens = tf.cast(tf.fill([1], self.cls_in_vocab), tf.int32) # (num_window, 1)\n\n            query_context_input_token_ids = tf.concat([cls_tokens, pad_query_input_token_ids, sep_tokens, pad_context_input_token_ids], axis=0)\n            query_context_input_token_mask = tf.concat([tf.ones_like(cls_tokens, tf.int32), pad_query_input_token_mask, tf.zeros_like(sep_tokens, tf.int32), pad_context_input_token_mask], axis=0)\n            query_context_input_token_type_mask = tf.concat([tf.zeros_like(cls_tokens, tf.int32), pad_query_input_token_type_mask, tf.zeros_like(sep_tokens, tf.int32), pad_context_input_token_type_mask], axis=0)\n\n            query_context_input_token_ids = tf.reshape(query_context_input_token_ids, [1, self.config.max_query_len + self.config.max_context_len + 2])\n            query_context_input_token_mask = tf.reshape(query_context_input_token_mask, [1, self.config.max_query_len + self.config.max_context_len + 2])\n            query_context_input_token_type_mask = tf.reshape(query_context_input_token_type_mask, [1, self.config.max_query_len + self.config.max_context_len + 2])\n            mention_start_idx_in_context = tf.reshape(mention_start_idx_in_context, [-1])\n            mention_end_idx_in_context = tf.reshape(mention_end_idx_in_context, [-1])\n\n            return [tf.math.add(i, 1), tf.concat([batch_qa_input_ids, query_context_input_token_ids], 0), \n                    tf.concat([batch_qa_input_mask, query_context_input_token_mask], 0), \n                    tf.concat([batch_qa_input_token_type_mask, query_context_input_token_type_mask], 0), \n                    tf.concat([backward_qa_mention_start_in_context, mention_start_idx_in_context], 0), \n                    tf.concat([backward_qa_mention_end_in_context, mention_end_idx_in_context], 0)]\n\n\n        _, stack_backward_qa_input_ids, stack_backward_qa_input_mask, stack_backward_qa_input_type_mask, stack_backward_mention_start_in_context, stack_backward_mention_end_in_context = tf.while_loop(\n            cond = lambda i, o1, o2, o3, o4, o5: i < self.k * self.c,\n            body=backward_qa_mention_linking, \n            loop_vars=[i0, backward_qa_input_ids, backward_qa_input_mask, backward_qa_input_token_type_mask, backward_qa_mention_start_in_context, backward_qa_mention_end_in_context], \n            shape_invariants=[i0.get_shape(), tf.TensorShape([None, None]), tf.TensorShape([None, None]), tf.TensorShape([None, None]), \n            tf.TensorShape([None]), tf.TensorShape([None])]) \n\n        # stack_backward_qa_input_ids, stack_backward_qa_input_mask, stack_backward_qa_input_type_mask -> (k*c, max_query_len + max_context_len + 2)\n        # stack_backward_mention_start_in_context, stack_backward_mention_end_in_context -> (k*c,)\n\n        batch_backward_qa_input_ids = tf.reshape(stack_backward_qa_input_ids, [-1, self.config.max_query_len+self.config.max_context_len+2])\n        batch_backward_qa_input_mask = tf.reshape(stack_backward_qa_input_mask, [-1, self.config.max_query_len+self.config.max_context_len+2])\n        batch_backward_qa_input_type_mask = tf.reshape(stack_backward_qa_input_type_mask, [-1, self.config.max_query_len+self.config.max_context_len+2])\n\n        backward_qa_linking_model = modeling.BertModel(config=self.bert_config, is_training=is_training, \n            input_ids=batch_backward_qa_input_ids, input_mask=batch_backward_qa_input_mask, \n            token_type_ids=batch_backward_qa_input_type_mask, use_one_hot_embeddings=False, \n            scope=\"bert\")\n\n        backward_query_context_embs = backward_qa_linking_model.get_sequence_output() # (k*c, max_query_len + max_context_len + 2, hidden_size)\n        backward_query_context_embs = tf.reshape(backward_query_context_embs, [self.k*self.c, -1, self.config.hidden_size])\n        flat_batch_backward_qa_input_type_mask = tf.reshape(batch_backward_qa_input_type_mask, [self.k*self.c, -1])\n\n        backward_context_flat_embs = self.transform_overlap_sliding_windows_to_original_document(backward_query_context_embs, flat_batch_backward_qa_input_type_mask) # (k*c, max_context_len, hidden_size)\n        batch_backward_mention_start_in_context = tf.reshape(stack_backward_mention_start_in_context, [-1]) + tf.range(0, self.c*self.k) * (self.config.max_query_len+self.config.max_context_len) \n        batch_backward_mention_end_in_context = tf.reshape(stack_backward_mention_end_in_context, [-1]) + tf.range(0, self.c*self.k) * (self.config.max_query_len+self.config.max_context_len) \n\n        backward_qa_mention_span_embs, backward_qa_mention_start_embs, backward_qa_mention_end_embs = self.get_candidate_span_embedding(tf.reshape(backward_context_flat_embs, \n                [-1, self.config.hidden_size]), tf.reshape(batch_backward_mention_start_in_context, [-1]), tf.reshape(batch_backward_mention_end_in_context, [-1]))\n        # backward_qa_mention_span_embs -> (k*c, 2*hidden_size)\n        # backward_qa_mention_start_embs, backward_qa_mention_end_embs -> (k*c, hidden_size)\n\n        backward_qa_mention_span_scores, backward_qa_mention_start_scores, backward_qa_mention_end_scores = self.get_mention_score_and_loss(backward_qa_mention_span_embs, \n                backward_qa_mention_start_embs, backward_qa_mention_end_embs, name_scope=\"backward_qa\")\n        # backward_qa_mention_span_prob -> (k*c)\n        # backward_qa_mention_start_prob, backward_qa_mention_end_prob -> (k*c)\n\n        if self.config.sec_qa_mention_score:\n            backward_qa_mention_span_scores = (backward_qa_mention_span_scores + backward_qa_mention_start_scores + backward_qa_mention_end_scores)/3.0\n        else:\n            backward_qa_mention_span_scores = backward_qa_mention_span_scores \n        #############\n        ############# backward QA computation ends\n        \n        forward_topc_mention_span_scores = tf.reshape(forward_topc_mention_span_scores, [-1])\n        expand_forward_topc_mention_span_scores = tf.tile(tf.expand_dims(forward_topc_mention_span_scores, 0), [self.k, 1]) # forward_topc_mention_span_scores -> (c); expand_forward_topc_mention_span_scores -> (c, k)\n        expand_forward_topc_mention_span_scores_in_mention_proposal = tf.tile(tf.expand_dims(forward_topc_mention_span_scores_in_mention_proposal, 0), [self.k, 1])\n        expand_topk_mention_span_scores = tf.tile(tf.expand_dims(topk_mention_span_scores, 1), [1, self.c]) # (k, c)\n\n        backward_qa_mention_span_scores = tf.reshape(backward_qa_mention_span_scores, [self.k, self.c]) # (k, c)\n\n        mention_span_linking_scores = (expand_forward_topc_mention_span_scores + backward_qa_mention_span_scores ) / 2.0 \n        mention_span_linking_scores = mention_span_linking_scores+ expand_forward_topc_mention_span_scores_in_mention_proposal + expand_topk_mention_span_scores\n        mention_span_linking_scores = tf.reshape(mention_span_linking_scores, [self.k, self.c]) # (k, c)\n        dummy_scores = tf.zeros([self.k, 1]) # (k, 1)\n\n        top_mention_span_linking_scores = tf.concat([dummy_scores, mention_span_linking_scores], axis=1) # (k, c)\n\n        forward_topc_span_cluster_ids = tf.reshape(forward_topc_span_cluster_ids, [self.k, self.c]) # (k, c)\n        same_cluster_indicator = tf.equal(forward_topc_span_cluster_ids, tf.expand_dims(topk_mention_span_cluster_ids, 1))  \n        non_dummy_indicator = tf.expand_dims(topk_mention_span_cluster_ids > 0, 1)\n        pairwise_labels = tf.logical_and(same_cluster_indicator, non_dummy_indicator)\n        dummy_labels = tf.logical_not(tf.reduce_any(pairwise_labels, 1, keepdims=True)) \n        top_mention_span_linking_labels = tf.concat([dummy_labels, pairwise_labels], 1)\n\n        linking_loss = self.marginal_likelihood_loss(top_mention_span_linking_scores, top_mention_span_linking_labels)\n\n        total_loss = mention_proposal_loss + linking_loss \n\n        return total_loss, (topk_mention_start_indices, topk_mention_end_indices), (forward_topc_mention_start_indices, forward_topc_mention_end_indices), top_mention_span_linking_scores \n\n\n    def marginal_likelihood_loss(self, antecedent_scores, antecedent_labels):\n        \"\"\"\n        Desc:\n            marginal likelihood of gold antecedent spans form coreference cluster \n        Args:\n            antecedent_scores: [k, c+1] the predicted scores by the model\n            antecedent_labels: [k, c+1] the gold-truth cluster labels\n        Returns:\n            a scalar of loss \n        \"\"\"\n        gold_scores = tf.math.add(antecedent_scores, tf.log(tf.to_float(antecedent_labels)))\n        marginalized_gold_scores = tf.math.reduce_logsumexp(gold_scores, [1])  # [k]\n        log_norm = tf.math.reduce_logsumexp(antecedent_scores, [1])  # [k]\n        loss = log_norm - marginalized_gold_scores  # [k]\n        return tf.math.reduce_sum(loss)\n\n\n    def get_query_token_ids(self, nonoverlap_doc_input_ids, sentence_map, mention_start_idx, mention_end_idx, paddding=True):\n        \"\"\"\n        Desc:\n            construct question based on the selected mention. \n        \"\"\"\n        nonoverlap_doc_input_ids = tf.reshape(nonoverlap_doc_input_ids, [-1])\n\n        sentence_idx_for_mention = tf.gather(sentence_map, mention_start_idx)\n        sentence_mask_for_mention = tf.math.equal(sentence_map, sentence_idx_for_mention)\n        query_token_input_ids = self.boolean_mask_1d(nonoverlap_doc_input_ids, sentence_mask_for_mention, name_scope=\"query_mention\", use_tpu=self.use_tpu)\n\n        sentence_start = tf.where(tf.equal(nonoverlap_doc_input_ids, tf.gather(query_token_input_ids, tf.constant(0))))\n\n        mention_start_in_sent = mention_start_idx - tf.cast(sentence_start, tf.int32) \n        mention_end_in_sent = mention_end_idx - tf.cast(sentence_start, tf.int32) \n\n        return query_token_input_ids, mention_start_in_sent, mention_end_in_sent \n\n\n\n    def get_mention_score_and_loss(self, candidate_mention_span_embs, candidate_mention_start_embs, candidate_mention_end_embs, \n        gold_label_candidate_mention_spans=None, gold_label_candidate_mention_starts=None, gold_label_candidate_mention_ends=None, expect_length_of_labels=None, \n        name_scope=\"mention\"):\n\n        candidate_mention_span_logits = self.ffnn(candidate_mention_span_embs, self.config.hidden_size*2, 1, dropout=self.dropout, name_scope=\"{}_span\".format(name_scope))\n        candidate_mention_start_logits = self.ffnn(candidate_mention_start_embs, self.config.hidden_size, 1, dropout=self.dropout, name_scope=\"{}_start\".format(name_scope))\n        candidate_mention_end_logits = self.ffnn(candidate_mention_end_embs, self.config.hidden_size, 1, dropout=self.dropout, name_scope=\"{}_end\".format(name_scope))\n\n        if gold_label_candidate_mention_spans is None or gold_label_candidate_mention_starts is None or gold_label_candidate_mention_ends is None: \n            candidate_mention_span_scores = tf.math.log(tf.sigmoid(candidate_mention_span_logits))\n            candidate_mention_start_scores = tf.math.log(tf.sigmoid(candidate_mention_start_logits))\n            candidate_mention_end_scores = tf.math.log(tf.sigmoid(candidate_mention_end_logits))\n\n            return candidate_mention_span_scores, candidate_mention_start_scores, candidate_mention_end_scores\n\n\n        start_loss, candidate_mention_start_probability = self.compute_mention_score_and_loss(candidate_mention_start_logits, gold_label_candidate_mention_starts)\n        end_loss, candidate_mention_end_probability = self.compute_mention_score_and_loss(candidate_mention_end_logits, gold_label_candidate_mention_ends)\n        span_loss, candidate_mention_span_probability = self.compute_mention_score_and_loss(candidate_mention_span_logits, gold_label_candidate_mention_spans)\n\n        \n        total_loss = start_loss + end_loss + span_loss\n        candidate_mention_span_scores = (tf.math.log(candidate_mention_start_probability) + tf.math.log(candidate_mention_end_probability) + tf.math.log(candidate_mention_span_probability)) / 3.0 \n\n        return total_loss, candidate_mention_start_probability, candidate_mention_end_probability, candidate_mention_span_probability, candidate_mention_span_scores\n\n\n    def compute_mention_score_and_loss(self, pred_sequence_logits, gold_sequence_labels, loss_mask=None):\n        \"\"\"\n        Desc:\n            compute the unifrom start/end loss and probabilities. \n        Args:\n            pred_sequence_logits: (input_shape, 1) \n            gold_sequence_labels: (input_shape, 1)\n            loss_mask: [optional] if is not None, it should be (input_shape). should be tf.int32 0/1 tensor. \n            FOR start/end score and loss, input_shape should be num_subtoken_in_doc.\n            FOR span score and loss, input_shape should be num_subtoken_in_doc * num_subtoken_in_doc. \n        \"\"\"\n        pred_sequence_probabilities = tf.cast(tf.reshape(tf.sigmoid(pred_sequence_logits), [-1]),tf.float32) # (input_shape)\n        expand_pred_sequence_scores = tf.stack([(1 - pred_sequence_probabilities), pred_sequence_probabilities], axis=-1) # (input_shape, 2)\n        expand_gold_sequence_labels = tf.cast(tf.one_hot(tf.reshape(gold_sequence_labels, [-1]), 2, axis=-1), tf.float32) # (input_shape, 2)\n\n        loss = tf.keras.losses.binary_crossentropy(expand_gold_sequence_labels, expand_pred_sequence_scores)\n        # loss -> shape is (input_shape)\n\n        if loss_mask is not None:\n            loss = tf.multiply(loss, tf.cast(loss_mask, tf.float32))\n\n        total_loss = tf.reduce_mean(loss)\n        # total_loss -> a scalar \n\n        return total_loss, pred_sequence_probabilities\n\n\n    def get_candidate_span_embedding(self, doc_sequence_embeddings, candidate_span_starts, candidate_span_ends):\n        doc_sequence_embeddings = tf.reshape(doc_sequence_embeddings, [-1, self.config.hidden_size])\n\n        span_start_embedding = tf.gather(doc_sequence_embeddings, candidate_span_starts)\n        span_end_embedding = tf.gather(doc_sequence_embeddings, candidate_span_ends)\n        span_embedding = tf.concat([span_start_embedding, span_end_embedding], 1) \n\n        return span_embedding, span_start_embedding, span_end_embedding \n\n    def get_candidate_mention_gold_sequence_label(self, candidate_mention_starts, candidate_mention_ends, \n        gold_start_index_labels, gold_end_index_labels, expect_length_of_labels):\n\n        gold_start_sequence_label = self.scatter_gold_index_to_label_sequence(gold_start_index_labels, expect_length_of_labels)\n        gold_end_sequence_label = self.scatter_gold_index_to_label_sequence(gold_end_index_labels, expect_length_of_labels)\n\n        gold_label_candidate_mention_starts = tf.gather(gold_start_sequence_label, candidate_mention_starts)\n        gold_label_candidate_mention_ends = tf.gather(gold_end_sequence_label, candidate_mention_ends)\n\n        gold_mention_sparse_label = tf.reshape(tf.stack([gold_start_index_labels, gold_end_index_labels], axis=1), [2, -1])\n        gold_span_value = tf.reshape(tf.ones_like(gold_start_index_labels, tf.int32), [-1])\n        gold_span_shape = tf.Variable([expect_length_of_labels, expect_length_of_labels], shape=(2, )) \n        gold_span_label = tf.cast(tf.scatter_nd(gold_mention_sparse_label, gold_span_value, gold_span_shape), tf.int32)\n\n        candidate_mention_spans = tf.stack([candidate_mention_starts, candidate_mention_ends], axis=1)\n        gold_label_candidate_mention_spans = tf.gather_nd(gold_span_label, tf.expand_dims(candidate_mention_spans, 1))\n \n        return gold_label_candidate_mention_spans, gold_label_candidate_mention_starts, gold_label_candidate_mention_ends\n\n\n    def scatter_gold_index_to_label_sequence(self, gold_index_labels, expect_length_of_labels):\n        \"\"\"\n        Desc:\n            transform the mention start/end position index tf.int32 Tensor to a tf.int32 Tensor with 1/0 labels for the input subtoken sequences.\n            1 denotes this subtoken is the start/end for a mention. \n        Args:\n            gold_index_labels: a tf.int32 Tensor with mention start/end position index in the original document. \n            expect_length_of_labels: the number of subtokens in the original document. \n        \"\"\"\n        gold_labels_pos = tf.reshape(gold_index_labels, [-1, 1]) # (num_of_mention, 1)\n        gold_value = tf.reshape(tf.ones_like(gold_index_labels), [-1]) # (num_of_mention)\n        label_shape = tf.Variable(expect_length_of_labels) \n        label_shape = tf.reshape(label_shape, [1]) # [1]\n        gold_label_sequence = tf.cast(tf.scatter_nd(gold_labels_pos, gold_value, label_shape), tf.int32) # (num_subtoken_in_doc)\n        return gold_label_sequence\n\n\n    def scatter_span_sequence_labels(self, gold_start_index_labels, gold_end_index_labels, expect_length_of_labels):\n        \"\"\"\n        Desc:\n            transform the mention (start, end) position pairs to a span matrix gold_span_sequence_labels. \n                matrix[i][j]: whether the subtokens between the position $i$ to $j$ can be a mention.  \n                if matrix[i][j] == 0: from $i$ to $j$ is not a mention. \n                if matrix[i][j] == 1: from $i$ to $j$ is a mention.\n        Args:\n            gold_start_index_labels: a tf.int32 Tensor with mention start position index in the original document. \n            gold_end_index_labels: a tf.int32 Tensor with mention end position index in the original document. \n            expect_length_of_labels: a scalar, should be the same with num_subtoken_in_doc\n        \"\"\" \n        gold_span_index_labels = tf.stack([gold_start_index_labels, gold_end_index_labels], axis=1) # (num_of_mention, 2)\n        gold_span_value = tf.reshape(tf.ones_like(gold_start_index_labels, tf.int32), [-1]) # (num_of_mention)\n        gold_span_label_shape = tf.Variable([expect_length_of_labels, expect_length_of_labels]) \n        gold_span_label_shape = tf.reshape(gold_span_label_shape, [-1])\n\n        gold_span_sequence_labels = tf.cast(tf.scatter_nd(gold_span_index_labels, gold_span_value, gold_span_label_shape), tf.int32) # (num_subtoken_in_doc, num_subtoken_in_doc)\n        return gold_span_sequence_labels \n\n\n    def get_candidate_cluster_labels(self, candidate_mention_starts, candidate_mention_ends, \n            gold_mention_starts, gold_mention_ends, gold_cluster_ids):\n        \n        same_mention_start = tf.equal(gold_mention_starts, candidate_mention_starts)\n        same_mention_end = tf.equal(gold_mention_ends, candidate_mention_ends) \n        # same_mention_start = tf.equal(tf.expand_dims(gold_mention_starts, 1), tf.expand_dims(candidate_mention_starts, 0))\n        # same_mention_end = tf.equal(tf.expand_dims(gold_mention_ends, 1), tf.expand_dims(candidate_mention_ends, 0)) \n        same_mention_span = tf.logical_and(same_mention_start, same_mention_end)\n        \n        candidate_cluster_idx_labels = tf.matmul(tf.expand_dims(gold_cluster_ids, 0), tf.to_int32(same_mention_span))  # [1, num_candidates]\n        candidate_cluster_idx_labels = tf.squeeze(candidate_cluster_idx_labels, 0)  # [num_candidates]\n\n        return candidate_cluster_idx_labels \n\n\n    def transform_overlap_sliding_windows_to_original_document(self, overlap_window_inputs, overlap_window_mask):\n        \"\"\"\n        Desc:\n            hidden_size should be equal to embeddding_size. \n        Args:\n            doc_overlap_window_embs: (num_window, window_size, hidden_size). \n                the output of (num_window, window_size) input_ids forward into BERT model. \n            doc_overlap_input_mask: (num_window, window_size). A tf.int32 Tensor contains 0/1. \n                0 represents token in this position should be neglected. 1 represents token in this position should be reserved. \n        \"\"\"\n        ones_input_mask = tf.ones_like(overlap_window_mask, tf.int32) # (num_window, window_size)\n        cumsum_input_mask = tf.math.cumsum(ones_input_mask, axis=1) # (num_window, window_size) # do not equal to cumsum_input_mask -> (num_window, window_size)\n        # offset_input_mask = tf.tile(tf.expand_dims(tf.range(self.config.num_window) * self.config.window_size, 1), [1, self.config.window_size]) # (num_window, window_size)\n        row_cumsum_input_mask = self.get_shape(cumsum_input_mask, 0)\n        col_cumsum_input_mask = self.get_shape(cumsum_input_mask, 1)\n        offset_input_mask = tf.tile(tf.expand_dims(tf.range(row_cumsum_input_mask) * col_cumsum_input_mask, 1), [1, col_cumsum_input_mask])\n\n\n        offset_cumsum_input_mask = offset_input_mask + cumsum_input_mask # (num_window, window_size)\n        global_input_mask = tf.math.multiply(ones_input_mask, offset_cumsum_input_mask) # (num_window, window_size)\n        global_input_mask = tf.reshape(global_input_mask, [-1]) # (num_window * window_size)\n        global_input_mask_index = self.boolean_mask_1d(global_input_mask, tf.math.greater(global_input_mask, tf.zeros_like(global_input_mask, tf.int32))) # (num_subtoken_in_doc)\n\n        overlap_window_inputs = tf.reshape(overlap_window_inputs, [self.config.num_window * self.config.window_size, -1]) # (num_window * window_size, hidden_size)\n        original_doc_inputs = tf.gather(overlap_window_inputs, global_input_mask_index)  # (num_subtoken_in_doc, hidden_size)\n\n        return original_doc_inputs\n\n\n    def ffnn(self, inputs, hidden_size, output_size, dropout=None, name_scope=\"fully-conntected-neural-network\",\n        hidden_initializer=tf.truncated_normal_initializer(stddev=0.02)):\n        \"\"\"\n        Desc:\n            fully-connected neural network. \n            transform non-linearly the [input] tensor with [hidden_size] to a fix [output_size] size.  \n        Args: \n            hidden_size: should be the size of last dimension of [inputs]. \n        \"\"\"\n        with tf.variable_scope(name_scope, reuse=tf.AUTO_REUSE):\n            hidden_weights = tf.get_variable(\"hidden_weights\", [hidden_size, output_size],\n                initializer=hidden_initializer)\n            hidden_bias = tf.get_variable(\"hidden_bias\", [output_size], initializer=tf.zeros_initializer())\n            outputs = tf.nn.relu(tf.nn.xw_plus_b(inputs, hidden_weights, hidden_bias))\n\n            if dropout is not None:\n                outputs = tf.nn.dropout(outputs, dropout)\n        return outputs\n\n\n    def boolean_mask_1d(self, itemtensor, boolmask_indicator, name_scope=\"boolean_mask1d\", use_tpu=False):\n        \"\"\"\n        Desc:\n            the same functionality of tf.boolean_mask. \n            The tf.boolean_mask operation is not available on the cloud TPU. \n        Args:\n            itemtensor : a Tensor contains [tf.int32, tf.float32] numbers. Should be 1-Rank.\n            boolmask_indicator : a tf.bool Tensor. Should be 1-Rank. \n            scope : name scope for the operation. \n            use_tpu : if False, return tf.boolean_mask.  \n        \"\"\"\n        with tf.name_scope(name_scope):\n\n            boolmask_sum = tf.reduce_sum(tf.cast(boolmask_indicator, tf.int32))\n            selected_positions = tf.cast(boolmask_indicator, dtype=tf.float32)\n            indexed_positions = tf.cast(tf.multiply(tf.cumsum(selected_positions), selected_positions),dtype=tf.int32)\n            one_hot_selector = tf.one_hot(indexed_positions - 1, boolmask_sum, dtype=tf.float32)\n            sampled_indices = tf.cast(tf.tensordot(tf.cast(tf.range(tf.shape(boolmask_indicator)[0]), dtype=tf.float32),\n                one_hot_selector,axes=[0, 0]),dtype=tf.int32)\n            sampled_indices = tf.reshape(sampled_indices, [-1])\n            mask_itemtensor = tf.gather(itemtensor, sampled_indices)\n\n            return mask_itemtensor\n\n\n    def get_dropout(self, dropout_rate, is_training):\n        return 1 - (tf.to_float(is_training) * dropout_rate)\n\n\n    def get_shape(self, x, dim):\n        \"\"\"\n        Desc:\n            return the size of input x in DIM. \n        \"\"\" \n        return x.get_shape()[dim].value or tf.shape(x)[dim]\n\n\n    def evaluate(self, top_span_starts, top_span_ends, predicted_antecedents, \n            gold_clusters, gold_starts, gold_ends):\n        \"\"\"\n        Desc:\n            expected cluster ids is : [[[21, 25], [18, 18]], [[63, 65], [46, 48], [27, 29]], [[88, 88], [89, 89]]]\n        Args:\n            top_span_starts: \n            top_span_ends:\n            predicted_antecedents: \n        Returns:\n            predicted_clusters: \n            gold_clusters:\n            mention_to_predicted:\n            mention_to_gold: \n        \"\"\" \n        # predicted_antecedents = np.argmax(predicted_antecedents, axis=-1)\n        top_span_starts, top_span_ends, predicted_antecedents = top_span_starts.tolist(), top_span_ends.tolist(), predicted_antecedents.tolist()\n        gold_clusters, gold_starts, gold_ends =  gold_clusters.tolist()[0], gold_starts.tolist()[0], gold_ends.tolist()[0]\n\n        def transform_gold_labels(gold_clusters, gold_starts, gold_ends):\n            gold_clusters_idx = [tmp for tmp in gold_clusters if tmp >= 0]\n            gold_starts = [tmp for tmp in gold_starts if tmp >= 0]\n            gold_ends = [tmp for tmp in gold_ends if tmp >= 0]\n\n            gold_clusters_dict = {}\n            gold_cluster_lst = []\n\n            for idx, (tmp_start, tmp_end) in enumerate(zip(gold_starts, gold_ends)):\n                tmp_cluster_idx = gold_clusters_idx[idx]\n                if tmp_cluster_idx not in gold_clusters_dict.keys():\n                    gold_cluster_lst.append(tmp_cluster_idx)\n                    gold_clusters_dict[tmp_cluster_idx] = [[tmp_start, tmp_end]]\n                else:\n                    gold_clusters_dict[tmp_cluster_idx].append([tmp_start, tmp_end])\n\n            gold_cluster = [gold_clusters_dict[tmp_idx] for tmp_idx in gold_cluster_lst]\n\n            return gold_cluster, gold_starts, gold_ends\n\n        gold_clusters, gold_starts, gold_ends = transform_gold_labels(gold_clusters, gold_starts, gold_ends)\n\n        gold_clusters = [tuple(tuple(m) for m in gc) for gc in gold_clusters]\n        mention_to_gold = {}\n        for gc in gold_clusters:\n            for mention in gc:\n                mention_to_gold[mention] = gc\n\n        predicted_clusters, mention_to_predicted = self.get_predicted_clusters(top_span_starts, top_span_ends, predicted_antecedents)\n    \n        return predicted_clusters, gold_clusters, mention_to_predicted, mention_to_gold\n\n\n"
  },
  {
    "path": "models/mention_proposal.py",
    "content": "#!/usr/bin/env python3 \n# -*- coding: utf-8 -*- \n\n\n\n# author: xiaoy li \n# description:\n# the mention proposal model for pre-training the Span-BERT model. \n\n\nimport os\nimport sys \n\nrepo_path = \"/\".join(os.path.realpath(__file__).split(\"/\")[:-2])\nif repo_path not in sys.path:\n    sys.path.insert(0, repo_path)\n\nimport tensorflow as tf\nfrom bert import modeling\n\n\n\nclass MentionProposalModel(object):\n    def __init__(self, config):\n        self.config = config \n        self.bert_config = modeling.BertConfig.from_json_file(config.bert_config_file)\n        self.bert_config.hidden_dropout_prob = config.dropout_rate\n\n    def get_mention_proposal_and_loss(self, instance, is_training, use_tpu=False):\n        \"\"\"\n        Desc:\n            forward function for training mention proposal module. \n        Args:\n            instance: a tuple of train/dev/test data instance. \n                e.g., (flat_input_ids, flat_doc_overlap_input_mask, flat_sentence_map, text_len, speaker_ids, gold_starts, gold_ends, cluster_ids)\n            is_training: True/False is in the training process. \n        \"\"\"\n        self.use_tpu = use_tpu \n        self.dropout = self.get_dropout(self.config.dropout_rate, is_training)\n\n        flat_input_ids, flat_doc_overlap_input_mask, flat_sentence_map, text_len, speaker_ids, gold_starts, gold_ends, cluster_ids = instance\n        # flat_input_ids: (num_window, window_size)\n        # flat_doc_overlap_input_mask: (num_window, window_size)\n        # flat_sentence_map: (num_window, window_size)\n        # text_len: dynamic length and is padded to fix length\n        # gold_start: (max_num_mention), mention start index in the original (NON-OVERLAP) document. Pad with -1 to the fix length max_num_mention.\n        # gold_end: (max_num_mention), mention end index in the original (NON-OVERLAP) document. Pad with -1 to the fix length max_num_mention.\n        # cluster_ids/speaker_ids is not used in the mention proposal model.\n\n        flat_input_ids = tf.math.maximum(flat_input_ids, tf.zeros_like(flat_input_ids, tf.int32)) # (num_window * window_size)\n        \n        flat_doc_overlap_input_mask = tf.where(tf.math.greater_equal(flat_doc_overlap_input_mask, 0), \n            x=tf.ones_like(flat_doc_overlap_input_mask, tf.int32), y=tf.zeros_like(flat_doc_overlap_input_mask, tf.int32)) # (num_window * window_size)\n        # flat_doc_overlap_input_mask = tf.math.maximum(flat_doc_overlap_input_mask, tf.zeros_like(flat_doc_overlap_input_mask, tf.int32))\n        flat_sentence_map = tf.math.maximum(flat_sentence_map, tf.zeros_like(flat_sentence_map, tf.int32)) # (num_window * window_size)\n        \n        gold_start_end_mask = tf.cast(tf.math.greater_equal(gold_starts, tf.zeros_like(gold_starts, tf.int32)), tf.bool) # (max_num_mention)\n        gold_start_index_labels = self.boolean_mask_1d(gold_starts, gold_start_end_mask, name_scope=\"gold_starts\", use_tpu=self.use_tpu) # (num_of_mention)\n        gold_end_index_labels = self.boolean_mask_1d(gold_ends, gold_start_end_mask, name_scope=\"gold_ends\", use_tpu=self.use_tpu) # (num_of_mention)\n\n        text_len = tf.math.maximum(text_len, tf.zeros_like(text_len, tf.int32)) # (num_of_non_empty_window)\n        num_subtoken_in_doc = tf.math.reduce_sum(text_len) # the value should be num_subtoken_in_doc \n\n        input_ids = tf.reshape(flat_input_ids, [-1, self.config.window_size]) # (num_window, window_size)\n        input_mask = tf.ones_like(input_ids, tf.int32) # (num_window, window_size)\n\n        model = modeling.BertModel(config=self.bert_config, is_training=is_training, \n            input_ids=input_ids, input_mask=input_mask, \n            use_one_hot_embeddings=False, scope='bert')\n\n        doc_overlap_window_embs = model.get_sequence_output() # (num_window, window_size, hidden_size)\n        doc_overlap_input_mask = tf.reshape(flat_doc_overlap_input_mask, [self.config.num_window, self.config.window_size]) # (num_window, window_size)\n\n        doc_flat_embs = self.transform_overlap_windows_to_original_doc(doc_overlap_window_embs, doc_overlap_input_mask) \n        doc_flat_embs = tf.reshape(doc_flat_embs, [-1, self.config.hidden_size]) # (num_subtoken_in_doc, hidden_size)\n\n        expand_start_embs = tf.tile(tf.expand_dims(doc_flat_embs, 1), [1, num_subtoken_in_doc, 1]) # (num_subtoken_in_doc, num_subtoken_in_doc, hidden_size)\n        expand_end_embs = tf.tile(tf.expand_dims(doc_flat_embs, 0), [num_subtoken_in_doc, 1, 1]) # (num_subtoken_in_doc, num_subtoken_in_doc, hidden_size)\n        expand_mention_span_embs = tf.concat([expand_start_embs, expand_end_embs], axis=-1) # (num_subtoken_in_doc, num_subtoken_in_doc, 2*hidden_size)\n        expand_mention_span_embs = tf.reshape(expand_mention_span_embs, [-1, self.config.hidden_size*2])\n        span_sequence_logits = self.ffnn(expand_mention_span_embs, self.config.hidden_size*2, 1, dropout=self.dropout, name_scope=\"mention_span\") # (num_subtoken_in_doc * num_subtoken_in_doc)\n\n        if self.config.start_end_share:\n            start_end_sequence_logits = self.ffnn(doc_flat_embs, self.config.hidden_size, 2, dropout=self.dropout, name_scope=\"mention_start_end\") # (num_subtoken_in_doc, 2)\n            start_sequence_logits, end_sequence_logits = tf.split(start_end_sequence_logits, axis=1)\n            # start_sequence_logits -> (num_subtoken_in_doc, 1)\n            # end_sequence_logits -> (num_subtoken_in_doc, 1)\n        else:\n            start_sequence_logits = self.ffnn(doc_flat_embs, self.config.hidden_size, 1, dropout=self.dropout, name_scope=\"mention_start\") # (num_subtoken_in_doc)\n            end_sequence_logits = self.ffnn(doc_flat_embs, self.config.hidden_size, 1, dropout=self.dropout, name_scope=\"mention_end\") # (num_subtoken_in_doc)\n\n        gold_start_sequence_labels = self.scatter_gold_index_to_label_sequence(gold_start_index_labels, num_subtoken_in_doc) # (num_subtoken_in_doc)\n        gold_end_sequence_labels = self.scatter_gold_index_to_label_sequence(gold_end_index_labels, num_subtoken_in_doc) # (num_subtoken_in_doc)\n\n        start_loss, start_sequence_probabilities = self.compute_score_and_loss(start_sequence_logits, gold_start_sequence_labels)\n        end_loss, end_sequence_probabilities = self.compute_score_and_loss(end_sequence_logits, gold_end_sequence_labels)\n        # *_loss -> a scalar \n        # *_sequence_scores -> (num_subtoken_in_doc)\n\n        gold_span_sequence_labels = self.scatter_span_sequence_labels(gold_start_index_labels, gold_end_index_labels, num_subtoken_in_doc) # (num_subtoken_in_doc * num_subtoken_in_doc)\n        span_loss, span_sequence_probabilities = self.compute_score_and_loss(span_sequence_logits, gold_span_sequence_labels)\n        # span_loss -> a scalar \n        # span_sequence_probabilities -> (num_subtoken_in_doc * num_subtoken_in_doc)\n        \n        total_loss = self.config.loss_start_ratio * start_loss + self.config.loss_end_ratio * end_loss + self.config.loss_span_ratio * span_loss \n        return total_loss, start_sequence_probabilities, end_sequence_probabilities, span_sequence_probabilities\n\n\n    def get_gold_mention_sequence_labels_from_pad_index(self, pad_gold_start_index_labels, pad_gold_end_index_labels, pad_text_len):\n        \"\"\"\n        Desc:\n            the original gold labels is padded to the fixed length and only contains the position index of gold mentions. \n            return the gold sequence of labels for evaluation. \n        Args:\n            pad_gold_start_index_labels: a tf.int32 tensor with a fixed length (self.config.max_num_mention). \n                every element in the tensor is the start position index for the mentions. \n            pad_gold_end_index_labels: a tf.int32 tensor with a fixed length (self.config.max_num_mention). \n                every element in the tensor is the end position index of the mentions. \n            pad_text_len: a tf.int32 tensor with a fixed length (self.config.num_window). \n                every positive element in the tensor indicates that the number of subtokens in the window. \n        Returns:\n            gold_start_sequence_labels: a tf.int32 tensor with the shape of (num_subtoken_in_doc). \n                if the element in the tensor equals to 0, this subtoken is not a start for a mention. \n                if the elemtn in the tensor equals to 1, this subtoken is a start for a mention.  \n            gold_end_sequence_labels: a tf.int32 tensor with the shape of (num_subtoken_in_doc). \n                if the element in the tensor equals to 0, this subtoken is not a end for a mention. \n                if the elemtn in the tensor equals to 1, this subtoken is a end for a mention.  \n            gold_span_sequence_labels: a tf.int32 tensor with the shape of (num_subtoken_in_doc, num_subtoken_in_doc)/ \n                if the element[i][j] equals to 0, this subtokens from $i$ to $j$ is not a mention. \n                if the element[i][j] equals to 1, this subtokens from $i$ to $j$ is a mention. \n        \"\"\"\n        text_len = tf.math.maximum(pad_text_len, tf.zeros_like(pad_text_len, tf.int32)) # (num_of_non_empty_window)\n        num_subtoken_in_doc = tf.math.reduce_sum(text_len) # the value should be num_subtoken_in_doc \n        \n        gold_start_end_mask = tf.cast(tf.math.greater_equal(pad_gold_start_index_labels, tf.zeros_like(pad_gold_start_index_labels, tf.int32)), tf.bool) # (max_num_mention)\n        gold_start_index_labels = self.boolean_mask_1d(pad_gold_start_index_labels, gold_start_end_mask, name_scope=\"gold_starts\", use_tpu=self.use_tpu) # (num_of_mention)\n        gold_end_index_labels = self.boolean_mask_1d(pad_gold_end_index_labels, gold_start_end_mask, name_scope=\"gold_ends\", use_tpu=self.use_tpu) # (num_of_mention)\n\n        gold_start_sequence_labels = self.scatter_gold_index_to_label_sequence(gold_start_index_labels, num_subtoken_in_doc) # (num_subtoken_in_doc)\n        gold_end_sequence_labels = self.scatter_gold_index_to_label_sequence(gold_end_index_labels, num_subtoken_in_doc) # (num_subtoken_in_doc)\n        gold_span_sequence_labels = self.scatter_span_sequence_labels(gold_start_index_labels, gold_end_index_labels, num_subtoken_in_doc) # (num_subtoken_in_doc, num_subtoken_in_doc)\n\n        return gold_start_sequence_labels, gold_end_sequence_labels, gold_span_sequence_labels\n\n\n    def scatter_gold_index_to_label_sequence(self, gold_index_labels, expect_length_of_labels):\n        \"\"\"\n        Desc:\n            transform the mention start/end position index tf.int32 Tensor to a tf.int32 Tensor with 1/0 labels for the input subtoken sequences.\n            1 denotes this subtoken is the start/end for a mention. \n        Args:\n            gold_index_labels: a tf.int32 Tensor with mention start/end position index in the original document. \n            expect_length_of_labels: the number of subtokens in the original document. \n        \"\"\"\n        gold_labels_pos = tf.reshape(gold_index_labels, [-1, 1]) # (num_of_mention, 1)\n        gold_value = tf.reshape(tf.ones_like(gold_index_labels), [-1]) # (num_of_mention)\n        label_shape = tf.Variable(expect_length_of_labels) \n        label_shape = tf.reshape(label_shape, [1]) # [1]\n        gold_sequence_labels = tf.cast(tf.scatter_nd(gold_labels_pos, gold_value, label_shape), tf.int32) # (num_subtoken_in_doc)\n        return gold_sequence_labels\n\n\n    def scatter_span_sequence_labels(self, gold_start_index_labels, gold_end_index_labels, expect_length_of_labels):\n        \"\"\"\n        Desc:\n            transform the mention (start, end) position pairs to a span matrix gold_span_sequence_labels. \n                matrix[i][j]: whether the subtokens between the position $i$ to $j$ can be a mention.  \n                if matrix[i][j] == 0: from $i$ to $j$ is not a mention. \n                if matrix[i][j] == 1: from $i$ to $j$ is a mention.\n        Args:\n            gold_start_index_labels: a tf.int32 Tensor with mention start position index in the original document. \n            gold_end_index_labels: a tf.int32 Tensor with mention end position index in the original document. \n            expect_length_of_labels: a scalar, should be the same with num_subtoken_in_doc\n        \"\"\" \n        gold_span_index_labels = tf.stack([gold_start_index_labels, gold_end_index_labels], axis=1) # (num_of_mention, 2)\n        gold_span_value = tf.reshape(tf.ones_like(gold_start_index_labels, tf.int32), [-1]) # (num_of_mention)\n        gold_span_label_shape = tf.Variable([expect_length_of_labels, expect_length_of_labels]) \n        gold_span_label_shape = tf.reshape(gold_span_label_shape, [-1])\n\n        gold_span_sequence_labels = tf.cast(tf.scatter_nd(gold_span_index_labels, gold_span_value, gold_span_label_shape), tf.int32) # (num_subtoken_in_doc, num_subtoken_in_doc)\n        return gold_span_sequence_labels\n\n\n    def compute_score_and_loss(self, pred_sequence_logits, gold_sequence_labels, loss_mask=None):\n        \"\"\"\n        Desc:\n            compute the unifrom start/end loss and probabilities. \n        Args:\n            pred_sequence_logits: (input_shape, 1) \n            gold_sequence_labels: (input_shape, 1)\n            loss_mask: [optional] if is not None, it should be (input_shape). should be tf.int32 0/1 tensor. \n            FOR start/end score and loss, input_shape should be num_subtoken_in_doc.\n            FOR span score and loss, input_shape should be num_subtoken_in_doc * num_subtoken_in_doc. \n        \"\"\"\n        pred_sequence_probabilities = tf.cast(tf.reshape(tf.sigmoid(pred_sequence_logits), [-1]),tf.float32) # (input_shape)\n        expand_pred_sequence_scores = tf.stack([(1 - pred_sequence_probabilities), pred_sequence_probabilities], axis=-1) # (input_shape, 2)\n        expand_gold_sequence_labels = tf.cast(tf.one_hot(tf.reshape(gold_sequence_labels, [-1]), 2, axis=-1), tf.float32) # (input_shape, 2)\n\n        loss = tf.keras.losses.binary_crossentropy(expand_gold_sequence_labels, expand_pred_sequence_scores)\n        # loss -> shape is (input_shape)\n\n        if loss_mask is not None:\n            loss = tf.multiply(loss, tf.cast(loss_mask, tf.float32))\n\n        total_loss = tf.reduce_mean(loss)\n        # total_loss -> a scalar \n\n        return total_loss, pred_sequence_probabilities\n\n\n    def transform_overlap_windows_to_original_doc(self, doc_overlap_window_embs, doc_overlap_input_mask):\n        \"\"\"\n        Desc:\n            hidden_size should be equal to embeddding_size. \n        Args:\n            doc_overlap_window_embs: (num_window, window_size, hidden_size). \n                the output of (num_window, window_size) input_ids forward into BERT model. \n            doc_overlap_input_mask: (num_window, window_size). A tf.int32 Tensor contains 0/1. \n                0 represents token in this position should be neglected. 1 represents token in this position should be reserved. \n        \"\"\"\n        ones_input_mask = tf.ones_like(doc_overlap_input_mask, tf.int32) # (num_window, window_size)\n        cumsum_input_mask = tf.math.cumsum(ones_input_mask, axis=1) # (num_window, window_size)\n        offset_input_mask = tf.tile(tf.expand_dims(tf.range(self.config.num_window) * self.config.window_size, 1), [1, self.config.window_size]) # (num_window, window_size)\n        offset_cumsum_input_mask = offset_input_mask + cumsum_input_mask # (num_window, window_size)\n        global_input_mask = tf.math.multiply(ones_input_mask, offset_cumsum_input_mask) # (num_window, window_size)\n        global_input_mask = tf.reshape(global_input_mask, [-1]) # (num_window * window_size)\n        global_input_mask_index = self.boolean_mask_1d(global_input_mask, tf.math.greater(global_input_mask, tf.zeros_like(global_input_mask, tf.int32))) # (num_subtoken_in_doc)\n\n        doc_overlap_window_embs = tf.reshape(doc_overlap_window_embs, [-1, self.config.hidden_size]) # (num_window * window_size, hidden_size)\n        original_doc_embs = tf.gather(doc_overlap_window_embs, global_input_mask_index) # (num_subtoken_in_doc, hidden_size)\n\n        return original_doc_embs \n\n\n    def ffnn(self, inputs, hidden_size, output_size, dropout=None, name_scope=\"fully-conntected-neural-network\",\n        hidden_initializer=tf.truncated_normal_initializer(stddev=0.02)):\n        \"\"\"\n        Desc:\n            fully-connected neural network. \n            transform non-linearly the [input] tensor with [hidden_size] to a fix [output_size] size.  \n        Args: \n            hidden_size: should be the size of last dimension of [inputs]. \n        \"\"\"\n        with tf.variable_scope(name_scope, reuse=tf.AUTO_REUSE):\n            hidden_weights = tf.get_variable(\"hidden_weights\", [hidden_size, output_size],\n                initializer=hidden_initializer)\n            hidden_bias = tf.get_variable(\"hidden_bias\", [output_size], initializer=tf.zeros_initializer())\n            outputs = tf.nn.relu(tf.nn.xw_plus_b(inputs, hidden_weights, hidden_bias))\n\n            if dropout is not None:\n                outputs = tf.nn.dropout(outputs, dropout)\n\n        return outputs \n\n\n    def get_dropout(self, dropout_rate, is_training):\n        return 1 - (tf.to_float(is_training) * dropout_rate)\n\n\n    def get_shape(self, x, dim):\n        \"\"\"\n        Desc:\n            return the size of input x in DIM. \n        \"\"\" \n        return x.get_shape()[dim].value or tf.shape(x)[dim]\n\n\n    def boolean_mask_1d(self, itemtensor, boolmask_indicator, name_scope=\"boolean_mask1d\", use_tpu=False):\n        \"\"\"\n        Desc:\n            the same functionality of tf.boolean_mask. \n            The tf.boolean_mask operation is not available on the cloud TPU. \n        Args:\n            itemtensor : a Tensor contains [tf.int32, tf.float32] numbers. Should be 1-Rank.\n            boolmask_indicator : a tf.bool Tensor. Should be 1-Rank. \n            scope : name scope for the operation. \n            use_tpu : if False, return tf.boolean_mask.  \n        \"\"\"\n        with tf.name_scope(name_scope):\n            if not use_tpu:\n                return tf.boolean_mask(itemtensor, boolmask_indicator)\n\n            boolmask_sum = tf.reduce_sum(tf.cast(boolmask_indicator, tf.int32))\n            selected_positions = tf.cast(boolmask_indicator, dtype=tf.float32)\n            indexed_positions = tf.cast(tf.multiply(tf.cumsum(selected_positions), selected_positions),dtype=tf.int32)\n            one_hot_selector = tf.one_hot(indexed_positions - 1, boolmask_sum, dtype=tf.float32)\n            sampled_indices = tf.cast(tf.tensordot(tf.cast(tf.range(tf.shape(boolmask_indicator)[0]), dtype=tf.float32),\n                one_hot_selector,axes=[0, 0]),dtype=tf.int32)\n            sampled_indices = tf.reshape(sampled_indices, [-1])\n            mask_itemtensor = tf.gather(itemtensor, sampled_indices)\n\n            return mask_itemtensor\n\n\n\n\n\n\n\n"
  },
  {
    "path": "requirements.txt",
    "content": "pyhocon\ntensorboard==1.15.0\ntensorflow-estimator==1.15.1\ntensorflow-gpu==1.15.0\npyyaml==5.2\nscikit-learn==0.19.1\nscipy\ntorch==1.2.0\n"
  },
  {
    "path": "run/build_dataset_to_tfrecord.py",
    "content": "#!/usr/bin/env python3 \n# -*- coding: utf-8 -*- \n\n\n\n# author: xiaoy li\n# description:\n# generate tfrecord for train/dev/test set for the model. \n# TODO (xiaoya): need to add help description for args  \n\n\n\nimport os\nimport sys \nimport re\nimport json  \nimport argparse \nimport numpy as np \nimport tensorflow as tf\nfrom collections import OrderedDict\n\nREPO_PATH = \"/\".join(os.path.realpath(__file__).split(\"/\")[:-2])\nif REPO_PATH not in sys.path:\n    sys.path.insert(0, REPO_PATH)\n\nfrom data_utils import conll\nfrom bert.tokenization import FullTokenizer\n\nSPEAKER_START = '[unused19]'\nSPEAKER_END = '[unused73]'\nsubtoken_maps = {}\ngold = {}\n\n\n\n\"\"\"\nDesc:\na single training/test example for the squad dataset.\nsuppose origin input_tokens are :\n['[unused19]', 'speaker', '#', '1', '[unused73]', '-', '-', 'basically', ',', 'it', 'was', 'unanimously', 'agreed', 'upon', 'by', 'the', 'various', 'relevant', 'parties', '.', \n'To', 'express', 'its', 'determination', ',', 'the', 'Chinese', 'securities', 'regulatory', 'department', 'compares', 'this', 'stock', 'reform', 'to', 'a', 'die', 'that', \n'has', 'been', 'cast', '.', 'It', 'takes', 'time', 'to', 'prove', 'whether', 'the', 'stock', 'reform', 'can', 'really', 'meet', 'expectations', ',', 'and', 'whether', 'any', \n'de', '##viation', '##s', 'that', 'arise', 'during', 'the', 'stock', 'reform', 'can', 'be', 'promptly', 'corrected', '.', '[unused19]', 'Xu', '_', 'l', '##i', '[unused73]', \n'Dear', 'viewers', ',', 'the', 'China', 'News', 'program', 'will', 'end', 'here', '.', 'This', 'is', 'Xu', 'Li', '.', 'Thank', 'you', 'everyone', 'for', 'watching', '.', 'Coming', \n'up', 'is', 'the', 'Focus', 'Today', 'program', 'hosted', 'by', 'Wang', 'Shi', '##lin', '.', 'Good', '-', 'bye', ',', 'dear', 'viewers', '.'] \nIF sliding window size is 50. \nArgs:\ndoc_idx: a string: cctv/bn/0001\nsentence_map: \n    e.g. [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 4, 4, 4, 4, 4, 5, 5, 5, 5, 5, 5, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 7, 7, 7, 7, 7, 7, 7]\nsubtoken_map: \n    e.g. [0, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 53, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 97, 98, 99, 99, 99, 100, 101, 102, 103]\nflattened_window_input_ids: [num-window, window-size]\n    e.g. before bert_tokenizer convert subtokens into ids:\n    [['[CLS]', '[unused19]', 'speaker', '#', '1', '[unused73]', '-', '-', 'basically', ',', 'it', 'was', 'unanimously', 'agreed', 'upon', 'by', 'the', 'various', 'relevant', 'parties', '.', 'To', 'express', 'its', 'determination', ',', 'the', 'Chinese', 'securities', 'regulatory', 'department', 'compares', 'this', 'stock', 'reform', 'to', 'a', 'die', 'that', 'has', 'been', 'cast', '.', 'It', 'takes', 'time', 'to', 'prove', 'whether', '[SEP]'],\n    ['[CLS]', ',', 'the', 'Chinese', 'securities', 'regulatory', 'department', 'compares', 'this', 'stock', 'reform', 'to', 'a', 'die', 'that', 'has', 'been', 'cast', '.', 'It', 'takes', 'time', 'to', 'prove', 'whether', 'the', 'stock', 'reform', 'can', 'really', 'meet', 'expectations', ',', 'and', 'whether', 'any', 'de', '##viation', '##s', 'that', 'arise', 'during', 'the', 'stock', 'reform', 'can', 'be', 'promptly', 'corrected', '[SEP]'],\n    ['[CLS]', 'the', 'stock', 'reform', 'can', 'really', 'meet', 'expectations', ',', 'and', 'whether', 'any', 'de', '##viation', '##s', 'that', 'arise', 'during', 'the', 'stock', 'reform', 'can', 'be', 'promptly', 'corrected', '.', '[unused19]', 'Xu', '_', 'l', '##i', '[unused73]', 'Dear', 'viewers', ',', 'the', 'China', 'News', 'program', 'will', 'end', 'here', '.', 'This', 'is', 'Xu', 'Li', '.', 'Thank', '[SEP]'],\n    ['[CLS]', '.', '[unused19]', 'Xu', '_', 'l', '##i', '[unused73]', 'Dear', 'viewers', ',', 'the', 'China', 'News', 'program', 'will', 'end', 'here', '.', 'This', 'is', 'Xu', 'Li', '.', 'Thank', 'you', 'everyone', 'for', 'watching', '.', 'Coming', 'up', 'is', 'the', 'Focus', 'Today', 'program', 'hosted', 'by', 'Wang', 'Shi', '##lin', '.', 'Good', '-', 'bye', ',', 'dear', 'viewers', '[SEP]'],\n    ['[CLS]', 'you', 'everyone', 'for', 'watching', '.', 'Coming', 'up', 'is', 'the', 'Focus', 'Today', 'program', 'hosted', 'by', 'Wang', 'Shi', '##lin', '.', 'Good', '-', 'bye', ',', 'dear', 'viewers', '.', '[SEP]', '[PAD]', '[PAD]', '[PAD]', '[PAD]', '[PAD]', '[PAD]', '[PAD]', '[PAD]', '[PAD]', '[PAD]', '[PAD]', '[PAD]', '[PAD]', '[PAD]', '[PAD]', '[PAD]', '[PAD]', '[PAD]', '[PAD]', '[PAD]', '[PAD]', '[PAD]', '[PAD]']] \nflattened_window_masked_ids: [num-window, window-size]\n    e.g.: before bert_tokenizer ids:\n    [[-3, -1, -1, -1, -1, -1, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, -2, -2, -2, -2, -2, -2, -2, -2, -2, -2, -2, -2, -3],\n    [-3, -2, -2, -2, -2, -2, -2, -2, -2, -2, -2, -2, -2, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, -2, -2, -2, -2, -2, -2, -2, -2, -2, -2, -2, -2, -3],\n    [-3, -2, -2, -2, -2, -2, -2, -2, -2, -2, -2, -2, -2, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, -1, -1, -1, -1, -1, -1, 68, 69, 70, 71, 72, -2, -2, -2, -2, -2, -2, -2, -2, -2, -2, -2, -2, -3],\n    [-3, -2, -2, -2, -2, -2, -2, -2, -2, -2, -2, -2, -2, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, -2, -2, -2, -2, -2, -2, -2, -2, -2, -2, -2, -2, -3],\n    [-3, -2, -2, -2, -2, -2, -2, -2, -2, -2, -2, -2, -2, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, -3, -4, -4, -4, -4, -4, -4, -4, -4, -4, -4, -4, -4, -4, -4, -4, -4, -4, -4, -4, -4, -4, -4, -4]]\nspan_start: \n    e.g.: mention start indices in the original document \n        [17, 20, 26, 43, 60, 85, 86]\nspan_end:\n    e.g.: mention end indices in the original document \ncluster_ids: \n    e.g.: cluster ids for the (span_start, span_end) pairs\n    [1, 1, 2, 2, 2, 3, 3] \ncheck the mention in the subword list: \n    1. ['its']\n    1. ['the', 'Chinese', 'securities', 'regulatory', 'department']\n    2. ['this', 'stock', 'reform']\n    2. ['the', 'stock', 'reform']\n    2. ['the', 'stock', 'reform']\n    3. ['you']\n    3. ['everyone']\n\"\"\"\n\n\n\ndef prepare_train_dataset(input_file, output_data_dir, output_filename, window_size, num_window, \n    tokenizer=None, vocab_file=None, language=\"english\", max_doc_length=None, genres=None, \n    max_num_mention=10, max_num_cluster=30, demo=False, lowercase=False):\n\n    if vocab_file is None:\n        if not lowercase:\n            vocab_file = os.path.join(REPO_PATH, \"data_utils\", \"uppercase_vocab.txt\")\n        else:\n            vocab_file = os.path.join(REPO_PATH, \"data_utils\", \"lowercase_vocab.txt\")\n\n    if tokenizer is None:\n        tokenizer = FullTokenizer(vocab_file=vocab_file, do_lower_case=lowercase)\n\n    writer = tf.python_io.TFRecordWriter(os.path.join(output_data_dir, \"{}.{}.tfrecord\".format(output_filename, language)))\n    doc_map = {}\n    documents = read_conll_file(input_file)\n    for doc_idx, document in enumerate(documents):\n        doc_info = parse_document(document, language)\n        tokenized_document = tokenize_document(genres, doc_info, tokenizer)\n        doc_key = tokenized_document['doc_key']\n        token_windows, mask_windows, text_len = convert_to_sliding_window(tokenized_document, window_size)\n        input_id_windows = [tokenizer.convert_tokens_to_ids(tokens) for tokens in token_windows]\n        span_start, span_end, mention_span, cluster_ids = flatten_clusters(tokenized_document['clusters'])\n\n        tmp_speaker_ids = tokenized_document[\"speakers\"] \n        tmp_speaker_ids = [[0]*130]* num_window\n        instance = (input_id_windows, mask_windows, text_len, tmp_speaker_ids, tokenized_document[\"genre\"], span_start, span_end, cluster_ids, tokenized_document['sentence_map'])   \n        write_instance_to_example_file(writer, instance, doc_key, window_size=window_size, num_window=num_window, \n            max_num_mention=max_num_mention, max_num_cluster=max_num_cluster)\n        doc_map[doc_idx] = doc_key\n        if demo and doc_idx > 3:\n            break \n    with open(os.path.join(output_data_dir, \"{}.{}.map\".format(output_filename, language)), 'w') as fo:\n        json.dump(doc_map, fo, indent=2)\n\n\n\ndef write_instance_to_example_file(writer, instance, doc_key, window_size=64, num_window=5, max_num_mention=20,\n    max_num_cluster=30, pad_idx=-1):\n\n    input_ids, input_mask, text_len, speaker_ids, genre, gold_starts, gold_ends, cluster_ids, sentence_map = instance \n    input_id_windows = input_ids \n    mask_windows = input_mask \n    flattened_input_ids = [i for j in input_id_windows for i in j]\n    flattened_input_mask = [i for j in mask_windows for i in j]\n    cluster_ids = [int(tmp) for tmp in cluster_ids]\n\n    max_sequence_len = int(num_window)\n    max_seg_len = int(window_size)\n\n    sentence_map = clip_or_pad(sentence_map, max_sequence_len*max_seg_len, pad_idx=pad_idx)\n    text_len = clip_or_pad(text_len, max_sequence_len, pad_idx=pad_idx)\n    tmp_subtoken_maps = clip_or_pad(subtoken_maps[doc_key], max_sequence_len*max_seg_len, pad_idx=pad_idx)\n\n    tmp_speaker_ids = clip_or_pad(speaker_ids[0], max_sequence_len*max_seg_len, pad_idx=pad_idx)\n\n    flattened_input_ids = clip_or_pad(flattened_input_ids, max_sequence_len*max_seg_len, pad_idx=pad_idx)\n    flattened_input_mask = clip_or_pad(flattened_input_mask, max_sequence_len*max_seg_len, pad_idx=pad_idx)\n    gold_starts = clip_or_pad(gold_starts, max_num_mention, pad_idx=pad_idx)\n    gold_ends = clip_or_pad(gold_ends, max_num_mention, pad_idx=pad_idx)\n    cluster_ids = clip_or_pad(cluster_ids, max_num_cluster, pad_idx=pad_idx)\n\n    features = OrderedDict()\n    features['sentence_map'] = create_int_feature(sentence_map)\n    features['text_len'] = create_int_feature(text_len)\n    features['subtoken_map'] = create_int_feature(tmp_subtoken_maps)\n    features['speaker_ids'] = create_int_feature(tmp_speaker_ids)\n    features['flattened_input_ids'] = create_int_feature(flattened_input_ids)\n    features['flattened_input_mask'] = create_int_feature(flattened_input_mask)\n    features['span_starts'] = create_int_feature(gold_starts)\n    features['span_ends'] = create_int_feature(gold_ends)\n    features['cluster_ids'] = create_int_feature(cluster_ids)\n\n    tf_example = tf.train.Example(features=tf.train.Features(feature=features))\n    writer.write(tf_example.SerializeToString())\n\n\ndef create_int_feature(values):\n    feature = tf.train.Feature(int64_list=tf.train.Int64List(value=list(values)))\n    return feature\n\n\ndef clip_or_pad(var, max_var_len, pad_idx=-1):\n    \n    if len(var) >= max_var_len:\n        return var[:max_var_len]\n    else:\n        pad_var  = (max_var_len - len(var)) * [pad_idx]\n        var = list(var) + list(pad_var) \n        return var \n\n\ndef flatten_clusters(clusters):\n\n    span_starts = []\n    span_ends = []\n    cluster_ids = []\n    mention_span = []\n    for cluster_id, cluster in enumerate(clusters):\n        for start, end in cluster:\n            span_starts.append(start)\n            span_ends.append(end)\n            mention_span.append((start, end))\n            cluster_ids.append(cluster_id + 1)\n    return span_starts, span_ends, mention_span, cluster_ids\n\n\ndef read_conll_file(conll_file_path):\n    documents = []\n    with open(conll_file_path, \"r\", encoding=\"utf-8\") as fi:\n        for line in fi:\n            begin_document_match = re.match(conll.BEGIN_DOCUMENT_REGEX, line)\n            if begin_document_match:\n                doc_key = conll.get_doc_key(begin_document_match.group(1), begin_document_match.group(2))\n                documents.append((doc_key, []))\n            elif line.startswith(\"#end document\"):\n                continue\n            else:\n                documents[-1][1].append(line.strip())\n    return documents\n\n\ndef parse_document(document, language):\n    \"\"\"\n    get basic information from one document annotation.\n    :param document:\n    :param language: english, chinese or arabic\n    :return:\n    \"\"\"\n    doc_key = document[0]\n    sentences = [[]]\n    speakers = []\n    coreferences = []\n    word_idx = -1\n    last_speaker = ''\n    for line_id, line in enumerate(document[1]):\n        row = line.split()\n        sentence_end = len(row) == 0\n        if not sentence_end:\n            assert len(row) >= 12\n            word_idx += 1\n            word = normalize_word(row[3], language)\n            sentences[-1].append(word)\n            speaker = row[9]\n            if speaker != last_speaker:\n                speakers.append((word_idx, speaker))\n                last_speaker = speaker\n            coreferences.append(row[-1])\n        else:\n            sentences.append([])\n    clusters = coreference_annotations_to_clusters(coreferences)\n    doc_info = {'doc_key': doc_key, 'sentences': sentences[: -1], 'speakers': speakers, 'clusters': clusters}\n    return doc_info\n\n\ndef normalize_word(word, language):\n    if language == \"arabic\":\n        word = word[:word.find(\"#\")]\n    if word == \"/.\" or word == \"/?\":\n        return word[1:]\n    else:\n        return word\n\n\ndef coreference_annotations_to_clusters(annotations):\n    \"\"\"\n    convert coreference information to clusters\n    :param annotations:\n    :return:\n    \"\"\"\n    clusters = OrderedDict()\n    coref_stack = OrderedDict()\n    for word_idx, annotation in enumerate(annotations):\n        if annotation == '-':\n            continue\n        for ann in annotation.split('|'):\n            cluster_id = int(ann.replace('(', '').replace(')', ''))\n            if ann[0] == '(' and ann[-1] == ')':\n                if cluster_id not in clusters.keys():\n                    clusters[cluster_id] = [(word_idx, word_idx)]\n                else:\n                    clusters[cluster_id].append((word_idx, word_idx))\n            elif ann[0] == '(':\n                if cluster_id not in coref_stack.keys():\n                    coref_stack[cluster_id] = [word_idx]\n                else:\n                    coref_stack[cluster_id].append(word_idx)\n            elif ann[-1] == ')':\n                span_start = coref_stack[cluster_id].pop()\n                if cluster_id not in clusters.keys():\n                    clusters[cluster_id] = [(span_start, word_idx)]\n                else:\n                    clusters[cluster_id].append((span_start, word_idx))\n            else:\n                raise NotImplementedError\n    assert all([len(starts) == 0 for starts in coref_stack.values()])\n    return list(clusters.values())\n\n\ndef checkout_clusters(doc_info):\n    words = [i for j in doc_info['sentences'] for i in j]\n    clusters = [[' '.join(words[start: end + 1]) for start, end in cluster] for cluster in doc_info['clusters']]\n    print(clusters)\n\n\ndef tokenize_document(genres, doc_info, tokenizer):\n    \"\"\"\n    tokenize into sub tokens\n    :param doc_info:\n    :param tokenizer:\n    max_doc_length: pad to max_doc_length\n    :return:\n    \"\"\"\n    genres = {g: i for i, g in enumerate(genres)}\n    sub_tokens = []  # all sub tokens of a document\n    sentence_map = []  # collected tokenized tokens -> sentence id\n    subtoken_map = []  # collected tokenized tokens -> original token id\n\n    word_idx = -1\n\n    for sentence_id, sentence in enumerate(doc_info['sentences']):\n        for token in sentence:\n            word_idx += 1\n            word_tokens = tokenizer.tokenize(token)\n            sub_tokens.extend(word_tokens)\n            sentence_map.extend([sentence_id] * len(word_tokens))\n            subtoken_map.extend([word_idx] * len(word_tokens))\n    \n\n    subtoken_maps[doc_info['doc_key']] = subtoken_map\n    genre = genres.get(doc_info['doc_key'][:2], 0)\n    speakers = {subtoken_map.index(word_index): tokenizer.tokenize(speaker)\n                for word_index, speaker in doc_info['speakers']}\n    clusters = [[(subtoken_map.index(start), len(subtoken_map) - 1 - subtoken_map[::-1].index(end))\n                 for start, end in cluster] for cluster in doc_info['clusters']]\n    tokenized_document = {'sub_tokens': sub_tokens, 'sentence_map': sentence_map, 'subtoken_map': subtoken_map,\n                          'speakers': speakers, 'clusters': clusters, 'doc_key': doc_info['doc_key'], \n                          \"genre\": genre}\n    return tokenized_document\n\n\ndef convert_to_sliding_window(tokenized_document, sliding_window_size):\n    \"\"\"\n    construct sliding windows, allocate tokens and masks into each window\n    :param tokenized_document:\n    :param sliding_window_size:\n    :return:\n    \"\"\"\n    expanded_tokens, expanded_masks = expand_with_speakers(tokenized_document)\n    sliding_windows = construct_sliding_windows(len(expanded_tokens), sliding_window_size - 2)\n    token_windows = []  # expanded tokens to sliding window\n    mask_windows = []  # expanded masks to sliding window\n    text_len = []\n\n    for window_start, window_end, window_mask in sliding_windows:\n        original_tokens = expanded_tokens[window_start: window_end]\n        original_masks = expanded_masks[window_start: window_end]\n        window_masks = [-2 if w == 0 else o for w, o in zip(window_mask, original_masks)]\n        one_window_token = ['[CLS]'] + original_tokens + ['[SEP]'] + ['[PAD]'] * (\n                sliding_window_size - 2 - len(original_tokens))\n        one_window_mask = [-3] + window_masks + [-3] + [-4] * (sliding_window_size - 2 - len(original_tokens))\n        token_calculate = [tmp for tmp in one_window_mask if tmp >= 0]\n        text_len.append(len(token_calculate))\n        assert len(one_window_token) == sliding_window_size\n        assert len(one_window_mask) == sliding_window_size\n        token_windows.append(one_window_token)\n        mask_windows.append(one_window_mask)\n    assert len(tokenized_document['sentence_map']) == sum([i >= 0 for j in mask_windows for i in j])\n\n    text_len = np.array(text_len)\n    return token_windows, mask_windows, text_len\n\n\ndef expand_with_speakers(tokenized_document):\n    \"\"\"\n    add speaker name information\n    :param tokenized_document: tokenized document information\n    :return:\n    \"\"\"\n    expanded_tokens = []\n    expanded_masks = []\n    for token_idx, token in enumerate(tokenized_document['sub_tokens']):\n        if token_idx in tokenized_document['speakers']:\n            speaker = [SPEAKER_START] + tokenized_document['speakers'][token_idx] + [SPEAKER_END]\n            expanded_tokens.extend(speaker)\n            expanded_masks.extend([-1] * len(speaker))\n        expanded_tokens.append(token)\n        expanded_masks.append(token_idx)\n    return expanded_tokens, expanded_masks\n\n\ndef construct_sliding_windows(sequence_length, sliding_window_size):\n    \"\"\"\n    construct sliding windows for BERT processing\n    :param sequence_length: e.g. 9\n    :param sliding_window_size: e.g. 4\n    :return: [(0, 4, [1, 1, 1, 0]), (2, 6, [0, 1, 1, 0]), (4, 8, [0, 1, 1, 0]), (6, 9, [0, 1, 1])]\n    \"\"\"\n    sliding_windows = []\n    stride = int(sliding_window_size / 2)\n    start_index = 0\n    end_index = 0\n    while end_index < sequence_length:\n        end_index = min(start_index + sliding_window_size, sequence_length)\n        left_value = 1 if start_index == 0 else 0\n        right_value = 1 if end_index == sequence_length else 0\n        mask = [left_value] * int(sliding_window_size / 4) + [1] * int(sliding_window_size / 2) \\\n               + [right_value] * (sliding_window_size - int(sliding_window_size / 2) - int(sliding_window_size / 4))\n        mask = mask[: end_index - start_index]\n        sliding_windows.append((start_index, end_index, mask))\n        start_index += stride\n    assert sum([sum(window[2]) for window in sliding_windows]) == sequence_length\n    return sliding_windows\n\n\n\ndef parse_args():\n    parser = argparse.ArgumentParser()\n    parser.add_argument(\"--source_files_dir\", default=\"/home/lixiaoya/data\", type=str, required=True)\n    parser.add_argument(\"--target_output_dir\", default=\"/home/lixiaoya/tfrecord_data\", type=str, required=True)\n    parser.add_argument(\"--num_window\", default=5, type=int, required=True)\n    parser.add_argument(\"--window_size\", default=64, type=int, required=True)\n    parser.add_argument(\"--max_num_mention\", default=30, type=int)\n    parser.add_argument(\"--max_num_cluster\", default=20, type=int)\n    parser.add_argument(\"--vocab_file\", default=\"/home/lixiaoya/spanbert_large_cased/vocab.txt\", type=str)\n    parser.add_argument(\"--language\", default=\"english\", type=str)\n    parser.add_argument(\"--max_doc_length\", default=600, type=int)\n    parser.add_argument(\"--lowercase\", help=\"DO or NOT lowercase the datasets.\", action=\"store_true\")\n    parser.add_argument(\"--demo\", help=\"Wether to generate a small dataset for testing the code.\", action=\"store_true\")\n    parser.add_argument('--genres', default=[\"bc\",\"bn\",\"mz\",\"nw\",\"pt\",\"tc\",\"wb\"])\n    parser.add_argument(\"--seed\", default=2333, type=int)\n\n    args = parser.parse_args()\n\n    os.makedirs(args.target_output_dir, exist_ok=True)\n    np.random.seed(args.seed)\n    tf.set_random_seed(args.seed)\n\n    return args\n\n\ndef main():\n    args_config = parse_args()\n\n    print(\"*\"*60)\n    print(\"***** ***** show configs ***** ***** \")\n    print(\"window_size : {}\".format(str(args_config.window_size)))\n    print(\"num_window : {}\".format(str(args_config.num_window)))\n    print(\"*\"*60)\n\n    for data_sign in [\"train\", \"dev\", \"test\"]:\n        source_data_file = os.path.join(args_config.source_files_dir, \"{}.{}.v4_gold_conll\".format(data_sign, args_config.language))\n        output_filename = \"{}.overlap.corefqa\".format(data_sign)\n        \n        if args_config.demo:\n            if args_config.lowercase:\n                output_filename=\"demo.lowercase.{}.overlap.corefqa\".format(data_sign)\n            else:\n                output_filename=\"demo.{}.overlap.corefqa\".format(data_sign)\n\n        print(\"$\"*60)\n        print(\"generate {}/{}\".format(args_config.target_output_dir, output_filename))\n        prepare_train_dataset(source_data_file, args_config.target_output_dir, output_filename, args_config.window_size, \n            args_config.num_window, vocab_file=args_config.vocab_file, language=args_config.language, \n            max_doc_length=args_config.max_doc_length, genres=args_config.genres, max_num_mention=args_config.max_num_mention,\n            max_num_cluster=args_config.max_num_cluster, demo=args_config.demo, lowercase=args_config.lowercase)\n\n\n\n\nif __name__ == \"__main__\":\n    main()\n\n    # please refer ${REPO_PATH}/scripts/data/generate_tfrecord_dataset.sh \n    # \n    # for generate tfrecord datasets \n    # \n    # python3 build_dataset_to_tfrecord.py \\\n    # --source_files_dir /xiaoya/data \\\n    # --target_output_dir /xiaoya/corefqa_data/overlap_64_2 \\\n    # --num_window 2 \\\n    # --window_size 64 \\\n    # --max_num_mention 50 \\\n    # --max_num_cluster 30 \\\n    # --vocab_file /xiaoya/pretrain_ckpt/cased_L-12_H-768_A-12/vocab.txt \\\n    # --language english \\\n    # --max_doc_length 600 \n    # \n\n\n\n"
  },
  {
    "path": "run/run_corefqa.py",
    "content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-  \n\n\"\"\"\nthis file contains training and testing the CorefQA model. \n\"\"\"\n\nimport os \nimport math \nimport logging\nimport random \nimport numpy as np \nimport tensorflow as tf\nfrom utils import util\nfrom utils import metrics\nfrom data_utils.config_utils import ModelConfig\nfrom func_builders.model_fn_builder import model_fn_builder \nfrom func_builders.input_fn_builder import file_based_input_fn_builder\n\n\ntf.app.flags.DEFINE_string('f', '', 'kernel')\nflags = tf.app.flags\n\nflags.DEFINE_string(\"output_dir\", \"data\", \"The output directory.\")\nflags.DEFINE_string(\"bert_config_file\", \"/home/uncased_L-2_H-128_A-2/config.json\", \"The config json file corresponding to the pre-trained BERT model.\")\nflags.DEFINE_string(\"init_checkpoint\", \"/home/uncased_L-2_H-128_A-2/bert_model.ckpt\", \"Initial checkpoint (usually from a pre-trained BERT model).\")\nflags.DEFINE_string(\"vocab_file\", \"/home/uncased_L-2_H-128_A-2/vocab.txt\", \"The vocabulary file that the BERT model was trained on.\")\nflags.DEFINE_string(\"logfile_path\", \"/home/lixiaoya/spanbert_large_mention_proposal.log\", \"the path to the exported log file.\")\nflags.DEFINE_integer(\"num_epochs\", 20, \"Total number of training epochs to perform.\")\nflags.DEFINE_integer(\"keep_checkpoint_max\", 30, \"How many checkpoint models keep at most.\")\nflags.DEFINE_integer(\"save_checkpoints_steps\", 500, \"Save checkpoint every X updates steps.\")\n\n\nflags.DEFINE_string(\"train_file\", \"/home/lixiaoya/train.english.tfrecord\", \"TFRecord file for training. E.g., train.english.tfrecord\")\nflags.DEFINE_string(\"dev_file\", \"/home/lixiaoya/dev.english.tfrecord\", \"TFRecord file for validating. E.g., dev.english.tfrecord\")\nflags.DEFINE_string(\"test_file\", \"/home/lixiaoya/test.english.tfrecord\", \"TFRecord file for testing. E.g., test.english.tfrecord\")\n\n\nflags.DEFINE_bool(\"do_train\", True, \"Whether to train a model.\")\nflags.DEFINE_bool(\"do_eval\", False, \"Whether to do evaluation: evaluation is done on a set of trained checkpoints, the model will select the best one on the dev set, and report result on the test set\")\nflags.DEFINE_bool(\"do_predict\", False, \"Whether to test (only) one trained model.\")\nflags.DEFINE_string(\"eval_checkpoint\", \"/home/lixiaoya/mention_proposal_output_dir/bert_model.ckpt\", \"[Optional] The saved checkpoint for evaluation (usually after the training process).\")\nflags.DEFINE_integer(\"iterations_per_loop\", 1000, \"How many steps to make in each estimator call.\")\n\n\nflags.DEFINE_float(\"learning_rate\", 3e-5, \"The initial learning rate for Adam.\")\nflags.DEFINE_float(\"dropout_rate\", 0.3, \"Dropout rate for the training process.\")\nflags.DEFINE_float(\"mention_threshold\", 0.5, \"The threshold for determining whether the span is a mention.\")\nflags.DEFINE_integer(\"hidden_size\", 128, \"The size of hidden layers for the pre-trained model.\")\nflags.DEFINE_integer(\"num_docs\", 5604, \"[Optional] The number of documents in the training files. Only need to change when conduct experiments on the small test sets.\")\nflags.DEFINE_integer(\"window_size\", 384, \"The number of sliding window size. Each document is split into a set of subdocuments with length set to window_size.\")\nflags.DEFINE_integer(\"num_window\", 5, \"The max number of windows for one document. This is used for fitting a document into fix shape for TF computation. \\\n    If a document is longer than num_window*window_size, the exceeding part will be abandoned. This only affects training and does not affect test, since the all \\\n    docs in the test set is shorter than num_window*window_size\")\nflags.DEFINE_integer(\"max_num_mention\", 30, \"The max number of mentions in one document.\")\nflags.DEFINE_bool(\"start_end_share\", False, \"Whether only to use [start, end] embedding to calculate the start/end scores.\") \nflags.DEFINE_integer(\"max_span_width\", 5, \"The max length of a mention.\")\nflags.DEFINE_integer(\"max_candidate_mentions\", 30, \"The number of candidate mentions.\")\nflags.DEFINE_float(\"top_span_ratio\", 0.2, \"The ratio of.\")\nflags.DEFINE_integer(\"max_top_antecedents\", 30, \"The number of top_antecedents candidate mentions.\")\nflags.DEFINE_integer(\"max_query_len\", 150, \".\")\nflags.DEFINE_integer(\"max_context_len\", 150, \".\")\nflags.DEFINE_bool(\"sec_qa_mention_score\", False, \"Whether to use TPU or GPU/CPU.\")\n\n\nflags.DEFINE_bool(\"use_tpu\", False, \"Whether to use TPU or GPU/CPU.\")\nflags.DEFINE_string(\"tpu_name\", None, \"The Cloud TPU to use for training. This should be either the name used when creating the Cloud TPU, or a grpc://ip.address.of.tpu:8470 url.\")\nflags.DEFINE_string(\"tpu_zone\", None, \"[Optional] GCE zone where the Cloud TPU is located in. If not specified, we will attempt to automatically detect the GCE project from metadata.\")\nflags.DEFINE_string(\"gcp_project\", None, \"[Optional] Project name for the Cloud TPU-enabled project. If not specified, we will attempt to automatically detect the GCE project from metadata.\")\nflags.DEFINE_string(\"master\", None, \"[Optional] TensorFlow master URL.\")\nflags.DEFINE_integer(\"num_tpu_cores\", 1, \"[Optional] Only used if `use_tpu` is True. Total number of TPU cores to use.\")\nflags.DEFINE_integer(\"seed\", 2333, \"[Optional] Random seed for initialization.\")\n\n\nFLAGS = tf.flags.FLAGS\n\n\nformat = '%(asctime)s - %(levelname)s - %(name)s - %(message)s'\nlogging.basicConfig(format=format, filename=FLAGS.logfile_path, level=logging.INFO)\nlogger = logging.getLogger(__name__)\nlogger.setLevel(logging.INFO)\n\n\n\ndef main(_):\n\n    tf.logging.set_verbosity(tf.logging.INFO)\n    num_train_steps = FLAGS.num_docs * FLAGS.num_epochs\n\n\n    keep_chceckpoint_max = max(math.ceil(num_train_steps / FLAGS.save_checkpoints_steps), FLAGS.keep_checkpoint_max)\n\n    if not FLAGS.do_train and not FLAGS.do_eval and not FLAGS.do_predict:\n        raise ValueError(\"At least one of `do_train`, `do_eval` or `do_predict' must be True.\")\n\n    tf.gfile.MakeDirs(FLAGS.output_dir)\n    tpu_cluster_resolver = None\n    if FLAGS.use_tpu and FLAGS.tpu_name:\n        tpu_cluster_resolver = tf.distribute.cluster_resolver.TPUClusterResolver(\n            FLAGS.tpu_name, zone=FLAGS.tpu_zone, project=FLAGS.gcp_project)\n        tf.config.experimental_connect_to_cluster(tpu_cluster_resolver)\n        tf.tpu.experimental.initialize_tpu_system(tpu_cluster_resolver)\n\n\n    is_per_host = tf.contrib.tpu.InputPipelineConfig.PER_HOST_V2\n    run_config = tf.contrib.tpu.RunConfig(\n        cluster=tpu_cluster_resolver,\n        master=FLAGS.master,\n        model_dir=FLAGS.output_dir,\n        evaluation_master=FLAGS.master,\n        keep_checkpoint_max = keep_chceckpoint_max,\n        save_checkpoints_steps=FLAGS.save_checkpoints_steps,\n        session_config=tf.ConfigProto(allow_soft_placement=True, log_device_placement=True),\n        tpu_config=tf.contrib.tpu.TPUConfig(\n            iterations_per_loop=FLAGS.iterations_per_loop,\n            num_shards=FLAGS.num_tpu_cores,\n            per_host_input_for_training=is_per_host))\n\n\n    model_config = ModelConfig(FLAGS, FLAGS.output_dir)\n    model_config.logging_configs()\n\n\n    model_fn = model_fn_builder(model_config, model_sign=\"corefqa\")\n    estimator = tf.contrib.tpu.TPUEstimator(\n        use_tpu=FLAGS.use_tpu,\n        eval_on_tpu=FLAGS.use_tpu,\n        warm_start_from=tf.estimator.WarmStartSettings(FLAGS.init_checkpoint,\n            vars_to_warm_start=\"bert*\"),\n        model_fn=model_fn,\n        config=run_config,\n        train_batch_size=1,\n        eval_batch_size=1,\n        predict_batch_size=1)\n\n\n    if FLAGS.do_train:\n        estimator.train(input_fn=file_based_input_fn_builder(FLAGS.train_file, num_window=FLAGS.num_window,\n            window_size=FLAGS.window_size, max_num_mention=FLAGS.max_num_mention, is_training=True, drop_remainder=True), \n            max_steps=num_train_steps)\n\n\n    if FLAGS.do_eval:\n        best_dev_f1, best_dev_prec, best_dev_rec, test_f1_when_dev_best, test_prec_when_dev_best, test_rec_when_dev_best = 0, 0, 0, 0, 0, 0\n        best_ckpt_path = \"\"\n        checkpoints_iterator = [os.path.join(FLAGS.eval_dir, \"model.ckpt-{}\".format(str(int(ckpt_idx)))) for ckpt_idx in range(0, num_train_steps+1, FLAGS.save_checkpoints_steps)]\n        model = util.get_model(model_config, model_sign=\"corefqa\")\n        for checkpoint_path in checkpoints_iterator[1:]:\n            dev_coref_evaluator = metrics.CorefEvaluator()\n            for result in estimator.predict(file_based_input_fn_builder(FLAGS.dev_file, num_window=FLAGS.num_window, \n                window_size=FLAGS.window_size, max_num_mention=FLAGS.max_num_mention, is_training=False, drop_remainder=False), \n                steps=698, checkpoint_path=checkpoint_path, yield_single_examples=False):\n                \n                predicted_clusters, gold_clusters, mention_to_predicted, mention_to_gold = model.evaluate(result[\"topk_span_starts\"], result[\"topk_span_ends\"], result[\"top_antecedent\"],\n                    result[\"cluster_ids\"], result[\"gold_starts\"], result[\"gold_ends\"])\n                dev_coref_evaluator.update(predicted_clusters, gold_clusters, mention_to_predicted, mention_to_gold)            \n            dev_prec, dev_rec, dev_f1 = dev_coref_evaluator.get_prf()\n            tf.logging.info(\"***** Current ckpt path is ***** : {}\".format(checkpoint_path))\n            tf.logging.info(\"***** EVAL ON DEV SET *****\")\n            tf.logging.info(\"***** [DEV EVAL] ***** : precision: {:.4f}, recall: {:.4f}, f1: {:.4f}\".format(dev_prec, dev_rec, dev_f1))\n            if dev_f1 > best_dev_f1:\n                best_ckpt_path = checkpoint_path\n                best_dev_f1 = dev_f1\n                best_dev_prec = dev_prec\n                best_dev_rec = dev_rec \n                test_coref_evaluator = metrics.CorefEvaluator()\n                for result in estimator.predict(file_based_input_fn_builder(FLAGS.test_file, \n                    num_window=FLAGS.num_window, window_size=FLAGS.window_size, max_num_mention=FLAGS.max_num_mention, \n                    is_training=False, drop_remainder=False),steps=698, checkpoint_path=checkpoint_path, yield_single_examples=False):\n                    predicted_clusters, gold_clusters, mention_to_predicted, mention_to_gold = model.evaluate(result[\"topk_span_starts\"], result[\"topk_span_ends\"], result[\"top_antecedent\"], \n                        result[\"cluster_ids\"], result[\"gold_starts\"], result[\"gold_ends\"])\n                    test_coref_evaluator.update(predicted_clusters, gold_clusters, mention_to_predicted, mention_to_gold)\n\n                test_pre, test_rec, test_f1 = test_coref_evaluator.get_prf()\n                test_f1_when_dev_best, test_prec_when_dev_best, test_rec_when_dev_best = test_f1, test_pre, test_rec \n                tf.logging.info(\"***** EVAL ON TEST SET *****\")\n                tf.logging.info(\"***** [TEST EVAL] ***** : precision: {:.4f}, recall: {:.4f}, f1: {:.4f}\".format(test_pre, test_rec, test_f1))\n\n        tf.logging.info(\"*\"*20)\n        tf.logging.info(\"- @@@@@ the path to the BEST DEV result is : {}\".format(best_ckpt_path))\n        tf.logging.info(\"- @@@@@ BEST DEV F1 : {:.4f}, Precision : {:.4f}, Recall : {:.4f},\".format(best_dev_f1, best_dev_prec, best_dev_rec))\n        tf.logging.info(\"- @@@@@ TEST when DEV best F1 : {:.4f}, Precision : {:.4f}, Recall : {:.4f},\".format(test_f1_when_dev_best, test_prec_when_dev_best, test_rec_when_dev_best))\n\n\n    if FLAGS.do_predict:\n        coref_evaluator = metrics.CorefEvaluator()\n        model = util.get_model(model_config, model_sign=\"corefqa\")\n        for result in estimator.predict(file_based_input_fn_builder(FLAGS.test_file, \n                    num_window=FLAGS.num_window, window_size=FLAGS.window_size, max_num_mention=FLAGS.max_num_mention, \n                    is_training=False, drop_remainder=False),steps=698, checkpoint_path=checkpoint_path, yield_single_examples=False):\n            \n            predicted_clusters, gold_clusters, mention_to_predicted, mention_to_gold = model.evaluate(result[\"topk_span_starts\"], result[\"topk_span_ends\"], \n                result[\"top_antecedent\"], result[\"cluster_ids\"], result[\"gold_starts\"], result[\"gold_ends\"])\n            coref_evaluator.update(predicted_clusters, gold_clusters, mention_to_predicted, mention_to_gold)\n        \n        p, r, f = coref_evaluator.get_prf()\n        tf.logging.info(\"Average precision: {:.4f}, Average recall: {:.4f}, Average F1 {:.4f}\".format(p, r, f))\n\n\n\nif __name__ == '__main__':\n    # set the random seed. \n    random.seed(FLAGS.seed)\n    np.random.seed(FLAGS.seed)\n    tf.set_random_seed(FLAGS.seed)\n    # start train/evaluate the model.\n    tf.app.run()\n\n\n\n\n\n"
  },
  {
    "path": "run/run_mention_proposal.py",
    "content": "#!/usr/bin/env python3\n# -*- coding: utf-8 -*-  \n\n\n\"\"\"\nthis file contains pre-training and testing the mention proposal model\n\"\"\"\n\nimport os \nimport math \nimport random \nimport logging\nimport numpy as np \nimport tensorflow as tf\nfrom data_utils.config_utils import ModelConfig\nfrom func_builders.model_fn_builder import model_fn_builder \nfrom func_builders.input_fn_builder import file_based_input_fn_builder\nfrom utils.metrics import mention_proposal_prediction\n\ntf.app.flags.DEFINE_string('f', '', 'kernel')\nflags = tf.app.flags\n\nflags.DEFINE_string(\"output_dir\", \"data\", \"The output directory of the model training.\")\nflags.DEFINE_string(\"bert_config_file\", \"/home/uncased_L-2_H-128_A-2/config.json\", \"The config json file corresponding to the pre-trained BERT model.\")\nflags.DEFINE_string(\"init_checkpoint\", \"/home/uncased_L-2_H-128_A-2/bert_model.ckpt\", \"Initial checkpoint (usually from a pre-trained BERT model).\")\nflags.DEFINE_string(\"vocab_file\", \"/home/uncased_L-2_H-128_A-2/vocab.txt\", \"The vocabulary file that the BERT model was trained on.\")\nflags.DEFINE_string(\"logfile_path\", \"/home/lixiaoya/spanbert_large_mention_proposal.log\", \"the path to the exported log file.\")\nflags.DEFINE_integer(\"num_epochs\", 20, \"Total number of training epochs to perform.\")\nflags.DEFINE_integer(\"keep_checkpoint_max\", 30, \"How many checkpoint models keep at most.\")\nflags.DEFINE_integer(\"save_checkpoints_steps\", 500, \"Save checkpoint every X updates steps.\")\n\n\nflags.DEFINE_string(\"train_file\", \"/home/lixiaoya/train.english.tfrecord\", \"TFRecord file for training. E.g., train.english.tfrecord\")\nflags.DEFINE_string(\"dev_file\", \"/home/lixiaoya/dev.english.tfrecord\", \"TFRecord file for validating. E.g., dev.english.tfrecord\")\nflags.DEFINE_string(\"test_file\", \"/home/lixiaoya/test.english.tfrecord\", \"TFRecord file for testing. E.g., test.english.tfrecord\")\n\n\nflags.DEFINE_bool(\"do_train\", True, \"Whether to train a model.\")\nflags.DEFINE_bool(\"do_eval\", False, \"whether to do evaluation: evaluation is done on a set of trained checkpoints, the checkpoint with the best score on the dev set will be selected.\")\nflags.DEFINE_bool(\"do_predict\", False, \"Whether to test (only) one trained model.\")\nflags.DEFINE_string(\"eval_checkpoint\", \"/home/lixiaoya/mention_proposal_output_dir/bert_model.ckpt\", \"[Optional] The saved checkpoint for evaluation (usually after the training process).\")\nflags.DEFINE_integer(\"iterations_per_loop\", 1000, \"How many steps to make in each estimator call.\")\n\n\nflags.DEFINE_float(\"learning_rate\", 3e-5, \"The initial learning rate for Adam.\")\nflags.DEFINE_float(\"dropout_rate\", 0.3, \"Dropout rate for the training process.\")\nflags.DEFINE_float(\"mention_threshold\", 0.5, \"The threshold for determining whether the span is a mention.\")\nflags.DEFINE_integer(\"hidden_size\", 128, \"The size of hidden layers for the pre-trained model.\")\nflags.DEFINE_integer(\"num_docs\", 5604, \"[Optional] The number of documents in the training files. Only need to change when conduct experiments on the small test sets.\")\nflags.DEFINE_integer(\"window_size\", 384, \"The number of sliding window size. Each document is split into a set of subdocuments with length set to window_size.\")\nflags.DEFINE_integer(\"num_window\", 5, \"The max number of windows for one document. This is used for fitting a document into fix shape for TF computation. \\\n    If a document is longer than num_window*window_size, the exceeding part will be abandoned. This only affects training and does not affect test, since the all \\\n    docs in the test set is shorter than num_window*window_size\")\nflags.DEFINE_integer(\"max_num_mention\", 30, \"The max number of mentions in one document.\")\nflags.DEFINE_bool(\"start_end_share\", False, \"Whether only to use [start, end] embedding to calculate the start/end scores.\") \nflags.DEFINE_float(\"loss_start_ratio\", 0.3, \"As described in the paper, the loss for a span being a mention is -loss_start_ratio* log p(the start of the given span is a start).\")\nflags.DEFINE_float(\"loss_end_ratio\", 0.3, \"As described in the paper, the loss for a span being a mention is -loss_end_ratio* log p(the end of the given span is a end).\")\nflags.DEFINE_float(\"loss_span_ratio\", 0.4, \"As described in the paper, the loss for a span being a mention is -loss_span_ratio* log p(the start and the end forms a span).\")\n\n\nflags.DEFINE_bool(\"use_tpu\", False, \"Whether to use TPU or GPU/CPU.\")\nflags.DEFINE_string(\"tpu_name\", None, \"The Cloud TPU to use for training. This should be either the name used when creating the Cloud TPU, or a grpc://ip.address.of.tpu:8470 url.\")\nflags.DEFINE_string(\"tpu_zone\", None, \"[Optional] GCE zone where the Cloud TPU is located in. If not specified, we will attempt to automatically detect the GCE project from metadata.\")\nflags.DEFINE_string(\"gcp_project\", None, \"[Optional] Project name for the Cloud TPU-enabled project. If not specified, we will attempt to automatically detect the GCE project from metadata.\")\nflags.DEFINE_string(\"master\", None, \"[Optional] TensorFlow master URL.\")\nflags.DEFINE_integer(\"num_tpu_cores\", 1, \"[Optional] Only used if `use_tpu` is True. Total number of TPU cores to use.\")\nflags.DEFINE_integer(\"seed\", 2333, \"[Optional] Random seed for initialization.\")\nFLAGS = tf.flags.FLAGS\n\n\n\nformat = '%(asctime)s - %(levelname)s - %(name)s - %(message)s'\nlogging.basicConfig(format=format, filename=FLAGS.logfile_path, level=logging.INFO)\nlogger = logging.getLogger(__name__)\nlogger.setLevel(logging.INFO)\n\n\n\ndef main(_):\n\n    tf.logging.set_verbosity(tf.logging.INFO)\n    num_train_steps = FLAGS.num_docs * FLAGS.num_epochs\n    # num_train_steps = 100 \n    keep_chceckpoint_max = max(math.ceil(num_train_steps / FLAGS.save_checkpoints_steps), FLAGS.keep_checkpoint_max)\n\n    if not FLAGS.do_train and not FLAGS.do_eval and not FLAGS.do_predict:\n        raise ValueError(\"At least one of `do_train`, `do_eval` or `do_predict' must be True.\")\n\n    tf.gfile.MakeDirs(FLAGS.output_dir)\n    tpu_cluster_resolver = None\n    if FLAGS.use_tpu and FLAGS.tpu_name:\n        tpu_cluster_resolver = tf.distribute.cluster_resolver.TPUClusterResolver(\n            FLAGS.tpu_name, zone=FLAGS.tpu_zone, project=FLAGS.gcp_project)\n        tf.config.experimental_connect_to_cluster(tpu_cluster_resolver)\n        tf.tpu.experimental.initialize_tpu_system(tpu_cluster_resolver)\n\n    is_per_host = tf.contrib.tpu.InputPipelineConfig.PER_HOST_V2\n    run_config = tf.contrib.tpu.RunConfig(\n        cluster=tpu_cluster_resolver,\n        master=FLAGS.master,\n        # evaluation_master=FLAGS.master,\n        model_dir=FLAGS.output_dir,\n        keep_checkpoint_max = keep_chceckpoint_max,\n        save_checkpoints_steps=FLAGS.save_checkpoints_steps,\n        # session_config=tf.ConfigProto(allow_soft_placement=True, log_device_placement=True),\n        tpu_config=tf.contrib.tpu.TPUConfig(\n            iterations_per_loop=FLAGS.iterations_per_loop,\n            num_shards=FLAGS.num_tpu_cores,\n            per_host_input_for_training=is_per_host))\n\n\n    model_config = ModelConfig(FLAGS, FLAGS.output_dir)\n    model_config.logging_configs()\n\n    model_fn = model_fn_builder(model_config, model_sign=\"mention_proposal\")\n    estimator = tf.contrib.tpu.TPUEstimator(\n        use_tpu=FLAGS.use_tpu,\n        # eval_on_tpu=FLAGS.use_tpu,\n        warm_start_from=tf.estimator.WarmStartSettings(FLAGS.init_checkpoint,\n            vars_to_warm_start=\"bert*\"),\n        model_fn=model_fn,\n        config=run_config,\n        train_batch_size=1,\n        predict_batch_size=1)\n\n\n    if FLAGS.do_train:\n        estimator.train(input_fn=file_based_input_fn_builder(model_config.train_file, num_window=model_config.num_window,\n            window_size=model_config.window_size, max_num_mention=model_config.max_num_mention, is_training=True, drop_remainder=True), max_steps=num_train_steps)\n\n\n    if FLAGS.do_eval:\n        # doing evaluation  on a set of trained checkpoints, the checkpoint with the best score on the dev set will be selected.\n        best_dev_f1, best_dev_prec, best_dev_rec, test_f1_when_dev_best, test_prec_when_dev_best, test_rec_when_dev_best = 0, 0, 0, 0, 0, 0\n        best_ckpt_path = \"\"\n        checkpoints_iterator = [os.path.join(FLAGS.eval_dir, \"model.ckpt-{}\".format(str(int(ckpt_idx)))) for ckpt_idx in range(0, num_train_steps, FLAGS.save_checkpoints_steps)]\n        for checkpoint_path in checkpoints_iterator[1:]:\n            eval_dev_result = estimator.evaluate(input_fn=file_based_input_fn_builder(FLAGS.dev_file, num_window=FLAGS.num_window, \n                window_size=FLAGS.window_size, max_num_mention=FLAGS.max_num_mention, is_training=False, drop_remainder=False),\n                steps=698, checkpoint_path=checkpoint_path)\n            dev_f1 = 2*eval_dev_result[\"precision\"] * eval_dev_result[\"recall\"] / (eval_dev_result[\"precision\"] + eval_dev_result[\"recall\"]+1e-10)\n            tf.logging.info(\"***** Current ckpt path is ***** : {}\".format(checkpoint_path))\n            tf.logging.info(\"***** EVAL ON DEV SET *****\")\n            tf.logging.info(\"***** [DEV EVAL] ***** : precision: {:.4f}, recall: {:.4f}, f1: {:.4f}\".format(eval_dev_result[\"precision\"], eval_dev_result[\"recall\"], dev_f1))\n            if dev_f1 > best_dev_f1:\n                best_dev_f1, best_dev_prec, best_dev_rec = dev_f1, eval_dev_result[\"precision\"], eval_dev_result[\"recall\"]\n                best_ckpt_path = checkpoint_path\n                eval_test_result = estimator.evaluate(input_fn=file_based_input_fn_builder(FLAGS.test_file, \n                    num_window=FLAGS.num_window, window_size=FLAGS.window_size, max_num_mention=FLAGS.max_num_mention, \n                    is_training=False, drop_remainder=False),steps=698, checkpoint_path=checkpoint_path)\n                test_f1 = 2*eval_test_result[\"precision\"] * eval_test_result[\"recall\"] / (eval_test_result[\"precision\"] + eval_test_result[\"recall\"]+1e-10)\n                test_f1_when_dev_best, test_prec_when_dev_best, test_rec_when_dev_best = test_f1, eval_test_result[\"precision\"], eval_test_result[\"recall\"]\n                tf.logging.info(\"***** EVAL ON TEST SET *****\")\n                tf.logging.info(\"***** [TEST EVAL] ***** : precision: {:.4f}, recall: {:.4f}, f1: {:.4f}\".format(eval_test_result[\"precision\"], eval_test_result[\"recall\"], test_f1))\n        tf.logging.info(\"*\"*20)\n        tf.logging.info(\"- @@@@@ the path to the BEST DEV result is : {}\".format(best_ckpt_path))\n        tf.logging.info(\"- @@@@@ BEST DEV F1 : {:.4f}, Precision : {:.4f}, Recall : {:.4f},\".format(best_dev_f1, best_dev_prec, best_dev_rec))\n        tf.logging.info(\"- @@@@@ TEST when DEV best F1 : {:.4f}, Precision : {:.4f}, Recall : {:.4f},\".format(test_f1_when_dev_best, test_prec_when_dev_best, test_rec_when_dev_best))\n        tf.logging.info(\"- @@@@@ mention_proposal_only_concate {}\".format(FLAGS.mention_proposal_only_concate))\n\n\n    if FLAGS.do_predict:\n        tp, fp, fn = 0, 0, 0\n        epsilon = 1e-10\n        for doc_output in estimator.predict(file_based_input_fn_builder(FLAGS.test_file,\n            num_window=FLAGS.num_window, window_size=FLAGS.window_size, max_num_mention=FLAGS.max_num_mention,\n            is_training=False, drop_remainder=False), checkpoint_path=FLAGS.eval_checkpoint, yield_single_examples=False): \n            # iterate over each doc for evaluation\n            pred_span_label, gold_span_label = mention_proposal_prediction(FLAGS, doc_output)\n\n            tem_tp = np.logical_and(pred_span_label, gold_span_label).sum()\n            tem_fp = np.logical_and(pred_span_label, np.logical_not(gold_span_label)).sum()\n            tem_fn = np.logical_and(np.logical_not(pred_span_label), gold_span_label).sum()\n\n            tp += tem_tp\n            fp += tem_fp\n            fn += tem_fn\n\n        p = tp / (tp+fp+epsilon)\n        r = tp / (tp+fn+epsilon)\n        f = 2*p*r/(p+r+epsilon)\n        tf.logging.info(\"Average precision: {:.4f}, Average recall: {:.4f}, Average F1 {:.4f}\".format(p, r, f))\n\n\n\nif __name__ == '__main__':\n    # set the random seed. \n    random.seed(FLAGS.seed)\n    np.random.seed(FLAGS.seed)\n    tf.set_random_seed(FLAGS.seed)\n    # start train/evaluate the model.\n    tf.app.run()\n\n\n\n\n\n\n"
  },
  {
    "path": "run/run_squad.py",
    "content": "# coding=utf-8\n# Copyright 2018 The Google AI Language Team Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Run BERT on SQuAD 1.1 and SQuAD 2.0.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport collections\nimport json\nimport math\nimport os\nimport random\nimport six\nimport tensorflow as tf\n\nfrom bert import modeling\nfrom bert import optimization\nfrom bert import tokenization\n\nflags = tf.flags\n\nFLAGS = flags.FLAGS\n\n## Required parameters\nflags.DEFINE_string(\n  \"bert_config_file\", None,\n    \"The config json file corresponding to the pre-trained BERT model. \"\n    \"This specifies the model architecture.\")\n\nflags.DEFINE_string(\"vocab_file\", None,\n                    \"The vocabulary file that the BERT model was trained on.\")\n\nflags.DEFINE_string(\n    \"output_dir\", None,\n    \"The output directory where the model checkpoints will be written.\")\n\n## Other parameters\nflags.DEFINE_string(\"train_file\", None,\n                    \"SQuAD json for training. E.g., train-v1.1.json\")\n\nflags.DEFINE_string(\n    \"predict_file\", None,\n    \"SQuAD json for predictions. E.g., dev-v1.1.json or test-v1.1.json\")\n\nflags.DEFINE_string(\n    \"init_checkpoint\", None,\n    \"Initial checkpoint (usually from a pre-trained BERT model).\")\n\nflags.DEFINE_bool(\n    \"do_lower_case\", True,\n    \"Whether to lower case the input text. Should be True for uncased \"\n    \"models and False for cased models.\")\n\nflags.DEFINE_integer(\n    \"max_seq_length\", 384,\n    \"The maximum total input sequence length after WordPiece tokenization. \"\n    \"Sequences longer than this will be truncated, and sequences shorter \"\n    \"than this will be padded.\")\n\nflags.DEFINE_integer(\n    \"doc_stride\", 128,\n    \"When splitting up a long document into chunks, how much stride to \"\n    \"take between chunks.\")\n\nflags.DEFINE_integer(\n    \"max_query_length\", 64,\n    \"The maximum number of tokens for the question. Questions longer than \"\n    \"this will be truncated to this length.\")\n\nflags.DEFINE_bool(\"do_train\", False, \"Whether to run training.\")\n\nflags.DEFINE_bool(\"do_predict\", False, \"Whether to run eval on the dev set.\")\n\nflags.DEFINE_integer(\"train_batch_size\", 32, \"Total batch size for training.\")\n\nflags.DEFINE_integer(\"predict_batch_size\", 8,\n                     \"Total batch size for predictions.\")\n\nflags.DEFINE_float(\"learning_rate\", 5e-5, \"The initial learning rate for Adam.\")\n\nflags.DEFINE_float(\"num_train_epochs\", 3.0,\n                   \"Total number of training epochs to perform.\")\n\nflags.DEFINE_float(\n    \"warmup_proportion\", 0.1,\n    \"Proportion of training to perform linear learning rate warmup for. \"\n    \"E.g., 0.1 = 10% of training.\")\n\nflags.DEFINE_integer(\"save_checkpoints_steps\", 1000,\n                     \"How often to save the model checkpoint.\")\n\nflags.DEFINE_integer(\"iterations_per_loop\", 1000,\n                     \"How many steps to make in each estimator call.\")\n\nflags.DEFINE_integer(\n    \"n_best_size\", 20,\n    \"The total number of n-best predictions to generate in the \"\n    \"nbest_predictions.json output file.\")\n\nflags.DEFINE_integer(\n    \"max_answer_length\", 30,\n    \"The maximum length of an answer that can be generated. This is needed \"\n    \"because the start and end predictions are not conditioned on one another.\")\n\nflags.DEFINE_bool(\"use_tpu\", False, \"Whether to use TPU or GPU/CPU.\")\n\ntf.flags.DEFINE_string(\n    \"tpu_name\", None,\n    \"The Cloud TPU to use for training. This should be either the name \"\n    \"used when creating the Cloud TPU, or a grpc://ip.address.of.tpu:8470 \"\n    \"url.\")\n\ntf.flags.DEFINE_string(\n    \"tpu_zone\", None,\n    \"[Optional] GCE zone where the Cloud TPU is located in. If not \"\n    \"specified, we will attempt to automatically detect the GCE project from \"\n    \"metadata.\")\n\ntf.flags.DEFINE_string(\n    \"gcp_project\", None,\n    \"[Optional] Project name for the Cloud TPU-enabled project. If not \"\n    \"specified, we will attempt to automatically detect the GCE project from \"\n    \"metadata.\")\n\ntf.flags.DEFINE_string(\"master\", None, \"[Optional] TensorFlow master URL.\")\n\nflags.DEFINE_integer(\n    \"num_tpu_cores\", 8,\n    \"Only used if `use_tpu` is True. Total number of TPU cores to use.\")\n\nflags.DEFINE_bool(\n    \"verbose_logging\", False,\n    \"If true, all of the warnings related to data processing will be printed. \"\n    \"A number of warnings are expected for a normal SQuAD evaluation.\")\n\nflags.DEFINE_bool(\n    \"version_2_with_negative\", False,\n    \"If true, the SQuAD examples contain some that do not have an answer.\")\n\nflags.DEFINE_float(\n    \"null_score_diff_threshold\", 0.0,\n    \"If null_score - best_non_null is greater than the threshold predict null.\")\n\n\nclass SquadExample(object):\n  \"\"\"A single training/test example for simple sequence classification.\n\n     For examples without an answer, the start and end position are -1.\n  \"\"\"\n\n  def __init__(self,\n               qas_id,\n               question_text,\n               doc_tokens,\n               orig_answer_text=None,\n               start_position=None,\n               end_position=None,\n               is_impossible=False):\n    self.qas_id = qas_id\n    self.question_text = question_text\n    self.doc_tokens = doc_tokens\n    self.orig_answer_text = orig_answer_text\n    self.start_position = start_position\n    self.end_position = end_position\n    self.is_impossible = is_impossible\n\n  def __str__(self):\n    return self.__repr__()\n\n  def __repr__(self):\n    s = \"\"\n    s += \"qas_id: %s\" % (tokenization.printable_text(self.qas_id))\n    s += \", question_text: %s\" % (\n        tokenization.printable_text(self.question_text))\n    s += \", doc_tokens: [%s]\" % (\" \".join(self.doc_tokens))\n    if self.start_position:\n      s += \", start_position: %d\" % (self.start_position)\n    if self.start_position:\n      s += \", end_position: %d\" % (self.end_position)\n    if self.start_position:\n      s += \", is_impossible: %r\" % (self.is_impossible)\n    return s\n\n\nclass InputFeatures(object):\n  \"\"\"A single set of features of data.\"\"\"\n\n  def __init__(self,\n               unique_id,\n               example_index,\n               doc_span_index,\n               tokens,\n               token_to_orig_map,\n               token_is_max_context,\n               input_ids,\n               input_mask,\n               segment_ids,\n               start_position=None,\n               end_position=None,\n               is_impossible=None):\n    self.unique_id = unique_id\n    self.example_index = example_index\n    self.doc_span_index = doc_span_index\n    self.tokens = tokens\n    self.token_to_orig_map = token_to_orig_map\n    self.token_is_max_context = token_is_max_context\n    self.input_ids = input_ids\n    self.input_mask = input_mask\n    self.segment_ids = segment_ids\n    self.start_position = start_position\n    self.end_position = end_position\n    self.is_impossible = is_impossible\n\n\ndef read_squad_examples(input_file, is_training):\n  \"\"\"Read a SQuAD json file into a list of SquadExample.\"\"\"\n  with tf.gfile.Open(input_file, \"r\") as reader:\n    input_data = json.load(reader)[\"data\"]\n\n  def is_whitespace(c):\n    if c == \" \" or c == \"\\t\" or c == \"\\r\" or c == \"\\n\" or ord(c) == 0x202F:\n      return True\n    return False\n\n  examples = []\n  for entry in input_data:\n    for paragraph in entry[\"paragraphs\"]:\n      paragraph_text = paragraph[\"context\"]\n      doc_tokens = []\n      char_to_word_offset = []\n      prev_is_whitespace = True\n      for c in paragraph_text:\n        if is_whitespace(c):\n          prev_is_whitespace = True\n        else:\n          if prev_is_whitespace:\n            doc_tokens.append(c)\n          else:\n            doc_tokens[-1] += c\n          prev_is_whitespace = False\n        char_to_word_offset.append(len(doc_tokens) - 1)\n\n      for qa in paragraph[\"qas\"]:\n        qas_id = qa[\"id\"]\n        question_text = qa[\"question\"]\n        start_position = None\n        end_position = None\n        orig_answer_text = None\n        is_impossible = False\n        if is_training:\n\n          if FLAGS.version_2_with_negative:\n            is_impossible = qa[\"is_impossible\"]\n          if (len(qa[\"answers\"]) != 1) and (not is_impossible):\n            raise ValueError(\n                \"For training, each question should have exactly 1 answer.\")\n          if not is_impossible:\n            answer = qa[\"answers\"][0]\n            orig_answer_text = answer[\"text\"]\n            answer_offset = answer[\"answer_start\"]\n            answer_length = len(orig_answer_text)\n            start_position = char_to_word_offset[answer_offset]\n            end_position = char_to_word_offset[answer_offset + answer_length -\n                                               1]\n            # Only add answers where the text can be exactly recovered from the\n            # document. If this CAN'T happen it's likely due to weird Unicode\n            # stuff so we will just skip the example.\n            #\n            # Note that this means for training mode, every example is NOT\n            # guaranteed to be preserved.\n            actual_text = \" \".join(\n                doc_tokens[start_position:(end_position + 1)])\n            cleaned_answer_text = \" \".join(\n                tokenization.whitespace_tokenize(orig_answer_text))\n            if actual_text.find(cleaned_answer_text) == -1:\n              tf.logging.warning(\"Could not find answer: '%s' vs. '%s'\",\n                                 actual_text, cleaned_answer_text)\n              continue\n          else:\n            start_position = -1\n            end_position = -1\n            orig_answer_text = \"\"\n\n        example = SquadExample(\n            qas_id=qas_id,\n            question_text=question_text,\n            doc_tokens=doc_tokens,\n            orig_answer_text=orig_answer_text,\n            start_position=start_position,\n            end_position=end_position,\n            is_impossible=is_impossible)\n        examples.append(example)\n\n  return examples\n\n\ndef convert_examples_to_features(examples, tokenizer, max_seq_length,\n                                 doc_stride, max_query_length, is_training,\n                                 output_fn):\n  \"\"\"Loads a data file into a list of `InputBatch`s.\"\"\"\n\n  unique_id = 1000000000\n\n  for (example_index, example) in enumerate(examples):\n    query_tokens = tokenizer.tokenize(example.question_text)\n\n    if len(query_tokens) > max_query_length:\n      query_tokens = query_tokens[0:max_query_length]\n\n    tok_to_orig_index = []\n    orig_to_tok_index = []\n    all_doc_tokens = []\n    for (i, token) in enumerate(example.doc_tokens):\n      orig_to_tok_index.append(len(all_doc_tokens))\n      sub_tokens = tokenizer.tokenize(token)\n      for sub_token in sub_tokens:\n        tok_to_orig_index.append(i)\n        all_doc_tokens.append(sub_token)\n\n    tok_start_position = None\n    tok_end_position = None\n    if is_training and example.is_impossible:\n      tok_start_position = -1\n      tok_end_position = -1\n    if is_training and not example.is_impossible:\n      tok_start_position = orig_to_tok_index[example.start_position]\n      if example.end_position < len(example.doc_tokens) - 1:\n        tok_end_position = orig_to_tok_index[example.end_position + 1] - 1\n      else:\n        tok_end_position = len(all_doc_tokens) - 1\n      (tok_start_position, tok_end_position) = _improve_answer_span(\n          all_doc_tokens, tok_start_position, tok_end_position, tokenizer,\n          example.orig_answer_text)\n\n    # The -3 accounts for [CLS], [SEP] and [SEP]\n    max_tokens_for_doc = max_seq_length - len(query_tokens) - 3\n\n    # We can have documents that are longer than the maximum sequence length.\n    # To deal with this we do a sliding window approach, where we take chunks\n    # of the up to our max length with a stride of `doc_stride`.\n    _DocSpan = collections.namedtuple(  # pylint: disable=invalid-name\n        \"DocSpan\", [\"start\", \"length\"])\n    doc_spans = []\n    start_offset = 0\n    while start_offset < len(all_doc_tokens):\n      length = len(all_doc_tokens) - start_offset\n      if length > max_tokens_for_doc:\n        length = max_tokens_for_doc\n      doc_spans.append(_DocSpan(start=start_offset, length=length))\n      if start_offset + length == len(all_doc_tokens):\n        break\n      start_offset += min(length, doc_stride)\n\n    for (doc_span_index, doc_span) in enumerate(doc_spans):\n      tokens = []\n      token_to_orig_map = {}\n      token_is_max_context = {}\n      segment_ids = []\n      tokens.append(\"[CLS]\")\n      segment_ids.append(0)\n      for token in query_tokens:\n        tokens.append(token)\n        segment_ids.append(0)\n      tokens.append(\"[SEP]\")\n      segment_ids.append(0)\n\n      for i in range(doc_span.length):\n        split_token_index = doc_span.start + i\n        token_to_orig_map[len(tokens)] = tok_to_orig_index[split_token_index]\n\n        is_max_context = _check_is_max_context(doc_spans, doc_span_index,\n                                               split_token_index)\n        token_is_max_context[len(tokens)] = is_max_context\n        tokens.append(all_doc_tokens[split_token_index])\n        segment_ids.append(1)\n      tokens.append(\"[SEP]\")\n      segment_ids.append(1)\n\n      input_ids = tokenizer.convert_tokens_to_ids(tokens)\n\n      # The mask has 1 for real tokens and 0 for padding tokens. Only real\n      # tokens are attended to.\n      input_mask = [1] * len(input_ids)\n\n      # Zero-pad up to the sequence length.\n      while len(input_ids) < max_seq_length:\n        input_ids.append(0)\n        input_mask.append(0)\n        segment_ids.append(0)\n\n      assert len(input_ids) == max_seq_length\n      assert len(input_mask) == max_seq_length\n      assert len(segment_ids) == max_seq_length\n\n      start_position = None\n      end_position = None\n      if is_training and not example.is_impossible:\n        # For training, if our document chunk does not contain an annotation\n        # we throw it out, since there is nothing to predict.\n        doc_start = doc_span.start\n        doc_end = doc_span.start + doc_span.length - 1\n        out_of_span = False\n        if not (tok_start_position >= doc_start and\n                tok_end_position <= doc_end):\n          out_of_span = True\n        if out_of_span:\n          start_position = 0\n          end_position = 0\n        else:\n          doc_offset = len(query_tokens) + 2\n          start_position = tok_start_position - doc_start + doc_offset\n          end_position = tok_end_position - doc_start + doc_offset\n\n      if is_training and example.is_impossible:\n        start_position = 0\n        end_position = 0\n\n      if example_index < 20:\n        tf.logging.info(\"*** Example ***\")\n        tf.logging.info(\"unique_id: %s\" % (unique_id))\n        tf.logging.info(\"example_index: %s\" % (example_index))\n        tf.logging.info(\"doc_span_index: %s\" % (doc_span_index))\n        tf.logging.info(\"tokens: %s\" % \" \".join(\n            [tokenization.printable_text(x) for x in tokens]))\n        tf.logging.info(\"token_to_orig_map: %s\" % \" \".join(\n            [\"%d:%d\" % (x, y) for (x, y) in six.iteritems(token_to_orig_map)]))\n        tf.logging.info(\"token_is_max_context: %s\" % \" \".join([\n            \"%d:%s\" % (x, y) for (x, y) in six.iteritems(token_is_max_context)\n        ]))\n        tf.logging.info(\"input_ids: %s\" % \" \".join([str(x) for x in input_ids]))\n        tf.logging.info(\n            \"input_mask: %s\" % \" \".join([str(x) for x in input_mask]))\n        tf.logging.info(\n            \"segment_ids: %s\" % \" \".join([str(x) for x in segment_ids]))\n        if is_training and example.is_impossible:\n          tf.logging.info(\"impossible example\")\n        if is_training and not example.is_impossible:\n          answer_text = \" \".join(tokens[start_position:(end_position + 1)])\n          tf.logging.info(\"start_position: %d\" % (start_position))\n          tf.logging.info(\"end_position: %d\" % (end_position))\n          tf.logging.info(\n              \"answer: %s\" % (tokenization.printable_text(answer_text)))\n\n      feature = InputFeatures(\n          unique_id=unique_id,\n          example_index=example_index,\n          doc_span_index=doc_span_index,\n          tokens=tokens,\n          token_to_orig_map=token_to_orig_map,\n          token_is_max_context=token_is_max_context,\n          input_ids=input_ids,\n          input_mask=input_mask,\n          segment_ids=segment_ids,\n          start_position=start_position,\n          end_position=end_position,\n          is_impossible=example.is_impossible)\n\n      # Run callback\n      output_fn(feature)\n\n      unique_id += 1\n\n\ndef _improve_answer_span(doc_tokens, input_start, input_end, tokenizer,\n                         orig_answer_text):\n  \"\"\"Returns tokenized answer spans that better match the annotated answer.\"\"\"\n\n  # The SQuAD annotations are character based. We first project them to\n  # whitespace-tokenized words. But then after WordPiece tokenization, we can\n  # often find a \"better match\". For example:\n  #\n  #   Question: What year was John Smith born?\n  #   Context: The leader was John Smith (1895-1943).\n  #   Answer: 1895\n  #\n  # The original whitespace-tokenized answer will be \"(1895-1943).\". However\n  # after tokenization, our tokens will be \"( 1895 - 1943 ) .\". So we can match\n  # the exact answer, 1895.\n  #\n  # However, this is not always possible. Consider the following:\n  #\n  #   Question: What country is the top exporter of electornics?\n  #   Context: The Japanese electronics industry is the lagest in the world.\n  #   Answer: Japan\n  #\n  # In this case, the annotator chose \"Japan\" as a character sub-span of\n  # the word \"Japanese\". Since our WordPiece tokenizer does not split\n  # \"Japanese\", we just use \"Japanese\" as the annotation. This is fairly rare\n  # in SQuAD, but does happen.\n  tok_answer_text = \" \".join(tokenizer.tokenize(orig_answer_text))\n\n  for new_start in range(input_start, input_end + 1):\n    for new_end in range(input_end, new_start - 1, -1):\n      text_span = \" \".join(doc_tokens[new_start:(new_end + 1)])\n      if text_span == tok_answer_text:\n        return (new_start, new_end)\n\n  return (input_start, input_end)\n\n\ndef _check_is_max_context(doc_spans, cur_span_index, position):\n  \"\"\"Check if this is the 'max context' doc span for the token.\"\"\"\n\n  # Because of the sliding window approach taken to scoring documents, a single\n  # token can appear in multiple documents. E.g.\n  #  Doc: the man went to the store and bought a gallon of milk\n  #  Span A: the man went to the\n  #  Span B: to the store and bought\n  #  Span C: and bought a gallon of\n  #  ...\n  #\n  # Now the word 'bought' will have two scores from spans B and C. We only\n  # want to consider the score with \"maximum context\", which we define as\n  # the *minimum* of its left and right context (the *sum* of left and\n  # right context will always be the same, of course).\n  #\n  # In the example the maximum context for 'bought' would be span C since\n  # it has 1 left context and 3 right context, while span B has 4 left context\n  # and 0 right context.\n  best_score = None\n  best_span_index = None\n  for (span_index, doc_span) in enumerate(doc_spans):\n    end = doc_span.start + doc_span.length - 1\n    if position < doc_span.start:\n      continue\n    if position > end:\n      continue\n    num_left_context = position - doc_span.start\n    num_right_context = end - position\n    score = min(num_left_context, num_right_context) + 0.01 * doc_span.length\n    if best_score is None or score > best_score:\n      best_score = score\n      best_span_index = span_index\n\n  return cur_span_index == best_span_index\n\n\ndef create_model(bert_config, is_training, input_ids, input_mask, segment_ids,\n                 use_one_hot_embeddings):\n  \"\"\"Creates a classification model.\"\"\"\n  model = modeling.BertModel(\n      config=bert_config,\n      is_training=is_training,\n      input_ids=input_ids,\n      input_mask=input_mask,\n      token_type_ids=segment_ids,\n      use_one_hot_embeddings=use_one_hot_embeddings)\n\n  final_hidden = model.get_sequence_output()\n\n  final_hidden_shape = modeling.get_shape_list(final_hidden, expected_rank=3)\n  batch_size = final_hidden_shape[0]\n  seq_length = final_hidden_shape[1]\n  hidden_size = final_hidden_shape[2]\n\n  output_weights = tf.get_variable(\n      \"cls/squad/output_weights\", [2, hidden_size],\n      initializer=tf.truncated_normal_initializer(stddev=0.02))\n\n  output_bias = tf.get_variable(\n      \"cls/squad/output_bias\", [2], initializer=tf.zeros_initializer())\n\n  final_hidden_matrix = tf.reshape(final_hidden,\n                                   [batch_size * seq_length, hidden_size])\n  logits = tf.matmul(final_hidden_matrix, output_weights, transpose_b=True)\n  logits = tf.nn.bias_add(logits, output_bias)\n\n  logits = tf.reshape(logits, [batch_size, seq_length, 2])\n  logits = tf.transpose(logits, [2, 0, 1])\n\n  unstacked_logits = tf.unstack(logits, axis=0)\n\n  (start_logits, end_logits) = (unstacked_logits[0], unstacked_logits[1])\n\n  return (start_logits, end_logits)\n\n\ndef model_fn_builder(bert_config, init_checkpoint, learning_rate,\n                     num_train_steps, num_warmup_steps, use_tpu,\n                     use_one_hot_embeddings):\n  \"\"\"Returns `model_fn` closure for TPUEstimator.\"\"\"\n\n  def model_fn(features, labels, mode, params):  # pylint: disable=unused-argument\n    \"\"\"The `model_fn` for TPUEstimator.\"\"\"\n\n    tf.logging.info(\"*** Features ***\")\n    for name in sorted(features.keys()):\n      tf.logging.info(\"  name = %s, shape = %s\" % (name, features[name].shape))\n\n    unique_ids = features[\"unique_ids\"]\n    input_ids = features[\"input_ids\"]\n    input_mask = features[\"input_mask\"]\n    segment_ids = features[\"segment_ids\"]\n\n    is_training = (mode == tf.estimator.ModeKeys.TRAIN)\n\n    (start_logits, end_logits) = create_model(\n        bert_config=bert_config,\n        is_training=is_training,\n        input_ids=input_ids,\n        input_mask=input_mask,\n        segment_ids=segment_ids,\n        use_one_hot_embeddings=use_one_hot_embeddings)\n\n    tvars = tf.trainable_variables()\n\n    initialized_variable_names = {}\n    scaffold_fn = None\n    if init_checkpoint:\n      (assignment_map, initialized_variable_names\n      ) = modeling.get_assignment_map_from_checkpoint(tvars, init_checkpoint)\n      if use_tpu:\n\n        def tpu_scaffold():\n          tf.train.init_from_checkpoint(init_checkpoint, assignment_map)\n          return tf.train.Scaffold()\n\n        scaffold_fn = tpu_scaffold\n      else:\n        tf.train.init_from_checkpoint(init_checkpoint, assignment_map)\n\n    tf.logging.info(\"**** Trainable Variables ****\")\n    for var in tvars:\n      init_string = \"\"\n      if var.name in initialized_variable_names:\n        init_string = \", *INIT_FROM_CKPT*\"\n      tf.logging.info(\"  name = %s, shape = %s%s\", var.name, var.shape,\n                      init_string)\n\n    output_spec = None\n    if mode == tf.estimator.ModeKeys.TRAIN:\n      seq_length = modeling.get_shape_list(input_ids)[1]\n\n      def compute_loss(logits, positions):\n        one_hot_positions = tf.one_hot(\n            positions, depth=seq_length, dtype=tf.float32)\n        log_probs = tf.nn.log_softmax(logits, axis=-1)\n        loss = -tf.reduce_mean(\n            tf.reduce_sum(one_hot_positions * log_probs, axis=-1))\n        return loss\n\n      start_positions = features[\"start_positions\"]\n      end_positions = features[\"end_positions\"]\n\n      start_loss = compute_loss(start_logits, start_positions)\n      end_loss = compute_loss(end_logits, end_positions)\n\n      total_loss = (start_loss + end_loss) / 2.0\n\n      train_op = optimization.create_optimizer(\n          total_loss, learning_rate, num_train_steps, num_warmup_steps, use_tpu)\n\n      output_spec = tf.contrib.tpu.TPUEstimatorSpec(\n          mode=mode,\n          loss=total_loss,\n          train_op=train_op,\n          scaffold_fn=scaffold_fn)\n    elif mode == tf.estimator.ModeKeys.PREDICT:\n      predictions = {\n          \"unique_ids\": unique_ids,\n          \"start_logits\": start_logits,\n          \"end_logits\": end_logits,\n      }\n      output_spec = tf.contrib.tpu.TPUEstimatorSpec(\n          mode=mode, predictions=predictions, scaffold_fn=scaffold_fn)\n    else:\n      raise ValueError(\n          \"Only TRAIN and PREDICT modes are supported: %s\" % (mode))\n\n    return output_spec\n\n  return model_fn\n\n\ndef input_fn_builder(input_file, seq_length, is_training, drop_remainder):\n  \"\"\"Creates an `input_fn` closure to be passed to TPUEstimator.\"\"\"\n\n  name_to_features = {\n      \"unique_ids\": tf.FixedLenFeature([], tf.int64),\n      \"input_ids\": tf.FixedLenFeature([seq_length], tf.int64),\n      \"input_mask\": tf.FixedLenFeature([seq_length], tf.int64),\n      \"segment_ids\": tf.FixedLenFeature([seq_length], tf.int64),\n  }\n\n  if is_training:\n    name_to_features[\"start_positions\"] = tf.FixedLenFeature([], tf.int64)\n    name_to_features[\"end_positions\"] = tf.FixedLenFeature([], tf.int64)\n\n  def _decode_record(record, name_to_features):\n    \"\"\"Decodes a record to a TensorFlow example.\"\"\"\n    example = tf.parse_single_example(record, name_to_features)\n\n    # tf.Example only supports tf.int64, but the TPU only supports tf.int32.\n    # So cast all int64 to int32.\n    for name in list(example.keys()):\n      t = example[name]\n      if t.dtype == tf.int64:\n        t = tf.to_int32(t)\n      example[name] = t\n\n    return example\n\n  def input_fn(params):\n    \"\"\"The actual input function.\"\"\"\n    batch_size = params[\"batch_size\"]\n\n    # For training, we want a lot of parallel reading and shuffling.\n    # For eval, we want no shuffling and parallel reading doesn't matter.\n    d = tf.data.TFRecordDataset(input_file)\n    if is_training:\n      d = d.repeat()\n      d = d.shuffle(buffer_size=100)\n\n    d = d.apply(\n        tf.contrib.data.map_and_batch(\n            lambda record: _decode_record(record, name_to_features),\n            batch_size=batch_size,\n            drop_remainder=drop_remainder))\n\n    return d\n\n  return input_fn\n\n\nRawResult = collections.namedtuple(\"RawResult\",\n                                   [\"unique_id\", \"start_logits\", \"end_logits\"])\n\n\ndef write_predictions(all_examples, all_features, all_results, n_best_size,\n                      max_answer_length, do_lower_case, output_prediction_file,\n                      output_nbest_file, output_null_log_odds_file):\n  \"\"\"Write final predictions to the json file and log-odds of null if needed.\"\"\"\n  tf.logging.info(\"Writing predictions to: %s\" % (output_prediction_file))\n  tf.logging.info(\"Writing nbest to: %s\" % (output_nbest_file))\n\n  example_index_to_features = collections.defaultdict(list)\n  for feature in all_features:\n    example_index_to_features[feature.example_index].append(feature)\n\n  unique_id_to_result = {}\n  for result in all_results:\n    unique_id_to_result[result.unique_id] = result\n\n  _PrelimPrediction = collections.namedtuple(  # pylint: disable=invalid-name\n      \"PrelimPrediction\",\n      [\"feature_index\", \"start_index\", \"end_index\", \"start_logit\", \"end_logit\"])\n\n  all_predictions = collections.OrderedDict()\n  all_nbest_json = collections.OrderedDict()\n  scores_diff_json = collections.OrderedDict()\n\n  for (example_index, example) in enumerate(all_examples):\n    features = example_index_to_features[example_index]\n\n    prelim_predictions = []\n    # keep track of the minimum score of null start+end of position 0\n    score_null = 1000000  # large and positive\n    min_null_feature_index = 0  # the paragraph slice with min mull score\n    null_start_logit = 0  # the start logit at the slice with min null score\n    null_end_logit = 0  # the end logit at the slice with min null score\n    for (feature_index, feature) in enumerate(features):\n      result = unique_id_to_result[feature.unique_id]\n      start_indexes = _get_best_indexes(result.start_logits, n_best_size)\n      end_indexes = _get_best_indexes(result.end_logits, n_best_size)\n      # if we could have irrelevant answers, get the min score of irrelevant\n      if FLAGS.version_2_with_negative:\n        feature_null_score = result.start_logits[0] + result.end_logits[0]\n        if feature_null_score < score_null:\n          score_null = feature_null_score\n          min_null_feature_index = feature_index\n          null_start_logit = result.start_logits[0]\n          null_end_logit = result.end_logits[0]\n      for start_index in start_indexes:\n        for end_index in end_indexes:\n          # We could hypothetically create invalid predictions, e.g., predict\n          # that the start of the span is in the question. We throw out all\n          # invalid predictions.\n          if start_index >= len(feature.tokens):\n            continue\n          if end_index >= len(feature.tokens):\n            continue\n          if start_index not in feature.token_to_orig_map:\n            continue\n          if end_index not in feature.token_to_orig_map:\n            continue\n          if not feature.token_is_max_context.get(start_index, False):\n            continue\n          if end_index < start_index:\n            continue\n          length = end_index - start_index + 1\n          if length > max_answer_length:\n            continue\n          prelim_predictions.append(\n              _PrelimPrediction(\n                  feature_index=feature_index,\n                  start_index=start_index,\n                  end_index=end_index,\n                  start_logit=result.start_logits[start_index],\n                  end_logit=result.end_logits[end_index]))\n\n    if FLAGS.version_2_with_negative:\n      prelim_predictions.append(\n          _PrelimPrediction(\n              feature_index=min_null_feature_index,\n              start_index=0,\n              end_index=0,\n              start_logit=null_start_logit,\n              end_logit=null_end_logit))\n    prelim_predictions = sorted(\n        prelim_predictions,\n        key=lambda x: (x.start_logit + x.end_logit),\n        reverse=True)\n\n    _NbestPrediction = collections.namedtuple(  # pylint: disable=invalid-name\n        \"NbestPrediction\", [\"text\", \"start_logit\", \"end_logit\"])\n\n    seen_predictions = {}\n    nbest = []\n    for pred in prelim_predictions:\n      if len(nbest) >= n_best_size:\n        break\n      feature = features[pred.feature_index]\n      if pred.start_index > 0:  # this is a non-null prediction\n        tok_tokens = feature.tokens[pred.start_index:(pred.end_index + 1)]\n        orig_doc_start = feature.token_to_orig_map[pred.start_index]\n        orig_doc_end = feature.token_to_orig_map[pred.end_index]\n        orig_tokens = example.doc_tokens[orig_doc_start:(orig_doc_end + 1)]\n        tok_text = \" \".join(tok_tokens)\n\n        # De-tokenize WordPieces that have been split off.\n        tok_text = tok_text.replace(\" ##\", \"\")\n        tok_text = tok_text.replace(\"##\", \"\")\n\n        # Clean whitespace\n        tok_text = tok_text.strip()\n        tok_text = \" \".join(tok_text.split())\n        orig_text = \" \".join(orig_tokens)\n\n        final_text = get_final_text(tok_text, orig_text, do_lower_case)\n        if final_text in seen_predictions:\n          continue\n\n        seen_predictions[final_text] = True\n      else:\n        final_text = \"\"\n        seen_predictions[final_text] = True\n\n      nbest.append(\n          _NbestPrediction(\n              text=final_text,\n              start_logit=pred.start_logit,\n              end_logit=pred.end_logit))\n\n    # if we didn't inlude the empty option in the n-best, inlcude it\n    if FLAGS.version_2_with_negative:\n      if \"\" not in seen_predictions:\n        nbest.append(\n            _NbestPrediction(\n                text=\"\", start_logit=null_start_logit,\n                end_logit=null_end_logit))\n    # In very rare edge cases we could have no valid predictions. So we\n    # just create a nonce prediction in this case to avoid failure.\n    if not nbest:\n      nbest.append(\n          _NbestPrediction(text=\"empty\", start_logit=0.0, end_logit=0.0))\n\n    assert len(nbest) >= 1\n\n    total_scores = []\n    best_non_null_entry = None\n    for entry in nbest:\n      total_scores.append(entry.start_logit + entry.end_logit)\n      if not best_non_null_entry:\n        if entry.text:\n          best_non_null_entry = entry\n\n    probs = _compute_softmax(total_scores)\n\n    nbest_json = []\n    for (i, entry) in enumerate(nbest):\n      output = collections.OrderedDict()\n      output[\"text\"] = entry.text\n      output[\"probability\"] = probs[i]\n      output[\"start_logit\"] = entry.start_logit\n      output[\"end_logit\"] = entry.end_logit\n      nbest_json.append(output)\n\n    assert len(nbest_json) >= 1\n\n    if not FLAGS.version_2_with_negative:\n      all_predictions[example.qas_id] = nbest_json[0][\"text\"]\n    else:\n      # predict \"\" iff the null score - the score of best non-null > threshold\n      score_diff = score_null - best_non_null_entry.start_logit - (\n          best_non_null_entry.end_logit)\n      scores_diff_json[example.qas_id] = score_diff\n      if score_diff > FLAGS.null_score_diff_threshold:\n        all_predictions[example.qas_id] = \"\"\n      else:\n        all_predictions[example.qas_id] = best_non_null_entry.text\n\n    all_nbest_json[example.qas_id] = nbest_json\n\n  with tf.gfile.GFile(output_prediction_file, \"w\") as writer:\n    writer.write(json.dumps(all_predictions, indent=4) + \"\\n\")\n\n  with tf.gfile.GFile(output_nbest_file, \"w\") as writer:\n    writer.write(json.dumps(all_nbest_json, indent=4) + \"\\n\")\n\n  if FLAGS.version_2_with_negative:\n    with tf.gfile.GFile(output_null_log_odds_file, \"w\") as writer:\n      writer.write(json.dumps(scores_diff_json, indent=4) + \"\\n\")\n\n\ndef get_final_text(pred_text, orig_text, do_lower_case):\n  \"\"\"Project the tokenized prediction back to the original text.\"\"\"\n\n  # When we created the data, we kept track of the alignment between original\n  # (whitespace tokenized) tokens and our WordPiece tokenized tokens. So\n  # now `orig_text` contains the span of our original text corresponding to the\n  # span that we predicted.\n  #\n  # However, `orig_text` may contain extra characters that we don't want in\n  # our prediction.\n  #\n  # For example, let's say:\n  #   pred_text = steve smith\n  #   orig_text = Steve Smith's\n  #\n  # We don't want to return `orig_text` because it contains the extra \"'s\".\n  #\n  # We don't want to return `pred_text` because it's already been normalized\n  # (the SQuAD eval script also does punctuation stripping/lower casing but\n  # our tokenizer does additional normalization like stripping accent\n  # characters).\n  #\n  # What we really want to return is \"Steve Smith\".\n  #\n  # Therefore, we have to apply a semi-complicated alignment heruistic between\n  # `pred_text` and `orig_text` to get a character-to-charcter alignment. This\n  # can fail in certain cases in which case we just return `orig_text`.\n\n  def _strip_spaces(text):\n    ns_chars = []\n    ns_to_s_map = collections.OrderedDict()\n    for (i, c) in enumerate(text):\n      if c == \" \":\n        continue\n      ns_to_s_map[len(ns_chars)] = i\n      ns_chars.append(c)\n    ns_text = \"\".join(ns_chars)\n    return (ns_text, ns_to_s_map)\n\n  # We first tokenize `orig_text`, strip whitespace from the result\n  # and `pred_text`, and check if they are the same length. If they are\n  # NOT the same length, the heuristic has failed. If they are the same\n  # length, we assume the characters are one-to-one aligned.\n  tokenizer = tokenization.BasicTokenizer(do_lower_case=do_lower_case)\n\n  tok_text = \" \".join(tokenizer.tokenize(orig_text))\n\n  start_position = tok_text.find(pred_text)\n  if start_position == -1:\n    if FLAGS.verbose_logging:\n      tf.logging.info(\n          \"Unable to find text: '%s' in '%s'\" % (pred_text, orig_text))\n    return orig_text\n  end_position = start_position + len(pred_text) - 1\n\n  (orig_ns_text, orig_ns_to_s_map) = _strip_spaces(orig_text)\n  (tok_ns_text, tok_ns_to_s_map) = _strip_spaces(tok_text)\n\n  if len(orig_ns_text) != len(tok_ns_text):\n    if FLAGS.verbose_logging:\n      tf.logging.info(\"Length not equal after stripping spaces: '%s' vs '%s'\",\n                      orig_ns_text, tok_ns_text)\n    return orig_text\n\n  # We then project the characters in `pred_text` back to `orig_text` using\n  # the character-to-character alignment.\n  tok_s_to_ns_map = {}\n  for (i, tok_index) in six.iteritems(tok_ns_to_s_map):\n    tok_s_to_ns_map[tok_index] = i\n\n  orig_start_position = None\n  if start_position in tok_s_to_ns_map:\n    ns_start_position = tok_s_to_ns_map[start_position]\n    if ns_start_position in orig_ns_to_s_map:\n      orig_start_position = orig_ns_to_s_map[ns_start_position]\n\n  if orig_start_position is None:\n    if FLAGS.verbose_logging:\n      tf.logging.info(\"Couldn't map start position\")\n    return orig_text\n\n  orig_end_position = None\n  if end_position in tok_s_to_ns_map:\n    ns_end_position = tok_s_to_ns_map[end_position]\n    if ns_end_position in orig_ns_to_s_map:\n      orig_end_position = orig_ns_to_s_map[ns_end_position]\n\n  if orig_end_position is None:\n    if FLAGS.verbose_logging:\n      tf.logging.info(\"Couldn't map end position\")\n    return orig_text\n\n  output_text = orig_text[orig_start_position:(orig_end_position + 1)]\n  return output_text\n\n\ndef _get_best_indexes(logits, n_best_size):\n  \"\"\"Get the n-best logits from a list.\"\"\"\n  index_and_score = sorted(enumerate(logits), key=lambda x: x[1], reverse=True)\n\n  best_indexes = []\n  for i in range(len(index_and_score)):\n    if i >= n_best_size:\n      break\n    best_indexes.append(index_and_score[i][0])\n  return best_indexes\n\n\ndef _compute_softmax(scores):\n  \"\"\"Compute softmax probability over raw logits.\"\"\"\n  if not scores:\n    return []\n\n  max_score = None\n  for score in scores:\n    if max_score is None or score > max_score:\n      max_score = score\n\n  exp_scores = []\n  total_sum = 0.0\n  for score in scores:\n    x = math.exp(score - max_score)\n    exp_scores.append(x)\n    total_sum += x\n\n  probs = []\n  for score in exp_scores:\n    probs.append(score / total_sum)\n  return probs\n\n\nclass FeatureWriter(object):\n  \"\"\"Writes InputFeature to TF example file.\"\"\"\n\n  def __init__(self, filename, is_training):\n    self.filename = filename\n    self.is_training = is_training\n    self.num_features = 0\n    self._writer = tf.python_io.TFRecordWriter(filename)\n\n  def process_feature(self, feature):\n    \"\"\"Write a InputFeature to the TFRecordWriter as a tf.train.Example.\"\"\"\n    self.num_features += 1\n\n    def create_int_feature(values):\n      feature = tf.train.Feature(\n          int64_list=tf.train.Int64List(value=list(values)))\n      return feature\n\n    features = collections.OrderedDict()\n    features[\"unique_ids\"] = create_int_feature([feature.unique_id])\n    features[\"input_ids\"] = create_int_feature(feature.input_ids)\n    features[\"input_mask\"] = create_int_feature(feature.input_mask)\n    features[\"segment_ids\"] = create_int_feature(feature.segment_ids)\n\n    if self.is_training:\n      features[\"start_positions\"] = create_int_feature([feature.start_position])\n      features[\"end_positions\"] = create_int_feature([feature.end_position])\n      impossible = 0\n      if feature.is_impossible:\n        impossible = 1\n      features[\"is_impossible\"] = create_int_feature([impossible])\n\n    tf_example = tf.train.Example(features=tf.train.Features(feature=features))\n    self._writer.write(tf_example.SerializeToString())\n\n  def close(self):\n    self._writer.close()\n\n\ndef validate_flags_or_throw(bert_config):\n  \"\"\"Validate the input FLAGS or throw an exception.\"\"\"\n  tokenization.validate_case_matches_checkpoint(FLAGS.do_lower_case,\n                                                FLAGS.init_checkpoint)\n\n  if not FLAGS.do_train and not FLAGS.do_predict:\n    raise ValueError(\"At least one of `do_train` or `do_predict` must be True.\")\n\n  if FLAGS.do_train:\n    if not FLAGS.train_file:\n      raise ValueError(\n          \"If `do_train` is True, then `train_file` must be specified.\")\n  if FLAGS.do_predict:\n    if not FLAGS.predict_file:\n      raise ValueError(\n          \"If `do_predict` is True, then `predict_file` must be specified.\")\n\n  if FLAGS.max_seq_length > bert_config.max_position_embeddings:\n    raise ValueError(\n        \"Cannot use sequence length %d because the BERT model \"\n        \"was only trained up to sequence length %d\" %\n        (FLAGS.max_seq_length, bert_config.max_position_embeddings))\n\n  if FLAGS.max_seq_length <= FLAGS.max_query_length + 3:\n    raise ValueError(\n        \"The max_seq_length (%d) must be greater than max_query_length \"\n        \"(%d) + 3\" % (FLAGS.max_seq_length, FLAGS.max_query_length))\n\n\ndef main(_):\n  tf.logging.set_verbosity(tf.logging.INFO)\n\n  bert_config = modeling.BertConfig.from_json_file(FLAGS.bert_config_file)\n\n  validate_flags_or_throw(bert_config)\n\n  tf.gfile.MakeDirs(FLAGS.output_dir)\n\n  tokenizer = tokenization.FullTokenizer(\n      vocab_file=FLAGS.vocab_file, do_lower_case=FLAGS.do_lower_case)\n\n  tpu_cluster_resolver = None\n  if FLAGS.use_tpu and FLAGS.tpu_name:\n    tpu_cluster_resolver = tf.contrib.cluster_resolver.TPUClusterResolver(\n        FLAGS.tpu_name, zone=FLAGS.tpu_zone, project=FLAGS.gcp_project)\n\n  is_per_host = tf.contrib.tpu.InputPipelineConfig.PER_HOST_V2\n  run_config = tf.contrib.tpu.RunConfig(\n      cluster=tpu_cluster_resolver,\n      master=FLAGS.master,\n      model_dir=FLAGS.output_dir,\n      save_checkpoints_steps=FLAGS.save_checkpoints_steps,\n      tpu_config=tf.contrib.tpu.TPUConfig(\n          iterations_per_loop=FLAGS.iterations_per_loop,\n          num_shards=FLAGS.num_tpu_cores,\n          per_host_input_for_training=is_per_host))\n\n  train_examples = None\n  num_train_steps = None\n  num_warmup_steps = None\n  if FLAGS.do_train:\n    train_examples = read_squad_examples(\n        input_file=FLAGS.train_file, is_training=True)\n    num_train_steps = int(\n        len(train_examples) / FLAGS.train_batch_size * FLAGS.num_train_epochs)\n    num_warmup_steps = int(num_train_steps * FLAGS.warmup_proportion)\n\n    # Pre-shuffle the input to avoid having to make a very large shuffle\n    # buffer in in the `input_fn`.\n    rng = random.Random(12345)\n    rng.shuffle(train_examples)\n\n  model_fn = model_fn_builder(\n      bert_config=bert_config,\n      init_checkpoint=FLAGS.init_checkpoint,\n      learning_rate=FLAGS.learning_rate,\n      num_train_steps=num_train_steps,\n      num_warmup_steps=num_warmup_steps,\n      use_tpu=FLAGS.use_tpu,\n      use_one_hot_embeddings=FLAGS.use_tpu)\n\n  # If TPU is not available, this will fall back to normal Estimator on CPU\n  # or GPU.\n  estimator = tf.contrib.tpu.TPUEstimator(\n      use_tpu=FLAGS.use_tpu,\n      model_fn=model_fn,\n      config=run_config,\n      train_batch_size=FLAGS.train_batch_size,\n      predict_batch_size=FLAGS.predict_batch_size)\n\n  if FLAGS.do_train:\n    # We write to a temporary file to avoid storing very large constant tensors\n    # in memory.\n    train_writer = FeatureWriter(\n        filename=os.path.join(FLAGS.output_dir, \"train.tf_record\"),\n        is_training=True)\n    convert_examples_to_features(\n        examples=train_examples,\n        tokenizer=tokenizer,\n        max_seq_length=FLAGS.max_seq_length,\n        doc_stride=FLAGS.doc_stride,\n        max_query_length=FLAGS.max_query_length,\n        is_training=True,\n        output_fn=train_writer.process_feature)\n    train_writer.close()\n\n    tf.logging.info(\"***** Running training *****\")\n    tf.logging.info(\"  Num orig examples = %d\", len(train_examples))\n    tf.logging.info(\"  Num split examples = %d\", train_writer.num_features)\n    tf.logging.info(\"  Batch size = %d\", FLAGS.train_batch_size)\n    tf.logging.info(\"  Num steps = %d\", num_train_steps)\n    del train_examples\n\n    train_input_fn = input_fn_builder(\n        input_file=train_writer.filename,\n        seq_length=FLAGS.max_seq_length,\n        is_training=True,\n        drop_remainder=True)\n    estimator.train(input_fn=train_input_fn, max_steps=num_train_steps)\n\n  if FLAGS.do_predict:\n    eval_examples = read_squad_examples(\n        input_file=FLAGS.predict_file, is_training=False)\n\n    eval_writer = FeatureWriter(\n        filename=os.path.join(FLAGS.output_dir, \"eval.tf_record\"),\n        is_training=False)\n    eval_features = []\n\n    def append_feature(feature):\n      eval_features.append(feature)\n      eval_writer.process_feature(feature)\n\n    convert_examples_to_features(\n        examples=eval_examples,\n        tokenizer=tokenizer,\n        max_seq_length=FLAGS.max_seq_length,\n        doc_stride=FLAGS.doc_stride,\n        max_query_length=FLAGS.max_query_length,\n        is_training=False,\n        output_fn=append_feature)\n    eval_writer.close()\n\n    tf.logging.info(\"***** Running predictions *****\")\n    tf.logging.info(\"  Num orig examples = %d\", len(eval_examples))\n    tf.logging.info(\"  Num split examples = %d\", len(eval_features))\n    tf.logging.info(\"  Batch size = %d\", FLAGS.predict_batch_size)\n\n    all_results = []\n\n    predict_input_fn = input_fn_builder(\n        input_file=eval_writer.filename,\n        seq_length=FLAGS.max_seq_length,\n        is_training=False,\n        drop_remainder=False)\n\n    # If running eval on the TPU, you will need to specify the number of\n    # steps.\n    all_results = []\n    for result in estimator.predict(\n        predict_input_fn, yield_single_examples=True):\n      if len(all_results) % 1000 == 0:\n        tf.logging.info(\"Processing example: %d\" % (len(all_results)))\n      unique_id = int(result[\"unique_ids\"])\n      start_logits = [float(x) for x in result[\"start_logits\"].flat]\n      end_logits = [float(x) for x in result[\"end_logits\"].flat]\n      all_results.append(\n          RawResult(\n              unique_id=unique_id,\n              start_logits=start_logits,\n              end_logits=end_logits))\n\n    output_prediction_file = os.path.join(FLAGS.output_dir, \"predictions.json\")\n    output_nbest_file = os.path.join(FLAGS.output_dir, \"nbest_predictions.json\")\n    output_null_log_odds_file = os.path.join(FLAGS.output_dir, \"null_odds.json\")\n\n    write_predictions(eval_examples, eval_features, all_results,\n                      FLAGS.n_best_size, FLAGS.max_answer_length,\n                      FLAGS.do_lower_case, output_prediction_file,\n                      output_nbest_file, output_null_log_odds_file)\n\n\nif __name__ == \"__main__\":\n  flags.mark_flag_as_required(\"vocab_file\")\n  flags.mark_flag_as_required(\"bert_config_file\")\n  flags.mark_flag_as_required(\"output_dir\")\n  tf.app.run()\n"
  },
  {
    "path": "run/transform_spanbert_pytorch_to_tf.py",
    "content": "#!/usr/bin/env python3 \n# -*- coding: utf-8 -*- \n\n\n\n# author: xiaoy li \n# description:\n# transform pytorch .bin models to tensorflow ckpt \n\n\nimport os\nimport sys  \nimport shutil \nimport torch\nimport argparse \nimport random \nimport numpy as np \nimport tensorflow as tf  \n\nREPO_PATH = \"/\".join(os.path.realpath(__file__).split(\"/\")[:-2])\n\nif REPO_PATH not in sys.path:\n    sys.path.insert(0, REPO_PATH)\n\n\nfrom bert import modeling \nfrom utils.load_pytorch_to_tf import load_from_pytorch_checkpoint \n\n\ndef load_models(bert_config_path, ):\n    bert_config = modeling.BertConfig.from_json_file(bert_config_path)\n    input_ids = tf.ones((8, 128), tf.int32)\n\n    model = modeling.BertModel(\n        config=bert_config,\n        is_training=False, \n        input_ids=input_ids,\n        use_one_hot_embeddings=False, \n        scope=\"bert\")\n\n    return model, bert_config \n\n\ndef copy_checkpoint(source, target):\n  for ext in (\".index\", \".data-00000-of-00001\"):\n    shutil.copyfile(source + ext, target + ext)\n\n\ndef main(bert_config_path, bert_ckpt_path, pytorch_init_checkpoint, output_tf_dir):\n\n    with tf.Session() as session:\n        model, bert_config = load_models(bert_config_path)\n        tvars = tf.trainable_variables()\n        assignment_map, initialized_variable_names = modeling.get_assignment_map_from_checkpoint(tvars, bert_ckpt_path)\n        session.run(tf.global_variables_initializer())\n        init_from_checkpoint = load_from_pytorch_checkpoint\n        init_from_checkpoint(pytorch_init_checkpoint, assignment_map)\n\n        for var in tvars:\n            init_string = \"\"\n            if var.name in initialized_variable_names:\n                init_string = \", *INIT_FROM_CKPT*\"\n                print(\"name = %s, shape = %s%s\" % (var.name, var.shape, init_string))\n        \n        saver = tf.train.Saver()\n        saver.save(session, os.path.join(output_tf_dir, \"model\"), global_step=100)\n        copy_checkpoint(os.path.join(output_tf_dir, \"model-{}\".format(str(100))), os.path.join(output_tf_dir, \"bert_model.ckpt\"))\n        print(\"=*=\"*30)\n        print(\"save models : {}\".format(output_tf_dir))\n        print(\"=*=\"*30)\n\n\ndef parse_args():\n    parser = argparse.ArgumentParser()\n    parser.add_argument(\"--spanbert_config_path\", default=\"/home/lixiaoya/spanbert_base_cased/config.json\", type=str)\n    parser.add_argument(\"--bert_tf_ckpt_path\", default=\"/home/lixiaoya/cased_L-12_H-768_A-12/bert_model.ckpt\", type=str)\n    parser.add_argument(\"--spanbert_pytorch_bin_path\", default=\"/home/lixiaoya/spanbert_base_cased/pytorch_model.bin\", type=str)\n    parser.add_argument(\"--output_spanbert_tf_dir\", default=\"/home/lixiaoya/tf_spanbert_base_case\", type=str)\n    parser.add_argument(\"--seed\", default=2333, type=int)\n\n\n    args = parser.parse_args()\n\n    random.seed(args.seed)\n    np.random.seed(args.seed)\n    torch.manual_seed(args.seed)\n    tf.set_random_seed(args.seed)\n    torch.cuda.manual_seed_all(args.seed)\n\n    os.makedirs(args.output_spanbert_tf_dir, exist_ok=True)\n\n    try:\n        shutil(args.spanbert_config_path, args.output_spanbert_tf_dir)\n    except:\n        print(\"#=#\"*30)\n        print(\"copy spanbert_config from {} to {}\".format(args.spanbert_config_path, args.output_spanbert_tf_dir))\n\n    return args\n\n\nif __name__ == \"__main__\":\n\n    args_config = parse_args()\n\n    main(args_config.spanbert_config_path, args_config.bert_tf_ckpt_path, args_config.spanbert_pytorch_bin_path, args_config.output_spanbert_tf_dir)\n\n    # \n    # Please refer to scripts/data/transform_ckpt_pytorch_to_tf.sh \n    # \n\n    # for spanbert large \n    # \n    # python3 transform_spanbert_pytorch_to_tf.py \\\n    # --spanbert_config_path /xiaoya/pretrain_ckpt/spanbert_large_cased/config.json \\\n    # --bert_tf_ckpt_path /xiaoya/pretrain_ckpt/cased_L-24_H-1024_A-16/bert_model.ckpt \\\n    # --spanbert_pytorch_bin_path /xiaoya/pretrain_ckpt/spanbert_large_cased/pytorch_model.bin \\\n    # --output_spanbert_tf_dir /xiaoya/pretrain_ckpt/tf_spanbert_large_cased\n\n\n    # for spanbert base \n    # \n    # python3 transform_spanbert_pytorch_to_tf.py \\\n    # --spanbert_config_path /xiaoya/pretrain_ckpt/spanbert_base_cased/config.json \\\n    # --bert_tf_ckpt_path /xiaoya/pretrain_ckpt/cased_L-12_H-768_A-12/bert_model.ckpt \\\n    # --spanbert_pytorch_bin_path /xiaoya/pretrain_ckpt/spanbert_base_cased/pytorch_model.bin \\\n    # --output_spanbert_tf_dir /xiaoya/pretrain_ckpt/tf_spanbert_base_cased\n\n"
  },
  {
    "path": "scripts/data/download_pretrained_mlm.sh",
    "content": "#!/usr/bin/env bash \n# -*- coding: utf-8 -*- \n\n\n\n# Author: xiaoy li \n# description:\n# download pretrained model ckpt \n\n\n\nBERT_PRETRAIN_CKPT=$1\nMODEL_NAME=$2\n\n\nif [[ $MODEL_NAME == \"bert_base\" ]]; then\n    mkdir -p $BERT_PRETRAIN_CKPT\n    echo \"DownLoad BERT Cased Base\"\n    wget https://storage.googleapis.com/bert_models/2018_10_18/cased_L-12_H-768_A-12.zip -P $BERT_PRETRAIN_CKPT\n    unzip $BERT_PRETRAIN_CKPT/cased_L-12_H-768_A-12.zip -d $BERT_PRETRAIN_CKPT\n    rm $BERT_PRETRAIN_CKPT/cased_L-12_H-768_A-12.zip\nelif [[ $MODEL_NAME == \"bert_large\" ]]; then\n    echo \"DownLoad BERT Cased Large\"\n    wget https://storage.googleapis.com/bert_models/2018_10_18/cased_L-24_H-1024_A-16.zip -P $BERT_PRETRAIN_CKPT\n    unzip $BERT_PRETRAIN_CKPT/cased_L-24_H-1024_A-16.zip -d $BERT_PRETRAIN_CKPT\n    rm $BERT_PRETRAIN_CKPT/cased_L-24_H-1024_A-16.zip\nelif [[ $MODEL_NAME == \"spanbert_base\" ]]; then\n    echo \"DownLoad Span-BERT Cased Base\"\n    wget https://dl.fbaipublicfiles.com/fairseq/models/spanbert_hf_base.tar.gz -P $BERT_PRETRAIN_CKPT \n    tar -zxvf $BERT_PRETRAIN_CKPT/spanbert_hf_base.tar.gz -C $BERT_PRETRAIN_CKPT\n    rm $BERT_PRETRAIN_CKPT/spanbert_hf_base.tar.gz\nelif [[ $MODEL_NAME == \"spanbert_large\" ]]; then\n    echo \"DownLoad Span-BERT Cased Large\"\n    wget https://dl.fbaipublicfiles.com/fairseq/models/spanbert_hf.tar.gz -P $BERT_PRETRAIN_CKPT\n    tar -zxvf $BERT_PRETRAIN_CKPT/spanbert_hf.tar.gz -C $BERT_PRETRAIN_CKPT\n    rm $BERT_PRETRAIN_CKPT/spanbert_hf.tar.gz\nelif [[ $MODEL_NAME == \"bert_tiny\" ]]; then\n    each \"DownLoad BERT Uncased Tiny; Helps to debug on GPU.\"\n    wget https://storage.googleapis.com/bert_models/2020_02_20/uncased_L-2_H-128_A-2.zip -P $BERT_PRETRAIN_CKPT \n    tar -zxvf $BERT_PRETRAIN_CKPT/uncased_L-2_H-128_A-2.zip -C $BERT_PRETRAIN_CKPT\n    rm $BERT_PRETRAIN_CKPT/uncased_L-2_H-128_A-2.zip\nelse\n    echo 'Unknown argment 2 (Model Sign)'\nfi "
  },
  {
    "path": "scripts/data/generate_tfrecord_dataset.sh",
    "content": "#!/usr/bin/env bash \n# -*- coding: utf-8 -*-\n\n\n\n# author: xiaoy li\n# description:\n# generate train/dev/test tfrecord files for training the model. \n# example:\n# bash generate_tfrecord_dataset.sh /path-to-conll-coreference-resolution-dataset /path-to-save-tfrecord-for-training /cased_L-12_H-768_A-12/vocab.txt\n\n\n\nREPO_PATH=/home/lixiaoya/coref-tf\nexport PYTHONPATH=$REPO_PATH\n\nsource_dir=$1\ntarget_dir=$2\nvocab_file=$3\n\nmkdir -p ${target_dir}\n\n\npython3 ${REPO_PATH}/run/build_dataset_to_tfrecord.py \\\n--source_files_dir $source_dir \\\n--target_output_dir $target_dir \\\n--num_window 2 \\\n--window_size 64 \\\n--max_num_mention 50 \\\n--max_num_cluster 40 \\\n--vocab_file $vocab_file \\\n--language english \\\n--max_doc_length 600 "
  },
  {
    "path": "scripts/data/preprocess_ontonotes_annfiles.sh",
    "content": "#!/usr/bin/env bash \n\n\n# author: xiaoy li \n# description:\n# generate annotated CONLL-2012 coreference resolution datasets from the official released OntoNotes 5.0 dataset.  \n# \n######################################\n# NOTICE:\n######################################\n# the scripts only work with python 2.\n# if you want to run with python 3, please refer to https://github.com/huggingface/neuralcoref/blob/master/neuralcoref/train/training.md#get-the-data\n# Thanks to their amazing job ! \n# \n# Reference: \n# https://github.com/huggingface/neuralcoref/blob/master/neuralcoref/train/training.md#get-the-data\n# https://github.com/mandarjoshi90/coref\n# \n\n\npath_to_ontonotes5.0_directory=$1\npath_to_save_processed_data_directory=$2\nlanguage=$3\n\n\ndlx() {\n  wget -P $path_to_save_processed_data_directory $1/$2\n  tar -xvzf $path_to_save_processed_data_directory/$2 -C $path_to_save_processed_data_directory\n  rm $path_to_save_processed_data_directory/$2\n}\n\n\nconll_url=http://conll.cemantix.org/2012/download\ndlx $conll_url conll-2012-train.v4.tar.gz\ndlx $conll_url conll-2012-development.v4.tar.gz\ndlx $conll_url/test conll-2012-test-key.tar.gz\ndlx $conll_url/test conll-2012-test-official.v9.tar.gz\n\ndlx $conll_url conll-2012-scripts.v3.tar.gz\ndlx http://conll.cemantix.org/download reference-coreference-scorers.v8.01.tar.gz\n\nbash $path_to_save_processed_data_directory/conll-2012/v3/scripts/skeleton2conll.sh -D $path_to_ontonotes5.0_directory/data/files/data $path_to_save_processed_data_directory/conll-2012\n\nfunction compile_partition() {\n    rm -f $2.$5.$3$4\n    cat $path_to_save_processed_data_directory/conll-2012/$3/data/$1/data/$5/annotations/*/*/*/*.$3$4 >> $path_to_save_processed_data_directory/$2.$5.$3$4\n}\n\nfunction compile_language() {\n    compile_partition development dev v4 _gold_conll $1\n    compile_partition train train v4 _gold_conll $1\n    compile_partition test test v4 _gold_conll $1\n}\n\ncompile_language $language\n\n\n"
  },
  {
    "path": "scripts/data/transform_ckpt_pytorch_to_tf.sh",
    "content": "#!/usr/bin/env bash \n# -*- coding: utf-8 -*-\n\n\n\n# author: xiaoy li\n# description:\n# transform trained spanbert language model from pytorch(.bin) to tensorflow(.ckpt). \n# PLEASE NOTICE: the same scale(Base/Large) BERT(TF) Models are also necessary. \n\n\n\nREPO_PATH=/home/lixiaoya/coref-tf\nexport PYTHONPATH=${REPO_PATH}\n\n\nMODEL_NAME=$1\nPATH_TO_SPANBERT_PYTORCH_DIR=$2\nPATH_TO_SAME_SCALE_BERT_TF_DIR=$3\nPATH_TO_SAVE_SPANBERT_TF_DIR=$4\n\n\nif [[ $MODEL_NAME == \"spanbert_base\" ]]; then\n    # spanbert large \n    echo \"Transform SpanBERT Cased Base from Pytorch To TF\"\n    python3 ${REPO_PATH}/run/transform_spanbert_pytorch_to_tf.py \\\n        --spanbert_config_path $PATH_TO_SPANBERT_PYTORCH_DIR/config.json \\\n        --bert_tf_ckpt_path $PATH_TO_SAME_SCALE_BERT_TF_DIR/bert_model.ckpt \\\n        --spanbert_pytorch_bin_path $PATH_TO_SPANBERT_PYTORCH_DIR/pytorch_model.bin \\\n        --output_spanbert_tf_dir $PATH_TO_SAVE_SPANBERT_TF_DIR\nelif [[ $MODEL_NAME == \"spanbert_large\" ]]; then\n    # spanbert base \n    echo \"Transform SpanBERT Cased Large from Pytorch To TF\"\n    python3 ${REPO_PATH}/run/transform_spanbert_pytorch_to_tf.py \\\n    --spanbert_config_path $PATH_TO_SPANBERT_PYTORCH_DIR/config.json \\\n    --bert_tf_ckpt_path $PATH_TO_SAME_SCALE_BERT_TF_DIR/bert_model.ckpt \\\n    --spanbert_pytorch_bin_path $PATH_TO_SPANBERT_PYTORCH_DIR/pytorch_model.bin \\\n    --output_spanbert_tf_dir $PATH_TO_SAVE_SPANBERT_TF_DIR\nelse\n    echo 'Unknown argment 1 (Model Sign)'\nfi "
  },
  {
    "path": "scripts/models/corefqa_gpu.sh",
    "content": "#!/usr/bin/env bash \n# -*- coding: utf-8 -*- \n\n\n\n# author: xiaoy li \n# description:\n# train and evaluate the middle checkpoints on dev and test sets. \n\n\n\nREPO_PATH=/home/lixiaoya/xiaoy_tf\nexport PYTHONPATH=\"$PYTHONPATH:$REPO_PATH\"\n\noutput_dir=/xiaoya/mention_proposal_output\nbert_dir=/xiaoya/pretrain_ckpt/uncased_L-2_H-128_A-2\ndata_dir=/xiaoya/corefqa_data/final_overlap_64_2\n\n\nrm -rf ${output_dir}\nmkdir -p ${output_dir}\n\n\n\nCUDA_VISIBLE_DEVICES=3 python3 ${REPO_PATH}/run/run_corefqa.py \\\n--output_dir=${output_dir} \\\n--bert_config_file=${bert_dir}/bert_config_nodropout.json \\\n--init_checkpoint=${bert_dir}/bert_model.ckpt \\\n--vocab_file=${bert_dir}/vocab.txt \\\n--logfile_path=${output_dir}/train.log \\\n--num_epochs=20 \\\n--keep_checkpoint_max=50 \\\n--save_checkpoints_steps=500 \\\n--train_file=${data_dir}/train.64.english.tfrecord \\\n--dev_file=${data_dir}/dev.64.english.tfrecord \\\n--test_file=${data_dir}/test.64.english.tfrecord \\\n--do_train=True \\\n--do_eval=False \\\n--do_predict=False \\\n--learning_rate=1e-5 \\\n--dropout_rate=0.0 \\\n--mention_threshold=0.5 \\\n--hidden_size=128 \\\n--num_docs=5604 \\\n--window_size=64 \\\n--num_window=2 \\\n--max_num_mention=20 \\\n--start_end_share=False \\\n--max_span_width=20 \\\n--max_candidate_mentions=50 \\\n--top_span_ratio=0.2 \\\n--max_top_antecedents=30 \\\n--max_query_len=150 \\\n--max_context_len=150 \\\n--sec_qa_mention_score=False \\\n--use_tpu=False \\\n--seed=2333\n\n\n\n"
  },
  {
    "path": "scripts/models/corefqa_tpu.sh",
    "content": "#!/usr/bin/env bash \n# -*- coding: utf-8 -*- \n\n\n\n# author: xiaoy li \n# description:\n# clean code and add comments \n\n\n\nREPO_PATH=/home/xiaoyli1110/xiaoya/Coref-tf\nexport PYTHONPATH=\"$PYTHONPATH:$REPO_PATH\"\nexport TPU_NAME=tensorflow-tpu\nexport TPU_ZONE=europe-west4-a\nexport GCP_PROJECT=xiaoyli-20-04-274510\n\noutput_dir=gs://europe_mention_proposal/output_bertlarge\nbert_dir=gs://europe_pretrain_mlm/uncased_L-2_H-128_A-2\ndata_dir=gs://europe_corefqa_data/final_overlap_64_2\n\n\n\npython3 ${REPO_PATH}/run/run_corefqa.py \\\n--output_dir=${output_dir} \\\n--bert_config_file=${bert_dir}/bert_config_nodropout.json \\\n--init_checkpoint=${bert_dir}/bert_model.ckpt \\\n--vocab_file=${bert_dir}/vocab.txt \\\n--logfile_path=${output_dir}/train.log \\\n--num_epochs=20 \\\n--keep_checkpoint_max=50 \\\n--save_checkpoints_steps=500 \\\n--train_file=${data_dir}/train.64.english.tfrecord \\\n--dev_file=${data_dir}/dev.64.english.tfrecord \\\n--test_file=${data_dir}/test.64.english.tfrecord \\\n--do_train=True \\\n--do_eval=False \\\n--do_predict=False \\\n--learning_rate=1e-5 \\\n--dropout_rate=0.0 \\\n--mention_threshold=0.5 \\\n--hidden_size=128 \\\n--num_docs=5604 \\\n--window_size=64 \\\n--num_window=2 \\\n--max_num_mention=20 \\\n--start_end_share=False \\\n--max_span_width=20 \\\n--max_candidate_mentions=50 \\\n--top_span_ratio=0.2 \\\n--max_top_antecedents=30 \\\n--max_query_len=150 \\\n--max_context_len=150 \\\n--sec_qa_mention_score=False \\\n--use_tpu=True \\\n--tpu_name=$TPU_NAME \\\n--tpu_zone=$TPU_ZONE \\\n--gcp_project=$GCP_PROJECT \\\n--num_tpu_cores=1 \\\n--seed=2333\n"
  },
  {
    "path": "scripts/models/mention_gpu.sh",
    "content": "#!/usr/bin/env bash \n# -*- coding: utf-8 -*- \n\n\n\n# author: xiaoy li \n# description:\n# clean code and add comments \n\n\n\nREPO_PATH=/home/lixiaoya/xiaoy_tf\nexport PYTHONPATH=\"$PYTHONPATH:$REPO_PATH\"\n\noutput_dir=/xiaoya/mention_proposal_output\nbert_dir=/xiaoya/pretrain_ckpt/uncased_L-2_H-128_A-2\ndata_dir=/xiaoya/corefqa_data/final_overlap_64_2\n\n\nrm -rf ${output_dir}\nmkdir -p ${output_dir}\n\n\n\nCUDA_VISIBLE_DEVICES=3 python3 ${REPO_PATH}/run/run_mention_proposal.py \\\n--output_dir=${output_dir} \\\n--bert_config_file=${bert_dir}/bert_config_nodropout.json \\\n--init_checkpoint=${bert_dir}/bert_model.ckpt \\\n--vocab_file=${bert_dir}/vocab.txt \\\n--logfile_path=${output_dir}/train.log \\\n--num_epochs=20 \\\n--keep_checkpoint_max=50 \\\n--save_checkpoints_steps=500 \\\n--train_file=${data_dir}/train.64.english.tfrecord \\\n--dev_file=${data_dir}/dev.64.english.tfrecord \\\n--test_file=${data_dir}/test.64.english.tfrecord \\\n--do_train=True \\\n--do_eval=False \\\n--do_predict=False \\\n--learning_rate=1e-5 \\\n--dropout_rate=0.0 \\\n--mention_threshold=0.5 \\\n--hidden_size=128 \\\n--num_docs=5604 \\\n--window_size=64 \\\n--num_window=2 \\\n--max_num_mention=20 \\\n--start_end_share=False \\\n--loss_start_ratio=0.3 \\\n--loss_end_ratio=0.3 \\\n--loss_span_ratio=0.3 \\\n--use_tpu=False \\\n--seed=2333\n"
  },
  {
    "path": "scripts/models/mention_tpu.sh",
    "content": "#!/usr/bin/env bash \n# -*- coding: utf-8 -*- \n\n\n\n# author: xiaoy li \n# description:\n# clean code and add comments \n\n\n\nREPO_PATH=/home/xiaoyli1110/xiaoya/Coref-tf\nexport PYTHONPATH=\"$PYTHONPATH:$REPO_PATH\"\nexport TPU_NAME=tensorflow-tpu\nexport TPU_ZONE=europe-west4-a\nexport GCP_PROJECT=xiaoyli-20-04-274510\n\noutput_dir=gs://europe_mention_proposal/output_bertlarge\nbert_dir=gs://europe_pretrain_mlm/uncased_L-2_H-128_A-2\ndata_dir=gs://europe_corefqa_data/final_overlap_64_2\n\n\n\npython3 ${REPO_PATH}/run/run_mention_proposal.py \\\n--output_dir=${output_dir} \\\n--bert_config_file=${bert_dir}/bert_config_nodropout.json \\\n--init_checkpoint=${bert_dir}/bert_model.ckpt \\\n--vocab_file=${bert_dir}/vocab.txt \\\n--logfile_path=${output_dir}/train.log \\\n--num_epochs=20 \\\n--keep_checkpoint_max=50 \\\n--save_checkpoints_steps=500 \\\n--train_file=${data_dir}/train.64.english.tfrecord \\\n--dev_file=${data_dir}/dev.64.english.tfrecord \\\n--test_file=${data_dir}/test.64.english.tfrecord \\\n--do_train=True \\\n--do_eval=False \\\n--do_predict=False \\\n--learning_rate=1e-5 \\\n--dropout_rate=0.0 \\\n--mention_threshold=0.5 \\\n--hidden_size=128 \\\n--num_docs=5604 \\\n--window_size=64 \\\n--num_window=2 \\\n--max_num_mention=20 \\\n--start_end_share=False \\\n--loss_start_ratio=0.3 \\\n--loss_end_ratio=0.3 \\\n--loss_span_ratio=0.3 \\\n--use_tpu=True \\\n--tpu_name=$TPU_NAME \\\n--tpu_zone=$TPU_ZONE \\\n--gcp_project=$GCP_PROJECT \\\n--num_tpu_cores=1 \\\n--seed=2333\n"
  },
  {
    "path": "scripts/models/quoref_tpu.sh",
    "content": "#!/usr/bin/env bash \n# -*- coding: utf-8 -*- \n\n\n\n# author: xiaoy li \n# description:\n# finetune the spanbert model on squad 2.0 for data augment.  \n\n\n\nREPO_PATH=/home/shannon/coref-tf\nexport TPU_NAME=tf-tpu\nexport PYTHONPATH=\"$PYTHONPATH:$REPO_PATH\"\nQUOREF_DIR=gs://qa_tasks/quoref\nBERT_DIR=gs://corefqa_output_squad/panbert_large_squad2_2e-5\nOUTPUT_DIR=gs://corefqa_output_quoref/spanbert_large_squad2_best_quoref_3e-5 \n\n\npython3 ${REPO_PATH}/run_quoref.py \\\n--vocab_file=$BERT_DIR/vocab.txt \\\n--bert_config_file=$BERT_DIR/bert_config.json \\\n--init_checkpoint=$BERT_DIR/best_bert_model.ckpt \\\n--do_train=True \\\n--train_file=$QUOREF_DIR/quoref-train-v0.1.json \\\n--do_predict=True \\\n--predict_file=$QUOREF_DIR/quoref-dev-v0.1.json \\\n--train_batch_size=8 \\\n--learning_rate=3e-5 \\\n--num_train_epochs=5 \\\n--max_seq_length=384 \\\n--do_lower_case=False \\\n--doc_stride=128 \\\n--output_dir=${OUTPUT_DIR} \\\n--use_tpu=True \\\n--tpu_name=$TPU_NAME"
  },
  {
    "path": "scripts/models/squad_tpu.sh",
    "content": "#!/usr/bin/env bash \n# -*- coding: utf-8 -*- \n\n\n\n# author: xiaoy li \n# description:\n# finetune the spanbert model on squad 2.0 for data augment.  \n\n\n\nREPO_PATH=/home/shannon/coref-tf\nexport TPU_NAME=tf-tpu\nexport PYTHONPATH=\"$PYTHONPATH:$REPO_PATH\"\nSQUAD_DIR=gs://qa_tasks/squad2\nBERT_DIR=gs://pretrained_mlm_checkpoint/spanbert_large_tf\nOUTPUT_DIR=gs://corefqa_output_squad/spanbert_large_squad2_2e-5  \n\n\npython3 ${REPO_PATH}/run/run_squad.py \\\n--vocab_file=$BERT_DIR/vocab.txt \\\n--bert_config_file=$BERT_DIR/bert_config.json \\\n--init_checkpoint=$BERT_DIR/bert_model.ckpt \\\n--do_train=True \\\n--train_file=$SQUAD_DIR/train-v2.0.json \\\n--do_predict=True \\\n--predict_file=$SQUAD_DIR/dev-v2.0.json \\\n--train_batch_size=8 \\\n--learning_rate=2e-5 \\\n--num_train_epochs=4.0 \\\n--max_seq_length=384 \\\n--do_lower_case=False \\\n--doc_stride=128 \\\n--output_dir=${OUTPUT_DIR} \\\n--use_tpu=True \\\n--tpu_name=$TPU_NAME \\\n--version_2_with_negative=True"
  },
  {
    "path": "tests/bitwise_and.py",
    "content": "#!/usr/bin/env python3 \n# -*- coding: utf-8 -*- \n\n\n\n# author: xiaoy li \n# description:\n# \n\n\nimport tensorflow as tf \n\n\nif __name__ == \"__main__\":\n    sess = tf.compat.v1.InteractiveSession()\n    lhs = tf.constant([0, 5, 3, 14], dtype=tf.int32)\n    rhs = tf.constant([5, 0, 7, 11], dtype=tf.int32)\n\n    res = tf.bitwise.bitwise_and(lhs, rhs)\n    res.eval()\n    # array([ 0, 0, 3, 10], dtype=int32)\n    sess.close()\n\n\n"
  },
  {
    "path": "tests/construct_label.py",
    "content": "#!/usr/bin/env python3\n# -*- coding: utf-8 -*- \n\n\n\n# desc:\n# construct labels \n\n\nimport tensorflow as tf \n\n\n\nif __name__ == \"__main__\":\n    sess = tf.compat.v1.InteractiveSession()\n    gold_starts = tf.constant([1, 2, 3, 4])\n    gold_ends = tf.constant([2, 3, 4, 5])\n    num_word = 10 \n    gold_mention_sparse_label = tf.stack([gold_starts, gold_ends], axis=1)\n    gold_mention_sparse_label.eval()\n    gold_span_value = tf.reshape(tf.ones_like(gold_starts, tf.int32), [-1])\n    gold_span_shape = tf.constant([num_word, num_word])\n    gold_span_label = tf.cast(tf.scatter_nd(gold_mention_sparse_label, gold_span_value, gold_span_shape), tf.int32)\n    gold_span_label.eval()\n\n    candidate_start = tf.constant([1, 4, 5])\n    candidate_end = tf.constant([2, 5, 5])\n    candidate_span = tf.stack([candidate_start, candidate_end], axis=1)\n\n    gold_span_label = tf.gather_nd(gold_span_label, tf.expand_dims(candidate_span, 1))\n    gold_span_label.eval()\n"
  },
  {
    "path": "tests/cumsum.py",
    "content": "#!/usr/bin/env python3 \n# -*- coding: utf-8 -*- \n\n\"\"\"\ninput_a:\n    array([[0, 1, 0, 1, 1],\n       [0, 1, 1, 1, 1],\n       [0, 1, 1, 0, 1],\n       [1, 1, 1, 1, 1],\n       [0, 1, 1, 1, 0]], dtype=int32)\ncum_input_b:\n    array([[1, 2, 3, 4, 5],\n       [1, 2, 3, 4, 5],\n       [1, 2, 3, 4, 5],\n       [1, 2, 3, 4, 5],\n       [1, 2, 3, 4, 5]], dtype=int32)\ninput_c:\n    array([[0, 2, 0, 4, 5],\n       [0, 2, 3, 4, 5],\n       [0, 2, 3, 0, 5],\n       [1, 2, 3, 4, 5],\n       [0, 2, 3, 4, 0]], dtype=int32)\ninput_c:\n    array([[  1,   2,   3,   4,   5],\n       [129, 130, 131, 132, 133],\n       [257, 258, 259, 260, 261],\n       [385, 386, 387, 388, 389],\n       [513, 514, 515, 516, 517]], dtype=int32)\ninput_d:\n    array([[  0,   2,   0,   4,   5],\n       [  0, 130, 131, 132, 133],\n       [  0, 258, 259,   0, 261],\n       [385, 386, 387, 388, 389],\n       [  0, 514, 515, 516,   0]], dtype=int32)\nflat_input_d:\n    array([  0,   2,   0,   4,   5,   0, 130, 131, 132, 133,   0, 258, 259,\n        0, 261, 385, 386, 387, 388, 389,   0, 514, 515, 516,   0], dtype=int32)\nboolean_mask:\n    array([False,  True, False,  True,  True, False,  True,  True,  True,\n        True, False,  True,  True, False,  True,  True,  True,  True,\n        True,  True, False,  True,  True,  True, False])\ninput_f:\n    array([  2,   4,   5, 130, 131, 132, 133, 258, 259, 261, 385, 386, 387,\n       388, 389, 514, 515, 516], dtype=int32)\n\"\"\"\n\n\n\n\nimport tensorflow as tf \n\n\nif __name__ == \"__main__\":\n    sess = tf.compat.v1.InteractiveSession()\n    input_a = tf.constant([\n        [0, 1, 0, 1, 1], [0, 1, 1, 1, 1], [0, 1, 1, 0, 1], [1, 1, 1, 1, 1], [0, 1, 1, 1, 0]])\n    ones_input_b = tf.ones_like(input_a, tf.int32)\n    cum_input_b = tf.math.cumsum(ones_input_b, axis=1)\n    cum_input_b.eval()\n    # input_c = tf.math.multiply(cum_input_b, input_a)\n    # input_c.eval()\n    seq_len = 128\n    offset = tf.tile(tf.expand_dims(tf.range(5) * 128, 1), [1, 5])\n    offset.eval()\n    input_e = offset + cum_input_b \n    input_e.eval()\n\n    input_d = tf.math.multiply(input_e, input_a)\n    input_d.eval()\n    flat_input_d = tf.reshape(input_d, [-1])\n    flat_input_d.eval()\n\n    boolean_mask = tf.math.greater(flat_input_d, tf.zeros_like(flat_input_d, tf.int32))\n    boolean_mask.eval()\n\n    input_f = tf.boolean_mask(flat_input_d, boolean_mask)\n    input_f.eval()\n\n    sess.close()\n\n\n"
  },
  {
    "path": "tests/gather.py",
    "content": "#!/usr/bin/env python3 \n# -*- coding: utf-8 -*- \n\n\n\n\n# author: xiaoy li \n\n\nimport tensorflow as tf \n\n\n\nif __name__ == \"__main__\":\n    sess = tf.compat.v1.InteractiveSession()\n    lhs = tf.zeros((4, 3))\n\n    slice_lhs = tf.gather(lhs, 1)\n    # slice_lhs_nd = tf.gather_nd(lhs, 1)\n\n    slice_lhs = tf.gather(lhs, [1, 2])\n\n    slice_lhs.eval()\n    # slice_lhs_nd.eval()\n    # array([ 0,  0,  3, 10], dtype=int32)\n    sess.close()"
  },
  {
    "path": "tests/model_fn.py",
    "content": "#!/usr/bin/env python3 \n# -*- coding: utf-8 -*-  \n\n\n\n# author: xiaoy li \n# description:\n# test config in model fn builder \n\n\n\n\ndef model_fn(config):\n\n    def mention_proposal_fn():\n        print(\"the number of document is : \")\n        print(config.document_number)\n        print(config.number_window_size)\n\n    return mention_proposal_fn \n\n\nclass Config:\n    number_window_size = 2 \n    document_number = 3 \n\n\n\n\nif __name__ == \"__main__\":\n    config = Config()\n    get_model_fn = model_fn(config)\n    get_model_fn()"
  },
  {
    "path": "tests/tile_repeat.py",
    "content": "#!/usr/bin/env python3 \n \nimport numpy as np \nimport tensorflow as tf \n\n\na_np = np.array([[1, 2, 3], [4, 5, 6], [7, 8, 9], [2, 3, 4], [5, 6, 7], [8, 9, 10]])\n\nprint(a_np.shape)\n# exit()\n\ndef shape(x, dim):\n    return x.get_shape()[dim].value or tf.shape(x)[dim]\n\n\nif __name__ == \"__main__\":\n\n    original_array = np.array([[1, 2, 3], [4, 5, 6], [7, 8, 9], [2, 3, 4], [5, 6, 7], [8, 9, 10]])\n    sess = tf.compat.v1.InteractiveSession()\n    start_scores = tf.convert_to_tensor(tf.constant([[1, 2, 3], [4, 5, 6], [7, 8, 9], [2, 3, 4], [5, 6, 7], [8, 9, 10]]))\n    print(tf.shape(start_scores))\n    # exit()\n    expand_scores = tf.tile(tf.expand_dims(start_scores, 2), [1, 1, 3])\n    # (6, 3, 3)\n    print(expand_scores.eval())\n    print(shape(expand_scores, 0))\n    print(shape(expand_scores, 1))\n    print(shape(expand_scores, 2))\n    print(\"=*=\"*20)\n    # tf.convert_to_tensor(data_np, np.float32)\n    # ndarray_scores = tf.make_ndarray(expand_scores)\n    # ndarray_scores = tf.convert_to_tensor(expand_scores, np.int32)\n    # print(ndarray_scores)\n    # exit()\n    ndarray_scores = np.array([[[1, 1, 1], [ 2 , 2 , 2], [ 3 , 3 , 3]],\n        [[ 4 , 4 , 4],[ 5 , 5 , 5],[ 6 , 6 , 6]],\n        [[ 7 , 7 , 7],[ 8 , 8 , 8],[ 9 , 9 , 9]],\n        [[ 2 , 2 , 2], [ 3 , 3 , 3], [ 4 , 4 , 4]],\n        [[ 5 , 5 , 5], [ 6 , 6 , 6], [ 7 , 7 , 7]],\n        [[ 8 , 8 , 8], [ 9 , 9 , 9], [10 , 10 ,10]]])\n    print(\"$=\"*20)\n    print(\"test_a is : {}\".format(str(ndarray_scores[2, 2, 2])))\n    print(\"test_b is : {}\".format(str(original_array[2, 2])))\n    print(\"^-\"*20)\n    print(\"test_a is : {}\".format(str(ndarray_scores[2, 1, 1])))\n    print(\"test_b is : {}\".format(str(original_array[2, 1])))\n    print(\"^-\"*20)\n    print(\"test_a is : {}\".format(str(ndarray_scores[2, 0, 0])))\n    print(\"test_b is : {}\".format(str(original_array[2, 0])))\n    sess.close() \n    # span_scores[k][i][j] = start_scores[k][i] + end_scores[k][j]\n    # start_scores[k][i][j] = start_scores[k][i]\n    # end_scores[k][i][j] = end_scores[k][j]\n\n    # [[1, 2, 3], [4, 5, 6], [7, 8, 9], [2, 3, 4], [5, 6, 7], [8, 9, 10]]\n\n    \"\"\"\n    [[[1, 1, 1], [ 2 , 2 , 2], [ 3 , 3 , 3]],\n    [[ 4 , 4 , 4],[ 5 , 5 , 5],[ 6 , 6 , 6]],\n    [[ 7 , 7 , 7],[ 8 , 8 , 8],[ 9 , 9 , 9]],\n    [[ 2 , 2 , 2], [ 3 , 3 , 3], [ 4 , 4 , 4]],\n    [[ 5 , 5 , 5], [ 6 , 6 , 6], [ 7 , 7 , 7]],\n    [[ 8 , 8 , 8], [ 9 , 9 , 9], [10 , 10 ,10]]]\n    \"\"\"\n\n"
  },
  {
    "path": "tests/tpu_operation.py",
    "content": "#!/usr/bin/env python3 \n# -*- coding: utf-8 -*- \n\n\n\n# author: xiaoy li \n# descripiton:\n# test math operations in tpu \n\n\nimport tensorflow as tf \nfrom tensorflow.contrib import tpu\nfrom tensorflow.contrib.cluster_resolver import TPUClusterResolver\n\n\n\nTPU_NAME = \"tensorflow-tpu\"\nTPU_ZONE = \"us-central1-f\"\nGCP_PROJECT = \"xiaoyli-20-04-274510\"\n\n\n\nif __name__ == \"__main__\":\n    tpu_cluster_resolver = tf.distribute.cluster_resolver.TPUClusterResolver(TPU_NAME, zone=TPU_ZONE, project=GCP_PROJECT)   \n    # tpu_cluster_resolver = TPUClusterResolver(tpu=['tensorflow-tpu']).get_master()\n    tf.config.experimental_connect_to_cluster(tpu_cluster_resolver)\n    tf.tpu.experimental.initialize_tpu_system(tpu_cluster_resolver)\n\n    scores = tf.constant([1.0, 2.3, 3.2, 4.3, 1.5, 1.8, 98, 2.9])\n    k = 2\n\n    def test_top_k():\n        top_scores, top_index = tf.nn.top_k(scores, k)\n        return top_scores, top_index \n\n    test_op = test_top_k\n\n    # with tf.compat.v1.InteractiveSession(tpu_cluster_resolver) as sess:\n    with tf.compat.v1.Session(tpu_cluster_resolver) as sess:\n        sess.run(tpu.initialize_system())\n\n        scores = tf.constant([1.0, 2.3, 3.2, 4.3, 1.5, 1.8, 98, 2.9])\n        k = 2\n        print(\"ALL Devices: \", tf.config.experimental_list_devices())\n\n        top_scores, top_index = tf.nn.top_k(scores, k) \n\n        print(top_scores.eval())\n        print(top_index.eval())\n\n        sess.run(tpu.shutdown_system())\n\n\n\n\n\n"
  },
  {
    "path": "utils/load_pytorch_to_tf.py",
    "content": "#!/usr/bin/env python3 \n# -*- coding: utf-8 -*- \n\n\n\n# author: xiaoy li \n# description:\n# transform pretrained model checkpoints from [pytorch *.bin] to [tensorflow *.ckpt]\n# Reference:\n# https://github.com/mandarjoshi90/coref/blob/master/pytorch_to_tf.py\n\n\nimport numpy as np\nimport torch\nimport tensorflow as tf\nfrom tensorflow.python.framework import ops\nfrom tensorflow.python.ops import variable_scope as vs\n\n\ntensors_to_transpose = (\n    \"dense/kernel\",\n    \"attention/self/query\",\n    \"attention/self/key\",\n    \"attention/self/value\"\n)\n\nvar_map = (\n    ('layer.', 'layer_'),\n    ('word_embeddings.weight', 'word_embeddings'),\n    ('position_embeddings.weight', 'position_embeddings'),\n    ('token_type_embeddings.weight', 'token_type_embeddings'),\n    ('.', '/'),\n    ('LayerNorm/weight', 'LayerNorm/gamma'),\n    ('LayerNorm/bias', 'LayerNorm/beta'),\n    ('weight', 'kernel')\n)\n\n\ndef to_tf_var_name(name: str):\n    for patt, repl in iter(var_map):\n        name = name.replace(patt, repl)\n    return '{}'.format(name)\n\n\ndef my_convert_keys(model):\n    converted = {}\n    for k_pt, v in model.items():\n        k_tf = to_tf_var_name(k_pt)\n        converted[k_tf] = v\n    return converted\n\n\ndef load_from_pytorch_checkpoint(checkpoint, assignment_map):\n    pytorch_model = torch.load(checkpoint, map_location='cpu')\n    pt_model_with_tf_keys = my_convert_keys(pytorch_model)\n    for _, name in assignment_map.items():\n        store_vars = vs._get_default_variable_store()._vars\n        var = store_vars.get(name, None)\n        assert var is not None\n        if name not in pt_model_with_tf_keys:\n            print('WARNING:', name, 'not found in original model.')\n            continue\n        array = pt_model_with_tf_keys[name].cpu().numpy()\n        if any([x in name for x in tensors_to_transpose]):\n            array = array.transpose()\n        assert tuple(var.get_shape().as_list()) == tuple(array.shape)\n        init_value = ops.convert_to_tensor(array, dtype=np.float32)\n        var._initial_value = init_value\n        var._initializer_op = var.assign(init_value)\n\n\ndef print_vars(pytorch_ckpt, tf_ckpt):\n    tf_vars = tf.train.list_variables(tf_ckpt)\n    tf_vars = {k: v for (k, v) in tf_vars}\n    pytorch_model = torch.load(pytorch_ckpt)\n    pt_model_with_tf_keys = my_convert_keys(pytorch_model)\n    only_pytorch, only_tf, common = [], [], []\n    tf_only = set(tf_vars.keys())\n    for k, v in pt_model_with_tf_keys.items():\n        if k in tf_vars:\n            common.append(k)\n            tf_only.remove(k)\n        else:\n            only_pytorch.append(k)\n    print('-------------------')\n    print('Common', len(common))\n    for k in common:\n        array = pt_model_with_tf_keys[k].cpu().numpy()\n        if any([x in k for x in tensors_to_transpose]):\n            array = array.transpose()\n        tf_shape = tuple(tf_vars[k])\n        pt_shape = tuple(array.shape)\n        if tf_shape != pt_shape:\n            print(k, tf_shape, pt_shape)\n    print('-------------------')\n    print('Pytorch only', len(only_pytorch))\n    for k in only_pytorch:\n        print(k, pt_model_with_tf_keys[k].size())\n    print('-------------------')\n    print('TF only', len(tf_only))\n    for k in tf_only:\n        print(k, tf_vars[k])\n\n\n"
  },
  {
    "path": "utils/metrics.py",
    "content": "#!/usr/bin/env python3 \n# -*- coding: utf-8 -*- \n\n\n\nimport numpy as np\nfrom collections import Counter\nfrom scipy.optimize import linear_sum_assignment\n\n\ndef mention_proposal_prediction(config, current_doc_result, concat_only=True):\n    \"\"\"\n    current_doc_result: \n        \"total_loss\": total_loss,\n        \"start_scores\": start_scores,\n        \"start_gold\": gold_starts,\n        \"end_gold\": gold_ends,\n        \"end_scores\": end_scores, \n        \"span_scores\": span_scores, \n        \"span_gold\": span_mention\n\n    \"\"\"\n\n    span_scores = current_doc_result[\"span_scores\"]\n    span_gold = current_doc_result[\"span_gold\"] \n\n    if concat_only:\n        scores = span_scores\n    else:\n        start_scores = current_doc_result[\"start_scores\"], \n        end_scores = current_doc_result[\"end_scores\"]   \n        # start_scores = tf.tile(tf.expand_dims(start_scores, 2), [1, 1, config[\"max_segment_len\"]])\n        start_scores = np.tile(np.expand_dims(start_scores, axis=2), (1, 1, config[\"max_segment_len\"]))\n        end_scores = np.tile(np.expand_dims(end_scores, axis=2), (1, 1, config[\"max_segment_len\"]))\n        start_scores = np.reshape(start_scores, [-1, config[\"max_segment_len\"], config[\"max_segment_len\"]])\n        end_scores = np.reshape(end_scores, [-1, config[\"max_segment_len\"], config[\"max_segment_len\"]])\n\n        # end_scores -> max_training_sent, max_segment_len \n        scores = (start_scores + end_scores + span_scores)/3\n\n    pred_span_label = scores >= 0.5\n    pred_span_label = np.reshape(pred_span_label, [-1])\n    gold_span_label = np.reshape(span_gold, [-1])\n\n    return pred_span_label, gold_span_label\n\n\ndef f1(p_num, p_den, r_num, r_den, beta=1):\n    p = 0 if p_den == 0 else p_num / float(p_den)\n    r = 0 if r_den == 0 else r_num / float(r_den)\n    return 0 if p + r == 0 else (1 + beta * beta) * p * r / (beta * beta * p + r)\n\n\nclass CorefEvaluator(object):\n    def __init__(self):\n        self.evaluators = [Evaluator(m) for m in (muc, b_cubed, ceafe)]\n\n    def update(self, predicted, gold, mention_to_predicted, mention_to_gold):\n        for e in self.evaluators:\n            e.update(predicted, gold, mention_to_predicted, mention_to_gold)\n\n    def get_f1(self):\n        return sum(e.get_f1() for e in self.evaluators) / len(self.evaluators)\n\n    def get_recall(self):\n        return sum(e.get_recall() for e in self.evaluators) / len(self.evaluators)\n\n    def get_precision(self):\n        return sum(e.get_precision() for e in self.evaluators) / len(self.evaluators)\n\n    def get_prf(self):\n        return self.get_precision(), self.get_recall(), self.get_f1()\n\n\nclass Evaluator(object):\n    def __init__(self, metric, beta=1):\n        self.p_num = 0\n        self.p_den = 0\n        self.r_num = 0\n        self.r_den = 0\n        self.metric = metric\n        self.beta = beta\n\n    def update(self, predicted, gold, mention_to_predicted, mention_to_gold):\n        if self.metric == ceafe:\n            pn, pd, rn, rd = self.metric(predicted, gold)\n        else:\n            pn, pd = self.metric(predicted, mention_to_gold)\n            rn, rd = self.metric(gold, mention_to_predicted)\n        self.p_num += pn\n        self.p_den += pd\n        self.r_num += rn\n        self.r_den += rd\n\n    def get_f1(self):\n        return f1(self.p_num, self.p_den, self.r_num, self.r_den, beta=self.beta)\n\n    def get_recall(self):\n        return 0 if self.r_num == 0 else self.r_num / float(self.r_den)\n\n    def get_precision(self):\n        return 0 if self.p_num == 0 else self.p_num / float(self.p_den)\n\n    def get_prf(self):\n        return self.get_precision(), self.get_recall(), self.get_f1()\n\n    def get_counts(self):\n        return self.p_num, self.p_den, self.r_num, self.r_den\n\n\ndef evaluate_documents(documents, metric, beta=1):\n    evaluator = Evaluator(metric, beta=beta)\n    for document in documents:\n        evaluator.update(document)\n    return evaluator.get_precision(), evaluator.get_recall(), evaluator.get_f1()\n\n\ndef b_cubed(clusters, mention_to_gold):\n    num, dem = 0, 0\n\n    for c in clusters:\n        if len(c) == 1:\n            continue\n\n        gold_counts = Counter()\n        correct = 0\n        for m in c:\n            if m in mention_to_gold:\n                gold_counts[tuple(mention_to_gold[m])] += 1\n        for c2, count in gold_counts.items():\n            if len(c2) != 1:\n                correct += count * count\n\n        num += correct / float(len(c))\n        dem += len(c)\n\n    return num, dem\n\n\ndef muc(clusters, mention_to_gold):\n    tp, p = 0, 0\n    for c in clusters:\n        p += len(c) - 1\n        tp += len(c)\n        linked = set()\n        for m in c:\n            if m in mention_to_gold:\n                linked.add(mention_to_gold[m])\n            else:\n                tp -= 1\n        tp -= len(linked)\n    return tp, p\n\n\ndef phi4(c1, c2):\n    return 2 * len([m for m in c1 if m in c2]) / float(len(c1) + len(c2))\n\n\ndef ceafe(clusters, gold_clusters):\n    clusters = [c for c in clusters if len(c) != 1]\n    scores = np.zeros((len(gold_clusters), len(clusters)))\n    for i in range(len(gold_clusters)):\n        for j in range(len(clusters)):\n            scores[i, j] = phi4(gold_clusters[i], clusters[j])\n    row_ind, col_ind = linear_sum_assignment(-scores)\n    similarity = sum(scores[row_ind, col_ind])\n    return similarity, len(clusters), similarity, len(gold_clusters)\n\n\ndef lea(clusters, mention_to_gold):\n    num, dem = 0, 0\n\n    for c in clusters:\n        if len(c) == 1:\n            continue\n\n        common_links = 0\n        all_links = len(c) * (len(c) - 1) / 2.0\n        for i, m in enumerate(c):\n            if m in mention_to_gold:\n                for m2 in c[i + 1:]:\n                    if m2 in mention_to_gold and mention_to_gold[m] == mention_to_gold[m2]:\n                        common_links += 1\n\n        num += len(c) * common_links / float(all_links)\n        dem += len(c)\n\n    return num, dem\n"
  },
  {
    "path": "utils/radam.py",
    "content": "#!/usr/bin/env python3 \n# -*- coding: utf-8 -*- \n\n\n\nimport tensorflow as tf\nfrom tensorflow.python import (\n        ops, math_ops, state_ops, control_flow_ops, resource_variable_ops)\nfrom tensorflow.python.training.optimizer import Optimizer\n\n__all__ = ['RAdam']\n\n\nclass RAdam(Optimizer):\n    \"\"\"Rectified Adam (RAdam) optimizer.\n    According to the paper\n    [On The Variance Of The Adaptive Learning Rate And Beyond](https://arxiv.org/pdf/1908.03265v1.pdf).\n    \"\"\"\n\n    def __init__(self,\n                 learning_rate=0.001,\n                 beta1=0.9,\n                 beta2=0.999,\n                 epsilon=1e-8,\n                 amsgrad=False,\n                 use_locking=False,\n                 name='RAdam'):\n        r\"\"\"Construct a new Rectified Adam optimizer.\n        Args:\n            learning_rate: A Tensor or a floating point value.    The learning rate.\n            beta1: A float value or a constant float tensor. The exponential decay\n                rate for the 1st moment estimates.\n            beta2: A float value or a constant float tensor. The exponential decay\n                rate for the 2nd moment estimates.\n            epsilon: A small constant for numerical stability. This epsilon is\n                \"epsilon hat\" in the Kingma and Ba paper (in the formula just before\n                Section 2.1), not the epsilon in Algorithm 1 of the paper.\n            amsgrad: boolean. Whether to apply AMSGrad variant of this algorithm from\n                the paper \"On the Convergence of Adam and beyond\".\n            use_locking: If `True` use locks for update operations.\n            name: Optional name for the operations created when applying gradients.\n                Defaults to \"Adam\".    @compatibility(eager) When eager execution is\n                enabled, `learning_rate`, `beta1`, `beta2`, and `epsilon` can each be\n                a callable that takes no arguments and returns the actual value to use.\n                This can be useful for changing these values across different\n                invocations of optimizer functions. @end_compatibility\n        \"\"\"\n\n        super(RAdam, self).__init__(use_locking, name)\n        self._lr = learning_rate\n        self._beta1 = beta1\n        self._beta2 = beta2\n        self._epsilon = epsilon\n        self._amsgrad = amsgrad\n\n    def _get_beta_accumulators(self):\n        with ops.init_scope():\n            graph = ops.get_default_graph()\n            return (self._get_non_slot_variable(\"beta1_power\", graph=graph),\n                    self._get_non_slot_variable(\"beta2_power\", graph=graph),\n                    )\n\n    def _get_niter(self):\n        with ops.init_scope():\n            graph = ops.get_default_graph()\n            return self._get_non_slot_variable(\"niter\", graph=graph)\n\n    def _create_slots(self, var_list):\n        first_var = min(var_list, key=lambda x: x.name)\n        self._create_non_slot_variable(\n            initial_value=self._beta1, name=\"beta1_power\", colocate_with=first_var)\n        self._create_non_slot_variable(\n            initial_value=self._beta2, name=\"beta2_power\", colocate_with=first_var)\n        self._create_non_slot_variable(\n            initial_value=1, name=\"niter\", colocate_with=first_var)\n        for var in var_list:\n            self._zeros_slot(var, 'm', self._name)\n            self._zeros_slot(var, 'v', self._name)\n        if self._amsgrad:\n            for var in var_list:\n                self._zeros_slot(var, 'vhat', self._name)\n\n    def _prepare(self):\n        learning_rate = self._call_if_callable(self._lr)\n        beta1 = self._call_if_callable(self._beta1)\n        beta2 = self._call_if_callable(self._beta2)\n        epsilon = self._call_if_callable(self._epsilon)\n\n        self._lr_t = ops.convert_to_tensor(learning_rate, name=\"learning_rate\")\n        self._beta1_t = ops.convert_to_tensor(beta1, name=\"beta1\")\n        self._beta2_t = ops.convert_to_tensor(beta2, name=\"beta2\")\n        self._epsilon_t = ops.convert_to_tensor(epsilon, name=\"epsilon\")\n\n    def _apply_dense_shared(self, grad, var):\n        var_dtype = var.dtype.base_dtype\n        beta1_power, beta2_power = self._get_beta_accumulators()\n        beta1_power = math_ops.cast(beta1_power, var_dtype)\n        beta2_power = math_ops.cast(beta2_power, var_dtype)\n        niter = self._get_niter()\n        niter = math_ops.cast(niter, var_dtype)\n        lr_t = math_ops.cast(self._lr_t, var_dtype)\n        beta1_t = math_ops.cast(self._beta1_t, var_dtype)\n        beta2_t = math_ops.cast(self._beta2_t, var_dtype)\n        epsilon_t = math_ops.cast(self._epsilon_t, var_dtype)\n\n        sma_inf = 2.0 / (1.0 - beta2_t) - 1.0\n        sma_t = sma_inf - 2.0 * niter * beta2_power / (1.0 - beta2_power)\n\n        m = self.get_slot(var, 'm')\n        m_t = state_ops.assign(m,\n                               beta1_t * m + (1.0 - beta1_t) * grad,\n                               use_locking=self._use_locking)\n        m_corr_t = m_t / (1.0 - beta1_power)\n\n        v = self.get_slot(var, 'v')\n        v_t = state_ops.assign(v,\n                               beta2_t * v + (1.0 - beta2_t) * math_ops.square(grad),\n                               use_locking=self._use_locking)\n\n        if self._amsgrad:\n            vhat = self.get_slot(var, 'vhat')\n            vhat_t = state_ops.assign(vhat,\n                                      math_ops.maximum(vhat, v_t),\n                                      use_locking=self._use_locking)\n            v_corr_t = math_ops.sqrt(vhat_t / (1.0 - beta2_power) + epsilon_t)\n        else:\n            v_corr_t = math_ops.sqrt(v_t / (1.0 - beta2_power) + epsilon_t)\n\n        r_t = math_ops.sqrt((sma_t - 4.0) / (sma_inf - 4.0) *\n                            (sma_t - 2.0) / (sma_inf - 2.0) *\n                            sma_inf / sma_t)\n\n        var_t = tf.where(sma_t > 5.0, r_t * m_corr_t / v_corr_t, m_corr_t)\n\n        var_update = state_ops.assign_sub(var,\n                                          lr_t * var_t,\n                                          use_locking=self._use_locking)\n\n        updates = [var_update, m_t, v_t]\n        if self._amsgrad:\n            updates.append(vhat_t)\n        return control_flow_ops.group(*updates)\n\n    def _apply_dense(self, grad, var):\n        return self._apply_dense_shared(grad, var)\n\n    def _resource_apply_dense(self, grad, var):\n        return self._apply_dense_shared(grad, var.handle)\n\n    def _apply_sparse_shared(self, grad, var, indices, scatter_add):\n        var_dtype = var.dtype.base_dtype\n        beta1_power, beta2_power = self._get_beta_accumulators()\n        beta1_power = math_ops.cast(beta1_power, var_dtype)\n        beta2_power = math_ops.cast(beta2_power, var_dtype)\n        niter = self._get_niter()\n        niter = math_ops.cast(niter, var_dtype)\n        lr_t = math_ops.cast(self._lr_t, var_dtype)\n        beta1_t = math_ops.cast(self._beta1_t, var_dtype)\n        beta2_t = math_ops.cast(self._beta2_t, var_dtype)\n        epsilon_t = math_ops.cast(self._epsilon_t, var_dtype)\n\n        sma_inf = 2.0 / (1.0 - beta2_t) - 1.0\n        sma_t = sma_inf - 2.0 * niter * beta2_power / (1.0 - beta2_power)\n\n        m = self.get_slot(var, 'm')\n        m_t = state_ops.assign(m, beta1_t * m, use_locking=self._use_locking)\n        with ops.control_dependencies([m_t]):\n            m_t = scatter_add(m, indices, grad * (1 - beta1_t))\n        m_corr_t = m_t / (1.0 - beta1_power)\n\n        v = self.get_slot(var, 'v')\n        v_t = state_ops.assign(v, beta2_t * v, use_locking=self._use_locking)\n        with ops.control_dependencies([v_t]):\n            v_t = scatter_add(v, indices, (1.0 - beta2_t) * math_ops.square(grad))\n\n        if self._amsgrad:\n            vhat = self.get_slot(var, 'vhat')\n            vhat_t = state_ops.assign(vhat,\n                                      math_ops.maximum(vhat, v_t),\n                                      use_locking=self._use_locking)\n            v_corr_t = math_ops.sqrt(vhat_t / (1.0 - beta2_power) + epsilon_t)\n        else:\n            v_corr_t = math_ops.sqrt(v_t / (1.0 - beta2_power) + epsilon_t)\n\n        r_t = math_ops.sqrt((sma_t - 4.0) / (sma_inf - 4.0) *\n                            (sma_t - 2.0) / (sma_inf - 2.0) *\n                            sma_inf / sma_t)\n\n        var_t = tf.where(sma_t > 5.0, r_t * m_corr_t / v_corr_t, m_corr_t)\n\n        var_update = state_ops.assign_sub(var,\n                                          lr_t * var_t,\n                                          use_locking=self._use_locking)\n\n        updates = [var_update, m_t, v_t]\n        if self._amsgrad:\n            updates.append(vhat_t)\n        return control_flow_ops.group(*updates)\n\n    def _apply_sparse(self, grad, var):\n        return self._apply_sparse_shared(\n            grad.values,\n            var,\n            grad.indices,\n            lambda x, i, v: state_ops.scatter_add(  # pylint: disable=g-long-lambda\n                x,\n                i,\n                v,\n                use_locking=self._use_locking))\n\n    def _resource_apply_sparse(self, grad, var, indices):\n        return self._apply_sparse_shared(grad, var, indices,\n                                         self._resource_scatter_add)\n\n    def _resource_scatter_add(self, x, i, v):\n        with ops.control_dependencies(\n                [resource_variable_ops.resource_scatter_add(x.handle, i, v)]):\n            return x.value()\n\n    def _finish(self, update_ops, name_scope):\n        # Update the power accumulators.\n        with ops.control_dependencies(update_ops):\n            beta1_power, beta2_power = self._get_beta_accumulators()\n            niter = self._get_niter()\n            with ops.colocate_with(beta1_power):\n                update_beta1 = beta1_power.assign(\n                    beta1_power * self._beta1_t, use_locking=self._use_locking)\n                update_beta2 = beta2_power.assign(\n                    beta2_power * self._beta2_t, use_locking=self._use_locking)\n                update_niter = niter.assign(\n                    niter + 1, use_locking=self._use_locking)\n        return control_flow_ops.group(\n            *update_ops + [update_beta1, update_beta2, update_niter], name=name_scope)\n"
  },
  {
    "path": "utils/util.py",
    "content": "#!/usr/bin/env python3 \n# -*- coding: utf-8 -*- \n\n\n\nimport codecs\nimport collections\nimport errno\nimport os\nimport shutil\nimport pyhocon\nimport tensorflow as tf\nfrom models import corefqa\nfrom models import mention_proposal \n\n\nrepo_path = \"/\".join(os.path.realpath(__file__).split(\"/\")[:-2])\n\n\ndef get_model(config, model_sign=\"corefqa\"): \n    if model_sign == \"corefqa\":\n        return corefqa.CorefQAModel(config)\n    else:\n        return mention_proposal.MentionProposalModel(config)\n\n\ndef initialize_from_env(eval_test=False, config_params=\"train_spanbert_base\", config_file=\"experiments_tinybert.conf\", use_tpu=False, print_info=False):\n    if not use_tpu:\n        print(\"loading experiments.conf ... \")\n        config = pyhocon.ConfigFactory.parse_file(os.path.join(repo_path, config_file)) \n    else: \n        print(\"loading experiments_tpu.conf ... \")\n        config = pyhocon.ConfigFactory.parse_file(os.path.join(repo_path, config_file))\n\n    config = config[config_params] \n\n    if print_info:\n        tf.logging.info(\"%*%\"*20)\n        tf.logging.info(\"%*%\"*20)\n        tf.logging.info(\"%%%%%%%% Configs are showed as follows : %%%%%%%%\")\n        for tmp_key, tmp_value in config.items():\n            tf.logging.info(str(tmp_key) + \" : \" + str(tmp_value)) \n    \n        tf.logging.info(\"%*%\"*20)\n        tf.logging.info(\"%*%\"*20)\n\n    config[\"log_dir\"] = mkdirs(os.path.join(config[\"log_root\"], config_params))\n\n    if print_info:\n        tf.logging.info(pyhocon.HOCONConverter.convert(config, \"hocon\"))\n    return config\n\n\ndef copy_checkpoint(source, target):\n    for ext in (\".index\", \".data-00000-of-00001\"):\n        shutil.copyfile(source + ext, target + ext)\n\n\ndef make_summary(value_dict):\n    return tf.Summary(value=[tf.Summary.Value(tag=k, simple_value=v) for k, v in value_dict.items()])\n\n\ndef flatten(l):\n    return [item for sublist in l for item in sublist]\n\n\ndef set_gpus(*gpus):\n    # pass\n    os.environ[\"CUDA_VISIBLE_DEVICES\"] = \",\".join(str(g) for g in gpus)\n    print(\"Setting CUDA_VISIBLE_DEVICES to: {}\".format(os.environ[\"CUDA_VISIBLE_DEVICES\"]))\n\n\ndef mkdirs(path):\n    try:\n        os.makedirs(path)\n    except OSError as exception:\n        if exception.errno != errno.EEXIST:\n            raise\n    return path\n\n\ndef load_char_dict(char_vocab_path):\n    vocab = [u\"<unk>\"]\n    with codecs.open(char_vocab_path, encoding=\"utf-8\") as f:\n        vocab.extend(l.strip() for l in f.readlines())\n    char_dict = collections.defaultdict(int)\n    char_dict.update({c: i for i, c in enumerate(vocab)})\n    return char_dict\n\n\ndef maybe_divide(x, y):\n    return 0 if y == 0 else x / float(y)\n\n\n\ndef shape(x, dim):\n    return x.get_shape()[dim].value or tf.shape(x)[dim]\n\n\ndef ffnn(inputs, num_hidden_layers, hidden_size, output_size, dropout,\n         output_weights_initializer=tf.truncated_normal_initializer(stddev=0.02),\n         hidden_initializer=tf.truncated_normal_initializer(stddev=0.02)):\n    if len(inputs.get_shape()) > 3:\n        raise ValueError(\"FFNN with rank {} not supported\".format(len(inputs.get_shape())))\n    current_inputs = inputs\n    hidden_weights = tf.get_variable(\"hidden_weights\", [hidden_size, output_size],\n                                         initializer=hidden_initializer)\n    hidden_bias = tf.get_variable(\"hidden_bias\", [output_size], initializer=tf.zeros_initializer())\n    current_outputs = tf.nn.relu(tf.nn.xw_plus_b(current_inputs, hidden_weights, hidden_bias))\n\n    return current_outputs\n\n\ndef batch_gather(emb, indices):\n    batch_size = shape(emb, 0)\n    seqlen = shape(emb, 1)\n    if len(emb.get_shape()) > 2:\n        emb_size = shape(emb, 2)\n    else:\n        emb_size = 1\n    flattened_emb = tf.reshape(emb, [batch_size * seqlen, emb_size])  # [batch_size * seqlen, emb]\n    offset = tf.expand_dims(tf.range(batch_size) * seqlen, 1)  # [batch_size, 1]\n    gathered = tf.gather(flattened_emb, indices + offset)  # [batch_size, num_indices, emb]\n    if len(emb.get_shape()) == 2:\n        gathered = tf.squeeze(gathered, 2)  # [batch_size, num_indices]\n    return gathered\n\n"
  }
]