Copy disabled (too large)
Download .txt
Showing preview only (25,033K chars total). Download the full file to get everything.
Repository: rowanz/swagaf
Branch: master
Commit: 0613674cca36
Files: 86
Total size: 80.9 MB
Directory structure:
gitextract_7vl97uy1/
├── .dockerignore
├── Dockerfile
├── LICENSE
├── README.md
├── create_swag/
│ ├── README.md
│ ├── __init__.py
│ ├── generate_candidates/
│ │ ├── README.md
│ │ ├── __init__.py
│ │ ├── classifiers.py
│ │ ├── questions2mturk.py
│ │ ├── rebalance_dataset_ensemble.py
│ │ ├── rebalance_dataset_mlp.py
│ │ ├── sample_candidates.py
│ │ └── sample_candidates.sh
│ ├── lm/
│ │ ├── README.md
│ │ ├── __init__.py
│ │ ├── config.py
│ │ ├── load_data.py
│ │ ├── pretrain_lm.py
│ │ ├── simple_bilm.py
│ │ ├── train_lm.py
│ │ ├── train_lm.sh
│ │ └── vocabulary/
│ │ ├── non_padded_namespaces.txt
│ │ └── tokens.txt
│ └── turktemplate.html
├── data/
│ ├── README.md
│ ├── test.csv
│ ├── train.csv
│ ├── train_full.csv
│ ├── val.csv
│ └── val_full.csv
├── evaluation.yaml
├── pytorch_misc.py
├── raw_data/
│ └── events.py
├── requirements.txt
└── swag_baselines/
├── README.md
├── __init__.py
├── decomposable_attention/
│ ├── README.md
│ ├── __init__.py
│ ├── dataset_reader.py
│ ├── decomposable_attention_swag.py
│ ├── run_experiments.sh
│ ├── train-elmo-goldonly.json
│ ├── train-elmo.json
│ ├── train-glove-840.json
│ ├── train-glove-goldonly-840.json
│ ├── train-glove-goldonly.json
│ ├── train-glove.json
│ ├── train-numberbatch-goldonly.json
│ └── train-numberbatch.json
├── esim/
│ ├── README.md
│ ├── __init__.py
│ ├── dataset_reader.py
│ ├── esim_swag.py
│ ├── predict.py
│ ├── run_experiments.sh
│ ├── train-elmo-goldonly.json
│ ├── train-elmo.json
│ ├── train-glove-goldonly.json
│ ├── train-glove.json
│ ├── train-numberbatch-goldonly.json
│ └── train-numberbatch.json
├── fasttext/
│ ├── README.md
│ ├── __init__.py
│ ├── compute_performance.py
│ └── prep_data.py
└── unarylstm/
├── __init__.py
├── dataset_reader.py
├── lstm_swag.py
├── predict.py
├── run_experiments.sh
├── run_experiments_ending.sh
├── train-cnn.json
├── train-lstmbasic-elmo-endingonly.json
├── train-lstmbasic-elmo-goldonly-endingonly.json
├── train-lstmbasic-elmo-goldonly.json
├── train-lstmbasic-elmo.json
├── train-lstmbasic-glove-endingonly.json
├── train-lstmbasic-glove-goldonly-endingonly.json
├── train-lstmbasic-glove-goldonly.json
├── train-lstmbasic-glove.json
├── train-lstmbasic-numberbatch-endingonly.json
├── train-lstmbasic-numberbatch-goldonly-endingonly.json
├── train-lstmbasic-numberbatch-goldonly.json
├── train-lstmbasic-numberbatch.json
└── train.json
================================================
FILE CONTENTS
================================================
================================================
FILE: .dockerignore
================================================
.dockerignore
**.pyc
**/__pycache__
.gitignore
.git
================================================
FILE: Dockerfile
================================================
FROM python:3.6.3-jessie
ENV LC_ALL=C.UTF-8
ENV LANG=C.UTF-8
ENV PATH /usr/local/nvidia/bin/:$PATH
ENV LD_LIBRARY_PATH /usr/local/nvidia/lib:/usr/local/nvidia/lib64
# Tell nvidia-docker the driver spec that we need as well as to
# use all available devices, which are mounted at /usr/local/nvidia.
# The LABEL supports an older version of nvidia-docker, the env
# variables a newer one.
ENV NVIDIA_VISIBLE_DEVICES all
ENV NVIDIA_DRIVER_CAPABILITIES compute,utility
LABEL com.nvidia.volumes.needed="nvidia_driver"
RUN pip install "git+git://github.com/allenai/allennlp.git@7142962d330ca5a95cade114c26a361c78f2042e"
# download spacy models
RUN python -m spacy download en_core_web_sm
# set the working directory
WORKDIR /swagaf
# install python packages
ADD ./requirements.txt .
RUN pip install -r ./requirements.txt
# add the code as the final step so that when we modify the code
# we don't bust the cached layers holding the dependencies and
# system packages.
ADD . .
ENV PYTHONPATH /swagaf
ENTRYPOINT []
CMD [ "/bin/bash" ]
================================================
FILE: LICENSE
================================================
MIT License
Copyright (c) 2018 Rowan Zellers
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
================================================
FILE: README.md
================================================
# swagaf
### Like this work, or commonsense reasoning in general? You might be interested in checking out my brand new dataset VCR: Visual Commonsense Reasoning, at [visualcommonsense.com](https://visualcommonsense.com)!
SWAG dataset. More info is at [rowanzellers.com/swag](https://rowanzellers.com/swag).
## Setting up your environment
To create an environment you will need to intall Python 3.1, PyTorch 3.1, and AllenNLP. These
requirements are listed in `requirements.txt`.
You will also need to set PYTHONPATH to the `swagaf` directory. You can do this by running the
following command from the `swagaf` folder.
```
export PYTHONPATH=$(pwd)
```
Alternatively, you can build and run the included Dockerfile to create an environment.
```
docker build -t swagaf .
docker run -it swagaf
```
## Common use cases
There is additional documentation in the subfolders.
* `data/` contains the SWAG dataset.
* `swag_baslines/` contains baseline implementations and instructions for how to run them.
Most people will not need to look at `create_swag` or `raw_data` but it's there if you need it!
## Citing
```
@inproceedings{zellers2018swagaf,
title={SWAG: A Large-Scale Adversarial Dataset for Grounded Commonsense Inference},
author={Zellers, Rowan and Bisk, Yonatan and Schwartz, Roy and Choi, Yejin},
booktitle = "Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing (EMNLP)",
year={2018}
}
```
================================================
FILE: create_swag/README.md
================================================
# create_swag
this folder contains the scripts used to create SWAG, including adversarial filtering. Here's the rough overview:
1. Compile a bunch of datasets. We used MPII and ActivityNet Captions.
2. Train the LM on those datasets (train first on toronto books). See the folder `lm/` for more info.
3. Oversample and then perform Adversarial Filtering. See `generate_candidates/`
4. Ask turkers to rank the distractors. You can use `turktemplate.html` as a starting point.
5. You're done!
### Important note:
This code is pretty hacky and comes with few guarantees (as with adversarial filtering itself); I figure you're probably going to need to do something different anyways. But hopefully it helps! Open up an issue if you notice anything wrong.
================================================
FILE: create_swag/__init__.py
================================================
================================================
FILE: create_swag/generate_candidates/README.md
================================================
# generate_candidates
Stage 1 of the pipeline - generate a bunch of candidates.
Unfortunately, this is pretty slow, so we'll want to duplicate it to several GPUs.
The current pipeline as of now:
1. Generate the candidates on 5 different GPUs
```
export PYTHONPATH=/home/rowan/code/commonsense
export CUDA_VISIBLE_DEVICES=0
nohup python sample_candidates.py -fold 0 > fold_0_log.txt &
```
2. Pretrain the assignments using the LM features. This also will split it up into 5 folds
```
nohup python rebalance_dataset_mlp.py > mlp_log.txt &
```
3. Do the assignments using more sophisticated features
```
export CUDA_VISIBLE_DEVICES=0
nohup python rebalance_dataset_ensemble.py -fold -1 > rebalance_everything.txt &
```
4. Use `questions2mturk.py` to come create a CSV for mturk.
================================================
FILE: create_swag/generate_candidates/__init__.py
================================================
================================================
FILE: create_swag/generate_candidates/classifiers.py
================================================
"""
The big idea will be to add in the worst scoring one. But we want to use a MULTILAYER PERCEPTRON.
Also not using word features for now
"""
import torch
from allennlp.common import Params
from allennlp.modules.augmented_lstm import AugmentedLstm
from allennlp.modules.seq2seq_encoders.pytorch_seq2seq_wrapper import PytorchSeq2SeqWrapper
from allennlp.modules.token_embedders.embedding import Embedding
from torch import nn
from torch.nn import functional as F
from torch.autograd import Variable
import pandas as pd
from model.pytorch_misc import clip_grad_norm, optimistic_restore, print_para, time_batch
from torch import optim
import numpy as np
#################### Model types
def reshape(f):
def wrapper(self, *args, **kwargs):
sizes = [x.size() for x in args] + [x.size() for x in kwargs.values()]
batch_size, num_ex = sizes[0][:2]
res = f(self, *[x.view((-1,) + x.size()[2:]) for x in args],
**{k: v.view((-1,), + v.size()[2:]) for k, v in kwargs})
if isinstance(res, tuple):
return tuple([x.view((batch_size, num_ex,) + x.size()[1:]) for x in res])
return res.view((batch_size, num_ex,) + res.size()[1:])
return wrapper
class LMFeatsModel(nn.Module):
def __init__(self, input_dim=5, hidden_dim=1024):
"""
Averaged embeddings of ending -> label
:param embed_dim: dimension to use
"""
super(LMFeatsModel, self).__init__()
self.mapping = nn.Sequential(
nn.Linear(input_dim, hidden_dim, bias=True),
nn.SELU(),
nn.AlphaDropout(p=0.2),
)
self.prediction = nn.Sequential(
nn.Linear(hidden_dim, hidden_dim, bias=True),
nn.SELU(),
nn.AlphaDropout(p=0.2),
nn.Linear(hidden_dim, 1, bias=False),
)
@reshape
def forward(self, feats):
"""
:param words: [batch, dim] indices
:return: [batch] scores of real-ness.
"""
inter_feats = self.mapping(feats)
preds = self.prediction(inter_feats).squeeze(1)
return preds, inter_feats
def fit(self, data, val_data=None, num_epoch=10):
self.train()
optimizer = optim.Adam(self.parameters(), weight_decay=1e-4, lr=1e-3)
best_val = 0.0
for epoch_num in range(num_epoch):
tr = []
for b, (time_per_batch, batch) in enumerate(time_batch(data, reset_every=100)):
results = self(batch['lm_feats'].cuda(async=True))[0]
loss = F.cross_entropy(results, Variable(results.data.new(results.size(0)).long().fill_(0)))
summ_dict = {'loss': loss.data[0], 'acc': (results.max(1)[1] == 0).float().mean().data[0]}
tr.append(pd.Series(summ_dict))
optimizer.zero_grad()
loss.backward()
clip_grad_norm(
[(n, p) for n, p in self.named_parameters() if p.grad is not None],
max_norm=1.0, verbose=False, clip=True)
optimizer.step()
mean_stats = pd.concat(tr, axis=1).mean(1)
if val_data is not None:
val_acc, val_results = self.validate(val_data)
print("e{:2d}: train loss {:.3f} train acc {:.3f} val acc {:.3f}".format(epoch_num, mean_stats['loss'],
mean_stats['acc'], val_acc), flush=True)
if val_acc < best_val or epoch_num == (num_epoch - 1):
return {'mlp': val_acc, 'fasttext': 0, 'cnn': 0, 'lstm_pos': 0, 'ensemble': 0}
best_val = val_acc
def validate(self, data):
self.eval()
all_predictions = []
for b, (time_per_batch, batch) in enumerate(time_batch(data, reset_every=100)):
results = self(batch['lm_feats'].cuda(async=True))[0]
all_predictions.append(results.data.cpu().numpy())
all_predictions = np.concatenate(all_predictions, 0)
acc = (all_predictions.argmax(1) == 0).mean()
return acc, {'ensemble': all_predictions}
class BoWModel(nn.Module):
def __init__(self, vocab, use_mean=True, embed_dim=100):
"""
Averaged embeddings of ending -> label
:param embed_dim: dimension to use
"""
super(BoWModel, self).__init__()
assert embed_dim == 100
self.embeds = Embedding.from_params(
vocab,
Params({'vocab_namespace': 'tokens',
'embedding_dim': embed_dim,
'trainable': True,
'padding_index': 0,
'pretrained_file':
'https://s3-us-west-2.amazonaws.com/allennlp/datasets/glove/glove.6B.100d.txt.gz'
}))
self.embed_dim = embed_dim
self.use_mean = use_mean
self.embedding_to_label = nn.Linear(self.embed_dim, 1, bias=False)
@reshape
def forward(self, word_ids):
"""
:param word_ids: [batch, length] ids
:return: [batch] scores of real-ness.
"""
embeds = self.embeds(word_ids)
mask = (word_ids.data != 0).long()
seq_lengths = mask.sum(-1, keepdim=True).float()
seq_lengths[seq_lengths < 1] = 1.0
inter_feats = embeds.sum(1) / Variable(seq_lengths) if self.use_mean else embeds.max(1)[0]
preds = self.embedding_to_label(inter_feats).squeeze(1)
return preds, inter_feats
class CNNModel(nn.Module):
def __init__(self, vocab, embed_dim=100, window_sizes=(2, 3, 4, 5), num_filters=128):
super(CNNModel, self).__init__()
self.embeds = Embedding.from_params(
vocab,
Params({'vocab_namespace': 'tokens',
'embedding_dim': embed_dim,
'trainable': True,
'padding_index': 0,
'pretrained_file':
'https://s3-us-west-2.amazonaws.com/allennlp/datasets/glove/glove.6B.100d.txt.gz'
}))
self.binary_feature_embedding = Embedding(2, embed_dim)
self.convs = nn.ModuleList([
nn.Conv1d(embed_dim * 2, num_filters, kernel_size=window_size, padding=window_size - 1) for window_size in
window_sizes
])
self.fc = nn.Linear(num_filters * len(window_sizes), 1, bias=False)
@reshape
def forward(self, word_ids, indicator_ids):
"""
:param word_ids: [batch, length] ids
:param indicator_ids: [batch, length] ids
"""
embeds = torch.cat((self.embeds(word_ids), self.binary_feature_embedding(indicator_ids)), 2)
# mask = (word_ids != 0).long()
embeds_t = embeds.transpose(1, 2) # [B, D, L]
conv_reps = []
for conv in self.convs:
conv_reps.append(F.relu(conv(embeds_t)).max(2)[0]) # Now it's [B, D]
inter_feats = torch.cat(conv_reps, 1)
preds = self.fc(inter_feats).squeeze(1)
return preds, inter_feats
class BLSTMModel(nn.Module):
def __init__(self, vocab, use_postags_only=True, embed_dim=100, hidden_size=200, recurrent_dropout_probability=0.3,
use_highway=False,
maxpool=True):
super(BLSTMModel, self).__init__()
self.embeds = Embedding.from_params(
vocab,
Params({'vocab_namespace': 'pos' if use_postags_only else 'tokens',
'embedding_dim': embed_dim,
'trainable': True,
'padding_index': 0,
'pretrained_file': None if use_postags_only else 'https://s3-us-west-2.amazonaws.com/allennlp/datasets/glove/glove.6B.100d.txt.gz',
}))
self.binary_feature_embedding = Embedding(2, embed_dim)
self.fwd_lstm = PytorchSeq2SeqWrapper(AugmentedLstm(
input_size=embed_dim * 2, hidden_size=hidden_size, go_forward=True,
recurrent_dropout_probability=recurrent_dropout_probability,
use_input_projection_bias=False, use_highway=use_highway), stateful=False)
self.bwd_lstm = PytorchSeq2SeqWrapper(AugmentedLstm(
input_size=embed_dim * 2, hidden_size=hidden_size, go_forward=False,
recurrent_dropout_probability=recurrent_dropout_probability,
use_input_projection_bias=False, use_highway=use_highway), stateful=False)
self.maxpool = maxpool
self.fc = nn.Linear(hidden_size * 2, 1, bias=False)
@reshape
def forward(self, word_ids, indicator_ids):
"""
:param word_ids: [batch, length] ids
:param indicator_ids: [batch, length] ids
"""
embeds = torch.cat((self.embeds(word_ids), self.binary_feature_embedding(indicator_ids)), 2)
mask = (word_ids != 0).long()
fwd_activation = self.fwd_lstm(embeds, mask) # [B, L, D]
bwd_activation = self.bwd_lstm(embeds, mask)
if self.maxpool:
reps = torch.cat((fwd_activation.max(1)[0], bwd_activation.max(1)[0]), 1) # [B*N, 2D]
else:
# Forward and last.
reps = torch.cat((
fwd_activation[torch.arange(0, mask.size(0), out=mask.data.new(mask.size(0))), mask.sum(1) - 1],
bwd_activation[:, 0]
), 1)
return self.fc(reps).squeeze(1), reps
class Ensemble(nn.Module):
def __init__(self, vocab):
super(Ensemble, self).__init__()
self.fasttext_model = BoWModel(vocab, use_mean=True, embed_dim=100)
self.mlp_model = LMFeatsModel(input_dim=8, hidden_dim=1024)
self.lstm_pos_model = BLSTMModel(vocab, use_postags_only=True, maxpool=True)
# self.lstm_lex_model = BLSTMModel(vocab, use_postags_only=False, maxpool=True)
self.cnn_model = CNNModel(vocab)
self.mlp = nn.Sequential(
nn.Linear(100 + 1024 + 400 + 4 * 128, 2048, bias=True),
# nn.SELU(),
# nn.AlphaDropout(p=0.2),
# nn.Linear(2048, 2048, bias=True),
nn.SELU(),
nn.AlphaDropout(p=0.2),
nn.Linear(2048, 1, bias=False),
)
def forward(self, lm_feats, ending_word_ids, postags_word_ids, ctx_indicator, inds):
"""
:param lm_feats: [batch_size, #options, dim]
:param ending_word_ids: [batch_size, #options, L] word ids
:param postags_word_ids: [batch_size, #options, L] word ids
:param ctx_indicator: [batch_size, #options, L] indicator
:param inds: [batch_size] indices (not needed)
:return:
"""
results = {}
results['mlp'], mlp_feats = self.mlp_model(lm_feats)
results['fasttext'], fasttext_feats = self.fasttext_model(ending_word_ids)
results['cnn'], cnn_feats = self.cnn_model(ending_word_ids, ctx_indicator)
results['lstm_pos'], lstm_feats = self.lstm_pos_model(postags_word_ids, ctx_indicator)
# results['lstm_lex'], _ = self.lstm_lex_model(ending_word_ids, ctx_indicator)
results['ensemble'] = self.mlp(
torch.cat((mlp_feats, fasttext_feats, cnn_feats, lstm_feats), 2)).squeeze(2)
return results
def predict(self, lm_feats, ending_word_ids, postags_word_ids, ctx_indicator, inds):
""" Predict a distribution of probabilities
:return: Dict from model type -> prob dist
"""
results = self.forward(lm_feats, ending_word_ids, postags_word_ids, ctx_indicator, inds)
results = {k: F.softmax(v, 1).data.cpu().numpy() for k, v in results.items()}
return results
def validate(self, val_dataloader):
"""
:param val_dataloader: Dataloader
:return: Accuracies: dict from model -> accuracy
All predictions: Dict from model -> [batch, #ex] distribution.
"""
# Compute the validation performance
self.eval()
all_predictions = {'mlp': [], 'fasttext': [], 'cnn': [], 'lstm_pos': [], #'lstm_lex': [],
'ensemble': []}
for b, (time_per_batch, batch) in enumerate(time_batch(val_dataloader, reset_every=100)):
batch = {k: v.cuda(async=True) if hasattr(v, 'cuda') else v for k, v in batch.items()}
if b % 100 == 0 and b > 0:
print("\nb{:5d}/{:5d} {:.3f}s/batch, {:.1f}m/epoch".format(
b, len(val_dataloader), time_per_batch,
len(val_dataloader) * time_per_batch / 60), flush=True)
for k, v in self.predict(**batch).items():
all_predictions[k].append(v)
all_predictions = {k: np.concatenate(v, 0) for k, v in all_predictions.items()}
accuracies = {k: np.mean(v.argmax(1) == 0) for k, v in all_predictions.items()}
return accuracies, all_predictions
def fit(self, train_dataloader, val_dataloader, num_epoch=5):
"""
:param train_dataloader: Dataloader
:param num_epoch number of epochs to use
"""
print_every = 100
optimizer = optim.Adam([p for p in self.parameters() if p.requires_grad], weight_decay=1e-6, lr=1e-3)
best_val = 0.0
for epoch_num in range(num_epoch):
tr = []
self.train()
for b, (time_per_batch, batch) in enumerate(time_batch(train_dataloader, reset_every=print_every)):
batch = {k: v.cuda(async=True) if hasattr(v, 'cuda') else v for k, v in batch.items()}
results = self(**batch)
losses = {'{}-loss'.format(k): F.cross_entropy(
v, Variable(v.data.new(v.size(0)).long().fill_(0))) for k, v in results.items()}
if any([np.isnan(x.data.cpu().numpy()) for x in losses.values()]):
import ipdb
ipdb.set_trace()
loss = sum(losses.values())
summ_dict = {k: v.data[0] for k, v in losses.items()}
summ_dict.update(
{'{}-acc'.format(k): (v.max(1)[1] == 0).float().mean().data[0] for k, v in results.items()})
tr.append(pd.Series(summ_dict))
optimizer.zero_grad()
loss.backward()
if b % print_every == 0 and b > 0:
print("\ne{:2d}b{:5d}/{:5d} {:.3f}s/batch, {:.1f}m/epoch".format(
epoch_num, b, len(train_dataloader), time_per_batch,
len(train_dataloader) * time_per_batch / 60))
print(pd.concat(tr[-print_every:], axis=1).mean(1))
print('-----------', flush=True)
# clip_grad_norm([(n, p) for n, p in self.named_parameters() if
# p.grad is not None and n.startswith('lstm_lex_model')], max_norm=1.0,
# verbose=b % 100 == 1, clip=True)
clip_grad_norm([(n, p) for n, p in self.named_parameters() if
p.grad is not None and not n.startswith('lstm_lex_model')], max_norm=1.0,
verbose=b % 100 == 1, clip=True)
optimizer.step()
val_results, _ = self.validate(val_dataloader)
val_acc = val_results['ensemble']
if val_acc < best_val or epoch_num == (num_epoch - 1):
print("Stopping on epoch={} with\n{}".format(epoch_num, pd.Series(val_results)), flush=True)
return val_results
else:
print("Continuing on epoch={} with\n{}".format(epoch_num, pd.Series(val_results)), flush=True)
best_val = val_acc
================================================
FILE: create_swag/generate_candidates/questions2mturk.py
================================================
import random
import pickle as pkl
import numpy as np
from tqdm import tqdm
from nltk.tokenize.moses import MosesDetokenizer
import pandas as pd
import re
detokenizer = MosesDetokenizer()
NUM_DISTRACTORS = 5
def _detokenize(sent):
s0 = detokenizer.detokenize(sent, return_str=True)
s1 = re.sub(r'\b(ca|do|wo)\sn\'t', r"\1n't", s0, flags=re.IGNORECASE)
return s1
assignments = np.load('assignments-22.npy')
start_ind = 0
df = []
for fold_id in range(5):
with open('../../generate_candidates/examples{}-of-5.pkl'.format(fold_id), 'rb') as f:
ex_this_fold = pkl.load(f)
assignments_this_fold = assignments[start_ind:start_ind+len(ex_this_fold)]
start_ind += len(ex_this_fold)
for i, (this_example, assignments_i) in enumerate(zip(tqdm(ex_this_fold), assignments_this_fold)):
selected_gens = [this_example['generations'][i] for i in assignments_i.tolist()]
# Find sent1 from the given sentences
sent1 = _detokenize(this_example['sent1'])
if sent1[0].islower():
sent1 = sent1[0].upper() + sent1[1:]
sent2 = _detokenize(this_example['startphrase'])
# perm = np.random.permutation(NUM_DISTRACTORS+1)
perm = np.arange(10)
series_dict = {
'item_ind': i,
'fold_id': 0,
'item_id': this_example['dataset'] + this_example['id'],
'startphrase': '{} {}'.format(sent1,sent2),
'sent1': sent1,
'sent2': sent2,
'gold': int(np.where(perm == 0)[0][0]),
}
for i, perm in enumerate(perm.tolist()):
series_dict['completion-{:d}'.format(i)] = _detokenize(selected_gens[perm])
df.append(pd.Series(series_dict))
random.seed(123456)
random.shuffle(df)
df = pd.DataFrame(df)
batch_size=1
batch_df = []
for j in range(df.shape[0] // batch_size):
batch_df.append(pd.Series({'{}-{}'.format(i, name):val for i, (_, item) in enumerate(df[j*batch_size:(j+1)*batch_size].iterrows()) for name, val in item.items()}))
batch_df = pd.DataFrame(batch_df)
batch_df.to_csv('batch_df_FULL.csv', index=False)
================================================
FILE: create_swag/generate_candidates/rebalance_dataset_ensemble.py
================================================
"""
The big idea will be to add in the worst scoring one. But we want to use a MULTILAYER PERCEPTRON.
Also not using word features for now
"""
import pickle as pkl
from argparse import ArgumentParser
from copy import deepcopy
import numpy as np
import pandas as pd
import spacy
import torch
from allennlp.data import Instance
from allennlp.data import Token
from allennlp.data import Vocabulary
from allennlp.data.dataset import Batch
from allennlp.data.fields import TextField, SequenceLabelField
from allennlp.data.token_indexers import SingleIdTokenIndexer
from torch.autograd import Variable
from torch.utils.data import Dataset
from torch.utils.data.dataloader import DataLoader
from tqdm import tqdm
from create_swag.lm.config import NUM_FOLDS
from create_swag.generate_candidates.classifiers import Ensemble, LMFeatsModel
######### PARAMETERS
NUM_DISTRACTORS = 9
TRAIN_PERC = 0.8
BATCH_SIZE = 1024
vocab = Vocabulary.from_files('../lm/vocabulary')
pos_vocab = Vocabulary(counter={'tokens': {name: i + 9000 for i, name in enumerate(
[vocab.get_token_from_index(x) for x in range(100)] + [pos for pos in spacy.parts_of_speech.NAMES.values() if
len(pos) > 0]
)}})
vocab._token_to_index['pos'] = pos_vocab._token_to_index['tokens']
vocab._index_to_token['pos'] = pos_vocab._index_to_token['tokens']
parser = ArgumentParser(description='which fold to use')
parser.add_argument('-fold', dest='fold', help='Which fold to use. If you say -1 we will use ALL OF THEM!', type=int,
default=0)
fold = parser.parse_args().fold
assert fold in set(range(NUM_FOLDS)) or fold == -1
print("~~~~~~~~~USING SPLIT#{}~~~~~~~~~~~~~".format(fold))
if fold == -1:
assignments = []
assignments = np.load('assignments-pretrained.npy')
# for i in range(5):
# assignments.append(np.load('assignments-fold-{}-19.npy'.format(i)))
# assignments = np.concatenate(assignments)
else:
assignments = np.load('assignments-pretrained-fold{}.npy'.format(fold))
#########################################
# TODO can we do this in parallel?
class AssignmentsDataLoader(Dataset):
# TODO: we might need to load the dataset again on every iteration because memory is a big problem.
def __init__(self, instances, inds, train=True, recompute_assignments=False):
self.instances = instances
self.inds = inds
self.train = train
self.recompute_assignments = recompute_assignments
self.dataloader = DataLoader(dataset=self, batch_size=128 if not recompute_assignments else 16,
shuffle=self.train, num_workers=0,
collate_fn=self.collate, drop_last=self.train)
def collate(self, items_l):
# Assume all of these have the same length
index_l, second_sentences_l, pos_tags_l, feats_l, context_len_l = zip(*items_l)
feats = Variable(torch.FloatTensor(np.stack(feats_l)))
inds = np.array(index_l)
instances = []
for second_sentences, pos_tags, context_len in zip(second_sentences_l, pos_tags_l, context_len_l):
for second_sent, pos_tag in zip(second_sentences, pos_tags):
instance_d = {
'words': TextField([Token(token) for token in ['@@bos@@'] + second_sent + ['@@eos@@']],
{'tokens': SingleIdTokenIndexer(namespace='tokens', lowercase_tokens=True)}),
'postags': TextField([Token(token) for token in ['@@bos@@'] + pos_tag + ['@@eos@@']],
{'pos': SingleIdTokenIndexer(namespace='pos', lowercase_tokens=False)}),
}
instance_d['context_indicator'] = SequenceLabelField([1] * (context_len + 1) +
[0] * (len(second_sent) - context_len + 1),
instance_d['words'])
instances.append(Instance(instance_d))
batch = Batch(instances)
batch.index_instances(vocab)
tensor_dict = batch.as_tensor_dict(for_training=self.train)
# instances_mask = torch.LongTensor(np.stack([np.array([len(sub_g) > 0 for sub_g in g], dtype=np.int64)
# for g in selected_gens]))
return {
'lm_feats': feats,
'inds': inds,
'ending_word_ids': tensor_dict['words']['tokens'].view(inds.shape[0], -1,
tensor_dict['words']['tokens'].size(1)),
'postags_word_ids': tensor_dict['postags']['pos'].view(inds.shape[0], -1,
tensor_dict['postags']['pos'].size(1)),
'ctx_indicator': tensor_dict['context_indicator'].view(inds.shape[0], -1,
tensor_dict['context_indicator'].size(1)),
}
def __len__(self):
return len(self.instances)
def __getitem__(self, index):
"""
:param index: index into the list of examples. ps: they are of the form
sent1: List[str] of tokens for the first sentence
startphrase: List[str] of tokens for the first part of the 2nd sentence
generations: List[List[str]] of tokenized responses. The first one is GT.
postags: List[List[str]] of POSTags and some lexicalization for startphrase+generations.
They're all of the same size (1024)
:return: index
second_sentences List[List[str]] full s2's
pos_tags List[List[str]] full PosTags of S2's
feats [#ex, dim] np array of features
context_len length of context size in second_sentences and pos_tags
"""
this_ex = self.instances[index]
second_sentences = [this_ex['startphrase'] + gen for gen in this_ex['generations']]
context_len = len(this_ex['startphrase'])
feats_vals = this_ex['scores'].values
if np.isinf(feats_vals).any():
feats_vals[np.isinf(feats_vals)] = 1e17
feats = np.column_stack((
np.log(feats_vals),
np.array([len(gen) for gen in this_ex['generations']], dtype=np.float32),
np.ones(feats_vals.shape[0], dtype=np.float32) * context_len,
np.ones(feats_vals.shape[0], dtype=np.float32) * len(this_ex['sent1']),
))
return index, second_sentences, this_ex['postags'], feats, context_len
@classmethod
def splits(cls, assignments):
""" if assignments is none we initialize by looking at topN"""
s_idx = 0
train_instances = []
val_instances = []
test_instances = []
train_indices = []
test_indices = []
print("loading the data!", flush=True)
def _load_from_examples(example_list, offset):
idx = np.random.permutation(len(example_list))
train_idx = np.sort(idx[:int(TRAIN_PERC * idx.shape[0])])
val_idx = np.sort(idx[int(TRAIN_PERC * idx.shape[0]):])
train_indices.append(offset + train_idx)
test_indices.append(offset + val_idx)
for i in tqdm(train_idx):
item_copy = example_list[i]
item_copy['generations'] = [example_list[i]['generations'][j] for j in assignments[i + offset]]
item_copy['postags'] = [example_list[i]['postags'][j] for j in assignments[i + offset]]
item_copy['scores'] = example_list[i]['scores'].iloc[assignments[i + offset]]
train_instances.append(item_copy)
for i in tqdm(val_idx):
item_copy = deepcopy(example_list[i])
item_copy['generations'] = [example_list[i]['generations'][j] for j in assignments[i + offset]]
item_copy['postags'] = [example_list[i]['postags'][j] for j in assignments[i + offset]]
item_copy['scores'] = example_list[i]['scores'].iloc[assignments[i + offset]]
val_instances.append(item_copy)
test_instances.append(example_list[i])
return len(ex_this_fold)
folds2use = range(5) if fold == -1 else [fold]
for fold_no in folds2use:
print("loading data from fold {}".format(fold_no), flush=True)
with open('examples{}-of-5.pkl'.format(fold_no), 'rb') as f:
ex_this_fold = pkl.load(f)
s_idx += _load_from_examples(ex_this_fold, s_idx)
train_indices = np.concatenate(train_indices, 0)
test_indices = np.concatenate(test_indices, 0)
return cls(train_instances, train_indices, train=True), cls(val_instances, test_indices, train=False), cls(
test_instances, test_indices, train=False, recompute_assignments=True)
def _iter():
train, val, test = AssignmentsDataLoader.splits(assignments)
model = Ensemble(vocab)
model.cuda()
val_results = model.fit(train.dataloader, val.dataloader, num_epoch=10)
# Now get predictions for the best thing
best_scoring_model_name = pd.Series(val_results).argmax()
print("We will rebalance with {}".format(best_scoring_model_name))
test_results, all_predictions = model.validate(test.dataloader)
n2chs = []
for val_ind, pred in zip(test.inds, all_predictions[best_scoring_model_name]):
high2low = (-pred).argsort() # Things at the beginning of this list seem real
idx2rank = high2low.argsort()
cur_assign = assignments[val_ind]
adversarial_examples = high2low[:idx2rank[0]]
adversarial_examples = adversarial_examples[
~np.in1d(adversarial_examples, cur_assign)] # not currently assigned
easy_idxs = high2low[idx2rank[0] + 1:][::-1]
easy_idxs = easy_idxs[np.in1d(easy_idxs, cur_assign)]
# Make the easy indices map according to their position in the assignments
easy_inds = np.argmax(easy_idxs[:, None] == cur_assign[None], 1)
assert np.allclose(cur_assign[easy_inds], easy_idxs)
num2change = min(2, adversarial_examples.shape[0], easy_idxs.shape[0])
n2chs.append(num2change)
# print("adversarial ex we can add {:4d} easy idxs {:4d} were changing {:4d}".format(
# adversarial_examples.shape[0], easy_idxs.shape[0], num2change))
if num2change == 0:
pass
else:
# change a random index
ind_loc = np.random.choice(easy_inds, replace=False, size=num2change)
adv_loc = np.random.choice(adversarial_examples, replace=False, size=num2change)
assignments[val_ind, ind_loc] = adv_loc
# Change the first index over.
# ind_loc = easy_inds[0]
# assignments[val_ind, ind_loc] = adversarial_examples[0]
val_results['n2chs'] = np.mean(n2chs)
return pd.Series(val_results)
all_results = []
for i in range(50):
all_results.append(_iter())
if fold == -1:
pd.DataFrame(all_results).to_csv('ensemble-accs.csv', index=False)
np.save('assignments-{}.npy'.format(i), assignments)
else:
pd.DataFrame(all_results).to_csv('ensemble-accs-fold-{}.csv'.format(fold), index=False)
np.save('assignments-fold-{}-{}.npy'.format(fold, i), assignments)
#
# # To extract some things (maybe this is useful? idk)
# from nltk.tokenize.moses import MosesDetokenizer
# def _extract():
# detokenizer = MosesDetokenizer()
# with open('examples0-of-5.pkl', 'rb') as f:
# ex_this_fold = pkl.load(f)
# assignments = np.load('assignments-4.npy')
#
# selected_examples = []
# for ind, (item, assign_i) in enumerate(zip(tqdm(ex_this_fold), assignments)):
# context = pd.Series([detokenizer.detokenize(item['sent1'], return_str=True)] * len(assign_i))
# completions = pd.Series(
# [detokenizer.detokenize(item['startphrase'] + item['generations'][i], return_str=True) for i in
# assign_i.tolist()])
# dataset = pd.Series([item['dataset']] * len(assign_i))
# ids = pd.Series([item['id']] * len(assign_i))
# duration = pd.Series([item['duration']] * len(assign_i))
# inds = pd.Series([ind] * len(assign_i))
#
# df_this_ex = pd.DataFrame(
# data={'inds': inds, 'selections': assign_i, 'context': context, 'completions': completions,
# 'is_gold': (assign_i == 0),
# 'choice': np.arange(NUM_DISTRACTORS + 1),
# 'dataset': dataset, 'ids': ids, 'duration': duration},
# columns=['inds', 'context', 'completions', 'selections', 'is_gold', 'choice', 'dataset', 'ids', 'duration'])
#
# df_with_extra_feats = pd.concat(
# (df_this_ex, item['scores'].iloc[assign_i].reset_index(drop=True)), axis=1)
# selected_examples.append(df_with_extra_feats)
# return pd.concat(selected_examples, 0).reset_index(drop=True)
#
#
# _extract().to_csv('dataset.csv', sep='\t', index=False)
================================================
FILE: create_swag/generate_candidates/rebalance_dataset_mlp.py
================================================
"""
The big idea will be to add in the worst scoring one. But we want to use a MULTILAYER PERCEPTRON.
Also not using word features for now
"""
import matplotlib as mpl
mpl.use('Agg')
import seaborn as sns
import matplotlib.pyplot as plt
from allennlp.data import Vocabulary
from torch.nn import functional as F
from torch import nn
from torch.autograd import Variable
import pickle as pkl
import numpy as np
from torch import optim
import torch
from tqdm import tqdm, trange
from pytorch_misc import clip_grad_norm, time_batch
import pandas as pd
import os
######### PARAMETERS
NUM_DISTRACTORS = 9
TRAIN_PERC = 0.8
vocab = Vocabulary.from_files('../lm/vocabulary')
all_data = []
if os.path.exists('feats_cached.npy'):
all_data = np.load('feats_cached.npy')
else:
print("loading data. this will take hella time probably!", flush=True)
for fold in trange(5):
print("tryna load {}".format(fold, flush=True))
with open('examples{}-of-5.pkl'.format(fold), 'rb') as f:
examples = pkl.load(f)
for this_ex in examples:
feats_vals = this_ex['scores'].values
if np.isinf(feats_vals).any():
feats_vals[np.isinf(feats_vals)] = 1e17
feats = np.column_stack((
np.log(feats_vals),
np.array([len(gen) for gen in this_ex['generations']], dtype=np.float32),
np.ones(feats_vals.shape[0], dtype=np.float32) * len(this_ex['startphrase']),
np.ones(feats_vals.shape[0], dtype=np.float32) * len(this_ex['sent1']),
))
all_data.append(feats)
all_data = np.stack(all_data)
np.save('feats_cached.npy', all_data)
print("There are {} things".format(all_data.shape[0]), flush=True)
assignments = np.arange(NUM_DISTRACTORS + 1, dtype=np.uint16)[None].repeat(all_data.shape[0], axis=0)
class SimpleCudaLoader(object):
""" silly cuda loader"""
def __init__(self,
indices,
is_train=True,
recompute_assignments=False,
batch_size=512,
):
self.indices = indices
self.is_train = is_train
self.recompute_assignments = recompute_assignments
if self.recompute_assignments:
self.feats = all_data[self.indices]
else:
self.feats = all_data[np.arange(all_data.shape[0])[:, None], assignments][self.indices]
self.batch_size = batch_size
def __iter__(self):
"""
Iterator for a cuda type application.
:return:
"""
# First cuda-ize everything
if self.is_train:
perm_vec = np.random.permutation(self.feats.shape[0])
feats_to_use = self.feats[perm_vec]
inds_to_use = self.indices[perm_vec]
else:
feats_to_use = self.feats
inds_to_use = self.indices
feats_cuda = torch.FloatTensor(feats_to_use).contiguous().cuda(async=True)
for s_idx in range(len(self)):
s_ind = s_idx * self.batch_size
e_ind = min(s_ind + self.batch_size, self.feats.shape[0])
if e_ind < self.batch_size and self.is_train:
# Skip small batch on training
return
yield Variable(feats_cuda[s_ind:e_ind], volatile=not self.is_train), inds_to_use[s_ind:e_ind]
@classmethod
def randomsplits(cls):
"""
Makes some random splits! But keeping in mind the (global) assignments info
:return:
"""
idx = np.random.permutation(all_data.shape[0])
train_idx = idx[:int(TRAIN_PERC * idx.shape[0])]
val_idx = np.sort(idx[int(TRAIN_PERC * idx.shape[0]):])
return cls(train_idx, is_train=True), cls(val_idx, is_train=False), cls(val_idx, recompute_assignments=True, is_train=False),
def __len__(self):
if self.is_train:
return self.feats.shape[0] // self.batch_size
else:
return (self.feats.shape[0] + self.batch_size - 1) // self.batch_size
class MLPModel(nn.Module):
def __init__(self):
super(MLPModel, self).__init__()
# self.mapping = nn.Linear(train_data.feats.shape[2], 1, bias=False)
self.mapping = nn.Sequential(
nn.Linear(all_data.shape[-1], 2048, bias=True),
nn.SELU(),
nn.AlphaDropout(p=0.2),
nn.Linear(2048, 2048, bias=True),
nn.SELU(),
nn.AlphaDropout(p=0.2),
nn.Linear(2048, 1, bias=False),
)
def forward(self, feats):
# Contribution from embeddings
# (batch, #ex, length, dim) -> (batch, #ex, dim)
return self.mapping(feats).squeeze(-1)
def fit(self, data, val_data=None, n_epoch=10):
self.train()
optimizer = optim.Adam(self.parameters(), weight_decay=1e-4, lr=1e-3)
best_val = 0.0
for epoch_num in range(n_epoch):
tr = []
for b, (time_per_batch, batch) in enumerate(time_batch(data, reset_every=100)):
feats, inds_to_use = batch
results = model(feats)
loss = F.cross_entropy(results, Variable(results.data.new(results.size(0)).long().fill_(0)))
summ_dict = {'loss': loss.data[0], 'acc': (results.max(1)[1] == 0).float().mean().data[0]}
tr.append(pd.Series(summ_dict))
optimizer.zero_grad()
loss.backward()
clip_grad_norm(
[(n, p) for n, p in model.named_parameters() if p.grad is not None],
max_norm=1.0, verbose=False, clip=True)
optimizer.step()
mean_stats = pd.concat(tr, axis=1).mean(1)
if val_data is not None:
vp, val_acc = self.predict(val_data)
print("e{:2d}: train loss {:.3f} train acc {:.3f} val acc {:.3f}".format(epoch_num, mean_stats['loss'],
mean_stats['acc'], val_acc), flush=True)
if val_acc < best_val or epoch_num == (n_epoch - 1):
return
best_val = val_acc
def predict(self, data):
self.eval()
all_predictions = []
for b, (time_per_batch, batch) in enumerate(time_batch(data, reset_every=100)):
feats, inds_to_use = batch
all_predictions.append(model(feats).data.cpu().numpy())
all_predictions = np.concatenate(all_predictions, 0)
if data.recompute_assignments:
masked_predictions = all_predictions[np.arange(data.feats.shape[0])[:, None], assignments[data.indices]]
else:
masked_predictions = all_predictions
acc = (masked_predictions.argmax(1) == 0).mean()
mr = (-masked_predictions).argsort(1).argsort(1)[:, 0].mean()
# print("acc is {:.3f}, mean rank is {:.3f}".format(acc, mr))
return all_predictions, acc
accs = []
for iter in trange(100):
train, val, test = SimpleCudaLoader.randomsplits()
model = MLPModel()
model.cuda()
model.fit(train, val)
predictions, acc = model.predict(test)
accs.append(acc)
# Now do some remapping
n2chs = []
for pred, val_ind in zip(predictions, test.indices):
high2low = (-pred).argsort() # Things at the beginning of this list seem real
idx2rank = high2low.argsort()
cur_assign = assignments[val_ind]
adversarial_examples = high2low[:idx2rank[0]]
adversarial_examples = adversarial_examples[
~np.in1d(adversarial_examples, cur_assign)] # not currently assigned
easy_idxs = high2low[idx2rank[0] + 1:][::-1]
easy_idxs = easy_idxs[np.in1d(easy_idxs, cur_assign)]
# Make the easy indices map according to their position in the assignments
easy_inds = np.argmax(easy_idxs[:, None] == cur_assign[None], 1)
assert np.allclose(cur_assign[easy_inds], easy_idxs)
num2change = min(2, adversarial_examples.shape[0], easy_idxs.shape[0])
n2chs.append(num2change)
# print("adversarial ex we can add {:4d} easy idxs {:4d} were changing {:4d}".format(
# adversarial_examples.shape[0], easy_idxs.shape[0], num2change))
if num2change == 0:
# print("Continuing, nothing we can change")
pass
else:
# change a random index
ind_loc = np.random.choice(easy_inds, replace=False, size=num2change)
adv_loc = np.random.choice(adversarial_examples, replace=False, size=num2change)
assignments[val_ind, ind_loc] = adv_loc
# Change the first index over.
# ind_loc = easy_inds[0]
# assignments[val_ind, ind_loc] = adversarial_examples[0]
print("{:.3f} val accuracy: {:.3f} n2chs".format(acc, np.mean(n2chs)), flush=True)
assert np.all(assignments[:, 0] == 0)
# Plot the accuracy as time goes by
np.save('assignments-pretrained.npy', assignments)
start_idx = 0
for fold in trange(5):
with open('examples{}-of-5.pkl'.format(fold), 'rb') as f:
examples = pkl.load(f)
assignments_this_fold = assignments[start_idx:start_idx+len(examples)]
np.save('assignments-pretrained-fold{}.npy'.format(fold), assignments_this_fold)
start_idx += len(examples)
plt.clf()
accuracy = pd.Series(np.array(accs))
df = pd.DataFrame(pd.concat([accuracy,
# accuracy.rolling(window=int(1/(1-TRAIN_PERC)), win_type='gaussian', min_periods=1, center=True).mean(std=2)
accuracy.rolling(window=2 * int(1 / (1 - TRAIN_PERC)), win_type=None, min_periods=1,
center=True).mean()
], 0),
columns=['accuracy'])
df['subject'] = 0
df['series'] = ['accuracy'] * accuracy.shape[0] + ['smoothed accuracy'] * accuracy.shape[0]
df.index.rename('iteration', inplace=True)
df.reset_index(inplace=True)
df.to_csv('pretrain-rebalance-mlp.csv')
sns.set(color_codes=True)
fig = sns.tsplot(time='iteration', value='accuracy', data=df, unit='subject', condition='series').get_figure()
fig.savefig('rebalancing-mlp-acc.pdf')
================================================
FILE: create_swag/generate_candidates/sample_candidates.py
================================================
import pickle as pkl
from argparse import ArgumentParser
from copy import deepcopy
from time import time
import numpy as np
import pandas as pd
import torch
from allennlp.commands.predict import Predictor
from allennlp.data import Vocabulary
from allennlp.models.archival import load_archive
from tqdm import tqdm
from create_swag.lm.config import NUM_FOLDS
from create_swag.lm.simple_bilm import SimpleBiLM
from pytorch_misc import optimistic_restore
from spacy.tokens.doc import Doc
from allennlp.common.util import get_spacy_model
import spacy
BATCH_SIZE = 1024
# ARGUMENTS
parser = ArgumentParser(description='which fold to use')
parser.add_argument('-fold', dest='fold', help='Which fold to use', type=int, default=0)
fold = parser.parse_args().fold
assert fold in set(range(NUM_FOLDS))
print("~~~~~~~~~USING SPLIT#{}~~~~~~~~~~~~~".format(fold))
# SETUP
spacy_model = get_spacy_model("en_core_web_sm", pos_tags=True, parse=False, ner=False)
spacy_model.tokenizer = lambda x: Doc(spacy_model.vocab, x)
archive = load_archive('https://s3-us-west-2.amazonaws.com/allennlp/models/elmo-constituency-parser-2018.03.14.tar.gz')
constituency_predictor = Predictor.from_archive(archive, 'constituency-parser')
# This is hella hacky! but it's tokenized already
constituency_predictor._tokenizer.spacy.tokenizer = lambda x: Doc(constituency_predictor._tokenizer.spacy.vocab, x)
vocab = Vocabulary.from_files('../lm/vocabulary')
pos_vocab = Vocabulary(counter={'tokens': {name: i + 9000 for i, name in enumerate(
[vocab.get_token_from_index(x) for x in range(100)] + [pos for pos in spacy.parts_of_speech.NAMES.values() if
len(pos) > 0]
)}})
model = SimpleBiLM(vocab=vocab, recurrent_dropout_probability=0.2, embedding_dropout_probability=0.2)
optimistic_restore(model,
torch.load('../lm/best-{}.tar'.format(fold))['state_dict']) # <- NEED TO DO THIS ON A FOLD LEVEL
# include if not necessary
model.register_buffer('invalid_tokens', torch.LongTensor([vocab.get_token_index(tok) for tok in
['@@UNKNOWN@@', '@@PADDING@@', '@@bos@@', '@@eos@@',
'@@NEWLINE@@']]))
model.cuda()
model.eval()
with open('../lm/lm-{}-of-{}.pkl'.format(fold, NUM_FOLDS), 'rb') as f:
stories_tokenized = pkl.load(f)
########
# We want to recurse until we find verb phrases
def find_VP(tree):
"""
Recurse on the tree until we find verb phrases
:param tree: constituency parser result
:return:
"""
# Recursion is annoying because we need to check whether each is a list or not
def _recurse_on_children():
assert 'children' in tree
result = []
for child in tree['children']:
res = find_VP(child)
if isinstance(res, tuple):
result.append(res)
else:
result.extend(res)
return result
if 'VP' in tree['attributes']:
# # Now we'll get greedy and see if we can find something better
# if 'children' in tree and len(tree['children']) > 1:
# recurse_result = _recurse_on_children()
# if all([x[1] in ('VP', 'NP', 'CC') for x in recurse_result]):
# return recurse_result
return [(tree['word'], 'VP')]
# base cases
if 'NP' in tree['attributes']:
return [(tree['word'], 'NP')]
# No children
if not 'children' in tree:
return [(tree['word'], tree['attributes'][0])]
# If a node only has 1 child then we'll have to stick with that
if len(tree['children']) == 1:
return _recurse_on_children()
# try recursing on everything
return _recurse_on_children()
def split_on_final_vp(sentence):
""" Splits a sentence on the final verb phrase"""
try:
res = constituency_predictor.predict_json({'sentence': sentence})
except:
return None, None
res_chunked = find_VP(res['hierplane_tree']['root'])
is_vp = [i for i, (word, pos) in enumerate(res_chunked) if pos == 'VP']
if len(is_vp) == 0:
return None, None
vp_ind = max(is_vp)
not_vp = [token for x in res_chunked[:vp_ind] for token in x[0].split(' ')]
is_vp = [token for x in res_chunked[vp_ind:] for token in x[0].split(' ')]
return not_vp, is_vp
good_examples = []
for (instance, s1_toks, s2_toks, item) in tqdm(stories_tokenized):
eos_bounds = [i + 1 for i, x in enumerate(s1_toks) if x in ('.', '?', '!')]
if len(eos_bounds) == 0:
s1_toks.append('.') # Just in case there's no EOS indicator.
context_len = len(s1_toks)
if context_len < 6 or context_len > 100:
print("skipping on {} (too short or long)".format(' '.join(s1_toks + s2_toks)))
continue
# Something I should have done: make sure that there aren't multiple periods, etc. in s2 or in the middle
eos_bounds_s2 = [i + 1 for i, x in enumerate(s2_toks) if x in ('.', '?', '!')]
if len(eos_bounds_s2) > 1 or max(eos_bounds_s2) != len(s2_toks):
continue
elif len(eos_bounds_s2) == 0:
s2_toks.append('.')
# Now split on the VP
startphrase, endphrase = split_on_final_vp(s2_toks)
if startphrase is None or len(startphrase) == 0 or len(endphrase) < 5 or len(endphrase) > 25:
print("skipping on {}->{},{}".format(' '.join(s1_toks + s2_toks), startphrase, endphrase), flush=True)
continue
# if endphrase contains unk then it's hopeless
if any(vocab.get_token_index(tok.lower()) == vocab.get_token_index(vocab._oov_token) for tok in endphrase):
print("skipping on {} (unk!)".format(' '.join(s1_toks + s2_toks)))
continue
context = s1_toks + startphrase
tic = time()
gens0, fwd_scores, ctx_scores = model.conditional_generation(context, gt_completion=endphrase,
batch_size=2 * BATCH_SIZE,
max_gen_length=25)
if len(gens0) < BATCH_SIZE:
print("Couldnt generate enough candidates so skipping")
continue
gens0 = gens0[:BATCH_SIZE]
fwd_scores = fwd_scores[:BATCH_SIZE]
# Now get the backward scores.
full_sents = [context + gen for gen in gens0] # NOTE: #1 is GT
result_dict = model(model.batch_to_ids(full_sents), use_forward=False, use_reverse=True, compute_logprobs=True)
ending_lengths = (fwd_scores < 0).sum(1)
ending_lengths_float = ending_lengths.astype(np.float32)
rev_scores = result_dict['reverse_logprobs'].data.cpu().numpy()
forward_logperp_ending = -fwd_scores.sum(1) / ending_lengths_float
reverse_logperp_ending = -rev_scores[:, context_len:].sum(1) / ending_lengths_float
forward_logperp_begin = -ctx_scores.mean()
reverse_logperp_begin = -rev_scores[:, :context_len].mean(1)
eos_logperp = -fwd_scores[np.arange(fwd_scores.shape[0]), ending_lengths - 1]
print("Time elapsed {:.3f}".format(time() - tic), flush=True)
scores = np.exp(np.column_stack((
forward_logperp_ending,
reverse_logperp_ending,
reverse_logperp_begin,
eos_logperp,
np.ones(forward_logperp_ending.shape[0], dtype=np.float32) * forward_logperp_begin,
)))
# PRINTOUT
low2high = scores[:, 2].argsort()
print("\n\n Dataset={} ctx: {} (perp={:.3f})\n~~~\n".format(item['dataset'], ' '.join(context),
np.exp(forward_logperp_begin)), flush=True)
for i, ind in enumerate(low2high.tolist()):
gen_i = ' '.join(gens0[ind])
if (ind == 0) or (i < 128):
print("{:3d}/{:4d}) ({}, end|ctx:{:5.1f} end:{:5.1f} ctx|end:{:5.1f} EOS|(ctx, end):{:5.1f}) {}".format(
i, len(gens0), 'GOLD' if ind == 0 else ' ', *scores[ind][:-1], gen_i), flush=True)
gt_score = low2high.argsort()[0]
item_full = deepcopy(item)
item_full['sent1'] = s1_toks
item_full['startphrase'] = startphrase
item_full['context'] = context
item_full['generations'] = gens0
item_full['postags'] = [ # parse real fast
[x.orth_.lower() if pos_vocab.get_token_index(x.orth_.lower()) != 1 else x.pos_ for x in y]
for y in spacy_model.pipe([startphrase + gen for gen in gens0], batch_size=BATCH_SIZE)]
item_full['scores'] = pd.DataFrame(data=scores, index=np.arange(scores.shape[0]),
columns=['end-from-ctx', 'end', 'ctx-from-end', 'eos-from-ctxend', 'ctx'])
good_examples.append(item_full)
with open('examples{}-of-{}.pkl'.format(fold, NUM_FOLDS), 'wb') as f:
pkl.dump(good_examples, f)
================================================
FILE: create_swag/generate_candidates/sample_candidates.sh
================================================
#!/usr/bin/env bash
export CUDA_VISIBLE_DEVICES=$1
echo "Sampling the candidates. remember to do this do this for all of the GPUS!"
python sample_candidates.py -fold $1
================================================
FILE: create_swag/lm/README.md
================================================
# LM
Contains hopefully everything you need to run the LM
# Setup
0. Update the config file with where your pretraining text is.
1. Create the vocabulary by running ```python load_data.py``` or copy things around manually, then do the pretraining using `pretrain_lm.py`. Or, you can access my pretrained checkpoint [here](https://drive.google.com/file/d/1Ik7cbGs-wbAKKCeuYA8Uhe5O3pHJwHcj/view?usp=sharing)
2. To finetune on activitynet captions and LSMDC do
```
export PYTHONPATH=/home/rowan/code/swagaf
export CUDA_VISIBLE_DEVICES=0
nohup python train_lm.py -fold 0 > fold_0_log.txt &
export CUDA_VISIBLE_DEVICES=1
nohup python train_lm.py -fold 1 > fold_1_log.txt &
export CUDA_VISIBLE_DEVICES=2
nohup python train_lm.py -fold 2 > fold_2_log.txt &
```
And accordingly on the other machine
```
export CUDA_VISIBLE_DEVICES=0
nohup python train_lm.py -fold 3 > fold_3_log.txt &
export CUDA_VISIBLE_DEVICES=1
nohup python train_lm.py -fold 4 > fold_4_log.txt &
```
One of my checkpoints (for fold 0) is [here](https://drive.google.com/file/d/1J9QPJTIOIDR4V_zGB8ejilWAXXkxrogC/view?usp=sharing)
3. Pick the best checkpoints, then go generate stuff!!
================================================
FILE: create_swag/lm/__init__.py
================================================
================================================
FILE: create_swag/lm/config.py
================================================
# Set this to how many LMs you want to train on diff splits of the data
NUM_FOLDS = 5
# what text to train on (right now it's toronto books)
PRETRAIN_TXT = '/home/mbforbes/repos/ari-holtzman/learning_to_write/data/corpora/tbooks/train.txt'
================================================
FILE: create_swag/lm/load_data.py
================================================
# First make the vocabulary, etc.
import os
import pickle as pkl
import random
import simplejson as json
from allennlp.common.util import get_spacy_model
from allennlp.data import Instance
from allennlp.data import Token
from allennlp.data import Vocabulary
from allennlp.data.dataset import Batch
from allennlp.data.fields import TextField
from allennlp.data.token_indexers import SingleIdTokenIndexer
from allennlp.data.token_indexers.elmo_indexer import ELMoTokenCharactersIndexer
from torch.utils.data import Dataset
from torch.utils.data.dataloader import DataLoader
from tqdm import tqdm
from raw_data.events import DATA_PATH
from pytorch_misc import pairwise
from create_swag.lm.config import NUM_FOLDS
def load_lm_data(fold=None, mode='train'):
"""
Turns the sequential data into instances.
:param split:
:return:
"""
# Get or make vocab
spacy_model = get_spacy_model("en_core_web_sm", pos_tags=False, parse=False, ner=False)
if os.path.exists('vocabulary'):
print("Loading cached vocab. caution if you're building the dataset again!!!!", flush=True)
vocab = Vocabulary.from_files('vocabulary')
with open(os.path.join(DATA_PATH, 'events-3.json'), 'r') as f:
lm_data = json.load(f)
lm_data = [data_item for s in ('train', 'val', 'test') for data_item in lm_data[s]]
else:
assert fold is None
with open(os.path.join(DATA_PATH, 'events-3.json'), 'r') as f:
lm_data = json.load(f)
lm_data = [data_item for s in ('train', 'val', 'test') for data_item in lm_data[s]]
# Manually doing this because I don't want to double count things
vocab = Vocabulary.from_instances(
[Instance({'story': TextField(
[Token(x) for x in ['@@bos@@'] + [x.orth_ for x in spacy_model(sent)] + ['@@eos@@']], token_indexers={
'tokens': SingleIdTokenIndexer(namespace='tokens', lowercase_tokens=True)})}) for data_item in
lm_data for sent in
data_item['sentences']], min_count={'tokens': 3})
vocab.get_index_to_token_vocabulary('tokens')
vocab.save_to_files('vocabulary')
print("VOCABULARY HAS {} ITEMS".format(vocab.get_vocab_size(namespace='tokens')))
if all([os.path.exists('lm-{}-of-{}.pkl'.format(i, NUM_FOLDS)) for i in range(NUM_FOLDS)]):
print("LOADING CACHED DATASET", flush=True)
if mode == 'val':
with open('lm-{}-of-{}.pkl'.format(fold, NUM_FOLDS), 'rb') as f:
print("Loading split{} for {}".format(fold, mode))
instances = pkl.load(f)
else:
instances = []
for other_fold in range(NUM_FOLDS):
if other_fold != fold:
with open('lm-{}-of-{}.pkl'.format(other_fold, NUM_FOLDS), 'rb') as f:
print("Loading split{} for {}".format(other_fold, mode))
instances += pkl.load(f)
return instances, vocab
print("MAKING THE DATASET", flush=True)
assert fold is None
for item in tqdm(lm_data):
item['sentences_tokenized'] = [[st.orth_ for st in spacy_model(sent)] for sent in item['sentences']]
def _to_instances(data):
# flatten this
instances = []
for item in data:
for s1, s2 in pairwise(item['sentences_tokenized']):
instances.append((
Instance({'story': TextField([Token(x) for x in ['@@bos@@'] + s1 + s2 + ['@@eos@@']],
token_indexers={
'tokens': SingleIdTokenIndexer(namespace='tokens',
lowercase_tokens=True)})}),
s1,
s2,
item,
))
return instances
random.seed(123456)
random.shuffle(lm_data)
all_sets = []
for fold_ in range(NUM_FOLDS):
val_set = _to_instances(lm_data[len(lm_data) * fold_ // NUM_FOLDS:len(lm_data) * (fold_ + 1) // NUM_FOLDS])
with open('lm-{}-of-{}.pkl'.format(fold_, NUM_FOLDS), 'wb') as f:
pkl.dump(val_set, f)
all_sets.extend(val_set)
return all_sets, vocab
class RawPassages(Dataset):
def __init__(self, fold, mode):
self.mode = mode
self.fold = fold
self.instances, self.vocab = load_lm_data(fold=self.fold, mode=self.mode)
self.dataloader = DataLoader(dataset=self, batch_size=32,
shuffle=self.mode == 'train', num_workers=0,
collate_fn=self.collate, drop_last=self.mode == 'train')
self.indexer = ELMoTokenCharactersIndexer()
def collate(self, instances_l):
batch = Batch([x[0] for x in instances_l])
batch.index_instances(self.vocab)
batch_dict = {k: v['tokens'] for k, v in batch.as_tensor_dict().items()}
batch_dict['story_tokens'] = [instance[0].fields['story'].tokens for instance in instances_l]
batch_dict['story_full'] = [x[1] + x[2] for x in instances_l]
batch_dict['items'] = [x[3] for x in instances_l]
return batch_dict
def __len__(self):
return len(self.instances)
def __getitem__(self, index):
"""
:param index:
:return: * raw rocstories
* entities
* entity IDs + sentences
* Instance. to print use r3.fields['verb_phrase'].field_list[5].tokens
"""
return self.instances[index]
@classmethod
def splits(cls, fold):
return cls(fold, mode='train'), cls(fold, mode='val')
if __name__ == '__main__':
instances, vocab = load_lm_data()
# train, val = RawPassages.splits()
# for item in train.dataloader:
# for story in item['story_tokens']:
# tok_text = [x.text.lower() for x in story]
# remapped_text = [vocab.get_token_from_index(vocab.get_token_index(x)) for x in tok_text]
# print('({}) {} -> {}'.format('D' if tok_text != remapped_text else ' ',
# ' '.join(tok_text), ' '.join(remapped_text)), flush=True)
================================================
FILE: create_swag/lm/pretrain_lm.py
================================================
import os
import pandas as pd
import torch
from allennlp.data import Instance
from allennlp.data import Token
from allennlp.data import Vocabulary
from allennlp.data.dataset import Batch
from allennlp.data.fields import TextField
from allennlp.data.token_indexers import SingleIdTokenIndexer
from allennlp.data.token_indexers.elmo_indexer import ELMoTokenCharactersIndexer
from torch import optim
from create_swag.lm.simple_bilm import SimpleBiLM
from raw_data.events import _postprocess
from pytorch_misc import clip_grad_norm, print_para, time_batch
from create_swag.lm.config import PRETRAIN_TXT
assert os.path.exists('../vocabulary')
vocab = Vocabulary.from_files('../vocabulary')
indexer = ELMoTokenCharactersIndexer()
def batcher(inp_list):
""" batches, asumming everything is padded and tokenized."""
instances = [Instance({'story': TextField([Token(x) for x in ['@@bos@@'] + subl + ['@@eos@@']], token_indexers={
'tokens': SingleIdTokenIndexer(namespace='tokens', lowercase_tokens=True), 'char_encoding': indexer}),
}) for subl in inp_list]
batch = Batch(instances)
batch.index_instances(vocab)
result_dict = batch.as_tensor_dict()['story']
result_dict['story'] = inp_list
return result_dict
def data_runner(start_point=0, minlength=4):
print("starting at {}".format(start_point))
with open(PRETRAIN_TXT, 'r') as f:
f.seek(start_point)
f.readline() # Clear the partial line
for i, line in enumerate(f):
yield _postprocess(line)
def _sample_a_good_pair(gen, seq_length, min_length=3):
cur_status = []
eos_idxs = [i for i, x in enumerate(cur_status) if x in ('.', '!', '?')]
while len(eos_idxs) < 2:
cur_status.extend([x for x in next(gen).split(' ') if x is not '\n'])
eos_idxs = [i for i, x in enumerate(cur_status) if x in ('.', '!', '?')]
if eos_idxs[1] >= seq_length:
return _sample_a_good_pair(gen, seq_length, min_length=min_length)
elif (eos_idxs[0] < min_length) or (eos_idxs[1] - eos_idxs[0]) < min_length: # Too short
return _sample_a_good_pair(gen, seq_length, min_length=min_length)
return cur_status[:eos_idxs[1] + 1]
def looped_data_runner(batch_size=128, seq_length=50):
offset = 0
TOTAL_BYTES_TRAIN = 4343022454
generators = [data_runner(start_point=TOTAL_BYTES_TRAIN * i // batch_size + offset, minlength=0) for i in
range(batch_size)]
while True:
for g_i, gen in enumerate(generators):
yield _sample_a_good_pair(gen, seq_length=seq_length, min_length=5)
def bucketed_data_runner(batch_size=64, seq_length=50):
length2batch = [[] for i in range(seq_length + 1)]
# Get diverse samples
for batch in looped_data_runner(batch_size=128, seq_length=seq_length):
length2batch[len(batch)].append(batch)
if len(length2batch[len(batch)]) >= batch_size:
# print("Yielding now of size {}".format(len(batch)))
yield batcher(length2batch[len(batch)])
length2batch[len(batch)] = []
# Dataloader
model = SimpleBiLM(vocab=vocab, recurrent_dropout_probability=0.2, embedding_dropout_probability=0.2)
model.cuda()
tr = []
model.train()
for epoch_num in range(2):
if epoch_num == 0:
optimizer = optim.Adam([p for p in model.parameters() if p.requires_grad], weight_decay=1e-6, lr=1e-3)
else:
optimizer = optim.Adam([p for p in model.parameters() if p.requires_grad], weight_decay=1e-6, lr=1e-4)
print(print_para(model))
for b, (time_per_batch, batch) in enumerate(time_batch(bucketed_data_runner())):
batch['tokens'] = batch['tokens'].cuda(async=True)
model_forward = model(batch['tokens'])
losses = {key: model_forward[key] for key in ['forward_loss', 'reverse_loss']}
tr.append(pd.Series({k: v.data[0] for k, v in losses.items()}))
loss = sum(losses.values())
optimizer.zero_grad()
loss.backward()
if b % 100 == 0 and b > 0:
df_cat = pd.concat(tr[-100:], axis=1).mean(1)
print("b{:8d} {:.3f}s/batch, fwd loss {:.3f} rev loss {:.3f} ".format(b, time_per_batch,
df_cat['forward_loss'],
df_cat['reverse_loss']), flush=True)
clip_grad_norm(
[(n, p) for n, p in model.named_parameters() if p.grad is not None],
max_norm=1.0, verbose=b % 1000 == 1, clip=True)
optimizer.step()
if b % 10000 == 0 and b > 0:
torch.save({'state_dict': model.state_dict()}, 'e{}-tbooks-pretrained-ckpt-{}.tar'.format(epoch_num, b))
================================================
FILE: create_swag/lm/simple_bilm.py
================================================
"""
A wrapper around ai2s elmo LM to allow for an lm objective...
"""
from typing import Optional, Tuple
from typing import Union, List, Dict
import numpy as np
import torch
from allennlp.common.checks import ConfigurationError
from allennlp.data import Token, Vocabulary, Instance
from allennlp.data.dataset import Batch
from allennlp.data.fields import TextField
from allennlp.data.token_indexers import SingleIdTokenIndexer
from allennlp.modules.augmented_lstm import AugmentedLstm
from allennlp.modules.seq2seq_encoders.pytorch_seq2seq_wrapper import PytorchSeq2SeqWrapper
from allennlp.nn.util import sequence_cross_entropy_with_logits
from torch.autograd import Variable
from torch.nn import functional as F
from torch.nn.utils.rnn import PackedSequence
def _de_duplicate_generations(generations):
"""
Given a list of list of strings, filter out the ones that are duplicates. and return an idx corresponding
to the good ones
:param generations:
:return:
"""
dup_set = set()
unique_idx = []
for i, gen_i in enumerate(generations):
gen_i_str = ' '.join(gen_i)
if gen_i_str not in dup_set:
unique_idx.append(i)
dup_set.add(gen_i_str)
return [generations[i] for i in unique_idx], np.array(unique_idx)
class StackedLstm(torch.nn.Module):
"""
A stacked LSTM.
Parameters
----------
input_size : int, required
The dimension of the inputs to the LSTM.
hidden_size : int, required
The dimension of the outputs of the LSTM.
num_layers : int, required
The number of stacked LSTMs to use.
recurrent_dropout_probability: float, optional (default = 0.0)
The dropout probability to be used in a dropout scheme as stated in
`A Theoretically Grounded Application of Dropout in Recurrent Neural Networks
<https://arxiv.org/abs/1512.05287>`_ .
use_input_projection_bias : bool, optional (default = True)
Whether or not to use a bias on the input projection layer. This is mainly here
for backwards compatibility reasons and will be removed (and set to False)
in future releases.
Returns
-------
output_accumulator : PackedSequence
The outputs of the interleaved LSTMs per timestep. A tensor of shape
(batch_size, max_timesteps, hidden_size) where for a given batch
element, all outputs past the sequence length for that batch are
zero tensors.
"""
def __init__(self,
input_size: int,
hidden_size: int,
num_layers: int,
recurrent_dropout_probability: float = 0.0,
use_highway: bool = True,
use_input_projection_bias: bool = True,
go_forward: bool = True) -> None:
super(StackedLstm, self).__init__()
# Required to be wrapped with a :class:`PytorchSeq2SeqWrapper`.
self.input_size = input_size
self.hidden_size = hidden_size
self.num_layers = num_layers
layers = []
lstm_input_size = input_size
for layer_index in range(num_layers):
layer = AugmentedLstm(lstm_input_size, hidden_size, go_forward,
recurrent_dropout_probability=recurrent_dropout_probability,
use_highway=use_highway,
use_input_projection_bias=use_input_projection_bias)
lstm_input_size = hidden_size
self.add_module('layer_{}'.format(layer_index), layer)
layers.append(layer)
self.lstm_layers = layers
def forward(self, # pylint: disable=arguments-differ
inputs: PackedSequence,
initial_state: Optional[Tuple[torch.Tensor, torch.Tensor]] = None):
"""
Parameters
----------
inputs : ``PackedSequence``, required.
A batch first ``PackedSequence`` to run the stacked LSTM over.
initial_state : Tuple[torch.Tensor, torch.Tensor], optional, (default = None)
A tuple (state, memory) representing the initial hidden state and memory
of the LSTM. Each tensor has shape (1, batch_size, output_dimension).
Returns
-------
output_sequence : PackedSequence
The encoded sequence of shape (batch_size, sequence_length, hidden_size)
final_states: torch.Tensor
The per-layer final (state, memory) states of the LSTM, each with shape
(num_layers, batch_size, hidden_size).
"""
if not initial_state:
hidden_states = [None] * len(self.lstm_layers)
elif initial_state[0].size()[0] != len(self.lstm_layers):
raise ConfigurationError("Initial states were passed to forward() but the number of "
"initial states does not match the number of layers.")
else:
hidden_states = list(zip(initial_state[0].split(1, 0),
initial_state[1].split(1, 0)))
output_sequence = inputs
final_states = []
for i, state in enumerate(hidden_states):
layer = getattr(self, 'layer_{}'.format(i))
# The state is duplicated to mirror the Pytorch API for LSTMs.
output_sequence, final_state = layer(output_sequence, state)
final_states.append(final_state)
final_state_tuple = tuple(torch.cat(state_list, 0) for state_list in zip(*final_states))
return output_sequence, final_state_tuple
class SimpleBiLM(torch.nn.Module):
def __init__(self,
vocab: Vocabulary,
recurrent_dropout_probability: float = 0.0,
embedding_dropout_probability: float = 0.0,
input_size=512,
hidden_size=512) -> None:
"""
:param options_file: for initializing elmo BiLM
:param weight_file: for initializing elmo BiLM
:param requires_grad: Whether or not to finetune the LSTM layers
:param recurrent_dropout_probability: recurrent dropout to add to LSTM layers
"""
super(SimpleBiLM, self).__init__()
self.forward_lm = PytorchSeq2SeqWrapper(StackedLstm(
input_size=input_size, hidden_size=hidden_size, num_layers=2, go_forward=True,
recurrent_dropout_probability=recurrent_dropout_probability,
use_input_projection_bias=False, use_highway=True), stateful=True)
self.reverse_lm = PytorchSeq2SeqWrapper(StackedLstm(
input_size=input_size, hidden_size=hidden_size, num_layers=2, go_forward=False,
recurrent_dropout_probability=recurrent_dropout_probability,
use_input_projection_bias=False, use_highway=True), stateful=True)
# This will also be the encoder
self.decoder = torch.nn.Linear(512, vocab.get_vocab_size(namespace='tokens'))
self.vocab = vocab
self.register_buffer('eos_tokens', torch.LongTensor([vocab.get_token_index(tok) for tok in
['.', '!', '?', '@@UNKNOWN@@', '@@PADDING@@', '@@bos@@',
'@@eos@@']]))
self.register_buffer('invalid_tokens', torch.LongTensor([vocab.get_token_index(tok) for tok in
['@@UNKNOWN@@', '@@PADDING@@', '@@bos@@', '@@eos@@',
'@@NEWLINE@@']]))
self.embedding_dropout_probability = embedding_dropout_probability
def embed_words(self, words):
assert words.dim() == 2
if not self.training:
return F.embedding(words, self.decoder.weight)
# Embedding dropout
vocab_size = self.decoder.weight.size(0)
mask = Variable(
self.decoder.weight.data.new(vocab_size, 1).bernoulli_(1 - self.embedding_dropout_probability).expand_as(
self.decoder.weight) / (1 - self.embedding_dropout_probability))
padding_idx = 0
embeds = self.decoder._backend.Embedding.apply(words, mask * self.decoder.weight, padding_idx, None,
2, False, False)
return embeds
def timestep_to_ids(self, timestep_tokenized: List[str]):
""" Just a single timestep (so dont add BOS or EOS"""
return Variable(torch.LongTensor([self.vocab.get_token_index(x) for x in timestep_tokenized])[:, None],
volatile=not self.training).cuda(async=True)
def batch_to_ids(self, stories_tokenized: List[List[str]]):
"""
Simple wrapper around _elmo_batch_to_ids
:param batch: A list of tokenized sentences.
:return: A tensor of padded character ids.
"""
batch = Batch([Instance(
{'story': TextField([Token('@@bos@@')] + [Token(x) for x in story] + [Token('@@eos@@')],
token_indexers={
'tokens': SingleIdTokenIndexer(namespace='tokens', lowercase_tokens=True)})})
for story in stories_tokenized])
batch.index_instances(self.vocab)
words = {k: v['tokens'] for k, v in batch.as_tensor_dict(for_training=self.training).items()}['story'].cuda(
async=True)
return words
def conditional_generation(self, context, gt_completion, batch_size=128, max_gen_length=25,
same_length_as_gt=False):
"""
Generate conditoned on the context. While we're at it we'll score the GT going forwards
:param context: List of tokens to condition on. We'll add the BOS marker to it
:param gt_completion: The GT completion
:param batch_size: Number of sentences to generate
:param max_gen_length: Max length for genertaed sentences (irrelvant if same_length_as_gt=True)
:param same_length_as_gt: set to True if you want all the sents to have the same length as the gt_completion
:return:
"""
# Forward condition on context, then repeat to be the right batch size:
# (layer_index, batch_size, fwd hidden dim)
forward_logprobs = self(self.batch_to_ids([context]), use_forward=True,
use_reverse=False, compute_logprobs=True)['forward_logprobs']
self.forward_lm._states = tuple(x.repeat(1, batch_size, 1).contiguous() for x in self.forward_lm._states)
# Each item will be (token, score)
generations = [[(context[-1], 0.0)] for i in range(batch_size)]
mask = Variable(forward_logprobs.data.new(batch_size).long().fill_(1))
gt_completion_padded = [self.vocab.get_token_index(gt_token) for gt_token in
[x.lower() for x in gt_completion] + ['@@PADDING@@'] * (
max_gen_length - len(gt_completion))]
for index, gt_token_ind in enumerate(gt_completion_padded):
embeds = self.embed_words(self.timestep_to_ids([gen[-1][0] for gen in generations]))
next_dists = F.softmax(self.decoder(self.forward_lm(embeds, mask[:, None]))[:, 0], 1).data
# Perform hacky stuff on the distribution (disallowing BOS, EOS, that sorta thing
sampling_probs = next_dists.clone()
sampling_probs[:, self.invalid_tokens] = 0.0
# fix first row!!!
sampling_probs[0].zero_()
sampling_probs[0, gt_token_ind] = 1
if same_length_as_gt:
if index == (len(gt_completion) - 1):
sampling_probs.zero_()
sampling_probs[:, gt_token_ind] = 1
else:
sampling_probs[:, self.eos_tokens] = 0.0
sampling_probs = sampling_probs / sampling_probs.sum(1, keepdim=True)
next_preds = torch.multinomial(sampling_probs, 1).squeeze(1)
next_scores = np.log(next_dists[
torch.arange(0, next_dists.size(0),
out=mask.data.new(next_dists.size(0))),
next_preds,
].cpu().numpy())
for i, (gen_list, pred_id, score_i, mask_i) in enumerate(
zip(generations, next_preds.cpu().numpy(), next_scores, mask.data.cpu().numpy())):
if mask_i:
gen_list.append((self.vocab.get_token_from_index(pred_id), score_i))
is_eos = (next_preds[:, None] == self.eos_tokens[None]).max(1)[0]
mask[is_eos] = 0
if mask.sum().data[0] == 0:
break
generation_scores = np.zeros((len(generations), max([len(g) - 1 for g in generations])), dtype=np.float32)
for i, gen in enumerate(generations):
for j, (_, v) in enumerate(gen[1:]):
generation_scores[i, j] = v
generation_toks, idx = _de_duplicate_generations([[tok for (tok, score) in gen[1:]] for gen in generations])
return generation_toks, generation_scores[idx], forward_logprobs.data.cpu().numpy()
def _chunked_logsoftmaxes(self, activation, word_targets, chunk_size=256):
"""
do the softmax in chunks so memory doesnt explode
:param activation: [batch, T, dim]
:param targets: [batch, T] indices
:param chunk_size: you might need to tune this based on GPU specs
:return:
"""
all_logprobs = []
num_chunks = (activation.size(0) - 1) // chunk_size + 1
for activation_chunk, target_chunk in zip(torch.chunk(activation, num_chunks, dim=0),
torch.chunk(word_targets, num_chunks, dim=0)):
assert activation_chunk.size()[:2] == target_chunk.size()[:2]
targets_flat = target_chunk.view(-1)
time_indexer = torch.arange(0, targets_flat.size(0),
out=target_chunk.data.new(targets_flat.size(0))) % target_chunk.size(1)
batch_indexer = torch.arange(0, targets_flat.size(0),
out=target_chunk.data.new(targets_flat.size(0))) / target_chunk.size(1)
all_logprobs.append(F.log_softmax(self.decoder(activation_chunk), 2)[
batch_indexer, time_indexer, targets_flat].view(*target_chunk.size()))
return torch.cat(all_logprobs, 0)
def forward(self, words: torch.Tensor, use_forward=True, use_reverse=True, compute_logprobs=False) -> Dict[
str, Union[torch.Tensor, List[torch.Tensor]]]:
"""
use this for training the LM
:param words: [batch_size, N] words. assuming you're starting with BOS and ending with EOS here
:return:
"""
encoded_inputs = self.embed_words(words)
mask = (words != 0).long()[:, 2:]
word_targets = words[:, 1:-1].contiguous()
result_dict = {
'mask': mask,
'word_targets': word_targets,
}
# TODO: try to reduce duplicate code here
if use_forward:
self.forward_lm.reset_states()
forward_activation = self.forward_lm(encoded_inputs[:, :-2], mask)
if compute_logprobs:
# being memory efficient here is critical if the input tensors are large
result_dict['forward_logprobs'] = self._chunked_logsoftmaxes(forward_activation,
word_targets) * mask.float()
else:
result_dict['forward_logits'] = self.decoder(forward_activation)
result_dict['forward_loss'] = sequence_cross_entropy_with_logits(result_dict['forward_logits'],
word_targets,
mask)
if use_reverse:
self.reverse_lm.reset_states()
reverse_activation = self.reverse_lm(encoded_inputs[:, 2:], mask)
if compute_logprobs:
result_dict['reverse_logprobs'] = self._chunked_logsoftmaxes(reverse_activation,
word_targets) * mask.float()
else:
result_dict['reverse_logits'] = self.decoder(reverse_activation)
result_dict['reverse_loss'] = sequence_cross_entropy_with_logits(result_dict['reverse_logits'],
word_targets,
mask)
return result_dict
================================================
FILE: create_swag/lm/train_lm.py
================================================
import os
from argparse import ArgumentParser
import numpy as np
import pandas as pd
import torch
from torch import optim
from torch.optim.lr_scheduler import StepLR
from tqdm import tqdm
from create_swag.lm.config import NUM_FOLDS
from create_swag.lm.load_data import load_lm_data, RawPassages
from create_swag.lm.simple_bilm import SimpleBiLM
from pytorch_misc import clip_grad_norm, optimistic_restore, print_para, time_batch
if not os.path.exists('vocabulary') or not all(
[os.path.exists('lm-{}-of-{}.pkl'.format(i, NUM_FOLDS)) for i in range(NUM_FOLDS)]):
print("MAKING THE VOCABULARY / DATA AGAIN", flush=True)
_, vocab = load_lm_data(None)
# ARGUMENTS
parser = ArgumentParser(description='which fold to use')
parser.add_argument('-fold', dest='fold', help='Which fold to use', type=int, default=0)
fold = parser.parse_args().fold
assert fold in set(range(NUM_FOLDS))
if not os.path.exists('checkpoints-{}'.format(fold)):
os.mkdir('checkpoints-{}'.format(fold))
print("~~~~~~~~~USING SPLIT#{}~~~~~~~~~~~~~".format(fold))
train, val = RawPassages.splits(fold=fold)
model = SimpleBiLM(
vocab=train.vocab,
recurrent_dropout_probability=0.2,
embedding_dropout_probability=0.2,
)
model.cuda()
optimistic_restore(model, torch.load('e1-tbooks-pretrained-ckpt-370000.tar')['state_dict'])
optimizer = optim.Adam([p for p in model.parameters() if p.requires_grad], weight_decay=1e-6, lr=1e-3)
# scheduler = ReduceLROnPlateau(optimizer, 'min', patience=3, factor=0.1,
# verbose=True, threshold=0.0001, threshold_mode='abs', cooldown=1)
scheduler = StepLR(optimizer, step_size=5, gamma=0.1)
print(print_para(model))
for epoch_num in range(15):
tr = []
model.train()
for b, (time_per_batch, batch) in enumerate(time_batch(train.dataloader)):
# batch['char_encoding'] = batch['char_encoding'].cuda(async=True)
batch['story'] = batch['story'].cuda(async=True)
model_forward = model(batch['story'])
losses = {key: model_forward[key] for key in ['forward_loss', 'reverse_loss']}
tr.append(pd.Series({k: v.data[0] for k, v in losses.items()}))
optimizer.zero_grad()
loss = sum(losses.values())
loss.backward()
if b % 100 == 0 and b > 0:
print("\ne{:2d}b{:5d}/{:5d} {:.3f}s/batch, {:.1f}m/epoch".format(
epoch_num, b, len(train.dataloader), time_per_batch,
len(train.dataloader) * time_per_batch / 60))
print(pd.concat(tr[-100:], axis=1).mean(1))
print('-----------', flush=True)
clip_grad_norm(
[(n, p) for n, p in model.named_parameters() if p.grad is not None],
max_norm=1, verbose=b % 1000 == 1, clip=True)
optimizer.step()
# Get the validation perplexity
perplexity = []
model.eval()
for batch in tqdm(val.dataloader):
# batch['char_encoding'] = batch['char_encoding'].cuda(async=True)
batch['story'] = batch['story'].cuda(async=True)
model_forward = model(batch['story'])
losses = {key: model_forward[key] for key in ['forward_loss', 'reverse_loss']}
perplexity.append(pd.Series({k: v.data[0] for k, v in losses.items()}))
df_cat = pd.DataFrame(perplexity).mean(0)
print("Epoch {}, fwd loss {:.3f} perplexity {:.3f} bwd loss {:.3f} perplexity {:.3f}".format(
epoch_num, df_cat['forward_loss'], np.exp(df_cat['forward_loss']), df_cat['reverse_loss'],
np.exp(df_cat['reverse_loss'])), flush=True)
scheduler.step(df_cat['forward_loss'] + df_cat['reverse_loss'])
torch.save({'state_dict': model.state_dict()}, 'checkpoints-{}/ckpt-{}.tar'.format(fold, epoch_num))
================================================
FILE: create_swag/lm/train_lm.sh
================================================
#!/usr/bin/env bash
FOLD_ID=$1
NUM_GPUS=3
export CUDA_VISIBLE_DEVICES=$((FOLD_ID % NUM_GPUS))
echo "Sampling the candidates. remember to do this do this for all of the GPUS and to pretrain first!"
python train_lm.py -fold $1
================================================
FILE: create_swag/lm/vocabulary/non_padded_namespaces.txt
================================================
*tags
*labels
================================================
FILE: create_swag/lm/vocabulary/tokens.txt
================================================
@@UNKNOWN@@
@@bos@@
@@eos@@
.
the
a
someone
and
,
to
in
of
on
he
his
is
man
her
with
she
as
at
then
up
are
it
's
people
into
down
they
him
out
woman
back
from
around
camera
an
off
while
over
shown
one
-
looks
two
by
them
seen
another
their
we
other
standing
person
men
turns
see
walks
through
hand
front
for
water
room
away
hands
takes
playing
several
door
head
behind
girl
stands
eyes
ball
holding
now
continues
boy
face
sits
table
side
sitting
screen
begins
car
black
white
talking
more
puts
group
onto
again
hair
young
large
that
shows
floor
across
all
wearing
after
holds
some
goes
steps
pulls
who
runs
moves
gets
starts
outside
watches
inside
shirt
speaking
smiles
along
next
field
each
stares
dog
wall
has
but
walking
small
look
opens
window
grabs
there
red
end
lady
before
ground
riding
stops
looking
how
top
jumps
blue
play
toward
leads
little
continue
game
arms
video
throws
gives
comes
open
close
towards
watch
together
still
leaves
picks
street
appears
well
does
its
falls
walk
air
arm
gaze
against
watching
crowd
glass
talks
about
feet
board
bed
later
doing
running
using
go
house
chair
various
long
himself
stage
put
smile
women
girls
rope
way
past
others
lifts
moving
when
boat
get
pool
stand
under
bar
different
ends
horse
where
three
paper
phone
glances
which
both
dancing
jump
kitchen
mouth
high
reaches
shakes
uses
dance
forward
shots
guy
light
drops
sees
leans
right
snow
run
enters
plays
begin
being
makes
beside
nods
faces
flips
slowly
green
time
dark
ice
followed
view
shoulder
set
like
body
follows
team
building
showing
kids
left
heads
child
tree
between
clips
hits
gun
gym
take
machine
road
bag
shot
move
boys
legs
make
start
bike
first
approaches
bottle
do
sit
area
track
smiling
show
jumping
finds
sink
can
line
points
stick
river
lies
pans
hit
tries
desk
same
waves
beach
above
few
middle
this
very
players
climbs
court
seat
lips
drink
pushes
performing
box
sets
brush
just
places
turn
children
putting
words
times
lights
yellow
dressed
piece
bowl
quickly
pours
spins
raises
title
routine
ingredients
cuts
office
near
going
fire
old
finally
gazes
tricks
roof
stairs
sand
once
cars
speaks
something
returns
bars
lowers
closes
swings
hat
eye
kisses
beam
stop
taking
removes
knife
passes
throw
counter
throwing
grass
mirror
distance
hold
cup
audience
frame
forth
hangs
finishes
moment
drives
suddenly
metal
jacket
hill
rolls
talk
trying
shoots
corner
bottom
many
hitting
appear
neck
yard
also
balls
cut
couple
mat
rises
player
motion
wooden
shoes
slides
chest
ring
truck
cat
leg
rides
lands
be
have
ride
lawn
slow
edge
pole
glasses
four
drums
credits
reads
father
suit
paint
mother
kicks
baby
making
herself
big
lot
foot
place
empty
not
spots
until
cigarette
bathroom
second
pass
getting
fingers
step
apartment
half
covered
fall
smoke
living
come
dirt
cutting
coat
sky
pair
rock
fence
background
catches
demonstrates
peers
picture
staring
kiss
pictures
pan
sides
wave
clothes
plate
nearby
joins
finger
dress
wood
seated
hall
swimming
what
weight
home
wipes
city
so
below
done
working
full
adds
n't
ski
opening
grins
brow
;
'
brown
pink
food
sidewalk
notices
bedroom
book
wheel
cream
towel
rubs
clean
couch
new
fourth
sticks
ready
hurries
closer
use
own
trees
spinning
park
alone
crosses
perform
wide
pauses
soccer
band
speak
pull
tool
guitar
tosses
exercise
ahead
bridge
slide
ocean
wears
work
goal
lays
beer
photo
carrying
kid
keeps
scene
arrives
text
haired
brushes
male
gymnast
tire
stare
night
only
circle
horses
pulling
strides
orange
tears
drinks
doors
bow
washing
shifts
doorway
tennis
tv
knees
pushing
swing
hard
performs
older
pocket
last
leave
stone
swinging
kneeling
fish
bucket
sign
shoulders
blows
coffee
or
corridor
round
lift
turning
painting
dogs
follow
waiting
touches
nails
music
brings
cake
lying
teams
gate
knocks
bends
shoe
lets
facing
leaps
snowy
checks
brushing
hanging
match
diving
center
leaving
hole
gently
tall
cheek
platform
laying
friend
carries
finished
bench
huge
path
breath
guard
deep
ending
ship
female
balance
clip
backs
laughing
fast
enter
straight
bull
athlete
shuts
skateboard
news
blonde
blow
filled
cell
flies
piano
equipment
if
pot
wash
cleaning
climbing
mountain
nose
ear
draws
no
train
expression
short
presses
race
cloth
windows
covers
point
indoor
briefly
find
friends
you
glares
driving
gray
hockey
helps
wand
logo
beneath
frowns
helmet
driver
gear
teeth
flip
finish
without
lake
:
bus
surface
deck
breaks
aside
hallway
cross
help
object
hug
police
school
far
family
figure
net
hugs
flying
laughs
students
ceiling
soldiers
push
closing
son
day
rushes
name
tools
bowling
pile
iron
was
sun
rocks
dances
pinata
break
punches
van
shop
catch
too
teen
shake
us
grab
cover
smoking
gathered
movements
almost
raft
zooms
intro
station
number
silver
been
closed
shaving
slips
works
bit
pumpkin
drive
direction
wrestling
rest
studies
mower
blinks
backwards
skiing
passing
falling
leading
position
card
final
dead
fires
parking
liquid
violin
case
store
lifting
pong
tattoo
cheer
shut
meanwhile
parked
pieces
frisbee
climb
guys
carpet
coach
landing
contact
kick
coming
image
pit
blood
row
slams
plastic
lit
fly
attached
wraps
part
clear
demonstrating
revealing
slightly
say
wife
fighting
themselves
our
rain
drinking
instrument
aims
song
bows
bikes
speed
spin
soldier
shoot
form
opposite
ladder
camel
tank
five
tube
drum
cleans
class
keep
mask
pants
cheers
reading
reach
low
t
heavy
images
handle
call
sword
railing
shooting
volleyball
having
vacuum
nod
dry
third
hula
competition
circles
eating
sandwich
fight
shorts
lens
basketball
uniform
lead
agent
skis
approach
singing
everyone
sprays
microphone
seems
waits
darts
pointing
product
join
plane
laugh
give
baton
lap
clock
tray
emerges
giving
interviewed
calf
try
chain
leaning
staircase
hoop
studio
cheering
elevator
practicing
dives
struggles
giant
roll
entrance
computer
cards
washes
pick
chin
wet
soap
prepares
surrounded
jaw
kicking
great
score
arena
rests
remains
restaurant
meets
applies
tight
land
trunk
slope
bread
kneels
free
shines
performance
ups
narrow
claps
polo
waving
placed
hotel
driveway
tubes
bright
cube
sheet
gloves
outdoors
forehead
wind
dart
mop
taps
even
flipping
following
officer
series
flat
bumper
scores
polish
outdoor
spray
preparing
rear
walls
porch
vehicle
money
shaking
made
purple
letters
belly
attention
eventually
fencing
fills
objects
captures
hammer
mixes
fist
practice
beams
forest
event
nervously
button
thoughtfully
surfing
kayak
spot
cap
flames
pose
shovel
christmas
lacrosse
individual
meet
sound
pins
during
sea
tie
scissors
mixing
gymnastics
share
garage
drawing
wrapped
ax
razor
serves
page
beard
dishes
ropes
multiple
races
chef
doctor
hose
angles
continuously
shoves
rifle
basket
scenes
newspaper
switches
control
flowers
parts
adjusts
key
pipe
paddle
boats
spectators
weights
monkey
instructor
racket
releases
upstairs
makeup
things
ties
pulled
blowing
passenger
grey
stretches
bounces
fades
offers
tiles
website
explains
tape
wine
kite
playground
party
boards
block
square
nail
crashes
rise
knee
balcony
map
dust
arrow
harmonica
hood
photos
rolling
rag
rubbing
gestures
tower
oven
display
reveal
lip
letter
pouring
canvas
life
sumo
instruments
attempts
cups
grabbing
mix
lines
ladies
pasta
alley
strolls
items
hospital
stilts
chases
dancers
tunnel
keys
swim
held
spreads
lane
bank
crouches
container
slaps
answers
dish
broken
grin
style
blanket
tiny
suv
sight
members
flag
says
regards
marches
among
!
waist
drop
thrown
busy
pats
lower
arrive
glance
belt
bags
alien
wins
note
hot
clutching
cab
town
canoe
speeds
underneath
sunglasses
base
shape
thick
spread
overhead
eyed
demonstrate
toy
drawn
turned
waters
athletes
paints
strokes
space
dining
tables
cookies
flash
referee
skating
guests
exits
sister
notes
carefully
javelin
listens
disappears
hops
used
leather
wakes
woods
gather
changes
swims
ping
curling
sofa
film
garden
wrapping
snaps
roller
string
dismounts
reaching
hips
upper
brother
setting
envelope
folds
gold
stirs
presents
tail
stepping
clouds
skin
becomes
daughter
blond
chairs
added
drags
oil
folded
bird
strikes
thumbs
apart
screens
every
dressing
knocking
ducks
club
slices
lamp
dive
opponent
laptop
flashes
shore
raise
puck
properly
guns
rack
lined
stomach
streets
sweeps
continuing
individuals
slack
read
tips
surf
floating
captain
bare
buildings
asleep
approaching
bending
remove
voice
martial
started
bites
sips
pour
wheels
blocks
eats
booth
thin
radio
bearded
colorful
bicycle
palm
collapses
whole
main
wax
bat
than
carriage
entire
shaped
tug
bottles
backyard
security
wrist
paces
good
ramp
underwater
beat
wiping
stadium
will
peeks
gone
pockets
backward
return
welding
beginning
further
lemon
pistol
pops
shadow
file
gift
courtyard
carry
paddling
war
afterwards
stunts
poses
lie
dips
including
marching
pale
throat
lid
cabinet
pen
sinks
darkness
crowded
raised
pretty
flute
faced
extends
traffic
completely
indoors
tent
clothing
passengers
grassy
helping
spits
shield
shrugs
shower
stove
tub
egg
arts
instructions
stack
sighs
proceeds
lobby
outfit
hear
stopping
swallows
length
juice
clapping
straightens
chocolate
spoon
cabin
tied
slows
listening
chase
colored
calls
throughout
upright
sail
scuba
bursts
cautiously
double
bite
leaf
cart
sidelines
saxophone
movie
panel
parallel
wipe
dough
course
parents
framed
contents
sleeping
signs
embrace
heavily
artist
drawer
directly
dock
beyond
cold
sweater
fishing
skateboarding
cheese
process
lifeboat
camels
mixture
tilts
smashes
upside
stuck
heart
piercing
monitor
ballet
rush
alongside
switch
aisle
vault
younger
electric
reflection
adult
gazing
shirts
bunch
plaster
downstairs
sip
shelf
either
guards
papers
clap
nurse
smaller
boxing
explaining
laid
dashes
present
locks
mounts
writing
motionless
add
tile
ironing
sugar
dries
products
karate
blade
vest
weapon
're
warm
husband
eggs
toddler
rips
morning
toilet
wanders
hurry
bald
model
closely
footage
blower
cheerleaders
hedge
disc
games
displayed
elderly
flings
butter
resumes
officers
features
helicopter
curtain
chamber
pain
knitting
host
much
skate
hopscotch
eat
ca
tightly
change
toothbrush
replay
fill
casually
plates
seats
rings
action
powder
sliding
quick
mountains
story
sports
motorcycle
peer
baseball
fixed
let
judges
squeezes
exchange
complete
seeing
information
dryer
really
snowboarding
writes
crawls
stunned
wrap
burning
glow
purse
taken
rowing
sponge
athletic
closet
exit
plants
locked
misses
covering
bend
bouncing
trail
staff
groups
paintball
sure
brunette
six
robe
spraying
widen
touch
salad
costume
check
vehicles
american
stream
beautiful
rail
hang
cage
distant
surfboard
cliff
potato
scans
furniture
wedding
sharp
hookah
tea
descends
examines
grips
were
gymnasium
elliptical
teacher
thug
aim
wrestle
books
level
flame
rider
anxiously
shadows
golden
hesitates
already
rough
played
tracks
peels
yanks
section
camp
mug
cheeks
jeans
cooking
log
rinses
accordion
chasing
mowing
forces
agents
lock
awkwardly
screams
float
sadly
faint
paddles
wallpaper
reporter
your
mixed
wrestler
wake
sends
pillow
windshield
clutches
burst
happy
naked
riders
aged
concrete
circular
surfer
arrows
thing
vegetables
marley
thoughtful
tip
jet
briefcase
color
glowing
needle
lay
barbell
drying
torch
fists
asian
freezes
braid
twirling
military
floats
word
ledge
boxes
intently
snatches
resting
supplies
walked
win
hears
partner
celebrating
storms
sunlight
massive
split
pages
brick
single
rows
winces
posing
magazine
skates
ears
hills
rafts
hoops
starting
trash
bloody
flower
loses
pack
list
addresses
creature
within
taxi
binoculars
wings
flight
fur
cleaner
rink
pushed
celebrate
cellphone
licks
frown
flashlight
frozen
cafe
graphic
boarding
television
web
poles
lotion
jack
wrestlers
bikers
teens
cocks
cash
stumbles
rounds
searches
fun
glancing
focus
pace
blades
tear
castle
sandy
faucet
rod
enjoying
smokes
adults
shoveling
caught
baking
introduction
elsewhere
stool
hides
twists
repeatedly
wait
sings
attempt
sleeve
target
angle
power
fallen
suitcase
apply
painted
cottage
lost
bowls
walkway
desert
serve
prepare
numerous
exercises
snowboard
hurls
sunny
fully
angrily
worker
boots
missing
calm
skirt
cloud
crash
nothing
potatoes
animal
trainer
faster
company
techniques
holes
army
dancer
sprints
creeps
doll
glides
jogs
everything
shirtless
raising
skateboards
waiter
recording
immediately
comb
grooming
frisbees
classroom
sleeps
punch
upward
whips
steel
softly
clearing
bath
unison
quiet
inspector
any
tutorial
hers
grinning
crashing
itself
surprise
racing
straw
goalie
highway
dumps
bell
gas
focuses
rub
fan
cameraman
lunges
glows
record
dad
suits
statue
2
steep
surfers
rocky
seconds
grows
uneven
bushes
proper
heard
rake
sunscreen
energy
collar
sedan
strip
inches
stay
silence
catching
keeping
scarf
photograph
positions
podium
picking
shaves
gates
attempting
croquet
twirls
smirks
furrows
candle
springs
puzzled
tires
press
goals
quietly
most
hop
visible
these
poured
biker
pretends
shave
halt
heading
sparks
desperately
grip
bedside
football
else
barrel
smirk
study
world
bumps
bodies
armed
drifts
vast
punching
slip
sailing
had
measuring
locations
smacks
student
tongue
liquor
crossing
happily
gown
clears
sniffs
sideways
debris
crew
soft
owner
surprised
meat
real
chopping
tiger
richard
parker
cricket
beats
wear
wearily
struggle
boot
stopped
wildly
mom
hiding
dirty
dropping
gesture
wheelchair
horror
lemons
cue
cone
halts
remote
thumb
safe
headlights
self
surrounding
stretching
loose
device
package
confused
placing
wire
trophy
basement
island
leash
barn
trampoline
cookie
shuffleboard
lounge
pavement
hook
types
address
3
art
creating
force
branches
averts
packed
plant
fans
angry
videos
conversation
measure
winner
peeling
engine
gathers
french
timer
customer
i
dusty
stays
rising
drill
lighter
records
german
crack
allows
scrubs
diver
sharpening
shuffles
church
charges
sending
general
protective
smooth
cigar
necklace
milk
limbs
jumped
pumps
curb
dealer
patch
crying
areas
emerge
remain
brief
interview
escape
higher
reception
displays
march
fruit
completes
professional
raking
bungee
bulls
training
cop
grimaces
fights
includes
shotgun
leap
surround
bartender
airport
drag
injured
removed
flags
cable
candy
bin
uniforms
moved
able
lemonade
outro
roping
notice
curls
tenderly
sad
bullets
champagne
atop
entering
hearing
written
attack
structure
struggling
drift
clerk
rubber
manager
serving
crowds
bear
bangs
opened
fabric
sweat
braids
applying
workout
divers
engaged
did
rafting
wagon
zoom
explodes
weapons
fixes
scrambles
halfway
staggers
salon
shelves
figures
kissing
moon
unconscious
policeman
shiny
pin
inner
sport
sauce
scrubbing
presenting
slumps
public
remaining
eyebrows
human
dials
business
feels
frowning
pausing
breaking
napkin
rooftop
safety
abruptly
sleep
practices
whose
socks
dinner
receiver
teammates
special
broom
paw
shocked
master
soda
adding
harness
bullet
jar
separate
fridge
bleachers
frantically
barely
glimpse
furrowed
letting
yards
cement
notebook
meeting
though
edges
kind
gradually
dragon
cook
exercising
include
occasionally
mans
post
foosball
buttons
loads
thrusts
wallet
wounded
alcohol
couples
disappear
mounted
sigh
axe
backpack
excited
4
blank
threw
mopping
applauds
chips
patio
limo
reveals
elbows
hip
retrieves
suite
concerned
grave
overlooking
removing
heels
inserts
motions
photographs
sails
bring
matches
stylist
pitcher
telephone
salt
sisters
shock
answer
sharpen
rapids
kayaks
handsome
batons
colors
names
trots
mid
gloved
animated
downward
fork
lipstick
lab
sphere
flashing
nervous
filming
onlookers
startled
upon
type
went
rafters
screaming
gymnastic
clad
mouths
rapidly
ad
instead
nearly
spies
jerks
claws
glove
enormous
interior
branch
tap
bent
snake
trim
screws
rinse
wild
never
attendant
o
waterfall
technique
compete
lingers
digs
headphones
mansion
waitress
evening
butt
splits
apron
strike
workers
neighborhood
heaves
storm
settles
squats
steady
workshop
directions
thought
celebrates
kayaking
viewers
fountain
locker
rug
fake
reluctantly
tearfully
hide
barber
cord
casts
rim
mobile
alarm
shining
grasp
blouse
lightly
goggles
fell
witch
saw
diner
bush
better
trimming
segment
darkened
vodka
swerves
tumbles
deserted
tugs
floors
vampire
1
returning
manages
shaft
battle
message
shine
shovels
spreading
fix
bounce
ways
ribbon
thread
tomato
canoes
bagpipes
casino
wandering
knives
inflatable
pounds
glare
star
breathes
flaming
lean
hidden
steam
winding
breathing
touching
armored
curly
scoops
interviews
member
pump
causing
pointed
item
pipes
umbrella
winning
winter
foreground
because
batter
leader
shrug
keyboard
winks
tender
limp
flicks
swirling
pause
troubled
yet
helplessly
private
mustached
houses
sergeant
create
introduces
manner
quite
scoring
began
solve
proudly
stuffed
slice
listen
colleague
cast
cameras
ignores
search
firmly
hundreds
site
country
chalk
completed
sing
carved
lenses
size
minutes
year
longer
seem
embarrassed
temple
balancing
teenage
hoodie
larger
disbelief
grand
decorating
cops
portrait
pickup
sucks
stairwell
worried
calmly
scowls
blankly
weary
narrows
gap
saunters
wig
cowboy
replaces
striped
combs
machines
filling
bay
banner
twirl
kites
combing
closeup
solution
puzzle
stacks
crotch
sticking
dangling
curtains
report
scratches
excitedly
candles
cocktail
lifted
wrists
lantern
aiming
crate
lever
construction
chopper
empties
brightly
policemen
bikini
sharply
adjusting
village
penguin
adjust
cave
numbers
helmets
yells
coaches
successfully
led
strong
explain
loud
trick
trips
dresser
palms
trims
puffs
bra
eight
chinese
shades
finding
handful
stalks
came
russian
tractor
tucks
sweep
gentle
conference
carving
hurriedly
animals
mind
mallet
sudden
pad
scrub
strange
bowing
toys
earth
draw
tubing
materials
sometimes
panning
offscreen
ornaments
mustache
wound
called
puff
amid
peering
uneasily
enough
hauls
breast
indicates
tightens
elephant
silent
elbow
nazi
curled
foam
dimly
amongst
spear
coin
tossing
parlor
king
strength
given
discuss
discus
theater
decorated
chicken
sneaks
cool
blast
rolled
indian
passionately
sack
cape
attacks
knits
searching
dot
awake
bubbles
assistant
loaded
travel
support
caresses
elegant
collects
plain
york
spider
washed
cloak
pressed
jungle
buckets
measures
interviewing
marathon
curlers
sharpener
skier
fives
remembers
steering
mows
stretch
boss
extremely
cane
warehouse
ankle
unlocks
gracefully
trailer
launches
guides
best
dried
avoid
glistening
tissue
plaza
pier
passed
growing
become
hugging
parade
movement
curiously
cats
ramps
crane
solves
fear
chopped
years
torn
jawed
broad
tiled
bears
silhouetted
hooded
groom
engage
judge
pressing
outstretched
surroundings
deeper
cry
mats
dodge
easily
gesturing
whiskey
winds
clipping
flops
knock
automaton
deeply
named
mark
silently
channel
penguins
bmx
location
twice
vacuums
shingles
braiding
skiers
parks
fresh
signals
greets
?
interspersed
love
stranger
towering
crouching
aerial
servant
dozens
knocked
swarm
tops
pop
steadily
knit
trails
sweeping
outfits
disappointed
bolts
cleaned
breakfast
splashes
foyer
fireplace
hut
dragged
swords
terrified
blender
cow
competing
scraper
locket
gymnasts
squash
mixer
perfect
fat
horrified
playfully
bomb
toe
bathtub
exhales
travels
toes
drapes
cigarettes
checking
strap
receives
surfaces
elevated
burly
considers
pet
smooths
sized
poster
whispers
formation
load
films
reporters
pattern
licking
blindfolded
repeats
medal
sheets
enclosure
enjoy
pets
bigger
hedges
opposing
titanic
hats
saying
badminton
pommel
bump
observes
cans
cracks
stuffs
refrigerator
dim
unit
explosion
flashback
watched
mic
luggage
serious
crystal
lightning
reappears
splash
market
fold
customers
grasps
lowering
stuff
convertible
stroll
soon
toss
10
balances
stairway
towels
swoops
draped
noodles
horizon
clicks
bumping
tin
applaud
certain
zombie
job
sort
ornate
mini
golf
hooks
paws
typewriter
garbage
pokes
feeds
service
picnic
dodges
pillar
charge
horn
hatch
stabs
commander
barrier
stained
thugs
illuminates
shuffle
reacts
campfire
marks
costumes
boxer
containing
partially
applause
roofed
ponytail
grill
thinks
except
sections
noise
tying
irons
here
got
scoreboard
guest
whistle
those
scraping
fails
flour
cattle
picked
volley
progress
successful
lots
such
polishing
spaghetti
rubik
transitions
hosting
logos
rubix
landscapes
tearful
bicycles
joint
pencil
stall
lurches
ten
auditorium
ambulance
fixing
counts
order
arriving
squeeze
state
downs
kneel
shadowy
stride
president
solemnly
slight
whip
skull
handles
mount
hurling
bills
gathering
slumped
outer
pepper
toothpaste
material
nice
carried
shears
blinds
duck
stern
link
masked
pitch
sharpens
mist
beaming
rifles
hunched
ink
descend
looms
basin
secretary
sleeves
balding
views
spotlight
handstand
reflected
dresses
hydra
mechanical
compartment
gravely
youth
produces
accepts
less
rooms
official
drawings
cartoon
repeated
crewman
advertisement
reacting
scotch
motel
steward
pucks
should
scrapes
mates
obstacles
swimmers
runners
tapes
inch
revolver
marked
profile
stars
pizza
advances
joined
stomps
trudges
braces
logs
unseen
curl
troops
stretched
pensively
clenches
gripping
shifting
inspects
dabs
handbag
drain
backstage
squad
mail
perched
towers
relief
grim
sprinkles
mess
pond
teammate
border
sequence
bolt
client
cooks
ballerina
wets
realizes
would
experience
demonstration
rackets
spell
swipes
tensely
clinging
frustrated
shed
oncoming
solo
packs
design
priest
gleaming
rose
brothers
chops
controls
uniformed
stacked
eyeing
early
uncomfortably
wing
fondly
anxious
dragging
muddy
changing
shallow
instructs
mops
stones
marble
garment
paste
flowing
owl
strings
lighting
fitness
mulch
rakes
excitement
discusses
troll
hogwarts
broomstick
omelette
sanding
targets
warmly
ankles
surveillance
flinches
beige
dozen
lunch
skyline
tumble
birds
limps
cargo
live
solemn
attaches
exterior
martini
marker
blinking
onstage
column
studying
age
forms
weather
build
handed
events
driven
terrace
drumming
knob
motor
blocking
mud
gawks
normal
scenery
height
beast
anchor
effort
orchestra
clippers
thoroughly
racquetball
awkward
uneasy
tumbling
neatly
ballroom
digital
ticket
cluttered
residential
trimmer
know
nudges
hurt
bound
bride
meal
ridge
laundry
bundle
landscape
apple
forming
satisfied
scream
withdraws
archway
mechanism
robot
sweatshirt
armchair
collection
curved
jersey
pierces
breeze
intense
reaction
males
rocking
chewing
spiral
chop
shelter
took
styles
blown
effect
fields
saddle
unable
lettuce
obviously
salesman
victory
however
slopes
crunches
diary
vacuuming
bongo
styling
dodgeball
rollerblading
test
sways
firing
dangles
drawers
folder
determined
wrestles
chatting
trousers
singer
packet
purses
piles
ambles
nuts
bounds
act
trembling
bewildered
retreats
huddle
robes
grenade
cockpit
gulps
breaths
barefoot
blind
squints
manuscript
twins
o.
crossed
pedestrians
triangle
toast
rotates
burn
layer
chuckles
extended
finishing
expectantly
stance
propped
underwear
subway
mercedes
date
cotton
mouse
nodding
suited
shaker
copy
cooked
spills
drummers
sheriff
impressive
aware
activities
need
trolley
tam
pumpkins
seven
skyward
flee
strapped
chews
swigs
abandoned
scurries
fingertips
handing
balloon
bouquet
loft
range
highlights
softens
guide
operator
substance
beauty
unfolds
credit
command
explode
tooth
bunk
streams
runway
stationary
squares
bad
patrons
perfectly
undoes
soaked
cries
procession
results
grimly
bathing
sketch
twist
lion
arc
moments
contains
original
displaying
spatula
feeling
frost
etc
tomatoes
stirring
dipping
tattooing
cheerleader
liquids
subscribe
gang
pedal
acoustic
straps
dons
emergency
shouts
stripes
poker
dripping
journal
canopy
reflects
powerful
sobs
detective
steers
blasts
shares
signal
vision
document
neon
medical
handkerchief
steals
silhouette
aide
fellow
trooper
print
surfs
gifts
sneakers
trunks
huddled
offices
fit
patient
drills
focused
brought
packing
stiffly
clearly
shapes
pressure
graphics
horseback
concern
valley
might
stair
excess
goodbye
foil
swimmer
heat
laces
hardwood
yelling
anything
late
dropped
amazement
goblin
tackles
safely
bringing
20
writer
shopping
relaxes
bruised
gallery
friendly
diamond
complex
sounds
overcoat
nears
plunges
suspended
mass
transport
adjacent
gum
escorts
swiftly
facility
illuminated
sunset
bowler
shift
amused
avoids
farther
wistfully
trophies
coins
coffin
settings
landed
asks
rhythm
fluid
lawnmower
central
boiling
spar
trip
somersaults
scroll
yoga
mall
s
peel
vet
snowboards
shaver
sporting
furiously
bats
recoils
pensive
fiery
plan
've
buries
ushers
easy
lowered
frightened
herd
dazed
oar
knowing
sipping
factory
tucked
onward
sheepishly
tapping
vigorously
upwards
coldly
parted
simple
uncle
jeep
built
engineer
spacious
nine
vent
neighbor
activity
lashes
cones
bureau
flask
drones
drains
helpless
syrup
disk
needed
rodeo
sled
persons
mouthwash
doubles
rapid
beckons
bandage
flees
handgun
glide
aid
flow
wrenches
corpse
dollar
5
reached
strewn
torches
daylight
creates
pajamas
dagger
secret
twisting
scar
makeshift
photographer
d
escort
taped
reverses
brakes
soars
illuminate
fair
proceed
pilot
clasps
eyebrow
gapes
swig
scattered
release
thigh
panels
bongos
canister
greet
balloons
flexes
scoop
hovering
corners
hillside
prison
siting
spare
syringe
farm
horizontal
pleased
earlier
roars
captions
passageway
drone
dome
pretend
olympic
pedals
discs
jewelry
library
amusement
shortly
loudly
tattoos
describing
sailboat
brushed
biking
instructing
newscaster
washer
former
roots
crawl
modern
paperwork
matching
intersection
noticing
smart
african
character
veil
drunk
weakly
plank
slings
hopping
scan
prisoners
reverse
cruiser
scramble
unwraps
estate
bearing
swivels
snap
musicians
momentarily
actually
enclosed
pads
violently
30
supply
exhausted
semi
bill
bug
skillet
embraces
cot
polished
death
grounds
avoiding
stereo
stiff
shell
animation
characters
sailors
strips
prepared
clipboard
attractive
instant
care
replays
rat
turbulent
medals
pumping
backdrop
cutter
anger
brand
funny
scrape
cubes
warming
roads
commercial
gryffindor
skateboarders
groomer
drummer
columns
hatchback
cymbals
eagerly
twin
bleeding
maid
hovers
transfixed
exposed
blazing
spectacles
prisoner
tense
railings
pays
rip
feed
shields
valet
british
pairs
broadly
hesitantly
scooter
pie
performers
airplane
desperate
backseat
hangar
surfboards
bone
trucks
runner
lighted
advance
decorate
fedora
accidentally
oblivious
smells
whacks
quizzically
write
bowed
faintly
swaying
uncertainly
jug
flood
dip
squatting
paved
opponents
zoo
ever
frosting
s.
taller
looked
dorm
tai
cupboard
slam
joy
kicked
showed
voices
capture
assembling
tango
till
decorations
skills
petting
rollers
teaching
interacting
motocross
bake
fishes
struts
handlebars
vaults
english
laughter
directs
question
spikes
swap
glumly
weaves
handcuffs
cradles
resume
kit
spanish
dying
flap
portion
twisted
evenly
wires
relieved
instantly
operating
coats
gunman
coast
belongings
stir
spring
amount
stretcher
hanger
supports
props
crawling
olive
warily
blindfold
ancient
machinery
palace
elaborate
screw
outline
manhattan
tastes
onions
wardrobe
strainer
recipe
haircut
tells
lining
squeezing
wo
butcher
weld
sir
&
gloomy
destroyer
sliced
react
com
mob
minas
archery
selfie
pictured
dealing
skateboarder
obstacle
patrol
bustling
lose
principal
goateed
ex
tuxedo
clothed
popcorn
cemetery
harbor
shyly
speeding
brace
patiently
pacing
luxury
creep
fidgets
arches
tentatively
silk
effects
eyelids
musician
headed
brilliant
featuring
salutes
torso
perimeter
canyon
maneuvers
striking
holder
piled
mattress
current
murky
terrain
perspective
tired
fearfully
bun
briskly
gliding
leotard
grow
blocked
shade
hammers
stable
fumbles
recorder
barricade
ships
baked
cartwheels
gel
tag
professor
response
prow
youtube
jogging
limb
fingernails
groomed
traveling
beers
dismount
magic
mane
pony
repair
skipping
lime
choppy
yarn
hooping
vase
phones
sweet
tan
minute
smashing
hollow
striding
topples
seal
hunter
modest
seizes
crates
arched
zips
gravel
companion
platforms
aircraft
wags
sync
descending
pout
comfortable
scissor
models
dial
rushing
rubble
crumpled
detail
crouched
treads
lone
bully
intensely
fighter
slung
vacant
firm
benches
passionate
label
system
muscles
trimmed
windscreen
bonnet
greeting
jackets
swirl
uncertain
passage
lush
gears
slamming
farmer
headset
tournament
won
nozzle
retrieve
wooded
inflated
lastly
chi
zombies
competitors
presence
describes
completing
tune
choreography
stirred
snitch
congas
contacts
wakeboard
bagpipe
pinches
slap
responds
chucks
cardboard
barges
breasts
blurry
panties
casual
salute
wander
banging
sketches
pat
aimed
readies
bespectacled
trapped
swollen
earnestly
chaos
flailing
casket
lovingly
feather
gingerly
lifeless
slim
baskets
rocket
similar
sprawling
hurrying
loosens
receive
pills
bobs
hikes
slender
mascara
pinned
streaming
padded
convoy
retreat
rockets
carves
flails
battered
campaign
50
puddle
heel
fifth
soup
beating
heaving
photographers
wrecked
muscle
enjoys
carton
trains
goat
sobbing
dots
flare
steaming
politely
uncomfortable
served
peace
oars
twenty
thousands
interest
dies
lamps
stain
disheveled
fancy
countryside
sands
details
london
positioned
wrong
suitcases
cakes
correct
sat
replayed
scatter
instruction
trade
attire
gorilla
sleek
wreck
tattooed
rejoins
attachment
scrolls
colleagues
absently
abdomen
vanishes
clings
vessel
dawn
sprawled
hobbles
forcing
garlic
peeking
earpiece
portable
storage
impact
wreckage
brows
fading
mimes
cashier
twitches
connected
terminal
pretending
skids
blink
sexy
giggles
sub
applauding
assemble
fenced
shorter
troopers
utility
rails
knitted
ventures
ones
folding
china
scoots
advertising
ref
agony
monitors
secure
dash
amidst
glisten
previous
farmhouse
worn
veranda
volume
everywhere
sway
throne
pay
daughters
topped
spit
tighten
result
icy
females
mound
crown
joke
scratch
flickering
facial
spoons
fine
bored
apparently
fashion
zooming
scared
clipper
ease
actions
orcs
congratulate
presented
released
overlaid
referees
sprayer
detergent
participating
contestants
cheerleading
jog
printed
jam
boulder
unzips
u
bacon
pub
forearm
blaze
examining
shove
dusk
swoop
grinds
slicing
flesh
documents
soar
assault
staying
jams
haul
tablet
muscular
losing
extreme
wad
album
x
sex
delighted
confidently
calendar
worriedly
craft
crest
jagged
legged
warning
roughly
upset
chauffeur
array
hunches
satchel
sympathetically
adjoining
45
weeds
cushion
survivors
ash
recognizes
chief
bass
pursuit
clamps
tell
fireworks
reactor
whom
stony
miss
paintings
usual
touched
horns
environment
hung
receptionist
suburban
fairy
dartboard
rinsing
wants
germans
thoughts
fellowship
tourists
cycle
wakeboarding
fooseball
sped
lifter
duct
maroon
feathers
weightlifting
loading
grimacing
links
caption
thanks
escalator
leafy
fangs
thief
scrutinizes
delivers
misty
claw
obeys
glaring
grate
wry
chandelier
hazy
swirls
cranes
butterfly
fierce
backing
index
script
director
blankets
idly
pitches
raw
article
coolly
banks
plugs
propellers
re
trio
drenched
joyfully
shark
soberly
sharing
bits
gentleman
chested
homes
posted
slithers
hastily
local
fbi
killer
mill
guitars
identical
onscreen
cereal
missed
cubicle
tracking
rags
mr
splashing
pill
clasped
missile
rotating
skeleton
cavern
carnival
tossed
separating
neat
arch
mostly
dribbling
lanes
lettering
needles
sandwiches
shampoo
funnel
deals
involved
summer
stump
rather
nowhere
shaved
tightening
grail
designs
marketing
applied
braided
rubiks
cheered
winners
overturned
fastens
bashes
desks
ignoring
releasing
daytime
unties
cracked
masks
heap
hungrily
chains
dummy
ford
carrier
nighttime
becoming
states
frees
clink
builds
common
armor
mirrored
carpeted
painfully
burns
foliage
fours
flaps
hopeful
rugged
pouts
compound
countdown
flicker
false
tow
santa
proud
cabinets
curves
exposing
hover
fetches
skyscrapers
dingy
shawl
bathed
aunt
boxers
actual
waking
cooler
electrical
scowl
wavy
oxygen
pig
sunshine
chubby
inhales
relax
containers
trailing
narrowed
protect
grandfather
warriors
speaker
grief
digging
files
plaid
pregnant
resort
pierced
suspiciously
sheer
harder
zebra
secured
fade
presenter
shatters
illustration
funeral
soaking
queen
gallops
servants
automatic
elder
lasso
rhythmically
rover
iceberg
knows
shaken
needs
frustration
hammering
possible
tirith
montage
helped
balustrade
skiptracer
solving
boarder
downhill
talked
alternating
te
defiantly
distracted
stoop
crazy
posters
motorcycles
spilling
trigger
shells
kidnapper
sacks
crushed
plops
invisible
chats
enemy
attacker
shack
forced
simultaneously
wielding
companions
loving
graceful
plucks
quarter
dreamily
nude
wheeled
polishes
pouch
electronic
jabs
cables
spill
glue
wonder
cliffs
waxing
seeds
spotting
affectionately
raincoat
slipping
newspapers
stoic
fits
prayer
alcove
timekeeper
fighters
unscrews
cia
lingering
moonlight
submerged
distraught
mercenaries
tilted
petals
plods
gave
monk
roar
triumphantly
exchanges
paris
solid
happens
cloudy
scanner
spoonful
incredulously
rotate
earring
resigned
comment
partition
aloft
spiders
astride
consults
gasping
aboard
concentration
apples
conductor
winged
billiard
exactly
could
decks
research
kinds
mjolnir
regular
triple
'd
ripped
wharf
questions
volunteers
replaced
mayonnaise
castles
slytherin
cooker
clinic
fang
accordian
capoeira
stunt
interact
aerobic
racquet
birdie
wiped
squeegee
barbel
hostages
melts
labeled
super
peanut
launch
collapse
departing
massages
$
coke
mustang
scrap
straddles
code
tech
concert
bubble
cartwheel
forwards
cardigan
think
skips
aging
nuzzles
trainers
7
beaten
slashes
stricken
goblet
ashtray
accelerates
collides
sneer
sparkling
levels
zone
vertical
secures
license
unbuttons
writhes
fearful
ghost
stroking
camouflage
flutter
paneled
crumples
businessman
dessert
revolving
stamps
submarine
comic
leafs
utensils
skinny
dangle
popping
boyish
eaten
nostrils
pained
plush
register
flung
champion
skyscraper
paying
corn
newly
wistful
reply
glimpses
walled
illuminating
sweaty
zip
holographic
recovers
den
waddles
strawberries
perches
gasps
somber
villagers
urn
crank
rich
facade
language
vegetable
chunk
featured
emotion
gaping
familiar
d.
skip
rectangular
pillars
tone
ghostly
hundred
relaxing
mutant
nearer
acting
brass
roommate
pirate
disturbed
stills
measured
employees
participants
baker
wands
tights
somewhat
hairstyle
thinking
why
telling
receiving
louder
replace
task
social
divan
hobbits
ham
glider
webs
myrtle
dementors
vinegar
vanilla
aerobics
mow
paintbrush
peeler
recruits
shuffling
chart
south
rams
mirrors
freeway
shy
accompanied
sprint
charred
major
chugs
12
raining
toting
wider
pots
selects
balanced
activates
charging
unshaven
skinned
cannon
hosts
regains
swept
barrels
maneuver
exasperated
midst
plummet
cases
surveys
trained
skeptically
tilting
plummets
wrings
admires
lawyer
observation
tents
singers
strands
pedestal
longingly
me
earnest
source
colonel
tarp
catwalk
stains
steak
propeller
determination
heave
delicate
surrounds
8
ascend
ajar
tipped
resolute
musters
ragged
footsteps
cousin
downtown
sidles
creased
network
heavyset
media
tremble
wiggles
belts
alarmed
sober
doctors
whirls
law
de
laden
9
creatures
accompanies
cocaine
congregation
bathrobe
assembled
groggily
intercepts
motorbike
tilt
defeated
bitterly
maintains
slot
wrench
capsule
caps
stood
ticks
fries
welds
steer
scale
typing
arranged
soapy
taste
sample
furious
sticky
roofs
flanked
performer
changed
falters
bobbing
dull
flock
specs
glowers
flippers
grown
memory
offering
visor
kettle
shredded
scotty
choir
examine
bricks
ages
skies
falcon
disappearing
pathway
gurney
mannequin
assist
mascot
waste
scored
april
bang
clown
hummer
linen
although
particular
appearing
clamp
orc
elf
boils
celebration
racetrack
discussing
sideline
matador
awards
treadmill
fruits
families
sticker
gut
asking
vomits
titles
victim
stories
stock
eager
employee
indicate
messy
sags
copies
consciously
bares
shards
gaining
headline
necks
leaping
frames
emerging
explosions
settle
erupts
duffel
narrowly
collide
shattered
fleeing
raps
seemingly
scanning
butts
trays
ringing
stalls
visitors
laps
chained
collapsible
irritably
delight
lapel
located
knuckles
pod
buy
destroyed
ruins
additional
observing
crouch
chip
collect
tightrope
juggling
paths
entry
trance
reels
pendant
gangster
alleyway
sill
curious
plows
obscured
boyfriend
forlornly
humvee
kills
natural
swats
campus
slowing
buried
cracking
fingernail
dunks
thrust
wheelbarrow
ward
industrial
lick
boom
amber
shivering
statues
pound
brim
sweating
sailor
puppet
musical
peacefully
encouraging
shutting
teaches
coyly
curve
melted
flipped
anchors
exhibit
screwing
headboard
crimson
honey
tunnels
coconut
blends
sander
squat
personal
filter
rights
hind
orb
layers
overalls
variety
fluffy
indicating
entered
peasants
silly
laotong
altar
pirates
parchment
walkie
scrolling
guiding
fiving
giants
shaven
stiffens
lifeboats
invisibility
strumming
fact
narrator
dumbbells
decides
said
siren
sock
grabbed
twelve
difficulty
mean
barman
fencers
afloat
th
robber
somersault
longbourn
bumpy
preparation
kickboxing
stepper
smoothing
skaters
curler
headquarters
brake
escapes
cruises
bottoms
adorned
uncovers
blackjack
thank
sketchbook
mentor
keeper
sinking
clubs
attic
writhing
monster
footing
badge
session
civilian
15
crooked
clenched
flashlights
orders
scampers
pierce
sprinting
engaging
auburn
cafeteria
snapping
coughs
odd
grits
swaggers
briefs
bones
exam
shimmies
birthday
banana
admiring
pursed
america
billboard
chicago
expanse
sleeved
wetsuit
posts
churning
6
cupcake
protruding
rippling
appreciatively
anguished
tearing
bundles
dj
controlled
afterward
rusty
counting
resolutely
mrs.
wolf
projector
dilapidated
astonished
vests
cluster
tiptoes
silverware
velvet
wipers
blazer
sucking
aliens
jelly
intercom
lanky
contract
ruffles
transforms
billowing
midair
kill
extending
program
ashes
flapping
visit
screwdriver
porter
amazed
extend
unrolls
representing
pulse
mood
sausage
teach
hay
combination
banister
dotted
sleeveless
strains
hull
momentum
jaws
shops
fluttering
violent
root
confusion
hostess
checked
floral
poking
hooves
cling
snakes
tenses
smashed
calling
royal
medicine
rice
unicorn
whipping
captured
quill
observe
extension
prevent
clay
ran
chunks
saloon
owners
chess
save
emblem
turkey
bark
may
gambling
blackness
japanese
ruler
ensure
coupe
adhesive
none
neither
routines
disappeared
comments
concentrates
drivers
quince
quaffle
bikinis
waterfalls
intertubes
congratulated
sumos
zumba
awhile
snowboarder
competes
correctly
kickball
slackline
marines
faded
sling
perplexed
cd
discreetly
hustles
tickets
gain
fuel
glitter
limousine
chat
urges
shaggy
delivery
watermelon
exiting
minivan
laser
tongs
sizes
eases
nightstand
handheld
silky
hours
throng
fog
scratching
threshold
parcel
coy
visitor
protesters
pro
sunken
cradling
littered
utensil
jutting
marine
backside
courtroom
dodging
content
grated
triangular
aluminum
hoses
strolling
shielding
awakes
executives
executive
chalkboard
globe
descent
prone
dejected
parting
hollywood
nurses
juts
sparkly
thighs
streaks
vial
daily
bombs
dashboard
mock
scowling
creeping
wheelie
autumn
mourners
squinting
multi
whisks
grimace
awning
pine
jumper
peppers
disguised
padding
cathedral
rummages
damp
progresses
surges
un
dune
college
vendor
joining
snack
revealed
fitting
venue
dissolves
freeze
doorknob
bible
corridors
furnished
freezer
menacing
100
bob
squirts
formed
spears
pinched
tread
impassively
ghetto
mission
scurry
damaged
pillows
redhead
vanity
ruined
groceries
designer
gardens
maintenance
irritated
fried
price
frog
milling
speech
magnificent
quarters
period
carve
exact
journey
n
blurred
welcome
beak
zookeeper
queue
styled
turban
ominous
doom
pinning
accomplice
climber
sorting
property
menu
smash
somebody
expensive
apprehensive
liner
crow
puppy
inn
attachments
fired
sprayed
senses
rabbit
verdell
approval
somewhere
disgust
lakeside
cronies
national
quality
trouble
planks
terror
teller
succeeds
jewish
ss
constantly
mechanics
hobbit
summit
shadowfax
patrolman
competitive
lassos
introduced
roofing
howcast
repeat
capturing
litle
seashore
audiences
stroller
gopro
partners
cuffs
expressions
random
janitor
giggling
robotic
dump
email
tanks
rundown
groin
scruffy
barred
weak
ass
co
miniature
railway
cheerfully
bmw
buffet
shoreline
reeling
swipe
stab
maintain
fleet
oval
sick
flagpole
reclines
weaving
shoving
crushes
recognize
feature
panes
send
bystanders
inspect
metallic
loops
sandals
miserably
tumbler
tighter
tee
racks
announcer
ads
mesmerized
respond
eyeliner
lace
hors
tinted
resembling
hurtles
crashed
seating
mouthed
mugs
rumpled
intricate
cherry
handwritten
360
panic
slate
port
stifles
roaring
downcast
asphalt
smears
unsteadily
wincing
beds
smoky
reel
fiercely
dubiously
cushions
restroom
shabby
slab
unfinished
sullenly
stylish
spilled
jars
disbelievingly
mice
wades
teenagers
headband
retreating
saucer
plaque
comforting
crumbles
server
disgusted
rescue
gaunt
increases
juices
pane
mysterious
signing
cocky
assembles
met
lumbers
sons
paddy
hem
threads
dispenser
foreign
donut
mace
smartly
grocery
editor
chopsticks
violinist
peaceful
weave
bridal
flooring
easel
producer
route
returned
confident
lord
skylight
hero
probe
corpses
archer
teddy
chainsaw
artwork
deliberately
cauldron
staffer
tokens
attacked
shutters
ripping
drip
journalist
vanger
created
clicking
bean
slippers
escorted
expressionless
drumsticks
footprints
cruise
weeping
awe
happening
feel
must
italian
tourist
badly
derby
introduce
occasional
supporters
troop
hurled
struck
hurtle
neighbors
swishes
previously
trudge
everybody
triumph
fetch
switching
important
speedboat
ribbons
frying
competitor
hairdresser
artists
installing
witches
usa
introducing
steppers
cellophane
methods
limes
pews
furry
trumpet
chic
choppers
gains
ranch
abs
handcuffed
deflects
rv
eagle
brightens
bounced
engagement
prints
girlfriend
blur
floorboards
mantle
peek
wrinkles
sultry
pries
powered
cramped
sunlit
sympathetic
melt
smeared
underside
haze
billows
entryway
fragments
otherwise
glassy
gunpoint
docks
rains
bodhi
bye
dig
dolphins
sour
seductively
doorman
bulletin
stroke
east
toned
doffs
flick
earrings
smoldering
ornament
cranks
senator
dr
pilots
count
interrogation
eyeglasses
mountainside
v
flickers
suvs
hoists
sheepish
miles
titled
oversized
wristwatch
related
rosy
hiking
waxes
overcast
emotional
example
acts
lounges
bulky
tripod
prop
brandy
hooked
glowering
harsh
gentlemen
championship
circling
roses
microphones
maitre
beefy
portly
caucasian
atrium
sharks
twilight
lobs
summons
unbuckles
lover
sculpture
surgical
pedestrian
helicopters
ditch
securely
alcoholic
coffees
morphs
launcher
barren
cellar
fail
pliers
gulp
buster
smoothly
sullen
upturned
compact
destroys
navy
straightening
tabby
nest
shimmering
careful
tangled
heated
exotic
lovely
gardening
dumping
senior
litter
feeding
flooded
mast
particles
sucked
contain
resistance
nut
disapprovingly
jokes
saucepan
visibly
healthy
gallop
ginger
cupcakes
squirms
impatiently
audio
health
crutches
shrubs
cavernous
galloping
thrower
mounting
rears
untouched
infant
grasping
homework
steadying
elastic
supporting
gloom
butler
concentrating
intermittently
thermometer
caretaker
reports
recorded
kitten
west
peeled
tension
means
stewards
causes
conscious
situation
tobacco
wildlife
aback
yacht
plow
unaware
halloween
contest
want
mustard
terrible
click
slapping
taping
nets
sense
decision
scooping
arrived
learning
ribs
dwarf
fashionable
pineapple
downwards
foods
brooms
ark
beverage
crop
lesson
bludger
sleds
shingle
alternate
extensions
crochet
install
canoeing
liquors
croquette
snowboarders
lob
leotards
sidewalks
olympics
copyright
chords
beanie
demonstrated
tubers
birders
maize
buckbeak
juvenile
straddling
mailbox
portraits
quad
smirking
tough
panicked
messages
doe
vintage
backup
deal
dramatically
munches
vibrating
expert
ammo
clumsily
weeps
bandaged
crossbow
bookshelf
cute
die
drunkenly
embarrassment
detailed
selection
ceremony
dense
admiringly
grain
hopefully
awaiting
brooding
firewood
wields
union
vulnerable
biting
erupt
sidelong
hurdles
gunmen
understanding
surreptitiously
department
cylinder
massage
fry
lectern
bemused
carpets
m
totally
tickles
sale
tails
slouches
imagines
morosely
glamorous
disperse
la
stoically
pleading
trembles
transfers
injects
technician
huddles
skirts
tour
components
restraints
bands
planes
sniffing
trot
essay
buddy
pleasure
appreciative
appearance
submerges
degree
shaky
motioning
whistling
smug
jab
arcade
rounding
cupping
cuff
circuit
access
drips
compares
scope
miserable
clenching
lonely
citizens
bandages
trap
devastated
reloads
module
rubbish
my
devil
since
nonchalant
blazes
bony
crater
levers
overgrown
evil
girder
stout
officials
specific
smugly
journalists
imposing
dazzling
pegs
traces
tally
widens
goers
tracker
storefront
gratefully
icing
pools
comfort
noses
darting
lazily
bordering
absorbs
peg
unhappily
okay
bells
sparkles
composure
khaki
community
petrol
brisk
topic
clipped
patting
harry
whale
gigantic
scattering
pumped
worktop
loop
pantry
greens
rinsed
tabletop
startles
tried
controlling
severed
stoops
felt
tunic
preacher
reins
'll
newscast
alive
native
chimney
dumbfounded
traditional
magazines
volunteer
rage
stubs
explore
disappointment
rule
formal
settee
gangplank
penalty
lipped
mad
cardio
recumbent
digicam
sensing
least
believe
physical
attitude
ingredient
arguing
static
enthusiastically
downstream
willow
poor
unscrewing
mechanic
sadness
hawaiian
loan
received
clamber
twigs
lieutenant
beans
difference
climbed
adjustment
viewer
hai
clambers
loss
weed
catapults
protection
powdered
incredible
spun
temperature
festival
moped
pincer
grandma
hairspray
chefs
hurdle
congratulates
instructors
grooms
robots
racers
rollerblades
blading
stats
individually
windsurfing
conga
ballerinas
lathers
hoola
oddly
c
computers
poop
kung
fu
nerds
giraffe
disapproving
rally
furtively
hell
overtakes
holster
teenager
expertly
steadies
recalls
baring
goatee
buckles
crowbar
reality
advancing
intruder
plume
embers
comforter
varying
suds
unsure
cinema
eyeballs
hilly
buys
realizing
fitted
sweetly
porcelain
banker
moss
corral
slat
backflips
mural
demurely
questioningly
twinkling
nightgown
cloaks
arrival
rival
rooftops
coal
squarely
frantic
avenue
recover
killing
gunfire
lists
hostage
triumphant
yields
tarmac
retirement
spiky
jacks
casting
rays
baffled
sly
actor
cringes
pixie
collecting
bulbs
stole
purposefully
eyelashes
cork
d'oeuvres
clutch
gape
stocking
pointedly
trench
personnel
grappling
north
backed
depicts
california
separates
seam
isolated
satellite
ignition
tends
stunning
incline
handshake
peak
snacks
restrain
aggressively
envelopes
grieving
babies
synchronized
endless
flattens
breathlessly
ram
armoire
posture
careens
warrior
furrowing
cupped
insert
poke
princess
obliviously
disguise
sniff
meter
sob
swat
ducking
lockers
reclining
european
pallet
signature
extracts
bathes
lobster
shrinks
spotted
flinging
worked
drifting
allowing
gropes
teachers
snorts
non
drunken
munching
perch
henchman
craggy
auto
strapping
slanted
stray
museum
18
enthusiasm
impressed
pristine
fireball
directing
expo
whistles
ladders
bravely
mushrooms
prods
newlyweds
version
manual
crammed
overhand
soul
menacingly
navigate
straighten
obediently
emerald
ornamental
dream
crib
statement
total
parsley
shouting
inter
brave
ferry
anticipation
arranges
filthy
unhappy
polka
sidecar
curving
chute
angeles
mitt
skater
cucumber
etched
symbols
prince
maids
defense
mouthpiece
simply
decorates
17
underground
prep
rental
pawn
dismounting
shrimp
swimsuit
talkie
chased
vapor
burned
wheelhouse
funnels
leashes
kimono
guitarist
success
sweatpants
bulb
pincers
oak
overcome
suspicious
layup
maker
wary
buck
fences
prize
barbells
platter
happened
thunder
express
remark
flatten
gotten
occupied
skiff
approached
appalled
medium
university
drug
pom
youths
acknowledges
shit
treats
raging
honda
boil
batch
mordor
axes
uruk
ames
billiards
actress
performed
via
font
stag
bakes
messes
drilling
serpent
mallets
salsa
bullfighting
kayakers
services
juicer
inline
coloring
loser
melting
plastering
voleyball
interacts
goalkeeper
interviewer
welder
parasailing
builders
exhibiting
placement
sax
justin
trashcan
registration
tranquil
sprawls
kitty
penis
pounces
evidence
chokes
stampede
imaginary
wiper
fasten
discarded
zero
hour
veers
poised
windowed
buckle
cautious
hairy
patches
idea
earphones
days
ascends
mighty
petite
soaks
sidesteps
graves
bellows
stomping
puffy
propelling
history
craning
dejectedly
telegram
bulging
agape
collapsing
shout
nightclub
courthouse
cells
connects
civilians
rumbles
overpass
cartridge
signaling
scrambling
scaffold
waistband
barrage
survey
scaffolding
deposit
cylindrical
retracts
positioning
swerve
samantha
dolls
flutters
hefty
calms
celebrity
unpacks
neighboring
posed
interrupts
beaded
strand
pajama
ovation
playful
pearls
acrobatic
stamp
000
faraway
e
operates
examination
col
subject
escaping
disintegrates
ranks
stolen
chasm
bunker
connecting
engages
picket
horizontally
nursery
growth
lounging
flushed
hinges
imploringly
tentacle
spike
texts
soil
bashful
riverbank
certificate
contentedly
tether
alert
humans
angled
floppy
promoter
somberly
sewing
handled
silhouettes
classical
strain
bodyguards
hint
orchid
console
nibbles
nostril
whisper
unlit
bashfully
solitary
sulks
platoon
po
lecture
mingle
stables
undershirt
rimmed
haunted
shoved
found
noises
annoyed
serenely
comrade
sniper
sickly
tentacles
meadow
duty
engines
burgundy
discovers
arcs
unleashes
ashen
robed
caress
loaf
domed
untie
chanting
demon
bitter
gait
iraqi
buses
scarred
exploring
slinky
clocks
rooted
theatre
crescent
wicker
squeezed
roadster
agitated
vacantly
linger
deposits
fury
accelerator
unhooks
resignedly
onion
stuffing
cursor
kissed
rainy
chuckle
umbrellas
answering
highest
flattened
guided
canal
listlessly
pleasant
verdant
overboard
thumps
tethered
tugging
pail
tests
mournfully
g
shoebox
bouncy
fringe
locking
nephew
outward
pigs
burner
preoccupied
sealing
translucent
patients
brandishing
fashioned
prometheus
matter
organ
engulfed
procedure
looming
accepting
contemplates
assortment
handcuff
recede
muttering
flint
binder
glittering
figurine
decorative
battery
bowlers
punched
inspectors
cows
porthole
barking
stepped
icon
flows
murder
sheds
meters
junk
archives
lawns
shrugging
darkens
lingerie
distressed
presentation
mouthful
dojo
lug
g.
strung
alight
thud
cubicles
adventure
engrossed
broadcast
united
foundation
hairs
emotions
damage
cleared
leopard
jazz
burnham
forty
pov
argument
wanting
horrible
perhaps
mike
slammed
mayor
possibly
occurs
inserted
chemical
always
candies
maybe
kindling
commentators
ill
fiddles
equally
ominously
convicts
thirty
sweets
basically
florence
stumble
slug
insides
darker
project
composes
pursued
ringwraiths
grout
hurts
regarding
slower
clapboard
blend
speedo
quidditch
undergrowth
ok
weighted
toilets
carriages
attach
lorry
glade
cuddles
housekeeper
motorbikes
coaching
dribble
rum
cooling
terraces
helper
tshirt
blended
varnish
ni
intertube
tasks
vaulting
matadors
peolpe
tiling
handstands
concepts
showcase
mainly
wile
websites
describe
sloop
birder
mastermind
tucking
buzz
strut
pew
dramatic
unmoving
frosted
weird
humble
classmates
ignore
precision
tend
bystander
fends
engulfs
sternly
distantly
million
luxurious
portfolio
rules
acknowledge
patterns
pleasantly
sizable
mortar
remainder
dozes
strawberry
riverside
bluff
cradle
overwhelmed
vanish
pistols
ascending
cannons
shapely
detonator
soaring
screeches
crushing
grazes
wryly
unnoticed
breathless
stifling
backhands
primps
bonds
netting
dumpster
limping
seethes
blindly
flares
pamphlet
cringing
imitates
donning
headstone
blinding
dolphin
extinguisher
conducts
twitch
postcard
beads
confetti
plug
curtsies
chaise
boyishly
pearl
upscale
grateful
incredulous
serene
helmeted
apparatus
palette
stark
gauge
parachute
blaster
spark
streak
anguish
unopened
bleary
rods
staggering
ecstatically
vanishing
formations
maps
overlook
buddies
nearing
nestled
swallow
likes
levitates
connect
parka
execute
juggles
nonchalantly
crude
restlessly
grid
bib
tentative
strangely
resident
comforts
scribbles
l
snuggles
woeful
droops
exhaust
trundles
subdued
jetty
dappled
obscures
bundled
topless
waggles
peeps
pounding
straws
palatial
ruefully
ripples
mesh
stainless
data
bonfire
situated
machete
planet
smack
texting
spiked
refuses
stocked
ipod
shelving
confronts
hog
impassive
monks
taut
petrified
rove
acolytes
fingertip
respectfully
articles
skims
stocky
radios
scarves
truth
husky
grimy
repairs
explores
honor
contraption
expectant
disgruntled
extra
relaxed
core
commands
desktop
swaps
locates
siblings
fanning
pepsi
forks
jerk
reassuringly
sentence
clutched
newcomer
quaint
attend
tropical
scales
cowers
vicious
injury
nearest
sneers
gateway
freshly
cobra
advanced
dumb
grandmother
expecting
vines
scientists
apprehensively
gps
depths
tide
google
straining
regard
awakens
animatedly
snowball
liberty
sneak
snowflakes
axle
panting
lather
heartbroken
los
elegantly
comfortably
bordered
clamped
separately
intensity
humanoid
slime
apparent
tremendous
frenchman
addressing
sheath
pyre
11
stagger
ammunition
st
oriental
plump
cloths
distress
increasingly
haggard
dipped
magically
mate
besides
separated
windowsill
armpit
gasp
stores
altogether
mantelpiece
chocolates
commentary
homemade
nicely
duke
glum
chevy
asgard
manage
promenade
rung
rats
rattles
corvette
yell
plains
ugly
farmers
bridesmaids
cyclone
mutants
bins
grunts
escalade
closest
cheap
maneuvering
realize
ponders
lungs
unsteady
often
courage
sorts
fifteen
iced
glazed
description
theme
boiled
garland
rumble
civil
dictaphone
davenport
pranam
grinding
despair
happen
challenge
busily
handling
stools
goon
conducting
nature
toothpick
trotting
jews
pride
jello
robbers
morgul
proffers
bases
hoping
watering
cues
backflip
highly
giggle
turkish
motorboat
steamer
eater
kangaroo
filmed
pedaling
assisting
gypsies
danced
flexing
installed
dinning
distract
snatcher
leeds
beings
shotput
tams
wilderness
seasoning
buffing
messing
drys
skiis
poms
participate
recap
competitions
jerseys
handlebar
narrates
holing
swimsuits
jousting
basketballs
poodle
elongated
installs
etcetera
kneeing
volting
alternates
int
paused
sledding
closeups
bristle
stencil
mayo
hector
nerd
destination
humps
gawk
childhood
nimbly
sealed
gag
thrusting
perp
carts
melody
cohorts
hauling
boulders
incoming
satin
holsters
lapels
sneering
snapshots
pursuers
gunpowder
cassette
keypad
consumed
yawns
loosely
entranced
dvd
challenges
expressionlessly
bulge
angel
fed
lad
bourbon
chalks
unresponsive
shattering
kept
polite
distorted
shin
horsemen
crisp
fort
wealthy
melancholy
brimmed
locomotive
americans
wedges
inspecting
monument
whizzes
payphone
cuffed
attentively
rounded
hesitation
flank
degrees
protectively
carrot
filing
combat
grenades
spotlights
latch
barks
spanning
plunge
themed
gardener
homeless
boardwalk
apologetically
visits
glued
marijuana
ron
celebrities
a.
hued
week
obscuring
scraps
urban
stockings
bracelet
skillfully
crease
platinum
candlelight
patron
indignant
replies
dates
injection
ray
zippo
pursuing
ridden
shudders
chorus
flop
gunner
engulf
sashays
airfield
missiles
impatient
swell
hefts
sunroom
recovering
weathered
seething
focusing
emphatically
creases
overturns
vantage
tweaks
precariously
suck
cruisers
lanterns
suspenders
cola
whispering
seriously
stubble
kiosk
timidly
stealthily
weighs
criminal
flannel
grace
unoccupied
rumples
microwave
yanking
choke
exhaustion
bushy
magnifying
unto
addressed
vertically
ripple
narrowing
iv
limply
frail
elevators
scuttles
bungalow
waiters
grilled
pyramid
coveralls
captive
electricity
bong
afternoon
cozy
wink
wedged
henchmen
relentlessly
skeletal
ignites
mushroom
rearview
crumble
wrapper
possessed
memories
defensively
snarls
desolate
maintaining
tattered
parched
faints
snags
pebble
actors
maximilian
resembles
headlight
increasing
defiant
fascinated
occupants
realization
autographs
criss
gasoline
putty
industries
sixth
folders
pleadingly
smacking
coworker
sprinkled
forlorn
pastry
basil
slurps
bamboo
barbecue
jolts
attendants
discards
sculptures
shoos
strained
lotus
selling
bugs
crestfallen
greenery
mohawk
snarling
remorsefully
taunting
manicure
scientist
swishing
transparent
catapult
severe
wade
gated
hyena
currents
prepping
feebly
pointy
sparkle
darkening
latex
denim
daybreak
governor
en
hardens
seed
robin
necktie
carrots
scary
copper
lighthouse
lunge
dug
cloaked
campsite
jets
crewmen
med
spattered
helm
chapel
joyful
cycles
huts
approvingly
automatically
instructional
sparse
hub
hardware
c.
operation
antique
arrange
partly
century
flakes
gloss
asked
citrus
typed
doleful
victorian
sorcerer
sleepily
grimhold
gem
reappear
spirit
dunes
opera
moored
combined
gulls
doorbell
peach
burnt
teasing
hive
enraged
screech
stripped
purchase
unnerved
accident
puffing
costco
thirties
couches
40
perfume
stabbing
camping
assistants
circus
ravine
observatory
telegraph
flushes
blackie
reigns
wintry
shrouded
horseman
woodland
hunting
packages
tiller
pancake
dishwasher
transition
elk
amazing
basic
anxiety
commotion
beautifully
hum
interrupted
orderly
cause
hopped
bold
utter
nobody
difficult
exhibits
reason
fifty
hydraulic
thumping
gutter
theirs
toothbrushes
fairly
dialogue
sari
residue
splashed
despite
hopelessly
offer
overweight
john
shaded
chairman
boxcar
roadside
stumbling
instructed
tram
bronze
ask
wailing
ranging
advantage
bunny
plenty
decor
borders
hauled
treatment
announces
cobwebs
al
choreographed
festive
washroom
involving
dice
dangerous
vine
spices
teeming
chooses
neuralyzer
smell
apparate
silvery
teases
unmasked
junior
splinter
partying
garnishes
concourse
balmoral
mode
herbs
bandanna
professors
cleaners
advertisements
cycling
solved
nailing
necessary
percussion
saran
required
assists
benefits
zoomed
showcases
toppings
piecing
forested
pogo
seasonings
gearing
bobby
hi
shuttlecock
waterboard
coating
demonstrations
method
torching
2014
lessons
reactions
sneaker
wording
obama
peddling
hairstylist
environments
criminals
werewolf
korean
backpacks
flirtatiously
encounter
delightedly
lookout
underpass
tanker
stagehands
doritos
riddle
medic
arrangement
workbench
barge
onboard
padlock
0
pudgy
condoms
workstation
switched
wrought
flirting
partial
dusted
tavern
superior
lodged
decanter
modified
workroom
dusts
jail
downpour
vampires
invitation
binding
splintering
swooping
danger
stately
anti
sorrowful
averted
unlocking
soot
washington
freight
scantily
guarded
sprinkle
collision
stows
embracing
skeptical
radiator
russians
cower
helipad
mezzanine
devices
plunging
settling
plumes
welcoming
robert
please
yank
departure
jury
zipper
claim
county
months
ceases
corset
reclined
chests
burlesque
ransacked
deejay
costumed
stalk
disbelieving
resentfully
charming
grind
stamped
laboratory
fetal
clinks
groggy
jerking
exploding
strategy
converge
battling
expands
ecstatic
baggy
lugs
graffiti
pre
cropped
blanketed
bluffs
25
graph
anyone
coral
calming
speedily
rustic
tidy
scalp
turtle
groans
surge
videotapes
award
spy
puddles
bust
60
forefinger
sag
foreman
businessmen
vendors
rewinds
spectator
slum
marketplace
passport
thermos
associates
morgue
soaps
rims
bodyguard
alligator
murmurs
suppresses
caller
cheerful
startling
eastern
outlines
clasping
slats
gauze
clutter
reflect
porsche
online
yourself
goats
unsettled
mnu
seals
seller
determinedly
tendrils
gangsters
trapping
manipulates
toenails
whiz
victoriously
discover
galley
cadillac
whisky
foreheads
channels
beacon
projects
inmates
teacup
woozily
potted
concierge
lovers
watery
erupting
affected
quarry
propels
toothy
daze
destruction
lifelessly
wrinkled
executioner
consciousness
blackened
remember
gritted
threatening
stringy
junction
spokes
hurtling
storeroom
dismissively
beret
lizard
cages
slick
listed
brighter
swallowing
distances
sirens
capsules
flared
blueprint
unplugs
roadway
countertop
coil
seventh
cocked
freeing
refills
headsets
knowingly
collared
merry
creamy
rearranges
hairline
kicker
adorn
puppets
peaks
someones
awarded
swift
choking
god
defensive
stumps
wedge
roman
planter
graciously
overlooks
panicking
unfastens
illustrated
spoke
quivering
advice
attaching
hammock
squid
surging
recedes
convention
60s
removal
blizzard
reluctant
buttoning
frock
skulks
piglet
snout
sincere
review
stifle
shutter
awestruck
resolve
artificial
distaste
honks
enraptured
greenhouse
wriggles
instinctively
hacks
vegas
blaring
douses
villain
engraved
beady
stripe
wail
chance
bee
banquet
depart
valve
blowtorch
hostile
flurry
encased
intent
detailing
embroidery
jots
tailored
scuttle
nailed
absorbed
rottweiler
treasure
onlooker
charcoal
toasts
contorted
imploring
adjustments
chessboard
lodge
convenience
camcorder
facebook
mnemonic
clippings
leisurely
unload
circled
testing
unlock
collapsed
wads
boiler
sash
dime
sided
whipped
goofy
saddles
slit
told
teenaged
glee
breathe
bursting
admiral
unique
signed
pruning
motors
eyeball
usher
obvious
pissed
interested
unusual
whatever
wondering
afraid
eleven
satisfaction
whimpering
periodically
allowed
impossible
turrets
croupier
understands
astonishment
echo
phrase
practically
b.
delegates
scenic
wheeling
embankment
clusters
sole
literally
defend
concealed
failed
nightie
relative
deafening
drove
swivel
caddy
rhythmic
knapsack
bishop
ashram
combed
trampled
respect
yes
whether
worse
tomb
windy
forgotten
aghast
taunt
impulse
bridges
tollkeeper
minister
wagging
toad
torchlight
javelins
heights
reverie
devastation
sometime
arizona
ottoman
pitched
mold
dismay
apartments
panned
incident
songwriter
explanation
adjusted
jamming
splattered
phial
battles
nazgul
effortlessly
swamp
scraped
dealt
loosen
panama
diadem
gypsy
cascade
slowed
precisely
ironed
mashing
choosing
waved
painter
finale
40s
knickers
bugle
rubbed
florida
rural
rucksack
aston
investigate
fells
collie
surgery
mistakes
upcoming
welded
buff
batman
groomers
weightlifter
ferociously
transitioning
piercings
gadget
ehow
bleach
heighten
trowel
windsurfers
diaper
countries
suntan
cam
strum
layups
stacking
razors
adidas
unpaved
sponsors
jockeys
vs
informant
hippogriff
twinkle
meekly
undercover
explosive
cordless
duffle
enthusiastic
western
bullpen
gay
beetle
chickens
cinnamon
swarthy
hispanic
arrest
crews
perks
warehouses
sideburns
3d
emblazoned
wired
checkered
smiley
emanating
flamethrower
whites
resemble
tvs
ceramic
condom
dismisses
indeed
beeline
coiled
greetings
happiness
00
twinkie
jostles
trapdoor
snare
victims
moonlit
boldly
recess
spectacular
materializes
disarms
runaway
distinguished
inscription
remorseful
beverages
spitting
coated
dissipates
railcar
downed
abyss
outlined
quizzical
sedans
motorcade
executes
flatbed
guardrail
caged
steeple
chandeliers
distractedly
grade
caked
sawed
withering
hardened
whirlwind
flail
emptying
intensifies
leak
mexican
mimics
shimmy
nuns
nun
tyler
sundae
thousand
roam
averting
thong
sushi
flicking
carousel
sleepy
mischievously
blush
competitively
flourishes
plainly
tequila
ogles
jean
lids
skimpy
trendy
affectionate
reopens
innocently
famous
sarcophagus
barbed
lags
unharmed
waterfront
restrains
newsstand
gi
destroying
dashing
true
unfold
coastal
starry
due
radiant
graze
foamy
respective
bullies
veteran
guilty
scrambled
roiling
dangerously
receding
crystalline
airborne
vomit
disgustedly
fox
searchlight
warms
wince
dolly
cheeked
marquee
posh
patterned
bulldog
trades
reassuring
defending
crumbling
pray
brawny
motorcyclist
boxed
taxis
niece
pivots
markings
supervises
shakily
bulletproof
accompanying
grating
brownstone
paned
framework
reeds
grapples
dividing
deflates
fatigues
cursive
charleston
tactical
budge
abandon
ailing
sorrowfully
soundly
mistake
sunk
seize
numbly
reader
moist
antenna
district
fixedly
webcam
micah
variations
gawking
speakers
atlanta
pupils
shrink
charger
midway
sympathy
dawns
o'clock
surgeon
treading
scorched
demonic
mohawked
camper
alternately
composed
structures
vaulted
amphitheater
stations
summoning
notepad
prominent
grandson
complicated
quicker
shafts
skeletons
heeled
prices
minutemen
pursues
transfer
dayton
caviar
inviting
barriers
quickens
futilely
frenzy
technology
gash
omelet
scars
unisphere
fiddle
mischievous
timber
recipes
cordon
browses
pastries
cock
chipped
celery
manicured
mango
gaggle
aisles
pavilion
fixture
locals
purpose
deftly
hails
reptile
softened
tallest
unroll
mocking
fountains
regain
amplifier
slacks
goods
inadvertently
giddily
splayed
survival
urine
leverage
pry
viewed
photographed
gleefully
whisked
nimrod
lovey
heats
stethoscope
piglets
intact
hilltop
uncaps
sneezes
replica
trough
proceedings
grizzly
catcher
announcers
coastline
flowerbed
attacking
las
entourage
racked
contemplative
lava
layout
pup
urns
whirling
fiddling
passion
spheres
rate
knight
urging
temples
haughtily
demeanor
dumbly
jiggles
listless
matveyev
u.
mashes
emptied
registers
rebreather
toasting
turntable
plastered
curtained
hysterical
noblewoman
wool
cityscape
halves
fireballs
reviews
burger
boogie
parent
prosthetic
ladle
spout
bandana
loom
ruffled
restrained
problem
mention
remnants
wennerstrom
rainbow
hedestad
guardian
v.
archive
lulls
wanted
phoenix
freely
berries
recovery
coaster
embossed
admire
antonio
unveils
flyer
searchingly
raspberries
dribbles
complies
capes
bifrost
taught
technicians
pulleys
listing
risen
carnage
staircases
snoop
sodden
winnings
berth
trouser
erect
lee
talent
favorite
wreath
lassoes
lucky
poison
casserole
crackers
clients
napkins
jaguar
whiteboard
highlighted
scaly
aquarium
rotation
tags
dolce
crutch
buffs
inmate
coordinated
twenties
automobile
burying
slipped
suppress
occasion
strums
buzzer
murmur
consists
restraining
riverbed
sites
lathered
reservoir
indecisive
movies
slapped
grunt
shrieking
immobile
astounded
peddle
gowns
armpits
wore
dominates
rapping
household
chatter
hushed
hindu
splendid
viciously
hopes
india
chelmsford
massaging
amiably
packets
nco
somehow
differently
blending
graveyard
hello
rickety
keenly
outcome
dollars
clatter
tat
constant
sorrow
banners
noose
midget
particularly
mortified
powders
reenters
streamers
afar
conservative
huggies
scooped
especially
unsuccessfully
known
fuse
inhaler
maze
journals
elves
firework
plinth
palantir
killed
ruin
elven
peoples
sting
malevolent
rohan
continued
battlements
lurking
undo
dreams
norther
supermarket
muzzle
tanned
probably
loosening
bellboy
quadrangle
caressing
stake
timing
continually
cheery
archbishop
commendatore
browsing
ketchup
crucifix
ensues
ushered
sewer
tugboat
zeppelin
hubcap
easter
14
failing
dwarfed
siding
taj
gaps
webbing
glaze
fusion
tritium
unconvinced
zig
fronted
cuddling
exhibition
grainy
beaches
whilst
nunchucks
appliance
stylists
multicolored
cobbled
lavatory
defenders
gnn
rafter
hmong
decoration
dementor
accessories
gryffindors
newsroom
sights
replacement
donuts
production
windex
trampolines
trimmers
plated
participation
weigh
amounts
swingset
url
www
2013
teh
ounce
easier
cleanser
chug
windsurfer
accordions
racquets
examples
netted
soaping
clarinet
socializing
maracas
classes
congratulating
sunblock
jetski
commences
productions
walkover
bologna
workouts
brasil
jim
houdini
masters
beiber
firearms
d'artagnan
previa
barmaid
cutout
drama
disturbing
booze
mantel
nipples
role
duel
enemies
seatbelt
bowtie
reload
slideshow
mashed
mounds
assailant
backdoor
datsun
unravels
adorning
sales
launched
packaged
bracket
bedding
knotted
showroom
unimpressed
clowns
hoisted
filtering
expectations
am
shopkeeper
shimmers
blossoms
bloom
angular
monstrous
chattering
fiancee
dismissive
pitchfork
roundhouse
hideous
shacks
captor
veins
keels
demure
regretfully
confidence
bracing
countless
slashing
tangle
consumes
peruses
darkly
checkpoint
snipers
180
swerving
windshields
stored
allow
artillery
disarray
decrepit
forklift
aggressive
sinister
searchlights
fond
breadstick
purchases
raindrops
scrunches
fakes
flirtatious
haughty
contorts
primly
baggage
pies
specifically
shreds
issue
advertises
revolves
phrases
showered
diamonds
coils
nestles
rehearsing
eyelid
mild
housing
bartenders
hypodermic
backroom
bookcase
flanking
comics
foggy
studded
stealing
allied
bomber
slits
planted
wetsuits
vegetation
cleanly
speck
spectacle
lively
knobs
propel
loosened
buoys
respectful
fling
turbulence
treacherous
recliner
soprano
greeted
lunging
adoring
plunks
oldest
musses
snowfall
royce
chuckling
wringing
bale
gould
hoist
drained
rib
doubled
confer
navigates
furtive
marshal
swagger
hungry
slumbering
trace
trailed
snatching
constructed
methodically
conveyor
sent
glinting
venetian
crook
collector
lashed
boardroom
spanks
portions
'm
cleavage
eiffel
hologram
thumbnail
slackens
cylinders
greenish
diagrams
spurts
launching
dunk
kindly
sarcastic
unwavering
sorry
emt
gallon
abandons
convulses
clergyman
brutish
handfuls
socket
reverts
hulking
disintegrate
ridges
inky
ringed
sockets
hilt
rust
armory
slump
overwhelming
arabic
authority
info
comfy
dignified
visual
bead
busty
casing
boathouse
widening
inching
efforts
repaired
detach
lobsters
explorers
darken
fireman
engineers
timed
opulent
stubborn
clasp
pursuer
forcefully
equipped
securing
showers
triangles
sparking
menus
widened
hearted
restless
adoringly
stem
huff
stew
carbon
blog
wish
cocktails
shuttle
bosom
layered
pigtails
buds
skilled
forbidden
lofty
patched
pupil
blearily
woefully
parry
fluidly
tweezers
mr.
superimposed
headgear
japan
hangers
pant
pointer
emitting
crab
goo
raven
khakis
smoothes
diagonally
plus
blackboard
librarian
illustrates
prays
preserver
convulsing
partway
seeps
bait
meerkats
spherical
cascades
pamphlets
smothers
erratically
mocks
skidding
screwed
subtly
gushes
lavender
roves
snowman
webbed
produce
gust
wispy
teapot
sideboard
cousins
gnarled
pens
deed
season
magical
gags
tunes
grandparents
pinball
slots
tiered
dissolve
empire
combines
pods
scottish
mile
indignantly
android
atv
resulting
tracing
units
mistress
oyster
hike
doorways
messenger
pummel
grotesque
canopied
deer
therapist
feeble
harmlessly
welling
attache
ashore
climbers
traced
frosty
bindings
thrashes
matronly
understand
finely
pure
arabian
printing
zipping
restaurants
encouragingly
bottled
beached
steams
dynamite
telescope
marauder
coordinates
stepfather
fumbling
vivian
diploma
shoppers
memorial
disconnects
deadbolt
millennium
earshot
steely
proceeding
likely
ecstasy
swastika
viewing
intrigued
directed
chew
genuinely
future
seductive
designed
noting
sculpting
legal
creepy
chelsea
packaging
transmitter
reference
bouncer
leaders
stuns
citadel
swarming
generator
direct
segments
seamen
bubbling
balconies
leaded
silt
pistons
sketchpad
consisting
slid
ross
despairingly
armful
consulting
application
staffers
spool
bakery
identification
mars
cracker
highlighting
protrude
parallels
saleswoman
tuning
fend
cornered
surprising
deadly
attackers
prizes
twitching
ago
sell
entirely
drown
videotaping
sanctuary
contempt
atmosphere
charged
scent
protest
rapt
recent
lump
nursing
improvement
englishman
reasons
headwaiter
swiss
batting
managed
appeared
terrifying
council
promontory
increase
unmarked
forties
snapped
hardly
indistinctly
lettered
viewpoint
benignly
delicately
doubt
geese
grouped
squeals
anchorman
buttocks
dumped
buzzing
spaces
redcap
concentrate
waistcoat
apprehension
warmth
terraced
cavalry
eerily
scratched
surveying
disclaimer
pas
wit
impression
chill
utterly
abha
knots
painful
mothers
drugstore
directors
associated
hysterically
hurl
unknown
problems
seesaw
pomade
moaning
interrogator
wizard
scrubbed
buffeted
urgent
documentary
rummaging
toaster
markers
dumbstruck
detectives
diapers
popped
muscled
completion
painters
jumpsuit
imperceptibly
isildur
wicked
scabbard
200
thuds
gondorian
armour
argue
fuss
plateau
rohirrim
lair
dozing
grubby
announcing
chevrolet
venice
hugged
sultan
arranging
ended
jigger
knot
mystified
urge
fingered
turret
forte
vain
compose
howls
reflective
cabins
dusting
nasty
scout
turk
freighters
tapestry
assistance
spaceship
detached
potion
advertised
dormitory
inanely
poolside
buffer
mahal
headscarf
balaclava
volleys
brolly
timbers
brimming
flit
benefit
aprons
options
trolleys
throttle
aimlessly
skywards
treat
filofax
holiday
alsatian
virtual
soak
lacy
turquoise
generally
temporarily
footman
intervention
towing
restarts
chiseled
broke
corgis
buckingham
bristol
affect
rivers
gran
torino
colt
marking
basilisk
deluminator
cactus
broomsticks
harp
courses
primed
lorries
achieve
oranges
sharpened
mitten
gort
broth
gren
repetition
template
smoothen
stages
handsprings
shiner
overlay
provided
gargle
bullfighter
fondant
consecutive
duo
2012
bloopers
illustrating
idle
jumpers
lotions
instruct
randomly
reef
volts
fencer
pyramids
brazil
glues
rectangle
sreet
waiving
ellen
cheddar
claus
professionally
helpful
muffin
organization
km
participant
juiced
creek
moping
automated
sunburn
represented
scythe
continuous
knifes
resets
freestyle
jesse
cooled
mulching
boaters
lb
judged
waterboarding
graham
squirrel
lifebuoy
barre
unicycle
bricked
kevin
juicing
windsurf
marshmallows
tuber
situps
lobe
reverend
hag
durmstrang
tounge
sprinter
gq
banquette
bounty
grotto
proceeded
f
hump
stickers
youthful
bleached
drugs
uncontrollably
camaro
newcomers
cymbal
account
peter
hoodies
awaits
ed
rig
yielding
bleed
sunroof
batteries
violence
chokehold
bops
buttoned
getaway
shard
plunger
schematic
ditches
merchandise
divider
mussed
peddles
28
gloomily
subtitles
acknowledgment
architecture
appraises
airy
500
counters
binds
veiled
wafting
deliberate
stupor
slain
flowery
slaves
fray
depiction
specialized
baseboards
jostling
blots
battlefield
smartphone
softening
electronics
abrupt
snug
toppling
intruders
supported
eavesdrops
bond
pelt
slicked
bloodstained
converse
crosswalk
sweats
profusely
austin
powers
roasting
blossom
urinal
unrolling
perusing
adopted
j.
madison
topple
tux
cruising
sequined
labels
beckoning
gyrates
watchful
muted
tambourine
pianist
conceals
whisk
interlaces
peck
cordial
widely
recruitment
dims
provides
sneaking
autograph
escaped
fins
destroy
incinerates
grapple
savings
nibbling
mended
beachfront
rusted
shorthaired
linked
cresting
skimming
conflicted
busts
tuck
huddling
contented
entitled
scrawled
latest
gleams
stokes
pebbles
secluded
rowdy
emotionally
swells
glossy
perpendicular
rejoin
mournful
viewfinder
crystals
slyly
mart
lamppost
authorities
ivory
morsel
plucked
coca
meager
neutral
sofas
hash
crush
peephole
dapper
chestnut
13
vengeful
footwork
wallops
disks
chili
bustle
affects
crossbar
promptly
breaker
buzzes
hatchet
powerfully
sensual
flawless
distributes
stylized
operated
dainty
portico
writhe
scurrying
tripping
screened
penny
coverage
colonial
afghanistan
fastened
appointed
comedy
guiltily
annoyance
handrail
overhears
milkshake
protesting
concoction
snips
techs
shanty
shrubbery
refugee
floods
captors
mothership
projected
dimensional
hearts
olives
meticulously
sews
scarlet
guidance
glittery
iphone
majority
slush
m.
perspiration
sleazy
trashed
rocker
penthouse
affecting
romantic
pacific
converted
defeatedly
fro
appraisingly
crazily
telescopic
outskirts
extinguishes
watchtower
rotting
urinating
passages
surrender
binocular
chant
brawl
bloodshot
spews
ranking
assigned
neglected
delta
prowls
welcomes
tongues
scatters
nipple
scoot
toyshop
girders
cavity
overlapping
cobble
stoned
knights
revelers
magician
spewing
tick
starring
hustling
insistently
slumbers
bearings
indifferent
queens
encounters
suspect
mangled
16
residents
totes
proprietor
questioning
26
reward
plans
cockatoo
emits
thrusters
uncorks
racecars
wiggle
ejects
projection
wastebasket
pulsing
incorporates
ribbed
mayhem
taser
trudging
plots
residence
roams
wafts
parmesan
radiantly
dollop
backtracks
reserved
sieve
tensing
postman
dine
reflecting
sheep
protected
fender
drilled
monastery
exquisite
precise
quivers
slouched
thickly
pursing
stances
sidestep
expect
stomachs
primate
melon
clawed
grumpy
medieval
vortex
today
alertly
dean
refuge
yawn
vibrant
biscuits
kneads
victorious
seagulls
stub
frenzied
saddened
springing
compassionate
sniffles
swabs
anyway
ibs
wisps
slinks
charts
furnishings
crazed
invite
dreamy
unhatched
snuggled
entwined
latches
tulle
hoof
backstroke
compassionately
cheerily
reunited
provocatively
plasters
freckles
pancakes
stripper
shapeless
tricycle
pushups
hawk
jukebox
affection
quote
greek
archers
decked
gauzy
columned
tended
heroes
gorgeous
recreation
gaming
rearrange
splitting
dna
chambers
dye
activate
atvs
crust
rages
twig
revolve
africa
sac
tackling
navel
ignite
saved
precedes
condensation
lash
repels
grizzled
humbly
widow
curtsy
hound
squishes
meaningfully
blissfully
brain
printer
stung
scrutinizing
boar
crevasse
headlamps
trickling
glimmer
textbook
19th
pinching
comatose
firelight
contemporary
eagles
rueful
incantus
pirouettes
excuses
contrast
floodlights
jewel
shrewdly
beloved
skulls
ukulele
helmsman
doorstep
spirits
seemed
allen
sketching
assignments
dusky
doodle
absent
saliva
t.
redial
envelops
vestibule
prescription
trainees
blurs
techie
negative
curtly
motorized
gottfried
obituary
edged
recognizable
fumes
hearse
internet
tailor
shady
stonily
considering
wagoneer
nappy
immaculately
parapet
magnum
leslie
harvey
swanky
england
blunt
jammed
bladed
headbutts
dispatches
idling
poncho
embedded
recovered
gnaws
guarding
exchanged
crewmembers
torrent
winch
cello
elaborately
desserts
roommates
fairies
squirming
shrinking
awed
hopeless
discouraged
defender
whoop
bud
mutters
captivated
suede
stetson
donkey
holstered
straggly
largely
wrest
crevice
ivy
mature
wriggling
ferris
moat
2010
monkeys
flex
biceps
frilly
passersby
approving
plethora
rattling
fuzzy
producing
steels
plum
recruit
strangling
teary
initial
expressing
fastening
banjo
con
bloodied
harnesses
shane
repeating
stronger
professionals
cds
greasy
unhook
smoked
sexual
muffled
freak
nights
aloud
discomfort
praise
definitely
overseer
parrot
departs
evident
tablecloth
attending
crunch
unloading
succession
discussion
beef
collected
precious
perspiring
significantly
fingering
humming
quilt
activist
absolutely
schoolhouse
seeking
praying
vigorous
tarpaulin
styrofoam
harshly
worry
snowing
organic
milkman
guardsmen
gunshots
urinates
manning
cabs
disco
nike
atlantic
scrapbook
watchman
anklet
canned
spoken
walker
sixty
stupefied
anteroom
tortured
wetting
dignitaries
sturdy
temper
plainclothes
miners
convinced
marched
await
intercut
cameramen
babes
seventeen
batsman
remained
magistrate
reasonable
loves
marchers
disdain
postures
resplendent
anthem
conviction
wonders
extensive
photography
headstones
wallaces
cuddle
peanuts
infinitely
shoelaces
savagely
supper
punk
hatred
latter
accomplished
assisted
wresting
convict
deputies
threateningly
hisses
belches
bullwhip
boater
acknowledgement
audible
candlesticks
unpacking
beggars
mauve
intervals
caramel
barrow
basins
clatters
lobbing
wise
untying
tarts
rumbling
efficiently
rap
cornflakes
overheard
ka
thunders
rigid
squealing
entertainment
belongs
mahogany
concentrated
whoever
composer
agility
abstract
secretly
landlord
wee
billfold
35
taunts
unbuttoned
injuries
ale
isengard
draining
woven
gantry
sinuous
goblins
desperation
bedraggled
majestic
osgiliath
trolls
foul
shelob
confronted
trepidation
agreement
ultimately
brute
plunged
walkman
vaguely
cricketers
brothel
sprinkling
wizened
carelessly
morose
hotels
everyday
wounds
miracle
jeweled
skill
christ
decent
steal
attended
cu
experiences
ensemble
announce
terms
physically
leathery
philadelphia
mimic
lumber
electrodes
bookshelves
fraught
disembark
radiomen
leaflets
fuselage
galleon
align
ltd
dear
snowstorm
cloisters
filters
eaters
mobiles
flustered
minus
vhs
mumbai
pickles
20s
tuk
generated
passerby
gravestone
pizzas
unflinching
cowering
shoelace
lyrics
grains
mugshot
symbiote
lifebelt
grapes
bookshop
san
glacier
pda
boarded
flier
windmill
divided
newborn
cuddly
mousse
truffles
headmistress
netherfield
crags
womans
function
tweed
gramophone
inhaling
fugitives
hyperemic
suggests
outfield
rovers
cities
replacing
plowing
labrador
pixies
licked
deflected
flavor
headrest
burmese
thinning
handlers
pager
blazers
suspension
learns
ferret
formula
dabbing
hurting
showcased
lapse
championships
angels
specialist
cucumbers
conclude
waxed
silicone
manis
snorkeling
volleyballs
shrub
texture
navigating
sart
listerine
midle
representative
initially
strongly
lathering
ans
inbetween
joking
toddlers
germany
steve
vaulter
performances
vaulters
paragraph
unsuccessful
handwash
unscrew
waterskiing
commentator
veterinarian
protectors
tutu
aligned
springboard
freebie
intentionally
effectively
gargles
elmo
contestant
cena
doh
emery
goring
piercer
sandcastle
breakdance
demo
sheers
occur
racketball
remover
wheeler
congo
snowslides
fom
installation
remodeling
bathe
futsal
bullfighters
mermaid
corals
acrobatics
learned
implement
acetone
rode
barbershop
symphony
looses
sealife
michele
betty
jockey
compilation
diagram
molder
retriever
mowers
raiser
brad
collage
kilt
aspects
ensuring
subscription
wakeboards
ina
guerrero
cove
deputy
sarah
honing
lense
sfu
tran
jamie
deniro
crabs
esu
poopsie
depository
anorak
skaterat
ripper
contac
metropolitan
academy
boasts
hookers
21
handicapped
scrawny
keyboardist
depicting
science
accusingly
awesome
partygoers
humping
saber
chemicals
spandex
volkswagen
gushing
disheartened
cohort
tailgate
proof
handguns
conceal
tauntingly
prom
dancefloor
veer
spiraling
pleads
jesus
junkyard
understandingly
chimpanzee
exposes
hairless
browse
identity
puffed
marshmallow
ikea
31
copying
humor
afro
flights
mime
fin
75
candlestick
protege
tribal
douglas
pharmacy
pestle
energetically
plantation
boasting
steed
daring
swiping
foe
newfound
petal
crowding
mocha
underlings
embeds
pastor
shuttered
clash
jaunty
branching
recklessly
tipping
trajectory
bayonet
versatile
jostled
beauties
moscow
scour
mercilessly
cobblestone
innocent
plummeting
trailers
standstill
borrowed
regretful
venturing
sawing
mannequins
strobe
ceilinged
hazmat
florescent
wrappings
clockwise
spa
steamy
ricochet
caliber
regaining
quiver
shimmer
gaudencio
gil
naval
loaves
nuzzle
fondles
appraising
oreo
hardcover
dildo
ultrasound
simulates
fondling
piggyback
caboose
genitals
masculine
boulevard
offstage
bodysuit
flatly
cynical
intimate
brochure
naturedly
exasperation
rogers
futuristic
oversees
fortress
vials
serum
taxicab
tacked
frisks
tackled
bash
europe
freed
ben
concave
icily
compass
bombed
merge
deflecting
retract
joystick
yoke
steeply
boxy
uphill
swelling
handiwork
paddleboards
enviously
colossal
frothy
exchanging
crests
executing
broodingly
peacock
seasoned
humiliated
shambles
spying
stormy
plucking
camped
tiers
undulating
foursome
veterans
volcano
churns
lei
expels
lasers
turntables
telekinetically
sporty
populated
encourages
varsity
retired
firefighter
billow
skid
shatter
impaled
salami
propping
moistened
droplets
supervisor
tenement
versus
grills
remembering
medics
coffins
june
sparsely
clinch
slugs
envisions
midriff
magnetic
manhole
commuter
associate
escorting
leggings
reset
gaudy
flattening
evades
detection
assassin
briefing
paintbrushes
chrome
accompany
silencer
uncovered
lustfully
database
element
unlocked
mends
disconnected
textured
trickle
igniting
fleeting
plywood
seashell
mailman
nickel
patrols
mailbag
drowsily
candid
panty
martha
snogs
attract
vulture
saves
wrangle
solar
shepherd
garments
clothesline
johnson
multitude
cords
breach
mercenary
volt
deflect
doggedly
selections
mismatched
caring
saturday
rehearsal
keyhole
chunky
courts
counselor
confessional
gazebo
lolls
ntsb
cordoned
awakened
enlarged
anonymous
scolding
splatters
grazing
ghastly
confusedly
placid
undercarriage
jolt
excavator
shipping
supplier
maniacal
quirks
expose
rotunda
region
demolished
archways
fidgeting
rot
shrieks
heavens
enveloped
prompting
orbits
active
tagged
halting
arbor
composing
bailey
clockwork
oils
pigeons
assembly
le
harold
b
flush
bookseller
headdress
author
pendulum
vibrate
moustache
candlelit
digitally
ticking
clerks
framing
breezy
timekeepers
lawman
stud
lived
deadpan
weeks
outlaws
month
reinserts
extinguishing
ii
sells
disassembled
holograph
pulses
eleventh
earbuds
ample
upholstered
roast
cocking
doubtfully
hunts
tousles
armrest
faux
datebook
cookbook
grocer
dissatisfied
50s
era
flirts
enchanted
childs
villa
processor
preppy
organize
cookware
shivers
roped
manila
dryly
statuesque
dutifully
disembarks
gravity
obese
serpentine
heaped
swatter
paunchy
meek
delivering
mopes
gilded
pivot
disappointedly
pinky
footbridge
insects
rep
seizing
straightened
fevered
calculating
43
charm
cushioned
askew
splintered
freestanding
phonograph
hoe
reptilian
splattering
primitive
clumps
hewn
softball
booklet
tasseled
fistful
fronts
cheesecake
masking
curt
bulbous
adolescent
nosed
flotation
sloshes
abdominal
bananas
carcass
unwrapping
rodent
lazy
pawing
slatted
savage
luck
growls
shipmate
ferocious
hunk
starboard
teal
opaque
tigers
densely
meerkat
width
insurance
survivor
polygraph
arthur
negligee
hilton
sourly
verge
agonized
graphs
frittata
immaculate
engulfing
stinky
imitating
printout
pigeon
imitate
babysitter
hail
tier
hearth
hustle
precipice
arrivals
crops
barley
boutique
jackdaw
sheltering
frankly
edelweiss
rooster
safari
matic
jeffreys
barricades
expanding
shading
w
inquiring
sheen
freedom
almond
poseidon
minotaur
transform
centaur
parries
refreshment
granite
marbled
flooding
nevada
roulette
sears
fissure
emaciated
pinnacle
stationed
meanders
girlish
illustrations
spacecraft
vital
spotless
illumination
domes
user
unlatches
coolers
concertina
beacons
vessels
treelike
reed
deformed
detaches
dotting
smearing
pulley
unfamiliar
nottingham
subtle
chancellor
patronizing
colonnade
watered
thinly
dove
noble
plot
longing
homestead
dashed
decree
therapy
feverishly
jubilant
cannonballs
droopy
hesitating
conversing
solutions
conditioning
feminine
keen
salvage
james
dryers
potomac
swigging
stubbornly
recognition
hesitant
george
breaststroke
sebastian
stepmother
hobbling
disfigured
podiums
incense
ox
adopts
sew
estranged
manuscripts
span
airplanes
matchbox
scuffing
apprentice
longboard
bitten
barbie
preteen
nestor
molding
canes
towed
karaboudjan
pontoon
beggar
parachutes
salaad
grove
horned
assignment
lame
rectangles
rejects
doorframe
calculator
laced
graduation
prowler
911
stripping
restraint
disposable
flaxen
inhale
barista
mentions
directory
mountainous
strangers
pileup
reflections
inclined
footboard
planting
goose
abutting
historic
establishment
contemplatively
mere
corporate
volumes
catalog
pasted
strangles
custom
forkful
sparkler
tissues
attorney
pledges
snatch
nonplussed
corrugated
protecting
florist
fiance
intersecting
obliges
truffle
hasty
wishers
lavish
leggy
ladles
scape
crossover
grille
smithereens
slimy
bizarre
grapefruit
hanky
stealthy
withdraw
blasting
faceless
michelle
movers
sonia
artest
90
development
trapeze
mutt
brandish
ensuing
bejeweled
punctures
shockwave
asgardians
bullhorn
kingdom
tommy
hallways
contained
toppled
suction
compartments
rescuers
carpathia
submersibles
subs
sediment
eel
ledger
insect
stokers
cigars
lowest
wobbling
amnesia
disoriented
crinkle
grudgingly
deathly
heavier
posse
stubby
mine
labored
balled
nook
lopes
receipt
otters
footpath
boars
flamingos
tasteful
humored
dyed
characteristics
stirrups
flounces
preparations
planner
tormentor
miami
tightened
gills
bind
magnet
cuba
supervising
resuming
headlines
independence
processing
birth
infected
median
stash
cologne
fuming
voltage
hoard
critically
spellbound
h
expected
explorer
overnight
happier
learn
hypnotized
asparagus
frightening
glock
tenderness
cassettes
conspicuously
considerable
nor
deliver
behavior
intricately
encouragement
whispered
entranceway
literature
suspects
herded
arrested
varied
evidently
loudspeaker
brains
comforted
horribly
mumbles
squeaking
delegate
sixties
choked
tuns
fragment
coughing
workmen
inert
stucco
fool
minions
appropriate
preventing
paradise
cackles
wars
opportunity
noisily
tiptoe
ramshackle
fudge
inverted
sounding
grape
rustling
achieved
leisure
groaning
yawning
scuttling
hesitate
thirteen
loafer
thanksgiving
loafers
weep
frenetic
thrashing
tantrum
gumballs
lincoln
din
easing
bounding
boarders
clod
alabama
autographed
available
latrine
hippie
vets
isabel
rigging
profound
supposed
receptacle
swallowed
rigidly
forgetting
buying
gardeners
indians
chilling
saris
billy
collars
considerably
foreboding
equal
irritation
stings
remorse
muslim
belligerent
pulpit
challengingly
adc
acknowledging
crumbs
clerical
humanity
peelings
dwelling
bleakly
ordered
listened
seek
advocate
withdrawn
cruel
intake
minor
truly
farewell
immense
surly
appreciation
consider
diners
streetlight
cornstarch
growling
windpipe
crisis
reporting
caravan
shudder
examiner
embroidered
bouncers
befuddled
halls
townspeople
grooves
propelled
waggling
blues
withdrawing
j
mutely
issues
soggy
thus
o'daniel
tins
liqueur
belong
rouged
unloaded
centre
overcoats
woken
texas
gamblers
shotguns
accelerating
gale
chick
shopper
floated
bothering
surprisingly
exaggerated
exhaling
connection
guardedly
tones
sore
registering
sweeper
smock
observed
noticed
progression
sedate
communicate
sheriffs
gunny
traps
sorority
frat
slashed
contemplating
standard
anticipating
seedy
gothic
shire
maggot
booty
mossy
ringwraith
internal
attracts
hack
inscribed
ruffling
wizards
bottomless
celeborn
hen
chalice
1000
pelennor
dank
cirith
ungol
sagging
shared
molten
trumpets
inlet
impossibly
tackle
colossus
rattle
narration
auction
peeing
undone
upholstery
minnesota
jauntily
drinkers
unloads
cares
combo
forget
teetering
lineup
kitchenette
raincoats
footmen
shame
eyelash
whirl
originally
intermittent
tending
heartbeat
aria
harpsichord
riveted
feast
importance
curtsey
inquiringly
creation
holy
occupy
torrential
disapproval
religious
bruises
youngsters
mint
nape
mingling
oc
relents
controller
according
tease
clams
clam
monitoring
deco
intended
infested
hulls
periscope
manipulating
ins
nyc
pug
conclusion
fever
thames
diagon
vinyl
borgin
requirement
fax
fishbowl
cheeky
underfoot
refuse
urchins
tempo
nodes
bunches
animates
capacitors
bees
grater
floored
stabbed
tar
squashed
prowl
supervise
restored
pecks
stilettos
league
sifts
glugs
yellowstone
planning
bridesmaid
pints
totters
stiletto
shoveled
petticoat
cranberry
grates
absentmindedly
4x4
disburse
woolen
garnish
leaflet
contemptuously
alfa
tracked
quay
q
skim
bore
intervenes
shims
imperata
carpenter
underway
cedar
ports
stubbly
glimpsed
30s
blackberry
landmarks
tramp
rewind
spain
lcd
mickey
verandah
outcrop
territory
lakes
islands
liners
evacuation
twirled
dell
shooters
latino
drier
mac
gateau
dursleys
tatty
snatched
poisonous
thestral
dreaming
snatchers
sweaters
privet
snowballs
ministry
python
chaser
irish
breather
serrated
buggy
canteen
songs
forecourt
filed
reapers
alrge
rapper
unstuck
cowboys
rhinestones
alcohols
kitesurfing
carwash
pacifier
extensively
sandbox
ar
illustrate
crowned
paramedics
pits
amazon
eliptical
comedian
dispute
hedging
edging
washboard
criquet
wmoan
bobble
kilts
delicious
canada
pvc
stading
arenas
samurai
mixtures
transportation
australia
prepped
restart
flexible
occurring
stating
theory
athletics
2009
narrating
rollerblade
4th
tutus
lye
sponsored
involves
rigorous
pople
fab
chisel
encourage
molds
statistics
caster
don
wallpapers
waives
artistic
commentates
specifics
ma
woodfire
teresa
hosing
bagpiper
equestrian
bruno
prune
balded
context
racer
oom
spade
intros
relay
gored
toasted
puting
concludes
hairstyles
mason
zest
wacker
popsicle
300
piling
scuff
screenshot
schools
commenting
op
providing
tubs
ipad
min
gloats
harmonicas
kits
lengths
vans
cigarrette
bruce
snowslide
declared
puppies
omelets
tanktop
browned
waching
modeling
curry
lube
leveler
showcasing
breakdancing
clorox
manmade
reversed
adn
mishaps
twirler
hulte
marathons
dominating
spackling
exhale
quotes
heidler
flew
avoided
swiffer
brazilian
curbside
jokingly
kg
101
refs
soles
argentina
walrus
moisturizer
fund
gluing
suma
applauded
amputees
scrubber
cadets
kleva
rappelling
simmons
largest
pickaxe
firebolt
beauxbatons
david
tyson
functions
rendition
clr
juggle
skins
measurements
taylor
explanatory
croutons
pointe
orchard
calango
sid
anh
oliver
jon
knelled
takeuchi
pikachu
mugshots
pedicab
sportensemble
lures
alter
revs
fashions
quiz
sensitive
baggie
boogies
chemistry
keg
operate
sedona
defends
yearbook
onramp
propane
truss
pinatas
conduct
gunfight
corrupt
scrapyard
skimmer
vito
clumsy
tellers
railroad
stomp
kidney
rouse
rouses
nametag
porn
satiny
camisole
studiously
bubbly
mackenzie
sculpted
intertwined
crusader
festooned
twinkies
graduate
processes
despondently
chalked
rowboat
retrieved
rendering
spars
unarmed
forge
stocks
futile
bruise
waltzing
decapitates
yapping
tresses
engraving
rapist
gore
vamp
beheads
bayou
slumber
unused
tubman
spirals
confederate
chaotic
mourning
goblets
infantry
headlamp
shooter
readying
lapping
divide
outfitted
spaced
reversing
fishtails
rpg
passports
switchblade
tourniquet
resentful
cartridges
encasement
maserati
peruse
decaying
inserting
hitch
douse
stagnant
readings
sadistic
rotor
indigo
government
wrinkle
apparel
rustles
appointment
psychiatrist
christian
armload
stickman
slash
sheeting
droppings
adorns
figurines
spikey
fucking
sprig
tortoise
ab
wrathfully
angelina
candidate
gesticulates
avert
headphone
protester
tac
hunters
oral
perching
prance
seeks
capitol
xs
feathery
diva
iowa
windmills
soundman
preps
troupe
complexion
picturesque
pelted
bristles
undergarments
organizing
soundboard
primping
daybed
slouch
chins
reemerges
privates
breathtaking
staggered
octopus
mp
corporal
suites
moons
rotten
activating
timers
ghoulish
bellow
dictation
rescued
converses
pellet
400
coasts
incinerating
labelled
radar
windblown
brooklyn
scruff
calmer
rescuer
cruz
hinge
effortless
notepaper
dizzying
billed
diminishes
brackets
documenting
uninjured
surfacing
mavericks
distorts
steeling
accomplishment
depth
loved
intimidating
lurch
pitching
capsizing
uncurls
froth
22
headlock
ravers
legos
lidded
cubby
shedding
pelts
staining
sawdust
corkscrew
helium
childishly
spurting
waterproof
deflated
cellphones
catholic
lacquer
necklaces
sunrise
sulking
turnstile
eldest
padlocked
bales
automobiles
shushing
virtually
campbell
tensed
arching
champ
acrobatically
alleyways
automatics
lackey
hydrant
accost
snores
hairpin
surrenders
fingerprints
murdered
snifter
lollipop
electrocutes
canvases
laundromat
similarly
denied
cataleya
analysis
hones
perturbed
cradled
sobered
lasagna
camouflaged
marsh
riffles
harm
jumble
jefferson
mule
roundabout
sunbathing
pergola
execs
bum
married
feigns
paraphernalia
puke
ritual
confiscated
milky
mush
tangles
fearlessly
mutilated
abduction
penned
dully
levitating
navigation
crafts
granddaughter
flyers
pricks
breezeway
bustier
sewn
ogle
shamefully
righting
retracting
bulkhead
spire
retro
chore
budweiser
sermon
trevor
cessna
trods
08
lars
grungy
irregular
smudge
nonchalance
critical
ambulances
moldy
openings
sprouting
2000
chants
decay
recoil
purposeful
laboriously
wavers
taupe
detector
amble
papered
scopes
undaunted
inbox
broadcaster
poufy
mama
underhand
clotheslines
guzzle
backboard
worktable
stepladder
visited
bookstore
feathered
breastplate
plumed
trident
godmother
consume
doberman
hinged
conjures
saunter
devours
adverts
earlobe
status
meaningful
doles
underling
per
hoods
flinch
pillared
semicircle
subpoena
chlorophyll
rt
ingot
palladium
racecar
pooling
tinged
formulas
sledgehammer
deploy
gatling
continent
graveled
careless
specimen
wrung
artfully
raid
lovelorn
pizzeria
beau
annabelle
hare
aspiring
sautes
filmy
madame
stature
pinkie
samples
deletes
reciprocates
gooey
uncooked
unpack
dutch
unnaturally
cerulean
bachelor
twine
handbags
tousled
agog
muck
lagoon
intervene
heimlich
bellies
parkers
shiver
yellowed
backhand
treetops
toolbox
graying
tote
brutally
goddess
melee
yang
roaming
fanned
scorpions
jetting
snapshot
orbit
lateral
piercingly
flaring
zipped
flouncy
guesthouse
plop
pylon
queasily
predator
trek
sunbeams
feigning
bluish
throats
expand
hazel
diligently
exams
subdues
stalking
blondes
banyan
ostrich
bucks
bail
orangutan
sputtering
snarl
persists
buoyancy
biscuit
juxtaposed
dew
chewed
heft
toothed
mottled
chomps
gobbles
uncover
suspends
clawing
lightens
comprehending
interlocked
tawny
gurgling
inland
eerie
writers
footstool
unflinchingly
sustengo
minimizes
dizzily
jinx
tubby
shined
nbc
cheekily
shins
bathrooms
flushing
sardines
espresso
bosses
guggenheim
sulky
bewilderedly
sardine
smitten
radiate
snaggletooth
rory
inward
suppressed
slipper
garbed
ticker
baled
pleasing
alights
frolic
mirthfully
doughnut
decidedly
bid
capital
ices
sprawl
bandaging
sponging
courtly
congratulations
hades
greyhound
em
flowering
coiling
pursue
casinos
bellhop
graffitied
ferryman
seams
gush
timepiece
isle
clouded
lawrence
airlock
ardent
computerized
pups
skullcap
rushed
bagged
microscope
giddy
persistently
oblong
hourglass
fatal
ravel
backrest
discolored
touchscreen
normally
forceps
gritting
staples
represents
affirming
terrains
sickle
physician
friar
jowly
arrogantly
vat
indifference
worms
lions
chamois
mulling
devouring
crone
cagey
madly
octagonal
blot
charter
wheat
backlit
imagined
sundress
forever
19
unstable
inscrutable
sim
statuette
depositing
component
venom
dyes
crypt
explosives
thrash
nuclear
snowflake
housed
virgil
restriction
brotherly
sludge
recoiling
porters
settled
instep
stemmed
identically
bereaved
wei
carcasses
counterpart
rebels
refugees
shanghai
playboy
occupant
matters
wring
sopping
gong
refreshments
draft
weakened
stylus
stitch
blanches
nesting
perky
barreling
studs
roach
roaches
wolves
aglow
puritan
griddle
mugger
tesla
takeout
slosh
ferrari
likeness
persian
comical
noodle
wallets
pickpocket
jaggerman
nip
prod
anchored
bianca
talons
rugs
leafless
stonewall
caricature
abreast
lunchroom
punctuated
evergreens
regions
garnished
numb
pottery
international
terminals
congested
locator
descriptive
poem
coworkers
slightest
traverses
underbrush
weakens
webpage
wiring
harriet
volvo
broods
hardening
manor
tases
chilled
cutters
verses
icons
murders
captioned
sanitizer
protrudes
wigs
crooks
mississippi
fidget
mexico
rodrigo
overhang
pm
tailed
glanced
doodles
cameron
academic
licorice
stretchers
er
spelled
blueberries
vows
witness
abashed
vending
dvds
inquisitively
speckled
laptops
televisions
bought
fizzing
gnome
scenarios
doggy
sloppily
quartet
talkies
avail
cartoons
term
monthly
albums
d'oeuvre
dubious
challenging
blah
significant
bicycling
dispirited
klimt
sprinklers
drenching
exists
signatures
ushering
armies
freezing
fractured
jotun
conflict
grilling
jotunheim
thicker
crumbled
planets
facades
gangway
reunite
slanting
submersible
glint
unclasps
remotely
grayish
furnaces
uppermost
unlaced
lacing
countess
slouching
steamed
furnace
watertight
wriggle
roughneck
amp
scoffs
minces
nhl
hanged
outhouse
remounts
exertion
matted
cornbread
subsides
gravestones
decapitated
smudged
rosemoor
frizzy
memorabilia
porcupine
rookie
disdainful
perfunctory
measurement
capuchin
pitying
lumbering
tranquilizer
statements
depot
widower
intelligent
squeegees
prevents
latching
malfunctioning
inspection
unrelenting
overjoyed
breezes
dishrag
boston
tipsily
tussle
overflows
brighten
marina
updo
girlishly
untangle
clamors
trellis
austere
luminous
capped
extravagantly
charmingly
examined
recruiters
warden
rec
disassemble
russia
reappearing
forearms
volcanic
blossoming
barreled
radioman
twister
f.
lunches
announcement
cooper
toiletries
taco
woody
clearance
sighing
riot
tassels
aerosol
innards
corpulent
bodied
pasty
trading
upraised
scraggly
valuables
droop
aftermath
customized
scamper
floyd
haunches
peeping
perfunctorily
threat
thirtyish
fifties
fortyish
sixteen
videotape
dinette
calculations
unmistakably
wandered
tupperware
imprinted
lack
thru
discovered
motivational
unbuttoning
hamper
waitresses
witnessed
discovery
jolted
transit
squirting
saving
sniffling
extraordinarily
gratitude
troubles
thrift
dialing
pandemonium
hectic
justice
debonair
possesses
uninterested
fearing
cynically
noncommittal
attends
bazaar
complains
noticeably
wonderingly
feelings
careening
brightness
pallid
hankie
newer
pondering
sloping
disinfectant
dislodge
shriek
identified
plaques
aides
booths
fran
upstream
unblinking
packard
coroner
spasm
swearing
workman
mariachi
crippled
solidly
ants
duplex
customary
bree
flabbergasted
choices
snickers
potential
sanford
fluorescent
throttles
broadside
begun
domestic
drooling
betrayed
hate
waltzes
moans
defiance
departed
defeat
gurgles
crackle
squishing
calves
openly
hymn
trust
parcels
biggest
creaks
crunching
galoshes
flimsy
creak
garb
diplomas
bunks
vigil
assassination
secretaries
tumbled
applications
ashamed
naturally
bulk
passive
fraction
operators
monuments
standards
cortege
reverence
na
firmness
committed
lamely
addition
reduces
hostility
opinion
relatively
traders
laborers
merchant
manacles
commanding
uncertainty
congress
wives
smiled
wading
buffalo
immensely
intercuts
flexibility
harassed
confront
clubhouse
rude
shelters
caution
ironic
accept
awareness
sepoys
relentless
menace
political
condition
heroic
incredulity
intends
stamping
brittle
morris
optimism
penetrate
significance
knapsacks
inevitable
ribboned
cbs
silenced
timid
fatigue
discreet
indicated
subjects
implication
merely
sony
shifty
pummeled
glad
quiets
sanitarium
squirm
scare
speechless
groping
deaf
expresses
answered
anybody
fanny
wends
saluting
freddie
substantial
undecided
promise
sophisticated
furs
confusing
relating
patrolling
annual
agitatedly
switchboard
powdering
tinkering
insult
outburst
suffering
toll
unsolved
undamaged
issuing
amazingly
efficient
insane
uproar
pennies
chord
ingratiating
flatcar
reek
whittling
clank
chinka
tenor
shambling
bellowing
duet
formally
boring
imperial
erected
blackface
judging
wordlessly
considered
goons
wharvey
gals
octave
passers
jackbooted
sombre
urgently
washers
clambering
bloated
foremen
smallest
oatmeal
rickshaw
baths
clattering
bach
evacuating
wretched
conversations
malibu
heroin
hellish
accent
anymore
wobbly
captives
blasted
lovebirds
freaks
tuxedos
gobs
groan
primes
polaroid
sustained
rocketing
energetic
unblinkingly
cheroot
unhurried
squirt
telephoto
hayseeds
saws
singed
arming
refined
patience
herald
reproachful
lives
admits
assorted
referring
exciting
kelly
romantically
lithe
sox
prominently
affair
anew
belonging
smokey
comp
kneecap
frequently
offense
depressed
unexpected
unnerving
thumper
spurt
requiring
tons
narsil
grasses
hobbiton
settlement
admiration
seeping
spurs
orthanc
prancing
rivendell
evenstar
crows
vista
amon
renewed
deagol
lamb
scrubby
topmost
dais
gully
misery
siege
encampment
fright
gamling
spectre
rallies
shagrat
mumakil
routing
dissolving
companies
gorgoroth
scree
pitiful
baited
tribe
marriage
catfish
curiosity
witnesses
cough
blistered
surplus
stilt
carnies
quieter
dunking
scuffle
cemented
decisions
usually
echoing
marshy
overtake
rotors
stretchy
overtaking
foulard
pelvis
parody
metro
goldfish
reapplies
daimler
gent
sexily
bedspread
typical
pal
banking
gambler
carnation
travelers
reminds
toying
sizing
losers
paid
terra
cotta
uncoils
accurately
acutely
neckline
topcoat
sizzling
disorder
terribly
olympian
caused
dangers
ultimate
mentally
conducted
celebrated
signet
inspired
lease
guilt
regret
primarily
undressing
charity
disturbs
recently
emerged
unclips
spattering
bouquets
drowns
decide
whining
died
gesticulating
howling
dealers
rotund
ruminating
ceilings
consultation
scrunched
mouthing
electrode
prying
antiseptic
coatcheck
thrilled
marx
unexpectedly
oily
ornately
speedboats
nick
motorboats
headlong
biplane
repairing
brethren
grecian
possession
grouping
redneck
nestling
tabs
arquillian
livid
millions
edgy
confounded
caribbean
alleys
guzzling
slytherins
brewing
wrenched
suggestively
cloister
astronomy
spires
bloodstains
moisten
leaks
browed
scrabbles
jive
affluent
rupee
loo
minibus
begs
visiting
towns
mimicking
forefront
tartan
helix
stupid
masonry
superhero
firemen
inferno
tidying
queensboro
autumnal
joe
peeved
yellowish
derelict
threatens
planetarium
pronged
reproachfully
hoverboard
handler
transformed
raced
unraveling
spiderman
hulk
contractor
cutlery
holbrooks
mowed
hoovering
unpleasantly
jig
parades
francisco
scarfs
pickle
checkout
obliterated
antonov
hawaii
presidential
treated
felled
tsunami
arks
cabby
weddings
jumbo
delivered
floss
shirted
brandishes
vases
alerted
dictates
flown
altitude
cyclist
wasteland
sprouts
lure
slippery
recreational
swans
publications
trilby
57
prunes
lilies
diy
dentist
administers
unwittingly
fiat
mustachioed
amd
pastel
si
breeches
reverentially
carefree
seeming
martin
dislodging
dispassionately
unmoved
suburbs
walkways
swung
churchyard
breech
sandbag
triomphe
operations
elevation
newsboy
leaned
skipper
pathways
dinosaur
secrets
nightdress
teeters
spasms
enamel
postcards
wheelman
toils
floorboard
consequence
leftovers
paving
halfheartedly
deusenberg
zags
luscious
antlers
abbey
manually
noon
videoing
canopies
pavements
harnessed
curricle
aboriginal
drovers
crocodile
bucking
vainly
shallows
watchers
unfolding
stabilize
v8
anglia
sty
tract
heal
carriageway
accord
marquis
draco
creator
godfather
fireplaces
horseless
owls
remembrall
declares
tornado
pregnancy
skewered
masse
alike
solders
anther
crucified
comrades
reloading
waterside
zimmer
cuttings
sedately
jellyfish
patted
seafront
beckon
choose
nightmare
vagrant
approves
ducati
dual
chimes
pensieve
stopper
tutor
hummers
fatty
delayed
tuna
simulating
majorette
enjoyment
bodybuilder
victor
fiair
designated
talknig
acrylic
finer
gradual
grades
involve
chose
strategies
newscasters
hammered
colander
preform
greased
caulk
lowe
ninja
drizzled
prefect
overall
matte
snorkeler
violins
oats
cocoa
challenger
waling
greats
summersault
lisa
venues
kettlebell
sweatsuits
michael
sailboard
shifter
ate
tack
ield
steeping
lo
goalies
creatively
paragraphs
average
lengthwise
yolks
solder
casters
uzbekistan
smith
beerpong
layering
shisha
moose
paired
instances
joust
surviving
waterway
consistently
strategically
thowing
inflates
1600
fetching
tandumisgreat
clipart
reuben
hints
scuffs
popsicles
popular
gymnasiums
oher
septum
thinner
omelettes
accelerated
seafood
contraptions
interesting
spicy
manta
audi
straddle
spacers
groves
improve
nyan
mopped
telecast
cools
situations
grease
playmate
paul
dyson
grandmaster
overlays
motorcross
presser
tracy
sailboats
freebies
roofers
selected
narrate
torched
critique
anoher
jumprope
homer
thump
promo
blew
bodybuilders
evaluating
woma
chronometer
epee
2007
briggs
paintballs
spining
injures
tinder
parasail
parasails
alfredo
grinder
skiier
basics
alex
pressurized
rid
chapman
stat
accessory
foos
backgrounds
pointers
certificates
monkeysee
experts
dropper
sequences
technical
decided
yo
renovation
rotations
headpiece
kale
spf
purchasing
rube
bents
ropers
spinner
builder
equips
manicurist
sponges
germs
concession
soldering
sri
lanka
funkin
weightlifters
grabing
melons
waterski
squeezer
forelegs
celebratory
conditions
exuberantly
veet
caped
puma
heatedly
69
bullring
scares
figs
stems
structured
trows
zucchini
cilantro
thumbling
standingin
shear
reps
hoagie
drizzles
sin
sectioned
onesie
interments
wheelchairs
clothe
ho
olimpic
hr
saxaphone
blowout
bali
2011
ombre
entertain
thatched
lecturing
nerdist
overlayed
naming
snowed
hobby
traversing
disney
taunted
tic
handspring
buffalos
guinness
bullseye
cub
effective
weldy
fluff
fuller
extract
spends
centers
pianos
putted
alternatively
6'2
tot
lonesome
pasting
throwers
kayaker
selfies
hiting
sponsor
asians
snowmobiles
coupled
unfortunately
eric
loosed
unboxing
duggar
gator
tortilla
plugged
tested
procedures
highlight
saute
jamaican
demonstrator
davis
pic
introductions
buoy
speedos
nex
columbia
pokemon
wildebeest
capoeria
sifter
cultural
kennel
sean
neil
quincy
drafting
boating
martina
imprisoned
bbq
retakes
creasing
flourish
pullman
tallies
cincinnati
montana
frosts
rabble
musket
sewerage
ake
mics
mak
hairdo
correspondent
epees
duration
genders
putin
meats
geared
niddle
equipments
sculptor
bieber
chaises
clone
anda
jolting
inking
@@NEWLINE@@
scrapping
excessively
offensive
raketball
puzzles
jai
alai
cuting
sandbags
firearm
foward
crossfit
bundt
sward
occasions
twitter
clapped
immersion
measurer
mikan
boobs
swish
skiboard
mobster
girdle
rembrandt
recessed
guestroom
spools
lawmen
melodrama
crime
gable
advent
picker
scabbers
whomping
dorito
ninety
futball
wih
introductory
watchnig
seasons
karateka
trial
mangos
catamarans
porcini
kniting
================================================
FILE: create_swag/turktemplate.html
================================================
<meta content="width=device-width,initial-scale=1" name="viewport"/>
<section class="container" id="Survey">
<div class="row" id="workContent">
<div class="col-sm-12">
<div class="panel panel-default panel-demo">
<div class="panel-heading">
<h4 class="panel-title">Instructions - please read even if you've done an earlier version of this HIT! <a data-toggle="collapse" href="#instructions"
style="color:#0080ff;font-weight:normal;text-decoration: underline;">(expand/collapse)</a>
</h4>
</div>
<div id="instructions" class="panel-collapse collapse in" style="margin:10px">
Imagine that you are watchin
gitextract_7vl97uy1/
├── .dockerignore
├── Dockerfile
├── LICENSE
├── README.md
├── create_swag/
│ ├── README.md
│ ├── __init__.py
│ ├── generate_candidates/
│ │ ├── README.md
│ │ ├── __init__.py
│ │ ├── classifiers.py
│ │ ├── questions2mturk.py
│ │ ├── rebalance_dataset_ensemble.py
│ │ ├── rebalance_dataset_mlp.py
│ │ ├── sample_candidates.py
│ │ └── sample_candidates.sh
│ ├── lm/
│ │ ├── README.md
│ │ ├── __init__.py
│ │ ├── config.py
│ │ ├── load_data.py
│ │ ├── pretrain_lm.py
│ │ ├── simple_bilm.py
│ │ ├── train_lm.py
│ │ ├── train_lm.sh
│ │ └── vocabulary/
│ │ ├── non_padded_namespaces.txt
│ │ └── tokens.txt
│ └── turktemplate.html
├── data/
│ ├── README.md
│ ├── test.csv
│ ├── train.csv
│ ├── train_full.csv
│ ├── val.csv
│ └── val_full.csv
├── evaluation.yaml
├── pytorch_misc.py
├── raw_data/
│ └── events.py
├── requirements.txt
└── swag_baselines/
├── README.md
├── __init__.py
├── decomposable_attention/
│ ├── README.md
│ ├── __init__.py
│ ├── dataset_reader.py
│ ├── decomposable_attention_swag.py
│ ├── run_experiments.sh
│ ├── train-elmo-goldonly.json
│ ├── train-elmo.json
│ ├── train-glove-840.json
│ ├── train-glove-goldonly-840.json
│ ├── train-glove-goldonly.json
│ ├── train-glove.json
│ ├── train-numberbatch-goldonly.json
│ └── train-numberbatch.json
├── esim/
│ ├── README.md
│ ├── __init__.py
│ ├── dataset_reader.py
│ ├── esim_swag.py
│ ├── predict.py
│ ├── run_experiments.sh
│ ├── train-elmo-goldonly.json
│ ├── train-elmo.json
│ ├── train-glove-goldonly.json
│ ├── train-glove.json
│ ├── train-numberbatch-goldonly.json
│ └── train-numberbatch.json
├── fasttext/
│ ├── README.md
│ ├── __init__.py
│ ├── compute_performance.py
│ └── prep_data.py
└── unarylstm/
├── __init__.py
├── dataset_reader.py
├── lstm_swag.py
├── predict.py
├── run_experiments.sh
├── run_experiments_ending.sh
├── train-cnn.json
├── train-lstmbasic-elmo-endingonly.json
├── train-lstmbasic-elmo-goldonly-endingonly.json
├── train-lstmbasic-elmo-goldonly.json
├── train-lstmbasic-elmo.json
├── train-lstmbasic-glove-endingonly.json
├── train-lstmbasic-glove-goldonly-endingonly.json
├── train-lstmbasic-glove-goldonly.json
├── train-lstmbasic-glove.json
├── train-lstmbasic-numberbatch-endingonly.json
├── train-lstmbasic-numberbatch-goldonly-endingonly.json
├── train-lstmbasic-numberbatch-goldonly.json
├── train-lstmbasic-numberbatch.json
└── train.json
SYMBOL INDEX (148 symbols across 17 files)
FILE: create_swag/generate_candidates/classifiers.py
function reshape (line 23) | def reshape(f):
class LMFeatsModel (line 37) | class LMFeatsModel(nn.Module):
method __init__ (line 38) | def __init__(self, input_dim=5, hidden_dim=1024):
method forward (line 57) | def forward(self, feats):
method fit (line 66) | def fit(self, data, val_data=None, num_epoch=10):
method validate (line 95) | def validate(self, data):
class BoWModel (line 106) | class BoWModel(nn.Module):
method __init__ (line 107) | def __init__(self, vocab, use_mean=True, embed_dim=100):
method forward (line 129) | def forward(self, word_ids):
class CNNModel (line 144) | class CNNModel(nn.Module):
method __init__ (line 145) | def __init__(self, vocab, embed_dim=100, window_sizes=(2, 3, 4, 5), nu...
method forward (line 166) | def forward(self, word_ids, indicator_ids):
class BLSTMModel (line 185) | class BLSTMModel(nn.Module):
method __init__ (line 186) | def __init__(self, vocab, use_postags_only=True, embed_dim=100, hidden...
method forward (line 215) | def forward(self, word_ids, indicator_ids):
class Ensemble (line 236) | class Ensemble(nn.Module):
method __init__ (line 237) | def __init__(self, vocab):
method forward (line 256) | def forward(self, lm_feats, ending_word_ids, postags_word_ids, ctx_ind...
method predict (line 275) | def predict(self, lm_feats, ending_word_ids, postags_word_ids, ctx_ind...
method validate (line 283) | def validate(self, val_dataloader):
method fit (line 305) | def fit(self, train_dataloader, val_dataloader, num_epoch=5):
FILE: create_swag/generate_candidates/questions2mturk.py
function _detokenize (line 12) | def _detokenize(sent):
FILE: create_swag/generate_candidates/rebalance_dataset_ensemble.py
class AssignmentsDataLoader (line 63) | class AssignmentsDataLoader(Dataset):
method __init__ (line 65) | def __init__(self, instances, inds, train=True, recompute_assignments=...
method collate (line 75) | def collate(self, items_l):
method __len__ (line 112) | def __len__(self):
method __getitem__ (line 115) | def __getitem__(self, index):
method splits (line 149) | def splits(cls, assignments):
function _iter (line 199) | def _iter():
FILE: create_swag/generate_candidates/rebalance_dataset_mlp.py
class SimpleCudaLoader (line 58) | class SimpleCudaLoader(object):
method __init__ (line 60) | def __init__(self,
method __iter__ (line 76) | def __iter__(self):
method randomsplits (line 101) | def randomsplits(cls):
method __len__ (line 111) | def __len__(self):
class MLPModel (line 118) | class MLPModel(nn.Module):
method __init__ (line 119) | def __init__(self):
method forward (line 133) | def forward(self, feats):
method fit (line 138) | def fit(self, data, val_data=None, n_epoch=10):
method predict (line 168) | def predict(self, data):
FILE: create_swag/generate_candidates/sample_candidates.py
function find_VP (line 63) | def find_VP(tree):
function split_on_final_vp (line 103) | def split_on_final_vp(sentence):
FILE: create_swag/lm/load_data.py
function load_lm_data (line 24) | def load_lm_data(fold=None, mode='train'):
class RawPassages (line 103) | class RawPassages(Dataset):
method __init__ (line 104) | def __init__(self, fold, mode):
method collate (line 113) | def collate(self, instances_l):
method __len__ (line 124) | def __len__(self):
method __getitem__ (line 127) | def __getitem__(self, index):
method splits (line 138) | def splits(cls, fold):
FILE: create_swag/lm/pretrain_lm.py
function batcher (line 24) | def batcher(inp_list):
function data_runner (line 36) | def data_runner(start_point=0, minlength=4):
function _sample_a_good_pair (line 45) | def _sample_a_good_pair(gen, seq_length, min_length=3):
function looped_data_runner (line 59) | def looped_data_runner(batch_size=128, seq_length=50):
function bucketed_data_runner (line 69) | def bucketed_data_runner(batch_size=64, seq_length=50):
FILE: create_swag/lm/simple_bilm.py
function _de_duplicate_generations (line 23) | def _de_duplicate_generations(generations):
class StackedLstm (line 40) | class StackedLstm(torch.nn.Module):
method __init__ (line 70) | def __init__(self,
method forward (line 97) | def forward(self, # pylint: disable=arguments-differ
class SimpleBiLM (line 138) | class SimpleBiLM(torch.nn.Module):
method __init__ (line 139) | def __init__(self,
method embed_words (line 174) | def embed_words(self, words):
method timestep_to_ids (line 189) | def timestep_to_ids(self, timestep_tokenized: List[str]):
method batch_to_ids (line 194) | def batch_to_ids(self, stories_tokenized: List[List[str]]):
method conditional_generation (line 210) | def conditional_generation(self, context, gt_completion, batch_size=12...
method _chunked_logsoftmaxes (line 277) | def _chunked_logsoftmaxes(self, activation, word_targets, chunk_size=2...
method forward (line 299) | def forward(self, words: torch.Tensor, use_forward=True, use_reverse=T...
FILE: pytorch_misc.py
function optimistic_restore (line 15) | def optimistic_restore(network, state_dict):
function pairwise (line 37) | def pairwise(iterable):
function get_ranking (line 44) | def get_ranking(predictions, labels, num_guesses=5):
function cache (line 65) | def cache(f):
class Flattener (line 84) | class Flattener(nn.Module):
method __init__ (line 85) | def __init__(self):
method forward (line 91) | def forward(self, x):
function to_variable (line 95) | def to_variable(f):
function arange (line 111) | def arange(base_tensor, n=None):
function to_onehot (line 118) | def to_onehot(vec, num_classes, on_fill=1, off_fill=0):
function save_net (line 139) | def save_net(fname, net):
function load_net (line 145) | def load_net(fname, net):
function batch_index_iterator (line 156) | def batch_index_iterator(len_l, batch_size, skip_end=True):
function batch_map (line 174) | def batch_map(f, a, batch_size):
function const_row (line 191) | def const_row(fill, l, volatile=False):
function print_para (line 198) | def print_para(model):
function accuracy (line 219) | def accuracy(output, target, topk=(1,)):
function nonintersecting_2d_inds (line 235) | def nonintersecting_2d_inds(x):
function intersect_2d (line 246) | def intersect_2d(x1, x2):
function np_to_variable (line 263) | def np_to_variable(x, is_cuda=True, dtype=torch.FloatTensor):
function gather_nd (line 270) | def gather_nd(x, index):
function enumerate_by_image (line 293) | def enumerate_by_image(im_inds):
function diagonal_inds (line 317) | def diagonal_inds(tensor):
function enumerate_imsize (line 331) | def enumerate_imsize(im_sizes):
function argsort_desc (line 341) | def argsort_desc(scores):
function unravel_index (line 351) | def unravel_index(index, dims):
function de_chunkize (line 360) | def de_chunkize(tensor, chunks):
function random_choose (line 367) | def random_choose(tensor, num):
function transpose_packed_sequence_inds (line 385) | def transpose_packed_sequence_inds(lengths):
function right_shift_packed_sequence_inds (line 407) | def right_shift_packed_sequence_inds(lengths):
function clip_grad_norm (line 437) | def clip_grad_norm(named_parameters, max_norm, clip=False, verbose=False):
function time_batch (line 478) | def time_batch(gen, reset_every=100):
function update_lr (line 494) | def update_lr(optimizer, lr=1e-4):
function all_upper_triangular_pairs (line 499) | def all_upper_triangular_pairs(gen):
function pad_last_dim (line 507) | def pad_last_dim(tensor, new_size):
FILE: raw_data/events.py
function remove_allcaps (line 25) | def remove_allcaps(sent):
function load_rocstories (line 47) | def load_rocstories(split):
function _to_time (line 112) | def _to_time(pandas_col):
function _lsmdc_to_list (line 117) | def _lsmdc_to_list(lsmdc, lsmdc_window=20):
function load_mpii (line 144) | def load_mpii(split):
function load_mpii_depersonized (line 180) | def load_mpii_depersonized(split):
function load_visual_madlibs (line 249) | def load_visual_madlibs(split):
function load_vist (line 343) | def load_vist(split):
function load_lsmdc (line 359) | def load_lsmdc(split):
function load_didemo (line 421) | def load_didemo(split):
function load_anet (line 445) | def load_anet(split):
function load_ava (line 456) | def load_ava(split):
function n2w_1k (line 473) | def n2w_1k(x, use_ordinal=False):
function _postprocess (line 478) | def _postprocess(sentence):
function load_everything (line 507) | def load_everything():
FILE: swag_baselines/decomposable_attention/dataset_reader.py
class SwagReader (line 25) | class SwagReader(DatasetReader):
method __init__ (line 40) | def __init__(self,
method _read (line 50) | def _read(self, file_path: str):
method text_to_instance (line 70) | def text_to_instance(self, # type: ignore
method from_params (line 89) | def from_params(cls, params: Params) -> 'SwagReader':
FILE: swag_baselines/decomposable_attention/decomposable_attention_swag.py
class DecomposableAttention (line 22) | class DecomposableAttention(Model):
method __init__ (line 68) | def __init__(self, vocab: Vocabulary,
method forward (line 122) | def forward(self, # type: ignore
method get_metrics (line 223) | def get_metrics(self, reset: bool = False) -> Dict[str, float]:
method from_params (line 229) | def from_params(cls, vocab: Vocabulary, params: Params) -> 'Decomposab...
FILE: swag_baselines/esim/dataset_reader.py
class SwagReader (line 25) | class SwagReader(DatasetReader):
method __init__ (line 40) | def __init__(self,
method _read (line 50) | def _read(self, file_path: str):
method text_to_instance (line 70) | def text_to_instance(self, # type: ignore
method from_params (line 89) | def from_params(cls, params: Params) -> 'SwagReader':
FILE: swag_baselines/esim/esim_swag.py
class VariationalDropout (line 27) | class VariationalDropout(torch.nn.Dropout):
method forward (line 28) | def forward(self, input):
class ESIM (line 44) | class ESIM(Model):
method __init__ (line 81) | def __init__(self, vocab: Vocabulary,
method forward (line 126) | def forward(self, # type: ignore
method get_metrics (line 264) | def get_metrics(self, reset: bool = False) -> Dict[str, float]:
method from_params (line 270) | def from_params(cls, vocab: Vocabulary, params: Params) -> 'ESIM':
FILE: swag_baselines/fasttext/prep_data.py
function _tokenize (line 9) | def _tokenize(sent):
FILE: swag_baselines/unarylstm/dataset_reader.py
class SwagReader (line 24) | class SwagReader(DatasetReader):
method __init__ (line 39) | def __init__(self,
method _read (line 51) | def _read(self, file_path: str):
method text_to_instance (line 71) | def text_to_instance(self, # type: ignore
method from_params (line 90) | def from_params(cls, params: Params) -> 'SwagReader':
FILE: swag_baselines/unarylstm/lstm_swag.py
class LstmSwag (line 21) | class LstmSwag(Model):
method __init__ (line 51) | def __init__(self, vocab: Vocabulary,
method forward (line 76) | def forward(self, # type: ignore
method get_metrics (line 125) | def get_metrics(self, reset: bool = False) -> Dict[str, float]:
method from_params (line 131) | def from_params(cls, vocab: Vocabulary, params: Params) -> 'LstmSwag':
Copy disabled (too large)
Download .json
Condensed preview — 86 files, each showing path, character count, and a content snippet. Download the .json file for the full structured content (25,270K chars).
[
{
"path": ".dockerignore",
"chars": 52,
"preview": ".dockerignore\n**.pyc\n**/__pycache__\n.gitignore\n.git\n"
},
{
"path": "Dockerfile",
"chars": 1037,
"preview": "FROM python:3.6.3-jessie\n\nENV LC_ALL=C.UTF-8\nENV LANG=C.UTF-8\n\nENV PATH /usr/local/nvidia/bin/:$PATH\nENV LD_LIBRARY_PATH"
},
{
"path": "LICENSE",
"chars": 1070,
"preview": "MIT License\n\nCopyright (c) 2018 Rowan Zellers\n\nPermission is hereby granted, free of charge, to any person obtaining a c"
},
{
"path": "README.md",
"chars": 1457,
"preview": "# swagaf\n\n### Like this work, or commonsense reasoning in general? You might be interested in checking out my brand new "
},
{
"path": "create_swag/README.md",
"chars": 756,
"preview": "# create_swag\n\nthis folder contains the scripts used to create SWAG, including adversarial filtering. Here's the rough o"
},
{
"path": "create_swag/__init__.py",
"chars": 0,
"preview": ""
},
{
"path": "create_swag/generate_candidates/README.md",
"chars": 835,
"preview": "# generate_candidates\n\nStage 1 of the pipeline - generate a bunch of candidates.\n\nUnfortunately, this is pretty slow, so"
},
{
"path": "create_swag/generate_candidates/__init__.py",
"chars": 0,
"preview": ""
},
{
"path": "create_swag/generate_candidates/classifiers.py",
"chars": 15626,
"preview": "\"\"\"\nThe big idea will be to add in the worst scoring one. But we want to use a MULTILAYER PERCEPTRON.\nAlso not using wor"
},
{
"path": "create_swag/generate_candidates/questions2mturk.py",
"chars": 2111,
"preview": "import random\nimport pickle as pkl\nimport numpy as np\nfrom tqdm import tqdm\nfrom nltk.tokenize.moses import MosesDetoken"
},
{
"path": "create_swag/generate_candidates/rebalance_dataset_ensemble.py",
"chars": 13160,
"preview": "\"\"\"\nThe big idea will be to add in the worst scoring one. But we want to use a MULTILAYER PERCEPTRON.\nAlso not using wor"
},
{
"path": "create_swag/generate_candidates/rebalance_dataset_mlp.py",
"chars": 10256,
"preview": "\"\"\"\nThe big idea will be to add in the worst scoring one. But we want to use a MULTILAYER PERCEPTRON.\nAlso not using wor"
},
{
"path": "create_swag/generate_candidates/sample_candidates.py",
"chars": 8718,
"preview": "import pickle as pkl\nfrom argparse import ArgumentParser\nfrom copy import deepcopy\nfrom time import time\n\nimport numpy a"
},
{
"path": "create_swag/generate_candidates/sample_candidates.sh",
"chars": 169,
"preview": "#!/usr/bin/env bash\n\nexport CUDA_VISIBLE_DEVICES=$1\necho \"Sampling the candidates. remember to do this do this for all o"
},
{
"path": "create_swag/lm/README.md",
"chars": 1232,
"preview": "# LM\n\nContains hopefully everything you need to run the LM\n\n# Setup\n\n0. Update the config file with where your pretraini"
},
{
"path": "create_swag/lm/__init__.py",
"chars": 0,
"preview": ""
},
{
"path": "create_swag/lm/config.py",
"chars": 242,
"preview": "# Set this to how many LMs you want to train on diff splits of the data\n\nNUM_FOLDS = 5\n\n\n# what text to train on (right "
},
{
"path": "create_swag/lm/load_data.py",
"chars": 6285,
"preview": "# First make the vocabulary, etc.\n\nimport os\nimport pickle as pkl\nimport random\n\nimport simplejson as json\nfrom allennlp"
},
{
"path": "create_swag/lm/pretrain_lm.py",
"chars": 4759,
"preview": "import os\n\nimport pandas as pd\nimport torch\nfrom allennlp.data import Instance\nfrom allennlp.data import Token\nfrom alle"
},
{
"path": "create_swag/lm/simple_bilm.py",
"chars": 16902,
"preview": "\"\"\"\nA wrapper around ai2s elmo LM to allow for an lm objective...\n\"\"\"\n\nfrom typing import Optional, Tuple\nfrom typing im"
},
{
"path": "create_swag/lm/train_lm.py",
"chars": 3709,
"preview": "import os\nfrom argparse import ArgumentParser\n\nimport numpy as np\nimport pandas as pd\nimport torch\nfrom torch import opt"
},
{
"path": "create_swag/lm/train_lm.sh",
"chars": 226,
"preview": "#!/usr/bin/env bash\n\nFOLD_ID=$1\nNUM_GPUS=3\nexport CUDA_VISIBLE_DEVICES=$((FOLD_ID % NUM_GPUS))\necho \"Sampling the candid"
},
{
"path": "create_swag/lm/vocabulary/non_padded_namespaces.txt",
"chars": 14,
"preview": "*tags\n*labels\n"
},
{
"path": "create_swag/lm/vocabulary/tokens.txt",
"chars": 107079,
"preview": "@@UNKNOWN@@\n@@bos@@\n@@eos@@\n.\nthe\na\nsomeone\nand\n,\nto\nin\nof\non\nhe\nhis\nis\nman\nher\nwith\nshe\nas\nat\nthen\nup\nare\nit\n's\npeople\n"
},
{
"path": "create_swag/turktemplate.html",
"chars": 22698,
"preview": "<meta content=\"width=device-width,initial-scale=1\" name=\"viewport\"/>\n<section class=\"container\" id=\"Survey\">\n\n <div c"
},
{
"path": "data/README.md",
"chars": 2502,
"preview": "# SWAG dataset\nEach item in the CSV is an example. It's conveniently in two formats. \n\n\n## full\nIn `train_full.csv` or `"
},
{
"path": "data/test.csv",
"chars": 7817885,
"preview": ",video-id,fold-ind,startphrase,sent1,sent2,gold-source,ending0,ending1,ending2,ending3\n0,anetv_pIUpJihiju0,11871,\"A pers"
},
{
"path": "data/val.csv",
"chars": 7893588,
"preview": ",video-id,fold-ind,startphrase,sent1,sent2,gold-source,ending0,ending1,ending2,ending3,label\n0,lsmdc1052_Harry_Potter_an"
},
{
"path": "data/val_full.csv",
"chars": 8929065,
"preview": "video-id,fold-ind,startphrase,gold-ending,distractor-0,distractor-1,distractor-2,distractor-3,gold-source,gold-type,dist"
},
{
"path": "evaluation.yaml",
"chars": 550,
"preview": "description: ESIM validation predictions for SWAG\ntasks:\n - spec:\n blueprint: rowanz/swag-baseline-image\n res"
},
{
"path": "pytorch_misc.py",
"chars": 15621,
"preview": "\"\"\"\nMiscellaneous functions that might be useful for pytorch\n\"\"\"\n\nimport h5py\nimport numpy as np\nimport torch\nfrom torch"
},
{
"path": "raw_data/events.py",
"chars": 23643,
"preview": "\"\"\"\nDataloader for event data. this includes\n\n- rocstories\n- didemo\n- MPII\n- activitynet captions\n\"\"\"\nimport pandas as p"
},
{
"path": "requirements.txt",
"chars": 111,
"preview": "pandas==0.20.3\ntorch==0.3.1\ngit+git://github.com/allenai/allennlp.git@7142962d330ca5a95cade114c26a361c78f2042e\n"
},
{
"path": "swag_baselines/README.md",
"chars": 560,
"preview": "# swag_baselines\n\nCurrently there are 4 baselines here: [FastText](https://fasttext.cc), [Decomposable Attention](https:"
},
{
"path": "swag_baselines/__init__.py",
"chars": 0,
"preview": ""
},
{
"path": "swag_baselines/decomposable_attention/README.md",
"chars": 215,
"preview": "#to run\n\npython -m allennlp.run train train.json -s tmp/output0 --include-package swag_baselines.decomposable_attention\n"
},
{
"path": "swag_baselines/decomposable_attention/__init__.py",
"chars": 0,
"preview": ""
},
{
"path": "swag_baselines/decomposable_attention/dataset_reader.py",
"chars": 4065,
"preview": "# Exactly the same as the other dataset reader\n\nfrom typing import Dict, List\nimport json\nimport logging\n\nfrom overrides"
},
{
"path": "swag_baselines/decomposable_attention/decomposable_attention_swag.py",
"chars": 13628,
"preview": "from typing import Dict, Optional\n\nimport torch\n\nfrom allennlp.common import Params\nfrom allennlp.common.checks import c"
},
{
"path": "swag_baselines/decomposable_attention/run_experiments.sh",
"chars": 1192,
"preview": "#!/usr/bin/env bash\n\n# Run skipthoughts with a bunch of different modes\nexport CUDA_VISIBLE_DEVICES=$1\nif [ $1 == \"0\" ];"
},
{
"path": "swag_baselines/decomposable_attention/train-elmo-goldonly.json",
"chars": 2730,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"elmo\": {\n \"typ"
},
{
"path": "swag_baselines/decomposable_attention/train-elmo.json",
"chars": 2731,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"elmo\": {\n \"typ"
},
{
"path": "swag_baselines/decomposable_attention/train-glove-840.json",
"chars": 1820,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"tokens\": {\n \"type\": \"single_id\",\n "
},
{
"path": "swag_baselines/decomposable_attention/train-glove-goldonly-840.json",
"chars": 1819,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"tokens\": {\n \"type\": \"single_id\",\n "
},
{
"path": "swag_baselines/decomposable_attention/train-glove-goldonly.json",
"chars": 1817,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"tokens\": {\n \"type\": \"single_id\",\n "
},
{
"path": "swag_baselines/decomposable_attention/train-glove.json",
"chars": 1818,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"tokens\": {\n \"type\": \"single_id\",\n "
},
{
"path": "swag_baselines/decomposable_attention/train-numberbatch-goldonly.json",
"chars": 1828,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"tokens\": {\n \"type\": \"single_id\",\n "
},
{
"path": "swag_baselines/decomposable_attention/train-numberbatch.json",
"chars": 1850,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"tokens\": {\n \"type\": \"single_id\",\n "
},
{
"path": "swag_baselines/esim/README.md",
"chars": 567,
"preview": "# ESIM on swag\n\nESIM seems to work pretty well on SWAG, so here's it in action. You can train using the following comman"
},
{
"path": "swag_baselines/esim/__init__.py",
"chars": 0,
"preview": ""
},
{
"path": "swag_baselines/esim/dataset_reader.py",
"chars": 4065,
"preview": "# Exactly the same as the other dataset reader\n\nfrom typing import Dict, List\nimport json\nimport logging\n\nfrom overrides"
},
{
"path": "swag_baselines/esim/esim_swag.py",
"chars": 13868,
"preview": "# TODO: projection dropout with ELMO\n# l2 reg with ELMO\n# multiple ELMO layers\n# doc\n\nfrom typing import Dict, Opt"
},
{
"path": "swag_baselines/esim/predict.py",
"chars": 3983,
"preview": "\"\"\"\nadapted from Allennlp because their version doesn't seem to work 😢😢😢\n\"\"\"\nfrom typing import Dict, Any, Iterable\nimpo"
},
{
"path": "swag_baselines/esim/run_experiments.sh",
"chars": 1037,
"preview": "#!/usr/bin/env bash\n\n# Run experiments with a bunch of different models\nexport CUDA_VISIBLE_DEVICES=$1\nif [ $1 == \"2\" ];"
},
{
"path": "swag_baselines/esim/train-elmo-goldonly.json",
"chars": 2536,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"elmo\": {\n \"type\": \"elmo_characters\"\n "
},
{
"path": "swag_baselines/esim/train-elmo.json",
"chars": 2537,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"elmo\": {\n \"type\": \"elmo_characters\"\n "
},
{
"path": "swag_baselines/esim/train-glove-goldonly.json",
"chars": 2358,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"tokens\": {\n \"type\": \"single_id\",\n "
},
{
"path": "swag_baselines/esim/train-glove.json",
"chars": 2359,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"tokens\": {\n \"type\": \"single_id\",\n "
},
{
"path": "swag_baselines/esim/train-numberbatch-goldonly.json",
"chars": 2367,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"tokens\": {\n \"type\": \"single_id\",\n "
},
{
"path": "swag_baselines/esim/train-numberbatch.json",
"chars": 2368,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"tokens\": {\n \"type\": \"single_id\",\n "
},
{
"path": "swag_baselines/fasttext/README.md",
"chars": 586,
"preview": "# fasttext baseline\n\nThis is a wrapper around the fasttext library for getting results on SWAG. See [https://fasttext.cc"
},
{
"path": "swag_baselines/fasttext/__init__.py",
"chars": 0,
"preview": ""
},
{
"path": "swag_baselines/fasttext/compute_performance.py",
"chars": 772,
"preview": "import numpy as np\nimport argparse\nimport os\n# neg probability, pos prob\n\nparser = argparse.ArgumentParser(description='"
},
{
"path": "swag_baselines/fasttext/prep_data.py",
"chars": 2404,
"preview": "import pandas as pd\nfrom tqdm import tqdm\nfrom allennlp.common.util import get_spacy_model\n\nUSE_ONLY_GOLD_EXAMPLES = Fal"
},
{
"path": "swag_baselines/unarylstm/__init__.py",
"chars": 0,
"preview": ""
},
{
"path": "swag_baselines/unarylstm/dataset_reader.py",
"chars": 4139,
"preview": "# slightly different from the other dataset reader\n\nfrom typing import Dict, List\nimport json\nimport logging\n\nfrom overr"
},
{
"path": "swag_baselines/unarylstm/lstm_swag.py",
"chars": 6737,
"preview": "from typing import Dict, List, TextIO, Optional\n\nfrom overrides import overrides\nimport torch\nfrom torch.nn.modules impo"
},
{
"path": "swag_baselines/unarylstm/predict.py",
"chars": 4020,
"preview": "\"\"\"\nadapted from Allennlp because their version doesn't seem to work 😢😢😢\n\"\"\"\nfrom typing import Dict, Any, Iterable\nimpo"
},
{
"path": "swag_baselines/unarylstm/run_experiments.sh",
"chars": 1156,
"preview": "#!/usr/bin/env bash\n\nexport CUDA_VISIBLE_DEVICES=$1\n\nif [ $1 == \"0\" ]; then\n echo \"fuck! LSTM Numberbatch\"\n python"
},
{
"path": "swag_baselines/unarylstm/run_experiments_ending.sh",
"chars": 1254,
"preview": "#!/usr/bin/env bash\n\nexport CUDA_VISIBLE_DEVICES=$1\n\necho \"ONLY ENDING!!!!\"\nif [ $1 == \"0\" ]; then\n echo \"fuck! LSTM "
},
{
"path": "swag_baselines/unarylstm/train-cnn.json",
"chars": 1406,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"tokens\": {\n \"type\": \"single_id\",\n "
},
{
"path": "swag_baselines/unarylstm/train-lstmbasic-elmo-endingonly.json",
"chars": 1527,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"elmo\": {\n \"type\": \"elmo_characters\"\n "
},
{
"path": "swag_baselines/unarylstm/train-lstmbasic-elmo-goldonly-endingonly.json",
"chars": 1527,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"elmo\": {\n \"type\": \"elmo_characters\"\n "
},
{
"path": "swag_baselines/unarylstm/train-lstmbasic-elmo-goldonly.json",
"chars": 1528,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"elmo\": {\n \"type\": \"elmo_characters\"\n "
},
{
"path": "swag_baselines/unarylstm/train-lstmbasic-elmo.json",
"chars": 1529,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"elmo\": {\n \"type\": \"elmo_characters\"\n "
},
{
"path": "swag_baselines/unarylstm/train-lstmbasic-glove-endingonly.json",
"chars": 1348,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"tokens\": {\n \"type\": \"single_id\",\n "
},
{
"path": "swag_baselines/unarylstm/train-lstmbasic-glove-goldonly-endingonly.json",
"chars": 1347,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"tokens\": {\n \"type\": \"single_id\",\n "
},
{
"path": "swag_baselines/unarylstm/train-lstmbasic-glove-goldonly.json",
"chars": 1349,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"tokens\": {\n \"type\": \"single_id\",\n "
},
{
"path": "swag_baselines/unarylstm/train-lstmbasic-glove.json",
"chars": 1350,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"tokens\": {\n \"type\": \"single_id\",\n "
},
{
"path": "swag_baselines/unarylstm/train-lstmbasic-numberbatch-endingonly.json",
"chars": 1357,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"tokens\": {\n \"type\": \"single_id\",\n "
},
{
"path": "swag_baselines/unarylstm/train-lstmbasic-numberbatch-goldonly-endingonly.json",
"chars": 1356,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"tokens\": {\n \"type\": \"single_id\",\n "
},
{
"path": "swag_baselines/unarylstm/train-lstmbasic-numberbatch-goldonly.json",
"chars": 1358,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"tokens\": {\n \"type\": \"single_id\",\n "
},
{
"path": "swag_baselines/unarylstm/train-lstmbasic-numberbatch.json",
"chars": 1359,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"tokens\": {\n \"type\": \"single_id\",\n "
},
{
"path": "swag_baselines/unarylstm/train.json",
"chars": 1352,
"preview": "{\n \"dataset_reader\": {\n \"type\": \"swag\",\n \"token_indexers\": {\n \"tokens\": {\n \"type\": \"single_id\",\n "
}
]
// ... and 2 more files (download for full content)
About this extraction
This page contains the full source code of the rowanz/swagaf GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 86 files (80.9 MB), approximately 6.3M tokens, and a symbol index with 148 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.
Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.