Repository: spro/practical-pytorch Branch: master Commit: c520c52e68e9 Files: 50 Total size: 991.7 KB Directory structure: gitextract_lif5hpq3/ ├── .gitignore ├── LICENSE ├── README.md ├── char-rnn-classification/ │ ├── .gitignore │ ├── char-rnn-classification.ipynb │ ├── data.py │ ├── model.py │ ├── predict.py │ ├── server.py │ └── train.py ├── char-rnn-generation/ │ ├── README.md │ ├── char-rnn-generation.ipynb │ ├── generate.py │ ├── helpers.py │ ├── model.py │ └── train.py ├── conditional-char-rnn/ │ ├── conditional-char-rnn.ipynb │ ├── data.py │ ├── generate.py │ ├── model.py │ └── train.py ├── data/ │ └── names/ │ ├── Arabic.txt │ ├── Chinese.txt │ ├── Czech.txt │ ├── Dutch.txt │ ├── English.txt │ ├── French.txt │ ├── German.txt │ ├── Greek.txt │ ├── Irish.txt │ ├── Italian.txt │ ├── Japanese.txt │ ├── Korean.txt │ ├── Polish.txt │ ├── Portuguese.txt │ ├── Russian.txt │ ├── Scottish.txt │ ├── Spanish.txt │ └── Vietnamese.txt ├── glove-word-vectors/ │ └── glove-word-vectors.ipynb ├── reinforce-gridworld/ │ ├── helpers.py │ ├── reinforce-gridworld.ipynb │ └── reinforce-gridworld.py └── seq2seq-translation/ ├── images/ │ ├── attention-decoder-network.dot │ ├── decoder-network.dot │ └── encoder-network.dot ├── masked_cross_entropy.py ├── seq2seq-translation-batched.ipynb ├── seq2seq-translation-batched.py └── seq2seq-translation.ipynb ================================================ FILE CONTENTS ================================================ ================================================ FILE: .gitignore ================================================ *.swp *.swo *.pt .ipynb_checkpoints __pycache__ data/eng-*.txt *.csv ================================================ FILE: LICENSE ================================================ The MIT License (MIT) Copyright (c) 2017 Sean Robertson Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ================================================ FILE: README.md ================================================ **These tutorials have been merged into [the official PyTorch tutorials](https://github.com/pytorch/tutorials). Please go there for better maintained versions of these tutorials compatible with newer versions of PyTorch.** --- ![Practical Pytorch](https://i.imgur.com/eBRPvWB.png) Learn PyTorch with project-based tutorials. These tutorials demonstrate modern techniques with readable code and use regular data from the internet. ## Tutorials #### Series 1: RNNs for NLP Applying recurrent neural networks to natural language tasks, from classification to generation. * [Classifying Names with a Character-Level RNN](https://github.com/spro/practical-pytorch/blob/master/char-rnn-classification/char-rnn-classification.ipynb) * [Generating Shakespeare with a Character-Level RNN](https://github.com/spro/practical-pytorch/blob/master/char-rnn-generation/char-rnn-generation.ipynb) * [Generating Names with a Conditional Character-Level RNN](https://github.com/spro/practical-pytorch/blob/master/conditional-char-rnn/conditional-char-rnn.ipynb) * [Translation with a Sequence to Sequence Network and Attention](https://github.com/spro/practical-pytorch/blob/master/seq2seq-translation/seq2seq-translation.ipynb) * [Exploring Word Vectors with GloVe](https://github.com/spro/practical-pytorch/blob/master/glove-word-vectors/glove-word-vectors.ipynb) * *WIP* Sentiment Analysis with a Word-Level RNN and GloVe Embeddings #### Series 2: RNNs for timeseries data * *WIP* Predicting discrete events with an RNN ## Get Started The quickest way to run these on a fresh Linux or Mac machine is to install [Anaconda](https://www.continuum.io/anaconda-overview): ``` curl -LO https://repo.continuum.io/archive/Anaconda3-4.3.0-Linux-x86_64.sh bash Anaconda3-4.3.0-Linux-x86_64.sh ``` Then install PyTorch: ``` conda install pytorch -c soumith ``` Then clone this repo and start Jupyter Notebook: ``` git clone http://github.com/spro/practical-pytorch cd practical-pytorch jupyter notebook ``` ## Recommended Reading ### PyTorch basics * http://pytorch.org/ For installation instructions * [Offical PyTorch tutorials](http://pytorch.org/tutorials/) for more tutorials (some of these tutorials are included there) * [Deep Learning with PyTorch: A 60-minute Blitz](http://pytorch.org/tutorials/beginner/deep_learning_60min_blitz.html) to get started with PyTorch in general * [Introduction to PyTorch for former Torchies](https://github.com/pytorch/tutorials/blob/master/Introduction%20to%20PyTorch%20for%20former%20Torchies.ipynb) if you are a former Lua Torch user * [jcjohnson's PyTorch examples](https://github.com/jcjohnson/pytorch-examples) for a more in depth overview (including custom modules and autograd functions) ### Recurrent Neural Networks * [The Unreasonable Effectiveness of Recurrent Neural Networks](http://karpathy.github.io/2015/05/21/rnn-effectiveness/) shows a bunch of real life examples * [Deep Learning, NLP, and Representations](http://colah.github.io/posts/2014-07-NLP-RNNs-Representations/) for an overview on word embeddings and RNNs for NLP * [Understanding LSTM Networks](http://colah.github.io/posts/2015-08-Understanding-LSTMs/) is about LSTMs work specifically, but also informative about RNNs in general ### Machine translation * [Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation](http://arxiv.org/abs/1406.1078) * [Sequence to Sequence Learning with Neural Networks](http://arxiv.org/abs/1409.3215) ### Attention models * [Neural Machine Translation by Jointly Learning to Align and Translate](https://arxiv.org/abs/1409.0473) * [Effective Approaches to Attention-based Neural Machine Translation](https://arxiv.org/abs/1508.04025) ### Other RNN uses * [A Neural Conversational Model](http://arxiv.org/abs/1506.05869) ### Other PyTorch tutorials * [Deep Learning For NLP In PyTorch](https://github.com/rguthrie3/DeepLearningForNLPInPytorch) ## Feedback If you have ideas or find mistakes [please leave a note](https://github.com/spro/practical-pytorch/issues/new). ================================================ FILE: char-rnn-classification/.gitignore ================================================ *.pt *.swp *.swo __pycache__ .ipynb_checkpoints ================================================ FILE: char-rnn-classification/char-rnn-classification.ipynb ================================================ { "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "![](https://i.imgur.com/eBRPvWB.png)\n", "\n", "# Practical PyTorch: Classifying Names with a Character-Level RNN\n", "\n", "We will be building and training a basic character-level RNN to classify words. A character-level RNN reads words as a series of characters - outputting a prediction and \"hidden state\" at each step, feeding its previous hidden state into each next step. We take the final prediction to be the output, i.e. which class the word belongs to.\n", "\n", "Specifically, we'll train on a few thousand surnames from 18 languages of origin, and predict which language a name is from based on the spelling:\n", "\n", "```\n", "$ python predict.py Hinton\n", "(-0.47) Scottish\n", "(-1.52) English\n", "(-3.57) Irish\n", "\n", "$ python predict.py Schmidhuber\n", "(-0.19) German\n", "(-2.48) Czech\n", "(-2.68) Dutch\n", "```" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Recommended Reading\n", "\n", "I assume you have at least installed PyTorch, know Python, and understand Tensors:\n", "\n", "* http://pytorch.org/ For installation instructions\n", "* [Deep Learning with PyTorch: A 60-minute Blitz](http://pytorch.org/tutorials/beginner/deep_learning_60min_blitz.html) to get started with PyTorch in general\n", "* [jcjohnson's PyTorch examples](https://github.com/jcjohnson/pytorch-examples) for an in depth overview\n", "* [Introduction to PyTorch for former Torchies](https://github.com/pytorch/tutorials/blob/master/Introduction%20to%20PyTorch%20for%20former%20Torchies.ipynb) if you are former Lua Torch user\n", "\n", "It would also be useful to know about RNNs and how they work:\n", "\n", "* [The Unreasonable Effectiveness of Recurrent Neural Networks](http://karpathy.github.io/2015/05/21/rnn-effectiveness/) shows a bunch of real life examples\n", "* [Understanding LSTM Networks](http://colah.github.io/posts/2015-08-Understanding-LSTMs/) is about LSTMs specifically but also informative about RNNs in general" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Preparing the Data\n", "\n", "Included in the `data/names` directory are 18 text files named as \"[Language].txt\". Each file contains a bunch of names, one name per line, mostly romanized (but we still need to convert from Unicode to ASCII).\n", "\n", "We'll end up with a dictionary of lists of names per language, `{language: [names ...]}`. The generic variables \"category\" and \"line\" (for language and name in our case) are used for later extensibility." ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "collapsed": false, "scrolled": true }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "['../data/names/Arabic.txt', '../data/names/Chinese.txt', '../data/names/Czech.txt', '../data/names/Dutch.txt', '../data/names/English.txt', '../data/names/French.txt', '../data/names/German.txt', '../data/names/Greek.txt', '../data/names/Irish.txt', '../data/names/Italian.txt', '../data/names/Japanese.txt', '../data/names/Korean.txt', '../data/names/Polish.txt', '../data/names/Portuguese.txt', '../data/names/Russian.txt', '../data/names/Scottish.txt', '../data/names/Spanish.txt', '../data/names/Vietnamese.txt']\n" ] } ], "source": [ "import glob\n", "\n", "all_filenames = glob.glob('../data/names/*.txt')\n", "print(all_filenames)" ] }, { "cell_type": "code", "execution_count": 2, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Slusarski\n" ] } ], "source": [ "import unicodedata\n", "import string\n", "\n", "all_letters = string.ascii_letters + \" .,;'\"\n", "n_letters = len(all_letters)\n", "\n", "# Turn a Unicode string to plain ASCII, thanks to http://stackoverflow.com/a/518232/2809427\n", "def unicode_to_ascii(s):\n", " return ''.join(\n", " c for c in unicodedata.normalize('NFD', s)\n", " if unicodedata.category(c) != 'Mn'\n", " and c in all_letters\n", " )\n", "\n", "print(unicode_to_ascii('Ślusàrski'))" ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "n_categories = 18\n" ] } ], "source": [ "# Build the category_lines dictionary, a list of names per language\n", "category_lines = {}\n", "all_categories = []\n", "\n", "# Read a file and split into lines\n", "def readLines(filename):\n", " lines = open(filename).read().strip().split('\\n')\n", " return [unicode_to_ascii(line) for line in lines]\n", "\n", "for filename in all_filenames:\n", " category = filename.split('/')[-1].split('.')[0]\n", " all_categories.append(category)\n", " lines = readLines(filename)\n", " category_lines[category] = lines\n", "\n", "n_categories = len(all_categories)\n", "print('n_categories =', n_categories)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now we have `category_lines`, a dictionary mapping each category (language) to a list of lines (names). We also kept track of `all_categories` (just a list of languages) and `n_categories` for later reference." ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "['Abandonato', 'Abatangelo', 'Abatantuono', 'Abate', 'Abategiovanni']\n" ] } ], "source": [ "print(category_lines['Italian'][:5])" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Turning Names into Tensors\n", "\n", "Now that we have all the names organized, we need to turn them into Tensors to make any use of them.\n", "\n", "To represent a single letter, we use a \"one-hot vector\" of size `<1 x n_letters>`. A one-hot vector is filled with 0s except for a 1 at index of the current letter, e.g. `\"b\" = <0 1 0 0 0 ...>`.\n", "\n", "To make a word we join a bunch of those into a 2D matrix ``.\n", "\n", "That extra 1 dimension is because PyTorch assumes everything is in batches - we're just using a batch size of 1 here." ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "collapsed": false }, "outputs": [], "source": [ "import torch\n", "\n", "# Just for demonstration, turn a letter into a <1 x n_letters> Tensor\n", "def letter_to_tensor(letter):\n", " tensor = torch.zeros(1, n_letters)\n", " letter_index = all_letters.find(letter)\n", " tensor[0][letter_index] = 1\n", " return tensor\n", "\n", "# Turn a line into a ,\n", "# or an array of one-hot letter vectors\n", "def line_to_tensor(line):\n", " tensor = torch.zeros(len(line), 1, n_letters)\n", " for li, letter in enumerate(line):\n", " letter_index = all_letters.find(letter)\n", " tensor[li][0][letter_index] = 1\n", " return tensor" ] }, { "cell_type": "code", "execution_count": 6, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\n", "\n", "Columns 0 to 12 \n", " 0 0 0 0 0 0 0 0 0 0 0 0 0\n", "\n", "Columns 13 to 25 \n", " 0 0 0 0 0 0 0 0 0 0 0 0 0\n", "\n", "Columns 26 to 38 \n", " 0 0 0 0 0 0 0 0 0 1 0 0 0\n", "\n", "Columns 39 to 51 \n", " 0 0 0 0 0 0 0 0 0 0 0 0 0\n", "\n", "Columns 52 to 56 \n", " 0 0 0 0 0\n", "[torch.FloatTensor of size 1x57]\n", "\n" ] } ], "source": [ "print(letter_to_tensor('J'))" ] }, { "cell_type": "code", "execution_count": 7, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "torch.Size([5, 1, 57])\n" ] } ], "source": [ "print(line_to_tensor('Jones').size())" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Creating the Network\n", "\n", "Before autograd, creating a recurrent neural network in Torch involved cloning the parameters of a layer over several timesteps. The layers held hidden state and gradients which are now entirely handled by the graph itself. This means you can implement a RNN in a very \"pure\" way, as regular feed-forward layers.\n", "\n", "This RNN module (mostly copied from [the PyTorch for Torch users tutorial](https://github.com/pytorch/tutorials/blob/master/Introduction%20to%20PyTorch%20for%20former%20Torchies.ipynb)) is just 2 linear layers which operate on an input and hidden state, with a LogSoftmax layer after the output.\n", "\n", "![](https://i.imgur.com/Z2xbySO.png)" ] }, { "cell_type": "code", "execution_count": 8, "metadata": { "collapsed": false }, "outputs": [], "source": [ "import torch.nn as nn\n", "from torch.autograd import Variable\n", "\n", "class RNN(nn.Module):\n", " def __init__(self, input_size, hidden_size, output_size):\n", " super(RNN, self).__init__()\n", " \n", " self.input_size = input_size\n", " self.hidden_size = hidden_size\n", " self.output_size = output_size\n", " \n", " self.i2h = nn.Linear(input_size + hidden_size, hidden_size)\n", " self.i2o = nn.Linear(input_size + hidden_size, output_size)\n", " self.softmax = nn.LogSoftmax()\n", " \n", " def forward(self, input, hidden):\n", " combined = torch.cat((input, hidden), 1)\n", " hidden = self.i2h(combined)\n", " output = self.i2o(combined)\n", " output = self.softmax(output)\n", " return output, hidden\n", "\n", " def init_hidden(self):\n", " return Variable(torch.zeros(1, self.hidden_size))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Manually testing the network\n", "\n", "With our custom `RNN` class defined, we can create a new instance:" ] }, { "cell_type": "code", "execution_count": 9, "metadata": { "collapsed": true, "scrolled": true }, "outputs": [], "source": [ "n_hidden = 128\n", "rnn = RNN(n_letters, n_hidden, n_categories)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To run a step of this network we need to pass an input (in our case, the Tensor for the current letter) and a previous hidden state (which we initialize as zeros at first). We'll get back the output (probability of each language) and a next hidden state (which we keep for the next step).\n", "\n", "Remember that PyTorch modules operate on Variables rather than straight up Tensors." ] }, { "cell_type": "code", "execution_count": 10, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "output.size = torch.Size([1, 18])\n" ] } ], "source": [ "input = Variable(letter_to_tensor('A'))\n", "hidden = rnn.init_hidden()\n", "\n", "output, next_hidden = rnn(input, hidden)\n", "print('output.size =', output.size())" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "For the sake of efficiency we don't want to be creating a new Tensor for every step, so we will use `line_to_tensor` instead of `letter_to_tensor` and use slices. This could be further optimized by pre-computing batches of Tensors." ] }, { "cell_type": "code", "execution_count": 11, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Variable containing:\n", "\n", "Columns 0 to 9 \n", "-2.8658 -2.8801 -2.7945 -2.9082 -2.8309 -2.9718 -2.9366 -2.9416 -2.7900 -2.8467\n", "\n", "Columns 10 to 17 \n", "-2.9495 -2.9496 -2.8707 -2.8984 -2.8147 -2.9442 -2.9257 -2.9363\n", "[torch.FloatTensor of size 1x18]\n", "\n" ] } ], "source": [ "input = Variable(line_to_tensor('Albert'))\n", "hidden = Variable(torch.zeros(1, n_hidden))\n", "\n", "output, next_hidden = rnn(input[0], hidden)\n", "print(output)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "As you can see the output is a `<1 x n_categories>` Tensor, where every item is the likelihood of that category (higher is more likely)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Preparing for Training\n", "\n", "Before going into training we should make a few helper functions. The first is to interpret the output of the network, which we know to be a likelihood of each category. We can use `Tensor.topk` to get the index of the greatest value:" ] }, { "cell_type": "code", "execution_count": 12, "metadata": { "collapsed": false, "scrolled": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "('Irish', 8)\n" ] } ], "source": [ "def category_from_output(output):\n", " top_n, top_i = output.data.topk(1) # Tensor out of Variable with .data\n", " category_i = top_i[0][0]\n", " return all_categories[category_i], category_i\n", "\n", "print(category_from_output(output))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We will also want a quick way to get a training example (a name and its language):" ] }, { "cell_type": "code", "execution_count": 13, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "category = Italian / line = Campana\n", "category = Korean / line = Koo\n", "category = Irish / line = Mochan\n", "category = Japanese / line = Kitabatake\n", "category = Vietnamese / line = an\n", "category = Korean / line = Kwak\n", "category = Portuguese / line = Campos\n", "category = Vietnamese / line = Chung\n", "category = Japanese / line = Ise\n", "category = Dutch / line = Romijn\n" ] } ], "source": [ "import random\n", "\n", "def random_training_pair(): \n", " category = random.choice(all_categories)\n", " line = random.choice(category_lines[category])\n", " category_tensor = Variable(torch.LongTensor([all_categories.index(category)]))\n", " line_tensor = Variable(line_to_tensor(line))\n", " return category, line, category_tensor, line_tensor\n", "\n", "for i in range(10):\n", " category, line, category_tensor, line_tensor = random_training_pair()\n", " print('category =', category, '/ line =', line)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Training the Network\n", "\n", "Now all it takes to train this network is show it a bunch of examples, have it make guesses, and tell it if it's wrong.\n", "\n", "For the [loss function `nn.NLLLoss`](http://pytorch.org/docs/nn.html#nllloss) is appropriate, since the last layer of the RNN is `nn.LogSoftmax`." ] }, { "cell_type": "code", "execution_count": 14, "metadata": { "collapsed": false }, "outputs": [], "source": [ "criterion = nn.NLLLoss()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We will also create an \"optimizer\" which updates the parameters of our model according to its gradients. We will use the vanilla SGD algorithm with a low learning rate." ] }, { "cell_type": "code", "execution_count": 15, "metadata": { "collapsed": true }, "outputs": [], "source": [ "learning_rate = 0.005 # If you set this too high, it might explode. If too low, it might not learn\n", "optimizer = torch.optim.SGD(rnn.parameters(), lr=learning_rate)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Each loop of training will:\n", "\n", "* Create input and target tensors\n", "* Create a zeroed initial hidden state\n", "* Read each letter in and\n", " * Keep hidden state for next letter\n", "* Compare final output to target\n", "* Back-propagate\n", "* Return the output and loss" ] }, { "cell_type": "code", "execution_count": 16, "metadata": { "collapsed": false }, "outputs": [], "source": [ "def train(category_tensor, line_tensor):\n", " rnn.zero_grad()\n", " hidden = rnn.init_hidden()\n", " \n", " for i in range(line_tensor.size()[0]):\n", " output, hidden = rnn(line_tensor[i], hidden)\n", "\n", " loss = criterion(output, category_tensor)\n", " loss.backward()\n", "\n", " optimizer.step()\n", "\n", " return output, loss.data[0]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now we just have to run that with a bunch of examples. Since the `train` function returns both the output and loss we can print its guesses and also keep track of loss for plotting. Since there are 1000s of examples we print only every `print_every` time steps, and take an average of the loss." ] }, { "cell_type": "code", "execution_count": 17, "metadata": { "collapsed": false, "scrolled": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "5000 5% (0m 7s) 2.7940 Neil / Chinese ✗ (Irish)\n", "10000 10% (0m 14s) 2.7166 O'Kelly / English ✗ (Irish)\n", "15000 15% (0m 23s) 1.1694 Vescovi / Italian ✓\n", "20000 20% (0m 31s) 2.1433 Mikhailjants / Greek ✗ (Russian)\n", "25000 25% (0m 40s) 2.0299 Planick / Russian ✗ (Czech)\n", "30000 30% (0m 48s) 1.9862 Cabral / French ✗ (Portuguese)\n", "35000 35% (0m 55s) 1.5634 Espina / Spanish ✓\n", "40000 40% (1m 5s) 3.8602 MaxaB / Arabic ✗ (Czech)\n", "45000 45% (1m 13s) 3.5599 Sandoval / Dutch ✗ (Spanish)\n", "50000 50% (1m 20s) 1.3855 Brown / Scottish ✓\n", "55000 55% (1m 27s) 1.6269 Reid / French ✗ (Scottish)\n", "60000 60% (1m 35s) 0.4495 Kijek / Polish ✓\n", "65000 65% (1m 43s) 1.0269 Young / Scottish ✓\n", "70000 70% (1m 50s) 1.9761 Fischer / English ✗ (German)\n", "75000 75% (1m 57s) 0.7915 Rudaski / Polish ✓\n", "80000 80% (2m 5s) 1.7026 Farina / Portuguese ✗ (Italian)\n", "85000 85% (2m 12s) 0.1878 Bakkarevich / Russian ✓\n", "90000 90% (2m 19s) 0.1211 Pasternack / Polish ✓\n", "95000 95% (2m 25s) 0.6084 Otani / Japanese ✓\n", "100000 100% (2m 33s) 0.2713 Alesini / Italian ✓\n" ] } ], "source": [ "import time\n", "import math\n", "\n", "n_epochs = 100000\n", "print_every = 5000\n", "plot_every = 1000\n", "\n", "# Keep track of losses for plotting\n", "current_loss = 0\n", "all_losses = []\n", "\n", "def time_since(since):\n", " now = time.time()\n", " s = now - since\n", " m = math.floor(s / 60)\n", " s -= m * 60\n", " return '%dm %ds' % (m, s)\n", "\n", "start = time.time()\n", "\n", "for epoch in range(1, n_epochs + 1):\n", " # Get a random training input and target\n", " category, line, category_tensor, line_tensor = random_training_pair()\n", " output, loss = train(category_tensor, line_tensor)\n", " current_loss += loss\n", " \n", " # Print epoch number, loss, name and guess\n", " if epoch % print_every == 0:\n", " guess, guess_i = category_from_output(output)\n", " correct = '✓' if guess == category else '✗ (%s)' % category\n", " print('%d %d%% (%s) %.4f %s / %s %s' % (epoch, epoch / n_epochs * 100, time_since(start), loss, line, guess, correct))\n", "\n", " # Add current loss avg to list of losses\n", " if epoch % plot_every == 0:\n", " all_losses.append(current_loss / plot_every)\n", " current_loss = 0" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Plotting the Results\n", "\n", "Plotting the historical loss from `all_losses` shows the network learning:" ] }, { "cell_type": "code", "execution_count": 18, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "[]" ] }, "execution_count": 18, "metadata": {}, "output_type": "execute_result" }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAg0AAAFkCAYAAACjCwibAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzt3Xd4VVX2//H3ogiIGjtYEFFHsI1+EymOg9gLjqKCJaKj\nYBccRWcsOFjHPrZRELsIGgsOKjqD6NgbYqKoI1hRVBCswCBNsn5/rORHElLuTW5Jbj6v57kP3nP2\nOWfdE+Su7LP32ubuiIiIiNSlRbYDEBERkaZBSYOIiIgkREmDiIiIJERJg4iIiCRESYOIiIgkREmD\niIiIJERJg4iIiCRESYOIiIgkREmDiIiIJERJg4iIiCQkqaTBzE41s2lmNr/s9bqZ7V/HMbubWbGZ\nLTGzj83suIaFLCIiItmQbE/DV8B5QD5QADwPPGFm21TX2Mw2B54C/gPsCNwM3GVm+9QzXhEREckS\na+iCVWb2A/Bnd7+3mn3XAAe4+28rbCsC8ty9b4MuLCIiIhlV7zENZtbCzI4CVgfeqKFZL+C5Ktue\nAXap73VFREQkO1ole4CZbU8kCW2BhcCh7j6jhuYdgblVts0F1jKzNu6+tIZrrAfsB3wBLEk2RhER\nkWasLbA58Iy7/5DKEyedNAAziPEJecAA4H4z262WxKE+9gMeSOH5REREmpuBwIOpPGHSSYO7/wp8\nXvb2HTPrAZwJnFZN82+BDlW2dQAW1NTLUOYLgHHjxrHNNtWOsZQ0GDZsGDfeeGO2w2hWdM8zT/c8\n83TPM2v69Okcc8wxUPZdmkr16WmoqgXQpoZ9bwAHVNm2LzWPgSi3BGCbbbYhPz+/YdFJwvLy8nS/\nM0z3PPN0zzNP9zxrUv54P6mkwcyuBP4NzALWJLo++hCJAGZ2FbCxu5fXYhgNDCmbRXEPsBfxSEMz\nJ0RERJqYZHsaNgTGABsB84H3gH3d/fmy/R2BTuWN3f0LMzsQuBH4E/A1cIK7V51RISIiIo1cUkmD\nu59Yx/5B1Wx7mSgEJSIiIk2Y1p6Q/6+wsDDbITQ7uueZp3ueebrnuaPBFSHTwczygeLi4mINnhER\nEUlCSUkJBQUFAAXuXpLKc6unQURERBKipEFEREQSoqRBREREEqKkQURERBKipEFEREQSoqRBRERE\nEqKkQURERBKipEFEREQSoqRBREREEqKkQURERBKipEFEREQSoqRBREREEqKkQURERBKipEFEREQS\noqRBREREEqKkQURERBLSqJMG92xHICIiIuUaddIwcWK2IxAREZFyjTppuPlm+OGHbEchIiIi0MiT\nhuXLYfjwbEchIiIi0MiThiFD4I474M03sx2JiIiIJJU0mNkFZvaWmS0ws7lmNsHMtk7guIFm9q6Z\nLTKz2WZ2t5mtW9dxAwZAfj6ceir8+msykYqIiEiqJdvT0Bu4BegJ7A20BiabWbuaDjCzXYExwJ3A\ntsAAoAdwR10Xa9kSRo+G996DkSOTjFRERERSKqmkwd37uvtYd5/u7u8DxwObAQW1HNYLmOnuI939\nS3d/HbidSBzq1L179DSMGAFz5iQTrYiIiKRSQ8c0rA048GMtbd4AOpnZAQBm1gE4HHg60YtccQW0\naQPnndeQUEVERKQh6p00mJkBNwGvuvuHNbUr61k4BnjYzJYBc4CfgKGJXmuddeDKK2HsWHj99fpG\nLCIiIg3RkJ6GUcQYhaNqa2Rm2wI3A5cA+cB+QBfiEUXCBg+GggIYOhRWrKhXvCIiItIA5vWo1Wxm\ntwIHAb3dfVYdbe8H2rr7ERW27Qq8Amzk7nOrOSYfKN5tt93Iy8v7/9t/+glefbWQ0aMLOeWUpMMW\nERHJKUVFRRQVFVXaNn/+fF5++WWAAncvSeX1kk4ayhKGfkAfd/88gfbjgWXufnSFbbsArwKbuPu3\n1RyTDxQXFxeTn59fad/xx8NTT8HHH8O6dU7aFBERaV5KSkooKCiANCQNydZpGAUMBI4GFplZh7JX\n2wptrjSzMRUOmwj0N7NTzaxLWS/DzcCU6hKGulx9NSxbFrMpREREJHOSHdNwKrAW8CIwu8LriApt\nNgI6lb9x9zHA2cAQ4H3gYWA60L8+AXfsCJdeGvUbpk2rzxlERESkPlol09jd60wy3H1QNdtGAikr\nzzR0KNx5Z0zBnDQpVWcVERGR2jTqtSdq0ro1/PWv8Mwz8MEH2Y5GRESkeWiSSQPA4YfDJpvATTdl\nOxIREZHmockmDa1bwxlnwLhxMHeVSZsiIiKSak02aQA4+eRY1Oq227IdiYiISO5r0knDOutEpchR\no2Dx4mxHIyIiktuadNIAcOaZ8P338MAD2Y5EREQktzX5pGGrraBfP7jhBqhHRWwRERFJUJNPGgDO\nPhumT1fNBhERkXTKiaTh97+HnXeO3gYRERFJj5xIGsyit+G55+D997MdjYiISG7KiaQBYMAAaN8e\n/vWvbEciIiKSm3ImaWjdGnr1gtdey3YkIiIiuSlnkgaAXXeF11/XLAoREZF0yLmk4Ycf4KOPsh2J\niIhI7smppKFXL2jRQo8oRERE0iGnkoa11oIddlDSICIikg45lTRAPKJQ0iAiIpJ6OZk0fPwxfPdd\ntiMRERHJLTmZNIB6G0RERFIt55KGzTaDTTdV0iAiIpJqOZc0mGlcg4iISDrkXNIAkTQUF8OSJdmO\nREREJHfkbNKwbBm8/Xa2IxEREckdOZk0/Pa3sXiVHlGIiIikTlJJg5ldYGZvmdkCM5trZhPMbOsE\njlvNzK4wsy/MbImZfW5mx9c76jq0aqXFq0RERFIt2Z6G3sAtQE9gb6A1MNnM2tVx3KPAHsAgYGug\nEEjrChFavEpERCS1WiXT2N37Vnxf1lswDygAXq3uGDPbn0g2tnD3n8s2z0o60iTtuitcdlksXtWt\nW7qvJiIikvsaOqZhbcCBH2tpcxDwNnCemX1tZh+Z2XVm1raB166VFq8SERFJrXonDWZmwE3Aq+7+\nYS1NtyB6GrYDDgHOBAYAI+t77URo8SoREZHUSurxRBWjgG2BXeto1wIoBY529/8BmNnZwKNmdrq7\nL63pwGHDhpGXl1dpW2FhIYWFhQkF+Pvfw+TJCTUVERFpcoqKiigqKqq0bf78+Wm7nnk9Rgqa2a3E\nY4fe7l7r+AQzuw/4nbtvXWFbN+C/wNbu/lk1x+QDxcXFxeTn5ycdX7kHH4SBA+H772G99ep9GhER\nkSajpKSEgoICgAJ3L0nluZN+PFGWMPQD9qgrYSjzGrCxma1eYVtXovfh62Svn4xeveLPKVPSeRUR\nEZHmIdk6DaOAgcDRwCIz61D2aluhzZVmNqbCYQ8CPwD3mtk2ZrYbcC1wd22PJlKhSxfYYAN48810\nXkVERKR5SLan4VRgLeBFYHaF1xEV2mwEdCp/4+6LgH2ImRZTgbHAE8SAyLQyi94GJQ0iIiINl2yd\nhjqTDHcfVM22j4H9krlWqvTqBddcA6WlMQVTRERE6ifnv0Z79YIFC2DGjGxHIiIi0rTlfNLQvXs8\nptAjChERkYbJ+aRhzTVh++2VNIiIiDRUzicNoMGQIiIiqdBskoYPPoCFC7MdiYiISNPVbJIGd3j7\n7WxHIiIi0nQ1i6ShW7dYwEqPKEREROqvWSQNLVpAz55KGkRERBqiWSQNsHIwZD3W5xIRERGaWdIw\nbx588UW2IxEREWmamk3S0LNn/KlHFCIiIvXTbJKG9daDrbZS0iAiIlJfzSZpABV5EhERaYhmlzS8\n8w4sWZLtSERERJqeZpc0LF8eiYOIiIgkp1klDb/9LbRtC1OmZDsSERGRpqdZJQ2tW8csin//O9uR\niIiIND3NKmkAGDQIJk+GTz7JdiQiIiJNS7NLGo48MqZf3nZbtiMRERFpWppd0tC2LZxwAtxzDyxa\nlO1oREREmo5mlzQAnHoqLFgADz6Y7UhERESajmaZNHTpAn/4A4wcqQWsREREEtUskwaAIUNg2jR4\n/fVsRyIiItI0JJU0mNkFZvaWmS0ws7lmNsHMtk7i+F3NbLmZlSQfamrts0+sRTFyZLYjERERaRqS\n7WnoDdwC9AT2BloDk82sXV0HmlkeMAZ4Ltkg06FFCzj9dBg/Hr79NtvRiIiINH5JJQ3u3tfdx7r7\ndHd/Hzge2AwoSODw0cADQKNZMur446FVK7jzzmxHIiIi0vg1dEzD2oADP9bWyMwGAV2ASxt4vZRa\nZx0YOBBuvx1+/TXb0YiIiDRu9U4azMyAm4BX3f3DWtr9BrgSGOjupfW9Xrqcfjp88w385z/ZjkRE\nRKRxa0hPwyhgW+ComhqYWQvikcTF7v5Z+eYGXDPldtoJ1l1Xi1iJiIjUpVV9DjKzW4G+QG93n1NL\n0zWBnYGdzKx8nkKLOIUtA/Z19xdrOnjYsGHk5eVV2lZYWEhhYWF9wq6WGXTvDm+9lbJTioiIZERR\nURFFRUWVts2fPz9t1zNPsrpRWcLQD+jj7p/X0daAbapsHgLsAfQHvnD3xdUclw8UFxcXk5+fn1R8\n9TFiBNxxR8yisEbVDyIiIpKckpISCgoKAArcPaUlDpKt0zAKGAgcDSwysw5lr7YV2lxpZmMAPHxY\n8QXMA5aUzcBYJWHIhh49YN48mDUr25GIiIg0XsmOaTgVWAt4EZhd4XVEhTYbAZ1SEVymdO8ef06d\nmt04REREGrNk6zS0cPeW1bzur9BmkLvvWcs5LnX39D9zSELHjrDpphrXICIiUptmu/ZEVT16qKdB\nRESkNkoaynTvDsXFsGJFtiMRERFpnJQ0lOnRAxYuhI8+ynYkIiIijZOShjIFZatn6BGFiIhI9ZQ0\nlMnLg65dNRhSRESkJkoaKtBgSBERkZopaaige3eYNg2WLs12JCIiIo2PkoYKevSAZcvgvfeyHYmI\niEjjo6Shgh13hFat9IhCRESkOkoaKmjbNhIHDYYUERFZlZKGKrp3V0+DiIhIdZQ0VNG9O0yfHoWe\nREREZCUlDVX06AHuUVJaREREVlLSUMU220D79npEISIiUpWShipatoyS0hoMKSIiUpmShmp07w4v\nvABvvpntSERERBoPJQ3VOOMM2Hxz+N3vYOhQWLAg2xGJiIhkn5KGanTuHL0MN9wA990X4xwmTMh2\nVCIiItmlpKEGrVrBWWfBhx9Cfj4cdhj87W/ZjkpERCR7lDTUYbPN4Mkn4cwz4dpr4eefsx2RiIhI\ndihpSIAZnH9+LGY1cmS2oxEREckOJQ0J6tgRBg+Gm26CX37JdjQiIiKZp6QhCX/5C/z0E9x1V7Yj\nERERyTwlDUno0gUKC+G66+JRhYiISHOSVNJgZheY2VtmtsDM5prZBDPbuo5jDjWzyWY2z8zmm9nr\nZrZvw8LOnvPPh6+/hgceyHYkIiIimZVsT0Nv4BagJ7A30BqYbGbtajlmN2AycACQD7wATDSzHZMP\nN/u22w769YOrr4YVK7IdjYiISOa0Sqaxu/et+N7MjgfmAQXAqzUcM6zKpgvNrB9wEDAtmes3Fhdc\nAL16RcGnAQOyHY2IiEhmNHRMw9qAAz8meoCZGbBmMsc0Nj17wp57wpVXxjLaIiIizUG9k4ayL/+b\ngFfd/cMkDv0L0B54pL7XbgyGD4d33oFHmvSnEBERSVxSjyeqGAVsC+ya6AFmdjQwAjjY3b+vq/2w\nYcPIy8urtK2wsJDCwsIkQ029vfaK0tJnnBH/vf762Y5IRESam6KiIoqKiiptmz9/ftquZ16P/nUz\nu5UYk9Db3WcleMxRwF3AAHefVEfbfKC4uLiY/Pz8pOPLlG+/hW23hb59Ydy4bEcjIiICJSUlFBQU\nABS4e0kqz53044myhKEfsEcSCUMhcDdwVF0JQ1PSsWNUiHzgAXjqqWxHIyIikl7J1mkYBQwEjgYW\nmVmHslfbCm2uNLMxFd4fDYwBzgGmVjhmrdR8hOw69ljYf3849VRIY4+QiIhI1iXb03AqsBbwIjC7\nwuuICm02AjpVeH8S0BIYWeWYm+oVcSNjBrffDgsWRJlpERGRXJVsnYY6kwx3H1Tl/R7JBtXUbLZZ\nLJt92mlw5JExMFJERCTXaO2JFDn5ZNh9dzjrrGxHIiIikh5KGlKkRQs46ST44AP4vs7JpCIiIk2P\nkoYU6tkz/nzrrezGISIikg5KGlJoiy2iyNOUKdmOREREJPWUNKSQWfQ2vPlmtiMRERFJPSUNKdaz\nZzyeKC3NdiQiIiKppaQhxXr1gp9/hk8+yXYkIiIiqaWkIcW6d48/Na5BRERyjZKGFFt7bejWTeMa\nREQk9yhpSINevdTTICIiuUdJQxr07AnvvQeLF2c7EhERkdRR0pAGPXvCr79CSUpXMRcREckuJQ1p\nsMMO0K6dxjWIiEhuUdKQBq1awc47a1yDiIjkFiUNadKzp5IGERHJLUoa0qRnT5g1C+bMyXYkIiIi\nqaGkIU169Yo/q/Y2PPssdOkCL76Y8ZBEREQaRElDmmy6KWy8ceWk4cMPYcAAmDcPDj003ouIiDQV\nShrSqOK4hnnz4MADoXNn+Ogj6NQJDjhAjy9ERKTpUNKQRr16wdSpsGgRHHJIFHt66qnohfjXv2DF\nikgkFi7MdqQiIiJ1U9KQRj17wv/+B/vtB++8A08+CZttFvvKE4dPP4UjjohiUCIiIo2ZkoY0KiiA\nFi3gtddg7Fjo0aPy/t/+Fh57DJ57DoYOzU6MIiIiiWqV7QBy2RprwMknw3bbxQDI6uyzD4weDSee\nCLvsAscdl9kYRUREEpVUT4OZXWBmb5nZAjOba2YTzGzrBI7b3cyKzWyJmX1sZs3mq/G22+ruRTjh\nBDj+eDjtNPjvfzMSloiISNKSfTzRG7gF6AnsDbQGJptZu5oOMLPNgaeA/wA7AjcDd5nZPvWIN2eN\nHAlbbAGHHx7jIERERBqbpJIGd+/r7mPdfbq7vw8cD2wGFNRy2GnA5+5+rrt/5O4jgfHAsPoGnYtW\nXx0efTSqSJ5+OrhnOyIREZHKGjoQcm3AgR9radMLeK7KtmeAXRp47ZyzzTYxvmHsWLjnnmxHIyIi\nUlm9kwYzM+Am4FV3r622YUdgbpVtc4G1zKxNfa+fq445Bk46KcZBvP9+tqMRERFZqSE9DaOAbYGj\nUhSLlLn5Zthyy0geSkuzHY2IiEio15RLM7sV6Av0dve6CiF/C3Sosq0DsMDdl9Z24LBhw8jLy6u0\nrbCwkMLCwiQjblratYNRo6BPH7jvPhg8ONsRiYhIY1RUVERRUVGlbfPnz0/b9cyTHHFXljD0A/q4\n++cJtL8aOMDdd6yw7UFgbXfvW8Mx+UBxcXEx+fn5ScWXS449FiZNirUq1l0329GIiEhTUFJSQkFB\nAUCBu5ek8tzJ1mkYBQwEjgYWmVmHslfbCm2uNLMxFQ4bDWxhZteYWVczOx0YANyQgvhz2nXXwbJl\n8Ne/ZjsSERGR5Mc0nAqsBbwIzK7wOqJCm42ATuVv3P0L4ECirsO7xFTLE9y96owKqaJjR7j00phR\nUVyc7WhERKS5S2pMg7vXmWS4+6Bqtr1M7bUcpAZDh8b0yyFD4PXXYy0LERGRbNBXUCPXqlVUi5wy\nBe69N9vRiIhIc6akoQno3TvqN5x/PvxYWxktERGRNFLS0EQkOijy8cfhu+8yE5OIiDQvShqaiEQG\nRT7yCBx6KAzTqh4iIpIGShqakKFDYfvtY1Bk1UqRX34JJ58Mm2wCRUXw2WfZiVFERHKXkoYmpKZB\nkb/+CgMHwtprw9SpsP76cM012YtTRERyk5KGJqa6QZFXXAFvvAEPPAAbbQRnnx3lp7/+OquhiohI\njlHS0ARVHBT52mtw2WVw0UWw666x/7TToH17+PvfsxuniIjkFiUNTVDFQZEDBsAuu8CFF67cv9Za\n8Kc/wR13wLx52YtTRERyi5KGJqp8UOTixfFYolWV2p5/+lNUj7zppuzEJyIiuUdJQxPVqhU880wM\niuzcedX9660XjylGjoSff858fCIiknuUNDRhG20EXbvWvP/ss2HpUrj11szFJCIiuUtJQw7baCM4\n4QS4/voY/7BwYbYjEhGRpkxJQ467+GLo0ycKQm2ySfz5wQfZjkpERJoiJQ05bsMNYz2KmTPhzDPh\nn/+EHXaAP/8525GJiEhTo6ShmdhsM7j8cpg1K5KH0aNj5oWIiEiilDQ0M61bx6yKRYtg8uRsRyMi\nIk2JkoZmqGtX2G47eOyxbEciIiJNiZKGZuqww+DJJ6McdSotWxbjJ0REJPcoaWim+veH+fPh+efr\nbvvllzBqFPzhDzEboybusdrmNtvADz+kLlYREWkclDQ0U7/9LWy5ZcymqM7PP8Pw4THTYvPNY/Dk\n3LmxONadd1Z/zK23wvjx0dtQ03lFRKTpUtLQTJlFb8Pjj8OKFavuP+ss+Mc/oKAAHn0Uvv8epk6N\nQZRDhsArr1RuP3UqnHNOHLfnnvDQQ5n5HCIikjlKGpqx/v3hu+9WTQDefBPGjIEbboD77ouVNPPy\nYt/NN8cS3P37x2MLgJ9+gsMPh/x8uOYaOOooeOEFmDMnox9HRETSTElDM7bzzrDpppVnUZSWwhln\nwP/9X5Sgrqp16+h5aN8e+vWD//0PjjsuSlQ/8gistloMsmzVKtqJiEjuSDppMLPeZvakmX1jZqVm\ndnACxww0s3fNbJGZzTazu81s3fqFLKnSokV8wU+YEMkCRM/C22/DLbdAy5bVH7f++jHz4rPPYMcd\nYeJEuP/+KCAFsO66sN9+ekQhIpJr6tPT0B54Fzgd8Loam9muwBjgTmBbYADQA7ijHteWFOvfH775\nBt56KwY/nn9+zIDYddfaj9thBxg3Dj7/HM47Dw48sPL+wkJ44w344ou0hS4iIhnWKtkD3H0SMAnA\nzCyBQ3oBM919ZNn7L83sduDcZK8tqbfrrrE+xWOPwa+/wi+/xLiERPTrF2WpN9101X0HHwzt2sHD\nD0dSISIiTV8mxjS8AXQyswMAzKwDcDjwdAauLXVo2RIOPTQeL9xyC/z1r7EaZqI6dYqZGFWtsUbU\nddAjChGR3JH2pMHdXweOAR42s2XAHOAnYGi6ry2JOewwmDcv6jEMG5a68xYWwrvvwowZqTuniIhk\nT9KPJ5JlZtsCNwOXAJOBjYC/A7cDJ9Z27LBhw8grn+tXprCwkMLCwrTE2lztsQfstRdccAG0aZO6\n8x5wAKy1VvQ2XHJJ6s4rIiKhqKiIoqKiStvmz5+ftuuZe51jGWs+2KwUOMTdn6ylzf1AW3c/osK2\nXYFXgI3cfW41x+QDxcXFxeTn59c7Psm+44+PAZEzZlT/GENERFKrpKSEgoICgAJ3L0nluTMxpmF1\n4Ncq20qJmRf6GslxRx0FH38cjylERKRpq0+dhvZmtqOZ7VS2aYuy953K9l9lZmMqHDIR6G9mp5pZ\nl7JehpuBKe7+bYM/gTRqe+0F660Hp58eNSC+/z7bEYmISH3Vp6dhZ+AdoJjoLbgeKAEuLdvfEehU\n3tjdxwBnA0OA94GHgelA/3pHLU1G69Zwzz3xaGLwYOjQAXbbDW68EZYuzXZ0IiKSjPrUaXiJWpIN\ndx9UzbaRwMhqmkszcPDB8fr2W3jqKXjiCTj33Fi74qabsh2diIgkSmtPSMZ07Agnnhhlp//+91j8\n6tlnsx2ViIgkSkmDZMUZZ8Dee8fsih9/zHY0IiKSCCUNkhUtWsTAyMWL4ZRToAEzf0VEJEOUNEjW\nbLIJ3H47jB8PY8dmOxoREamLkgbJqsMPh2OPhaFDtSKmiEhjp6RBsu6WW2DddeG44/SYQkSkMVPS\nIFmXlwd33w0vvxxLaYuISOOkpEEahb32gkMOifoNixdnOxoREamOkgZpNK67LgpA3XBD4sc8/DCc\nc076YhIRkZWUNEijsdVWcOaZcNVVMHt23e0nTYKBAyPJeOml9McnItLcKWmQRuWvf4XVV4cLL6y9\n3TvvxMyLvn1hxx3h8sszE5+ISHOmpEEalbw8uOyyKPxUXFx9m1mz4MADYZttoKgIRoyA//wHXnst\no6GKiDQ7Shqk0TnxRNh+ezjrrFWnYP78c/QutG0ba1i0bw+HHgrbbafeBhGRdEt6lUuRdGvVKsYp\n7LsvHHNMLHS12mrxevZZmDMHXn89ltmGKEk9YgQcdRRMmQI9e2Y3fhGRXKWkQRqlffaJ6ZfPPBPj\nF5Yti1ebNrG0dteuldsPGADdukVvw1NPZSdmEZFcp6RBGq1rrolXIlq2jEGUxxwDJSWQn5/e2ERE\nmiONaZCcceSR8JvfaGyDiEi6KGmQnNGqFQwfDo8/DtOmZTsaEZHco6RBcsrAgdHbMGAAfPVVtqMR\nEcktShokp7RuHZUily+HPn203LaISCopaZCcs8UWUVa6RQvYbTf49NOV+9zh1Vehf384+eTsxSgi\n0hQpaZCc1LlzJA6rrx6JwwcfxOJWPXtC795RQfLee2HhwprP4R4zMUREJChpkJy1ySaROKy3Huyw\nQxR/WnNNePppePNN+PXX6HWoyeOPQ0EBvP125mIWEWnMkk4azKy3mT1pZt+YWamZHZzAMauZ2RVm\n9oWZLTGzz83s+HpFLJKEDh3ghRfg4oujSNR//hNlqLt2jaTi+edrPvbpp+PPsWMzE6uISGNXn56G\n9sC7wOmA19G23KPAHsAgYGugEPioHtcWSdr668Mll8BOO63cZgZ77llz0uAeAyrbtYtFsZYvz0io\nIiKNWtJJg7tPcveL3P0JwOpqb2b7A72Bvu7+grvPcvcp7v5GPeIVSZk994zehx9/XHXfBx/AN9/E\nipvffRdrXoiINHeZGNNwEPA2cJ6ZfW1mH5nZdWbWNgPXFqnRHntEj8KLL666b9KkGEQ5dGisuDlu\nXMbDExFpdDKRNGxB9DRsBxwCnAkMAEZm4NoiNercGbbcsvpHFP/+dyQVbdvGehaPP17zTIsnnoB5\n89Ibq4hIY5CJpKEFUAoc7e5vu/sk4GzgODNrk4Hri9SounENCxfGrIr994/3Rx8NS5bAP/+56vFP\nPw2HHAInnpj+WEVEsi0Tq1zOAb5x9/9V2DadGA+xKfBZTQcOGzaMvLy8StsKCwspLCxMR5zSDO25\nJ9x5J8ykKh3OAAAcXUlEQVSZAxttFNteeCEGPpYnDZ06we67xyyK445beezPP0eBqC5dYOLEWMZ7\nv/0y/hFEpBkrKiqiqKio0rb58+en7XrmnugEiGoONisFDnH3J2tpcxJwI7Chu/9Stq0fMB5Yw92X\nVnNMPlBcXFxMvtY4ljSaOxc6doQHHogeBYDTToPnnoNPPlnZ7p57ojfhq69iqibE+0cegf/+Nx5h\nfPddLJTVunXmP4eISLmSkhIKCgoACtw9pSXq6lOnob2Z7Whm5RPYtih736ls/1VmNqbCIQ8CPwD3\nmtk2ZrYbcC1wd3UJg0gmdegA22238hFF+VTL8l6Gcv37Q5s28OCD8X7yZLj7brj++uiJuPlmmDED\nbrsts/GLiGRSfcY07Ay8AxQTdRquB0qAS8v2dwQ6lTd290XAPsDawFRgLPAEMSBSJOsqjmv46KNY\n5OqAAyq3ycuDgw+OWRQLF8JJJ8Fee60cy7DTTrHt4ovh++8zGr6ISMbUp07DS+7ewt1bVnkNLts/\nyN33rHLMx+6+n7uv4e6d3f1c9TJIY7HXXjBzZrwmTYoehd13X7XdMcfAe+/BYYfBDz/AXXdFkahy\nf/tb9FRcfHHGQhcRySitPSHNXp8+sSLmCy9E0tCnT9RoqGr//WMdi+eeg2uugc03r7x/gw3gootg\n9Gh4//2MhJ5yDz0U63KIiFRHSYM0e2uvDfn58NRTUeip6niGcq1bw5//DAMGxGDJ6gwdClttBWed\nFb0ODfXOOzBlCpSWNvxcdfn5Zxg8GE44ITPXE5GmR0mDCDGuYcIEWLp01fEMFZ1/Pjz6aPRMVGe1\n1eDGG2OMxIQJNZ9n8eIYXHnWWfDaa5W/pEtLYwrnbrtFMtOrV/RqnHNOJBCpSEaq8+CDUY/iww/j\n+iIiVTVoymW6aMqlZNozz0QPQ+fOMbbB6lxVpXZ/+EOsXzF9eix6VdX550dysd56USNik02iB2PL\nLWHUqJiJ8bvfRc/GuutGojJ+fEwR7dQpkoktt1z52mmnmAnSEPn5sNlmsRbHsmXwxhsNvw8iknmN\nasqlSC76/e/j8cP++6fmi/LGG2H2bLjuulX3vfsu/P3vMGIEfP01vPJKDK585BE480zYZpvofXjt\nNTj00BhjceutsYDWCy9E2yVL4Mkno/3++8cjkY8/rn+8JSXxKOTEE+GCC6JHo7o1OUSkeVNPg0iZ\nCRPit+3OnVNzvvPPX1m/ofycv/4Ku+wSX/rFxfE4o1xpKcyfD+usk/g1fv01poj27RtjM157rX7F\npU4/PdbQ+PJLaNky7sMGG0Q9ChFpWtTTIJIBhx6auoQB4MILIwH4859XbvvHPyJZuPPOygkDxDiJ\nZBIGgFatopdh3LjoLbj88uTj/OWXqIg5aFCczywSnmefjVhFRMopaRBJkzXXjMcT48fHwMiZM+OR\nxNChMbgxlXr0iPoQV1wBr7+e3LGPPgoLFsTMiXIDBkQyctVVqY1TRJo2PZ4QSSP3GC8xf34Mdpw+\nPdaqWHPN1F/r119jxsW338YaGIleo3fvWAL82Wcrb7/zTjjllJhN0a1b6uMVkfTQ4wmRJsoMbrkl\nvngnT46ZEelIGCAeLYwdGwtnnZlgkfYZM2IZ8OqW9v7jH2Plz2uuSW2cItJ0KWkQSbP8fLjkEhg2\nLKZiptOWW8a4iXvvhX/+s+72d98dUzoPOWTVfW3awNlnx3iJa6+NFT5FpHlT0iCSARddBDfckJlr\nHX88HHgg/OUv8ciiJsuWwZgx0aPQpk31bU49FY48MsZLdO4ca3LccQf89FPtMXz1VVSWnDWrvp9C\nRBojJQ0iOcYsFs/6/POVS3lXZ8KEeJRxwgk1t2nfPnoa5s6N3os2baKE9nbb1dzzsHw5HHUU3HMP\n7LMPzJvXsM8jIo2HkgaRHLTTTrGU99/+BitWrLp/8eIo4rTvvrD99nWfb6214LjjonLmzJlRC6Jf\nP1i0aNW2I0bAW29FsrFgAey3X6xr0VQtXgxHHBGDWEWaOyUNIjlqxAj45BN4+OFV9111VVSYvOWW\n5M+72WZRjfLjj+NRSMV1M/797xg4eeWVMHBgzMj48st4XFJdgtEUPPxwTEv9+9+zHYlI9ilpEMlR\nO+8clSL/9rfKX+wffxxf7OeeC1tvXb9z77hjzNQYP35lQamvv47xEX37xuJaEL0YkybBe+9F+eul\nSxv2mbJh1KjoWXnooeg5EWnOlDSI5LARI6Jb/bHH4r07DBkSNSOGD2/YuQ89NBKSSy6JL9TCwhjz\nMGZM5VVAe/SInomXXoKTT27YNTNt6tR4/eMfkfDUNkZEpDlQ0iCSw3r1inELl18evQ2PPALPPRcL\nYFW3+mayhg+PQY+FhbEq5kMPwfrrr9pujz3gppuid+Kzzxp+3VRZujTirsnIkTFr5KSTYrrs7ben\nb2lykaZASYNIjrvoInj//fjCHjYsegj69k3Nuc1ilsRBB8XiXL//fc1tjzsu1tYYNSo1106FCy+M\nJcgnTlx13w8/RBJ06qmxiNdJJ8UKpVqPQ5ozJQ0iOW7XXeM3/cGD45n8zTen9vzt2sXjhyFD6m53\n4omRZDSGQZFz5kRPQl5exPXdd5X333NP9CqUT0ndf3/YdNOoUyHSXClpEGkGLrooHk9cfDF06pS9\nOE4/PRKXcePSf63S0soDQKu66qpYc+PNN2Na6imnrHz0sGIF3HZbTLXcYIPY1rJlJBcPPggLF6Y/\nfpHGSEmDSDOw++6xzkTFZbqzoXPnqB9x663pHxtw9NHxuGTx4lX3ffVVjE/4859jMa7bb49iV+XJ\nzKRJUY+iau/J4MFxvoceSm/sIo2VkgaRZqJr1xiDkG1Dh8IHH8RsinR5+eWorzBlSoxJqJqgXHFF\nLBz2pz/F+/794dhjI7ZZs2LcRX4+9OxZ+bhOnWI8SNVHFC+9FNNLTzklfZ9JpDFQ0iAiGbXnnrDN\nNvUrLFVuzpzqK11CJAjnnQcFBTH48/77Y8pkuZkzY6Gu886rvOLoP/4RlS8POyyKVA0ZUn2SdfLJ\n8PbbUFISgyUHD46enCVLIpn4z3+S+yyLFkXpbZGmIOmkwcx6m9mTZvaNmZWa2cFJHLurmS03s5Su\n7y0iTYdZ/Eb/+OP1W9Dq449hiy3gmGOqf8Txz3/GOIVrr41HFH/+cxSbeuGF2H/55bDeeqs+elh7\nbbjvvpgdsfbaMZW0OgccABtvHL0U3brFY4077oCPPoLevWNtjiVLEvssy5dHctOtWwwm1XROaezq\n09PQHngXOB1I+K+4meUBY4Dn6nFNEckhf/wjrLEGjB6d3HHu8bihffsYV3DttZX3L18ea2rsv3/0\naEAMeNxjjxjU+Nxz0fNwwQWw+uqrnn+vvaIH5MYbq98P0KpVTL987TXYe+8onnXSSTFQcvRo+OIL\nuPrqxD7PuHGRbGyySazlceCBUfq73Lx5kZDst18kK+qRkKxz93q/gFLg4ATbFgGXAhcDJXW0zQe8\nuLjYRSQ3nXmm+/rruy9enPgx997rDu6TJ7sPH+5u5v700yv333ZbbHv33crHff+9e5cu7i1auG+y\nSXLXrM7Spe7vvVf9vuHD3VdbzX3GjNrPsWyZ+xZbuB92mHtpqfuECe6dO8exp5zi3qdPxNuihftu\nu8WfV1zRsLileSguLnbil/p8b8B3fHWvjCQNwCDgTaJnQ0mDiPhHH8W/QEce6f7KK/HFWZt589zX\nXdd94MB4v2KF+x/+4J6XF1/QCxe6d+jgfuyx1R8/bZr72mu733NPaj9HVb/8EsnAnnvW/pnuuSc+\nf8UE55df3C++2H2DDdz339/9zjvjc7u7n3deJBQffpjW8CUHpDNpMG/AQzQzKwUOcfcna2nzG+Bl\n4Pfu/pmZXQz0c/f8Wo7JB4qLi4vJz6+xmYg0cbfcEqtHzpoFXbrEOIVjj4Xf/GbVtn/8Izz9dDwO\n2HDD2DZ/fpTKdo8yz7fcEmMeOneu/nrLl8fiU+k2eXI8Urj//vg81cXRrVssYV6+LkhdFi+O9uuu\nC6++Go9DRKpTUlJCQUEBQIG7p3QMYVqTBjNrQfQw3OXud5Rtu4Tonagzadhtt93Iy8urtK+wsJDC\nwsJ6xywijUtpKbzySsx0ePTRKP502GGx2NZOO0Wb556DffaJWQ+DB1c+/uOPY1Gs+fNjwGNjWcK6\nsDDinjYtBk5WdM89UWly2jT47W8TP+err8Juu8ENN8BZZ9XdfsWKeK22WnKxS9NRVFREUVFRpW3z\n58/n5ZdfhjQkDWl9PAHklbVZBiwve62osG33Go7T4wmRZuiXX9zvvtt9yy2j6/7gg+PRxVZbxXP9\nmrr7n3kmxgB8/31Gw63VnDkxfqJjR/dXX125fdmyGF/Rv3/9zjt0qHu7du6fflp329NPdy8oqN91\npOlK5+OJdNdpWABsD+wE7Fj2Gg3MKPvvKWm+vog0Ie3aRU/CjBnRtT9jRkxjnDUrqjbWVJxq333h\nxRdjKmVj0bFj1HP4zW+ijkN5FcyxY6NWxEUX1e+8V10Vj2dOOqn2KZqffBL3rLgY/vvf+l1LpKpW\nyR5gZu2BrYDy/323MLMdgR/d/SszuwrY2N2Pc3cHPqxy/DxgibtPb2DsIpKjWrWKsQBHHw3jx8ca\nEd26ZTuq5HXsGMWezj0XzjgD3norHjH075/cY4mK1lgD7rwzEqXbb48pqNW5+OK4/sKF8dhnu+3q\n/zkSVVoaj0MyMW5EsqM+PQ07A+8AxUT3x/VACTGdEqAjkMUlcUQkV7RsCUceGTUMmqrWraPuw4MP\nRgI0c2Z8oTfEPvtEZcpzzqlc16Hce+9FHYuLLop79+ijDbteIoqLYdttY8yF6knkrqSTBnd/yd1b\nuHvLKq/BZfsHufuetRx/qdcyCFJEJBcVFsLUqfDII7DDDg0/3/XXxwDLY45Z9Ut6xIiomjloUBS1\n+vDD9D2iWLEiiln16hUDLqdOhb/9LT3Xam5mzoRffsl2FJVp7QkRkQzZbjs4/PDUnGuNNaKiZHFx\n5S/pKVOiJPUll0Qvxz77QF5eJCupNmtWVN4cPjzKdb/9dvRuXHFFlPKuryVLohz3rbfG+h6JKi2N\nKaw//lj/azcW330Xf1+6doWiosZTYlxJg4hIE9Wz58ov6TfeiG0XXhhfNuUz09u0WfmIIpVfPOPH\nx7iMmTNjXY+rroqehuHDYz2NY4+Nxbjq46qrYtzGsGGw0UYxBmTixLofezzyCAwYEJ//iSfqd+1k\npevL/N57IwkqKIixPb17R4KYbUoaRESasOHDoXv3+JKeODEGXl5+eeXiT0ccEUWxUvGIYsmSWHDs\n8MOjF2PaNOjTZ+X+Vq1ihsjs2dH7kKwZMyJpGD4cvvkm1hf57DM4+OAYM/Hzz9Uft2xZJEx77QU7\n7wyHHAIDBybXU5GMFSvg0kthnXViKfZUn3v06BjP8/jjUe9j/vz4OZ92WiQTWZPqOZypeKE6DSIi\nCfv0U/f27d1btXLfeedV61ksXRrlti+6qGHX+eQT9//7vyhnPWpU7WWyb7stam1UXBukLqWlUY9j\nq61WXR9k6lT3Ndd0P/HE6o+99dZYd+T99+M8Y8e6r7OO+4Ybuk+cmHgMifj665Vrg3Tp4r7ppu4/\n/JC68z/9dNy7N99cuW35cvfrrovtL7xQ+/GNdu2JdL2UNIiIJOeee+JLc/Lk6vcfd5x7t26rftE/\n9lh88V12mfv8+dUfu3x5fAmvuWYU3krkn+bSUvcDDoj1QGbPTvwzgPtzz1W/f/To6vcvWBDJwfHH\nV94+e3bE0KaN+48/JhZDXSZOdF9vPfeNN3Z/8UX3WbMiOTn00LrXT0nUgQe65+ever7S0vhZ1ZQ4\nlVPSICIidZo7t+Z9Tz0V/+K///7KbU88Eb0TBQXxxbreeu7XXuu+aFHsnzEjFsraaKM49ogjak4s\nqjN7dlTE3HDDuFZtyhckq2nBMfdYpKxPn/ji/N//Vm6/5JKI/8svVz1mzhz3li3dR45MPO6aXHhh\n3IcDD3T/7ruV2//5z9g+enTDrzFzZiR/d91Vcwx5ebWv1KqkQUREGqT8EcWIEfH+6afdW7eOctbL\nl8dvzKecEklEhw7uu+wS3xDrrOM+ZIj722/X77rffut+0EFxrsGDa046/vjHSBrKV/WsySefuLdt\n6z5s2Mrzr7GG+znn1HzMQQfFY5uG+Ne/4jNcfnn1PQqnnRZxffBBw65z/vnxc6qYFFX04YcRx2OP\n1XwOJQ0iItJgxx3n3rVrrNXRpo17v36xFkZFn33mfsIJ8dv0Qw/V/httokpLY02RNdZw33xz98cf\nd3/jjViT46WX4rdqiDaJuO66+G38jTdiLY68vNrXHSnvCajYy5KMBQvcO3Vy32efmh9B/PKL+3bb\nuW+/ffx3fSxZ4r7++u5nnll7u/x898MOq3m/kgYREWmw8gF2rVu79+0bX1KZ9Pnn7r17RwxVX3vt\nlfiYgOXLo+dgyy3js1x1Ve3tly6NL+Ozz65f3EOGxEDTmTNrb/f++9Hb0K+f+/PPr5qQ1WXcuLgX\nM2bU3u7662Mw6k8/Vb+/KS9YJSIijcTee8d6FLvvHkWQ2rTJ7PW7dImaDh98AO+/H5UqP/oIPv0U\nJk2qeUGyqlq1imXSv/wSNtgA/vSn2tuvtlpMvxw3LvkS16+8AiNHwpVXwuab1952++1j2fOpU6Po\n1frrx7TJceMSm/p5221xXNeutbc76qj4HI89lvDHSBlzbyRlpiows3yguLi4mPx8VZwWEUmVn36K\nCpEtcuBXxieeiKThd7+ru+20abDTTnHMwQcndv4lS2DHHWP11FdeqVz7ojbu8M47UTfjqaeiUmaL\nFhHnQQfFq1u3yklSeXzjx0cxq7rsvXfUa3j++VX3lZSUUFBQAFDg7iWJRZ0YJQ0iItIs5OdD584w\nYUJi7YcPjzU+3nknCkvV1+zZ8K9/RRLx7LOweHFUumzZMhKTJUtijYkOHaL3JJFVQu+9F044Ab76\nCjbZpPK+dCYNSS+NLSIi0hQNGgRnnw3z5sGGG9betqQkqlFecknDEgaIhcVOPDFeixdH78Drr0fS\n0K5dLP3erl2UBU90WfHDDovqkEVF9au8WV/qaRARkWbhhx/iC/zqq2Ndi5p89x306BEloqdMSfyL\nPNMOPzyWRn/33crb09nTkANPtUREROq23noxnuHee2teaGrpUjj00HhcMGFC400YIAZ3TpuWvmXP\nq6OkQUREmo1Bg2LmRkk1v3+7w8knx8DFxx+P8Q+N2QEHRG/IAw9k7ppKGkREpNnYd98YOHjEEbH8\n9tKlK/dddx3cf39M59xll+zFmKg2bWJK55dfZu6aShpERKTZaNUKnnkmZlKcckrUXrj22vht/fzz\nY3ntgQOzHWXibr01sz0Nmj0hIiLNynbbwaOPwscfR+/CiBGwbFnUR7jssmxHl5xEa0ekinoaRESk\nWdp663hEMXMmjB4djyZyoehVOqmnQUREmrWNN45HFVI35VQiIiKSECUN8v8VFRVlO4RmR/c883TP\nM0/3PHcknTSYWW8ze9LMvjGzUjOrdekPMzvUzCab2Twzm29mr5vZvvUPWdJF/2Nnnu555umeZ57u\nee6oT09De+Bd4HRive667AZMBg4A8oEXgIlmtmM9ri0iIiJZkvRASHefBEwCMKt79XN3r1rh+0Iz\n6wccBExL9voiIiKSHRkf01CWaKwJ/Jjpa4uIiEj9ZWPK5V+IRxyP1NKmLcD06dMzEpCE+fPnU1Jd\nQXZJG93zzNM9zzzd88yq8N3ZNtXnbtDS2GZWChzi7k8m2P5o4HbgYHd/oY52GSyMKSIiknMGuvuD\nqTxhxnoazOwo4A5gQG0JQ5lngIHAF8CSNIcmIiKSS9oCmxPfpSmVkaTBzAqBu4AjywZS1srdfwBS\nmh2JiIg0I6+n46RJJw1m1h7YCiifObFF2fTJH939KzO7CtjY3Y8ra380cB/wJ2CqmXUoO26xuy9o\n6AcQERGRzEh6TIOZ9SFqLVQ9cIy7Dzaze4HO7r5nWfsXiFoNVY1x98H1iFlERESyoEEDIUVERKT5\n0NoTIiIikhAlDSIiIpKQRpc0mNkQM5tpZovN7E0z657tmHKFmV1gZm+Z2QIzm2tmE8xs62raXWZm\ns83sFzN71sy2yka8ucbMzi9b5O2GKtt1v1PMzDY2s7Fm9n3ZfZ1mZvlV2ui+p4iZtTCzy83s87L7\n+amZ/bWadrrn9ZTIYpF13V8za2NmI8v+v1hoZuPNbMNk4mhUSYOZHQlcD1wM/B+xNsUzZrZ+VgPL\nHb2BW4CewN5Aa2CymbUrb2Bm5wFDgZOBHsAi4mewWubDzR1lye/JVFlvRfc79cxsbeA1YCmwH7AN\ncA7wU4U2uu+pdT5wCrGQYTfgXOBcMxta3kD3vMFqXSwywft7E3Ag0J+YoLAx8FhSUbh7o3kBbwI3\nV3hvwNfAudmOLRdfwPpAKfD7CttmA8MqvF8LWAwcke14m+oLWAP4CNiTmHl0g+53Wu/31cBLdbTR\nfU/tPZ8I3Fll23jgft3ztNzvUqKycsVttd7fsvdLgUMrtOladq4eiV670fQ0mFlroAD4T/k2j0/1\nHLBLtuLKcWsTGeuPAGbWBehI5Z/BAmAK+hk0xEhgors/X3Gj7nfaHAS8bWaPlD2GKzGzE8t36r6n\nxevAXmb2G4Cy2j27Av8qe697nkYJ3t+didpMFdt8BMwiiZ9BNhasqsn6QEtgbpXtc4lsSFKobLXR\nm4BX3f3Dss0diSSiup9BxwyGlzPKyqfvRPwPW5Xud3psAZxGPOq8guiq/YeZLXX3sei+p8PVxG+y\nM8xsBfHo+0J3f6hsv+55eiVyfzsAy3zVoopJ/QwaU9IgmTUK2Jb4bUDSwMw2JRKzvd19ebbjaUZa\nAG+5+4iy99PMbHvgVGBs9sLKaUcCRwNHAR8SifLNZja7LFGTHNFoHk8A3wMriGyoog7At5kPJ3eZ\n2a1AX2B3d59TYde3xDgS/QxSowDYACgxs+VmthzoA5xpZsuIDF/3O/XmANOrbJsObFb23/p7nnrX\nAle7+6Pu/l93fwC4EbigbL/ueXolcn+/BVYzs7VqaVOnRpM0lP0mVgzsVb6trAt9L9K08EZzVJYw\n9AP2cPdZFfe5+0ziL0/Fn8FaxGwL/QyS9xywA/Fb145lr7eBccCO7v45ut/p8BqrPtLsCnwJ+nue\nJqsTv/RVVErZd4zueXoleH+LgV+rtOlKJNNvJHqtxvZ44gbgPjMrBt4ChhF/Ge/LZlC5wsxGAYXA\nwcCiCouHzXf38iXIbwL+amafEkuTX07MYHkiw+E2ee6+iOiq/f/MbBHwg7uX/yas+516NwKvmdkF\nwCPEP5wnAidVaKP7nloTifv5NfBfIJ/49/uuCm10zxvA6lgskjrur7svMLO7gRvM7CdgIfAP4DV3\nfyvhQLI9daSaqSSnl33gxUT2s3O2Y8qVF5H5r6jm9ccq7S4hpu/8QqzHvlW2Y8+VF/A8FaZc6n6n\n7T73Bd4ru6f/BQZX00b3PXX3uz3xS99Moj7AJ8ClQCvd85Td4z41/Bt+T6L3F2hD1Or5vixpeBTY\nMJk4tGCViIiIJKTRjGkQERGRxk1Jg4iIiCRESYOIiIgkREmDiIiIJERJg4iIiCRESYOIiIgkREmD\niIiIJERJg4iIiCRESYOIiIgkREmDiIiIJERJg4iIiCTk/wHMztUCX24OVgAAAABJRU5ErkJggg==\n", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "import matplotlib.pyplot as plt\n", "import matplotlib.ticker as ticker\n", "%matplotlib inline\n", "\n", "plt.figure()\n", "plt.plot(all_losses)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Evaluating the Results\n", "\n", "To see how well the network performs on different categories, we will create a confusion matrix, indicating for every actual language (rows) which language the network guesses (columns). To calculate the confusion matrix a bunch of samples are run through the network with `evaluate()`, which is the same as `train()` minus the backprop." ] }, { "cell_type": "code", "execution_count": 19, "metadata": { "collapsed": false, "scrolled": false }, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAeQAAAGoCAYAAACXNJbuAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzsnXeYJFXV/z9fdkFYkCxBJWckSBRBQckoGBDh5UUyP0VB\ndMGXpLCSJCfBAJKRjAgoIFElieQkYQm7sMDCguwSll3C7vn9cW4zNTUdqrtrZnp6z+d56pnuW3VP\n3a6u6VP33BNkZgRBEARBMLjMNNgDCIIgCIIgFHIQBEEQdAShkIMgCIKgAwiFHARBEAQdQCjkIAiC\nIOgAQiEHQRAEQQcQCjkIgiAIOoBQyEEQBEHQAYRCDoIgCIIOIBRyEARBEHQAoZCDIAiCoAMIhRwE\nQRA0RNJSko6UdImkBVLbFpI+N9hj6xZCIQdBEAR1kbQB8BjwBWBrYI60a1XgsMEaV7cRCjkIgiBo\nxDHAL8xsE+CDTPttwDqDM6TuIxRyEARB0IiVgT9XaZ8AzD/AY+laQiEHQRAEjZgELFylfTXg5QEe\nS9cSCjkIgqAGknaUdJekVyQtltp+Kumbgz22AeZS4FhJCwEGzCRpPeAE4IJBHVkXEQo5CIKgCpJ+\nCJwEXA/MDQxLuyYBPx2scQ0SBwNPAeNwh64ngNuBu4EjB3FcXYXMbLDHEARB0HFIegI42MyulvQO\nsKqZPS9pJeAfZjbDrZ1KWgRfT54DeMjMnhnkIXUVwwd7AEEQBB3KEsBDVdrfB2Yf4LF0BGY2Dhgn\naRiwsqR5zGziYI+rWwiTdRAEQXXGAJ+v0r458OQAj2VQkXSKpN3T62HAP4EHceX8lcEcWzcRM+Qg\nCILqnAT8RtKsgIC1JW0PHATsMagjG3i2Af6YXm8FLAksD+wIHAWsN0jj6ipiDTkIgqAGknYAfgks\nlZpeAUaZ2dmDNqhBQNJUYGkze0nSmcB7ZvZTSUsAj5jZnIM8xK4gZshBEAQ1MLOLgIskjQDmMLMJ\ngz2mQeI1YEVJ43GT/Q9T+whg2qCNqsuINeQgCIIqSJotKWLM7D1gthSDvOkgD20wOBe4HHgcj0O+\nJbV/AQ+HCkogTNZBEARVkHQTcJWZ/V7S3MDTeB7n+YF9zex3gzrAAUbSNsAiwBVm9lJq2xmYZGbX\nDOrguoRQyEEQBFWQ9AawgZn9R9IewI/xVJHfAQ43sxUGdYCDhKRZzWzqYI+jGwmTdRAEQXVGAO+k\n15vis+XpwD3AYoM2qkFA0jBJh0h6GXhX0pKp/YhKOFTQPqGQgyAIqvMs8K2UnWoz4KbUvgDw9qCN\nanD4ObALsD+9yy8+zowXAtZvhEIOgqAUJC0taTNJs6X3GuwxtcnhePGEscC9Zvav1L4p1TN4dTM7\nAd9PXudZr+pH8HjkoAQi7CkIgraQNB9wGbAh7oG7DPA8cLakiWa232COr1XM7EpJd+JlBx/J7LqV\n6rWBu5nP4BaDPDMBMw/wWLqWmCEHQdAuJwMfAYsC72XaL8NjVocsZvYqvo68SWXmD9xnZjNaqM8T\nwJertG/DjGct6DdihhwEQbtsCmyWsjhl259hCDs/pZn/5cBX6aKZf4scDpwv6TP4RG5rScvhpuwt\nB3VkXUTMkIMgaJfZ6T0zrjAvXhlpqHIy8CFdOPNvlhRnvBWwMTAZV9ArAFuZ2c2DObZuImbIQRC0\nyx34TOmQ9N4kzYR75P590EbVPl05828VM7sD2GSwx9HNhEIOgqBd9gdulbQmMAtwHPA5fIY8lKsA\ndevMvy0kzUHOumpmM1oYWL8QJusgCNrCzB4HlgXuBK7BFdlVwGpm9txgjq1NKjP/Ct0y828aSUtI\nuk7SZOAtYGLaJqW/QQlE6swgCIIqSFoJD3F6EA/pupbMzH+IP2w0haS78JrQp+KVn3opDjP752CM\nq9sIhTwDkBLjb4PXdD3ezN6UtDrwmpm9PLijG5pIGoZnLtoIz9yUN+FtOAjDGhQkbQ68a2Z3pvd7\nAf8PD5XZy8yG7AxK0lzA3sCqwBy4cv6NmY0f4HEsg3t7V7vXDh+A878LrGFmT/f3uWZkQiF3OZJW\nwUulvQUsDixnZs9LOhJY1Mx2qtc/qI6k03GFfB0wnr4zhpGDMKxBQdJjwAFmdr2klYH7gRNxBfKU\nme06qAMc4kj6f8DvgDeAV+l9r5mZrV5QzkbUfoDcrUHfvwNHmdkt9Y4L2iMUcpcj6RbgQTPbX9I7\nwKpJIa8LXGxmiw/uCIcmqRLQTmZ2/WCPZbBJs6eVzGyspF+m19skK8z1ZrbQ4I6wNSStX2+/md0+\nQON4AfitmR3bhoxRwKH4w1K1B8hvN+i/FPB74I94/uoPc/0fbXVsQQ/hZd39rAX8oEr7y8CQ/KHs\nED6geirBGZEP8MpI4HGqF6TXbwJzDsqIyuEfVdqyimxYIwElLW3MA1xR4Lh67AnsYmYXttj/U/iS\n17mZNsPXlY0C1yJoTCjk7ud9qv8oLgu8PsBj6SZOBH4iaW8LM9OdwEnJ8WdtYLvUvizw0qCNqn3m\nyb2fGa+HfARe/agIp9KztPE4uZlpQa7AY6J/30LfCrMAd7fR/xw8Reb2VHHqCsohTNZdjqSzgPmA\nbfEZyyp4tZargdvN7KeDOLwhi6Q/42ukbwL/oa8Jb+vBGNdgIGlR4LfAIsCvzezs1H4yMMzM9hnM\n8ZWNpA2Ak8xsjQLHtrS0ISl7zWYH9sWV+mP0vdd+XUDesbjj3RHNjCPTfzK+3BVWoX4kFHKXk7xE\nrwTWBD4JvIKbqv8FfM3MJg/i8IYsks6ttz8cmboXScsD95vZHAWOfQX4ipmNbvIcYwoeama2ZA0Z\nJ2XezgTsDDyatrxS37fBeP4CnGdmfyo4rqAFQiHPIEhaj0zoxlD1lpQ0O3Agtdfkqv44Bf1LcvrZ\nFV9n/ImZTZC0BfCimf1ncEfXGilCoVcTXorxQGC4mX2pgIz9gCWBAV/aSJ7RRbBGa9mSvg/8Ajdd\nV5ulX9vSIINehEKeAZE0t5lNGuxxtIKkS4ANgAup7i166mCMa0YmmXBvAO4C1gdWSJ78BwJrmtk2\nTchaEDiBngeuXkmkzWzAnIckTafHcSnLPcBuRUow9sfSRnIUWxl4YaBivNO1qIUN5PfSzYRTV5cj\n6QBgrJldlt5fDnxH0qu4yfqRugI6jy2Ar5vZXe0IkVcLWILqs+xCzi+StsHX5hfFnWayMgrFhnYJ\nxwC/MLOTUmhdhdvwpBrNcB5+PY+gygPXALNE7v104HUzm9qEjEnAn9sZhKRTgMfM7OykjG8Hvgi8\nJ2lLM/tHCzLnxLOPPVXkwcLMIs3yABAKufvZE9gBQNImeLWWLXBFcjzuvTmUmIjPNlpG0trARbgp\nMT/7KRTCkZxujsIVyDfxcJCl8DCz37QzviHIysD/VmmfAMzfpKwvAV82s4fbHlWbmNkLJcgow5dg\nGzz+F7wE4uLA8sCO+D3YsIBHehC/3cxOlzQbHo+8uO/S/8TacGcQCrlDkbQWMJOZ/TvX/gVgmpnd\nX1DUQsC49HpL4HIzu0nSWODfNXt1LocAh0va2cyqVeIpwhm4Y8vWtD4L+xHwfTO7RNIuwHHJTHs4\nnut4RmISvraad0RaDY93b4Zx9H1IGhRyns5ZDJiKx6HfbmbT+nko8+MZugC+BlxhZqMlnQP8pKCM\n9XHlDfBt/BrPjTt6/QJoqJCT/8YGVLcINfT0DhoTCrlz+Q1wdJX2zwAHAF8oKGciHo4yDi+q/ovU\nLoZIML+kh+itNJcGXksPFfk1uSKm4mWB77YZwrEoPXGdU3APdvC17Xto3lQ7lLkUOFbSd/Hvaabk\nRHgCPUlCivJT4BhJPzCzseUOs2lG4gkxRtBT0WgevCTju/hyx/OSvmpm46qLKGVp4zVgRUnj8f/h\nH6b2EXgIYxHmoseytDnwJzN7T9J1uKWsLpJWA65P55w9yZofvxYTgFDIJRAKuXNZEahmtnso7SvK\nVcDFkp7B45FvSO2rMXQyTV1dsrz7cHN1O5//VXwm/ALwIrAO8Ai+7lhohidpzlp1ZCUtPRgxn5Jm\nofq6+ot1uh2MP0COwx/ynkh/LwaObHIIl+E/+s9Jeo++D1wDaX04AFd+e1QqO0laGrew/AFPiHIp\ncDJuVu5DSUsb5wKX02PNqURIfAFouP6bGAd8UdKbuEL+n9Q+Dz7bb8TJwF/wJbC38Pv9Q9yUHo6U\nJRFe1h2KpP8CW5rZv3Lt6wLXmVk+i1AtOTPjZq1F8DjCh1L7SOAdMzurgIwhH2okKfsQswyuKI6l\negjHEwXknQWMM7PDUnWj43Ev4zWBq8xs9wIy7gA2NrP3c+3LAbea2WcbySiLVE3oHGDd/C4KetGm\nBCEr4aF1D5nZMy2MY+d6+83s/GZltoqkZ4Ft8uvZabb4JzNbMv0//snMFq4h4yngsLS0kc0lfzgw\nr5kVsqSkWfYiuLn6pdS2MzDJzK4p0P9HuOJ8F3+IXN3Mpkv6MbC1mX21Qf9JwBfM7On0+otm9mRa\nQjvfzJYv8jmC+oRC7lBSeM/CwDfN7K3UNjc+W5xgZtsO8FhKCTWSF3hfmuqKvWGyfkmL+KEf/yit\njTsUPWFmZ9bplw1hqRbK8vG+gspnJnyN/6P0/n9wZfYMcIaZfVBAxg3pvN/IyFkB906+3MyKrg+2\nTUp7+RHuMV3tOx5q3vhtk2bo6+f9NZJ/xz/NbISkxYHHayUJSTJWMLMXJE0ANjGzR9ID0D1mNl//\nfopeY1kTV+o3m9m7qe3ruFKvG7Ug6XVgXTN7RtJo4MdmdmNKkvKAmc3e3+OfEQiTdefyMzy84YW0\nhgrweXw9acdmBEnaES8wsST+ZPuCpJ8CY4o8XVNeqNE6uBlzMVr0bk79zwQulLQQbr57HNhB0kJW\nuzbsMq2NujpmNh0Pg6m8vxQ3XzbD1vj4L0oK/XPArcBFjTIn9QOfx+vdFjWBfoz6qTa0pFnpu+Za\n1cRfo//ceG7tamMqsrb9d+AMSXtkLEur4aUQb0vHrExfZ7YsLS1tJFP3mWY2tY5zWeWzFFq/TQ8W\n9+farivSF18qWwt/4Pwn7lg5P/5b9HhBGUEDYobcwSRT8Q54hq0puGfwJWb2Yd2OvWX8EDgcOAVP\niL9SMpntAuzcyFSVZIzBY5afbP5T9JLzMDAaGEX1WdhbBWRMBNZJprN9gO3MbD1JmwK/H0jzeVIY\nq1D9B79Q5qKkNP6B/9CtD1xgZv9X7kgLjeM+YKSZ3dlC39JqQ6d7/ljcCarP7LFoAgpJW+GhbXMA\nb+fGZEXWotMD34X4g0blf244/tC0o5m9JumrwMxmdlMNGS0tbaT/uTXN7L+qn0bTat3z8tSZh5jZ\nZPVOo1lNSKPUmWsCnzSzv0taAHfWq1iEdpsRLSj9QSjkLkfSE8DBZnZ1bg1rJeAfZtYwTlTS93CH\nlHZCjUpJUK/etXevBe4ys2PT+uXTZjZbARn742b/83LtuwDzm9kJBWRsjv8oVbt+Nc3e8oQMeRYG\nbgb+iq/VV4QUng22i6QN8XX1g6m+rl5zLCqxNrSk3+CZrQ7BleFeeGTBD4ADzeyignJG417BB7dz\nzyZZy+Oe+eD32NNN9G17aaNV5Kkzv21mk1Q/jaa1asUIyiUUcgch6RvADWb2YXpdkyZmYFOA5ZOZ\nOquQlwEeraXAVD3USMBYWgs1QtJteLzu34ocX0PGv3FT4nXATfhs+ZFkDr+yiCNUmnFsb2b35NrX\nAS4uMsuWe63fBBxuZq81Mf7KWnafXelvU2vZZaGe1Ij5sTUci1osoFBD1ou4cv+HpLdx56Nn07LL\n9mb2tYJyJgMrm9nz7Y5psJF0KHBC/sFCnuDj/+os0wRDjFhD7iyuxhN5TKB+qE8zBcHH4OuD+axD\nmwP1TNClhBqpd4L+04ATkymw2izs0QIiD8BTEf4f7t1ZMZV9A7i34LAWxtfi87wGfLqgjAXxEnyF\nlXGi4RLBINHOuMqsDT0vUFGib9OTZOVOfO22KDfiZuGWFbI88UZNzGy3Gv1WwR29pqtvgYq8jCL3\n/Ci8FnJ+pj8i7et3hawOyjHezYRC7iAsky/WyssdexLwm7TeKWBtSdsDBwF71BnLYSWd/2H6ejVn\nf+iyns8N/6nTzGl+YE7rnVj/TPr+YNXiJTwXcH5tbl18DbQIVwJfAZ4reDwAZvbPZo4fKNoc15dw\nhb6FpHYLKDyPOzy9iMfYbos/aG2FZwSrSc6qdB1wvDzcrdXqRPnQwpnxsK656XHqqsbD9DxYV7v/\nPx4GxR6sK/8feValYBpZtR+6eB6dk2O8awmF3OWY2VnJbH0k/kR9MV4T+SfJM7ghai+NZz5Bf9uY\npyqcmGsb24SIs4FTk3dw5Yd1I3ymVzSEa2/gCklfpsWi8QCpf8UD/rtm9nIyz45pxcGqXSSNoHpG\nqXozubYLKGQ4F1c0/8RDsP4iaW9cGTbyPK9m1Tm0SlvRh79v59vSmvDvqP8gtgTweuZ1SyQHRkvb\naElZJTgMd1j7fUFxZ1EndLEAHZNjvJuJNeQORtJGePq+FVLTk8Ap1mIt4/RjO4eZTWiy373A0Wb2\n51z71sABZlY0jWdLSHoQ2MjMJlZZ2+5FkfVsScLNb3vT81D6Ae4BO6qI2VXS7viP4VTgv7kx1fR8\nzcn4Dv4DeREePrJiWt/fG/dqL7ReWgaSPoUrwy2q7R8sk6SkxYA1gGcLmnf7HXniln9YjWQgJZ5n\nZ3x2fA6eUjQbhfABXsXtX9X6VpE1iTZCF5Nz6A6Wwr+C/iFmyB2KejLrXEnPrG0d4HpJI82s6YpC\nySmkFY/TUtJ4SjoIeNXMzs217wZ8ysyOrdH1GqCSzartte2kcPeTdBge+zsFGN2kN+5R+PrdMeYx\nya3wC2BPM7sged9WuIuenOMDxSm4KfYLeBjWt/F18l8A+w3wWAAPKzOvuNR21aUkr6w64EtR8Lcz\nKdU3LMX7SjoO+D6eXnR7q1NRylJWsuSEeFfFU7tF2q2S1kk5xrsXM4utAzd8nXPvKu17AS83IWdB\nfBb2Cp6JaVp2Kyjjv3hCkXz7usDEJsYyFk+/l2//Am6iHfTr3sRneRNYqk0Z7wGLp9fvAEum10sC\nUwf484wH1k6v3waWTa+/AdxZoP82eL7le4AHs1uT4xiGhzy9nO7XyjU5Ati9CTkH4DHqlfdX4Ilc\nXsYjDYrIOCm3nYwnf3kHOL2gjKeBDdPrL6bv/PvAtXgcchEZ04AFqrTP18T/8PfSNRjR4v0xEX8o\nnpY+/5vZbSDv1W7eYobcucwNVAsPuglPnFCU82jfGeMm4GhJ+TSev8LjZ4tScXTJ8zru+TwgJNP9\n/1HbwWXZav1ynA9sh1+DVnkVDycbm2v/Em14B7fI7PR8NxPxKkej8fXxussAKrc29M/xkoD74wUc\nKjyOz9LOLignXwd8YzyyoJk64Kvl3k/H79X96O2YWI9F6Cli8i08NO9MearSfxSUUSuj1ydw03UR\n9sO/k1arpI0kHLn6nVDIncu1uNkwXxrtm3gCiaKU4YxRVhrPcXgx9bx383r4DL4qGeeWhlixakBn\n4j/QF9H6Q8owYH9Jm+EZ1PI/cEVSX/4Bdy7bLY3h05K+iK9vH9HMYNR+msingeXwh4NHgB+kH+49\naex5XmZt6J2SrFslZR2WHgGaKWDQdh1wK5DFrgDv4jPZF/GHgErGrKlA3SQ26kmZacAeKSlOhWF4\nZreiqU7bWuqxXBKdoH8IhdxBqHfO2ieAn0v6ClBx3FgHV14nNiG27YLv5p6/q9A7jee5NJnGE1dA\np8grUGW9m4+j/mf6afOjrsuWwFZmdkcbMlbG19DBQ2GyFFXwx+DK81bcA/523Cx4gpmdVnQgapAm\nkmI1iU+lx0pxGG6d2QGfge3SoG+ZtaE/Q/WymDPhntZFabsOeEpks7Xl1p1TtrWrrVh2q5uBs9KD\n7LJ49jBw34WxDfpWUo4KfzDK1j7+IPXfs8AYsDbDGCVNAxa2nEOopPnwrHcRh1wGg20zj61nw2eO\nRbbnm5C5KZ4kYfEO+HzCze1T6FnHnoyHpmgAxzEW92ge9O88jWcW3DlubdwLvtn+o3GnrJbWB2vI\nHIGbqucvcOzzwGrp9f3AD9LrTWlyfRF4APheep1dVz8UuKMJOaen7/lm4I3KdcXrABda18ZN1NXW\nbhcAPiwoY+40lmuAzTPthwE/Lyjj78A8g3yP1roWnwamDObYummLGXIHYWalx+xSUsF3Va8YNRJ/\nOChSMQrz/+ADJB2Bh3JNAZ6xXD3gBuOolgsafCb4vhXLDXwocKikXcysSHH2/Bhmxsf+eTMro9LN\novhs7nYzmyJJ6VoV5TPAr63NnM1ZkqwHCx5+G+789RBuOTlZXr93TeCqJk99OHC+pM/gs+KtU5jR\nTrhloygjcYW8CLC/pXKDuBXgt/U65rJrrZgyy1UYhs+4Xy4yCPPZdR8LgZmNKtI/Hfux6TyF7NHk\n/VGpyDUSX0OvFmde9XegZLN50ICIQ+5yVELBd/WuGPUL4HPWZMWoJOccPCHJO7n22YHTrEYqwtyx\ntXJBV3gJdy46zGqEI8krGy2X5DxP34eUtQuM43k8cX/LVW6Sue9yPMuVAcuk63oO7r1eKNxI0lXA\npWZ2eZPnL6UakEouoJCSpRyKL4/MgT8YHG41KiqVTe4eq7bcMwWvB9zQsUtehORdS0le5BWf/h++\nJLWX9c42V0/OTrgjYqWM6GjgeDO7sGD/w/HMfCfiSYKOAhbHHc0OtxqJbNRTaWox/H+rmtn8UMsl\nDQpaIxRyByPps/jMo9oT7YDVy1UJFaOSnFrrUPPj8ckNLTZppv4rXOlWclevjXvmHoVXX/oZ/mNV\n1QM6zdBrYmaHFBjH7ng94x3NrKX4TkkX4ObPPfCkL5XruhmeJ/tzdfpm00R+Cldg59JEmkh1WDWg\nNItbDy960nS8sEoqzpKSkQh/WFubnqxb4Epognm2uCJjegxPnnO9pJWB+3DHrq8CT5nZrgVk7Is7\n+Z2Ox6iDO2vuBfzCzE4uIOM5YB8zuy79D3/ezJ5LM+B1zOx/G/T/O76eXugBImiRwbaZx1Z9w52d\nJtPzA/sQ7qgyCbitQd85s6/rbQXHMgVYLL3OrustQ4H1o3SuufB1qKVyY5gHN0e+UnAsNwHbVmnf\nFrg1vd4R/7Hrz+/noXQtpuIeyk3H3uJhT6tWua5L4rOqen2nF9wKxamWcD3mwR+Ezk7bfsC8LciZ\nCizR4hg+Xuds95rgDmTntjqWjJx36Yk1/yUe9gS+Pv9qQRlj8ApY+fadKRi/n35LFk2vx+NVtCr3\n2ltNfJ5ZcOvS8IG4r2a0LdaQO5ejcW/bUemJ9jt4nOhFVI9PzjJRUmUmOona5f6MYh6nY2itYlSF\nyhgMN7XlMTzrVREqM4M8D+GJF8ArAy1aT0hai94af0A4yTwt56r47KdIgYkyqmHNTvXMafPSk5ms\nKlZe8ZG2kbQ+Hqb3Nu7UBbAPvk6/lZnd3oS4x3ElkQ+Na4iVWJzFfJb9bdqvpPQB7sMBHmpX8Xh/\nE38gLcLC9HixZ7mb4vH7L6VjX8TzcG+KPzyuRYN7DT4u9Xg6/hAA7jH+vKTT8ERFxxQcR1CHUMid\nywrA9un1R8BsZvauvDbqNdQvRbchPWnyyoilbKliVIavpn634Q8WWRPvB8ALZlYzDjnHS8DueOWa\nLLvTE3c6H7niE1mSqf0WXBkugs+EJuKJPj5Dz49OTaycalh34NaBionc0nrs/rhnbcsUSROZ1p4L\nYfUrNv0GXwv/oSVTbjI//zbtW7noeXAfhRMkHYJ7XE/OjePtJmS1yzX4GmtDk3Ad7gROSolA1sbv\nMXCF9lJBGc/iFqD8Esx2+Dp9Ef6MW93+jZdB/WNadlmUYp/vGHxN/yv0nhDcgs/8QyGXQCjkzmUy\nPevG4/GZ3H/S+7prtpYppWcllPuzNitGVcYgaQngRUu2rxb5GV5laQt8PQ7cm3d5PH0j+FP/ZXVk\nnIx/hv3wWV2F64A/Fh1ISsaxDf7dHG9mb0paHXjNzIp44e4P3CppTfy7Pg6PT50XX0stOo4D8EID\nl6X3VwDfkTQeL1JRy/HsrRrtzbI0sI1l1lXNbFpyFNupSVmVON1r6W3ZaWjRycXx18WKVeN6Bp/l\nr0f1h4MiMvbGH0y2wR9YKvfFFjS2dFUYBVyWLBGVNeT1cAW7bREBZnZg5vVlkl7ELUrPmNlfCoj4\nFp6K9B71rjr1H/z+D0ognLo6FElXA9eZ2R8knYBn6DoPN7NONLONm5DVbhanrKyWKkalvuvX21/U\ntJkU+w/wWQb4Gu4ZVjDpvbzyzZpm9mzOSW1xfO151gIyVsFnB2/h3qrLJRlH4mt1hRSRpLnwH+2s\nR/FvCprNKzLG4JV47panibwcnz1tm8ZSJE1ky6TZ3/FmdnWu/VvAgWa2ThOyNqi3v94DZsYjuBFm\nxapx1ZNXSEZZSFqDvpXfTrQBqr6UQiZXSvd49n9mVTxcb66BGEe3Ewq5Q5G0JK74Hk1hQSfSE0qy\nr9WpEpOTUzeLkxWIQ05m8jvN7LZc++zAfmZWaJ0thZPk+Xg8NkDZfiS9DmxiZg/nflw2Bs4zs88W\nkHEL7ry1f07GusDFZrZ4g/7DgYOBc8ysqOmylqwpeDGIcZJOBWY1sx9IWhb4t5nN0478AuffDp/d\nn4Zn5gLPKrcXvrTwsZ+BdUgJxYFCUl1fBjN7cQDHshzwY3or9dPM7OkCfW8HrjCz09L9voqZjUlr\nyMuY2eb9NvAZiFDIHUi74R85WaNxM+DB1mLiiKRIPwQOMrOTMu0L4t7RRVMR5p+iZ8YT+B+BZy26\ntaCctmb8Kc53LnwWORFYBV/Lvga428wamj4lvYV7qj6XU8iLAU8XnGW/i886xjY6toGcV3CT8d2S\nnsZDYa5IP8D3mVkh5yF5Mo9aiSNqFh+o8aDVqzvJ5FzkXknf7+70KI7/4A8uLZnYpdaSaZQhQw3i\n5gtej6/hnuE35to3w+O/bygg4zt4par76Z2Kdy3gf8zsTw36fwm4AV/S2QU4A88uty6wgZk90GgM\nQQGadcu1lJlUAAAgAElEQVSObWA22gj/yMmZTAqnaUPGdFx5vYE7QM2S2hekhLAaYAPggYLHboXP\n9Kfj3tsTM1uhNI14iM7f0+f5CPfofR93wCmUuhL3eK+ki8yGLG0CjCso4xo8sUq716+MNJH7pM9x\nWroWv0/yJgFHNei7WNGtwDjWxMt9voRn+boKd9Z7gxSq08R12QkPG5yatkfxuPEBk4EvRWS3NfHE\nIE/icb1FZDwKbFalfXPgkYIynsMTgOTbDwOeKyhjKTwf/b14YpM/Aiu3e//GlrnGgz2A2Gp8Mf4k\nu1EJcq6iStxukzKm47PRpdI/4t3pfVkKeXkaxN1mji0tbzP+ILAPbjrenCbyaQNn4Z6rMydFtgQ+\ns3wQOKWgjEolpRNwj/pvZLcmxjIz7ux2KukhIbWPBPYoKOMpYPv0OvuAcTgFa/+WseGe5+eSiXPF\nnU/Pw9cqi8rZF38YPTZzTY9LbSMHSkYd2V/Hk+oUOXYKVXLR474LkwvKeA9Yukr7MsB7A/X9xlZ/\nC5N1hyJPuXc0HhLTVPhHGVmccvI+zrCV4ncvx72B9wSuteIm61XyTXhs5IH4D/CXCsiYjD+Vt1Qv\nWJ6H+q/A3mZWNGSkmpy5gCvxGc8nca/zhfA11C3MbHKd7hUZ9Uy9VvS6lkFy2lnBPEf5BHyN/RFJ\nywD3mNl8BWSsSHVzd8N7LCNjCv5Q8VSufUXgfjMbUb1nHzljgFGWW8KQp5L9pRXIG1+GjDqyl8Zn\nt7MXOPZV4H+trw/Hxri/wgIFZFyPrwGfm2vfFTdZb1alz5yV3xnVziEPDHg4WtcSYU+dS8vhH1RP\nWnFolbaiiUE+zudrZm+nNa1TapynHg/Ts56Y5R6gYR7rxI24EmxJIZsnfFiDNoutm69nbpJCYj72\nkDazW5qQ0XLyCpWUJjLDq3i41Qt48oh18BrES1A9n3N2LEvi1oKV6f39Vq5xMw8Wb+NKPV+wYBF8\n5l6UMpJptC2jiiKrPIT+kuIxxNfgZUu/bWbPJblL446eRR92rgWOTfd+1vHuu8Co7D2UuV/KTjAU\nNCAUcudSL6FH3UQL7fzQ12BXMjGr5kUb9pH0IG72LUp+RjEdeN2aq7h0HXB8mjG1NOPHvc53BX7e\nxHmBjzMWbWRmf01NWwKfSK+/JmlTPNl+zc+UlyHp6IwM8HXtujLwh6GF8LXseg9GRX8s26nYdCq+\nDr9R+rs2npzlRNyU3gyXAWdL+hk9ynA94HjgkibklJFMowwZ1RSZ8HXx7fseXpX98ZjlpyRVPPIX\nwetnF72+lQpXP0pbtX3Q+37ZEF+amkA5CYaCBoTJeogg6ZP4P/AewBqNzJmSNsSdfdbJm5OSufVu\nPHzqxmr9m5Cxn5k1THAgz0C1Cx5HvTj+jz8GN/teaAVvxDLMvJJOwRXyU/hafX45YP86ffcEvm5m\nW6X37+BewFPSIcsDx1mdhP8FZRxvGY/2/kZtVGyS9AawoXmI3lvA2mb2dLp/TjSz1ZoYxyy48t2T\nngnDh3hmukOsp4xiIznfwZX7LVRJpmFmfx4gGV+ht0KejhereBaY2cymVOtXRY5wh8FV8fvkETO7\no0jfdkj/b/fhPhOXWq5SW1Ayg72IHVv9Da83ej6epH40nqJurQL9rqWO4wnuzPTX/paRjhO+bjsd\nn4FdgodgPJLarh7ga3pHna2u41A6ZqvM+48doNL77wH/6m8ZmWNnBm7FY0HbuSaLUsWpLX13izbo\nO5EUEYB78341vV6Kgg5D+fsMzwi3ctpG4Ov0dzX5mVbHPYEfSNsfyTi9DZSMKjI/gTuM1S0ugWfS\n2jLXtjPuUT8BOBP4RAsydsIfhhvKAL4MnIMvJbyLO9d9uZ3PH1ud72uwBxBblS/FTZEH4rOT1/BQ\nlA+BFZuQ8QLupFNr//J4Gst+lZGO2zX9Q3+1yr4N074+1Wxyx10PzJV5fyAwd+b9fMATDWQsWU3p\nNPndjCfj8YrPdrLvl6VB9ZwyZOTkvU77CnkaqVJSrn0+GnjS4w8Y30qvL8bjVdfDHyQfL3j+KbXu\nAbwIx50UqOCFx6Xvj89o78M9pGdr8lqUIeMTuFPm/bglqXJ9dsUdAMfhZRnrybghewz+cPIBHnq0\nb7qPftmCjA+bkZH5DnYF/ok/RI8GDgAWaue+iy13nQd7ALHlvhD4C75eezEeGjEstTerkKdSJcwh\ns39pGpROLENGOu4mPIVirf0HAzc2kNFLYeBKPDurbBiCVUXGZcCCTX4/U/A0mbX2Lw9M7W8ZueNP\nBo5p876bDnyqSvtiNAitATYjxdTiYTRP0WOaLRS6h+d6nkIu3IseZTwa9/RvJOcQfA3+b/ja+hQ8\nqUgz16IMGcfi68dX4Ar4Q3w2+igeHz6sgIzxeIrXyvuj8Ix5lfffpfFDaNsyqshcOsl5EX9AuLad\ney+2ni2cujqPLYBfA7+zNsJygJeBlfC1qmqsgv+z9reMynE112Xxp/hG2bHynr51PX8LyvgaXrGq\nGV7Cr0mtdIOr0LiKTxkysgwHdkthMNVC5Pat1VFeAAJ8nfOIFP5UYRjwBdw7viaW8UNI9+zykubF\nc64X8g0wsytThq5LJH3dzP6RUrP+DX/Y2sCK5ffeCfiRmZ2ZPt/GwHWS9jB3RixCGTK+i8/4r5VX\nF3sU/55WLXpN8AQ2r2Xeb4D/r1S4D3fu6m8ZvTDPAf8r3IJ2ND5xCEogFHLn8SU8beADkp4ELsTX\nW5vlevwH9m+W89ZNXr6H4eu6/S0DPJzmtTr7X8N/OIYC1wOHS7quxjUZhXuC97eMLCvhCUmgp+BG\nUSoOV6LHJFrhA3yd/4RqHVWgfKOkj/CQqputQVUh86pi8wLXSPomnpTk07gyLlqec1EyCsfMbpFX\nJ/o0xR9yypDxWfzhCDN7XNL7wMlNKGPw/4slgHHJ2W11etcN/yS5KIN+kvEx8gIxu+FlVKfjOQnO\nLto/qE8o5A7DzO4B7pH0Uzy8Yje8HvFMeNzrOCvm6Xgk7tE8WtLp9MzGlseT/g/DzU79LYN03Ed1\n9k+j8b1o9A0faTZEoAwZv8JDYZ5O12R0al8Or9o0nL5hMv0h42PMrOWQlEpfSecC+xS8tyoUyS09\nE27G3kPSCWZWLR4+O57jklK+FXde+oo1V3xjOL7UkuVD3PltIGUMo/fDzUe4U1QzXA8cIy+v+S08\n21bWs3oV3ImuX2VI+jQeIbELbq6+G7doXW4FEuAExYmwpyGAvEjA7sCOwNz4bKNuMojUbzE8XGQz\neidruBHYy8zGDJCM6fiM4/0ah3wC2NzqhCxVkbEVHjtb+UEoQwYAZrZ1g8+zBH5NNqH3NbkZN3U2\nTFpSkoyGM1Q8FOw7bcpoeE0aIWlL4LdmVrX6UZVxfA2fnfeqK13gu6l2r/X5nuvJ6ScZTd9rkubH\nY8C/hCvznS0TaiXpVjyLWs14+nZlSLoB2BjPJX4BvpbesDpU0BqhkIcQ8ipQWwG7FVHImX7z4E+2\nwguST2zh3C3LSLOvhpjZrp0uIydvXvyaADxrZm8W6VeWjE68JnXOMzf+Y15VAZU1jk65JmVe1xTz\n/66ZTcu1z5vaa8aItytD0rW4Sfqv+b5B+YRCDoIgCIIOoOwUi0EQBEEQtEAo5CAIgiDoAMLLugOQ\nNB/uNDWWvt6dQRAEQ5FZ8bz1N5rZf8sWLmlRYP42RLxhZi+WNZ4yCIXcGWyGVyAKgiDoNnbAMw+W\nhqRFZ4YXCgdQV+c9SSt0klIOhdwZjAUP+K33uPc3YPM6+8/k+wVO1UgKeLbCelwDfLPAudqV0SjE\nschnKUIZcoaSjCL/9tfjkUf12KTB/l/hWVHr0Shq7mw84q8e4xrsL3q/1rvfBvJea1Rq+RIaV25s\nlNRsIO61N0iVO8e2eaJqzP8hjX8za5FGNiJ1D4Uc9GIq+J1R719x1gb7i9VMbywF8jXVq8n4bIFz\ntSvj7Qb7i3yWomNpV85QklEkv8WseGKqenyuwf5PFjimkRvLCLxoVD0aRYoUvV/r3W8Dea8t3mD/\niALHlDGO0mT02zLcQjS+S6vRqYovnLoaIGmMpLp5liVNl1Q4LjgIgiBon+H442WzWyjkAUDSOpI+\nklQ3Z24/sBC9E7YHQRAE/cwwXLk2u9VM5zfIdJVCxhebfg2sL2mhegemrFelYGYTzKxN/4IgCIKg\nGWKG3KGkUm3b4fmBr8MToVf2bZDMyptLul/SVGA9SUtKulrSq5LekXSvpI2qiJ9T0sWS3pX0kqQf\n5c7dy2Qt6TOSLpH039TnXklrtfsZV2pXQGlSVmt8yIDIKOeKlCOnm2SAF35qly1LkLF+CTK67V77\nQgkyOule60wk7ZWWLKdIuqfRb7ikHSQ9LGmypFcknZ1SkxamaxQyroyfTPVYL6K6a+bRwAHACnh9\n0jlw5f1V4PO42flaSXkPkJ8BD6VjjgFOraG4Kw8Gt+PeDlviv2xHU8K1LuMnshwpq3eIjHKuSDly\nukkGwKolyOgUhdxt99o6JcjopHutdfrLZC1pO+BEvFTlanihkxtTsY5qx68HnA/8AVgR2AZYGziz\nmc/TqTP3VtgNrx0M7o8/p6T1zez2zDGHmNmtmfeTcMVcYZSkrYFvAL/NtN9lZsen16eniz8SLxGX\nZwdgPmB1M6uUp2tYESkIgiBojorJupV+DRgJnGFmFwBI2hP4Oq5njqty/DrAGDP7TXr/gqQzgP1L\nHlfnk8oTro3X+8TMpkm6HJ8lVxSykQqGZ/rNDhyGB1wujF+PWfEC5Vn+VeX9T2oMZ1XgoYwyLszf\n0smzrEQnPIcGQRDU4zHg8Vxb/ycdrMyQW+lXC0kzA2uQqUluZibpFuCLNbr9CzhK0hZmdoOkBYHv\n4hbYwnSFQsYV7zBgvKRs+/uS9s68z0f+nwhsBOyHF+meAvwJmKWNsUxptePmlBPpGARBMLCsTN+p\nw3iatNg2TT/NkOfH9clrufbXgOWqdTCzuyV9D7hM0qzpFNcCe1c7vsVxdT7JW3pHYF+8uHuWq/GU\nNrUKaq8LnGdm1yZZc1A94j6/aLMO8GQNmY8Cu0ua28wmNfwAQRAEQUtU1oTrcRdwd67tvZLHIWlF\n4FTgl8BN+NzqBOAMYI+icoa8Qga2AirFz9/J7pB0FX4x/g9Qlb7PAFtL+mt6f3iN49aT9DM8B9+m\n+IJ9rbyCl+D5Aq+WdDD+mLga8LKZ/buZDxYEQRC0x3ppyzIGOKh2lzeAacCCufYFgVdr9DkQ9zU6\nKb1/PEXj3CHp52aWn21XpRu8rHcDbs4r48Sf8LWAlameX29fYCL+EHUNvoz7YO4Yw03ba+Ke1gcD\nI83sltwx/sLjkTcBJuDrB4/int3Tmv1gQRAEQW36Iw45/YY/gC9nAiBfC92IvpPtCiOAj3Jt03Hd\nUG2SV/PzDGnMrGbKSjO7j571+9Or7H8B2DjX/LvcMUsWGMOw3PtxwLaN+gVBEASt049e1icB50l6\nALgX97oeAZwHIOlo4NNmtnM6/i/Amckb+0Y8xfbJwL/NrNasupVxBUEQBEHn0R9e1gBmdnmKOT4c\nN1U/DGxmZq+nQxYCFskcf37yQdoLXzuehIfFHtjMuGTWqFJK0N9IWh14AL5PO37Wtshh5Yxn3KhS\n5ATdzmzti9j4gPZl3HJU+zJKo1My6Daq2FaUwv5IVXgY2ABgDTPLLwW2ReU387fAMi30fwZI6RZL\nH1s7xAw5CIIgGJL01wx5sOgGp64gCIIgGPLEDDkIgiAYkvSjU9egMORnyPlKS1X2byBpmqSyFlWC\nIAiCDiDqIQ8wkhaUdJqk5yRNlfSCpGslbVhQxF3Awmb2dn+OMwiCIBhYuq0ecqeOCwBJi+GB2G/i\n+aYfx6/n5nhc8YqNZJjZR3iSjiAIgqCLKJI6s1a/TqTTZ8i/wzNcrWVmV5vZs2b2pJmdTO/80p+S\ndFUqDD1a0laVHclkPb1ispa0s6SJkjaV9ISkdyRVqnOQ6bdH2j8l/f1hZt/Mkk5PRainpCLWB2T2\nzyXpLEkTJL0l6RZJq/TXRQqCIAiGPh2rkCXNA2wGnG5mfep45UzQhwKX4ikyrwcukjR39vBc9xH4\njHsH4Mt4ucUTMufeAU8SfhCwPJ4u83BJO6ZDfoJXX98GWDbJGZuRfyVeE3kzvDr6g8AtuTEFQRAE\nbRAm64FjaTwHaK1KTVnONbPLAVJBh33w+sg31Th+OPADMxub+pwOHJLZ/0tgPzO7Jr1/QdLngB8A\nF+IZWp4xs0pe03GVjpLWw/NeL5ByogLsL+nbuAI/q8DnCYIgCBrQbXHInayQCyfkxqtjA2Bm70l6\nG1igzvHvVZRxYnzleEkjgKWAsyVllecwPB0aeD7TmyU9jRek+KuZVUo/rgp8EngzV5t51iS3Dn9L\nh2VZib51RoMgCDqJK9OW5a1+P2u3hT116rjAs5sZbjK+psGx+Xx1Rn1zfLXjK9pzjvR3DzypeJZp\nAGb2kKTFgS3w4hSXS7rZzLZN/V/Bc8blHyoa1EfenHZSZwZBEAwO26Qty8epM/uNUMgDhJlNlHQj\nsJekX5vZlOx+SXOZWemPYGY2QdIrwFJmdmmd494FrgCukPQn4Ia0Rvwgnnh8mpm9WPb4giAIAidM\n1gPLXsCdwL2SRuG1hYcDm+LruZ8rKKcZ8zfAKODUZPr+G/AJfF14bjM7RdJI3Mz9ED673hZ41cwm\n4c5b/wKuTp7Xo4HPAF8DruqkROZBEARB59DRCtnMxqSqHj/HvaAXBl7HFfO+lcOqdW3wvtF5z5Y0\nGdgfOA6YjK9Tn5IOeSftWxo3Y9+HK9wKXwOOAs4BPgW8CtwOvNbMOIIgCILaDB8GMzc73QKGG2kB\nsrPoaIUMYGav4V7T+9TY38f6YGbzZl7/k4yFwszOB87PHX8NOStGMldXNVmb2VnU8ZY2s8nAT9MW\nBEEQ9APDhsHwFoJ3h00nFHIQBEEQlMXwmWDmFhaEO1Xxdeq4giAIgqAuw4e72brpfi2YuQeCUMgd\nxbdoJ+ZY41oJAOjLQrZ92zJe1eMljGRsCTJWKEEGuF9eu/ylBBllfMf5qL9WWbp9Ebcc1b6M3/+8\nfRkAe95QgpBPliBjSuNDGvJY40MKcVIbfceXNIbaDB8GM7egxTpV8XVs6swgCIIgmJHo1AeFIAiC\nIKjPTLQWVDy97IGUQ8yQS0bSKEkPDfY4giAIup5KZpBmtw7NDNKVClnSgpJOk/ScpKmSXpB0raQN\nB2gITcU9B0EQBC3QijJutYjyANChw2odSYsBdwNv4iUWH8c9YTYHTgdWHLzRBUEQBKXRau7MDoxB\nhu6cIf8Ov9xrmdnVZvasmT1pZicD60jaWdJ0SdPS38p2aEWApD0kPSFpSvr7w+wJJH1G0iWS/ivp\nXUn3Slord8z3JI2RNCkdO/uAfPogCIIZhcoacrNbh2q+rpohS5oH2Aw4yMym5veb2duSLgWy8Q1f\nBS7Ac2YjaQe8HvJeeLmS1YA/SHrXzC5MivV2vAbylnhazM/T+yteGvgmnkJzXrwIxYH0rrkcBEEQ\nBB/TVQoZV4QCnq51gJm9D0wAkLQU8Btcgd+WDvklsF9KpwnwgqTP4cUsLgR2AOYDVs9UmxqTO42A\nnc3svXSeC4GNCIUcBEFQHl1W7qnbFHLh/CuS5sQzNfzFzE5KbSOApYCzJWVzVQ8HJqbXqwIPNSj9\nOLaijBPjgQUaj+ow+iYW+CaeMCQIgqBTeQx318nSx0hZPq06aBXoI2kv4Gd4Od1HgB+b2X01jj0X\n2Bl36M3qof+YWeFsT92mkJ/BL8jywDW1DpI0E3A5MAmf+VaYI/3dA7g3163iBlAkjU4+FZJRaNVi\nFO1k6gqCIBgcVqbvb9d44Mz+PW2rccgNfo0lbQecCHwf1wUjgRslLWtmb1Tpsg9wQOb9cLwq4eUl\nDmtoYWYTgRuBvSTNlt8vaa708hS8lvK3zOyDTP8JwCvAUmb2fG57IR32KPB5SXP364cJgiAI6tN/\nccgjgTPM7AIzewrYE3gP2K3awWb2jplNqGzA2sDcwHnNfJyuUsiJvfDLfa+krSUtLWl5SfsAd0va\nBfghfoGVYpYXzHhBjwIOkvRjSctIWknSLpJGpv2X4HWNr5a0rqQl0nm+MLAfMwiCICgbSTMDawC3\nVtrMzIBbgC8WFLMbcIuZjWvm3F2nkM1sDLA68HfgBHxx4yZgUzwueQP8c1+Lz4Yr236p/9m4yXpX\nfDb8D3xt4Pm0/0NgE9wx7Lp0zAF0bGRbEARBl9I/iUHmxyd1r+XaX8PXk+siaWFgC+APhT9HotvW\nkAEws9dwm/4+VXb/DVe29fpfClxaZ/84YNsa+w7DvbOybacCp9YfdRAEQdAUBdaQL3ndtyxv9e/0\naRfcCbimH1MtulIhB0EQBDMABcKetl/YtywPvgNrPFizyxu4xXPBXPuCeN6JRuwKXGBmHxU4thdd\nZ7IOgiAIZhD6wakrLUs+gOeOANzZKL2/u95wJH2FFDrbyseJGXJH8T7tFSfv41jeEq/OtmTbMr5k\nz7Ut4041/YBZhRVKkAGeDr1dVilBRu3H+uLko/Ja5c0SZOTj7ltgz7JquTxZgowyflK3LkFGGZ8F\nPNFgq0wuaQx16L845JOA8yQ9QE/Y0wiS17Sko4FPm9nOuX67A/82s5a+gFDIQRAEQZDBzC6XND9w\nOG6qfhjYzMwqq9ELAYtk+6RkU9+muu9SIUIhB0EQBEOTfkoMAmBmvwV+W2NfH8dgM3ubnuRSLRFr\nyCUiabFUOaoM22QQBEFQj/5LDDIodJ1ClnRuprziB5JelXSTpF3TwnxRORskOXM2OYSyFrSCIAiC\neoRCHhLcgNv4FwM2B27D44D/kvJYF0H0TRRetF8QBEHQ37RSC7mydSDdqpDfN7PXzWy8mT1sZsfQ\nU594l2qmZUlzpbb1JS2GK3GAiWm2fU46TpL2l/SMpKmSxko6KHf+pSTdJmmypIclrTMQHzoIgmCG\nImbIQxMz+zteQqsSU1DPtPwi8J30ehlgYeAn6f0xwP54Nq4VgO3oGyx+JHAcXqpxNHBxEzPzIAiC\nYAZkRvOyfoqeGmE1TctmZpIqQZavJ+85JM2Bu7T/yMz+mPaPAf6dE3G8mf0t9RmFFwpdGlfOQRAE\nQRkUyNRVs18HMqMp5Mq6cKusAMxCjzm7Fo9lXo9P512Ahgr5V/RNlLBl2oIgCDqVB+ibtKadJEcF\naXU9OBRyR7ACPqOdnt5nZ8lFUjEVvcOyqZAqDwAFTNYH42WagyAIhhJrpC3LOLzgXj/SZTPkGWZd\nU9KGuLn6SqCSbSWbcnw1es+eP0h/s1/dM8BUMjlOqxBhT0EQBANBlzl1desM+ROSFsQv+4J4bcoD\n8RrIF6Y14nuAAyWNTccckZPxAq5ct5J0PTDFzCZLOhY4TtKHwF3Ap4DPmdk5qV+EPQVBEAwEMUMe\nEmwOvIKbp28ANgD2NrNvmVllBrsb/lXejycS/3lWgJm9AozCvapfBU5Lu44ATsS9rJ/A6yZ/Ktu1\nynhi1hwEQRDUpetmyCnHaJ88o1WOewr4Uq55WO6Yo4Cjcm0GHJ22vMwXqsh4K98WBEEQlEA4dQVB\nEARBB9BlJutQyEEQBMHQJBRy0H8siicGa5WSvs6pb7ct4k4tXcJASpAx/2fblwHwxjMlCCnj+1m9\nBBmPliAD4J0SZMxWgowyxgEeFdkBbFPCPXtlWdeknXv2/ZLGUIcuM1l3q1NXEARBEAwpYoYcBEEQ\nDE26zGQdM+QqSNpZ0sTM+1GS8nnhavUdJemh/htdEARBAHRdYpAhp5AlnZvKJE5Lfyuvry/5VNnY\n4eOpn52rXt8gCIKgP+gyhTxUTdY3ALvQOytWv3kQmNl7wHv9JT8IgiBogXDq6gjeN7PXzWxCZnsL\nIM2Yd5d0laTJkkZL2irbWdI3Uvt7km6StGPqN2e1k+XN0JK+Iunfkt6VNFHSHZIWyfX5nqQxkiZJ\nukTS7P1xIYIgCGZYumyGPFQVciMOxVNargxcD1wkaW4ASUsAVwBXAasCZ+F1DxuZmS31Hwb8Gfg7\nsBKwDnBmrv/SwDeBrwFfx1N3HljC5wqCIAi6lKGqkLeS9E5me1tSVuGda2aXm9nzeE3DOYC1074f\nAE+Z2YFm9oyZXQ6c18S550zbdWY21syeNrMLzeylzDECdjazJ83sLuBCmluDDoIgCBrRZTPkobqG\nfBuwJ73XkN/MvH6s8sLM3pP0NrBAaloWuC8n796iJzaziZLOB26SdDNwC3C5mb2aOWxsWneuMD5z\n/jqMBObKtW2ftiAIgk7lYeCRXNvU/j9tl60hD1WFPNnMxtTZ/2HuvVGiNcDMdpN0Kl5VajvgSEkb\nm1lFsbd4/pMpJxNTEATBQPL5tGV5mZ4ief1ExCEPeZ4G1sy1rV3twHqY2SNmdqyZrQc8DvxvGYML\ngiAICtJlJuuhqpA/IWnB3DZfwb5nAMtLOkbSMpK2BXZO+xrGD0taXNKvJK0jaVFJm+IJqJ9o7aME\nQRAELTETPWbrZrYCmk/SXilSZoqkeySt1eD4WSQdJWmspKmSnpe0SzMfZ6iarDcHXsm1PQ2sSHWl\n+nGbmY2VtA1wIrAP8C+85vFvKRbL/B6wPLATMB++PnyamZ3Z5GcIgiAIOhBJ2+E64vu4j9FI4EZJ\ny5rZGzW6XQF8CtgVeA5YmCYnvUNOIZvZrvgHrrW/jzHCzObNvf8r8NfKe0k/B14ysw/S/vOB8zPH\nHwYcll5PALauc/6Pj820nQqcWu9zBUEQBE1SMUG30q8+I4EzzOwCAEl74iGsuwHH5Q+WtDnwZWBJ\nM5uUml9sdlhD1WTdFpJ+KGlNSUtI2hH4Gc2FPgVBEASDTT+sIUuaGVgDuLXSZmaGR9R8sUa3rYD7\ngQMkvSTpaUnHS5q1mY8z5GbIJbEM8AtgHvwp5njgmEEdURAEQdAc/eNlPX864rVc+2vAcjX6LInP\nkEDNHNgAACAASURBVKcC30oyfgfMC+xedFgzpEI2s32BfQd7HH2Yg/a+kUkflTSQa0uQ8XIJMpZu\nW8Llr+9TwjhgW51Vipz2KVR0rAHzNj6kEFNKkDFbCTIuK0EGlHdd2uTKehGdA00733G/lRfooeLU\n1Uq/cpkJmA78r5m9CyBpX+AKST8ys0IXY4ZUyEEQBEEXUGAN+ZJ/wiV39G57a3LdLm8A04AFc+0L\nAq/2PRxw596XK8o48SSevOqzuJNXQ0IhB0EQBF3L9hv4luXB52CNkdWPN7MPJT2Apzu+FkCS0vtf\n1zjNXcA2kkZksjQuh8+aX6rRpw8zpFNXEARB0AX0X2KQk4D/J2knScsDvwdGkJx/JR2dUihXuBj4\nL3CupBUkrY97Y59d1FwNMUMuFUlLAc8AK5lZJAoJgiDoT/ppDdnMLpc0P3A4bqp+GNjMzF5PhywE\nLJI5frKkTfBcoffhyvky4JBmhjVkFbKkc/EMW0ZPkQkDlklVngaLhtm+giAIghLox1zWZvZbPGFU\ntX19cmGY2WhgsxZG8zFDViEnbgB2oXfVp9fzB0ma2czyBR/6CzU+JAiCIGib/ksMMigM9TXk983s\ndTObkNlM0h2STpF0qqQ3SFm5JM0j6RxJr0uaJOlmSStVhEk6QtJ9ad1gbDrmj5JGZI6RpIMkPZvy\nlY6RtH9mTAYsI+kfkiZLekhS08UrgiAIghmLoa6Q67Er8C6wDrB3arsKLzi8CV7x6THgFklzZvot\nB3wtbVsBGwP/l9l/Ah7DfCiwAl6seEJmv4AjgV8BqwLPAxclL70gCIKgLPqxuMRg0KET98JsJemd\nzPvrzWy79PopM/t5ZYekDYCVgYXM7KPUth+eVWVrelJnGrCLmU1Nx1yEu7sflhT33sAeZnZxOn4M\ncE9uXMea2U2p/y9xh4AlcOUcBEEQlEGX1UMe6gr5NmBPetZts+He9+eOXRWYG5iYm6zOCiyVef98\nRRknxgMLpNefw6/ZbQ3G9Viuv5KM+gp5ykjQXL3bZtnetyAIgo7lMbwsfJap1Q4sly5bQ+7QYRVm\nspnVyjOXz8UyBzAO2JC+jlcTM6/zzl9Gj4GjaB65rIyK13VjI8lsJ8Pw1QueIgiCoFNYOW1ZxgP9\nXJU2ZshDlgeBTwMfmFmriZZHAx/gJuwLahwTYU9BEAQDQefksi6FGUkh34gHbF8j6UDgWeAzeI3L\ny8zskUYCzOw9SccDJ0qaBtyNB40vb2bnpcPCeSsIgiBomm5VyH1mqSkcanPc+/k8vDzWeOB2entJ\nN2IUPks+ElgYeIXewePVZsgxaw6CICibMFl3BtUypWT2rV+j/V1gn7RV238IuVRnZnYicGLmveHK\n+Mgq/Z8j91Wb2X/zbUEQBEEJhFNXEARBEHQAsYYcBEEQBB1AmKyDfuPdd4G32xBQVrru75Ug44ES\nZNzStoRt9bsSxgHzfvRR2zLeHP5sCSMp41+2aPReI+YtQcZs7YuYf5f2ZQC88YcShLR/n7ifaLss\n0PiQQrzZRt9PlDSGOnSZQu7QiXsQBEEQzFjEDDkIgiAYmoRTVxAEQRAMPjYTWAvmZ+tQ23CHDsuR\ntGAqofiMpCmSxqfSintKKmHxKQiCIBiqTBsG04a3sHXoGnLHzpAlLYFnwnoTOBDPXP4+njD1+8BL\npDrHTcqd2czK8n4KgiAIBonpSSG30q8T6eQZ8u/wjFhrmNmfzOxpMxtrZn8xs63M7K8AkuaSdJak\nCZLeknSLpFUqQiSNkvSQpN0lPU9yMZX0d0m/lnSypDclvZqOGSHpHElvp5n55hlZM6VzPS/pPUlP\nSeqVZETSuZL+LGk/Sa9IekPS6ZI69BYIgiAYmkwbJj4aNlPT27RhnZnhuCMVsqR5gU2A03OlEKtx\nJTAfsBmwOl5E4hZJc2eOWRqvefxt4POZ9p2A14G1gF8DvweuAO4CVgNuAi6QNGs6fia8YtR3gBWA\nw4CjJG2TG9NXgSWBr6Rz7JK2IAiCIKhKp5qsl8aLNIzONkp6Ha9fDHA6brJeE1ggY4beX9K3gW2A\ns1LbzMCOZpYPqnvEzH6VZB8DHAS8bmZnp7bDgR8CqwD3mtlHuBKu8IKkdYFt8QeDCm8Ce6c0m6Ml\nXYdXiDq76SsRBEEQVGXasGFMG978vHLasOmUEzNeLp2qkGuxFj5LvRiPOl8V+CTwptTLBDErsFTm\n/QtVlDHAo5UXZjZd0n/xStuVtteS3I+j7CXtBewKLIpnNZgFeCgn9z9JGVcYD6zU+OMdBMyVa9sm\nbUEQBJ3Kw0C+YF4j42b7TB82jGnDmlfI04eJUMjFeRavkLRcttHMxgJIqqQamgOvtrQBfcseTsq8\nnlzjPHnnLqvSBsm0L+l/gOOBkcA9wDvA/sDaBeQWuGuOprdFPQiCYCjwefr+dr0MnNavZ53GTExr\nIe3WtH4YSxl0pEI2szcl3QzsLek0M6uV6+9BYCFgmpm9OABDWxe4y8zOqDRIWqrO8UEQBEE/MY1h\nfNRFCrkjnboSP8IfGO6XtK2k5SUtK+l7wPLAR2Z2Cz5TvVrSJpIWk7SupCMlrd4PY3oGWFPSppKW\nSWvMa/XDeYIgCIIZjI6cIQOY2fOSVgMOBn4FfBaPQ34COA4PiwLYAjgKOAf4FPAqcDvwWqNTtNB2\nBm6XuTS1XwL8Jo0hCIIgGECmM4xpLaix6f0wljLoWIUM7lQF/CRttY6ZDPw0bdX2H0Zvz+hK+4ZV\n2pas0jYs8/oDYPe0Zfl55phdq8gYWWv8QRAEQWu0vobcmSr5/7d35/F2Tff/x1/vhKihpaiE789M\nUFSJKqGUqBhrnqrE2GpjCqVaQfDFN2oeS41RQ4l5SJDUUDOZxBCEJEJiSA1JRELu/fz+WOsk++57\nzrln2Cf33JPP8/E4D/fsvdba69wk1llrr/351POStXPOOVdQmCGX/2ouYRCX1FfShBi2+UVJBW9P\nStpGUnPq1SSprDyYdT1Dds455wpprnCG3NzGti5J+wMXEcI0v0x4suYxSd3NbFqBagZ0Jzx9Ew6Y\nfVpOv3xAritDgDeqqJ9V4vkBGbSRRfL6LD7PvRm0AZ8vUk2i9uCy1ndOynY8Z1bdRn09f/lW9U1M\ne6n6NgDomkEbIzNo4+8ZtJFVuP5q/g3OyagPhc2lU0W7rOe2vTjcD7jWzAYBSDoa2AU4nLCHqZDP\nzGx62R2KfMnaOeeciyQtCvQAhueOxUBPw4AtilUFRsccBo/HKI5l8Rmyc865DqmZRSrcZV10yXp5\noDOtn9T5hFSwqoSpwO+BVwlRJI8CnpK0mZmNLrVfPiBnTNI2wJPAMtUsXTjnnCuulHvIQ+6YzpA7\nZrQ4NvOrbHdZm9k7tMy98GIMGtUP6FNqOw05IEvqSnh+eWfC88tfEsJx3gbcUiTyV1byPc/snHMu\nQ6U89rTDgT9khwN/2OLYWyO/4aAeEwtVmUYI5pXeVNCVEOeiVC8DW5ZRvvEGZEmrA88TMi6dCrxO\n2F2wIWHH3IeELFHpeovEbE7OOec6gMpDZxauY2bfSRpByND3IIBClqFehDS9pfopYSm7ZI24qesa\n4Fugh5ndY2Zvm9lEM3vIzHYzs4cB4nNiR0t6QNJMwowaSRtIelTSDEkfSxokablc4wr+Iul9SbMk\njZK0d6HOSFpc0hBJ/5H0gxp/duecW2jkInWV+yrhOeSLgaMkHSJpXcLW9yWAmwEknS/pllxhScdL\n+rWkNSWtL+lSYFtCmuCSNdSALGlZ4FfAlWZWSu6vMwnPxWwA3ChpacLOuhHAJkBvQurFuxJ1/gr8\nljDb/jFwCXCrpF/k6c8yhJ15Bmzv95Sdc67+mdldwJ+AswnpdX8C9Dazz2KRbsDKiSpdCM8tvwY8\nRViR7WVmT5Vz3UZbsl6LsPU8eXMdSZ8RciRDGKz/En++zcyS33JOA0aa2emJY0cCH0haC/iAkLS4\nl5nlHn6cGAfj3wP/SVx2ReBfwNvAQb4c7pxz2cpF3qqkXlvM7Grg6gLnDku9/xshNW9VGm1ALuRn\nhNWA2wlb0nNGpMptBGwnaUbquAFrEr4FLQE8Ee8p5CxKy4gAAp4AXgIOiM+wlWAwsHjq2KZ4Qinn\nXH0bS9iuk1TKImV1Ko/UVZ+Lw402II8nDJ4tnhUzs4kAktK7q79OvV+KcBP/FMKgmjSVsAwBYff2\nlNT5dFiah4G9gfVp/Te1gH2AVUor6pxzdWND5v/vMWcqcF1Nr1p5cgkfkGvOzD6X9ARwjKQrKni8\naSSwFzDJzFo9qCbpTcLAu6qZPVusK4Qd3l8DwyX90swyiBHonHMupxa7rNtTfX5NqM4fCV80XpW0\nn6R1JXWX9FtgXYoH8r2KEIT5TkmbSlpDUm9JN0qSmc0ELgQuibvv1pC0saRjJB2caEcAZnYy4dnn\nf0sqFOHFOedcBWq4y7pdNNQMGcDM3pe0MWE39HmEwCBzgDcJN91zN+lb3dc1s6mStgQGAo8R7jdP\nAobm7gOb2emSPiXMgNcgBB0ZGa81r6lEmydK6sz8mfL4LD+vc865xtBwAzKAmX0CHB9fhcrk/Ypk\nZu8RbuYWa/8K4IoC556Gll+/zKxoX5xzzpXP7yE755xzdaC5wseefMnaOeecy1BThfmQfYbsSrAv\nIUBYpcoJs1rM/2TQRhb5O9bOoI2sErVXH9fleP636jbOsS+qbuN0/bDtQiVZL6N2qvVURu2slUEb\nZeUSKOBXGbRxbgZtAKyWUTu10RQ3dVVSrx7V59cE55xzbiHjM2TnnHMdUqPdQ/YZcgliZqhfZ13W\nOedc5XK7rMt/1efQt9DPkCXdBCxtZnsVKdYNqP7mnXPOucw0WqSuhX5ALkbSomb2nZl92t59cc45\n11JzhZu6fMm6A5D0pKQrJF0SUzYOjcfnLUNLWlTSlZKmSPpG0gRJf0419SNJ90r6WtI7knZb0J/F\nOecaXaMtWddnr9rXIYRQmz2Bo/OcPx7YlRDNqztwEDAxVeYM4E5C+pNHgdskLVOj/jrnnGsAvmTd\n2rtmdmqR8yvHMs/H95PzlLnJzO4CkPRX4DhgM+DxTHvqnHMLsUbbZe0Dcmsj2jh/M/CEpLcJS9oP\nm9kTqTJjcz+Y2SxJ04EV2r50P2Dp1LED48s55+rVi8BLqWOzan7V5gpjWTfX6eKwD8itfV3spJmN\nkrQasBOwPXCXpGFmtm+iWDo8lFHS7YFLqC5Sl3POtYfN4ytpInBWTa86t8Jd1pXUWRB8QK5AzIt8\nN3C3pHuAoZKWMbMv27lrzjm30Gi0XdY+IJdJUj9gKjCKMPPdD5jqg7Fzzi1Ynn6xMVkZ52cApxAi\n0TcBrwA7t9FWW+0755xbyC30A7KZHZb4edsCZTonfr4euL5Ie62+rpnZslV20znnXIrvsnbOOefq\ngOdDds455+pAo+VD9gHZOedch+RL1q6GngY+rKL+jIz6kUU7EzNoY1IGbeyZQRsAIzNo4wdVt3C6\nqu+FvdO/+kYAdd8yg1Y+yaCN7TNoA+D/ZdDGNRm0MTGDNvpk0AYU2S5TgqkZ9aGwWgYGkdQX+BMh\n298Y4Fgze6WEelsCTwFjzayswBL1uZDunHPOtRNJ+wMXAWcCGxMG5MckLd9GvaWBW4BhlVzXB2Tn\nnHMdUg2zPfUDrjWzQWY2jpBoaBZweBv1/g7cRoglWjYfkBMk3STp3sT7JyVd3J59cs45l19TDJ1Z\n7qvYMrekRYEewPDcMTMzwqx3iyL1DgNWp4p4oQ1zD1nSTcDSZraXpCeBUWZ2YpXN7knruNTOOefq\nQI1CZy4PdKb1BodPgHXyVZC0NnAesJWZNUuVbfZomAG5FjwcpnPO1a9SQme+c8co3rljdItjc76a\nnVkfJHUiLFOfaWbv5Q5X0lbDDchxprwNsLWkEwhhK1cnbF++DtiOsGvuA+BqM7u8SFstZtqSfgsc\nT/iW9DXwb+AEM/ssnt8GeJKw7XMg8GNgNHComb2b/ad1zjlXTPcDN6b7gRu3OPbpyA/5V4+C/+uf\nRgiL3DV1vCvwcZ7y3wc2BX4q6ap4rBMgSd8CO5jZU6X0tRHvIR8HvAD8g/ALXBGYTPisk4G9gfUI\n6/znStqnjLYXAfoDPwF2B1YFbspT7n8JmwJ6AHOBGyv5IM455wprLnszV3gVW7I2s++AEUCv3DGF\nNehewPN5qkwHNgB+CmwUX38HxsWf04miC2q4GbKZzYjfSmblZq7RXFrebJ8kqSchW9PgEtu+OfF2\nYpyBvyRpCTPLZeM24K9m9iyApP8DHpbUxcy+rexTOeecS6th6MyLgZsljQBeJkywlgBuBpB0PrCS\nmfWJG77eTFaW9Ckw28zeKqdfDTcgFxMf9D4MWAVYHOhCSKNYav0ehOfSNgJ+yPwVhlUI34ZyxiZ+\nzj0dvwJtRv24gfBnnrR1fDnnXL0aC7yeOpbdfdpCmlikwtCZxeuY2V3xmeOzCSuto4HeiUleN2Dl\nsi/choVmQJZ0APA3wjedF5mfRnGzEusvAQwFhgC/AT4jLFkPJQzsScmd2bnUiyXcHjgCWLOU7jjn\nXB3ZML6SphK27dROLSN1mdnVwNUFzh2W73ji/FlU8PhTow7I30KrP6WewHNmdm3ugKRyRr91gWWB\nv5jZR7F+SYO5c8657JWyy7pQvXpUn72q3kTg55JWlbRcvCH/LrCppB0krS3pbOBnZbT5AWGgP07S\n6pJ+TdjglZZvu3sGEYidc841skYdkC8kbFt/E/iUsNZ/LXAvcCdhyXpZ4KpCDUQ27wezacChwD7A\nG4Tl7pOK1WnjmHPOuSrUYpd1e2qYJevkmn585jdfKpoj4ivptHxtxPfbpd7/C/hXqn7nxPmnSS2V\nm9mY9DHnnHPVq+Eu63bRMAOyc865hUtThaEzK7nvvCD4gOycc65Dyi1ZV1KvHvmAXFfWpfWjA+UY\n23aRknyUUTvVyiKvx30ZtJGV6e3dAQDUPV9wufK9yLZVt7F53kB35ZqYQRuQzb+fZTNoI4t/f7dl\n0AaEcA2VWiyjPhTmu6ydc845lzmfITvnnOuQcvmQK6lXj3xAds451yHVKB9yu2nXATmmSlzazPZq\nz34455zreBrtHrLPkJ1zznVIjbbLum6+JkjqLek/kr6QNE3SQ5LWSJxfVVKzpP0lPSfpG0ljJW2d\nKNNJ0vWS3pc0S9I4ScelrnOTpPsknSRpSrzWlZI6J8p0kXShpA8lzZT0gqRtEudXkfSgpM/j+bGS\ndkyc30DSo5JmSPpY0iBJy9Xut+ecc66jq5sBGVgSuAjYBNiOEPoy3zMrFxCyNv0UeAF4UNIP47lO\nwGRgb2A9QraNcyXtk2pjW2AN4JfAIYSQmIcmzl8F/JyQK3lD4G5gSCIZxdWEDE9bERJT/xmYCSBp\naWA4IcH1JkBvQurFdIQv55xzVchF6ir35UvWbTCze5PvJR0JfCrpx2aWTP58hZndH8v8AdiREA7z\nQjObS8uUV5Mk9SQMrIMTxz8HjomJpd+R9AjQC7hB0iqEwXllM/s4lr9Y0k6EXMr9CbGxByf6NTHR\n9jHASDM7PfVZPpC0lpmNL+sX45xzLi+P1FUjktYmDKY/B5YnzHYNWIWQJCLnxdwPZtYk6VXCbDjX\nTl/CwLkK4an2LsCo1OXeiINxzlTCTJf4386EgTqZpakLMC3+fDlwjaTewDDgHjPLRRXYCNhO0ozU\nNY2Q7LjIgHwW8P3Usd2BPQpXcc65djcaGJM6NrvmV220e8h1MyADDwETgCOBKYQB+Q3CQFgSSQcQ\nlrP7EQbuGYSsTOm8xekQUMb85fulgLmE5ebmVLmZAGZ2g6ShwC7ADsBfJJ1oZlfF+g/G66bTLk4t\n/gnOpLpIXc451x5+Gl9JHwFX1PSqzRXusm72JevCJC0LdAeOMLPn4rGtChTfHHg2lukM9CDMWAF6\nAs+Z2bWJttds1UJxowgz5K65vuRjZh8B1wHXSToPOIpw73kksBcwyczSA7pzzrmMNFU4Q67XJet6\n+ZrwBfBf4HeS1pS0HWGDV748wn0l7SFpHcLmqmVgXkDcd4FNJe0gaW1JZwM/K6cjMXXj7cAgSXtK\nWk3SZpJOjfeRkXRJvMZqkjYhbBLLLatfRQhoe6ekTSWtEXeQ35haAnfOOefmae8BuRMwN97PPYAw\n2x1LGIz/VKDOqfE1mjAj3s3MPo/nrgXuBe4kLFkvSxggy3UoMAi4EBgX29wU+CCe7wxcSRiEH41l\n+gKY2VRCLuZOwGPAa8DFwBep+9bOOeeq4Luss7UCYVaLmQ1n/saqnPS6ggFvmdnm+Rozs28JO66P\nSJ06LVHmsDz1+qXeNxF2WJ2VLhvPH5fveOL8e0D6USvnnHMZ8l3WGZC0DOEZ3m0Iy84lV61Nj5xz\nznU0vss6GzcSloAvNLOHyqjnS77OOecA32WdiUqSSZjZJFovYTeY+4GXKq/+0wHZdGN0Ru3UhfQT\nbg4+yaSVzTmz6jbep9UdpLKtwXVVtxF8lFE79eCbjNrpU0Xd16n1Y09z6UTnCoaFuXU6INdnr5xz\nzrmFTHtv6nLOOecq0swiFeZDrs+hrz575ZxzzrWh0e4h12ev2hBTKKaTUewTUzL2K1TPOedc42iK\nA3L5r7aHPkl9JU2I48qLkgoGmZK0paRnYzrfWZLeknRCuZ+nIWbIMZvSFcDvzWxQhW10js8fO+ec\n6wCamzvT1FzBDLmNOpL2JwSo+h3wMiE/wmOSupvZtDxVviaMQa/Fn7cihFWeaWbXl9qvDjlDTpJ0\nCnAZsH9uMJbURdLlkj6J327+I2nTRJ1tJDVL2lHSq5JmE6JrIWl3SSNivfGSzogxs3N1+0l6TdJM\nSR9IukrSkonzfSR9EUNrvilphqQhkrousF+Kc84tBJqaOjF3bueyX01NbQ59/YBrzWyQmY0DjgZm\nAYfnK2xmo83sX2b2lpl9YGa3EyI1/qKcz9OhB2RJ/0eIwrWLmT2YOPU3YE/gYGBjQsrDx2JAkqTz\ngT8T0je+JukXwC3AJcC6wO8J+/7/mqjTBBwL/Bg4hBDHemCq3SWAk4CDCH8gqxDCcDrnnKtjkhYl\nhHEenjsWwx4PA7YosY2NY9mnyrl2R16y3pmQLLiXmT2VOyhpCcK3mUPM7PF47CjgV4SQmhcl2jg9\nhuzM1T0DON/M/hkPTYrHLgDOATCzyxP1P5B0OnANcEzi+CKE5fOJsd0rgdOr/cDOOefma5rbGeZW\nEDpzbtEl6+UJMS/SD+x/AqxTrKKkycCPYv0BZnZTsfJpHXlAHkP4xZ0taScz+zoeX5PwuZ7PFTSz\nuZJeJsyE5x0GRqTa3AjoKal/4lhnoIuk75nZbEnbE5JbrAv8IF5rsdz5WGdWbjCOphLidrdhKPC9\n1LEN8BzJzrn69iAhpX3SjJpftbmpMxQfXJk7+B6aBt/T8uBX02vVpa2ApQhpggdKGm9m/yq1ckce\nkD8iJHB4ChgqacfEoFyqdPmlgDMI2Z1aiIPxqoS/dVcRlrE/JyxJXw90AXIDcjo8lFFSHO4dgRVL\n7rxzztWHX8dX0ut5jmWrqakT1saArD32Y5E99mtxrHnMaJq326ZQlWmEW5PpfT9dgY+LXStGlAR4\nQ1I3YABQ8oDcoe8hm9lkQoKKboR7xEsC7xEGxC1z5SQtQsiL/EYbTY4E1jGz99OveL4HIDP7k5m9\nbGbjgf/J+GM555wrQdPczsz9rvxXsSVrM/uOsHraK3cs5rLvRWLltQSdgcXK+TwdeYYMgJl9KGkb\nwkz5MWAnwj3dv0n6ApgMnAIsTkhqkZNvxno28FC8DzAYaCYsY29gZqcTNoctKuk4wkx5K8LGL+ec\nc43jYuBmSSOY/9jTEsDNAJLOB1Yysz7x/R+BD4Bxsf42hI29l5Zz0Q4/IAOY2ZQ4KD9JuBHbmzDg\nDgK+D7wK7GBmXyWr5WnncUm7EpatTyHMtMcRlqQxs9cknRjPnQc8Q7ifXNGzz8455ypnzZ2xpgqG\nsTaeQzazuyQtT5ikdQVGA73N7LNYpBuwcqJKJ8JTO6sBcwkrtSebWVmZTzrkgGxmrVLEmNlUwkar\nnBPiK1/9pymQOcrMngCeKHLtywjPPSfdljh/C+HRqWSdBwpdzznnXIXmdmpzU1fBem0ws6uBqwuc\nOyz1/krgyvI70lKHHJCdc845SthlXbBeHfIB2TnnXMfUJJhbwgMs+erVIR+Q68qytN5pX4bRN2fU\nj90yaGNYBm1kkWQ9q4il6RgBHVlWyeurf8BgjRZxeipzGb+rug2A4zkzg1YWzaCNZTNoI6u/r7e0\nXaSgqRn1oYgmwh3bSurVoQ792JNzzjnXKHyG7JxzrmPyGXLjizkwj0u8b5ZUUsiZcso655yrwtwq\nXnWoIQdkSTfFgbFJ0hxJ70o6XVKln7cbMCTLPjrnnKvSXEK0iHJfdTogN/KS9RDgUEK2hp0Iz5PN\nIWRuKouZfZppz5xzzlWvmcqWn5uz7kg2GnKGHM0xs8/MbHKMljKMkK4RSXtLel3S7Lg8fWKxhpLL\n0JIWlXSlpCmSvon1/5yq8iNJ90r6WtI7krLYtuyccy4pdw+53JffQ253swlpFDchZN+4nZDb8Ezg\nHEmHlNjO8cCuhExT3YGDgImpMmcAdxLyJj4K3CZpmWo/gHPOucbVyEvW88Qcxr2By4ETgWFmdl48\nPV7S+sDJlBaTemXgXTPLZf2YnKfMTWZ2V7z2X4HjgM2Axyv/FM4551qodIOW30Ne4HaTNIPwpL4I\n8aYHAM8C96fKPgccL0lm1irpRMrNwBOS3iYksng4xr9OGpv7wcxmSZoOrNB2lwcTklIlbUrIHOmc\nc/VqLCH/cdLsfAWz1WCPPTXygPxv4GjCnropZtYMENJaVs7MRklajbBRbHvgLknDzGzfRLHv0tUo\n6fbAPsAqVfXPOecWvA3jK2kqUFayo/L5gNxhfG1mE/IcfwvYMnVsK+CdEmbHAJjZTOBu4G5JFMSx\n+gAAG2VJREFU9wBDJS1jZl9W1WPnnHOl8wG5w7sIeFlSf8Lmrp5AX8Jsuk2S+hG++o0izHz3A6b6\nYOyccwuYD8gdW1xy3o+QeLo/YXDtb2a3JoulqyV+ngGcAqxF+GN9Bdi5SN1Cx5xzzrl5GnJATieP\nznP+PuC+IufXSL3vnPj5euD6InVbJdo0syzStzjnnEvKReqqpF4dasgB2Tnn3EKgicqWn33J2jnn\nnMuQ30N2zjnn6oAPyK5mOu8L2qTy+nMHZNSRTzJo4wcZtPFNBm1kdfs+i9/JzzNo46UM2vhlBm1A\nNn2p/s/4eM7MoB9gp5xVdRu6IIu+ZPF3LZvfCVT/O3Gl8wHZOedcx+QzZOecc64ONFgs66qyPUm6\nKaYmbJI0R9K7kk6XVHG7klaNbf6kmr4555xrcA2WfjGLGfIQ4FDge4T4zlcDc4ALym1IUi4RhAfS\ncM45V1yDLVlnkQ95jpl9ZmaTzew6YBiwO4CkvSW9Lmm2pAmSTkxWjMf6S7pF0peESOTvx9Oj40z5\n37Hsk5IuTtW/T9KNiffdJD0iaZak8ZL2i9c4Lp5vNfuWtHQ8tnXi2AaSHpU0Q9LHkgZJWi5xfh9J\nr8XrTJP0uKTFE+ePlPSmpG/if/9Q9W/ZOedcS7nAIOW+GnHJuoDZQBdJmxBiRd8ObEDY9neOpENS\n5U8CRgMbE8JZbkaYJW8HdAP2KuPat8Y6WxNSJ/0B+FGqTNHZt6SlgeHACGATQh7lFYBcfuNu8TNd\nD6wLbAPcG/uMpIMIaR7/Es//FThb0sFlfA7nnHPtSFLfOKH7RtKLkgrmwZW0Z5yYfSrpK0nPS9qh\n3GtmuqlL0vaEAexy4ERgmJmdF0+Pl7Q+cDIwKFFtuJldkmijOf74uZl9Wsa11wV6AT3MbFQ8diTw\nbrpoG00dA4w0s9MTbR8JfCBpLeD7QGfgPjObHIu8kag/ADjJzB6I7yfFz3004QuDc865LNQoUpek\n/QmJiH4HvAz0Ax6T1N3MpuWpsjXwOGEi9iVwOPCQpM3MbEyp3cpiQN5N0gwgd//3NsKg9Cxwf6rs\nc8DxkpRIdTgigz4AdAe+yw3GAGb2nqQvymxnI2C7+JmSDFgTeIKQa/l1SY8R/hAGm9mXkpaIZW6Q\nlIx33Znwh+Sccy4rtbuH3A+41swGAUg6GtiFMNC22h9lZv1Sh06TtDuwG7BAB+R/E2Z/3wFTzKwZ\nQGprIjrP1yWWa6b17HbRUi+SaINUO+k2lgIeJGR0Sl9vavx8v5K0BbADcCxwrqTNmB/l4EjCt6qk\ntv8KNPUDLd3ymA6ETge2WdU559rPWOD11LHZtb9sDQbkuLm4B5Bb3cXMTNIwYItSmlcYAL8PfF5O\nt7IYkL82swl5jr8FbJk6thXwTmJ2nM+38b/prEmfASvm3sRHqzYgfCEAeBtYRNLGiSXrtYAfptog\ntpP71rIxLe8rjyTct56U+3KRj5m9ALwg6RxgErCnmV0qaQqwppndWeQz5tf5kuoidTnnXLvYML6S\nphL26dZQbWbIyxPGn3TItE+AdUq8wsnAksS9R6WqZWCQi4CXJfUnbO7qCfQlzKaL+ZQw09xR0kfA\nbDObThh4L5K0M/Ae4R71MrlKZva2pOHAP+Ku5rnAhcAs4oBrZrMlvQicKmki0BU4J3X9qwgz3Dsl\nXUD4hrM2sD9wBPAzwr3qx2NfNyf8Ab4Z658JXCZpOjAUWAzYFFjGzC4t4ffmnHOuFKWkX3zrDhh3\nR8tjc76qVY+Q9BvgdODXBe43F1SzAdnMRknaj7Bzuj/h61J/M0tubGo1UzazJknHAmfEuv8h7Li+\nEfgJcAvhj+ES5s+Ocw4GbgCeBj4m7HBen5ZrJ4cTdki/SphVn0IYXHPXnyppS2Ag8BhhQJ0EDI3L\nFtMJN/CPJwRsngScaGaPx/o3SPo6tnsBYUl+LOCDsXPOLWjrHRheSZ+MhFt7FKoxjTCH7po63pUw\nrhQk6QDCssA+ZvZkuV2takA2s8PaOH8fcF+R82sUOH4jYQBOHptL2AF9TJH2PgF2zb2X9P8IjyyN\nT5QZR1g6T2qxPG5m7xEem8p3jXGEACgFxeXq8pesnXPOla4Gu6zN7DtJIwgroQ/CvHvCvQhPEOUl\n6UDCZG9/MxtaQa8aK5a1pG0Jm7LGAisRZqjvA8+0Z7+cc87VQO12WV8M3BwH5txjT0sANwNIOh9Y\nycz6xPe/ieeOA16RlJtdfxNvuZakoQZkwo7p84DVgRmEx6wONLM6DZTmnHOuYjUakM3sLknLE26b\ndiUEr+ptZrmNwd2AlRNVjiKstF4VXzm3EG6TlqShBuR4Hze91c8551wjKmVTV6F6bTCzqwm5GfKd\nOyz1ftsKetFKQw3IHV7T47QOLNYe1sqgjfFtF+lQls2gjSz+bH+QQRuvZdAGwP9k0MakDNo4LYM2\nQBdU/7u1M06qvh9nP9B2oTad13aRkpQb6iFpAQwvNYrU1V5qEcvaOeecc2XyGbJzzrmOydMvLjzy\npWt0zjlXJ3IDcrkvH5ArI+mmOCg2SfpW0vuSBkpabAFc/gPCbrp0kFbnnHPtrcHyIXeUJeshwKFA\nF0LQ70GERBF/qeVFY8ztklNAOuecW4CaqWy2WzBLQfuq+xlyNMfMPjOzj8zsQUIKxF8BSPplnEHP\n2yIpaaN4bJX4fhVJD0r6XNJMSWMl7RjPLSPptphYepaktyXlHvZusWQtqZOk6+MsfZakcZKOS3Y0\nzujvk3SSpCmSpkm6UlI6WYZzzjk3T0eZIc8jaQNCFqmJ8ZCRJyZ26tjVhM+6FSHZxI+BmfHc/wLr\nAr2B/xKe+Vm8QDudgMnA3oSkEz2B6yRNMbPBiXLbAlOAX8b27gJGEeJsO+ecy0LunnAl9epQRxmQ\nd5M0g9DfxQiLFH8so/7KwGAzy2Vkmpg6NyqXspFw3zhpXk7kGE/7rMS5SZJ6AvsByQH5c+CYuOT9\njqRHCHFQfUB2zrmsNNgu644yIP+bkLZxKUJM0blmdn8Z9S8HrpHUGxgG3GNmY+O5a4B7JPUgZH26\nP+Y6zktSX+AwYBXCTLoLYfab9EYq5/NUQu7mNgwihEtN6knrtNLOOVdPxhBSCCTNzlcwWzWM1NUe\nOsqA/LWZTQCQdAQwRtJhZnYT82/PK1G+RXiZmBJxKLALsAMhH/JJZnaVmQ2N95p3JtyXHi7pSjM7\nJd2JmFrrb4QvBS8S4mWfAmyWKpr+K2KUdL/+EEIYbuec60g2iq+kKRSIPJkd39TVvuLM8zzg3Pjo\n02eEwXjFRLGN89T7yMyuM7N9CJk8jkqc+6+Z3WpmhwAnAL8rcPmewHNmdq2ZjTGz94E1M/lgzjnn\nyuPPIdeFuwm/0r6EoMmTgQGS1pK0C3BisrCkSyTtIGk1SZsQNl29Gc+dJenXktaUtD4hn/Kb5Pcu\nsGlsa21JZwM/q8kndM45t1DpkANyTKd4JWG5eFHgQMJO6THAybSONt85ln8TeBQYRxjMAb4lzLjH\nAE8Rvj8dmLxc4udrgXuBOwlL1svSMtWWc865BaWS2XGlO7MXgLq/h5xOc5U4PhAYGN8+D/w0VaRz\nouxxFGBm5wLnFjg3KdXOt8AR8ZV0WqJMq/6aWb9C13fOOVch39TlnHPO1YEG29TlA7JzzrmOyZ9D\ndvVrrYzaGZdBG9/PoI0skmxVsp6VTxbPgr+VQRsfZdBGVv/sx2fQxqJtF2nTgxm0ATC96hZ09mVV\nt2H77l59P+4+s+o2qrcA1oUbLFJXh9zU5ZxzzjUanyE755zrmBpsU5fPkEskaUI6s1MWZZ1zzlUo\nt6mr3Fedbuqq6wFZ0vKSrpE0SdJsSVMlDZG0RUbttxo4JfWR9EWe4psC12VxXeeccxlosEhd9b5k\nfS+hjwcDE4CuhKxJy9XwmiJPOkcz+28Nr+mcc65cDbbLum5nyJKWJuQv/rOZPWNmk83sVTMbaGYP\n58pIulbSx5K+kfSapJ0Tbewt6fU4u54g6cTEuSeBVYFLJDVLapK0DXAjsHTi2BmxfIvZtKQBiZn7\nh5IuTX2EJSXdIGl6LHcUzjnnspO7h1zuy+8hl21mfO0hqUv6pCQBQ4EtgN8A6xHCZjbF8z2AfwG3\nE1IfngmcI+mQ2MRewIfA6UA3QnKK5wjJJaYTZuMrAhfmufY+sdxRhGeN9qB17rETgVcIEcSuJqR/\nXLv8X4NzzrmFQd0uWZtZk6Q+wD+AP0gaCTwN3BlzGf+KcF93XTN7L1abmGiiHzDMzM6L78fH5BEn\nA4PM7AtJTcBMM/s0V0nSV+Hy9lmR7q1MyHE8PMbV/hB4NVXmETP7e/x5oKR+hKQW75bxa3DOOVdI\nEy0T75ZTrw7V7YAMYGb3SXoE+AWwObATcHJc/l0B+DAxGKetB9yfOvYccLwkxTSOlbqbMEOeEPMs\nPwo8FAfnnPSM+ePY5yIGAUukjvUkm6AUzjlXK2OB11PHZtf+spUOrD4gVyYmdBgeX+dK+gdwFnmW\nkhdgnz6U1B3YnjBTv4rwRWHrxKCcfjrOaPMWwSHA6tl21jnnam7D+EqaSs0fTGkizxbcEpTw2JOk\nvsCfCLc0xwDHmtkrBcp2Ay4irNquBVxmZifmK1tMPd9DLuQtwjRyDLCypELxIt+i9dRyK+CdxOz4\nWxLZnIoca8XM5pjZI2Z2AmEpegta/410zjlXKzXa1CVpf8IAeyawMWG8eUzS8gWqLAZ8CpwDjK70\n49TtgCxpWUnDJR0kaUNJq0nal3AP+H4z+w/wDHCPpO3j+R0l9Y5NXAT0ktRf0trxfnRf4G+Jy0wE\ntpa0kqTlEseWkrSdpOUkLZ6nb30kHS5pfUmrEx7LmgVMqsGvwjnn3ILVD7jWzAaZ2TjgaML/4w/P\nV9jMJplZPzP7J1UERa/bAZmww/pFwr3apwk3Kc4CrgWOjWX2Iuxkvh14g5AfuROAmY0C9gP2j3UH\nAP3N7NbENc4AVgPeI3y7wcxeAP5O2KH9KeELALRcGPmSsMP6WcI3p+2AXc3sizxlKXLMOedcpSqJ\n0pV7FSBpUaAH4TYpEHb5AsMIK6E1U7f3kOO949Piq1CZL4Eji5y/D7ivyPmXCMsR6eN9CbPp5LE1\nEj8/ADxQpN018hzbpFB555xzFcp+qrM84bblJ6njnwDrZH61hLodkJ1zzrnq3RFfSV+1R0fa5AOy\nc865BnZgfCWNJKxK5zWNsKjdNXW8K+Hx1ZrxAbmurEi4pV2pp7LpBntm0MZDGbTxVgZtrJpBGwAv\nZ9DGshm08U31Tezz5+rbABg8IoNGns6gjYxsMKD6Nl4/t+omdPdlVbexgWVzq/N1PV5F7Y45vJjZ\nd5JGEPImPAjzIkP2Ai6v5bU75m/MOeecq52LgZvjwPwyYdf1EsDNAJLOB1Yysz65CpI2IsQNWwr4\nUXz/rZmVPLPwAdk551wHlXsQuZJ6hZnZXfGZ47MJS9Wjgd6JkMrdCCGUk0Yxf4vZJoQcC5OAVpt8\nC/EBuQQxM9SoUiKvlFPWOedcNXIJjiupV5yZXU1IDJTv3GF5jlX9GHFdD8jxG8o5wM6EbylfEL6p\nnB2fF15Q9qSyr2HOOedqpjYz5PZS1wMycC+hjwcDEwiDci9guWKVshafd3bOOVdXmqhscK3P7BJ1\nG6lL0tKE2NN/NrNnzGyymb1qZgPN7OFYplnS0ZIelTRL0nuS9k6183+S3pb0dTx/tqTOifNnShol\n6beSJkj6UtIdkpZMlHlS0sWJ93+U9I6kbyR9LOmuVPc7SRoo6b+Spko6sya/JOecW6jVKJh1O6nb\nAZkQOnMmsIekLkXKnU1Ih/gT4DbgTknJaCrTCWmU1gOOI0T26pdqY01gd8LS+C7ANsCp+S4maVPg\nMqA/0B3oTYipndQn9n0z4BTgDEm9inwG55xzC7m6HZBjGsM+8fWlpGclnSspnVHpLjO7yczGm9kZ\nwKvMj3WNmZ1nZi+Z2Qdm9ggh6cR+qTYE9DGzt8zsOeBWwtJ4PisTBttH4qx9jJldmSrzmpmdY2bv\nxdjZrxZpzznnXEUaa4Zc1/eQzew+SY8AvwA2B3YCTpF0hJkNisVeTFV7Adgo9yam0TqWMAteivCZ\n03HTJprZrMT7qcAKBbr1BGEr+wRJQ4GhwH1mlozY8FqqTrH2Ei6NXUzaIb6cc65ejSHk8EmavQCu\n21j3kOt6QIZ5SSaGx9e5kv5ByPo0qGhFQNIWwD+B04HHCQPxgUD6kaT0Nj2jwOqBmc2UtAnwS8JI\neRYwQNKmZpZLu1Vyey2dAKzbdjHnnKsrG5GYB0VTKPDUUIYaa5d13S5ZF/EWsGTi/eap85szP+bi\nFoTZ7/+Z2Ugze4/qYlMCYGbNZvZvMzuV8LdwNUIKRueccwtMboZc7stnyGWRtCxhs9aNhCXgGcDP\nCPmJ708U3TeGN3sW+G0sk3to+11glbhs/QqwK7BHlf3ahRB55RnCc9G7EO5Bj6umXeecc+VqrBly\n3Q7IhI1TLxLWcdcEFgUmA9cC5yfKnQkcAFxFuFd7gJm9DWBmD0m6BLgCWAx4hLAre0CZfUlm3PwS\n2Cte93uEQf8AMxuXp6xzzjlXkrodkOO949Piq5gpZta7SDun0voRpssT588i3AdO1rmM8GhT7v12\niZ+fA7Ytcr1WS9dmlkX6JOeccy3ULnRme6jbAdk555wrzpes64kvDzvn3ELLH3uqG2bWue1Szjnn\nGpPPkF3NPE/IoVGh5Qdk041pGbVTtW/aLtKmdLCC9vRJe3cgGDygvXuQ8IcM2rgmgzaA10dm007V\nPq+6hdf1SAb9gFtbbq8py0RCAAhXOh+QnXPOdVCNtWSdaWAQSdtIapL0gyzbdc4551prrFjWJQ/I\nkh6UNKTAuV9Iagb+C6yYCCFZSrsTJB1XannnnHMuWHgjdd0ADJa0kplNSZ07DHjFzF7PrmvOOedc\nMY21qaucJeuHgWnAocmDkpYE9gGuj0vWzckla0lbSXpG0ixJkyRdJmnxeO5JYFXgklivKR4/VNIX\nknaQ9KakGZKGSOqaaHdTSY9L+kzSl5KekrRxqm/Nkn4n6SFJX8e2Npe0pqQnJc2U9Jyk1VP1dpc0\nQtI3ksZLOkNS58T5AfGzzJb0oaRLE+e6SLowHp8p6QVJ25Txe3bOOVeShXTJOuYnHkRqQCbkFu4E\n3JkrmjshaU1gCCEm9QbA/sCWQC5/8F7Ah4TNeN2AFRNtLAGcBBxESL+4CnBh4rrfB24GegI/B94B\nHo1fEJL6x3IbEZJO3A78HTgX6EGIQz0vn7GkXwC3AJcQUi/9npCT+a/x/D6EcJ5HAWsRYmMnt/Je\nFfuzH7Bh/OxD4u/COeecy6vcTV03AmtJ2jpx7FBgsJnNyFP+VOCfZnaFmb1vZrnY1H0kdTGzLwiL\n+TPN7FMz+zRRdxHg92Y2ysxGEwbNXrmTZvakmd1uZu/G2NVHEwbx9Gz0RjO7x8zGAxcQMjP908yG\nxXqXEVIp5pwBnG9m/zSzSWY2PB47Op5fmRAze7iZfWhmr5rZDQCSVo6/j33N7Hkzm2BmFwPPMT/h\nhXPOuUxUcv+40nCbtVfWY09m9rak54HDgWckrUWYvfYvUGUjYENJv00cU/zv6sDbRS43y8wmJt5P\nBVaY14i0AmGWu0083hlYnDCTTkrOXnMPgr6eOvY9SUuZ2czY556Skp+pM9BF0vcIM94TgAmShgKP\nAg/FFYQNY9l3JClRvwthub8NQwn5KpI2iM0651x9eiG+kmYtkCs31j3kSp5DvgG4XFJfwqxvvJn9\np0DZpQjZmS5j/kCc80Eb10n/li3VxiDgh8Cxsa05hOxQXYq0Y0WO5VYLliLMiO9Nd8jMZgMfSuoO\nbA/8ipCB+0/xPvFShD/pTYDmVPWZrT5hKzsyf9XeOec6hi3iK2kiCyIwSGM9h1zJgHwXcCnh3u7B\nhHumhYwEfmxmxcJPfUuYVZarJ/AHM3sM5i0XL19CvbbiX48E1jGz9ws2YDaHkMrxEUlXE3IhbwiM\nInyWrjErlHPOuZpprBly2YFBzOxrwqB8PmEj1i2pIslZ7EDC8u8VkjaStFbcwXxFosxEYGtJK0la\nroyuvAscLGldST8H/klpqyTpmXr62NnAIXFn9Y9j+/tLOgdAUh9Jh0taP+7OPjhed5KZvUvYNDZI\n0p6SVpO0maRTJe1UxmcrIIMwkHPuqL6NTMJR1ksbWbXTSG1k1U4WbdyXQRv18lmyaqc+2kgvUbeP\nxnoOudJIXTcAywBDzezj1Ll5M1AzG0u4x7s28Axh9jkA+ChR/gzCRqv3gOSmrrYcTliyHkH4UnBZ\nnvr5ZsNFj5nZ48CuhOXolwl/704gfHEA+JKww/pZYAywHbBr3KAGYVPXIMKO8HGEpe9NaXuJvgQZ\nPOadyYCcxePm9dJGVu00UhtZtZNFG1kMyPXyWbJqpz7aqI8BuXYk9Y2Bq76R9KKkn7VR/pfxcdnZ\nkt6R1Kfca1YUyzrulm61zGxmT6ePm9kIws3RQm29BGycOnYLqZm3mT2QbNvMxhAeL0q6N1Un3ZdJ\nefqXr89PAE8U6O8DwANFPk8TcFZ8Oeecq5naLFlL2h+4CPgdYWLWD3hMUncza7VBV9JqhFgdVwO/\nIewxul7SlDielCTTWNbOOefcglOzJet+wLVmNsjMxhEee51FWJnN5w/A+2Z2ipm9bWZXAYNjOyXz\nAdk551wHlX2kLkmLEoJGDc8dMzMDhtF6M3nO5vF80mNFyufl6RfrQ3z4uK1HlWcTHscuYG4J+Vyb\nvyqhXJFrlNKPktRLG/XUl3ppY0H25bU2zk8voUxH+ftaT30prY2JRc7NauN8IuFBOrhChj6msh3T\nRf9fuzzhNmY6gfknwDoF6nQrUP4HkhaLT+a0SWHgd+1J0m+A29q7H845VwMHmdntWTYoaRVCKOQl\nqmhmDtDdzFpsuJW0ImHj8RZxj1Pu+EBgazNrNeuV9DYhKuTAxLGdCPeVlyh1QPYZcn14jPBc90TC\nV1fnnOvovkd4guaxrBs2sw8krUdpsScKmZYejHPHCTeZu6aOdyVMyfP5uED56aUOxuADcl0ws/8S\nnl92zrlG8nytGo6DaQaPk7Zq9ztJIwi5Ex4EiKGQewGXF6j2ApCONbEDZT4d5pu6nHPOuZYuBo6S\ndIikdQkZApcgZA5E0vmSko/m/h1YQ9JASetI+iMhLfHF5VzUZ8jOOedcgpndJWl5QuTGrsBooLeZ\nfRaLdCNk/suVnyhpF0La3uMIaYWPMLP0zuuifFOXc845Vwd8ydo555yrAz4gO+ecc3XAB2TnnHOu\nDviA7JxzztUBH5Cdc865OuADsnPOOVcHfEB2zjnn6oAPyM4551wd8AHZOeecqwM+IDvnnHN1wAdk\n55xzrg78fziXG3mmK1+9AAAAAElFTkSuQmCC\n", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "# Keep track of correct guesses in a confusion matrix\n", "confusion = torch.zeros(n_categories, n_categories)\n", "n_confusion = 10000\n", "\n", "# Just return an output given a line\n", "def evaluate(line_tensor):\n", " hidden = rnn.init_hidden()\n", " \n", " for i in range(line_tensor.size()[0]):\n", " output, hidden = rnn(line_tensor[i], hidden)\n", " \n", " return output\n", "\n", "# Go through a bunch of examples and record which are correctly guessed\n", "for i in range(n_confusion):\n", " category, line, category_tensor, line_tensor = random_training_pair()\n", " output = evaluate(line_tensor)\n", " guess, guess_i = category_from_output(output)\n", " category_i = all_categories.index(category)\n", " confusion[category_i][guess_i] += 1\n", "\n", "# Normalize by dividing every row by its sum\n", "for i in range(n_categories):\n", " confusion[i] = confusion[i] / confusion[i].sum()\n", "\n", "# Set up plot\n", "fig = plt.figure()\n", "ax = fig.add_subplot(111)\n", "cax = ax.matshow(confusion.numpy())\n", "fig.colorbar(cax)\n", "\n", "# Set up axes\n", "ax.set_xticklabels([''] + all_categories, rotation=90)\n", "ax.set_yticklabels([''] + all_categories)\n", "\n", "# Force label at every tick\n", "ax.xaxis.set_major_locator(ticker.MultipleLocator(1))\n", "ax.yaxis.set_major_locator(ticker.MultipleLocator(1))\n", "\n", "plt.show()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "You can pick out bright spots off the main axis that show which languages it guesses incorrectly, e.g. Chinese for Korean, and Spanish for Italian. It seems to do very well with Greek, and very poorly with English (perhaps because of overlap with other languages)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Running on User Input" ] }, { "cell_type": "code", "execution_count": 20, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\n", "> Dovesky\n", "(-0.87) Czech\n", "(-0.88) Russian\n", "(-2.44) Polish\n", "\n", "> Jackson\n", "(-0.74) Scottish\n", "(-2.03) English\n", "(-2.21) Polish\n", "\n", "> Satoshi\n", "(-0.77) Arabic\n", "(-1.35) Japanese\n", "(-1.81) Polish\n" ] } ], "source": [ "def predict(input_line, n_predictions=3):\n", " print('\\n> %s' % input_line)\n", " output = evaluate(Variable(line_to_tensor(input_line)))\n", "\n", " # Get top N categories\n", " topv, topi = output.data.topk(n_predictions, 1, True)\n", " predictions = []\n", "\n", " for i in range(n_predictions):\n", " value = topv[0][i]\n", " category_index = topi[0][i]\n", " print('(%.2f) %s' % (value, all_categories[category_index]))\n", " predictions.append([value, all_categories[category_index]])\n", "\n", "predict('Dovesky')\n", "predict('Jackson')\n", "predict('Satoshi')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The final versions of the scripts [in the Practical PyTorch repo](https://github.com/spro/practical-pytorch/tree/master/char-rnn-classification) split the above code into a few files:\n", "\n", "* `data.py` (loads files)\n", "* `model.py` (defines the RNN)\n", "* `train.py` (runs training)\n", "* `predict.py` (runs `predict()` with command line arguments)\n", "* `server.py` (serve prediction as a JSON API with bottle.py)\n", "\n", "Run `train.py` to train and save the network.\n", "\n", "Run `predict.py` with a name to view predictions: \n", "\n", "```\n", "$ python predict.py Hazaki\n", "(-0.42) Japanese\n", "(-1.39) Polish\n", "(-3.51) Czech\n", "```\n", "\n", "Run `server.py` and visit http://localhost:5533/Yourname to get JSON output of predictions." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Exercises\n", "\n", "* Try with a different dataset of line -> category, for example:\n", " * Any word -> language\n", " * First name -> gender\n", " * Character name -> writer\n", " * Page title -> blog or subreddit\n", "* Get better results with a bigger and/or better shaped network\n", " * Add more linear layers\n", " * Try the `nn.LSTM` and `nn.GRU` layers\n", " * Combine multiple of these RNNs as a higher level network" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Next**: [Generating Shakespeare with a Character-Level RNN](https://github.com/spro/practical-pytorch/blob/master/char-rnn-generation/char-rnn-generation.ipynb)" ] } ], "metadata": { "anaconda-cloud": {}, "celltoolbar": "Raw Cell Format", "kernelspec": { "display_name": "Python [conda root]", "language": "python", "name": "conda-root-py" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.5.2" } }, "nbformat": 4, "nbformat_minor": 1 } ================================================ FILE: char-rnn-classification/data.py ================================================ import torch import glob import unicodedata import string all_letters = string.ascii_letters + " .,;'-" n_letters = len(all_letters) def findFiles(path): return glob.glob(path) # Turn a Unicode string to plain ASCII, thanks to http://stackoverflow.com/a/518232/2809427 def unicodeToAscii(s): return ''.join( c for c in unicodedata.normalize('NFD', s) if unicodedata.category(c) != 'Mn' and c in all_letters ) # Read a file and split into lines def readLines(filename): lines = open(filename).read().strip().split('\n') return [unicodeToAscii(line) for line in lines] # Build the category_lines dictionary, a list of lines per category category_lines = {} all_categories = [] for filename in findFiles('../data/names/*.txt'): category = filename.split('/')[-1].split('.')[0] all_categories.append(category) lines = readLines(filename) category_lines[category] = lines n_categories = len(all_categories) # Find letter index from all_letters, e.g. "a" = 0 def letterToIndex(letter): return all_letters.find(letter) # Turn a line into a , # or an array of one-hot letter vectors def lineToTensor(line): tensor = torch.zeros(len(line), 1, n_letters) for li, letter in enumerate(line): tensor[li][0][letterToIndex(letter)] = 1 return tensor ================================================ FILE: char-rnn-classification/model.py ================================================ import torch import torch.nn as nn from torch.autograd import Variable class RNN(nn.Module): def __init__(self, input_size, hidden_size, output_size): super(RNN, self).__init__() self.hidden_size = hidden_size self.i2h = nn.Linear(input_size + hidden_size, hidden_size) self.i2o = nn.Linear(input_size + hidden_size, output_size) self.softmax = nn.LogSoftmax() def forward(self, input, hidden): combined = torch.cat((input, hidden), 1) hidden = self.i2h(combined) output = self.i2o(combined) output = self.softmax(output) return output, hidden def initHidden(self): return Variable(torch.zeros(1, self.hidden_size)) ================================================ FILE: char-rnn-classification/predict.py ================================================ from model import * from data import * import sys rnn = torch.load('char-rnn-classification.pt') # Just return an output given a line def evaluate(line_tensor): hidden = rnn.initHidden() for i in range(line_tensor.size()[0]): output, hidden = rnn(line_tensor[i], hidden) return output def predict(line, n_predictions=3): output = evaluate(Variable(lineToTensor(line))) # Get top N categories topv, topi = output.data.topk(n_predictions, 1, True) predictions = [] for i in range(n_predictions): value = topv[0][i] category_index = topi[0][i] print('(%.2f) %s' % (value, all_categories[category_index])) predictions.append([value, all_categories[category_index]]) return predictions if __name__ == '__main__': predict(sys.argv[1]) ================================================ FILE: char-rnn-classification/server.py ================================================ from bottle import route, run from predict import * @route('/') def index(input_line): return {'result': predict(input_line, 10)} run(host='localhost', port=5533) ================================================ FILE: char-rnn-classification/train.py ================================================ import torch from data import * from model import * import random import time import math n_hidden = 128 n_epochs = 100000 print_every = 5000 plot_every = 1000 learning_rate = 0.005 # If you set this too high, it might explode. If too low, it might not learn def categoryFromOutput(output): top_n, top_i = output.data.topk(1) # Tensor out of Variable with .data category_i = top_i[0][0] return all_categories[category_i], category_i def randomChoice(l): return l[random.randint(0, len(l) - 1)] def randomTrainingPair(): category = randomChoice(all_categories) line = randomChoice(category_lines[category]) category_tensor = Variable(torch.LongTensor([all_categories.index(category)])) line_tensor = Variable(lineToTensor(line)) return category, line, category_tensor, line_tensor rnn = RNN(n_letters, n_hidden, n_categories) optimizer = torch.optim.SGD(rnn.parameters(), lr=learning_rate) criterion = nn.NLLLoss() def train(category_tensor, line_tensor): hidden = rnn.initHidden() optimizer.zero_grad() for i in range(line_tensor.size()[0]): output, hidden = rnn(line_tensor[i], hidden) loss = criterion(output, category_tensor) loss.backward() optimizer.step() return output, loss.data[0] # Keep track of losses for plotting current_loss = 0 all_losses = [] def timeSince(since): now = time.time() s = now - since m = math.floor(s / 60) s -= m * 60 return '%dm %ds' % (m, s) start = time.time() for epoch in range(1, n_epochs + 1): category, line, category_tensor, line_tensor = randomTrainingPair() output, loss = train(category_tensor, line_tensor) current_loss += loss # Print epoch number, loss, name and guess if epoch % print_every == 0: guess, guess_i = categoryFromOutput(output) correct = '✓' if guess == category else '✗ (%s)' % category print('%d %d%% (%s) %.4f %s / %s %s' % (epoch, epoch / n_epochs * 100, timeSince(start), loss, line, guess, correct)) # Add current loss avg to list of losses if epoch % plot_every == 0: all_losses.append(current_loss / plot_every) current_loss = 0 torch.save(rnn, 'char-rnn-classification.pt') ================================================ FILE: char-rnn-generation/README.md ================================================ # Practical PyTorch: Generating Shakespeare with a Character-Level RNN ## Dataset Download [this Shakespeare dataset](https://raw.githubusercontent.com/karpathy/char-rnn/master/data/tinyshakespeare/input.txt) (from [Andrej Karpathy's char-rnn](https://github.com/karpathy/char-rnn)) and save as `shakespeare.txt` ## Jupyter Notebook The [Jupyter Notebook version of the tutorial](https://github.com/spro/practical-pytorch/blob/master/char-rnn-generation/char-rnn-generation.ipynb) describes the model and steps in detail. ## Python scripts Run `train.py` with a filename to train and save the network: ``` > python train.py shakespeare.txt Training for 2000 epochs... (10 minutes later) Saved as shakespeare.pt ``` After training the model will be saved as `[filename].pt` — now run `generate.py` with that filename to generate some new text: ``` > python generate.py shakespeare.pt --prime_str "Where" Where, you, and if to our with his drid's Weasteria nobrand this by then. AUTENES: It his zersit at he ``` ### Training options ``` Usage: train.py [filename] [options] Options: --n_epochs Number of epochs to train --print_every Log learning rate at this interval --hidden_size Hidden size of GRU --n_layers Number of GRU layers --learning_rate Learning rate --chunk_len Length of chunks to train on at a time ``` ### Generation options ``` Usage: generate.py [filename] [options] Options: -p, --prime_str String to prime generation with -l, --predict_len Length of prediction -t, --temperature Temperature (higher is more chaotic) ``` ================================================ FILE: char-rnn-generation/char-rnn-generation.ipynb ================================================ { "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "![](https://i.imgur.com/eBRPvWB.png)\n", "\n", "# Practical PyTorch: Generating Shakespeare with a Character-Level RNN\n", "\n", "[In the RNN classification tutorial](https://github.com/spro/practical-pytorch/blob/master/char-rnn-classification/char-rnn-classification.ipynb) we used a RNN to classify text one character at a time. This time we'll generate text one character at a time.\n", "\n", "```\n", "> python generate.py -n 500\n", "\n", "PAOLTREDN:\n", "Let, yil exter shis owrach we so sain, fleas,\n", "Be wast the shall deas, puty sonse my sheete.\n", "\n", "BAUFIO:\n", "Sirh carrow out with the knonuot my comest sifard queences\n", "O all a man unterd.\n", "\n", "PROMENSJO:\n", "Ay, I to Heron, I sack, againous; bepear, Butch,\n", "An as shalp will of that seal think.\n", "\n", "NUKINUS:\n", "And house it to thee word off hee:\n", "And thou charrota the son hange of that shall denthand\n", "For the say hor you are of I folles muth me?\n", "```\n", "\n", "This one might make you question the series title — \"is that really practical?\" However, these sorts of generative models form the basis of machine translation, image captioning, question answering and more. See the [Sequence to Sequence Translation tutorial](https://github.com/spro/practical-pytorch/blob/master/seq2seq-translation/seq2seq-translation.ipynb) for more on that topic." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Recommended Reading\n", "\n", "I assume you have at least installed PyTorch, know Python, and understand Tensors:\n", "\n", "* http://pytorch.org/ For installation instructions\n", "* [Deep Learning with PyTorch: A 60-minute Blitz](https://github.com/pytorch/tutorials/blob/master/Deep%20Learning%20with%20PyTorch.ipynb) to get started with PyTorch in general\n", "* [jcjohnson's PyTorch examples](https://github.com/jcjohnson/pytorch-examples) for an in depth overview\n", "* [Introduction to PyTorch for former Torchies](https://github.com/pytorch/tutorials/blob/master/Introduction%20to%20PyTorch%20for%20former%20Torchies.ipynb) if you are former Lua Torch user\n", "\n", "It would also be useful to know about RNNs and how they work:\n", "\n", "* [The Unreasonable Effectiveness of Recurrent Neural Networks](http://karpathy.github.io/2015/05/21/rnn-effectiveness/) shows a bunch of real life examples\n", "* [Understanding LSTM Networks](http://colah.github.io/posts/2015-08-Understanding-LSTMs/) is about LSTMs specifically but also informative about RNNs in general\n", "\n", "Also see these related tutorials from the series:\n", "\n", "* [Classifying Names with a Character-Level RNN](https://github.com/spro/practical-pytorch/blob/master/char-rnn-classification/char-rnn-classification.ipynb) uses an RNN for classification\n", "* [Generating Names with a Conditional Character-Level RNN](https://github.com/spro/practical-pytorch/blob/master/conditional-char-rnn/conditional-char-rnn.ipynb) builds on this model to add a category as input" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Prepare data\n", "\n", "The file we are using is a plain text file. We turn any potential unicode characters into plain ASCII by using the `unidecode` package (which you can install via `pip` or `conda`)." ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "file_len = 1115394\n" ] } ], "source": [ "import unidecode\n", "import string\n", "import random\n", "import re\n", "\n", "all_characters = string.printable\n", "n_characters = len(all_characters)\n", "\n", "file = unidecode.unidecode(open('../data/shakespeare.txt').read())\n", "file_len = len(file)\n", "print('file_len =', file_len)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To make inputs out of this big string of data, we will be splitting it into chunks." ] }, { "cell_type": "code", "execution_count": 2, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " will continue that I broach'd in jest.\n", "I can, Petruchio, help thee to a wife\n", "With wealth enough and young and beauteous,\n", "Brought up as best becomes a gentlewoman:\n", "Her only fault, and that is faults en\n" ] } ], "source": [ "chunk_len = 200\n", "\n", "def random_chunk():\n", " start_index = random.randint(0, file_len - chunk_len)\n", " end_index = start_index + chunk_len + 1\n", " return file[start_index:end_index]\n", "\n", "print(random_chunk())" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Build the Model\n", "\n", "This model will take as input the character for step $t_{-1}$ and is expected to output the next character $t$. There are three layers - one linear layer that encodes the input character into an internal state, one GRU layer (which may itself have multiple layers) that operates on that internal state and a hidden state, and a decoder layer that outputs the probability distribution." ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "collapsed": false }, "outputs": [], "source": [ "import torch\n", "import torch.nn as nn\n", "from torch.autograd import Variable\n", "\n", "class RNN(nn.Module):\n", " def __init__(self, input_size, hidden_size, output_size, n_layers=1):\n", " super(RNN, self).__init__()\n", " self.input_size = input_size\n", " self.hidden_size = hidden_size\n", " self.output_size = output_size\n", " self.n_layers = n_layers\n", " \n", " self.encoder = nn.Embedding(input_size, hidden_size)\n", " self.gru = nn.GRU(hidden_size, hidden_size, n_layers)\n", " self.decoder = nn.Linear(hidden_size, output_size)\n", " \n", " def forward(self, input, hidden):\n", " input = self.encoder(input.view(1, -1))\n", " output, hidden = self.gru(input.view(1, 1, -1), hidden)\n", " output = self.decoder(output.view(1, -1))\n", " return output, hidden\n", "\n", " def init_hidden(self):\n", " return Variable(torch.zeros(self.n_layers, 1, self.hidden_size))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Inputs and Targets" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Each chunk will be turned into a tensor, specifically a `LongTensor` (used for integer values), by looping through the characters of the string and looking up the index of each character in `all_characters`." ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Variable containing:\n", " 10\n", " 11\n", " 12\n", " 39\n", " 40\n", " 41\n", "[torch.LongTensor of size 6]\n", "\n" ] } ], "source": [ "# Turn string into list of longs\n", "def char_tensor(string):\n", " tensor = torch.zeros(len(string)).long()\n", " for c in range(len(string)):\n", " tensor[c] = all_characters.index(string[c])\n", " return Variable(tensor)\n", "\n", "print(char_tensor('abcDEF'))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Finally we can assemble a pair of input and target tensors for training, from a random chunk. The input will be all characters *up to the last*, and the target will be all characters *from the first*. So if our chunk is \"abc\" the input will correspond to \"ab\" while the target is \"bc\"." ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "collapsed": true }, "outputs": [], "source": [ "def random_training_set(): \n", " chunk = random_chunk()\n", " inp = char_tensor(chunk[:-1])\n", " target = char_tensor(chunk[1:])\n", " return inp, target" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Evaluating\n", "\n", "To evaluate the network we will feed one character at a time, use the outputs of the network as a probability distribution for the next character, and repeat. To start generation we pass a priming string to start building up the hidden state, from which we then generate one character at a time." ] }, { "cell_type": "code", "execution_count": 6, "metadata": { "collapsed": false }, "outputs": [], "source": [ "def evaluate(prime_str='A', predict_len=100, temperature=0.8):\n", " hidden = decoder.init_hidden()\n", " prime_input = char_tensor(prime_str)\n", " predicted = prime_str\n", "\n", " # Use priming string to \"build up\" hidden state\n", " for p in range(len(prime_str) - 1):\n", " _, hidden = decoder(prime_input[p], hidden)\n", " inp = prime_input[-1]\n", " \n", " for p in range(predict_len):\n", " output, hidden = decoder(inp, hidden)\n", " \n", " # Sample from the network as a multinomial distribution\n", " output_dist = output.data.view(-1).div(temperature).exp()\n", " top_i = torch.multinomial(output_dist, 1)[0]\n", " \n", " # Add predicted character to string and use as next input\n", " predicted_char = all_characters[top_i]\n", " predicted += predicted_char\n", " inp = char_tensor(predicted_char)\n", "\n", " return predicted" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Training" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "A helper to print the amount of time passed:" ] }, { "cell_type": "code", "execution_count": 7, "metadata": { "collapsed": true }, "outputs": [], "source": [ "import time, math\n", "\n", "def time_since(since):\n", " s = time.time() - since\n", " m = math.floor(s / 60)\n", " s -= m * 60\n", " return '%dm %ds' % (m, s)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The main training function" ] }, { "cell_type": "code", "execution_count": 8, "metadata": { "collapsed": true }, "outputs": [], "source": [ "def train(inp, target):\n", " hidden = decoder.init_hidden()\n", " decoder.zero_grad()\n", " loss = 0\n", "\n", " for c in range(chunk_len):\n", " output, hidden = decoder(inp[c], hidden)\n", " loss += criterion(output, target[c])\n", "\n", " loss.backward()\n", " decoder_optimizer.step()\n", "\n", " return loss.data[0] / chunk_len" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Then we define the training parameters, instantiate the model, and start training:" ] }, { "cell_type": "code", "execution_count": 9, "metadata": { "collapsed": false, "scrolled": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "[0m 19s (100 5%) 2.1267]\n", "Wh! 'Lod at the to Cell I dy\n", "Whapesfe show dous that,\n", "But thes lo he ther, letrst surave and and cod a \n", "\n", "[0m 38s (200 10%) 1.9876]\n", "Whan the she ciching, doove whath that he gone prie hasigrow nice knotat by wiith haye! ha coll, and i \n", "\n", "[0m 59s (300 15%) 2.0772]\n", "Whurgre of nowif for of agand witeling in fromound be noyed th well and fort and withen a custrone fri \n", "\n", "[1m 19s (400 20%) 1.9062]\n", "Why sleemer chome, I\n", "tence lord thou let not mories, Wherly me cloonger on wit, me cre wort if thing i \n", "\n", "[1m 39s (500 25%) 1.9632]\n", "Whank of winded than inderreast, hids for hink marry, I son will now my be tor think that I be uncient \n", "\n", "[2m 0s (600 30%) 1.9364]\n", "What to youre\n", "Good the dorsentemat.\n", "What the not what a meifery part is be of look\n", "Whait of the hall w \n", "\n", "[2m 20s (700 35%) 1.8673]\n", "Whes Bester,\n", "Bars, and and most man\n", "ingeld my tiement make I lesiefoden as do you same to muse woke o' \n", "\n", "[2m 40s (800 40%) 2.1523]\n", "Whe my bone a me but mast at the face.\n", "Whe he frend him cope a be to with I comes or he God his for ma \n", "\n", "[3m 1s (900 45%) 1.8042]\n", "Whis our namure.\n", "\n", "TRANIO:\n", "May platis the lord,\n", "I wis he we but he hards paron's we for the surven neav \n", "\n", "[3m 21s (1000 50%) 1.9770]\n", "Whis, is at ell demes sy host is in\n", "The revention eart-aly, his the couth stare.\n", "The streath, the so h \n", "\n", "[3m 42s (1100 55%) 1.9771]\n", "Which the called these what mace all bries,\n", "Gow the from ceart repise--tring be of the\n", "Hee he that, of \n", "\n", "[4m 3s (1200 60%) 1.7054]\n", "What that hays how the frow he dresers gard.\n", "\n", "BAPTISTA:\n", "That was on a prain their with to goe, all me\n", " \n", "\n", "[4m 23s (1300 65%) 1.6584]\n", "Whe time, like\n", "Those paurstriet.\n", "\n", "SICINIUS:\n", "Glow a and elfers; rother's Rome servest enon't is may thu \n", "\n", "[4m 44s (1400 70%) 1.7370]\n", "When him these;\n", "There and of Have the in of the do best veath and hever the chaw, not pites with at my \n", "\n", "[5m 6s (1500 75%) 1.6769]\n", "Wher he have live the courtas,\n", "I here that whils him I shee my like deated,\n", "To countert a hardor of so \n", "\n", "[5m 26s (1600 80%) 1.7480]\n", "Wh for the grone them with are\n", "Belent dis are couch of my to tell ding.\n", "\n", "Sir:\n", "What the deatred thou as \n", "\n", "[5m 48s (1700 85%) 1.7725]\n", "Why.\n", "\n", "CUMETEL:\n", "I carcithy place, did the forling like grease in ratenforer;\n", "Which ot chatuse, be thy p \n", "\n", "[6m 8s (1800 90%) 1.6781]\n", "What feath wifiten,\n", "Thou kind Maner'd my king: I'll thou\n", "Reven's my streathence,\n", "By civery sow'd king' \n", "\n", "[6m 28s (1900 95%) 1.5265]\n", "What so srome the and any strand?\n", "\n", "BAPTISTA:\n", "Not bother hear are a common int.\n", "\n", "QUEEN MIRGANSIO:\n", "I say \n", "\n", "[6m 49s (2000 100%) 1.5479]\n", "Why, ruse the tort,\n", "And whese a to the vill bear not tell not the the borwading.\n", "\n", "JULIET:\n", "In be our no \n", "\n" ] } ], "source": [ "n_epochs = 2000\n", "print_every = 100\n", "plot_every = 10\n", "hidden_size = 100\n", "n_layers = 1\n", "lr = 0.005\n", "\n", "decoder = RNN(n_characters, hidden_size, n_characters, n_layers)\n", "decoder_optimizer = torch.optim.Adam(decoder.parameters(), lr=lr)\n", "criterion = nn.CrossEntropyLoss()\n", "\n", "start = time.time()\n", "all_losses = []\n", "loss_avg = 0\n", "\n", "for epoch in range(1, n_epochs + 1):\n", " loss = train(*random_training_set()) \n", " loss_avg += loss\n", "\n", " if epoch % print_every == 0:\n", " print('[%s (%d %d%%) %.4f]' % (time_since(start), epoch, epoch / n_epochs * 100, loss))\n", " print(evaluate('Wh', 100), '\\n')\n", "\n", " if epoch % plot_every == 0:\n", " all_losses.append(loss_avg / plot_every)\n", " loss_avg = 0" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Plotting the Training Losses\n", "\n", "Plotting the historical loss from all_losses shows the network learning:" ] }, { "cell_type": "code", "execution_count": 10, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "[]" ] }, "execution_count": 10, "metadata": {}, "output_type": "execute_result" }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAg0AAAFkCAYAAACjCwibAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzt3Xl8VNX5x/HvYd8XUVlEq4JawAVZBAQFQUDBam1FjRtg\nfwqIRdGq1brUDXfUqohb645Wq+KGgKggIqJBtFpQFEQQZVMCsggh5/fHk+tMhklyJ5lkbpLP+/Wa\n12Tu3OXM3MncZ855zjnOey8AAIDiVMt0AQAAQMVA0AAAAEIhaAAAAKEQNAAAgFAIGgAAQCgEDQAA\nIBSCBgAAEApBAwAACIWgAQAAhELQAAAAQilV0OCc+6tzLs85N76Y9fo457Kdc1udc18654aW5rgA\nAKD8lThocM51lXSupE+KWW9vSa9KmiHpEEl3S3rYOde/pMcGAADlr0RBg3OugaQnJf2fpPXFrD5K\n0hLv/aXe+y+89/dJel7S2JIcGwAAZEZJaxruk/SK9/6tEOt2l/RmwrKpknqU8NgAACADaqS6gXPu\nVEkdJXUJuUkLSasSlq2S1Mg5V9t7/0uSYzSTNFDSN5K2plpGAACqsDqS9pY01Xu/Lp07TilocM61\nlnSXpKO999vTWZAEAyU9VYb7BwCgsjtd0tPp3GGqNQ2dJe0mab5zzuUvqy7pSOfc+ZJqe+99wjY/\nSGqesKy5pA3JahnyfSNJTz75pNq1a5diERFFY8eO1Z133pnpYiBNOJ+VC+ezclm4cKHOOOMMKf9a\nmk6pBg1vSjooYdmjkhZKujlJwCBJ70s6NmHZgPzlhdkqSe3atVOnTp1SLCKiqHHjxpzLSoTzWblw\nPiuttDfvpxQ0eO83Sfpf/DLn3CZJ67z3C/Mfj5O0h/c+GIthoqTRzrlbJP1TUj9JJ0kaVMqyAwCA\ncpSOESETaxdaStrz1ye9/0bSYElHS1og62r5J+99Yo8KAAAQYSn3nkjkve+b8Hh4knVmyfIhAABA\nBcXcEygXWVlZmS4C0ojzWblwPhEWQQPKBV9KlQvns3LhfCIsggYAABAKQQMAAAiFoAEAAIRC0AAA\nAEIhaAAAAKEQNAAAgFAIGgAAQCgEDQAAIBSCBgAAEApBAwAACIWgAQAAhELQAAAAQiFoAAAAoRA0\nAACAUAgaAABAKAQNAAAgFIIGAAAQCkEDAAAIhaABAACEQtAAAABCIWgAAAChEDQAAIBQCBoAAEAo\nBA0AACAUggYAABAKQQMAAAiFoAEAAIRC0AAAAEIhaAAAAKEQNAAAgFAIGgAAQCgEDQAAIBSCBgAA\nEApBAwAACIWgAQAAhELQAAAAQiFoAAAAoRA0AACAUAgaAABAKAQNAAAgFIIGAAAQCkEDAAAIhaAB\nAACEQtAAAABCIWgAAAChEDQAAIBQCBoAAEAoBA0AACAUggYAABBKSkGDc26kc+4T51xO/m2Oc+6Y\nItbv7ZzLS7jtcM7tXvqiAwCA8lQjxfWXS7pM0mJJTtIwSZOdcx299wsL2cZL2l/Sxl8XeL869aIC\nAIBMSilo8N6/lrDoSufcKEndJRUWNEjSGu/9hlQLBwAAoqPEOQ3OuWrOuVMl1ZP0flGrSlrgnFvp\nnJvmnDu8pMcEAACZk2rzhJxzB8qChDqyJocTvfeLCln9e0kjJH0kqbakcyS945w7zHu/oGRFBgAA\nmZBy0CBpkaRDJDWWdJKkx51zRyYLHLz3X0r6Mm7RXOdcG0ljJQ0t7kAXXjhWTZo0LrAsKytLWVlZ\nJSg2AACVy6RJkzRp0qQCy3JycsrseM57X7odODdd0lfe+1Eh179VUk/vfc8i1ukkKXv27Gz17Nmp\nVOUDAKAqmT9/vjp37ixJnb3389O573SM01BN1vQQVkdZs0Wxtm8vUXkAAEAZSKl5wjk3TtIUSd9K\naijpdEm9JQ3If/4mSa2890PzH18gaamkz2U5EOdIOkpS/zDHI2gAACA6Us1p2F3SY5JaSsqR9Kmk\nAd77t/KfbyFpz7j1a0m6Q1IrSZvz1+/nvZ8V5mDbtqVYOgAAUGZSHafh/4p5fnjC49sk3VaCckmS\ncnNLuiUAAEi3SM89QU0DAADRQdAAAABCiXTQQPMEAADREemggZoGAACig6ABAACEEumggXEaAACI\nDoIGAAAQSqSDBponAACIjkgHDdQ0AAAQHQQNAAAgFIIGAAAQSqSDBnIaAACIjkgHDdQ0AAAQHQQN\nAAAglEgHDTRPAAAQHZEOGqhpAAAgOggaAABAKJEOGmieAAAgOiIdNFDTAABAdBA0AACAUCIdNNA8\nAQBAdEQ6aKCmAQCA6CBoAAAAoUQ6aKB5AgCA6Ih00EBNAwAA0RHpoIGaBgAAoiPSQQM1DQAARAdB\nAwAACCXSQQPNEwAAREekg4bc3EyXAAAABCIdNFDTAABAdBA0AACAUCIdNOTmSt5nuhQAAECKeNAg\n0YMCAICoiHzQ8MsvmS4BAACQKkDQQF4DAADREPmggZoGAACiIfJBAzUNAABEQ+SDBmoaAACIBoIG\nAAAQSuSDBponAACIhsgHDdQ0AAAQDQQNAAAglMgHDTRPAAAQDZEPGqhpAAAgGiIfNFDTAABANEQ+\naKCmAQCAaCBoAAAAoUQ6aKheneYJAACiItJBQ82a1DQAABAVkQ4aatUiaAAAICpSChqccyOdc584\n53Lyb3Occ8cUs00f51y2c26rc+5L59zQsMerWZPmCQAAoiLVmoblki6T1ElSZ0lvSZrsnGuXbGXn\n3N6SXpU0Q9Ihku6W9LBzrn+Yg9E8AQBAdNRIZWXv/WsJi650zo2S1F3SwiSbjJK0xHt/af7jL5xz\nvSSNlTS9uOMRNAAAEB0lzmlwzlVzzp0qqZ6k9wtZrbukNxOWTZXUI8wxatWieQIAgKhIqaZBkpxz\nB8qChDqSNko60Xu/qJDVW0halbBslaRGzrna3vsi6xFIhAQAIDpSDhokLZLlJzSWdJKkx51zRxYR\nOJTYihVj9frrjXX88bFlWVlZysrKSvehAACocCZNmqRJkyYVWJaTk1Nmx3Pe+9LtwLnpkr7y3o9K\n8txMSdne+4vilg2TdKf3vmkR++wkKbtz52wdcEAnPfVUqYoIAECVMX/+fHXu3FmSOnvv56dz3+kY\np6GapNqFPPe+pH4Jywao8ByIAmieAAAgOlJqnnDOjZM0RdK3khpKOl1Sb1kgIOfcTZJaee+DsRgm\nShrtnLtF0j9lAcRJkgaFOV7NmtLWramUEAAAlJVUcxp2l/SYpJaSciR9KmmA9/6t/OdbSNozWNl7\n/41zbrCkOyWNkbRC0p+894k9KpJq2FAqw6YZAACQglTHafi/Yp4fnmTZLNlAUClr1EhaurQkWwIA\ngHSL9NwTjRpJP/6Y6VIAAAAp4kFD48bSTz9JpezgAQAA0iDSQUPDhtZ7YsuWTJcEAABEOmho3Nju\naaIAACDzIh00NGpk9wQNAABkXqSDhqCm4aefMlsOAAAQ8aChYUO7p6YBAIDMI2gAAAChRDpoqFHD\nmigIGgAAyLxIBw2StMsu5DQAABAFkQ8amjalpgEAgCiIfNCwyy4EDQAARAFBAwAACCXyQUPTpuQ0\nAAAQBZEPGqhpAAAgGggaAABAKBUiaNiwQcrNzXRJAACo2iIfNDRtavfr12e2HAAAVHWRDxp22cXu\naaIAACCzCBoAAEAokQ8aguYJul0CAJBZkQ8aqGkAACAaIh801K0r1a5N0AAAQKZFPmhwjrEaAACI\ngsgHDRJDSQMAEAUVImigpgEAgMwjaAAAAKEQNAAAgFAqRNBATgMAAJlXIYIGahoAAMi8ChU0eJ/p\nkgAAUHVViKBh111tauycnEyXBACAqqtCBA0tWtj9Dz9kthwAAFRlBA0AACAUggYAABBKhQgaGjaU\n6tWTvv8+0yUBAKDqqhBBg3NW20BNAwAAmVMhggaJoAEAgEyrMEFDy5YEDQAAZFKFCRpatCCnAQCA\nTKpQQQM1DQAAZE6FCRpatpTWrpW2b890SQAAqJoqTNDQooXNPbF6daZLAgBA1VShggaJJgoAADKl\nwgQNLVvaPUEDAACZUWGCht12s0Ge6EEBAEBmVJigoWZNmyKbmgYAADKjwgQNEgM8AQCQSRUqaIgf\nq2H6dOnddzNbHgAAqpIKFzR8/72UlyedfbZ0ww2ZLhEAAFVHhQoaguaJWbOkFSuk777LdIkAAKg6\nUgoanHOXO+fmOec2OOdWOededM7tX8w2vZ1zeQm3Hc653VMtbNA88eST9njFilT3AAAASirVmoYj\nJN0jqZukoyXVlDTNOVe3mO28pP0ktci/tfTepzy2Y4sW0ubN0jPPSG3bSjk50saNqe4FAACUREpB\ng/d+kPf+Ce/9Qu/9fyUNk7SXpM4hNl/jvV8d3EpQ1l8HeNq0SbrkEvubJgoAAMpHaXMamshqEX4s\nZj0naYFzbqVzbppz7vCSHCwYSrpzZ+noo+1vmigAACgfJQ4anHNO0l2SZnvv/1fEqt9LGiHpj5L+\nIGm5pHeccx1TPWarVlL16tKZZ9rfEkEDAADlpUYptp0gqb2knkWt5L3/UtKXcYvmOufaSBoraWhR\n244dO1aNGzcusOzaa7M0enSWatSwoaUJGgAAVdWkSZM0adKkAstycnLK7HjOe5/6Rs7dK+l3ko7w\n3n9bgu1vldTTe5804HDOdZKUnZ2drU6dOhW6n06dpMMOkyZOTLUEAABUTvPnz1fnzp0lqbP3fn46\n951yTUN+wHCCpN4lCRjydZQ1W5RK69bUNAAAUF5SChqccxMkZUk6XtIm51zz/KdyvPdb89cZJ2kP\n7/3Q/McXSFoq6XNJdSSdI+koSf1LW/jWraU5c0q7FwAAEEaqNQ0jZb0l3klYPlzS4/l/t5S0Z9xz\ntSTdIamVpM2SPpXUz3s/K9XCJqKmAQCA8pNS0OC9L7a3hfd+eMLj2yTdlmK5QmndWlq3TtqyRapb\n3PBSAACgVCrU3BOJWre2ewZ4AgCg7FWKoKGwJoq335YGDZJK0EEEAAAkqNBBwx572H1hQcMzz0hT\npkhr15ZfmQAAqKwqdNBQv77UtGksaPjgA2n79tjzs2fb/VdflX/ZAACobCp00CDFelC89ZbUvbv0\n6KO2fN066X/5g1sTNAAAUHqVImj45hvpwgvt8Qsv2H0wfkPt2tLixRkpGgAAlUpp5p6IhNatpUce\nkfLypLPOkiZNknJyrGmiVSupTRtqGgAASIcKX9Owxx4WMAwfLl1/veU0TJkivfee1KuXtN9+BA0A\nAKRDhQ8aDjlE2nVXadw4aa+9bBKrZ56RPvzQgoa2ba15gm6XAACUToUPGn7/e+n776UWLezxiSdK\nkydL27bFgob166Uff7Tnn37aaiIAAEBqKnzQIEk14jIzTjzR7hs0kA46yIIGyZoo8vIsYTIrywIN\nAAAQXqUIGuK1b295DD16WDARBA2LF0vZ2dKaNdLWrdKYMZktJwAAFU2F7z2RyDnp2WdjE1g1bCg1\nb241DUuWSI0bS/feK515pvTyy9Lxx2e2vAAAVBSVrqZBkg49VPrtb2OP27a1oGHKFKl/f+n006XB\ng622gQRJAADCqZRBQ6L99rMhpj/4QDr2WKuNuOACadkyaeHCTJcOAICKoUoEDUFNg/fSMcfYssMP\nt5yHmTMzWzYAACqKKhM0SDamQ6tW9nf9+lKXLgQNAACEVSWChv32s/tBgwou793bggbyGgAAKF6V\nCBratZOOPNISIOP17i398ENsQqu8vPIvGwAAFUWVCBrq1rUahQ4dCi7v2VOqVs2e27ZN6tZNuuyy\nzJQRAICoq3TjNKSiUSObq2LmTGnlSumjj2y46VtuyXTJAACIniodNEjWRPHPf0obN1qPijlzrKdF\nkDwJAABMlWieKErv3tJPP0kHHGAjRNaoIU2dmulSAQAQPVU+aOjTx26PPy41a2Z5DgQNAADsrMoH\nDQ0bSm+/bbkNkjRwoD3eti2z5QIAIGqqfNCQaOBA6eefLbcBAADEEDQk6NhR2m03migAAEhE0JCg\nWjVpwABp8mSrcQAAAIagIYmRI6Vvv5U6d5YWLMh0aQAAiAaChiR69ZLmz5fq1ZO6d7dBnwAAqOoI\nGgqx//7S++/bZFeXXsqkVgAAEDQUoU4dadw464I5fXqmSwMAQGYRNBTjuONswKe//pVZMAEAVRtB\nQzGck26+Wfr4Y2nixEyXBgCAzCFoCKFXL+nss6XRo6WTTpJ++CHTJQIAoPwRNIT08MPSpEnSrFk2\nG2Zubvr2vXq1tGhR+vYHAEBZIGgIyTnp1FNtpMilS6VXX7Xl3kvnnCM98UTJ933JJdKRRzLfBQAg\n2ggaUnTooVK3brH8hhkzrBbiiiuk7dtT319envTGG9KaNdIrr6S3rAAApBNBQwmMHGk1Dl9/Lf3t\nb1KbNtKKFdLzz6e+r08+seaJJk0s+AAAIKoIGkrg5JPtIn/aadK8eVbr0L+/NH58wUGgfvlFuv56\n6bbbpHXrku9r6lSpfn3pxhvt7+XLy+c1AACQKoKGEqhXTxo61AKG3r2lfv2kiy6y4aZnz7Z1Vq2S\n+vaVbrhBuuoqaY89pOHDdx6Seto0qU8f6ayzLHj417/K/eUAABAKQUMJjRpltQ033WRJkgMHSu3b\nW1LkccfZFNtffy3NnGlNF9deayNLdu0q9eghffedzaI5e7Zt26CBdMop0iOPSDt2ZPrVAQCwM4KG\nEjrgAOmnnywAkCxwuPVWqWVLqXp16fjjpQ8/tAmvdt1VuuwyCyImT7Yg4oQTpClTLHly4EDbx9Ch\nNrtmdnbmXhcAAIWpkekCVCaDB9utMEEwsddeNjT1sGHSb35jk2JJ1iujdm2bKOuww8qlyAAAhEZN\nQwZ07Cg9+aS0ebPVMjhny2vVkrp0kebMia27dauUk5N8P9u2WbIlAADlgaAhQ0480XIcbryx4PLD\nDy8YNIwZIw0YkHwf555rtRTLlpVdOQEACBA0ZFCfPpbvEO/wwy3nYflyq0l47jnrcbFxY8H11qyR\nnn7a5sE4+mjmwwAAlD2ChogJEivnzJHeektav95GjZw3r+B6jz1mzRrvvWfNHD17SmeeKV14ofXM\nAAAg3QgaIqZ5cxth8v33bYTJtm2ta+f778fW8V568EHpj3+0LpxvvmnDWy9bJj36qM1lAQBAuhE0\nRNDhh9v4Di++KA0ZYr0q4oOGmTOlxYstp0GS2rWzAGPWLBsP4rnnqG0AAKQfQUME9eghLVgg/fij\ndNJJ9nju3NgQ1Q89ZAmQvXvvvO3w4VLdutL996e/XPfeK51xhk0R/tNP6d8/ACDaUgoanHOXO+fm\nOec2OOdWOededM7tH2K7Ps65bOfcVufcl865oSUvcuV3+OF2v88+1uzQo4cFEF9+aU0Qzz0njRgR\n66oZr1EjCxwmTpS2bElfmXJzpb//3WbkPO006cADrTtoSXz3nY2e+fnn6SsfAKDspVrTcISkeyR1\nk3S0pJqSpjnn6ha2gXNub0mvSpoh6RBJd0t62DnXvwTlrRIOPFBq2tQmxnLOmiecsyaKG2+0HIcR\nIwrf/s9/tiDjoossgOjfv/AJs8KaM8f28dpr0rvvSitXWu1HSdx2m7RwoXU5BQBUHCmNCOm9HxT/\n2Dk3TNJqSZ0lzS5ks1GSlnjvL81//IVzrpeksZKmp1TaKqJ6dRuCumVLe9y4sdShg3WxfPtt6eab\nba6KwrRtK/3+91bbcPDBNjT1iBFWQ5GsdiLeo4/a/bBhBZe/9JKVp2tXe9ysmTRjhnUbTcXq1ZbE\nKUn//W9q2wIAMqu0OQ1NJHlJPxaxTndJbyYsmyqpRymPXam1aWOzaQZ69JCmT7dxHUaNKn77p56S\n1q6VPvnEciD+8x/p8ceL3mbzZmnsWOmKK6ybZ8B7CxpOOEGqVs1uRx1lQUOq7rrLtj/2WIIGAKho\nShw0OOecpLskzfbe/6+IVVtIWpWwbJWkRs652iU9flUTjN9w+eUFg4nC1K1rtQGSJVMOGyadf75N\nwX3EEZafkOjpp21ciO+/L9j08N//SkuXWu1FoF8/Gztiw4bwr2H9eum++yzo6dPH9hsfnKTb/PkW\nPAEA0qM0E1ZNkNReUs80lWUnY8eOVePGjQssy8rKUlZWVlkdMrJOPNEu3EE3y1TdfbcFA199JdWo\nId1wgwUQ++5rz3tvvSMGD7YRKF94IZaQ+dJLlmB51FGx/fXrZ1N4v/uubTNunDVfDB8eW2fBAmse\nqZYfmj72mCVPXnSR1YD8/LMldu6zT2ybvDybl+PUU20ujuJs3mz5G61bF1y+Y4e9vi+/tBEzmzcv\nfl8//mivswbTuAGoICZNmqRJkyYVWJZT2IRF6eC9T/km6V5JyyTtFWLdmZLGJywbJumnIrbpJMln\nZ2d7pN/mzd43b+79n/4UW/buu95L3k+d6v2IEd7vvbf3eXn23KGHen/qqQX3kZfnfevW3o8d6/2b\nb9q2u+3m/S+/2PPZ2bbsqadi2wwe7P3RR9vfK1bY85MnF9zv1Km2/Nlnw72WCy7wftdd7TXFe/xx\n20/Nmt6PG1f8fnbs8L5VK+/Hjw93XACIquzsbC9LHejkS3CNL+qWcvOEc+5eSSdIOsp7/22ITd6X\n1C9h2YD85ciAunWlSy+1X/7ffGPL7r1X2n9/+1X+xz/a8gULpCeekD7+2HpyxHPOahumTLEky3bt\nbD6M116L7U+SXn/d7rdvt0Gp+va1x61aWQ+RTz8tuN/nnrP72YWl1cbZsUN65hnL3Qi2k2zmz6uv\nlv7wB+n006UHHrB1i/LJJ9Yj5MMPiz8uAFRVqY7TMEHS6ZJOk7TJOdc8/1Ynbp1xzrnH4jabKGlf\n59wtzrkDnHPnSTpJ0vg0lB8lNGKEXbRHj7ZZNJ991mbUrFbN8g2aNpWuuUY65xxrcojPZwj06yct\nWmQTbL30kvWseOQR65o5aZK0++7S1KnW5JCdbc0RQdDgnDVdxCdD5ubaKJjVqoULGmbNklatkn7z\nm4KDWT34oPUYueEGy59YtszGlyjKO+/YPWNHoKTOO88+c0BllmpNw0hJjSS9I2ll3C3+d2hLSXsG\nD7z330gaLBvXYYGsq+WfvPeJPSpQjurXt9qG11+3X+qTJsV6ZdSsKR1/vPTKK1KnTnZBTtZVs18/\n6x561VVWS3H22VbzcOONFijcf7/tOzvbJt9q2FDq3Dm2/UEHFQwa3nnHAo4RI+yXf2KS5datts9g\nxs9nn5X23lsaP94SNz/+2C76V19t+Qzt2lkgE7yGogRjRixaZMFLuq1bZwFORfT995kuQcXw+uvS\nhAllm9wLZFy62zvScRM5DeVixw7vP/00lrsQ74MPvO/Xz/vvvy96H19/Hdt+/Xrv69SxXIIzzvB+\n2zbvGzXy/rrrbF/HHVdw2wce8L56de+3bLHHQS7FokW2jzfeKLj+tdfa8lNP9X77dstluPRS+3uP\nPbw/4QTLszj4YCtL4KGHvHfO+2++Sf4acnO9b9zY+z59bP9ffJF8vbw870eN8v7MM4t+T5IZNMj7\nNm12zr2IuoUL7b2bNy/TJYm2X37xvlo1+/y8916mS4OqLlI5Dag8qlWzX/vJahEOO8xmz2zRouh9\n7LtvbPvGja17p2TNHjVr2miUL71kU3gHTROBgw+2XIOFC+3+hRds+/33l3bbrWATxZIl0k03Sb16\nWR7DsGFWi3Hyydbb4dxzpcmTreZjyhQrS+DUUy2P48knk7+GBQuknBwrs1R4E8Udd1iNxZNPpvbr\n++ef7b38+mvp+uvDbxcFc+ZYz5o5czJdkrKRl2dNWaX17bexGoYXXij9/oCoImhAWl11lXW/7NbN\nHh9zjI2XsHXrzkFDhw52P3eu9M9/WiLlkCEWhPTqZd05AxdeaIHEG29YnsVTT9kAWJ062fMjR0qn\nnGLPt2pV8DgNGlhS5BNPxCb9ivfOOxZU/O530i67JA8apk6VLrvMmnCqV7dZRQOPP26BT2GmT5e2\nbbNA57bbih/Uat06m5sj/hiZkp1t9/PnZ7YcZeWxx6Tf/rb087QsWWL3AwZY0JDscwZUCumuukjH\nTTRPVBrLl1uVbbNm1hySaN997XnJ+x49Yk0dd9xhTR2//OL900/b888/b89t2uR9r17e3313+HIE\nXTk/+GDn5+K7gvbq5X1WVsHnp0zxvmFD74891poyBg3yvmdPe+6dd2y/DRp4/9JLyY999tnet2vn\n/datdt+9e/ImocDVV9s+d93V+zVrwr/GstCtm5XlwAMzW46ycvLJRTdJhXX//d7XqOH966/b/ubP\nT0/5gJKgeQIVVuvW0iGHWNJktSSftnvuse6Zn39uTRhBU8cRR1jtxJgxNh13VpbVFkg2Iua779pz\nYfXrZ4NPPfFEweW5ubavYA6NDh0K1jRMmGCDVx15pCVeVq9uzR3vvWdV0pdfbjORDhxoPUyuusq6\nlwby8qwb6nHHSbVrWxPL3Lk2yFYyGzdK//iHdRXdscMGwiqNtWutlqQkcnMtIbVdO6tJSeesqVHg\nfSwBtrRNFEuWWC+eo4+2CeVookClle4oJB03UdNQqSxf7v3atalts22b9/Xq2a+2P//ZfuGX1sUX\n26/39eu9v+su7wcMsOTE+OS1f/zD+1q1LLny5ZftuQsuKHj8nBzva9e25E7J+2nTrBblhhsssbNL\nF+//9z9bd948W2fmTHu8bp3fadCreLfeagNSLV/u/b/+ZeteeaX311zj/Z13pv6a//pXS2T8/PPY\nso0bY4NwFeXTT+34d9xReC1NYVautBqcn35Kvczl5b//jdVyPfJI6fb1xz9637+//X3WWd63b1/6\n8kXdF1/Y/0aYzxLKV1nWNGQ8QEhaKIIGeBvJ8Z57iq7KT8Unn9gnvn59u7gPHuz9RRfZBSM4xowZ\nts6iRd537ep9797J93XiibZe374Fy/fBB94fcIA1rdx5p13wmzSxICTQpo192SYKRur8v/+zx3l5\ndjGqUcOCHcn7BQsKrh+/32Q6d7btgiaXDRvs+H/+c9HbeW9Bi3PWRFKjhlXBF2bMGO9ffDH2+K67\n7Lhvvrk1ix3RAAAbbklEQVTzugsXWk+XlSuLL0NZuvtuCxB32cX7v/+9dPvq2NF6/3hvAaGU+aal\nspSbG2u6evnlTJcGiQgagDQ54wzvzznHuoom88MP9l8xapT/dVjtZP7zH7ugJvv1vWmTBQWSdcNL\nzJHIyrL8jXg//WQX0urVvV+8uOBzeXlW87LbbhbkBMt69PD+sMNiXVYTrV1rZezVy+4XLvT+3HOt\nXJ06Jd8m3vnnWwDkvfeHHGLvWzKrV9v+u3ePLQtqYR56qOC6ubm2XtAtN5NOOMGCwm7dLO+kpPLy\nrGvxLbfY4y+/9Em7DJe1lSvtsxXf3bis3H23nfPmzXceYh6ZR9AAlJO8PPvlKdmv9MJqOfLyvF+2\nrOh9zZhh83Yk/tq+806ridi2zR5/8oklhDZpUvSvtgsusC/p7dvtgiRZkHHWWcnL+e9/2zpffx0b\nv0Kyi2TNmpaYGUiWpNqjRyzgGT7c3o9knnjC/1rNv2SJNd/UqGGPr7ii4Lp33WUXmxEj7PlZswo+\nv3699889l7w88e6/3/ulS4tepyjB2BzXXuv9SSfFEmFLYu1aey3PPWeP8/Js39dfX/J9lsSYMb5c\nfvkvW2a1deed5/1NN3lft641eSE6SIQEyolzsa6gf/tb8jEsgvX22qvoffXta10V+yXMvNK1qyV5\nfvaZXWrPOsuSO+fPt26fhRk61EaVnDrVurV27WrdPR9/3GYxTTR9unUn3HdfS9j89FPrAnvnnZas\n+dlntt6339rsnvHdKnNzbfyKYATPTp2sq+i2bTsfZ8oUS5asW1f697+tfLm5Nnvp0qWx9ZYula64\nwsbDmDDBxgIZPTo2AueKFZYAO2SIjWtRmDVrrOvrww/Hlq1YYYmx8UmoRQnG5ujb185jaRIhg+6W\nwYyxzklduthssWH88kvqo5C+/badywcftMerVsX+DrrJlpUrr7RxUG66yZKCt2yxMVIk6X//i32u\nKpIVK+x/JZiLB4VjEmAgQffuNoT1CSeUzf4PPdR6YcybF+uh8NprBacIT6ZjRxuM65JLrDfDiy9a\nj43sbBsS/OSTC45R8eab1mtDkv70J2n9ehvqu3FjO/78+RYUTJkibdpk98G4F4sW2cUgPmjYts0u\nCh07xo6xY4eNjTFqlLR4sQ28ddBBduvUSfrii9i6d95pxx43znrS3HefBQ5du9q4HC++aOXae28b\nP2HAgOTvQzDQ1Mcfx5Y9/7z1xBk0yC6mxXnrLQvUDjvM3r9vv7UArrAgsSiJQYMUC+jCOPpoG+js\nvvuKXzcnxyaUmzHDzvWbb9p4JdOm2WBqHTumN2hYvNjO4eDB9t5s2WLn6dJLLdBs1Ejq2VN6+mnr\nPXLMMdKuu1rvoIo0xfxzz9nrnDfPPn8oHDUNQIKgW2SyLqLpUK+eXVTnzZMeekjac0/rslkc56xW\nYuFCG/zp+ONt+TXXSHXqFLzofP21/bLv398e165tv/JbtLAagfbtYxeX6dPtPn4wreC5Qw+1+0MO\nseMnXpA+/FD68Ue7WJ96qv2C/89/LFhJrGlYsEDq3dvmIJHs1/h//iMdeKAFLPvua+/7yJHWZTEn\nJ/n78N57dh9fMzJvnt3/+9/Jt9m6teDj2bOlww+XatWymoatW617akksWWITvDVpElvWpYvNmrpy\nZdHb5uZa2R95RFq9uvhj3X23vf6XXrKJ2Pr1s2BxwgSrtenXL3wNR1Hefts+o/vvb7VfwYRuU6fa\nCKdDhsTWzcqy5cccI+23n5XrlVdKX4bCrFljQVk6g6MXX7T7IAAsTm6utHlz+o5foaS7vSMdN5HT\ngErunHOsF0ODBtadMqyVK609+dlnCy6/8ELLxdi0yR7ff7/lO+TkJN/P0KHWNTQ31/umTW3wrQYN\nYr0xzjrL+w4dCm5z2GGWd/GPf8TWu+oq2z431xIyGza0dvU5c7x/7DH7e9Mma+dv2tS6pRZnxQpL\nIE1MogwcfrjlDEixuVHatrVusE2a7NwF8IUXrFzx7e4HHBDrwfLRR7avxK+bHTuKn3vFe+vtkpjv\nsWyZ7XPy5J33GZ+oGMyzIhXfg2PjRjvH558fW7ZuneXD1K1rCakvvGD7+u67wvfz1luW8FuUQYO8\n/+1vrUfMgQd6/7vf2fKsLO8POqjguqtWWS+Uo46yc92zp83jUpwdO6xLcHG5QYn+8hf/6xw0JTV8\nuHUl9t7KH8wbEvRcKs6NN1pycFSRCAlUMg89ZP99zqX+pZls0qslS+yL7/777Yv7iCNio1YmE4xH\n8d57Vo5bbrH7jz6yi26TJtZdNN6aNdb7wjkLKF580QKP+C/voUO93313CyJmzbJ9fv65BQLJLqKF\nGTgwefm3bLFyjx1r+3v99djYF5deavevvVZwm8GDY6/Newt4atb0/t577fGqVfZ8fJdR7y2wq1+/\n+N4Ifft6P2RIwWV5edbb5aqrCi4fM8YmVwsSV59/3o49ZIitX1hPGO+9v/12SzBN/LwsX+79++/b\n30GwUlgy5Hff2T4uv7zo17TXXvZ+em9dkp2zhN0GDZIneH71VSyx9tlnrQyffFL0MYKA6brril4v\n3sqVlkTctq2dwzBBXaLVq+1/pV4929/DD9vjPn3sXIYxdGjBoDVqSIQEKpmuXe3+mGOKT6hMVLfu\nzsv22Uc68UTp1ltjVbd//Wvh+whyFO6+2+bmGD3amjhmzbL2/vXrY5OPBXbdVXrgAatOb9HCjvfR\nR9Kxx8bWueMOq8quXj2Wo7F0aSw57sADw73GYcOsGn7x4oLLP/rIyn3GGZYfMX9+rDr+7LMtmS2+\niWLdutiImIsW2f2331rC5H772ePddrPmm/hkyMcft6ajTZuKTsqUrEo7Pp9Bsqacrl2t+SYwc6aN\n9vndd7HX9fnnUrNmluexdq3NqZLM1q323p555s6fl9atLQ9Hsqau3XYrvOr+gQesav3TTwt/PRs2\n2HsRnKvTTrNzf/LJOzdNBNq0sfdQss9F69b2WosSnLeg6SOMcePs8z99uuVMPPJI+G0Dr71mdTs1\na0rXXWdNPT172nw5X38dbh9BU1LQVJaqn36yZtBNm0q2fSYRNAAZ0KGD9RS49NL07XPsWLtAV6tm\nF6sgCTKZjh1tveeftyG069e3L81337U8gzZtLDkvmS5d7EL61luWAPn738eea9bMelJIlqhXq1Ys\naKhfP3yS2Qkn2ORhF19syZaB996zIOfggy3f4uOP7bU2bmxBwJAhdhH45Rdb//nn7QLRuHEsKTO4\nYAdBQ9ATJggaPvvM8iqGDbPXMmVK4eXcvNm2a9Mm+fv00Ud2/M2bLRk1SCz94AO7//xz+yy0bWs5\nKrffnrwHyJNPWg+JogLB4LV07pw8r2HbNgsaatUqetK0YBj1IGioU8fO8xdf2Pt+wAFFl6FmTem8\n8ywAWr8+tnz9ertYBoIyzpkTO19FWbbMyn/JJfY5Ou00aeLE1HueTJ5sQdaVV1pgOG2aBTpt2kjL\nlyfvIZQoCBriZ+KdO9f2tXJl8ROW3Xqr5RidfLKVf+tW+8z95S+pvZZ4K1bY52fjxpLvI5R0V12k\n4yaaJ4ASmTMnltdQnPbtrYo1mPjryitt5Mldd/X+ssvSU5799rMBqYYNs5yIVLz2mlUbB9Xk3nt/\n3HGxMRUuusja8084wQaT8j42NPRjj9njI4+04cKPOirWhHDPPdbEET80eL9+seePOMLa8TdtsqHH\nW7VKPg7GkiU2SFbt2t5/9tnOz7/yipXlkkuszHXq2MBP++/v/ejRtk779jbegffef/yxvd6grT3e\nwIHhx5L429+8b9Fi5+XBSJWXX273heW7PPiglSO+GeyHH6z848aFK8OKFdakEZwH7+0cDBgQe9yz\np+VNxA+zXpS//MVyOoLclOzs5M1KRdm82Zolbr7ZmoJat/a/ji8SjAb75ZfF72fPPW3drl3t8YYN\nsWHvJcsJKUxOjuXk9OtnTUVnnBEbXXO33cK/lkRXX23NRxs2kNMAoAyccYZ9AwTzZEybFvvS+/DD\n9BxjwAAbcrtz55KNujh+vJXn9tvtC3+XXWKJo8GgUk2bxtro8/Ls4l+zpuV3OOf9o496P3KkDW7l\nveUV/Pa3BY8zbJiNVPn557bPINH0zTf9TsN3e28jhTZpYkFLYTNarl1r+R0tWtggX08+acvPOsve\nj19+sYvGfffFtvnzn+2Lf/ny2LJNmywwGT8+3HtWWDJk9+52ofr4Y19gvpVEY8ZYYJNoyZLU5pk4\n4ojYxXPxYjtmjRp20dy+3S6yt95q5+/aa4ve1/btNrBZ4vDnPXtaYmbYcgWBXPCZnzzZ8nS8t8HC\nJJvVtih5eXY+OnSwZOOff47lfcyebfk2iYOnxQvml1mxwj6bkvctW8aCueKSVJPZts32EQxlTtAA\nIO2eecZ6IgS/ojdutC/B3/wmffN9jBhhX+p164a/6MXLy7MLhRTrmTF9uj0XXOAlu1AGtm3z/g9/\nsOW1a9tF6q677Jfyjh02xflxxxU8ztVXW43CRRdZTUvwhb91qyVD3nRTbN0HHrD3adCgkk3Idd99\ndvH88EMr4zvvxJ5bv94ujvGJla++austXBhu/8mSIWfOtGUvvWSvqXp17ydOTL59374250lp3XOP\nvc4ff7Taj7p1rQzPPx+rEXrnHaspOuqogtv+/LPV8nz1lT0O3oPES8KCBXaMxKTdwpxzjtV+Jft8\n5+buHMQls369lSWYwn7GDAuQggnLgknq4oeY37LF9r91q13c4wPoGTMsofKLLwp+vlMRBIpBcEvQ\nAKBcHHdceoc/vvlm+wVW0i/DwBdfWJPJgAGx5pfc3NiFKP6XufcWOJx9tl14vLdfj5L9mmzb1n4N\nxnv4YStns2axbQLHH2/NHFu2xOYUOf/84icLK0zQxTPYV+LEVkEzwowZ9nj0aO/33jt8IJeXZxem\n3/3OgqRNm+xC2aNHrEmmXbtYE0mi3XdPrRtwYVautPf0wQetx8jIkXbc4cNjk6Hl5MSGVQ96juTl\neX/yyf7XIc9zc22o74MPTv4eXHedBUHF1Y7t2GG1PonnN17btrH5XQoTzC0yY4bVNp11lj1++ml7\nfutWa/66557Ycffe24LPAw+MzQOTKPg8lyS47t+/4NwvBA0AKqSg+11ZdU/r3t0ukMVdUIOq55df\ntgvMhAkFn49vmgmqrgMTJ9o27dsXvBiU1LZtdpFs1sxqFRLl5Vn+R7du9ve++9oEaql4+WXLS7j4\nYguQ6tSxLo6Bk0+2QChR0P00mEejtHr3tiBEsl/gf/mLXbhHjYo1EQXNJUGNy7hx/tfciyCnpVat\nwi+m27ZZbsnee9t8K7m5Vrvx/PMFc02CmoG5cwsv78CBVvNRlNmz/a9diYPuvI0aFcwB6drV+zPP\ntL+DIHH0aKsBS5yPJV6XLhZUpSJo+nn00dgyggYAFVJQVdusWfqaPOI98EC4fv47dtiFc/Ron7TW\nIxgzoFevnbddtsyCho4drVo9HXr2tOMVNi5AkEtx222xYCdV99wTC4Ruv73gc9ddZ/khiefkrbd8\nSk0hxZkwwfZ34IF2rLff9r/moQSznO7YYY/79LFaHedi41tccon/NRdi1arCj7N4cWxm1ZYt7XxJ\nljtwxx3W5CBZzVdRzjsvNnjVxo3J34egKWDNGmu2kmJ5EYHRo2NB0fXXW1ARTFBXlKImhivMxRfb\n+xcftBA0AKiQ1qyxb5kwIwSWtYMPtnwNyftvvin43ObNltGeONJm4IsvUksCLM5FF1k5xowpfJ2+\nfe2Xdq1a1sZfElddZYmo8T1FvLceB8mSJYNBv0ra9JLohx/swn3XXfZ427bYaJ5Brx3vremieXOr\nZr/xxtgsp1u2WMBxyinhjvfBB5YDM3Gi5UNcfHEscLrgguID19tvt2aEvDzL66hTZ+dAceJEOy87\ndlhQ7FzB/AXvY6Ohrl9veUN/+EO48o8fb8dMPF+FWbPGEmcTezsRNACokPLyLIExfujjTAnayWvX\nTj71dlGjMaZb0GzzwAOFrzN3rq1Tmmm7C/PVV7bvN96wv0eNssDo3HPTPzzy4sUFL4JDhvgie28k\n2rq18J4IYUybZsOXFzfduvexYCqYer5hQwta4n/FX3ddwWallSt33s/Chf7XZp5q1SyvI4yghikI\nUnv0sB4f8ce65BLrVum9NeHUr2+jXMZjREgAFZJzNgDP6NGZLklsUKI2bZJPRlanTvmV5cgjbfTG\nI44ofJ1u3aTrry/dgD+F2WcfG2xr+nSb1Oz++21SshdfDD9qZ1ht29oIoYEhQ2xyr/jZUotSu3Zs\ntMmS6N/fprkPMwFdMEjXyJH298yZNhhY/CBsq1dLu+8ee9yy5c772X9/mwH0ppukvLxwM69KNkmY\nZINvPfOM9P77NrBX4JFHpNtus5lOv//eZnY9/3wbBbS8EDQAKFOnnWbDO2daUIZgJMhMatHCRpIM\nRs8szJVXhpsBNVXVqtlIlHfcEZvyfNQoG8q6S5f0Hy/ekCE2amK9emV7nJIIhj7/5hubPfbQQ+0i\nfe+99h5JOwcNyVSrZu/j/Pn2Pu+5Z7jj77673T791M5NtWoW2AWjok6ZYvubOdOGgvfeRk0tTwQN\nAKqEKAUNUdCtmw3VPW2aBS/jx9vF8rzzyv7YyeZPiYIGDaTmza1W6rTTbNnQoXYfTMUeJmiQpMMO\ns/uwtQyBgw6yGoVPP5Wuvtqmns/Otvu5c6ULLogNK17etQySVKN8DwcAmbH//jYvQvv2mS5JNNx8\ns03Y1KRJbFmqk6dVRjfdZEFD0KTSqJHVFARzcqxebU05xSlp0HDwwdKMGXaMK66wGoepU20Ol7w8\nmyCudWsLLtq2TW3f6UDQAKBKaNDAfrEVN+FSVVGvXjSbCDJt+PCdl3XoEAsaVq0KV9MweLD0z39K\nffumdvwgr+Hiiy3I7dfPaoO+/tqea93ans9Ukx/NEwCqjIMOslkegVS0b29BQ26uTbceJmioVcsC\nkDAJmPGOP95yWU45xR4PHGgJka++WnAa+kwhaAAAoAgdOljzQDB9epigoaSaNbNeM0FwO3CgJUKu\nW0fQAABA5HXoYD0VZs2yx2UZNCTaZx9L3m3YUOrZs/yOWxhyGgAAKEKQPPv223bfvHn5Hn/MGEvA\nrFmzfI+bDEEDAABFaNjQelAEQUN51jRI1rUyKmieAACgGB06SMuXW4+T+vUzXZrMIWgAAKAYHTrY\nfXnXMkQNQQMAAMUgaDAEDQAAFCNIhiRoAAAARSJoMAQNAAAUo2FDm9grmD67qqLLJQAAIbz3XtXu\nOSERNAAAEErTppkuQebRPAEAAEIhaAAAAKEQNAAAgFAIGgAAQCgEDQAAIBSCBgAAEApBAwAACIWg\nAQAAhELQAAAAQiFoQLmYNGlSpouANOJ8Vi6cT4SVctDgnDvCOfeyc+4751yec+74Ytbvnb9e/G2H\nc66KzxVWtfClVLlwPisXzifCKklNQ31JCySdJ8mH3MZL2k9Si/xbS+/96hIcGwAAZEjKE1Z579+Q\n9IYkOedcCpuu8d5vSPV4AAAgGsorp8FJWuCcW+mcm+acO7ycjgsAANKkPKbG/l7SCEkfSaot6RxJ\n7zjnDvPeLyhkmzqStHDhwnIoHspDTk6O5s+fn+liIE04n5UL57Nyibt21kn3vp33YdMSkmzsXJ6k\n33vvX05xu3ckLfPeDy3k+dMkPVXiggEAgNO990+nc4flUdOQzDxJPYt4fqqk0yV9I2lreRQIAIBK\noo6kvWXX0rTKVNDQUdZskZT3fp2ktEZHAABUIXPKYqcpBw3OufqS2sqSGyVpX+fcIZJ+9N4vd87d\nJKlV0PTgnLtA0lJJn8uin3MkHSWpfxrKDwAAyklJahq6SHpbNvaCl3RH/vLHJJ0tG4dhz7j1a+Wv\n00rSZkmfSurnvZ9VwjIDAIAMKFUiJAAAqDqYewIAAIRC0AAAAEKJXNDgnBvtnFvqnNvinJvrnOua\n6TKheM65a5JMTPa/hHWuyx8VdLNzbrpzrm2myouCwkxEV9z5c87Vds7d55xb65zb6Jx7nonpMqe4\nc+qc+1eS/9nXE9bhnEaAc+5y59w859wG59wq59yLzrn9k6xX5v+jkQoanHOnyJImr5F0qKRPJE11\nzu2a0YIhrM8kNVdsYrJewRPOucsknS/pXEmHSdokO7e1MlBO7KzIiehCnr+7JA2W9EdJR8qSn/9T\ntsVGEcJMLjhFBf9nsxKe55xGwxGS7pHUTdLRkmpKmuacqxusUG7/o977yNwkzZV0d9xjJ2mFpEsz\nXTZuxZ67ayTNL+L5lZLGxj1uJGmLpJMzXXZuO52rPEnHp3L+8h//IunEuHUOyN/XYZl+TVX9Vsg5\n/ZekF4rYhnMa0ZukXfPPQ6+4ZeXyPxqZmgbnXE1JnSXNCJZ5e1VvSuqRqXIhJfvlV4V+7Zx70jm3\npyQ55/aR/YqJP7cbJH0gzm3khTx/XWRduOPX+ULSt+IcR1mf/OruRc65Cc65XeKe6yzOaVQ1kdUe\n/SiV7/9oZIIGWeRUXdKqhOWrZG8Gom2upGGSBkoaKWkfSbPyBwNrIfuAc24rpjDnr7mkbflfVIWt\ng2iZIuksSX0lXSqpt6TXnXPBwH0txDmNnPzzc5ek2d77IG+s3P5HMzWMNCoZ7338GOefOefmSVom\n6WRJizJTKgCF8d7/O+7h5865/0r6WlIf2QB+iKYJktqr6PmbykyUahrWStohi4biNZf0Q/kXB6Xh\nvc+R9KVsyPEfZPkpnNuKKcz5+0FSLedcoyLWQYR575fKvoeDjHvOacQ45+6VNEhSH+99/PxN5fY/\nGpmgwXu/XVK2pH7BsvxqmH4qo4k3UHaccw1kXz4r87+MflDBc9tIlgnMuY24kOcvW1JuwjoHSNpL\n0vvlVliUmHOutaRmik0myDmNkPyA4QRJR3nvv41/rjz/R6PWPDFe0qPOuWzZ9NljJdWT9GgmC4Xi\nOeduk/SKrEliD0nXStou6Zn8Ve6SdKVz7ivZlOfXy3rGTC73wmInxU1Ep2LOn/d+g3PuEUnjnXM/\nSdoo6R+S3vPezyvXFwNJRZ/T/Ns1su52P+Svd4usdnCqxDmNEufcBFl32OMlbXLOBTUKOd77rfl/\nl8//aKa7jiTpSnJe/gveIot+umS6TNxCnbdJ+R/QLbJs3Kcl7ZOwzt9l3YI2y76Y2ma63Nx+PTe9\nZV2vdiTc/hn2/EmqLetLvjb/C+k5Sbtn+rVV1VtR51Q24/AbsoBhq6Qlku6XtBvnNHq3Qs7jDkln\nJaxX5v+jTFgFAABCiUxOAwAAiDaCBgAAEApBAwAACIWgAQAAhELQAAAAQiFoAAAAoRA0AACAUAga\nAABAKAQNAAAgFIIGAAAQCkEDAAAI5f8BEXt19l83XNQAAAAASUVORK5CYII=\n", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "import matplotlib.pyplot as plt\n", "import matplotlib.ticker as ticker\n", "%matplotlib inline\n", "\n", "plt.figure()\n", "plt.plot(all_losses)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Evaluating at different \"temperatures\"\n", "\n", "In the `evaluate` function above, every time a prediction is made the outputs are divided by the \"temperature\" argument passed. Using a higher number makes all actions more equally likely, and thus gives us \"more random\" outputs. Using a lower value (less than 1) makes high probabilities contribute more. As we turn the temperature towards zero we are choosing only the most likely outputs.\n", "\n", "We can see the effects of this by adjusting the `temperature` argument." ] }, { "cell_type": "code", "execution_count": 11, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Thoo head strant me reporce\n", "O and hears of thou provand of treech.\n", "\n", "LUCI death in that to tellon is head thing come thou that to not him with your firsure but,\n", "They here thyse of yet in thou thy meat to\n" ] } ], "source": [ "print(evaluate('Th', 200, temperature=0.8))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Lower temperatures are less varied, choosing only the more probable outputs:" ] }, { "cell_type": "code", "execution_count": 12, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "This commanderence the forself to the the to the the the to the to the the formands\n", "What to the strange the boy the the have the the to the to to the formands\n", "That the the the the the the sorn the to th\n" ] } ], "source": [ "print(evaluate('Th', 200, temperature=0.2))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Higher temperatures more varied, choosing less probable outputs:" ] }, { "cell_type": "code", "execution_count": 13, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "That,\n", "henct wto Haste's, norsee'd stave brYiry's is dsem.\n", "Hell hurss Heamous halloR:\n", "Tht a readerty the!\n", "\n", "KuWhrate.\n", "\n", "VLOMAY, mere's no, toojecur' kong.\n", "\n", "DUKE VIx whJos ivistomzliben\n", "The vrieglad bloot, \n" ] } ], "source": [ "print(evaluate('Th', 200, temperature=1.4))" ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "# Exercises\n", "\n", "* Train with your own dataset, e.g.\n", " * Text from another author\n", " * Blog posts\n", " * Code\n", "* Increase number of layers and network size to get better results" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Next**: [Generating Names with a Conditional Character-Level RNN](https://github.com/spro/practical-pytorch/blob/master/conditional-char-rnn/conditional-char-rnn.ipynb)" ] } ], "metadata": { "anaconda-cloud": {}, "kernelspec": { "display_name": "Python [conda root]", "language": "python", "name": "conda-root-py" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.5.2" } }, "nbformat": 4, "nbformat_minor": 1 } ================================================ FILE: char-rnn-generation/generate.py ================================================ # https://github.com/spro/practical-pytorch import torch from helpers import * from model import * def generate(decoder, prime_str='A', predict_len=100, temperature=0.8): hidden = decoder.init_hidden() prime_input = char_tensor(prime_str) predicted = prime_str # Use priming string to "build up" hidden state for p in range(len(prime_str) - 1): _, hidden = decoder(prime_input[p], hidden) inp = prime_input[-1] for p in range(predict_len): output, hidden = decoder(inp, hidden) # Sample from the network as a multinomial distribution output_dist = output.data.view(-1).div(temperature).exp() top_i = torch.multinomial(output_dist, 1)[0] # Add predicted character to string and use as next input predicted_char = all_characters[top_i] predicted += predicted_char inp = char_tensor(predicted_char) return predicted if __name__ == '__main__': # Parse command line arguments import argparse argparser = argparse.ArgumentParser() argparser.add_argument('filename', type=str) argparser.add_argument('-p', '--prime_str', type=str, default='A') argparser.add_argument('-l', '--predict_len', type=int, default=100) argparser.add_argument('-t', '--temperature', type=float, default=0.8) args = argparser.parse_args() decoder = torch.load(args.filename) del args.filename print(generate(decoder, **vars(args))) ================================================ FILE: char-rnn-generation/helpers.py ================================================ # https://github.com/spro/practical-pytorch import unidecode import string import random import time import math import torch from torch.autograd import Variable # Reading and un-unicode-encoding data all_characters = string.printable n_characters = len(all_characters) def read_file(filename): file = unidecode.unidecode(open(filename).read()) return file, len(file) # Turning a string into a tensor def char_tensor(string): tensor = torch.zeros(len(string)).long() for c in range(len(string)): tensor[c] = all_characters.index(string[c]) return Variable(tensor) # Readable time elapsed def time_since(since): s = time.time() - since m = math.floor(s / 60) s -= m * 60 return '%dm %ds' % (m, s) ================================================ FILE: char-rnn-generation/model.py ================================================ # https://github.com/spro/practical-pytorch import torch import torch.nn as nn from torch.autograd import Variable class RNN(nn.Module): def __init__(self, input_size, hidden_size, output_size, n_layers=1): super(RNN, self).__init__() self.input_size = input_size self.hidden_size = hidden_size self.output_size = output_size self.n_layers = n_layers self.encoder = nn.Embedding(input_size, hidden_size) self.gru = nn.GRU(hidden_size, hidden_size, n_layers) self.decoder = nn.Linear(hidden_size, output_size) def forward(self, input, hidden): input = self.encoder(input.view(1, -1)) output, hidden = self.gru(input.view(1, 1, -1), hidden) output = self.decoder(output.view(1, -1)) return output, hidden def init_hidden(self): return Variable(torch.zeros(self.n_layers, 1, self.hidden_size)) ================================================ FILE: char-rnn-generation/train.py ================================================ # https://github.com/spro/practical-pytorch import torch import torch.nn as nn from torch.autograd import Variable import argparse import os from helpers import * from model import * from generate import * # Parse command line arguments argparser = argparse.ArgumentParser() argparser.add_argument('filename', type=str) argparser.add_argument('--n_epochs', type=int, default=2000) argparser.add_argument('--print_every', type=int, default=100) argparser.add_argument('--hidden_size', type=int, default=50) argparser.add_argument('--n_layers', type=int, default=2) argparser.add_argument('--learning_rate', type=float, default=0.01) argparser.add_argument('--chunk_len', type=int, default=200) args = argparser.parse_args() file, file_len = read_file(args.filename) def random_training_set(chunk_len): start_index = random.randint(0, file_len - chunk_len) end_index = start_index + chunk_len + 1 chunk = file[start_index:end_index] inp = char_tensor(chunk[:-1]) target = char_tensor(chunk[1:]) return inp, target decoder = RNN(n_characters, args.hidden_size, n_characters, args.n_layers) decoder_optimizer = torch.optim.Adam(decoder.parameters(), lr=args.learning_rate) criterion = nn.CrossEntropyLoss() start = time.time() all_losses = [] loss_avg = 0 def train(inp, target): hidden = decoder.init_hidden() decoder.zero_grad() loss = 0 for c in range(args.chunk_len): output, hidden = decoder(inp[c], hidden) loss += criterion(output, target[c]) loss.backward() decoder_optimizer.step() return loss.data[0] / args.chunk_len def save(): save_filename = os.path.splitext(os.path.basename(args.filename))[0] + '.pt' torch.save(decoder, save_filename) print('Saved as %s' % save_filename) try: print("Training for %d epochs..." % args.n_epochs) for epoch in range(1, args.n_epochs + 1): loss = train(*random_training_set(args.chunk_len)) loss_avg += loss if epoch % args.print_every == 0: print('[%s (%d %d%%) %.4f]' % (time_since(start), epoch, epoch / args.n_epochs * 100, loss)) print(generate(decoder, 'Wh', 100), '\n') print("Saving...") save() except KeyboardInterrupt: print("Saving before quit...") save() ================================================ FILE: conditional-char-rnn/conditional-char-rnn.ipynb ================================================ { "cells": [ { "cell_type": "markdown", "metadata": { "nbpresent": { "id": "9a73330c-27c1-4957-8e95-c3b42bc14a71" } }, "source": [ "![](https://i.imgur.com/eBRPvWB.png)\n", "\n", "# Practical PyTorch: Generating Names with a Conditional Character-Level RNN\n", "\n", "[In the last tutorial](https://github.com/spro/practical-pytorch/blob/master/char-rnn-classification/char-rnn-classification.ipynb) we used a RNN to classify names into their language of origin. This time we'll turn around and generate names from languages. This model will improve upon the RNN we used to [generate Shakespeare one character at a time](https://github.com/spro/practical-pytorch/blob/master/char-rnn-generation/char-rnn-generation.ipynb) by adding another input (representing the language) so we can specify what kind of name to generate.\n", "\n", "```\n", "> python generate.py Russian\n", "Rovakov\n", "Uantov\n", "Shavakov\n", "\n", "> python generate.py German\n", "Gerren\n", "Ereng\n", "Rosher\n", "\n", "> python generate.py Spanish\n", "Salla\n", "Parer\n", "Allan\n", "\n", "> python generate.py Chinese\n", "Chan\n", "Hang\n", "Iun\n", "```\n", "\n", "Being able to \"prime\" the generator with a specific category brings us a step closer to the [Sequence to Sequence model](https://github.com/spro/practical-pytorch/blob/master/seq2seq-translation/seq2seq-translation.ipynb) used for machine translation." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Recommended Reading\n", "\n", "I assume you have at least installed PyTorch, know Python, and understand Tensors:\n", "\n", "* http://pytorch.org/ For installation instructions\n", "* [Deep Learning with PyTorch: A 60-minute Blitz](https://github.com/pytorch/tutorials/blob/master/Deep%20Learning%20with%20PyTorch.ipynb) to get started with PyTorch in general\n", "* [jcjohnson's PyTorch examples](https://github.com/jcjohnson/pytorch-examples) for an in depth overview\n", "* [Introduction to PyTorch for former Torchies](https://github.com/pytorch/tutorials/blob/master/Introduction%20to%20PyTorch%20for%20former%20Torchies.ipynb) if you are former Lua Torch user\n", "\n", "It would also be useful to know about RNNs and how they work:\n", "\n", "* [The Unreasonable Effectiveness of Recurrent Neural Networks](http://karpathy.github.io/2015/05/21/rnn-effectiveness/) shows a bunch of real life examples\n", "* [Understanding LSTM Networks](http://colah.github.io/posts/2015-08-Understanding-LSTMs/) is about LSTMs specifically but also informative about RNNs in general\n", "\n", "I also suggest the previous tutorials:\n", "\n", "* [Classifying Names with a Character-Level RNN](https://github.com/spro/practical-pytorch/blob/master/char-rnn-classification/char-rnn-classification.ipynb) for using an RNN to classify text into categories\n", "* [Generating Shakespeare with a Character-Level RNN](https://github.com/spro/practical-pytorch/blob/master/char-rnn-generation/char-rnn-generation.ipynb) for using an RNN to generate one character at a time" ] }, { "cell_type": "markdown", "metadata": { "nbpresent": { "id": "cc294dae-dd8f-4288-8d3c-bb9fd3ad19bc" } }, "source": [ "# Preparing the Data\n", "\n", "See [Classifying Names with a Character-Level RNN](https://github.com/spro/practical-pytorch/blob/master/char-rnn-classification/char-rnn-classification.ipynb) for more detail - we're using the exact same dataset. In short, there are a bunch of plain text files `data/names/[Language].txt` with a name per line. We split lines into an array, convert Unicode to ASCII, and end up with a dictionary `{language: [names ...]}`." ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "collapsed": false, "nbpresent": { "id": "6a9d80df-1d38-4c41-849c-95e38da98cc7" } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "O'Neal\n" ] } ], "source": [ "import glob\n", "import unicodedata\n", "import string\n", "\n", "all_letters = string.ascii_letters + \" .,;'-\"\n", "n_letters = len(all_letters) + 1 # Plus EOS marker\n", "EOS = n_letters - 1\n", "\n", "# Turn a Unicode string to plain ASCII, thanks to http://stackoverflow.com/a/518232/2809427\n", "def unicode_to_ascii(s):\n", " return ''.join(\n", " c for c in unicodedata.normalize('NFD', s)\n", " if unicodedata.category(c) != 'Mn'\n", " and c in all_letters\n", " )\n", "\n", "print(unicode_to_ascii(\"O'Néàl\"))" ] }, { "cell_type": "code", "execution_count": 2, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "# categories: 18 ['Arabic', 'Chinese', 'Czech', 'Dutch', 'English', 'French', 'German', 'Greek', 'Irish', 'Italian', 'Japanese', 'Korean', 'Polish', 'Portuguese', 'Russian', 'Scottish', 'Spanish', 'Vietnamese']\n" ] } ], "source": [ "# Read a file and split into lines\n", "def read_lines(filename):\n", " lines = open(filename).read().strip().split('\\n')\n", " return [unicode_to_ascii(line) for line in lines]\n", "\n", "# Build the category_lines dictionary, a list of lines per category\n", "category_lines = {}\n", "all_categories = []\n", "for filename in glob.glob('../data/names/*.txt'):\n", " category = filename.split('/')[-1].split('.')[0]\n", " all_categories.append(category)\n", " lines = read_lines(filename)\n", " category_lines[category] = lines\n", "\n", "n_categories = len(all_categories)\n", "\n", "print('# categories:', n_categories, all_categories)" ] }, { "cell_type": "markdown", "metadata": { "nbpresent": { "id": "4ff5f52a-2523-47f0-beba-f6c29d412e5f" } }, "source": [ "# Creating the Network\n", "\n", "This network extends [the last tutorial's RNN](#Creating-the-Network) with an extra argument for the category tensor, which is concatenated along with the others. The category tensor is a one-hot vector just like the letter input.\n", "\n", "We will interpret the output as the probability of the next letter. When sampling, the most likely output letter is used as the next input letter.\n", "\n", "I added a second linear layer `o2o` (after combining hidden and output) to give it more muscle to work with. There's also a dropout layer, which [randomly zeros parts of its input](https://arxiv.org/abs/1207.0580) with a given probability (here 0.1) and is usually used to fuzz inputs to prevent overfitting. Here we're using it towards the end of the network to purposely add some chaos and increase sampling variety.\n", "\n", "![](https://i.imgur.com/jzVrf7f.png)" ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "collapsed": true, "nbpresent": { "id": "597a765d-634b-41a8-a0c6-be5c019da150" } }, "outputs": [], "source": [ "import torch\n", "import torch.nn as nn\n", "from torch.autograd import Variable\n", "\n", "class RNN(nn.Module):\n", " def __init__(self, input_size, hidden_size, output_size):\n", " super(RNN, self).__init__()\n", " self.input_size = input_size\n", " self.hidden_size = hidden_size\n", " self.output_size = output_size\n", " \n", " self.i2h = nn.Linear(n_categories + input_size + hidden_size, hidden_size)\n", " self.i2o = nn.Linear(n_categories + input_size + hidden_size, output_size)\n", " self.o2o = nn.Linear(hidden_size + output_size, output_size)\n", " self.softmax = nn.LogSoftmax()\n", " \n", " def forward(self, category, input, hidden):\n", " input_combined = torch.cat((category, input, hidden), 1)\n", " hidden = self.i2h(input_combined)\n", " output = self.i2o(input_combined)\n", " output_combined = torch.cat((hidden, output), 1)\n", " output = self.o2o(output_combined)\n", " return output, hidden\n", "\n", " def init_hidden(self):\n", " return Variable(torch.zeros(1, self.hidden_size))" ] }, { "cell_type": "markdown", "metadata": { "nbpresent": { "id": "8ff6da45-57cd-46ca-b14a-3f560ce4d345" } }, "source": [ "# Preparing for Training\n", "\n", "First of all, helper functions to get random pairs of (category, line):" ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "collapsed": true }, "outputs": [], "source": [ "import random\n", "\n", "# Get a random category and random line from that category\n", "def random_training_pair():\n", " category = random.choice(all_categories)\n", " line = random.choice(category_lines[category])\n", " return category, line" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "For each timestep (that is, for each letter in a training word) the inputs of the network will be `(category, current letter, hidden state)` and the outputs will be `(next letter, next hidden state)`. So for each training set, we'll need the category, a set of input letters, and a set of output/target letters.\n", "\n", "Since we are predicting the next letter from the current letter for each timestep, the letter pairs are groups of consecutive letters from the line - e.g. for `\"ABCD\"` we would create (\"A\", \"B\"), (\"B\", \"C\"), (\"C\", \"D\"), (\"D\", \"EOS\").\n", "\n", "![](https://i.imgur.com/JH58tXY.png)\n", "\n", "The category tensor is a [one-hot tensor](https://en.wikipedia.org/wiki/One-hot) of size `<1 x n_categories>`. When training we feed it to the network at every timestep - this is a design choice, it could have been included as part of initial hidden state or some other strategy." ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "collapsed": false, "nbpresent": { "id": "cf311809-10bf-40f7-87e1-1952342f7f35" } }, "outputs": [], "source": [ "# One-hot vector for category\n", "def make_category_input(category):\n", " li = all_categories.index(category)\n", " tensor = torch.zeros(1, n_categories)\n", " tensor[0][li] = 1\n", " return Variable(tensor)\n", "\n", "# One-hot matrix of first to last letters (not including EOS) for input\n", "def make_chars_input(chars):\n", " tensor = torch.zeros(len(chars), n_letters)\n", " for ci in range(len(chars)):\n", " char = chars[ci]\n", " tensor[ci][all_letters.find(char)] = 1\n", " tensor = tensor.view(-1, 1, n_letters)\n", " return Variable(tensor)\n", "\n", "# LongTensor of second letter to end (EOS) for target\n", "def make_target(line):\n", " letter_indexes = [all_letters.find(line[li]) for li in range(1, len(line))]\n", " letter_indexes.append(n_letters - 1) # EOS\n", " tensor = torch.LongTensor(letter_indexes)\n", " return Variable(tensor)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "For convenience during training we'll make a `random_training_set` function that fetches a random (category, line) pair and turns them into the required (category, input, target) tensors." ] }, { "cell_type": "code", "execution_count": 6, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# Make category, input, and target tensors from a random category, line pair\n", "def random_training_set():\n", " category, line = random_training_pair()\n", " category_input = make_category_input(category)\n", " line_input = make_chars_input(line)\n", " line_target = make_target(line)\n", " return category_input, line_input, line_target" ] }, { "cell_type": "markdown", "metadata": { "nbpresent": { "id": "53fb987f-4f42-4bf8-81ae-280ebdd19aee" } }, "source": [ "# Training the Network\n", "\n", "In contrast to classification, where only the last output is used, we are making a prediction at every step, so we are calculating loss at every step.\n", "\n", "The magic of autograd allows you to simply sum these losses at each step and call backward at the end. But don't ask me why initializing loss with 0 works." ] }, { "cell_type": "code", "execution_count": 7, "metadata": { "collapsed": false, "nbpresent": { "id": "df50f546-6d02-4383-beab-90378f16576b" } }, "outputs": [], "source": [ "def train(category_tensor, input_line_tensor, target_line_tensor):\n", " hidden = rnn.init_hidden()\n", " optimizer.zero_grad()\n", " loss = 0\n", " \n", " for i in range(input_line_tensor.size()[0]):\n", " output, hidden = rnn(category_tensor, input_line_tensor[i], hidden)\n", " loss += criterion(output, target_line_tensor[i])\n", "\n", " loss.backward()\n", " optimizer.step()\n", " \n", " return output, loss.data[0] / input_line_tensor.size()[0]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To keep track of how long training takes I am adding a `time_since(t)` function which returns a human readable string:" ] }, { "cell_type": "code", "execution_count": 8, "metadata": { "collapsed": true }, "outputs": [], "source": [ "import time\n", "import math\n", "\n", "def time_since(t):\n", " now = time.time()\n", " s = now - t\n", " m = math.floor(s / 60)\n", " s -= m * 60\n", " return '%dm %ds' % (m, s)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Training is business as usual - call train a bunch of times and wait a few minutes, printing the current time and loss every `print_every` epochs, and keeping store of an average loss per `plot_every` epochs in `all_losses` for plotting later." ] }, { "cell_type": "code", "execution_count": 9, "metadata": { "collapsed": false, "nbpresent": { "id": "81fde336-785e-461b-a751-718a5f6bff88" }, "scrolled": true }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "0m 28s (5000 5%) 1.8674\n", "0m 53s (10000 10%) 2.4155\n", "1m 20s (15000 15%) 3.4203\n", "1m 45s (20000 20%) 1.3962\n", "2m 12s (25000 25%) 1.7427\n", "2m 38s (30000 30%) 2.9514\n", "3m 4s (35000 35%) 2.8836\n", "3m 31s (40000 40%) 1.6728\n", "3m 57s (45000 45%) 2.5014\n", "4m 22s (50000 50%) 1.9687\n", "4m 48s (55000 55%) 1.5595\n", "5m 16s (60000 60%) 2.3830\n", "5m 43s (65000 65%) 1.5155\n", "6m 10s (70000 70%) 1.7967\n", "6m 37s (75000 75%) 1.8564\n", "7m 3s (80000 80%) 1.9873\n", "7m 30s (85000 85%) 1.9569\n", "7m 56s (90000 90%) 1.7553\n", "8m 22s (95000 95%) 2.3103\n", "8m 48s (100000 100%) 1.7575\n" ] } ], "source": [ "n_epochs = 100000\n", "print_every = 5000\n", "plot_every = 500\n", "all_losses = []\n", "loss_avg = 0 # Zero every plot_every epochs to keep a running average\n", "learning_rate = 0.0005\n", "\n", "rnn = RNN(n_letters, 128, n_letters)\n", "optimizer = torch.optim.Adam(rnn.parameters(), lr=learning_rate)\n", "criterion = nn.CrossEntropyLoss()\n", "\n", "start = time.time()\n", "\n", "for epoch in range(1, n_epochs + 1):\n", " output, loss = train(*random_training_set())\n", " loss_avg += loss\n", " \n", " if epoch % print_every == 0:\n", " print('%s (%d %d%%) %.4f' % (time_since(start), epoch, epoch / n_epochs * 100, loss))\n", "\n", " if epoch % plot_every == 0:\n", " all_losses.append(loss_avg / plot_every)\n", " loss_avg = 0" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Plotting the Network\n", "\n", "Plotting the historical loss from all_losses shows the network learning:" ] }, { "cell_type": "code", "execution_count": 10, "metadata": { "collapsed": false, "scrolled": true }, "outputs": [ { "data": { "text/plain": [ "[]" ] }, "execution_count": 10, "metadata": {}, "output_type": "execute_result" }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAg0AAAFkCAYAAACjCwibAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzt3XecVOX1x/HvoYs00YCoiBo1YAnKqhDsNZZYMLYVI5JY\nAc0PNZY0C2rUREUFYkw0wUaMsXdjUOyKgA17xQZEwQXpu3t+f5y9zuyy5c6yuzO7fN6v176WuXPL\nM3OHvWfOc57nmrsLAACgLq3y3QAAANA8EDQAAIBUCBoAAEAqBA0AACAVggYAAJAKQQMAAEiFoAEA\nAKRC0AAAAFIhaAAAAKkQNAAAgFRyChrM7GQze9XMSip+njOzfWtZf4iZPWZmc7PW32fVmw0AAJpa\nrpmGTyWdLWmApCJJkyXda2b9alh/F0mPSdqvYpsnJN1vZv3r11wAAJAvtqo3rDKzryWd6e5/T7n+\nG5L+6e4XrdKBAQBAk2pT3w3NrJWkIyR1lPR8ym1MUmdJ8+p7XAAAkB85Bw1mtpUiSOggaaGkIe7+\ndsrNfyVpTUn/quMYa0v6saSPJS3NtY0AAKzGOkjaSNKj7v51Q+64PpmGtyX1l9RV0mGSbjKzXeoK\nHMzsaEm/k3SQu39VxzF+LOnWerQNAACEoZJua8gd5hw0uHuppA8rHs4wsx0k/VLSKTVtY2ZHSbpe\n0mHu/kSKw3wsSbfccov69aupxhLNyejRo3XVVVfluxloIJzPloXz2bK89dZbOuaYY6SKa2lDqndN\nQ5ZWktrX9KSZFUv6m6Qj3f2RlPtcKkn9+vXTgAEDVr2FyLuuXbtyLlsQzmfLwvlssRq8ez+noMHM\nLpH0sKRZioLGoZJ2lbRPxfN/kLSeuw+reHy0pH9IOk3SVDPrWbGrJe6+oCFeAAAAaBq5ztPQQ9JE\nRV3D44q5GvZx98kVz68rqXfW+idIai1pvKQvsn7GrkKbAQBAHuSUaXD34+t4fniVx7vXp1EAAKDw\ncO8JNIni4uJ8NwENiPPZsnA+kRZBA5oEf5RaFs5ny8L5RFoEDQAAIBWCBgAAkApBAwAASIWgAQAA\npELQAAAAUiFoAAAAqRA0AACAVAgaAABAKgQNAAAgFYIGAACQCkEDAABIhaABAACkQtAAAABSIWgA\nAACpEDQAAIBUCBoAAEAqBA0AACAVggYAAJAKQQMAAEiFoAEAAKRC0AAAAFIhaAAAAKkQNAAAgFQI\nGgAAQCoEDQAAIBWCBgAAkApBAwAASIWgAQAApELQAAAAUiFoAAAAqRA0AACAVHIKGszsZDN71cxK\nKn6eM7N969hmNzObZmZLzexdMxu2ak0GAAD5kGum4VNJZ0saIKlI0mRJ95pZv+pWNrONJD0g6b+S\n+ku6WtLfzGzverYXAADkSZtcVnb3B6ss+q2ZnSJpkKS3qtnkFEkfuvtZFY/fMbOdJI2W9J9cGwsA\nAPKn3jUNZtbKzI6S1FHS8zWsNkjS41WWPSrpR/U9LgAAyI+cMg2SZGZbKYKEDpIWShri7m/XsPq6\nkuZUWTZHUhcza+/uy3I9PgAAyI/6ZBreVtQn7CDpz5JuMrO+DdoqAABQcHLONLh7qaQPKx7OMLMd\nJP1SUb9Q1WxJPass6ylpQZosw+jRo9W1a9dKy4qLi1VcXJxrswEAaHEmTZqkSZMmVVpWUlLSaMcz\nd1+1HZj9V9In7v7zap67VNJ+7t4/a9ltkrq5+/617HOApGnTpk3TgAEDVql9AACsTqZPn66ioiJJ\nKnL36Q2575wyDWZ2iaSHJc2S1FnSUEm7Stqn4vk/SFrP3ZO5GK6TNNLMLpN0o6Q9JR0mqcaAAQAA\nFKZcuyd6SJooqZekEkmvSdrH3SdXPL+upN7Jyu7+sZkdIOkqSadJ+kzSL9y96ogKAABQ4HKdp+H4\nOp4fXs2ypxQTQQEAgGaMe08AAIBUCBoAAEAqBA0AACAVggYAAJAKQQMAAEiFoAEAAKRC0AAAAFIh\naAAAAKkQNAAAgFQIGgAAQCoEDQAAIBWCBgAAkApBAwAASIWgAQAApFLQQUNpab5bAAAAEgUdNKxY\nke8WAACAREEHDcuX57sFAAAgQdAAAABSKeigge4JAAAKR0EHDcuW5bsFAAAgUdBBA5kGAAAKR0EH\nDdQ0AABQOAgaAABAKgUdNNA9AQBA4SjooIFCSAAACkdBBw1kGgAAKBwFHTRQ0wAAQOEo6KCBTAMA\nAIWjoIMGMg0AABQOggYAAJAKQQMAAEiFoAEAAKRS0EEDhZAAABSOgg4ayDQAAFA4cgoazOxcM3vJ\nzBaY2Rwzu9vMNk+x3VAze8XMFpnZF2Z2g5l1r2s7ggYAAApHrpmGnSVdK2mgpL0ktZX0mJmtUdMG\nZrajpImS/ippC0mHSdpB0vV1HYygAQCAwtEml5Xdff/sx2Z2nKS5kookPVPDZoMkfeTu4ysef2Jm\nf5F0Vl3HI2gAAKBwrGpNQzdJLmleLes8L6m3me0nSWbWU9Lhkh6sa+cUQgIAUDjqHTSYmUkaK+kZ\nd3+zpvXc/TlJx0i63cyWS/pS0nxJo+o6BpkGAAAKR07dE1VMUNQo7FjbSma2haSrJZ0v6TFJvST9\nSdJfJB1f27avvTZaBx3UtdKy4uJiFRcX17vRAAC0FJMmTdKkSZMqLSspKWm045m7576R2ThJB0ra\n2d1n1bHuTZI6uPsRWct2lPS0pF7uPqeabQZImlZUNE0vvzwg5/YBALC6mj59uoqKiiSpyN2nN+S+\nc+6eqAgYDpa0e10BQ4WOkkqrLCtX1EJYbRvSPQEAQOHIdZ6GCZKGSjpa0iIz61nx0yFrnUvMbGLW\nZvdL+qmZnWxmG1dkGa6W9KK7z67teBRCAgBQOHKtaThZkSF4ssry4ZJuqvh3L0m9kyfcfaKZdZI0\nUlHL8I2k/0o6p66DkWkAAKBw5DpPQ52ZCXcfXs2y8ZLGV7N6rQgaAAAoHNx7AgAApELQAAAAUino\noIFCSAAACkdBBw1kGgAAKBwFHTSsWCHVY+4pAADQCAo6aJDINgAAUCgKPmhYtizfLQAAABJBAwAA\nSImgAQAApELQAAAAUin4oGHp0ny3AAAASM0gaCDTAABAYSBoAAAAqRA0AACAVAgaAABAKgUfNFAI\nCQBAYSj4oIFMAwAAhYGgAQAApELQAAAAUinooKFNG4IGAAAKRUEHDe3aUQgJAEChKPiggUwDAACF\ngaABAACkQtAAAABSKeigoW1bggYAAApFQQcNFEICAFA4Cj5oINMAAEBhIGgAAACpEDQAAIBUCj5o\nWLIk360AAABSgQcNnTpJCxbkuxUAAEBqBkHDN9/kuxUAAEBqBkFDSUm+WwEAAKQcgwYzO9fMXjKz\nBWY2x8zuNrPNU2zXzswuNrOPzWypmX1oZsfVtV3nzgQNAAAUijY5rr+zpGslvVyx7R8kPWZm/dy9\ntpLFOyR9T9JwSR9I6qUUAUtS01BWJrVunWNLAQBAg8opaHD3/bMfV2QL5koqkvRMdduY2b6KYGMT\nd08qFGalOV7nzvF74UKpW7dcWgoAABraqtY0dJPkkubVss6BiszE2Wb2mZm9Y2Z/NLMOde28U6f4\nTTEkAAD5l2v3xHfMzCSNlfSMu79Zy6qbKDINSyUdImkdSX+W1F3SL2o7RpJpoK4BAID8q3fQIGmC\npC0k7VjHeq0klUs62t2/lSQzO13SHWY2wt1rnPPxmmtGS+qqESOktdeOZcXFxSouLl6FZgMA0DJM\nmjRJkyZNqrSspBG/aZu7576R2ThFt8PO7l5rfYKZ/UPSYHffPGtZX0kzJW3u7h9Us80ASdP+859p\n2nvvAbr3Xumgg3JuJgAAq53p06erqKhIkorcfXpD7jvnmoaKgOFgSbvXFTBUeFbSembWMWvZDxTZ\nh89q25CaBgAACkeu8zRMkDRU0tGSFplZz4qfDlnrXGJmE7M2u03S15L+bmb9zGwXSZdLuqG2rgkp\n7j3Rvj01DQAAFIJcMw0nS+oi6UlJX2T9HJG1Ti9JvZMH7r5I0t6KkRZTJd0s6V5Jv0xzwG7dyDQA\nAFAIcp2noc4gw92HV7PsXUk/zuVYia5dyTQAAFAICvreExKZBgAACkXBBw1kGgAAKAwFHzSQaQAA\noDAUfNBApgEAgMJA0AAAAFIp+KCB7gkAAApDwQcNZBoAACgMBR80dOsmLVsmLV2a75YAALB6K/ig\noWvX+E22AQCA/Cr4oKFbt/hNXQMAAPlV8EEDmQYAAApDwQcNZBoAACgMBR80kGkAAKAwFHzQ0Llz\n/CbTAABAfhV80NC6tdSlC5kGAADyreCDBim6KMg0AACQX80iaOjWjUwDAAD51iyCBjINAADkX7MI\nGsg0AACQf80iaOCmVQAA5F+zCBq4PTYAAPnXLIIGMg0AAORfswkayDQAAJBfzSJo6NZNWrBAKi/P\nd0sAAFh9NYugoUcPyV2aOzffLQEAYPXVLIKGPn3i96xZ+W0HAACrs2YRNGy4Yfz+5JP8tgMAgNVZ\nswgauneX1lyTTAMAAPnULIIGs8g2kGkAACB/mkXQIEVdA5kGAADyp9kEDWQaAADIr2YTNJBpAAAg\nv5pV0DBvnvTtt/luCQAAq6ecggYzO9fMXjKzBWY2x8zuNrPNc9h+RzNbYWbTc21oMuySbAMAAPmR\na6ZhZ0nXShooaS9JbSU9ZmZr1LWhmXWVNFHS47k2UspM8ERdAwAA+dEml5Xdff/sx2Z2nKS5kook\nPVPH5tdJulVSuaSDczmuJK23ntS6NZkGAADyZVVrGrpJcknzalvJzIZL2ljSBfU9UJs20vrrk2kA\nACBfcso0ZDMzkzRW0jPu/mYt620m6RJJO7l7eWxWPxtuSKYBAIB8qXfQIGmCpC0k7VjTCmbWStEl\ncZ67f5Asru8B+/Qh0wAAQL7UK2gws3GS9pe0s7t/WcuqnSVtJ2kbMxtfsaxV7MKWS9rH3Z+saePR\no0era9eu3z1+6y1p/vxiScX1aTYAAC3KpEmTNGnSpErLSkpKGu145u65bRABw8GSdnX3D+tY1yT1\nq7J4pKTdJf1U0sfuvqSa7QZImjZt2jQNGDDgu+V/+Ys0cqS0dGnUOAAAgMqmT5+uoqIiSSpy95yn\nOKhNTpdeM5ug+Jp/kKRFZtaz4qkSd19asc4lktZ392EeEcmbVfYxV9JSd38r18b26SOVlUlffJGZ\ntwEAADSNXEdPnCypi6QnJX2R9XNE1jq9JPVuiMZVlQQK1DUAAND0cgoa3L2Vu7eu5uemrHWGu/se\ntezjAncfUNPztUkmeProo/psDQAAVkWzufeEJK25ZmQb3qxxgCcAAGgszSpokKQtt5Rmzsx3KwAA\nWP0QNAAAgFSaZdDw0UfSokX5bgkAAKuXZhk0SDHREwAAaDrNLmjoVzFV1Btv5LcdAACsbppd0NCp\nk7TRRtQ1AADQ1Jpd0CBJW21F0AAAQFNrlkEDIygAAGh6zTZomDVLWrgw3y0BAGD10WyDBomZIQEA\naErNMmjo21cyo4sCAICm1CyDho4dpU02YdglAABNqVkGDZK0ww7SAw9IpaX5bgkAAKuHZhs0nHGG\n9N570u2357slAACsHppt0FBUJP3kJ9KYMVJZWb5bAwBAy9dsgwZJ+v3vpXfeke64I98tAQCg5WvW\nQcP220v77RfZhvLyfLcGAICWrVkHDZL0m9/EfA2PPJLvlgAA0LI1+6Bh8ODIOFx1Vb5bAgBAy9bs\ngwYzafRo6fHHpddfz3drAABouZp90CBJhx0mbbCBNHZsvlsCAEDL1SKChrZtpVGjpFtvlebOzXdr\nAABomVpE0CBJJ54Yv2++Ob/tAACgpWoxQcNaa0kHHCBNmpTvlgAA0DK1mKBBkoqLpWnTYnppAADQ\nsFpU0HDAAVKnTmQbAABoDC0qaFhjDemQQyJocM93awAAaFlaVNAgRRfF229Lr76a75YAANCytLig\nYe+9pe7dpX/+M98tAQCgZWlxQUPbtnETq8mT890SAABalhYXNEjSjjtKM2ZIixfnuyUAALQcLTJo\nGDxYKi2VXn453y0BAKDlyCloMLNzzewlM1tgZnPM7G4z27yObYaY2WNmNtfMSszsOTPbZ9WaXbut\ntoqhl88915hHAQBg9ZJrpmFnSddKGihpL0ltJT1mZmvUss0ukh6TtJ+kAZKekHS/mfXPvbnptG4t\nDRpE0AAAQENqk8vK7r5/9mMzO07SXElFkp6pYZvRVRb9xswOlnSgpEYbGDl4sDR+fMzXYNZYRwEA\nYPWxqjUN3SS5pHlpNzAzk9Q5l23qY/Bg6euvmVIaAICGUu+goeLiP1bSM+7+Zg6b/krSmpL+Vd9j\npzFwYGQY6KIAAKBhrEqmYYKkLSQdlXYDMzta0u8kHe7uX63CsevUrZu05ZYEDQAANJScahoSZjZO\n0v6Sdnb3L1Nuc5Sk6yUd5u5PpNlm9OjR6tq1a6VlxcXFKi4uTtXOwYOlZ5+t/rnFi6WZM6Xtt0+1\nKwAACs6kSZM0qcpdGktKShrteOY53tmpImA4WNKu7v5hym2KJf1N0pHu/kCK9QdImjZt2jQNGDAg\np/Zlu/VW6ZhjpM8+k9Zfv/JzZ54pXXGF9O9/Sz/9ab0PAQBAQZk+fbqKiookqcjdpzfkvnOdp2GC\npKGSjpa0yMx6Vvx0yFrnEjObmPX4aEkTJZ0haWrWNl0a5iXUbP/9pTZtpHvuqby8tFS65Rapa1fp\nZz9jEigAANLItabhZEldJD0p6YusnyOy1uklqXfW4xMktZY0vso2Y+vV4hystZa0xx7SXXdVXv7o\no9KcOdJDD0n9+0sHHijNndvYrQEAoHnLdZ6GOoMMdx9e5fHuuTaqIQ0ZIo0aFcMv1147lk2cGLNG\n/uhH0T2xwQbSf/8bt9UGAADVa5H3nsh28MFSebl0//3xeP586b77pGHDYkjm+utL3/se8zkAAFCX\nFh809OoVoyjuvjse/+tf0ooV0tChmXU224ygAQCAurT4oEGKLopHH5VGjJDOOkvad98IJhKbbSa9\n+27m8eTJ0kknNX07AQAoZKtF0HDooVJZWYyiGDlSuvHGys9XzTT885/S9ddLixY1bTsBAChk9Zrc\nqbnZeGPpo4+kddeNIZhVbbZZ1DokxZKvvBLL335biqGuAABgtcg0SDFCorqAQYqgQYpsQ2mp9Prr\n8XjmzKZpGwAAzcFqkWmoS3bQ0KmTtHRpPH4zl9twAQDQwhE0KAKFXr0q1zUMHkzQAABAttWme6Iu\nyQiKV16JGojBg2vvnliyRDr7bKkR7wsCAEBBIWiokIygmDFD2nZbaYstonhy8eLq13/4Yenyy2Ok\nBQAAqwOChgpJ0PDKK5mgwV16553q13/44fh9551N10YAAPKJoKHC5ptLCxfG0MskaJAyXRSPPy69\n9Vb82z2Chh49pCeekObNy0+bAQBoSgQNFZIRFFIEDZ07S717RzHk3LlxD4vhwyNgmDlT+vzz6J4o\nLc3c1yKtb75p2LYDANAUCBoqfP/78btHj8wU01tsEUHDlVfGMMwXX5SefjqyDB07SkceGQWTVW+9\nXZtXX40bZL32WsO/BgAAGhNBQ4U11ojMwrbbxt0vpQgapk6Vxo+XfvUracstI7vwyCPS7rtLHTrE\nFNWPPip9+2264zz3XGQn7rij8V4LAACNgaAhyy9/KZ14YubxlltKX3wRt9Y+44wIHB58UJoyJW56\nJUXQsGyZ9NBD6Y6RTFF9zz0N23YAABobQUOWM86IICCRFEOOGBFdCsXFMR11WZm0337x3MYbx/0p\nbr013TFmzJB69pTeeEN6//2GbT8AAI2JoKEWRUXSr38tnXNOPG7XTrrwwsgyJDUQkvTzn0cG4ssv\na99fcl+LU0+Nrg2yDQCA5oSgoRbt2kkXXxx3vkwMH56ZoyExdGis+/e/176/d9+Ngsodd5T22Ue6\n++6GbzMAAI2FoKEBdO0qHXGEdMMNUf9Qk6SeoX9/6ZBDpOefl2bPbpo2AgCwqggaGsgJJ0gffhiT\nPdXklVekPn2ktdaSDjwwRmnkOscDAAD5QtDQQAYPlvr1k377W+mii6RrrpFWrKi8zowZ0jbbxL/X\nWSf+/fzzTd9WAADqg6ChgZhF0eRnn0njxsXwzexJn9wz97VIbLNNTPYEAEBzQNDQgI45Rvr006hT\nGDRIuvnmzHNffCF99VUm0yDFv994Y+WMBAAAhYigoZEcc0zMHPm//8XjpAgyO2jo319avrzmO2kC\nAFBICBoayZFHRpfF7bfH4yeflLp1kzbcMLNO//7xOwkoavPtt9KECbWPzgAAoDERNDSSddaJWSNv\nvjmmmL7yyphZMrmvhRRDNTfeuHJdg3v1+xs/Xho5Upo2rfbj3nxz3IETAICGRtDQiH72M+mll2IO\nhwMOiNkkq+rfP5NpmDo1hmP+/vcxCVSirEy67rr494wZNR/v3/+Wjj1Wuvba+rf566+5dTcAoHoE\nDY3oJz+JbMImm8S9KVq3XnmdbbaJoME9hmm2aiVdemnlkRWPPCJ9/LHUpYs0fXr1x5o7VzrllNi+\ntrkiajJnjnT66XFvjaOPzn17AEDLR9DQiNZYI+ZhePJJqXPn6tfZZpsYVfHGG3G77HPOiWzCGmtI\nP/6x9NFHUctQVBSzSFYXNLhLJ50UXR8XXSS9/HJu2YIvv4ybc91wgzRwYAQd2ZmOXMydK915Z+Vl\nCxZIixfXvl15ubRoUf2OCQBoGgQNjaxfP6l795qfT4ohTz89LpzHHRe35H70UalTJ2nPPeNeFyNG\nRODw2msrD9G84464+dWf/xwFmOXl0lNPpW/j2WdHhuK996Srr46A4bnncn6p3+3r8MOlhQszy4YM\nkU47rfJ6VYOSCROkzTeP0SSri/oGZgCQLwQNedanT3RhPP543Ja7R49Y3qNHdEssXBijLo46KiaG\nWrZMevvtzPYLF0qjR0cW4qc/jcLKPn2kyZPTHf/ZZ6N48g9/iGNuvXXcBvy//839tXz+eXTDuMfd\nPKWox3jhhci2ZK/XvXvlwOTpp2Mui4ceyv24NbnppnhPaiouzafFi6X11os2AkBzQdCQZ2aZuRtO\nOqnyc5tuGhf1hx6SOnbMrJfdRTFmjDR/vnTVVZn97blnJmhYtCiGfb78cnyL//hjadKk6IqYMkUa\nNUrabru4vbcUGYc99qhf0HDttXHL7zZtMvUY770XF8gPPsjMWfHEE9KSJZVrL5ICz2RCLPeo0cie\nICtXDz0Us3KmDaCa0tSpcd7+9KfCDGoAoDpt8t0ASDvtJM2bJ+2228rPbb555t+dO8fj6dOlYcOk\nN9+MYOH886WNNsqst8ce0o03RnHjiSdK990Xy1u1yszzYJa5WL3wQjyX2HNP6eSToy6iW7d0r2Hh\nwhjhcdJJ0mOPZUaEZI/2ePHFKA5Nuk6mTs1s+/77ERTdf3+8Fy+8EPt7/fUYhVIf774bvy+9NF5T\nIXn22TgHr78e78euu+a7RQBQt5wyDWZ2rpm9ZGYLzGyOmd1tZpun2G43M5tmZkvN7F0zG1b/Jrc8\nF1wQF9TsORxqMmBABA3l5fFNfKONpDPPrLzO7rvH70MPjYvwnXdGQea4cRFAzJ0b3/TffDMu7gMH\nVt5+zz1j/1Om1N2esrKY0fK88yKr8ctfRp1GkmmYMSMmtOrRIwIBKS6SrVplgobXXosA5tJLY3+3\n3SadcUZkV154IQopc+UeQcPAgdH18/LLue+jMT37rLTXXlHzsipDZLGy88/PBIwAGpi7p/6R9JCk\nn0nqJ2lrSQ9I+ljSGrVss5GkbyVdLukHkkZKWiFp71q2GSDJp02b5qjs8svd11zTffx4d8l98uTq\n1+vbN54fM6Z+x9l4Y/dTT635+bIy97/8xb179ziO5P5//xfPXXGF+xpruJeWuu+9t/vBB7sfdJD7\nnnu6f/llrHvEEfH788/dx41zb9vWfdky9333jW3N3O+4I9a5//7q21Be7n7hhe6HHhr/zvb557Ht\nXXe5f//77kOGuE+d6n799e4ff1y/96ShlJW5d+vmfsEFcR5bt3afNSu/bWop5s2L837aafluCZA/\n06ZNc0kuaYDncI1P87NqG0vrSCqXtFMt61wm6bUqyyZJeqiWbQgaavD443HW2rVzP/74mtebMCEu\n+mVl9TvO8ce7b7FF9c/NmuW+447RjuHDo01z5qzcxrffdl9nHffzz3e/5BL3zp3dJ02K5158MX7f\nc4/7L37hvs02se2tt8byX/wiAoE+fTLBSLbycvdf/zoTsPznP5Wff+KJWP7WWxHcJOtJ7vvtV7/3\npKHMnBntePxx9wUL4n359a9z38+UKe4zZlT/XHl5/c99czZ1ary3m22W75YA+dOYQcOqFkJ2q2jY\nvFrWGSTp8SrLHpX0o1U89mopubX22mtLf/xjzeudckpmsqj62Gef6L74+OPKy5ctiyGUs2bFiIgb\nb4zujGTUh5QZRvrgg5k7ew4aFLUL118fBZ7bby/17BldFDNmZF7XoYdKZ50VozmSos7Hq356FLNm\nXnJJFBL27y9dcUXl5997L177JptEkectt0QXzcSJMYT1xRfr977U17ffZuahePbZmOhr4MCoUxk6\nNNqXS0Hku+/GNOVnnFH9c337xqia1c3778fv996L4lsADay+0YYkU3RPTKljvXcknV1l2X6SyiS1\nr2EbMg21OOWUlb9ZN7QFC9w7dHD/059WPnb79u51nZr113ffeuv41vfJJ7G/Vq3i8c9/Huv85Cfu\nu+8eWZNrrql+P7fdFtt8+WVm2TXXxLLLL4/HN90Uj994I7POmWdGt0RVpaWRQUmyDbffHu2cPbv2\n1/Pgg3Wvc9ll7qefXv1zQ4ZEe0pK3IcNcx8wIPPc5MnR/hdeqH3/iWXL3IuKogunY0f35cszzz3z\nTHQZtWkTWZ7S0nT7dHdfssT9hBPc//739NsUmjFj3Lt0ie6ua6/Nd2uA/CjI7glJf5b0oaRedaxX\n76Bhl1128QMPPLDSz2233dbgbzCqd8gh7oMGZR4nXQd/+Uvd2+6/f6y71lqZeoMf/jCWTZwYjy+4\nIC58kvuX7J+vAAAcmklEQVRTT1W/n9mz4/lbb43Hd9wR25x5ZmadZcvc11svE4y4Rw1FTd0QSRfJ\nqadmjn/zzTW/ls8+i4CnpoAgsdlm8XqrdgssXBiBluT+s5+5b7pp5XqR0lL3Hj3czzij9v0nzj03\ngoJrr419vvRSLJ89O+pBdtnF/aGH4rmnn063z8WL3ffZJ7ZZYw33Dz9Mt12hGTYsPrO77x6fwWyl\npe7bbRefw8MOc3/ggbw0EWhQt91220rXyV122aWwggZJ4yR9ImnDFOtOkXRllWXHSZpfyzZkGgrA\nLbfEJ2TWLPdvvolvrkcdtXLRYXXOPTe23WOPzLITT4xlH30Ujx9+2L+rMygpqXlfW2/t/uMfR/1E\n27buRx+98oX50ksjY5FkA/r2df/lL6vfX2mpe79+/l3B3JZbRg1FTS65JNbdaqua15k1K/NaZs6s\n/Nydd8by88/PrPPPf1Ze5+STo36jrvd2ypQIdC65JIKlDh3cr7wynrv++ghu/ve/eH969qwcXNVk\nyZI4Tx07ut97r/uGG8b7neY85+L++90vuqhh91nVjju6H3NMZKHWWCNeW+Kdd+K9P+igCNy23rpx\n2wLkS0FlGioChk8lbZJy/UslvVpl2W2iELLgffNNXIjHjnU/++y4qHz2Wbptb789Pl3Z386ffNJ9\n6NDMxeh//4t1Nt209n2dfnqst+GGmYtlVV9/HQHFVVe5r1gR/x4/vuZ9vvyy+403RltGjaq+K8M9\nnt9ss8gEJKM93OMb/PrrZ7pNJk6M51u3XjkTc+yxEZi4x+uX3D/9tPI6//2vf1cg6u7+7bcrt2X+\nfPfevSOTkHQ77LprjB5xj2/Wu+ySWf/4490337zm9yB5fccdF8HHlCmx7P77oy0NndTbZZfIkPzv\nfw2732w9e0Zw9vrr8RoefTTz3N13+3ddXUmX1ty5jdeWbJddFscEcvHww/G3LVcFEzRImiBpvqSd\nJfXM+umQtc4lkiZmPd5I0kLFKIofSBohabmkvWo5DkFDgfjJT+Jbefv27r//ffrtkm91taX93WNo\n52GH1b7OvHkRcNTVP3/IIZF+fv99r3ZERU3+/e/KF/Irr4xhoosXuz/7bDx3++3xDT/p70+GjCZ1\nFcOGxQiQoqIIEhIrVrivvXZmdMSiRREgVLVihfv3vhfdFr/5Tab7IVFeHlmerl2jRiTxm9/EdgsW\nxDlKsg7u7vfd59+NIKlJMnS36gXtsMPce/XKBGjl5dHdU1sgVpv58yOgStu9le1vf3P/3e/qXm/B\ngtj/LbdEe9dfv/LIm4svznSXffpprPuvf+XWlvooLY3z1r595boboDbffhv/Z6obPVaXQgoayitq\nEar+HJu1zt8lTa6y3S6SpklaIuk9ST+r4zgEDQXiH/+IT8m660bffC7uust96dLa15kyxf3NN+vf\nvmzJvA5jx/p3BZhpzJ2budgsXBhzKEgRhAwfHt0GZWURkBQXxzfldu1iqGTfvnER6t07MiKnnea+\nySaVX1/aIseTTop127Z13377mI8jmVMiubhX/fafdPFcfHH8/uCDzHOLF0eK/rLLqj/e889HcFJd\nN07yTf322+NxEjwdcEDdr6M6ybnZYovKXVZVff55DJtMrFgRwUu7dpH5qs2MGZXf6+HD3fv3zzx/\n9NHRfZHYbLMo7G1sr7wS7eraNQpgswtX3SMwy6VgFS3Tt9/GF6TE009n/vbm+vkomKChqX4IGgrH\nvHlRjf6Pf+S7JXVbsiTauu66kW7PZZ6CLbeMdP64cVEXkEy6JLmfd16s85vfRF3HFVfEhT0pqExS\n3Q88kOmW+eKL2OaMM6I9adoyc2ZkE954I2o81l8/uhxuuSWyHNVNWFRSEu3t3DkK/Ko6+GD3wYNX\nXl5e7v6jH0VmpOpFLLHTTu677Rb/Li6O17XBBnW/juoMHx7v8V//Gu3NHg3jHu9l797+Xc3H44/H\n8nvuySyr6zOYBCZffRWPkxqPpKtnm22iriZx4okR9DW2a66JoOepp+IzlXyeEjvtlK72BPXz8cdN\nk1FaVT//eeXC86uuynz2H3sst30RNCCv6soWFJLhw+NTnWuR28iRkSHYbDP3ww+PZX/7W6T+k8LN\nJGuw1lqRvi8rizqLtdeOi0FJSdR8SHEBW7EiaiVOOKF+r+Xee2NfZvG6ago8ttkm1qsuhZ8Us779\nduXlSYbikUdqPn4y3HXy5MhIDBoUj3PtYy0vj8DpzDNj26pdL+4RHG29dRSNDhwYWZ3y8shsbLed\n+8471z0p16WXxrf5pGbmtdeivUnXVocOkYVKJEFfEuBlW748sjZps1W1OfzwTIbjV7+K2qDkm+Oy\nZRGAbrBB/QtPly/PPQtYl7KyljM52OjR8ZnLrhN66in3996rfbtXX226L0tJd5pZdOW5R0HvwIFR\n8zVsWG77I2gAUkpmo0yKA9NKvqVKkYpPZP/hXL7cvVOnyhfb3/8+Hmd/Q9hoo+iHHDky/lilnX+h\nOqecEhmQ2tKTo0ZFG6r777J0aRQHjhiRWVZe7r7DDpFpqO1CtXRpBE3du8eFLumieOKJ3F7D9OmZ\n4MM9AoSddqq8Tu/eUWzrHhf5pJupVauogRg/vu4iyuOPj8xJorQ0ztell8YFouo3tmRK8+wun88/\nj6nMkzlFWreOz0Yuvvkm8xlKAqZzzonHScFr0iWXBDZSvE/1cc45UfDakBf5k0+OIbir6qOPom23\n3LLq+6qvwYMrf/7Ky6Ow+Ygjat/uqKPic79iReO3Mfl8SvFlwT2yYKNGRWFv587R3ZgWQQOQUmlp\nFFfmes+NOXPif8MOO9R+IT3kkMguJBfxDz+M7bKngR46NOoRpEiRN7YXX4wREDW1+4IL4o9f0l/6\n4IOeulD0nHNi3ZNOij+e7dtX/raexkUXxR+9pKgyGWmSjESZP9+/qylJ7LtvLOvYMTI4s2dnAoia\n7Lab+5FHVl62xx5xzpKsTdXRP/36RbBRXh7rrLNOzPkxblwEGMXFEazceWdkmk44ITIUtRk9Otr6\n1luZi8GDD8ZzX38dj5N5R5JM0Jprxnmqyfjx8fqqnuPy8shmSTGxV02+/TbeuzRZwy++iOxHq1a1\nZ5WWL68cYFe1eHHUcJjF+U/qc8rK6jciIPH111EwXN09ZJYsiYt9ci+X5csjwyRl/iYk9Tq9etX8\nf6asLD4LUgR2DeXdd6t/z/72t3i/e/aMz09JSabwOvkMJfVFaRA0ADkoKam5n742p59e97fozz5b\neTTCvfdWHrr35z/H/6y6JoNqKnPmRJ/65ZdH3cTGG8c3/TTp8E8+ifkpktc8YEDlSbTcI5g44IBI\noVZN9b/7btQyDBmSWZZ8w0/+CCYFX6++mlknyU5kH2uvvWovotxgg6g7yXbuufFN/5JLKnddJEaM\niO6mDTeM4+2/f+VzuWJFZqRMknnIzipVtXx5ZGeSbNeNN1ZOObtHYW0ykddZZ8XjI4+snCXJlgz7\nlWKocLa3347lrVpVziZde23ldS+4oPKFsza//W0U0EorzyeS7Y9/jHWqS/NnD+V98snIJO2xR2Qe\nkllgn3xy5e2efDIumtlZk3Hj4tv2okUR+A4Y4DUW5b7wQjz3hz/E45dfjsc9e8bcI+6ZQmkpRlpV\nJ/n8Se433FDze5CroUPj/19VxxwT53/YMPdtt81k215/PZ7fYQf3Aw9MfxyCBqAZWbQohpoWUkX8\n8OHxzalTpwgCskdZ5Lqfqhe3q66KC2P37vGNcuTIKNo86KBY3rNnZg6IxMYbZ4aSTZgQ3+arzr9x\nzz2ZbIR7FFGaVd9FsXhx/DWrOgV2kmHYeefojqlqypToMx4xIgLG6gKp5cvjQvPkkzGktUOHmoPS\n5Hhnnhm/t9228ggO98h8JMHPvvvGxSCZbbW6eVCSbqG2bVcefnfFFZH9GTEigpXlyzPTkn//+/HN\nu6QkAqMePWLd2vryFy+Oz8mpp0aNSfbw4WzLlkUfvBTnL3HXXRFUrb22f1ck7B5ZreQ19O4d66y1\nVuVam3vuycycevfdsezLLzPL+vSJ+p3u3ePutknxcbZktFcyX0nSrfXb30aRdGlpfC633bbyEOqq\nLr00sj+bbtqwI2yKinyluqBk9NUZZ0R7zCJzmd01khRmV1d/Ux2CBgCr5NVX44/OEUesWtHc2LHx\nRzz5Y/bppxGIjBwZfwhHjow+7K22ir7k666rPCtj4uij49uTe/xRTia/qs1HH8VfrOpulf7GG/Fc\n1Wmzk2nIpdpn/UzrmWe81vqDIUPim3BpabwH0sq3mL/wwsx8Eb16RXZk3rw4P9ddt/I+TzwxLiqn\nnRYBWHYf+x57RIFo8s34gQfiwrrFFnGBvvDC+Nbdrl3MndKnT9Qq1JRluv76uGi9/35kQXr0qL5W\n4uabMxfyJItUXh71PNttFxmN7Im13GPZySdHzcf8+dE1tMkmkUUYNSpef1I0OmhQ7O/cc+PzNXVq\nvM7u3aN2p7zcfc8946Ke3eWSdKe1bh3HGDYsLtTJN/dp0yJ4GDMmRhsNH55p+4svZgL9PfeMrNOx\nx8braQjl5ZmaqOyuwQ8+iGX33Zf5jK+9duXhwd98E9mfiy9eeb8TJsR5zkbQAGCVzZ696lNDJ99i\nk0K+IUPiwlfXHApVjRsXF7UlS6Kr5Kij6t4mKWCr2gXhHt/QpOpvKrbxxvFc9sRX9bVoUeVZP5cv\njwvVs89Gt0abNu5XXx3PPfBAHPff/668j2TGzSR1ngwH3H33qFvIvgguXhzdKr/+dWb9pAj3m2/i\neOPGxXvTt2/UY0gxB8fZZ0dWpHv3uFhnt2nUqKgJ+OyzCPS6dYvz2KlTZELcM7eXf/nl6OL64Q+j\n62DZssie7LtvdHt07RqBzEsvxfrJcNm6fPRRZDN69Yrg4f/+Ly7aSRsffDD2nd3Nlx0wzZwZ5+KP\nf8wsO+SQzBTx//pXvCcjRsR5a9s2goik2HnkyMxstEkh9Jgx8Z63bx8ZtGuuie2q1oL84Q8RsGX/\nf3rjjfgcjh9ffddLMrJKikxGomoXVp8+sU7V+VOGDYv3qWoQt/328T5kZw8JGgAUhKSQb9KkTEFj\nLgVaiWnT/LvMQNeuUXOQxkEHxbfAbKWlcXGoaUhmMsdEbcNLc/HDH0bxpHvsU4o/+tttFxeYpPuk\nvDyG9lXtpkouHqeeGr+TFP1tt8V+eveOmoQvv4yagmSdJDA45phYP7nQJUOCk5R9cXE8Xrgw6jza\ntKlcNDhmTLznrVrFxbF79/hGf/75ERQk7Vm+PLqbzj8/3vNu3eL1bbppJjh47rlMkPKrX0UXyaqO\nNigri8xTp07R9qpTrmc79NDoekr07Zu5n8yQIdG25AZ5AwfGxbVTp3htyXv7+eeRFUqOl9xr5o03\n4nVJlSccmzMnU+icFDV+9VV0JyRZjlatVp6ILRk5s956mWHd7pHNyL7r7XHHxXpVZ9NNan+yg7Ky\nssxxs4d2EzQAKBgbbBCp2w4dolCxPtmLFSvij91pp3mNXQ7VufjiuJBlX4iT+SSS+3ZUdfXV8XxS\nUb+qfvGLTJ3CCSdE7cDYsXEhyb4Y1KS8PC6ua68d72H2a3nzzSiWS4Z8duxYufDyooti2fjxUdiX\n3a0za1ZkK7Lnlpg6tfpixoULoyvkT3+q/WZxhx4aXRutWsVF76WXogti++3jdaxYEQHI+efH8pNO\nqvv1p5EEpHXNT/DHP2b6/pcvj4v+hAkRwCTf6pMg6IwzvFIB5eefx+PDD4/fU6ZkiizXWy9e3+LF\nsc8//zlzzNNPjy6OPn0yAdwFF0T3wezZcSE/9tgIHrKzTOPHR9A1alRm1tiknmH06JVfe9WC6/Ly\nyKJkjxBKRlbstVfsOzn3BA0ACkZy2/Pttqu+XiGt3XbLTNld3fC56iTf1pJ7OCRZhqq3wc42f36k\ngBvKddfFBWHBgigaPOusWP711+nrRZLbkNfUXz57dgRDJ51U+V4ln34aIynatIntc7kfTH389a9x\nnOxM0NKllV/nIYdErYVU/X1V6mPZssjE1HWL9qeeiuPOmJEZSTJ5cqYbrWvXTDr/rrti2RVXZLbf\nZJNYloysmDkzArnsYGWbbTKjeD77LJ4///wYjdSuXQRr66wT3R2J0tLMcN3ks33aafFZTepBknvq\nSJW7M5YtqzmIvuKKylOqJ6/p3XcjY5S0gaABQMEYMya+Ka/qbInJ7dO7dEmfrUjGryfD4JJ5DmrK\nMjSGpGslud9HfY6dFOxVHb6aVllZ1FA09qyNy5fHBay24yT3RWmIrolcJTUm112XmXL8iy/iwtu5\ns/vee2fWnT8/MjFJd457pivg+eczy6ZOrTxq54QToktq+fLIIHTvHhftuXPjAv7DH0YmpuqIpJKS\nysWt++wT07q/9ZZ/181w7LHR3ZP2859kFpJRIxdeGBmr8vL4f9m+fbz+xgwaWgkAcnDOOdL770sb\nbrhq+xk8OH5vtZVklm6bLl2kLbeUXnhBWrZM+t3vpJ/8RNphh1VrSy623lpq3166/HKpd29p++1z\n38e228bvH/6wfm1o1Ur63vfid2Nq2zbe39qOs/fe8fvQQ6U2bRq3PVV17Bjn48UXpbffjs/HuutK\n7dpJl10mjR6dWbdbN2nyZGmjjTLLRo2K9QYNyizbbjtpvfUqP545U+rXT7rpJumii6SuXeP9P/xw\n6bXX4vcmm1RuW5cu0sCB0n/+E4/feUfq21fafHOpUyfpiSekO+6Qhg9P//n//vejbVOmxOPXX4/X\nbyadeqp0yy1Sz56p3756aeJTDKC5a9Mm/iCuquQP9dZb577dCy9I114rzZolPfjgqrclF23bStts\nExeq445L/wc/26BBcWFLAqfmbNNNpfPOk4YOzc/xBw6Unn46Khj69cucj1NOqXvboqL4qc0uu8Tv\nLbeU7rqrcqB36qnSnXdKZ59d/bZ77y1dc4307bfxWf3BDyIA23ZbaezYCHyHDau7nQkzabfdKgcN\nSdDWtat02GHp91VfZBoA5MU668Qf9iOOyG27gQPjm9+YMdLJJ8eFoqltt138/ulP67f9hhtK//tf\n/bIUhcZMOv98abPN8nP8gQOlt96SXnopvsk3tL59pQULpHvvXTkzNHCgVFKSyRxVtdde0vz50j//\nGUFN0r6iImnRImnffaX118+tPbvuKk2bJn31lfTee5Gpa0pkGgDkzYQJuW8zaJBUXh4Xq/POa/g2\npXHoofEHe1UyBQ2RrUFcuN2lN9+UfvazxjlGx441P9euXc3PDRwode4sjRsXj3/wg/idZDd+/vPc\n27LrrlJZmXTDDfE710zdqiLTAKBZ6dcvagnGjIl+5XzYYw/p0Uel1q3zc3xk9O2bCcAaI9OwKtq2\nje6EV1+NzFr37rH8oIOkSy6J37nafPOoW0gC7qbONBA0AGhWWreWPv44+pOBVq0y3TyFFjRI0UUh\nVW5bly7SuedGUJErs8g2zJoVRZ2dOzdIM1MjaADQ7DT2qAE0L4MGRYFu1REMhSApVEy6JhrCrrvG\n76bumpCoaQAANHOnnZYZkVJo+vaN2obddmu4fRI0AABQTz16xHwShcgshgg3pC22kI4+WjrkkIbd\nbxoEDQAANCNm0q235ufY9AwCAIBUCBoAAEAqBA0AACAVggYAAJAKQQMAAEiFoAEAAKRC0AAAAFIh\naAAAAKkQNAAAgFQIGgAAQCoEDQAAIBWCBjSJSZMm5bsJaECcz5aF84m0cg4azGxnM7vPzD43s3Iz\nOyjFNkPN7BUzW2RmX5jZDWbWvX5NRnPEH6WWhfPZsnA+kVZ9Mg1rSnpF0ghJXtfKZrajpImS/ipp\nC0mHSdpB0vX1ODYAAMiTnG+N7e6PSHpEkszMUmwySNJH7j6+4vEnZvYXSWflemwAAJA/TVHT8Lyk\n3ma2nySZWU9Jh0t6sAmODQAAGkjOmYZcuftzZnaMpNvNrEPFMe+TNKqWzTpI0ltvvdXYzUMTKSkp\n0fTp0/PdDDQQzmfLwvlsWbKunR0aet/mXmdZQs0bm5VLOsTd76tlnS0k/UfSFZIek9RL0p8kTXX3\n42vY5mhJt9a7YQAAYKi739aQO2yKoOEmSR3c/YisZTtKelpSL3efU802a0v6saSPJS2tdwMBAFj9\ndJC0kaRH3f3rhtxxo3dPSOooaXmVZeWKkRfVFlJWvMgGjY4AAFiNPNcYO63PPA1rmll/M9umYtEm\nFY97Vzz/BzObmLXJ/ZJ+amYnm9nGFVmGqyW96O6zV/kVAACAJpFz94SZ7SrpCa08R8NEd/+5mf1d\nUh933yNrm5GSTpa0saRvJP1X0jnu/uWqNB4AADSdVappAAAAqw/uPQEAAFIhaAAAAKkUXNBgZiPN\n7CMzW2JmL5jZ9vluE+pmZudV3MAs++fNKutcWHHDssVm9h8z2zRf7UVlaW5EV9f5M7P2ZjbezL4y\ns4Vm9m8z69F0rwLZ6jqnZvb3av7PPlRlHc5pATCzc83sJTNbYGZzzOxuM9u8mvUa/f9oQQUNZnak\nYhKo8yRtK+lVSY+a2Tp5bRjSekNST0nrVvzslDxhZmcrZgE9UXHDskWKc9suD+3Eymq9EV3K8zdW\n0gGSfippF0nrSbqzcZuNWqS5ueDDqvx/trjK85zTwrCzpGslDZS0l6S2kh4zszWSFZrs/6i7F8yP\npBckXZ312CR9JumsfLeNnzrP3XmSptfy/BeSRmc97iJpiaQj8t12flY6V+WSDsrl/FU8XiZpSNY6\nP6jY1w75fk2r+08N5/Tvku6qZRvOaYH+SFqn4jzslLWsSf6PFkymwczaSipSDMeUJHm8qscl/Shf\n7UJONqtIhX5gZrdkzd2xseJbTPa5XSDpRXFuC17K87edYrK47HXekTRLnONCtltFuvttM5tgZt2z\nnisS57RQdVNkj+ZJTft/tGCCBkXk1FpS1Wml5yjeDBS2FyQdp5j+O5mT4ykzW1Nx/lyc2+Yqzfnr\nKWl5xR+qmtZBYXlY0rGS9pB0lqRdJT1kZslMveuKc1pwKs7PWEnPuHtSN9Zk/0ebYhpprAbc/dGs\nh2+Y2UuSPpF0hKS389MqADVx939lPZxpZq9L+kDSbooJ/FCYJkjaQtKO+Th4IWUavpJUpoiGsvWU\nxHTTzYy7l0h6V9KmivNn4tw2V2nO32xJ7cysSy3roIC5+0eKv8NJxT3ntMCY2ThJ+0vazSvPqNxk\n/0cLJmhw9xWSpknaM1lWkYbZU4104w00HjPrpPjj80XFH6PZqnxuuygqgTm3BS7l+ZsmqbTKOj+Q\ntKGk55ussag3M9tA0tqSkosR57SAVAQMB0va3d1nZT/XlP9HC6174kpJ/zCzaZJekjRacZfMf+Sz\nUaibmf1RcXOyTyStL+kCSSsk/bNilbGSfmtm7ytueT5GMTLm3iZvLFZSUXuyqTJ3nt3EzPpLmufu\nn6qO8+fuC8zsBklXmtl8SQslXSPpWXd/qUlfDCTVfk4rfs5TDLebXbHeZYrs4KMS57SQmNkExXDY\ngyQtMrMko1Di7ksr/t00/0fzPXSkmqEkIype8BJF9LNdvtvET6rzNqniA7pEUY17m6SNq6xzvmJY\n0GLFH6ZN891ufr47N7sqhl6VVfm5Me35k9ReMZb8q4o/SHdI6pHv17a6/tR2TiV1kPSIImBYKulD\nSX+W9D3OaeH91HAeyyQdW2W9Rv8/yg2rAABAKgVT0wAAAAobQQMAAEiFoAEAAKRC0AAAAFIhaAAA\nAKkQNAAAgFQIGgAAQCoEDQAAIBWCBgAAkApBAwAASIWgAQAApPL/J59eIFK2fFMAAAAASUVORK5C\nYII=\n", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "import matplotlib.pyplot as plt\n", "import matplotlib.ticker as ticker\n", "%matplotlib inline\n", "\n", "plt.figure()\n", "plt.plot(all_losses)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Sampling the Network\n", "\n", "To sample we give the network a letter and ask what the next one is, feed that in as the next letter, and repeat until the EOS token.\n", "\n", "* Create tensors for input category, starting letter, and empty hidden state\n", "* Create a string `output_str` with the starting letter\n", "* Up to a maximum output length,\n", " * Feed the current letter to the network\n", " * Get the next letter from highest output, and next hidden state\n", " * If the letter is EOS, stop here\n", " * If a regular letter, add to `output_str` and continue\n", "* Return the final name\n", "\n", "*Note*: Rather than supplying a starting letter every time we generate, we could have trained with a \"start of string\" token and had the network choose its own starting letter." ] }, { "cell_type": "code", "execution_count": 13, "metadata": { "collapsed": false }, "outputs": [], "source": [ "max_length = 20\n", "\n", "# Generate given a category and starting letter\n", "def generate_one(category, start_char='A', temperature=0.5):\n", " category_input = make_category_input(category)\n", " chars_input = make_chars_input(start_char)\n", " hidden = rnn.init_hidden()\n", "\n", " output_str = start_char\n", " \n", " for i in range(max_length):\n", " output, hidden = rnn(category_input, chars_input[0], hidden)\n", " \n", " # Sample as a multinomial distribution\n", " output_dist = output.data.view(-1).div(temperature).exp()\n", " top_i = torch.multinomial(output_dist, 1)[0]\n", " \n", " # Stop at EOS, or add to output_str\n", " if top_i == EOS:\n", " break\n", " else: \n", " char = all_letters[top_i]\n", " output_str += char\n", " chars_input = make_chars_input(char)\n", "\n", " return output_str\n", "\n", "# Get multiple samples from one category and multiple starting letters\n", "def generate(category, start_chars='ABC'):\n", " for start_char in start_chars:\n", " print(generate_one(category, start_char))" ] }, { "cell_type": "code", "execution_count": 14, "metadata": { "collapsed": false, "scrolled": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Riberkov\n", "Urtherdez\n", "Shimanev\n" ] } ], "source": [ "generate('Russian', 'RUS')" ] }, { "cell_type": "code", "execution_count": 15, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Gomen\n", "Ester\n", "Ront\n" ] } ], "source": [ "generate('German', 'GER')" ] }, { "cell_type": "code", "execution_count": 16, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Sandar\n", "Per\n", "Alvareza\n" ] } ], "source": [ "generate('Spanish', 'SPA')" ] }, { "cell_type": "code", "execution_count": 17, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Cha\n", "Hang\n", "Ini\n" ] } ], "source": [ "generate('Chinese', 'CHI')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The final versions of the scripts [in the Practical PyTorch repo](https://github.com/spro/practical-pytorch/tree/master/conditional-char-rnn) split the above code into a few files:\n", "\n", "* `data.py` (loads files)\n", "* `model.py` (defines the RNN)\n", "* `train.py` (runs training)\n", "* `generate.py` (runs `generate()` with command line arguments)\n", "\n", "Run `train.py` to train and save the network.\n", "\n", "Then run `generate.py` with a language to view generated names: \n", "\n", "```\n", "$ python generate.py Russian\n", "Alaskinimhovev\n", "Beranivikh\n", "Chamon\n", "```" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Exercises\n", "\n", "* Adjust the `temperature` argument to see how generation is affected \n", "* Try with a different dataset of category -> line, for example:\n", " * Fictional series -> Character name\n", " * Part of speech -> Word\n", " * Country -> City\n", "* Use a \"start of sentence\" token so that sampling can be done without choosing a start letter\n", "* Get better results with a bigger network" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Next**: [Translation with a Sequence to Sequence Network and Attention](https://github.com/spro/practical-pytorch/blob/master/seq2seq-translation/seq2seq-translation.ipynb)" ] } ], "metadata": { "anaconda-cloud": {}, "kernelspec": { "display_name": "Python [conda root]", "language": "python", "name": "conda-root-py" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.5.2" }, "nbpresent": { "slides": { "10393c05-7962-4245-9228-8b7db4eb79a1": { "id": "10393c05-7962-4245-9228-8b7db4eb79a1", "prev": "22628fc4-8309-4579-ba36-e5b01a841473", "regions": { "335fd672-4ee6-4b7c-a65f-3ecbf38305e1": { "attrs": { "height": 0.8, "width": 0.8, "x": 0.1, "y": 0.1 }, "content": { "cell": "cc294dae-dd8f-4288-8d3c-bb9fd3ad19bc", "part": "whole" }, "id": "335fd672-4ee6-4b7c-a65f-3ecbf38305e1" } } }, "22628fc4-8309-4579-ba36-e5b01a841473": { "id": "22628fc4-8309-4579-ba36-e5b01a841473", "prev": null, "regions": { "6cfa5157-02f6-48e3-8ce4-89641febbe59": { "attrs": { "height": 0.8, "width": 0.8, "x": 0.1, "y": 0.1 }, "content": { "cell": "9a73330c-27c1-4957-8e95-c3b42bc14a71", "part": "whole" }, "id": "6cfa5157-02f6-48e3-8ce4-89641febbe59" } } }, "2f34f0df-3ccc-4416-9d5d-cb4b075f539f": { "id": "2f34f0df-3ccc-4416-9d5d-cb4b075f539f", "prev": "3eb7f63f-04de-4f51-a240-38d5074bed6f", "regions": { "e25707b9-630e-4ece-9f66-bfcbc8342d76": { "attrs": { "height": 0.8, "width": 0.8, "x": 0.1, "y": 0.1 }, "content": { "cell": "df50f546-6d02-4383-beab-90378f16576b", "part": "whole" }, "id": "e25707b9-630e-4ece-9f66-bfcbc8342d76" } } }, "3eb7f63f-04de-4f51-a240-38d5074bed6f": { "id": "3eb7f63f-04de-4f51-a240-38d5074bed6f", "prev": "cc4bd43a-59ec-4127-b1d8-ebd30162207a", "regions": { "76282c28-a6ba-4a08-be6c-4f27f5b81ddf": { "attrs": { "height": 0.8, "width": 0.8, "x": 0.1, "y": 0.1 }, "content": { "cell": "53fb987f-4f42-4bf8-81ae-280ebdd19aee", "part": "whole" }, "id": "76282c28-a6ba-4a08-be6c-4f27f5b81ddf" } } }, "686bcaec-0623-4943-b227-f4e1c5975c4a": { "id": "686bcaec-0623-4943-b227-f4e1c5975c4a", "prev": "e4c6fc30-f833-4368-99fe-5297b99f1f14", "regions": { "659c021e-7f79-4612-aa7d-8f1c48f91f8b": { "attrs": { "height": 0.8, "width": 0.8, "x": 0.1, "y": 0.1 }, "content": { "cell": "4ff5f52a-2523-47f0-beba-f6c29d412e5f", "part": "whole" }, "id": "659c021e-7f79-4612-aa7d-8f1c48f91f8b" } } }, "964ac1b6-c781-47e8-89f0-1f593d473cd0": { "id": "964ac1b6-c781-47e8-89f0-1f593d473cd0", "prev": "cf8a3b4c-bbb5-4ef3-a8c6-43a1ad7eebc8", "regions": { "d22afc7e-4bbe-401e-9bd2-d12c4a103cf5": { "attrs": { "height": 0.8, "width": 0.8, "x": 0.1, "y": 0.1 }, "content": { "cell": "8ff6da45-57cd-46ca-b14a-3f560ce4d345", "part": "whole" }, "id": "d22afc7e-4bbe-401e-9bd2-d12c4a103cf5" } } }, "cc4bd43a-59ec-4127-b1d8-ebd30162207a": { "id": "cc4bd43a-59ec-4127-b1d8-ebd30162207a", "prev": "964ac1b6-c781-47e8-89f0-1f593d473cd0", "regions": { "1e6711af-7711-4579-ac7a-f893b0d86931": { "attrs": { "height": 0.8, "width": 0.8, "x": 0.1, "y": 0.1 }, "content": { "cell": "cf311809-10bf-40f7-87e1-1952342f7f35", "part": "whole" }, "id": "1e6711af-7711-4579-ac7a-f893b0d86931" } } }, "cf8a3b4c-bbb5-4ef3-a8c6-43a1ad7eebc8": { "id": "cf8a3b4c-bbb5-4ef3-a8c6-43a1ad7eebc8", "prev": "686bcaec-0623-4943-b227-f4e1c5975c4a", "regions": { "3b983e72-35fb-4d19-83b4-789a3394f61f": { "attrs": { "height": 0.8, "width": 0.8, "x": 0.1, "y": 0.1 }, "content": { "cell": "597a765d-634b-41a8-a0c6-be5c019da150", "part": "whole" }, "id": "3b983e72-35fb-4d19-83b4-789a3394f61f" } } }, "e4c6fc30-f833-4368-99fe-5297b99f1f14": { "id": "e4c6fc30-f833-4368-99fe-5297b99f1f14", "prev": "10393c05-7962-4245-9228-8b7db4eb79a1", "regions": { "98a6b3b6-d2db-4d8a-bb16-a4307ede4803": { "attrs": { "height": 0.8, "width": 0.8, "x": 0.1, "y": 0.1 }, "content": { "cell": "6a9d80df-1d38-4c41-849c-95e38da98cc7", "part": "whole" }, "id": "98a6b3b6-d2db-4d8a-bb16-a4307ede4803" } } }, "f1a487d8-4b0b-47df-988f-1161d66174b2": { "id": "f1a487d8-4b0b-47df-988f-1161d66174b2", "prev": "2f34f0df-3ccc-4416-9d5d-cb4b075f539f", "regions": { "2c817a32-203d-404b-8bf5-ba17f7d27034": { "attrs": { "height": 0.8, "width": 0.8, "x": 0.1, "y": 0.1 }, "content": { "cell": "81fde336-785e-461b-a751-718a5f6bff88", "part": "whole" }, "id": "2c817a32-203d-404b-8bf5-ba17f7d27034" } } } }, "themes": {} } }, "nbformat": 4, "nbformat_minor": 1 } ================================================ FILE: conditional-char-rnn/data.py ================================================ # Practical PyTorch: Generating Names with a Conditional Character-Level RNN # https://github.com/spro/practical-pytorch import glob import unicodedata import string import random import time import math import torch from torch.autograd import Variable # Preparing the Data all_letters = string.ascii_letters + " .,;'-" n_letters = len(all_letters) + 1 # Plus EOS marker EOS = n_letters - 1 def unicode_to_ascii(s): return ''.join( c for c in unicodedata.normalize('NFD', s) if unicodedata.category(c) != 'Mn' and c in all_letters ) def read_lines(filename): lines = open(filename).read().strip().split('\n') return [unicode_to_ascii(line) for line in lines] category_lines = {} all_categories = [] for filename in glob.glob('../data/names/*.txt'): category = filename.split('/')[-1].split('.')[0] all_categories.append(category) lines = read_lines(filename) category_lines[category] = lines n_categories = len(all_categories) # Preparing for Training def random_training_pair(): category = random.choice(all_categories) line = random.choice(category_lines[category]) return category, line def make_category_input(category): li = all_categories.index(category) tensor = torch.zeros(1, n_categories) tensor[0][li] = 1 return Variable(tensor) def make_chars_input(chars): tensor = torch.zeros(len(chars), n_letters) for ci in range(len(chars)): char = chars[ci] tensor[ci][all_letters.find(char)] = 1 tensor = tensor.view(-1, 1, n_letters) return Variable(tensor) def make_target(line): letter_indexes = [all_letters.find(line[li]) for li in range(1, len(line))] letter_indexes.append(n_letters - 1) # EOS tensor = torch.LongTensor(letter_indexes) return Variable(tensor) def random_training_set(): category, line = random_training_pair() category_input = make_category_input(category) line_input = make_chars_input(line) line_target = make_target(line) return category_input, line_input, line_target ================================================ FILE: conditional-char-rnn/generate.py ================================================ # Practical PyTorch: Generating Names with a Conditional Character-Level RNN # https://github.com/spro/practical-pytorch import sys if len(sys.argv) < 2: print("Usage: generate.py [language]") sys.exit() else: language = sys.argv[1] import torch import torch.nn as nn from torch.autograd import Variable from data import * from model import * rnn = torch.load('conditional-char-rnn.pt') # Generating from the Network max_length = 20 def generate_one(category, start_char='A', temperature=0.5): category_input = make_category_input(category) chars_input = make_chars_input(start_char) hidden = rnn.init_hidden() output_str = start_char for i in range(max_length): output, hidden = rnn(category_input, chars_input[0], hidden) # Sample as a multinomial distribution output_dist = output.data.view(-1).div(temperature).exp() top_i = torch.multinomial(output_dist, 1)[0] # Stop at EOS, or add to output_str if top_i == EOS: break else: char = all_letters[top_i] output_str += char chars_input = make_chars_input(char) return output_str def generate(category, start_chars='ABC'): for start_char in start_chars: print(generate_one(category, start_char)) generate(language) ================================================ FILE: conditional-char-rnn/model.py ================================================ import torch import torch.nn as nn from torch.autograd import Variable # Creating the Network class RNN(nn.Module): def __init__(self, category_size, input_size, hidden_size, output_size): super(RNN, self).__init__() self.category_size = category_size self.input_size = input_size self.hidden_size = hidden_size self.output_size = output_size self.i2h = nn.Linear(category_size + input_size + hidden_size, hidden_size) self.i2o = nn.Linear(category_size + input_size + hidden_size, output_size) self.o2o = nn.Linear(hidden_size + output_size, output_size) self.softmax = nn.LogSoftmax() def forward(self, category, input, hidden): input_combined = torch.cat((category, input, hidden), 1) hidden = self.i2h(input_combined) output = self.i2o(input_combined) output_combined = torch.cat((hidden, output), 1) output = self.o2o(output_combined) return output, hidden def init_hidden(self): return Variable(torch.zeros(1, self.hidden_size)) ================================================ FILE: conditional-char-rnn/train.py ================================================ # Practical PyTorch: Generating Names with a Conditional Character-Level RNN # https://github.com/spro/practical-pytorch import glob import unicodedata import string import random import time import math import torch import torch.nn as nn from data import * from model import * # Training the Network def train(category_tensor, input_line_tensor, target_line_tensor): hidden = rnn.init_hidden() optimizer.zero_grad() loss = 0 for i in range(input_line_tensor.size()[0]): output, hidden = rnn(category_tensor, input_line_tensor[i], hidden) loss += criterion(output, target_line_tensor[i]) loss.backward() optimizer.step() return output, loss.data[0] / input_line_tensor.size()[0] def time_since(t): now = time.time() s = now - t m = math.floor(s / 60) s -= m * 60 return '%dm %ds' % (m, s) n_epochs = 100000 print_every = 5000 plot_every = 500 all_losses = [] loss_avg = 0 # Zero every plot_every epochs to keep a running average hidden_size = 128 learning_rate = 0.0005 rnn = RNN(n_categories, n_letters, hidden_size, n_letters) optimizer = torch.optim.Adam(rnn.parameters(), lr=learning_rate) criterion = nn.CrossEntropyLoss() start = time.time() def save(): torch.save(rnn, 'conditional-char-rnn.pt') try: print("Training for %d epochs..." % n_epochs) for epoch in range(1, n_epochs + 1): output, loss = train(*random_training_set()) loss_avg += loss if epoch % print_every == 0: print('%s (%d %d%%) %.4f' % (time_since(start), epoch, epoch / n_epochs * 100, loss)) if epoch % plot_every == 0: all_losses.append(loss_avg / plot_every) loss_avg = 0 except KeyboardInterrupt: print("Saving before quit...") save() ================================================ FILE: data/names/Arabic.txt ================================================ Khoury Nahas Daher Gerges Nazari Maalouf Gerges Naifeh Guirguis Baba Sabbagh Attia Tahan Haddad Aswad Najjar Dagher Maloof Isa Asghar Nader Gaber Abboud Maalouf Zogby Srour Bahar Mustafa Hanania Daher Tuma Nahas Saliba Shamoon Handal Baba Amari Bahar Atiyeh Said Khouri Tahan Baba Mustafa Guirguis Sleiman Seif Dagher Bahar Gaber Harb Seif Asker Nader Antar Awad Srour Shadid Hajjar Hanania Kalb Shadid Bazzi Mustafa Masih Ghanem Haddad Isa Antoun Sarraf Sleiman Dagher Najjar Malouf Nahas Naser Saliba Shamon Malouf Kalb Daher Maalouf Wasem Kanaan Naifeh Boutros Moghadam Masih Sleiman Aswad Cham Assaf Quraishi Shalhoub Sabbag Mifsud Gaber Shammas Tannous Sleiman Bazzi Quraishi Rahal Cham Ghanem Ghanem Naser Baba Shamon Almasi Basara Quraishi Bata Wasem Shamoun Deeb Touma Asfour Deeb Hadad Naifeh Touma Bazzi Shamoun Nahas Haddad Arian Kouri Deeb Toma Halabi Nazari Saliba Fakhoury Hadad Baba Mansour Sayegh Antar Deeb Morcos Shalhoub Sarraf Amari Wasem Ganim Tuma Fakhoury Hadad Hakimi Nader Said Ganim Daher Ganem Tuma Boutros Aswad Sarkis Daher Toma Boutros Kanaan Antar Gerges Kouri Maroun Wasem Dagher Naifeh Bishara Ba Cham Kalb Bazzi Bitar Hadad Moghadam Sleiman Shamoun Antar Atiyeh Koury Nahas Kouri Maroun Nassar Sayegh Haik Ghanem Sayegh Salib Cham Bata Touma Antoun Antar Bata Botros Shammas Ganim Sleiman Seif Moghadam Ba Tannous Bazzi Seif Salib Hadad Quraishi Halabi Essa Bahar Kattan Boutros Nahas Sabbagh Kanaan Sayegh Said Botros Najjar Toma Bata Atiyeh Halabi Tannous Kouri Shamoon Kassis Haddad Tuma Mansour Antar Kassis Kalb Basara Rahal Mansour Handal Morcos Fakhoury Hadad Morcos Kouri Quraishi Almasi Awad Naifeh Koury Asker Maroun Fakhoury Sabbag Sarraf Shamon Assaf Boutros Malouf Nassar Qureshi Ghanem Srour Almasi Qureshi Ghannam Mustafa Najjar Kassab Shadid Shamoon Morcos Atiyeh Isa Ba Baz Asker Seif Asghar Hajjar Deeb Essa Qureshi Abboud Ganem Haddad Koury Nassar Abadi Toma Tannous Harb Issa Khouri Mifsud Kalb Gaber Ganim Boulos Samaha Haddad Sabbag Wasem Dagher Rahal Atiyeh Antar Asghar Mansour Awad Boulos Sarraf Deeb Abadi Nazari Daher Gerges Shamoon Gaber Amari Sarraf Nazari Saliba Naifeh Nazari Hakimi Shamon Abboud Quraishi Tahan Safar Hajjar Srour Gaber Shalhoub Attia Safar Said Ganem Nader Asghar Mustafa Said Antar Botros Nader Ghannam Asfour Tahan Mansour Attia Touma Najjar Kassis Abboud Bishara Bazzi Shalhoub Shalhoub Safar Khoury Nazari Sabbag Sleiman Atiyeh Kouri Bitar Zogby Ghanem Assaf Abadi Arian Shalhoub Khoury Morcos Shamon Wasem Abadi Antoun Baz Naser Assaf Saliba Nader Mikhail Naser Daher Morcos Awad Nahas Sarkis Malouf Mustafa Fakhoury Ghannam Shadid Gaber Koury Atiyeh Shamon Boutros Sarraf Arian Fakhoury Abadi Kassab Nahas Quraishi Mansour Samaha Wasem Seif Fakhoury Saliba Cham Bahar Shamoun Essa Shamon Asfour Bitar Cham Tahan Tannous Daher Khoury Shamon Bahar Quraishi Ghannam Kassab Zogby Basara Shammas Arian Sayegh Naifeh Mifsud Sleiman Arian Kassis Shamoun Kassis Harb Mustafa Boulos Asghar Shamon Kanaan Atiyeh Kassab Tahan Bazzi Kassis Qureshi Basara Shalhoub Sayegh Haik Attia Maroun Kassis Sarkis Harb Assaf Kattan Antar Sleiman Touma Sarraf Bazzi Boulos Baz Issa Shamon Shadid Deeb Sabbag Wasem Awad Mansour Saliba Fakhoury Arian Bishara Dagher Bishara Koury Fakhoury Naser Nader Antar Gerges Handal Hanania Shadid Gerges Kassis Essa Assaf Shadid Seif Shalhoub Shamoun Hajjar Baba Sayegh Mustafa Sabbagh Isa Najjar Tannous Hanania Ganem Gerges Fakhoury Mifsud Nahas Bishara Bishara Abadi Sarkis Masih Isa Attia Kalb Essa Boulos Basara Halabi Halabi Dagher Attia Kassis Tuma Gerges Ghannam Toma Baz Asghar Zogby Aswad Hadad Dagher Naser Shadid Atiyeh Zogby Abboud Tannous Khouri Atiyeh Ganem Maalouf Isa Maroun Issa Khouri Harb Nader Awad Nahas Said Baba Totah Ganim Handal Mansour Basara Malouf Said Botros Samaha Safar Tahan Botros Shamoun Handal Sarraf Malouf Bishara Aswad Khouri Baz Asker Toma Koury Gerges Bishara Boulos Najjar Aswad Shamon Kouri Srour Assaf Tannous Attia Mustafa Kattan Asghar Amari Shadid Said Bazzi Masih Antar Fakhoury Shadid Masih Handal Sarraf Kassis Salib Hajjar Totah Koury Totah Mustafa Sabbagh Moghadam Toma Srour Almasi Totah Maroun Kattan Naifeh Sarkis Mikhail Nazari Boutros Guirguis Gaber Kassis Masih Hanania Maloof Quraishi Cham Hadad Tahan Bitar Arian Gaber Baz Mansour Kalb Sarkis Attia Antar Asfour Said Essa Koury Hadad Tuma Moghadam Sabbagh Amari Dagher Srour Antoun Sleiman Maroun Tuma Nahas Hanania Sayegh Amari Sabbagh Said Cham Asker Nassar Bitar Said Dagher Safar Khouri Totah Khoury Salib Basara Abboud Baz Isa Cham Amari Mifsud Hadad Rahal Khoury Bazzi Basara Totah Ghannam Koury Malouf Zogby Zogby Boutros Nassar Handal Hajjar Maloof Abadi Maroun Mifsud Kalb Amari Hakimi Boutros Masih Kattan Haddad Arian Nazari Assaf Attia Wasem Gerges Asker Tahan Fakhoury Shadid Sarraf Attia Naifeh Aswad Deeb Tannous Totah Cham Baba Najjar Hajjar Shamoon Handal Awad Guirguis Awad Ganem Naifeh Khoury Hajjar Moghadam Mikhail Ghannam Guirguis Tannous Kanaan Handal Khoury Kalb Qureshi Najjar Atiyeh Gerges Nassar Tahan Hadad Fakhoury Salib Wasem Bitar Fakhoury Attia Awad Totah Deeb Touma Botros Nazari Nahas Kouri Ghannam Assaf Asfour Sarraf Naifeh Toma Asghar Abboud Issa Sabbag Sabbagh Isa Koury Kattan Shamoon Rahal Kalb Naser Masih Sayegh Dagher Asker Maroun Dagher Sleiman Botros Sleiman Harb Tahan Tuma Said Hadad Samaha Harb Cham Atiyeh Haik Malouf Bazzi Harb Malouf Ghanem Cham Asghar Samaha Khouri Nassar Rahal Baz Kalb Rahal Gerges Cham Sayegh Shadid Morcos Shamoon Hakimi Shamoon Qureshi Ganim Shadid Khoury Boutros Hanania Antoun Naifeh Deeb Samaha Awad Asghar Awad Saliba Shamoun Mikhail Hakimi Mikhail Cham Halabi Sarkis Kattan Nazari Safar Morcos Khoury Essa Nassar Haik Shadid Fakhoury Najjar Arian Botros Daher Saliba Saliba Kattan Hajjar Nader Daher Nassar Maroun Harb Nassar Antar Shammas Toma Antar Koury Nader Botros Bahar Najjar Maloof Salib Malouf Mansour Bazzi Atiyeh Kanaan Bishara Hakimi Saliba Tuma Mifsud Hakimi Assaf Nassar Sarkis Bitar Isa Halabi Shamon Qureshi Bishara Maalouf Srour Boulos Safar Shamoun Ganim Abadi Koury Shadid Zogby Boutros Shadid Hakimi Bazzi Isa Totah Salib Shamoon Gaber Antar Antar Najjar Fakhoury Malouf Salib Rahal Boulos Attia Said Kassis Bahar Bazzi Srour Antar Nahas Kassis Samaha Quraishi Asghar Asker Antar Totah Haddad Maloof Kouri Basara Bata Antar Shammas Arian Gerges Seif Almasi Tuma Shamoon Khoury Hakimi Abboud Baz Seif Issa Nazari Harb Shammas Amari Totah Malouf Sarkis Naser Zogby Handal Naifeh Cham Hadad Gerges Kalb Shalhoub Saliba Tannous Tahan Tannous Kassis Shadid Sabbag Tahan Abboud Nahas Shamoun Dagher Botros Amari Maalouf Awad Gerges Shamoon Haddad Salib Attia Kassis Sleiman Maloof Maroun Koury Asghar Kalb Asghar Touma Ganim Rahal Haddad Zogby Mansour Guirguis Touma Maroun Tannous Hakimi Baba Toma Botros Sarraf Koury Sarraf Nassar Boutros Guirguis Qureshi Aswad Basara Toma Tuma Mansour Ba Naifeh Mikhail Amari Shamon Malouf Boutros Hakimi Srour Morcos Halabi Bazzi Abadi Shamoun Haddad Baz Baba Hadad Saliba Haddad Maalouf Bitar Shammas Totah Said Najjar Mikhail Samaha Boulos Kalb Shamon Shamoun Seif Touma Hajjar Hadad Atiyeh Totah Mansour Nazari Quraishi Ba Sarkis Gerges Shalhoub Nazari Issa Salib Shalhoub Nassar Guirguis Daher Hakimi Attia Cham Isa Hakimi Amari Boutros Sarraf Antoun Botros Haddad Tahan Bishara Shalhoub Safar Haik Tahan Seif Awad Antoun Atiyeh Samaha Assaf Guirguis Hadad Sayegh Khouri Asghar Tannous Maalouf Khouri Hajjar Abadi Ghanem Salib Botros Bitar Bishara Quraishi Boutros Aswad Srour Shamon Abboud Almasi Baba Tahan Essa Sabbag Issa Abadi Abboud Bazzi Nader Bahar Ghannam Asghar Gaber Sayegh Guirguis Srour Asghar Quraishi Sayegh Rahal Tahan Morcos Cham Kanaan Nahas Essa Mifsud Kouri Isa Saliba Asfour Guirguis Isa Bishara Assaf Naser Moghadam Kalb Baba Guirguis Naifeh Bitar Samaha Abboud Hadad Ghannam Hanania Shadid Totah Tahan Toma Maloof Botros Issa Deeb Nahas Khoury Sayegh Harb Said Guirguis Nader Harb Atiyeh Zogby Basara Nassar Kalb Khoury Mifsud Wasem Handal Ganim Harb Ganim Malouf Sayegh Khoury Sabbag Sabbag Boulos Malouf Gaber Shammas Fakhoury Halabi Haddad Asker Morcos Hanania Amari Kassab Malouf Khouri Moghadam Totah Maloof Atiyeh Abadi Baz Khoury Arian Handal Dagher Awad Atiyeh Arian Khoury Amari Attia Ganim Nader Dagher Sabbag Halabi Khouri Khouri Saliba Mifsud Koury Awad Bahar Mustafa Kassis Gaber Mifsud Bishara Asker Nahas Wasem Sleiman Bata Daher Antar Isa Ganim Rahal Toma Rahal Shamoun Maloof Hakimi Safar Gerges Hanania Koury Assaf Safar Gerges Ganim Morcos Awad Arian Tahan Sleiman Asker Boulos Koury Mifsud Sabbag Dagher Bazzi Mustafa Almasi Handal Isa Guirguis Sayegh Ganim Ghanem Toma Mustafa Basara Bitar Samaha Mifsud Tahan Issa Salib Khoury Hadad Haik Gaber Mansour Hakimi Ba Mustafa Gaber Kattan Koury Awad Maalouf Masih Harb Atiyeh Zogby Nahas Assaf Morcos Ganem Ganem Wasem Fakhoury Ghanem Salib Khouri Maloof Khouri Shalhoub Issa Najjar Kassis Mustafa Sayegh Kassis Hajjar Nader Sarkis Tahan Haddad Antar Sayegh Zogby Mifsud Kassab Hanania Bishara Shamoun Abboud Mustafa Sleiman Abadi Sarraf Zogby Daher Issa Nazari Shamon Tuma Asghar Morcos Mifsud Cham Sarraf Antar Ba Aswad Mikhail Kouri Mikhail Awad Halabi Moghadam Mikhail Naifeh Kattan Shammas Malouf Najjar Srour Masih Fakhoury Khouri Assaf Mifsud Malouf Abboud Shamoon Mansour Halabi Ganem Deeb Wasem Kalb Safar Tuma Fakhoury Toma Guirguis Kassab Nader Handal Baba Fakhoury Haik Guirguis Seif Almasi Shamon Ba Salib Zogby Koury Najjar Atiyeh Morcos Antar Awad Hadad Maroun Touma Almasi Kassis Arian Malouf Koury Sarraf Hadad Bata Tuma Sarkis Quraishi Gaber Abadi Nader Bazzi Ghannam Botros Deeb Awad Kattan Kanaan Sarraf Nahas Assaf Shadid Gaber Samaha Harb Samaha Zogby Atiyeh Mustafa Hanania Isa Almasi Bitar Fakhoury Moghadam Handal Seif Mustafa Rahal Antoun Kassab Bazzi Hadad Nader Tuma Basara Totah Nassar Seif Nassar Daher Daher Maalouf Rahal Quraishi Hadad Bahar Sabbag Halabi Tuma Antoun Boutros Gerges Bishara Baba Zogby Nahas Atiyeh Rahal Sabbagh Bitar Botros Tuma Ganim Handal Daher Boutros Khouri Maroun Mifsud Arian Safar Koury Deeb Shamoun Cham Asghar Morcos Tahan Salib Aswad Shadid Saliba Ganim Haik Kattan Antoun Hajjar Toma Toma Antoun Tahan Haik Kassis Shamoun Shammas Kassis Shadid Samaha Sarraf Nader Ganem Zogby Maloof Kalb Gerges Seif Nahas Arian Asfour Hakimi Ba Handal Abadi Harb Nader Asghar Sabbag Touma Amari Kanaan Hajjar Said Sarraf Haddad Mifsud Shammas Sleiman Asfour Deeb Kattan Naser Said Bishara Harb Morcos Sayegh Said Naser Aswad Seif Kouri Dagher Shamon Hadad Handal Tuma Shamon Hakimi Rahal Hadad Ghannam Almasi Daher Handal Malouf Mansour Sabbagh Sabbag Saliba Haddad Tahan Khoury Harb Ganim Mansour Ganem Handal Handal Antar Asfour Kouri Cham Masih Saliba Qureshi Daher Safar Assaf Harb Abboud Haik Ghannam Maalouf Daher Najjar Mifsud Daher Amari Saliba Kanaan Guirguis Atiyeh Sleiman Mikhail Arian Wasem Attia Nassar Cham Koury Baba Guirguis Morcos Quraishi Seif Sarkis Moghadam Ba Boutros Nader Gerges Salib Salib Guirguis Essa Guirguis Antoun Kassis Abboud Najjar Aswad Srour Mifsud Ghanem Bitar Ghannam Asghar Deeb Kalb Nader Srour Attia Shamon Bata Nahas Gerges Kanaan Kassis Sarkis Maloof Almasi Nassar Saliba Arian Ghanem Awad Naifeh Boutros Fakhoury Sabbag Antar Tahan Mustafa Almasi Shammas Totah Boutros Cham Shamon Ganim Ghanem Assaf Khoury Naifeh Bahar Quraishi Bishara Cham Asfour Ghannam Khoury Sayegh Hanania Maroun Kouri Sarkis Haik Basara Salib Shammas Fakhoury Nahas Ganim Botros Arian Shalhoub Hadad Mustafa Shalhoub Kassab Asker Botros Kanaan Gaber Bazzi Sayegh Nassar Kassis Fakhoury Kassis Amari Sarraf Mifsud Salib Samaha Mustafa Asfour Najjar Essa Naifeh Cham Sarraf Moghadam Fakhoury Assaf Almasi Asghar Nader Kalb Shamoun Gerges Wasem Morcos Nader Said Safar Quraishi Samaha Kassab Deeb Sarraf Rahal Naifeh Ba Nazari Ganim Arian Asker Touma Kassab Tahan Mansour Morcos Shammas Baba Morcos Isa Moghadam Ganem Baz Totah Nader Kouri Guirguis Koury Zogby Basara Baz Deeb Mustafa Shadid Awad Sarraf Quraishi Kanaan Tahan Ghannam Shammas Abboud Najjar Bishara Tuma Srour Mifsud Srour Hajjar Qureshi Bitar Hadad Almasi Wasem Abadi Maroun Baz Koury Ganem Awad Maalouf Mifsud Haik Sleiman Arian Seif Mansour Koury Kattan Koury Aswad Ba Rahal Zogby Bahar Fakhoury Samaha Sarraf Mifsud Antar Moghadam Botros Srour Sabbag Sayegh Rahal Attia Naifeh Saliba Mustafa Amari Issa Masih Khouri Haddad Kalb Bazzi Salib Hanania Shamoon Tuma Cham Antoun Wasem Kouri Ghanem Wasem Khoury Assaf Ganem Seif Nader Essa Shadid Botros Sleiman Bishara Basara Maalouf Issa Nassar Moghadam Ganim Kassis Antoun Said Khouri Salib Baz Sarkis Tuma Naifeh Najjar Asker Khouri Mustafa Najjar Sabbag Malouf Wasem Maalouf Gaber Said Zogby Bahar Hanania Shalhoub Abadi Handal Qureshi Kanaan Abboud Mifsud Touma Ganim Bishara Bazzi Gaber Haik Ghanem Sarraf Sarkis Mustafa Baz Kanaan Nazari Bahar Malouf Quraishi Kattan Arian Shadid Tuma Nader Khoury Safar Wasem Toma Haddad Quraishi Nassar Kanaan Gaber Haddad Rahal Koury Harb Mikhail Dagher Shadid Boutros Mikhail Khouri Nader Issa Harb Dagher Gerges Morcos Essa Fakhoury Tuma Kattan Totah Qureshi Nahas Bitar Tahan Daher Shammas Kouri Ganim Daher Awad Malouf Mustafa Aswad ================================================ FILE: data/names/Chinese.txt ================================================ Ang Au-Yong Bai Ban Bao Bei Bian Bui Cai Cao Cen Chai Chaim Chan Chang Chao Che Chen Cheng Cheung Chew Chieu Chin Chong Chou Chu Cui Dai Deng Ding Dong Dou Duan Eng Fan Fei Feng Foong Fung Gan Gauk Geng Gim Gok Gong Guan Guang Guo Gwock Han Hang Hao Hew Hiu Hong Hor Hsiao Hua Huan Huang Hui Huie Huo Jia Jiang Jin Jing Joe Kang Kau Khoo Khu Kong Koo Kwan Kwei Kwong Lai Lam Lang Lau Law Lew Lian Liao Lim Lin Ling Liu Loh Long Loong Luo Mah Mai Mak Mao Mar Mei Meng Miao Min Ming Moy Mui Nie Niu Ou-Yang Ow-Yang Pan Pang Pei Peng Ping Qian Qin Qiu Quan Que Ran Rao Rong Ruan Sam Seah See Seow Seto Sha Shan Shang Shao Shaw She Shen Sheng Shi Shu Shuai Shui Shum Siew Siu Song Sum Sun Sze Tan Tang Tao Teng Teoh Thean Thian Thien Tian Tong Tow Tsang Tse Tsen Tso Tze Wan Wang Wei Wen Weng Won Wong Woo Xiang Xiao Xie Xing Xue Xun Yan Yang Yao Yap Yau Yee Yep Yim Yin Ying Yong You Yuan Zang Zeng Zha Zhan Zhang Zhao Zhen Zheng Zhong Zhou Zhu Zhuo Zong Zou Bing Chi Chu Cong Cuan Dan Fei Feng Gai Gao Gou Guan Gui Guo Hong Hou Huan Jian Jiao Jin Jiu Juan Jue Kan Kuai Kuang Kui Lao Liang Lu Luo Man Nao Pian Qiao Qing Qiu Rang Rui She Shi Shuo Sui Tai Wan Wei Xian Xie Xin Xing Xiong Xuan Yan Yin Ying Yuan Yue Yun Zha Zhai Zhang Zhi Zhuan Zhui ================================================ FILE: data/names/Czech.txt ================================================ Abl Adsit Ajdrna Alt Antonowitsch Antonowitz Bacon Ballalatak Ballaltick Bartonova Bastl Baroch Benesch Betlach Biganska Bilek Blahut Blazek Blazek Blazejovsky Blecha Bleskan Blober Bock Bohac Bohunovsky Bolcar Borovka Borovski Borowski Borovsky Brabbery Brezovjak Brousil Bruckner Buchta Cablikova Camfrlova Cap Cerda Cermak Chermak Cermak Cernochova Cernohous Cerny Cerney Cerny Cerv Cervenka Chalupka Charlott Chemlik Chicken Chilar Chromy Cihak Clineburg Klineberg Cober Colling Cvacek Czabal Damell Demall Dehmel Dana Dejmal Dempko Demko Dinko Divoky Dolejsi Dolezal Doljs Dopita Drassal Driml Duyava Dvorak Dziadik Egr Entler Faltysek Faltejsek Fencl Fenyo Fillipova Finfera Finferovy Finke Fojtikova Fremut Friedrich Frierdich Fritsch Furtsch Gabrisova Gavalok Geier Georgijev Geryk Giersig Glatter Glockl Grabski Grozmanova Grulich Grygarova Hadash Hafernik Hajek Hajicek Hajkova Hana Hanek Hanek Hanika Hanusch Hanzlick Handzlik Hanzlik Harger Hartl Havlatova Havlice Hawlata Heidl Herback Herodes Hiorvst Hladky Hlavsa Hnizdil Hodowal Hodoval Holan Holub Homulka Hora Hovanec Hrabak Hradek Hrdy Hrula Hruska Hruskova Hudecek Husk Hynna Jaluvka Janca Janicek Jenicek Janacek Janick Janoch Janosik Janutka Jares Jarzembowski Jedlicka Jelinek Jindra Jirava Jirik Jirku Jirovy Jobst Jonas Kacirek Kafka Kafka Kaiser Kanak Kaplanek Kara Karlovsky Kasa Kasimor Kazimor Kazmier Katschker Kauphsman Kenzel Kerner Kesl Kessel Kessler Khork Kirchma Klein Klemper Klimes Kober Koberna Koci Kocian Kocian Kofron Kolacny Koliha Kolman Koma Komo Coma Konarik Kopp Kopecky Korandak Korycan Korycansky Kosko Kouba Kouba Koukal Koza Kozumplikova Kratschmar Krawiec Kreisinger Kremlacek Kremlicka Kreutschmer Krhovsky Krivan Krivolavy Kriz Kruessel Krupala Krytinar Kubin Kucera Kucharova Kudrna Kuffel Kupfel Kofel Kulhanek Kunik Kurtz Kusak Kvasnicka Lawa Linart Lind Lokay Loskot Ludwig Lynsmeier Macha Machacek Macikova Malafa Malec Malecha Maly Marek Marik Marik Markytan Matejka Matjeka Matocha Maxa/B Mayer Meier Merta Meszes Metjeka Michalovic Michalovicova Miksatkova Mojzis Mojjis Mozzis Molcan Monfort MonkoAustria Morava Morek Muchalon Mudra Muhlbauer Nadvornizch Nadwornik Navara Navratil Navratil Navrkal Nekuza Nemec Nemecek Nestrojil Netsch Neusser Neisser Naizer Novak Nowak Novotny Novy Novy Oborny Ocasek Ocaskova Oesterreicher Okenfuss Olbrich Ondrisek Opizka Opova Opp Osladil Ozimuk Pachr Palzewicz Panek Patril Pavlik Pavlicka Pavlu Pawlak Pear Peary Pech Peisar Paisar Paiser Perevuznik Perina Persein Petrezelka Petru Pesek Petersen Pfeifer Picha Pillar Pellar Piller Pinter Pitterman Planick Piskach Plisek Plisko Pokorny Ponec Ponec Prachar Praseta Prchal Prehatney Pretsch Prill Psik Pudel Purdes Quasninsky Raffel Rafaj Ransom Rezac Riedel Riha Riha Ritchie Rozinek Ruba Ruda Rumisek Ruzicka Rypka Rebka Rzehak Sabol Safko Samz Sankovsky Sappe Sappe Sarna Satorie Savchak Svotak Swatchak Svocak Svotchak Schallom Schenk Schlantz Schmeiser Schneider Schmied Schubert Schwarz Schwartz Sedmik Sedmikova Seger Sekovora Semick Serak Sherak Shima Shula Siegl Silhan Simecek Simodines Simonek Sip Sitta Skala Skeril Skokan Skomicka Skwor Slapnickova Slejtr Slepicka Slepica Slezak Slivka Smith Snelker Sokolik Soucek Soukup Soukup Spicka Spoerl Sponer Srda Srpcikova Stangl Stanzel Stary Staska Stedronsky Stegon Sztegon Steinborn Stepan Stites Stluka Stotzky StrakaO Stramba Stupka Subertova Suchanka Sula Svejda Svejkovsky Svoboda Tejc Tikal Tykal Till Timpe Timpy Toman Tomanek Tomasek Tomes Trampotova Trampota Treblik Trnkova Uerling Uhlik Urbanek Urbanek Urbanovska Urista Ustohal Vaca Vaculova Vavra Vejvoda Veverka Victor Vlach Vlach Vlasak Vlasek Volcik Voneve Votke Vozab Vrazel Vykruta Wykruta Waclauska Weichert Weineltk Weisener Wiesner Wizner Weiss Werlla Whitmire Widerlechner Wilchek Wondracek Wood Zajicek Zak Zajicek Zaruba Zaruba Zelinka Zeman Zimola Zipperer Zitka Zoucha Zwolenksy ================================================ FILE: data/names/Dutch.txt ================================================ Aalsburg Aalst Aarle Achteren Achthoven Adrichem Aggelen Agteren Agthoven Akkeren Aller Alphen Alst Altena Althuis Amelsvoort Amersvoort Amstel Andel Andringa Ankeren Antwerp Antwerpen Apeldoorn Arendonk Asch Assen Baarle Bokhoven Breda Bueren Buggenum Buiren Buren Can Cann Canne Daal Daalen Dael Daele Dale Dalen Laar Vliert Akker Andel Denend Aart Beek Berg Hout Laar See Stoep Veen Ven Venn Venne Vennen Zee Donk Haanraads Haanraats Haanrade Haanrath Haenraats Haenraets Hanraets Hassel Hautem Hautum Heel Herten Hofwegen Horn Hout Houte Houtem Houten Houttum Houtum Kan Kann Kanne Kappel Karl Kikkert Klein Klerk Klerken Klerks Klerkse Klerkx Klerx Kloet Kloeten Kloeter Koeman Koemans Kolen Kolijn Kollen Koning Kool Koole Koolen Kools Kouman Koumans Krantz Kranz Krusen Kuijpers Kuiper Kuipers Laar Langbroek Laren Lauwens Lauwers Leeuwenhoeck Leeuwenhoek Leeuwenhoek Lucas Lucassen Lyon Maas Maes Maessen Marquering Marqueringh Marquerink Mas Meeuwe Meeuwes Meeuwessen Meeuweszen Meeuwis Meeuwissen Meeuwsen Meisner Merckx Mertens Michel Middelburg Middlesworth Mohren Mooren Mulder Muyskens Nagel Nelissen Nifterick Nifterick Nifterik Nifterik Niftrik Niftrik Offermans Ogterop Ogtrop Oirschot Oirschotten Oomen Oorschot Oorschot Ophoven Otten Pander Panders Paulis Paulissen Peerenboom Peeters Peij Pender Penders Pennders Penner Penners Peter Peusen Pey Philips Prinsen Rademaker Rademakers Ramaaker Ramaker Ramakers Ramecker Rameckers Raske Reijnder Reijnders Reinder Reinders Reynder Reynders Richard Rietveld Rijnder Rijnders Robert Roggeveen Roijacker Roijackers Roijakker Roijakkers Romeijn Romeijnders Romeijnsen Romijn Romijnders Romijnsen Rompa Rompa Rompaeij Rompaey Rompaij Rompay Rompaye Rompu Rompuy Rooiakker Rooiakkers Rooijakker Rooijakkers Roosa Roosevelt Rossem Rossum Rumpade Rutten Ryskamp Samson Sanna Schenck Schermer Schneider Schneiders Schneijder Schneijders Schoonenburg Schoonraad Schoorel Schoorel Schoorl Schorel Schrijnemakers Schuyler Schwarzenberg Seeger Seegers Seelen Segers Segher Seghers Severijns Severins Sevriens Silje Simon Simonis Slootmaekers Smeets Smets Smit Smits Snaaijer Snaijer Sneiders Sneijder Sneijders Sneijer Sneijers Snell Snider Sniders Snijder Snijders Snyder Snyders Specht Spijker Spiker Ter Avest Teunissen Theunissen Tholberg Tillens Tunison Tunneson Vandale Vandroogenbroeck Vann ================================================ FILE: data/names/English.txt ================================================ Abbas Abbey Abbott Abdi Abel Abraham Abrahams Abrams Ackary Ackroyd Acton Adair Adam Adams Adamson Adanet Addams Adderley Addinall Addis Addison Addley Aderson Adey Adkins Adlam Adler Adrol Adsett Agar Ahern Aherne Ahmad Ahmed Aikman Ainley Ainsworth Aird Airey Aitchison Aitken Akhtar Akram Alam Alanson Alber Albert Albrighton Albutt Alcock Alden Alder Aldersley Alderson Aldred Aldren Aldridge Aldworth Alesbury Alexandar Alexander Alexnader Alford Algar Ali Alker Alladee Allam Allan Allard Allaway Allcock Allcott Alldridge Alldritt Allen Allgood Allington Alliott Allison Allkins Allman Allport Allsop Allum Allwood Almond Alpin Alsop Altham Althoff Alves Alvey Alway Ambrose Amesbury Amin Amner Amod Amor Amos Anakin Anderson Andersson Anderton Andrew Andrews Angus Anker Anley Annan Anscombe Ansell Anstee Anthony Antic Anton Antony Antram Anwar Appleby Appleton Appleyard Apsley Arah Archer Ardern Arkins Armer Armitage Armour Armsden Armstrong Arnall Arnett Arnold Arnott Arrowsmith Arscott Arthur Artliff Ashbridge Ashbrook Ashby Ashcroft Ashdown Ashe Asher Ashford Ashley Ashman Ashton Ashurst Ashwell Ashworth Askew Aslam Asom Aspey Aspin Aspinall Astbury Astle Astley Aston Atherley Atherstone Atherton Atkin Atkins Atkinson Attard Atter Atterbury Atterton Attewell Attrill Attwood Auberton Auborn Aubrey Austen Austin Auton Avenue Avery Aves Avis Awad Axon Aylett Ayley Ayliffe Ayling Aylott Aylward Ayres Ayton Aziz Bacon Bailey Bain Bainbridge Baines Bains Baird Baker Baldwin Bale Ball Ballantyne Ballard Bamford Bancroft Banks Banner Bannister Barber Barclay Barker Barlow Barnard Barnes Barnett Baron Barr Barrett Barron Barrow Barry Bartlett Barton Bass Bassett Batchelor Bate Bateman Bates Batt Batten Batty Baxter Bayliss Beadle Beal Beale Beamish Bean Bear Beattie Beatty Beaumont Beck Bedford Beech Beer Begum Bell Bellamy Benfield Benjamin Bennett Benson Bentley Berger Bernard Berry Best Bethell Betts Bevan Beveridge Bickley Biddle Biggs Bill Bing Bingham Binnington Birch Bird Bishop Bithell Black Blackburn Blackman Blackmore Blackwell Blair Blake Blakeley Blakey Blanchard Bland Bloggs Bloom Blundell Blythe Bob Boden Boland Bolton Bond Bone Bonner Boon Booth Borland Bostock Boulton Bourne Bouvet Bowden Bowen Bower Bowers Bowes Bowler Bowles Bowman Boyce Boyd Boyle Bracey Bradbury Bradley Bradshaw Brady Brain Braithwaite Bramley Brandrick Bray Breen Brelsford Brennan Brett Brewer Bridges Briggs Bright Bristow Britton Broadbent Broadhurst Broadley Brock Brook Brooke Brooker Brookes Brookfield Brooks Broomfield Broughton Brown Browne Browning Bruce Brunet Brunton Bryan Bryant Bryson Buchan Buchanan Buck Buckingham Buckley Budd Bugg Bull Bullock Burch Burden Burdett Burford Burge Burgess Burke Burland Burman Burn Burnett Burns Burr Burrows Burt Burton Busby Bush Butcher Butler Butt Butter Butterworth Button Buxton Byrne Caddy Cadman Cahill Cain Cairns Caldwell Callaghan Callow Calveley Calvert Cameron Campbell Cann Cannon Caplan Capper Carey Carling Carmichael Carnegie Carney Carpenter Carr Carrington Carroll Carruthers Carson Carter Cartwright Carty Casey Cashmore Cassidy Caton Cavanagh Cawley Chadwick Chalmers Chamberlain Chambers Chan Chance Chandler Chantler Chaplin Chapman Chappell Chapple Charge Charles Charlton Charnock Chase Chatterton Chauhan Cheetham Chelmy Cherry Cheshire Chester Cheung Chidlow Child Childs Chilvers Chisholm Chong Christie Christy Chung Church Churchill Clamp Clancy Clark Clarke Clarkson Clay Clayton Cleary Cleaver Clegg Clements Cliff Clifford Clifton Close Clough Clowes Coates Coburn Cochrane Cockburn Cockle Coffey Cohen Cole Coleman Coles Coll Collard Collett Colley Collier Collingwood Collins Collinson Colman Compton Conneely Connell Connelly Connolly Connor Conrad Conroy Conway Cook Cooke Cookson Coomber Coombes Cooper Cope Copeland Copland Copley Corbett Corcoran Core Corlett Cormack Corner Cornish Cornock Corr Corrigan Cosgrove Costa Costello Cotter Cotterill Cotton Cottrell Couch Coulson Coulter Court Cousin Cousins Cove Cowan Coward Cowell Cowie Cowley Cox Coyle Crabb Crabtree Cracknell Craig Crane Craven Crawford Crawley Creasey Cresswell Crew Cripps Crisp Crocker Croft Crofts Cronin Crook Crosby Cross Crossland Crossley Crouch Croucher Crow Crowe Crowley Crown Crowther Crump Cullen Cumming Cummings Cummins Cunningham Curley Curran Currie Curry Curtis Curwood Cutts D arcy Dacey Dack Dalby Dale Daley Dallas Dalton Daly Dalzell Damon Danby Dandy Daniel Daniells Daniels Danks Dann Darby Darbyshire Darcy Dardenne Darlington Darr Daugherty Davenport Davey David Davidson Davie Davies Davis Davison Davy Dawe Dawes Dawkins Dawson Day Dayman De ath Deacon Deakin Dean Deane Deans Debenham Deegan Deeley Deighton Delamarre Delaney Dell Dempsey Dempster Denby Denham Denis Denney Dennis Dent Denton Depp Dermody Derrick Derrien Dervish Desai Devaney Devenish Deverell Devine Devlin Devon Devonport Dewar Dexter Diamond Dibble Dick Dickens Dickenson Dicker Dickinson Dickson Dillon Dimmock Dingle Dipper Dixon Dobbin Dobbins Doble Dobson Docherty Docker Dodd Dodds Dodson Doherty Dolan Dolcy Dolman Dolton Donald Donaldson Donkin Donlan Donn Donnachie Donnelly Donoghue Donohoe Donovan Dooley Doolin Doon Doors Dora Doran Dorman Dornan Dorrian Dorrington Dougal Dougherty Doughty Douglas Douthwaite Dove Dover Dowell Dowler Dowling Down Downer Downes Downey Downie Downing Downs Downton Dowson Doyle Drabble Drain Drake Draper Drew Drewett Dreyer Driffield Drinkwater Driscoll Driver Drummond Drury Drysdale Dubois Duck Duckworth Ducon Dudley Duff Duffield Duffin Duffy Dufour Duggan Duke Dukes Dumont Duncan Dundon Dunford Dunkley Dunlop Dunmore Dunn Dunne Dunnett Dunning Dunsford Dupont Durand Durant Durber Durham Durrant Dutt Duval Duvall Dwyer Dyde Dyer Dyerson Dykes Dymond Dymott Dyson Eade Eadie Eagle Eales Ealham Ealy Eames Eansworth Earing Earl Earle Earley Easdale Easdown Easen Eason East Eastaugh Eastaway Eastell Easterbrook Eastham Easton Eastwood Eatherington Eaton Eaves Ebbs Ebden Ebdon Ebeling Eburne Eccles Eccleston Ecclestone Eccott Eckersall Eckersley Eddison Eddleston Eddy Eden Edeson Edgar Edge Edgell Edgerton Edgley Edgson Edkins Edler Edley Edlington Edmond Edmonds Edmondson Edmunds Edmundson Edney Edon Edwards Edwick Eedie Egan Egerton Eggby Eggison Eggleston Eglan Egleton Eglin Eilers Ekin Elbutt Elcock Elder Eldeston Eldridge Eley Elfman Elford Elkin Elkington Ellam Ellans Ellard Elleray Ellerby Ellershaw Ellery Elliman Elling Ellingham Elliot Elliott Ellis Ellison Elliston Ellrott Ellwood Elmer Elmes Elmhirst Elmore Elms Elphick Elsdon Elsmore Elson Elston Elstone Eltis Elven Elvin Elvins Elwell Elwood Elworthy Elzer Emberey Emberson Embleton Emerick Emerson Emery Emmanuel Emmerson Emmery Emmett Emmings Emmins Emmons Emmott Emms Emsden Endroe England English Ennis Ennos Enright Enticott Entwistle Epsom Epton Ernest Erridge Errington Errity Esan Escott Eskins Eslick Espley Essam Essan Essop Estlick Etchells Etheridge Etherington Etherton Ettrick Evans Evason Evenden Everdell Everett Everill Everitt Everson Everton Eveson Evison Evrard Ewart Ewin Ewing Ewles Exley Exon Exton Eyett Eyles Eyre Eyres Fabb Fagan Fagon Fahy Fairbairn Fairbrace Fairbrother Fairchild Fairclough Fairhurst Fairley Fairlie Fairweather Falconer Falk Fall Fallon Fallows Falsh Farge Fargher Farhall Farley Farmer Farnsworth Farnum Farnworth Farr Farrant Farrar Farre Farrell Farrelly Farren Farrer Farrier Farrington Farrow Faulkner Faust Fawcett Fawn Faye Fearn Fearnley Fearns Fearon Featherstone Feeney Feetham Felix Fell Fellmen Fellows Feltham Felton Fenlon Fenn Fenton Fenwick Ferdinand Fereday Ferguson Fern Fernandez Ferns Fernyhough Ferreira Ferrier Ferris Ferry Fewtrell Field Fielder Fielding Fields Fifield Finan Finbow Finch Findlay Findley Finlay Finn Finnegan Finney Finnigan Finnimore Firth Fischer Fish Fisher Fishlock Fisk Fitch Fitchett Fitton Fitzgerald Fitzpatrick Fitzsimmons Flack Flaherty Flanagan Flanders Flannery Flavell Flaxman Fleetwood Fleming Fletcher Flett Florey Floss Flower Flowers Floyd Flynn Foden Fogg Foley Fontaine Foran Forbes Ford Forde Fordham Foreman Forester Forman Forrest Forrester Forshaw Forster Forsyth Forsythe Forth Fortin Foss Fossard Fosse Foster Foston Fothergill Fotheringham Foucher Foulkes Fountain Fowler Fowley Fox Foxall Foxley Frame Frampton France Francis Franco Frankish Frankland Franklin Franks Frary Fraser Frazer Frederick Frederikson Freeburn Freedman Freeman Freestone Freeth Freight French Fretwell Frey Fricker Friel Friend Frith Froggatt Froggett Frost Frostick Froy Frusher Fryer Fulker Fuller Fulleron Fullerton Fulton Funnell Furey Furlong Furnell Furness Furnish Furniss Furse Fyall Gadsden Gaffney Galbraith Gale Gales Gall Gallacher Gallagher Galliford Gallo Galloway Galvin Gamble Gammer Gammon Gander Gandham Ganivet Garber Garbett Garbutt Garcia Gardener Gardiner Gardner Garland Garner Garrard Garratt Garrett Garside Garvey Gascoyne Gaskell Gately Gates Gaudin Gaumont Gauntlett Gavin Gaynor Geaney Geary Geeson Geldard Geldart Gell Gemmell Gene George Gerard Gerrard Geyer Gibb Gibbins Gibbon Gibbons Gibbs Giblin Gibson Gifford Gilbert Gilbey Gilchrist Gilder Giles Gilfillan Gilks Gill Gillam Gillan Gillard Gillen Gillespie Gillett Gillies Gilmartin Gilmore Gilmour Ginty Girdwood Girling Given Gladwell Glaister Glasby Glasgow Glass Gleave Gledhill Gleeson Glen Glencross Glenn Glennie Glennon Glew Glossop Glover Glynn Goble Godby Goddard Godden Godfrey Godwin Goff Gold Goldberg Golding Goldman Goldsmith Goldsworthy Gomez Gonzalez Gooch Good Goodacre Goodall Goodchild Goode Gooding Goodman Goodridge Goodson Goodwin Goodyear Gordon Goring Gorman Gosden Gosling Gough Gould Goulden Goulding Gourlay Govender Govier Gower Gowing Grady Graham Grainger Grange Granger Grant Graves Gray Grayson Greaves Green Greenall Greenaway Greene Greener Greenhill Greening Greenleaf Greenshields Greenslade Greensmith Greenway Greenwood Greer Gregory Greig Grenard Grennan Gresham Grey Grierson Griff Griffin Griffith Griffiths Griggs Grimes Grimshaw Grinham Grivet Grogan Groom Grose Grosvenor Grout Groves Grundy Guest Guilmard Guinard Gulley Gunby Gunn Gunning Gunston Gunter Guthrie Gutteridge Guttridge Hackett Hadden Haddock Hadfield Hagan Haggett Haigh Haine Haines Hale Halford Hall Hallam Hallett Halliday Halliwell Halstead Hamer Hamill Hamilton Hammond Hamnett Hampson Hampton Hancock Hand Handley Hanlon Hannam Hansen Hanson Harden Harding Hardwick Hardy Hargreaves Harker Harkness Harley Harlow Harman Harness Harper Harries Harrington Harris Harrison Harrop Harry Hart Hartley Harvey Harwood Haslam Hassan Hassani Hastings Hatch Hatton Hawes Hawker Hawkes Hawkins Hawkridge Hawley Haworth Hawtin Hayes Haynes Hayward Head Healey Healy Heath Heathcote Heather Heatley Heaton Hedley Hegney Helley Hellier Helm Hemingway Hemmings Henderson Hendry Heneghan Hennessy Henry Hepburn Hepples Herbert Heritage Heron Herron Hetherington Hewitt Hewlett Heywood Hibbert Hickey Hickman Hicks Higgins Higginson Higgs Hill Hills Hilton Hind Hinde Hindle Hindley Hinds Hine Hinton Hirst Hiscocks Hitchcock Hoare Hobbs Hobson Hocking Hodder Hodge Hodges Hodgkins Hodgkinson Hodgson Hodkinson Hodson Hogan Hogg Holden Holder Holding Holdsworth Hole Holgate Holl Holland Hollis Holloway Holman Holmes Holt Homer Hood Hook Hooper Hooton Hope Hopes Hopkins Hopkinson Hopwood Horn Horne Horner Horrocks Horton Hough Houghton Hoult Houlton Houston Howard Howarth Howden Howe Howell Howells Howes Howie Hoyle Hubbard Hudson Huggins Hughes Hull Hulme Hume Humphrey Humphreys Humphries Hunt Hunter Hurley Hurrell Hurst Hussain Hussein Hussey Hutchings Hutchins Hutchinson Hutchison Hutton Hyde Ianson Ibbotson Ibbs Ibrahim Iddon Iggleden Iles Ilett Illing Illingworth Ilsley Impey Imran Ingermann Ingham Ingle Ingleby Ingledew Inglefield Ingles Inglethorpe Ingram Inker Inman Innalls Innes Inson Ireland Ireson Ironman Ironmonger Irvin Irvine Irving Irwin Isaac Isaacs Isbill Isbitt Isgate Isherwod Isherwood Islam Isman Isnard Issac Ivory Izzard Jackman Jacks Jackson Jacob Jacobs Jacobson Jacques Jaffray Jagger Jakeman James Jameson Jamieson Janes Jansen Jardine Jarman Jarram Jarratt Jarrett Jarrold Jarvis Jasper Jebson Jeffcock Jefferies Jeffers Jefferson Jeffery Jefford Jeffrey Jeffreys Jeffries Jeffs Jems Jenas Jenkin Jenkins Jenkinson Jenks Jenkyns Jenner Jennings Jennison Jennson Jensen Jepson Jermy Jerome Jerry Jervis Jesson Jessop Jevons Jewell Jewers Jewett Jewitt Jewkes Jewson Jiggens Jobson Johannson Johansen Johanson John Johns Johnson Johnston Johnstone Jolley Jolly Jonas Jones Jonhson Jopson Jordan Jordison Jordon Joseph Joss Jourdan Jowett Jowitt Joyce Joynson Jubb Judd Judge Jukes Jupp Jury Kacy Kaddour Kamara Kampfner Kane Kanes Kapoor Karim Karne Karras Kassell Kaufman Kaul Kaur Kavanagh Kay Kaye Kayes Keable Keal Kealey Keane Kearney Kearns Kearsley Kearton Keating Keaveney Keay Keeble Keefe Keegan Keelan Keeler Keeley Keeling Keenan Keene Keetley Keffler Kehoe Keighley Keight Keilty Keir Keith Kelk Kell Kelland Kellems Kellie Kelliher Kelly Kelsall Kelsey Kelso Kemp Kempson Kempster Kendall Kendell Kendrick Kenley Kennard Kennedy Kenneford Kennell Kenneth Kennett Kenney Kenning Kenny Kenrick Kensington Kent Kentwood Kenward Kenworthy Kenyon Keogh Kerby Kernick Kerr Kerrell Kerridge Kerrigan Kerrighen Kerrison Kershaw Ketley Kett Kettell Ketteringham Kettlewell Keward Kewley Keys Keyte Keywood Khalid Khalifa Khalil Khan Kibblewhite Kidd Kiddle Kidman Kidner Kiely Kiernan Kilb Kilbee Kilbey Kilbride Kilburn Kilford Kill Killeen Killen Killick Killock Kilminster Kilmurry Kilnan Kilner Kilroy Kilshaw Kimber Kimble Kinch Kinchin Kinder King Kingdon Kinghorn Kingman Kings Kingscott Kingsley Kingston Kinnaird Kinnear Kinnersley Kinniburgh Kinnison Kinrade Kinsella Kinsey Kinsley Kipling Kirby Kirk Kirkbride Kirkbright Kirkby Kirkland Kirkman Kirkpatrick Kirkwood Kirtley Kirwan Kirwin Kitchen Kitchin Kitching Kitson Kitt Klam Klein Knab Knappett Knibb Knigge Knight Knightley Knighton Knights Knott Knowler Knowles Knox Knoxville Knuckles Knutt Koban Kolt Kone Kore Kouma Kram Kreyling Kristensen Kromberg Kruger Kumar Kurian Kurray Kydd Kyle Kysel Labbe Lacey Lacy Laing Laird Lake Lakey Lakin Lamb Lambert Lambton Lame Lamond Lancaster Lander Lane Lang Langdon Lange Langford Langley Langridge Langston Langton Lanham Laraway Large Larkin Larkings Larsen Larsson Last Latham Lathan Lathey Lattimore Laurie Laver Laverick Lavery Lawal Lawler Lawlor Lawn Lawrance Lawrence Lawrie Laws Lawson Lawther Lawton Laycock Layton Le tissier Leach Leadley Leahy Leake Leal Leary Leaver Leck Leckie Ledger Lee Leech Leedham Leek Leeming Lees Leese Leeson Legg Legge Leggett Leigh Leighton Leitch Leith Lendon Lenihan Lennard Lennon Lennox Leonard Leroy Leslie Lester Lethbridge Levann Levett Levin Levine Levy Lewin Lewington Lewins Lewis Lewry Leyland Leys Leyshon Liddell Liddle Lightfoot Lilley Lilly Lilwall Lincoln Lind Linden Lindo Lindop Lindsay Line Lines Linford Ling Linley Linsby Linton Lister Litchfield Little Littlewood Livermore Livingstone Llewellyn Lloyd Loat Lobb Lock Locke Lockett Lockhart Lockie Lockwood Lockyer Lodge Loft Lofthouse Loftus Logan Lohan Lois Lomas Lomax London Long Longhurst Longley Longworth Lonsdale Lopes Lopez Lord Loudon Loughran Louth Lovatt Love Lovegrove Lovell Lovelock Lovett Lovey Lowbridge Lowdon Lowe Lowes Lowis Lowndes Lowrie Lowry Lucas Luce Lucey Luckhurst Ludgrove Ludkin Ludlow Luke Luker Lumb Lumley Lumsden Lunn Lunt Luscombe Luttrell Luxton Lyall Lyes Lyme Lynas Lynch Lynes Lynn Lyon Lyons Mac Macarthur Macaulay Macdonald Mace Macfarlane Macgregor Machin Macintyre Mack Mackay Mackenzie Mackie Maclean Macleod Macmillan Macpherson Macrae Madden Maddocks Magee Maguire Maher Mahoney Main Mair Major Makin Malley Mallinson Malone Maloney Mangnall Mann Manning Mansell Mansfield Manson Markham Marks Marlow Marr Marriott Marsden Marsh Marshall Martin Martinez Martins Mason Masters Mather Mathers Matheson Mathews Matthams Matthews Maughan Mawson Maxwell May Maynard Mcarthur Mcauley Mcavoy Mcbain Mccabe Mccaffrey Mccall Mccallum Mccann Mccarthy Mccartney Mccluskey Mcclymont Mcconnell Mccormack Mccormick Mccourt Mcculloch Mccullough Mcdermott Mcdonagh Mcdonald Mcdonnell Mcdougall Mcelroy Mcewan Mcfadden Mcfarlane Mcgee Mcghee Mcgill Mcginty Mcgowan Mcgrady Mcgrath Mcgregor Mcgrory Mcguinness Mcguire Mcintosh Mcintyre Mckay Mckee Mckenna Mckenzie Mckeown Mckie Mclaren Mclaughlin Mclean Mclellan Mcleod Mcloughlin Mcmahon Mcmanus Mcmillan Mcnally Mcnamara Mcnaught Mcneil Mcneill Mcnulty Mcphail Mcphee Mcpherson Mcrae Mcshane Mctaggart Meadows Meakin Mears Melia Mellor Meredith Merritt Metcalf Metcalfe Michael Michel Middleton Miles Milford Mill Millar Millard Miller Millett Milligan Millington Mills Millward Milne Milner Milward Mistry Mitchell Moffat Mohamed Mohammed Molloy Molyneux Monaghan Montague Montgomery Moody Moon Mooney Moore Moorhouse Moran More Moreno Moreton Morgan Moriarty Morley Moroney Morris Morrison Morrow Mortimer Morton Moseley Moss Mottram Mould Muir Mullen Mulligan Mullins Mundy Munro Murphy Murray Murrell Mustafa Myatt Myers Nair Nairn Nandi Nanson Nanton Napier Napper Nartey Nash Nason Naughton Naumann Nayler Naylor Naysmith Neal Neale Neary Neave Neaverson Nedd Needham Neeson Negros Neighbour Neill Neilsen Neilson Neish Nelmes Nelms Nelson Nemeth Nero Nesbitt Ness Nessbert Nettleton Neville Nevins Nevis Newall Newberry Newbold Newbury Newby Newcombe Newell Newey Newham Newill Newington Newland Newlands Newman Newsham Newsome Newson Newstead Newton Neyland Nichol Nicholas Nicholl Nicholls Nichols Nicholson Nickel Nickolls Nicks Nicol Nicolas Nicoll Nicolson Nield Nielsen Nielson Nightingale Niles Nilsen Nineham Nisbet Nixon Noach Noakes Nobbs Noble Noggins Nokes Nolan Nood Noon Noonan Norbert Norburn Norbury Norcross Nord Norgate Norgrove Norm Norman Normington Norris Norsworthy North Northcott Norton Norville Norwood Notman Nott Nourse Nova Nowak Nowell Noyce Noyes Nugent Number Nunn Nurse Nurton Nutella Nutman Nutt Nuttall Oakes Oakey Oakley Oaks Oakton Oates Oatridge Oatway Obrien Ocallaghan Oconnell Oconnor Odam Oddie Oddy Odea Odell Odling Odonnell Odonoghue Odriscoll Oflynn Ogden Ogilvie Ogilvy Ogrady Ohalloran Ohara Okeefe Okey Okten Olan Oldfield Oldham Olding Oldland Oldroyd Olds Oleary Oliver Olivier Ollerhead Olley Oloughlin Olsen Olson Omalley Oman Oneil Oneill Opayne Openshaw Oram Orbell Orchard Oreilly Oriley Orman Orme Ormiston Ormond Ormsby Ormston Orrell Orritt Orton Orvis Orwin Osborn Osborne Osman Osmond Ostcliffe Ostler Osullivan Oswald Otoole Otten Otter Ottey Ottley Otton Ould Oulton Overall Overett Overfield Overing Overson Overton Owen Owens Owings Oxby Oxenham Oxley Oxtoby Pack Packard Packer Pagan Page Paige Pailing Paine Painter Paisley Palfrey Palfreyman Palin Pallett Palmer Panesar Pankhurst Pannell Parish Park Parker Parkes Parkin Parkins Parkinson Parks Parmar Parnaby Parnell Parr Parratt Parrott Parry Parsons Partington Partlett Partridge Pascoe Pasfield Paskell Passmore Patchett Patel Pateman Paterson Paton Patrick Patten Patterson Pattinson Pattison Patton Paul Pavot Pawson Payne Peace Peach Peacock Peake Peal Peaper Pearce Pears Pearson Peat Peck Pedley Peebles Peel Peers Pegg Peigne Pell Pelling Pemberton Pender Pendlebury Pendleton Penfold Penn Pennell Penney Pennington Percival Pereira Perez Perkin Perkins Perks Perowne Perrett Perrin Perrins Perry Peters Petersen Peterson Petrova Pett Petticrew Peyton Phelan Phelps Philip Philips Phillips Philpott Phipps Phoenix Pick Pickard Pickering Pickersgill Pickett Pickford Pickthall Picot Pierce Piercey Pierre Pigott Pike Pilkington Pillay Pinder Pine Pinkney Pinner Pinnock Pinsmail Pipe Piper Pitcher Pitchford Pitt Pitts Plant Plastow Platt Platts Pledger Plouvin Plumb Plummer Pocock Pointer Pole Pollard Pollock Polson Pomeroy Pomphrey Pond Pooke Poole Poon Pope Porter Potter Potts Poulter Poulton Pounder Povey Powell Power Powers Powis Powles Poyser Pratt Preece Prendergast Prentice Prescott Preston Prevost Price Prime Prince Pringle Prior Pritchard Privett Probert Procter Proctor Prosser Provan Pryor Pugh Pullen Purcell Purkis Purnell Purse Purvis Putt Pyle Quigley Quinlivan Quinn Quinnell Quinton Quirk Quirke Rackham Radcliffe Radford Radley Raeburn Rafferty Rahman Raine Rainey Rainford Ralph Ralston Ramm Rampling Ramsay Ramsden Ramsey Rand Randall Randle Ranger Rankin Ranks Rann Ransom Ranson Rapson Rashid Ratcliffe Raval Raven Ravenscroft Rawlings Rawlinson Rawsthorne Raymond Rayner Read Reade Reader Reading Readle Readman Reardon Reasbeck Reay Redden Redding Reddy Redfern Redhead Redin Redman Redmond Redwood Reed Rees Reese Reeve Reeves Regan Regent Rehman Reid Reilly Reisser Render Renna Rennalls Rennie Renshaw Renwick Reveley Reyes Reygan Reynolds Rhoades Rhodes Rhys Ricci Rice Rich Richards Richardson Riches Richman Richmond Richter Rick Rickard Rickards Rickett Ricketts Riddell Riddle Riddler Ridge Ridgway Ridgwell Ridle Ridley Rigby Rigg Rigley Riley Ring Ripley Rippin Riseborough Ritchie Rivers Rixon Roach Robb Robbins Robe Robert Roberts Robertson Robin Robins Robinson Robishaw Robotham Robson Roche Rochford Rockliffe Rodden Roden Rodger Rodgers Rodham Rodrigues Rodriguez Rodwell Roebuck Roff Roffey Rogan Rogers Rogerson Roles Rolfe Rollinson Roman Romans Ronald Ronflard Rook Rooke Roome Rooney Rootham Roper Ropple Roscoe Rose Rosenblatt Rosenbloom Ross Rosser Rossi Rosso Roth Rothery Rothwell Rouse Roussel Rousset Routledge Rowan Rowe Rowland Rowlands Rowley Rowlinson Rowson Royall Royle Rudd Ruff Rugg Rumbold Rumsey Ruscoe Rush Rushbrooke Rushby Rushton Russel Russell Russon Rust Rutherford Rutter Ryan Ryans Rycroft Ryder Sadiq Sadler Said Saleh Salisbury Sallis Salmon Salt Salter Sampson Samuel Samuels Sanchez Sanders Sanderson Sandison Sands Santos Sargent Saunders Savage Saville Sawyer Saxton Sayers Schmid Schmidt Schofield Scott Searle Seddon Seer Selby Sellars Sellers Senior Sewell Sexton Seymour Shackleton Shah Shakespeare Shand Shanks Shannon Sharkey Sharma Sharp Sharpe Sharples Shaughnessy Shaw Shea Shearer Sheehan Sheldon Shelton Shepherd Sheppard Sheridan Sherman Sherriff Sherry Sherwood Shields Shipley Short Shotton Showell Shuttleworth Silcock Silva Simmonds Simmons Simms Simon Simons Simpson Sims Sinclair Singh Singleton Sinha Sisson Sissons Skelly Skelton Skinner Skipper Slade Slater Slattery Sloan Slocombe Small Smallwood Smart Smit Smith Smithson Smullen Smyth Smythe Sneddon Snell Snelling Snow Snowden Snowdon Somerville South Southern Southgate Southwick Sparkes Sparrow Spears Speed Speight Spence Spencer Spicer Spiller Spinks Spooner Squire Squires Stacey Stack Staff Stafford Stainton Stamp Stanfield Stanford Stanley Stannard Stanton Stark Steadman Stedman Steel Steele Steer Steere Stenhouse Stephen Stephens Stephenson Sterling Stevens Stevenson Steward Stewart Stock Stocker Stockley Stoddart Stokes Stokoe Stone Stoppard Storer Storey Storr Stott Stout Strachan Strange Street Stretton Strickland Stringer Strong Stroud Stuart Stubbs Stuckey Sturgess Sturrock Styles Sugden Sullivan Summers Sumner Sunderland Sutherland Sutton Swain Swales Swan Swann Swanson Sweeney Sweeting Swift Sykes Sylvester Symes Symonds Taggart Tailor Tait Talbot Tallett Tamber Tang Tanner Tansey Tansley Tappin Tapping Tapscott Tarr Tarrant Tasker Tate Tatlock Tatlow Tatnell Taurel Tayler Taylor Teague Teal Teale Teasdale Tedd Telford Tell Tellis Tempest Templar Temple Templeman Templeton Tennant Terry Thackeray Thackray Thake Thatcher Thelwell Thirlwall Thirlway Thirlwell Thistlethwaite Thom Thomas Thomason Thompson Thoms Thomson Thonon Thorley Thorndyke Thorne Thornes Thornhill Thornley Thornton Thorp Thorpe Thurbon Thurgood Thurling Thurlow Thurman Thurston Tickner Tidmarsh Tierney Till Tillett Tilley Tilson Tilston Timberlake Timmins Timms Timney Timson Tindall Tindell Tinker Tinkler Tinsley Tipping Tippins Tips Tisdall Titmarsh Titmus Titmuss Titterington Toal Tobin Tocher Todd Tohill Toland Tolley Tollis Tolmay Tomas Tombs Tomes Tomkins Tomlin Tomlinson Tompkin Tompkins Toms Tong Tonge Tonks Tonner Toomer Toomey Topham Topley Topliss Topp Torney Torrance Torrens Torres Tosh Totten Toucet Tovar Tovey Towell Towers Towle Townend Towns Townsend Townsley Tozer Trafford Train Trainor Trattles Travers Travill Travis Traynor Treble Trennery Trent Treseder Trevor Trew Trickett Trigg Trimble Trinder Trollope Troon Trotman Trott Trueman Truman Trump Truscott Tuck Tucker Tuckey Tudor Tuffnell Tufnall Tugwell Tully Tunks Tunstall Turford Turke Turkington Turland Turnbull Turner Turney Turnham Turnock Turrell Turton Turvey Tuthill Tuttle Tutton Tweddle Twigg Twiggs Twine Tyler Tyman Tyne Tyrer Tyrrell Uddin Ullman Ullmann Ulyatt Umney Underdown Underhill Underwood Unsworth Unwin Upfield Upjohn Upsdell Upson Upton Urwin Utley Utterson Uttley Utton Uttridge Vale Valentine Vallance Vallins Vallory Valmary Vancoller Vane Vann Vanstone Vanwell Vardy Varey Varley Varndell Vass Vaughan Vaughn Veale Veasey Veevers Veitch Velds Venables Ventura Verdon Verell Verney Vernon Vicary Vicens Vickars Vickerman Vickers Vickery Victor Vikers Villiger Villis Vince Vincent Vine Viner Vines Viney Vinicombe Vinny Vinton Virgo Voakes Vockins Vodden Vollans Voyse Vyner Wade Wadham Waghorn Wagstaff Wain Wainwright Waite Wakefield Wakeford Wakeham Wakelin Waldron Wale Wales Walkden Walker Wall Wallace Waller Walling Wallis Walls Walmsley Walpole Walsh Walshe Walter Walters Walton Wane Wang Warburton Warby Ward Warden Wardle Ware Wareing Waring Warn Warner Warren Warriner Warrington Warwick Water Waterfield Waterhouse Wateridge Waterman Waters Waterson Watkins Watkinson Watling Watson Watt Watters Watts Waugh Wears Weasley Weaver Webb Webber Webster Weeks Weir Welch Weldon Weller Wellington Wellman Wells Welsh Welton Were Werner Werrett West Western Westgate Westlake Weston Westwell Westwood Whalley Wharton Wheatcroft Wheatley Wheeldon Wheeler Whelan Whitaker Whitby White Whiteford Whitehead Whitehouse Whitelaw Whiteley Whitfield Whitham Whiting Whitley Whitlock Whitmore Whittaker Whittingham Whittington Whittle Whittley Whitworth Whyte Wickens Wickham Wicks Widdows Widdowson Wiggins Wigley Wilcox Wild Wilde Wildman Wileman Wiles Wilkes Wilkie Wilkin Wilkins Wilkinson Wilks Wilkshire Will Willett Willetts Williams Williamson Willis Wills Willson Wilmot Wilson Wilton Wiltshire Winder Windsor Winfer Winfield Winman Winn Winship Winstanley Winter Wintersgill Winward Wise Wiseman Wither Withers Wolf Wolfe Wolstencroft Wong Wood Woodcock Woodford Woodhall Woodham Woodhams Woodhead Woodhouse Woodland Woodley Woods Woodward Wooldridge Woollard Woolley Woolnough Wootton Worgan Wormald Worrall Worsnop Worth Worthington Wotherspoon Wragg Wraight Wray Wren Wrench Wrenn Wrigglesworth Wright Wrightson Wyatt Wyer Yabsley Yallop Yang Yapp Yard Yardley Yarker Yarlett Yarnall Yarnold Yarwood Yasmin Yates Yeadon Yeardley Yeardsley Yeates Yeatman Yeldon Yeoman Yeomans Yetman Yeung Yoman Yomkins York Yorke Yorston Youlden Young Younge Younis Youssouf Yule Yusuf Zaoui ================================================ FILE: data/names/French.txt ================================================ Abel Abraham Adam Albert Allard Archambault Armistead Arthur Augustin Babineaux Baudin Beauchene Beaulieu Beaumont Bélanger Bellamy Bellerose Belrose Berger Béringer Bernard Bertrand Bisset Bissette Blaise Blanc Blanchet Blanchett Bonfils Bonheur Bonhomme Bonnaire Bonnay Bonner Bonnet Borde Bordelon Bouchard Boucher Brisbois Brodeur Bureau Caron Cavey Chaput Charbonneau Charpentier Charron Chastain Chevalier Chevrolet Cloutier Colbert Comtois Cornett Coté Coupe Courtemanche Cousineau Couture Daniau D'aramitz Daviau David Deforest Degarmo Delacroix De la fontaine Deniau Deniaud Deniel Denis De sauveterre Deschamps Descoteaux Desjardins Desrochers Desrosiers Dubois Duchamps Dufort Dufour Duguay Dupond Dupont Durand Durant Duval Émile Eustis Fabian Fabre Fabron Faucher Faucheux Faure Favager Favre Favreau Fay Félix Firmin Fontaine Forest Forestier Fortier Foss Fournier Gage Gagne Gagnier Gagnon Garcon Gardinier Germain Géroux Giles Girard Giroux Glaisyer Gosse Gosselin Granger Guérin Guillory Hardy Harman Hébert Herbert Herriot Jacques Janvier Jordan Joubert Labelle Lachance Lachapelle Lamar Lambert Lane Langlais Langlois Lapointe Larue Laurent Lavigne Lavoie Leandres Lebeau Leblanc Leclair Leclerc Lécuyer Lefebvre Lefévre Lefurgey Legrand Lemaire Lémieux Leon Leroy Lesauvage Lestrange Lévêque Lévesque Linville Lyon Lyon Maçon Marchand Marie Marion Martel Martel Martin Masson Masson Mathieu Mercier Merle Michaud Michel Monet Monette Montagne Moreau Moulin Mullins Noel Oliver Olivier Page Paget Palomer Pan Pape Paquet Paquet Parent Paris Parris Pascal Patenaude Paternoster Paul Pelletier Perrault Perreault Perrot Petit Pettigrew Pierre Plamondon Plourde Poingdestre Poirier Porcher Poulin Proulx Renaud Rey Reyer Richard Richelieu Robert Roche Rome Romilly Rose Rousseau Roux Roy Royer Salomon Salvage Samson Samuel Sargent Sarkozi Sarkozy Sartre Sault Sauvage Sauvageau Sauvageon Sauvageot Sauveterre Savatier Segal Sergeant Séverin Simon Solomon Soucy St martin St pierre Tailler Tasse Thayer Thibault Thomas Tobias Tolbert Traver Travere Travers Traverse Travert Tremblay Tremble Victor Victors Villeneuve Vincent Vipond Voclain Yount ================================================ FILE: data/names/German.txt ================================================ Abbing Abel Abeln Abt Achilles Achterberg Acker Ackermann Adam Adenauer Adler Adlersflügel Aeschelman Albert Albrecht Aleshire Aleshite Althaus Amsel Andres Armbrüster Armbruster Artz Aue Auer Augustin Aust Autenburg Auttenberg Baasch Bach Bachmeier Bäcker Bader Bähr Bambach Bauer Bauers Baum Baumann Baumbach Baumgärtner Baumgartner Baumhauer Bayer Beck Becke Beckenbauer Becker Beckert Behrend Behrends Beitel Beltz Benn Berg Berger Bergfalk Beringer Bernat Best Beutel Beyer Beyersdorf Bieber Biermann Bischoffs Blau Blecher Bleier Blumenthal Blumstein Bocker Boehler Boer Boesch Böhler Böhm Böhme Böhmer Bohn Borchard Bösch Bosch Böttcher Brahms Brand Brandt Brant Brauer Braun Braune Breiner Breisacher Breitbarth Bretz Brinkerhoff Brodbeck Brose Brotz Bruhn Brun Brune Buchholz Buckholtz Buhr Bumgarner Burgstaller Busch Carver Chevrolet Cline Dahl Denzel Derrick Diefenbach Dieter Dietrich Dirchs Dittmar Dohman Drechsler Dreher Dreschner Dresdner Dressler Duerr Dunkle Dunst Dürr Eberhardt Ebner Ebner Eckstein Egger Eichel Eilerts Engel Enns Esser Essert Everhart Fabel Faerber Falk Falkenrath Färber Fashingbauer Faust Feigenbaum Feld Feldt Fenstermacher Fertig Fiedler Fischer Flater Fleischer Foerstner Forst Förstner Foth Frank Franke Frei Freud Freudenberger Freund Fried Friedrich Fromm Frost Fuchs Fuhrmann Fürst Fux Gabler Gaertner Garb Garber Gärtner Garver Gass Gehrig Gehring Geier Geiger Geisler Geissler Geiszler Gensch Gerber Gerhard Gerhardt Gerig Gerst Gerstle Gerver Giehl Giese Glöckner Goebel Goldschmidt Gorman Gott Gotti Gottlieb Gottschalk Graner Greenberg Groos Gros Gross Groß Große Grosse Größel Großel Großer Grosser Grosz Grünewald Günther Gunther Gutermuth Gwerder Haas Haase Haber Habich Habicht Hafner Hahn Hall Halle Harman Hartmann Hase Hasek Hasenkamp Hass Hauer Haupt Hausler Havener Heidrich Heinrich Heinrichs Heintze Hellewege Heppenheimer Herbert Hermann Herrmann Herschel Hertz Hildebrand Hinrichs Hintzen Hirsch Hoch Hochberg Hoefler Hofer Hoffman Hoffmann Höfler Hofmann Hofmeister Holst Holtzer Hölzer Holzer Holzknecht Holzmann Hoover Horn Horn Horowitz Houk Hüber Huber Huff Huffman Huffmann Hummel Hummel Hutmacher Ingersleben Jaeger Jäger Jager Jans Janson Janz Jollenbeck Jordan Jund Jung Junge Kahler Kaiser Kalb Kalbfleisch Kappel Karl Kaspar Kassmeyer Kästner Katz Kaube Käufer Kaufer Kauffmann Kaufman Keil Keller Kempf Kerner Kerper Kerwar Kerwer Kiefer Kiefer Kirchner Kistler Kistner Kleid Klein Klossner Knef Kneib Kneller Knepp Knochenmus Knopf Knopp Koch Kock Koenig Koenigsmann Köhl Kohl Köhler Kohler Kolbe König Königsmann Kopp Kraemer Krämer Kramer Krantz Kranz Kraus Krause Krauss Krauß Krebs Kröger Kron Kruckel Krüger Krüger Kruger Kruse Kruse Küchler Kuhn Kundert Kunkel Kunkle Kuntz Kunze Kurzmann Laberenz Lafrentz Lafrenz Landau Lang Lange Langenberg Langer Larenz Laurenz Lauritz Lawerenz Lawrenz Lehmann Lehrer Leitner Leitz Leitzke Lenz Leverenz Lewerentz Lewerenz Lichtenberg Lieberenz Linden Loewe Lohrenz Lorentz Lorenz Lorenzen Loris Loritz Löwe Ludwig Luther Maas Maier Mandel Mann Markwardt Marquardt Marquering Marquerink Martell Martin Martz Mas Maurer Maus Mayer Meier Mein Meindl Meinhardt Meisner Meissner Melsbach Mendel Mendelsohn Mendelssohn Messer Messerli Messmann Messner Metz Metz Metzger Meyer Michel Mohren Möller Morgenstern Moser Mueller Muhlfeld Müller Nagel Neuman Neumann Nuremberg Nussbaum Nussenbaum Oberst Oelberg Ohme Oliver Oppenheimer Ott Otto Oursler Pahlke Papke Papp Paternoster Paul Paulis Pawlitzki Penzig Peter Peters Pfaff Pfenning Plank Pletcher Porsche Portner Prinz Protz Rademacher Rademaker Rapp Raske Raskob Raskop Raskoph Regenbogen Reier Reiher Reiter Rettig Reuter Reuter Richard Richter Rier Riese Ritter Rose Rosenberg Rosenberger Rosenfeld Rot Roth Rothbauer Rothenberg Rothschild Sachs Saller Saller Salomon Salzwedel Samuel Sander Sauber Schäfer Scheer Scheinberg Schenck Schermer Schindler Schirmer Schlender Schlimme Schlusser Schmeling Schmid Schmidt Schmitt Schmitz Schneider Schnoor Schnur Schoettmer Schräder Schrader Schreck Schreier Schröder Schröder Schroeder Schroeter Schröter Schubert Schuchard Schuchardt Schuchert Schuhart Schuhmacher Schuler Schult Schulte Schultes Schultheis Schultheiss Schultheiß Schultz Schultze Schulz Schulze Schumacher Schuster Schuttmann Schwangau Schwartz Schwarz Schwarzenegger Schwenke Schwinghammer Seelenfreund Seidel Senft Senft Sheinfeld Shriver Siegel Siegel Siekert Siemon Silverstein Simen Simmon Simon Simons Siskin Siskind Sitz Sitz Slusser Solberg Sommer Sommer Sommer Sommer Sonnen Sorg Sorge Spannagel Specht Spellmeyer Spitznogle Sponaugle Stark Stauss Steen Steffen Stein Steinmann Stenger Sternberg Steube Steuben Stieber Stoppelbein Stoppelbein Strand Straub Strobel Strohkirch Stroman Stuber Stueck Stumpf Sturm Suess Sulzbach Swango Switzer Tangeman Tanzer Teufel Tiedeman Tifft Tillens Tobias Tolkien Tresler Tritten Trumbauer Tschida Unkle Unruh Unterbrink Ursler Vann Van tonder Vieth Vogel Vogt Vogts Voigt Voigts Volk Voll Von brandt Von essen Von grimmelshausen Von ingersleben Vonnegut Von wegberg Voss Voß Wägner Wagner Wähner Wahner Waldfogel Waldvogel Walkenhorst Walter Walther Waltz Wang Warner Waxweiler Weber Wechsler Wedekind Weeber Wegener Wegner Wehner Wehunt Weigand Weiman Weiner Weiss Weiß Welter Wendel Wendell Werner Wernher West Westerberg Wetterman Wetzel Wexler Wieck Wiegand Wildgrube Winter Winther Winther Wirner Wirnhier Wirt Wirth Wolf Wolff Wolter Wörner Wörnhör Wruck Wyman Xylander Zellweger Zilberschlag Zimmerman Zimmermann ================================================ FILE: data/names/Greek.txt ================================================ Adamidis Adamou Agelakos Akrivopoulos Alexandropoulos Anetakis Angelopoulos Antimisiaris Antipas Antonakos Antoniadis Antonopoulos Antonopoulos Antonopoulos Arvanitoyannis Avgerinos Banos Batsakis Bekyros Belesis Bertsimas Bilias Blades Bouloukos Brisimitzakis Bursinos Calogerakis Calpis Chellos Christakos Christodoulou Christou Chrysanthopoulos Chrysanthopoulos Comino Close Close Close Close Close Close Close Close Dalianis Danas Dasios Demakis Demarchis Demas Demetrious Dertilis Diakogeorgiou Dioletis Dounias Dritsas Drivakis Eatros Egonidis Eliopoulos Forakis Fotopoulos Fourakis Frangopoulos Galanopoulos Garofalis Gavril Gavrilopoulos Georgeakopoulos Geracimos Gianakopulos Giannakopoulos Giannakos Glynatsis Gomatos Grammatakakis Gravari Hadjiyianakies Hagias Haritopoulos Honjas Horiatis Houlis Jamussa Kaglantge Kalakos Kalogeria Kaloxylos Kanavos Kapsimalles Karahalios Karameros Karkampasis Karnoupakis Katsourinis Kefalas Kokkali Kokoris Kolovos Konstantatos Kosmas Kotsilimbas Kotsiopoulos Kouches Koulaxizis Koumanidis Kourempes Kouretas Kouropoulos Kouros Koustoubos Koutsoubos Kreskas Kringos Kyritsis Laganas Leontarakis Letsos Liatos Lillis Lolos Louverdis Makricosta Malihoudis Maneates Manos Manoukarakis Matsoukis Mentis Mersinias Metrofanis Michalaras Milionis Missiakos Moraitopoulos Nikolaou Nomikos Paitakes Paloumbas Panayiotopoulos Panoulias Pantelakos Pantelas Papadelias Papadopulos Papageorge Papoutsis Pappayiorgas Paraskevopoulos Paraskos Paschalis Patrianakos Patselas Pefanis Petimezas Petrakis Pezos Phocas Pispinis Polites Polymenakou Poniros Protopsaltis Rallis Rigatos Rorris Rousses Ruvelas Sakelaris Sakellariou Samios Sardelis Sfakianos Sklavenitis Sortras Sotiris Spyridis Stamatas Stamatelos Stavropoulos Strilakos Stroggylis Tableriou Taflambas Tassioglou Telis Tsoumada Theofilopoulos Theohari Totolos Tourna Tsahalis Tsangaris Tselios Tsogas Vamvakidis Varvitsiotes Vassilikos Vassilopulos Vlahos Vourlis Xydis Zaloumi Zouvelekis ================================================ FILE: data/names/Irish.txt ================================================ Adam Ahearn Aodh Aodha Aonghuis Aonghus Bhrighde Bradach Bradan Braden Brady Bran Brannon Brian Callaghan Caomh Carey Casey Cassidy Cathain Cathan Cathasach Ceallach Ceallachan Cearbhall Cennetig Ciardha Clark Cleirich Cleirigh Cnaimhin Coghlan Coilean Collins Colman Conall Conchobhar Conn Connell Connolly Cormac Corraidhin Cuidightheach Curran Dúbhshlaine Dalach Daly Damhain Damhan Delaney Desmond Devin Diarmaid Doherty Domhnall Donnchadh Donndubhan Donnell Donoghue Donovan Doyle Dubhain Dubhan Duncan Eoghan Eoin Eoin Faolan Farrell Fearghal Fergus Finn Finnegan Fionn Flanagan Flann Flynn Gallchobhar Gerald Giolla Gorman Hayden Ivor John Kavanagh Keefe Kelly Kennedy Lennon Login Macclelland Macdermott Maceachthighearna Macfarland Macghabhann Maciomhair Macshuibhne Madaidhin Madden Maguire Mahoney Maille Malone Manus Maolmhuaidh Mathghamhain Maurice Mcguire Mckay Mclain Mcmahon Mcnab Mcneil Meadhra Michael Milligan Mochan Mohan Molloy Monahan Mooney Muirchertach Mullen Mulryan Murchadh Murphy Names Naoimhin Naomhan Neil Neville Nevin Niadh Niall Nolan Nuallan O'Boyle O'Brien O'Byrne O'Donnell O'Hannagain O'Hannigain O'Keefe O'Mooney O'Neal O'Boyle O'Bree O'Brian O'Brien O'Callaghann O'Connell O'Connor O'Dell O'Doherty O'Donnell O'Donoghue O'Dowd O'Driscoll O'Gorman O'Grady O'Hagan O'Halloran O'Hanlon O'Hara O'Hare O'Kane O'Keefe O'Keeffe O'Kelly O'Leary O'Loughlin O'Mahoney O'Mahony O'Malley O'Meara O'Neal O'Neill O'Reilly O'Rourke O'Ryan O'Shea O'Sullivan O'Toole Patrick Peatain Pharlain Power Quigley Quinn Quirke Raghailligh Reagan Register Reilly Reynold Rhys Riagain Riagan Riain Rian Rinn Roach Rodagh Rory Ruadh Ruadhain Ruadhan Ruaidh Samuel Scolaidhe Seaghdha Sechnall Seighin Shannon Sheehy Simon Sioda Sloan Sluaghadhan Suaird Sullivan Tadhg Tadhgan Taidhg Teagan Teague Tighearnach Tracey Treasach Whalen Whelan William ================================================ FILE: data/names/Italian.txt ================================================ Abandonato Abatangelo Abatantuono Abate Abategiovanni Abatescianni Abbà Abbadelli Abbascia Abbatangelo Abbatantuono Abbate Abbatelli Abbaticchio Abbiati Abbracciabene Abbracciabeni Abelli Abelló Abrami Abramo Acardi Accardi Accardo Acciai Acciaio Acciaioli Acconci Acconcio Accorsi Accorso Accosi Accursio Acerbi Acone Aconi Acqua Acquafredda Acquarone Acquati Adalardi Adami Adamo Adamoli Addario Adelardi Adessi Adimari Adriatico Affini Africani Africano Agani Aggi Aggio Agli Agnelli Agnellutti Agnusdei Agosti Agostini Agresta Agrioli Aiello Aiolfi Airaldi Airò Aita Ajello Alagona Alamanni Albanesi Albani Albano Alberghi Alberghini Alberici Alberighi Albero Albini Albricci Albrici Alcheri Aldebrandi Alderisi Alduino Alemagna Aleppo Alesci Alescio Alesi Alesini Alesio Alessandri Alessi Alfero Aliberti Alinari Aliprandi Allegri Allegro Alò Aloia Aloisi Altamura Altimari Altoviti Alunni Amadei Amadori Amalberti Amantea Amato Amatore Ambrogi Ambrosi Amello Amerighi Amoretto Angioli Ansaldi Anselmetti Anselmi Antonelli Antonini Antonino Aquila Aquino Arbore Ardiccioni Ardizzone Ardovini Arena Aringheri Arlotti Armani Armati Armonni Arnolfi Arnoni Arrighetti Arrighi Arrigucci Aucciello Azzarà Baggi Baggio Baglio Bagni Bagnoli Balboni Baldi Baldini Baldinotti Baldovini Bandini Bandoni Barbieri Barone Barsetti Bartalotti Bartolomei Bartolomeo Barzetti Basile Bassanelli Bassani Bassi Basso Basurto Battaglia Bazzoli Bellandi Bellandini Bellincioni Bellini Bello Bellomi Belloni Belluomi Belmonte Bencivenni Benedetti Benenati Benetton Benini Benivieni Benvenuti Berardi Bergamaschi Berti Bertolini Biancardi Bianchi Bicchieri Biondi Biondo Boerio Bologna Bondesan Bonomo Borghi Borgnino Borgogni Bosco Bove Bovér Boveri Brambani Brambilla Breda Brioschi Brivio Brunetti Bruno Buffone Bulgarelli Bulgari Buonarroti Busto Caiazzo Caito Caivano Calabrese Calligaris Campana Campo Cantu Capello Capello Capello Capitani Carbone Carboni Carideo Carlevaro Caro Carracci Carrara Caruso Cassano Castro Catalano Cattaneo Cavalcante Cavallo Cingolani Cino Cipriani Cisternino Coiro Cola Colombera Colombo Columbo Como Como Confortola Conti Corna Corti Corvi Costa Costantini Costanzo Cracchiolo Cremaschi Cremona Cremonesi Crespo Croce Crocetti Cucinotta Cuocco Cuoco D'ambrosio Damiani D'amore D'angelo D'antonio De angelis De campo De felice De filippis De fiore De laurentis De luca De palma De rege De santis De vitis Di antonio Di caprio Di mercurio Dinapoli Dioli Di pasqua Di pietro Di stefano Donati D'onofrio Drago Durante Elena Episcopo Ermacora Esposito Evangelista Fabbri Fabbro Falco Faraldo Farina Farro Fattore Fausti Fava Favero Fermi Ferrara Ferrari Ferraro Ferrero Ferro Fierro Filippi Fini Fiore Fiscella Fiscella Fonda Fontana Fortunato Franco Franzese Furlan Gabrielli Gagliardi Gallo Ganza Garfagnini Garofalo Gaspari Gatti Genovese Gentile Germano Giannino Gimondi Giordano Gismondi Giùgovaz Giunta Goretti Gori Greco Grillo Grimaldi Gronchi Guarneri Guerra Guerriero Guidi Guttuso Idoni Innocenti Labriola Làconi Laganà Lagomarsìno Lagorio Laguardia Lama Lamberti Lamon Landi Lando Landolfi Laterza Laurito Lazzari Lecce Leccese Leggièri Lèmmi Leone Leoni Lippi Locatelli Lombardi Longo Lupo Luzzatto Maestri Magro Mancini Manco Mancuso Manfredi Manfredonia Mantovani Marchegiano Marchesi Marchetti Marchioni Marconi Mari Maria Mariani Marino Marmo Martelli Martinelli Masi Masin Mazza Merlo Messana Micheli Milani Milano Modugno Mondadori Mondo Montagna Montana Montanari Monte Monti Morandi Morello Moretti Morra Moschella Mosconi Motta Muggia Muraro Murgia Murtas Nacar Naggi Naggia Naldi Nana Nani Nanni Nannini Napoleoni Napoletani Napoliello Nardi Nardo Nardovino Nasato Nascimbene Nascimbeni Natale Nave Nazario Necchi Negri Negrini Nelli Nenci Nepi Neri Neroni Nervetti Nervi Nespola Nicastro Nicchi Nicodemo Nicolai Nicolosi Nicosia Nicotera Nieddu Nieri Nigro Nisi Nizzola Noschese Notaro Notoriano Oberti Oberto Ongaro Orlando Orsini Pace Padovan Padovano Pagani Pagano Palladino Palmisano Palumbo Panzavecchia Parisi Parma Parodi Parri Parrino Passerini Pastore Paternoster Pavesi Pavone Pavoni Pecora Pedrotti Pellegrino Perugia Pesaresi Pesaro Pesce Petri Pherigo Piazza Piccirillo Piccoli Pierno Pietri Pini Piovene Piraino Pisani Pittaluga Poggi Poggio Poletti Pontecorvo Portelli Porto Portoghese Potenza Pozzi Profeta Prosdocimi Provenza Provenzano Pugliese Quaranta Quattrocchi Ragno Raimondi Rais Rana Raneri Rao Rapallino Ratti Ravenna Ré Ricchetti Ricci Riggi Righi Rinaldi Riva Rizzo Robustelli Rocca Rocchi Rocco Roma Roma Romagna Romagnoli Romano Romano Romero Roncalli Ronchi Rosa Rossi Rossini Rotolo Rovigatti Ruggeri Russo Rustici Ruzzier Sabbadin Sacco Sala Salomon Salucci Salvaggi Salvai Salvail Salvatici Salvay Sanna Sansone Santini Santoro Sapienti Sarno Sarti Sartini Sarto Savona Scarpa Scarsi Scavo Sciacca Sciacchitano Sciarra Scordato Scotti Scutese Sebastiani Sebastino Segreti Selmone Selvaggio Serafin Serafini Serpico Sessa Sgro Siena Silvestri Sinagra Sinagra Soldati Somma Sordi Soriano Sorrentino Spada Spanò Sparacello Speziale Spini Stabile Stablum Stilo Sultana Tafani Tamàro Tamboia Tanzi Tarantino Taverna Tedesco Terranova Terzi Tessaro Testa Tiraboschi Tivoli Todaro Toloni Tornincasa Toselli Tosetti Tosi Tosto Trapani Traversa Traversi Traversini Traverso Trucco Trudu Tumicelli Turati Turchi Uberti Uccello Uggeri Ughi Ungaretti Ungaro Vacca Vaccaro Valenti Valentini Valerio Varano Ventimiglia Ventura Verona Veronesi Vescovi Vespa Vestri Vicario Vico Vigo Villa Vinci Vinci Viola Vitali Viteri Voltolini Zambrano Zanetti Zangari Zappa Zeni Zini Zino Zunino ================================================ FILE: data/names/Japanese.txt ================================================ Abe Abukara Adachi Aida Aihara Aizawa Ajibana Akaike Akamatsu Akatsuka Akechi Akera Akimoto Akita Akiyama Akutagawa Amagawa Amaya Amori Anami Ando Anzai Aoki Arai Arakawa Araki Arakida Arato Arihyoshi Arishima Arita Ariwa Ariwara Asahara Asahi Asai Asano Asanuma Asari Ashia Ashida Ashikaga Asuhara Atshushi Ayabito Ayugai Baba Baisotei Bando Bunya Chiba Chikamatsu Chikanatsu Chino Chishu Choshi Daishi Dan Date Dazai Deguchi Deushi Doi Ebina Ebisawa Eda Egami Eguchi Ekiguchi Endo Endoso Enoki Enomoto Erizawa Eto Etsuko Ezakiya Fuchida Fugunaga Fujikage Fujimaki Fujimoto Fujioka Fujishima Fujita Fujiwara Fukao Fukayama Fukuda Fukumitsu Fukunaka Fukuoka Fukusaku Fukushima Fukuyama Fukuzawa Fumihiko Funabashi Funaki Funakoshi Furusawa Fuschida Fuse Futabatei Fuwa Gakusha Genda Genji Gensai Godo Goto Gushiken Hachirobei Haga Hagino Hagiwara Hama Hamacho Hamada Hamaguchi Hamamoto Hanabusa Hanari Handa Hara Harada Haruguchi Hasegawa Hasekura Hashimoto Hasimoto Hatakeda Hatakeyama Hatayama Hatoyama Hattori Hayakawa Hayami Hayashi Hayashida Hayata Hayuata Hida Hideaki Hideki Hideyoshi Higashikuni Higashiyama Higo Higoshi Higuchi Hike Hino Hira Hiraga Hiraki Hirano Hiranuma Hiraoka Hirase Hirasi Hirata Hiratasuka Hirayama Hiro Hirose Hirota Hiroyuki Hisamatsu Hishida Hishikawa Hitomi Hiyama Hohki Hojo Hokusai Honami Honda Hori Horigome Horigoshi Horiuchi Horri Hoshino Hosokawa Hosokaya Hotate Hotta Hyata Hyobanshi Ibi Ibu Ibuka Ichigawa Ichihara Ichikawa Ichimonji Ichiro Ichisada Ichiyusai Idane Iemochi Ienari Iesada Ieyasu Ieyoshi Igarashi Ihara Ii Iida Iijima Iitaka Ijichi Ijiri Ikeda Ikina Ikoma Imada Imagawa Imai Imaizumi Imamura Imoo Ina Inaba Inao Inihara Ino Inoguchi Inokuma Inoue Inouye Inukai Ippitsusai Irie Iriye Isayama Ise Iseki Iseya Ishibashi Ishida Ishiguro Ishihara Ishikawa Ishimaru Ishimura Ishinomori Ishiyama Isobe Isoda Isozaki Itagaki Itami Ito Itoh Iwahara Iwahashi Iwakura Iwasa Iwasaki Izumi Jimbo Jippensha Jo Joshuya Joshuyo Jukodo Jumonji Kada Kagabu Kagawa Kahae Kahaya Kaibara Kaima Kajahara Kajitani Kajiwara Kajiyama Kakinomoto Kakutama Kamachi Kamata Kaminaga Kamio Kamioka Kamisaka Kamo Kamon Kan Kanada Kanagaki Kanegawa Kaneko Kanesaka Kano Karamorita Karube Karubo Kasahara Kasai Kasamatsu Kasaya Kase Kashiwagi Kasuse Kataoka Katayama Katayanagi Kate Kato Katoaka Katsu Katsukawa Katsumata Katsura Katsushika Kawabata Kawachi Kawagichi Kawagishi Kawaguchi Kawai Kawaii Kawakami Kawamata Kawamura Kawasaki Kawasawa Kawashima Kawasie Kawatake Kawate Kawayama Kawazu Kaza Kazuyoshi Kenkyusha Kenmotsu Kentaro Ki Kido Kihara Kijimuta Kijmuta Kikkawa Kikuchi Kikugawa Kikui Kikutake Kimio Kimiyama Kimura Kinashita Kinoshita Kinugasa Kira Kishi Kiski Kita Kitabatake Kitagawa Kitamura Kitano Kitao Kitoaji Ko Kobayashi Kobi Kodama Koga Kogara Kogo Koguchi Koiso Koizumi Kojima Kokan Komagata Komatsu Komatsuzaki Komine Komiya Komon Komura Kon Konae Konda Kondo Konishi Kono Konoe Koruba Koshin Kotara Kotoku Koyama Koyanagi Kozu Kubo Kubota Kudara Kudo Kuga Kumagae Kumasaka Kunda Kunikida Kunisada Kuno Kunomasu Kuramochi Kuramoto Kurata Kurkawa Kurmochi Kuroda Kurofuji Kurogane Kurohiko Kuroki Kurosawa Kurusu Kusatsu Kusonoki Kusuhara Kusunoki Kuwabara Kwakami Kyubei Maeda Maehata Maeno Maita Makiguchi Makino Makioka Makuda Marubeni Marugo Marusa Maruya Maruyama Masanobu Masaoka Mashita Masoni Masudu Masuko Masuno Masuzoe Matano Matokai Matoke Matsuda Matsukata Matsuki Matsumara Matsumoto Matsumura Matsuo Matsuoka Matsura Matsushina Matsushita Matsuya Matsuzawa Mayuzumi Mazaki Mazawa Mazuka Mifune Mihashi Miki Mimasuya Minabuchi Minami Minamoto Minatoya Minobe Mishima Mitsubishi Mitsuharu Mitsui Mitsukuri Mitsuwa Mitsuya Mitzusaka Miura Miwa Miyagi Miyahara Miyajima Miyake Miyamae Miyamoto Miyazaki Miyazawa Miyoshi Mizoguchi Mizumaki Mizuno Mizutani Modegi Momotami Momotani Monomonoi Mori Moriguchi Morimoto Morinaga Morioka Morishita Morisue Morita Morri Moto Motoori Motoyoshi Munakata Munkata Muraguchi Murakami Muraoka Murasaki Murase Murata Murkami Muro Muruyama Mushanaokoji Mushashibo Muso Mutsu Nagahama Nagai Nagano Nagasawa Nagase Nagata Nagatsuka Nagumo Naito Nakada Nakadai Nakadan Nakae Nakagawa Nakahara Nakajima Nakamoto Nakamura Nakane Nakanishi Nakano Nakanoi Nakao Nakasato Nakasawa Nakasone Nakata Nakatoni Nakayama Nakazawa Namiki Nanami Narahashi Narato Narita Nataga Natsume Nawabe Nemoto Niijima Nijo Ninomiya Nishi Nishihara Nishikawa Nishimoto Nishimura Nishimuraya Nishio Nishiwaki Nitta Nobunaga Noda Nogi Noguchi Nogushi Nomura Nonomura Noro Nosaka Nose Nozaki Nozara Numajiri Numata Obata Obinata Obuchi Ochiai Ochida Odaka Ogata Ogiwara Ogura Ogyu Ohba Ohira Ohishi Ohka Ohmae Ohmiya Oichi Oinuma Oishi Okabe Okada Okakura Okamoto Okamura Okanao Okanaya Okano Okasawa Okawa Okazaki Okazawaya Okimasa Okimoto Okita Okubo Okuda Okui Okuma Okuma Okumura Okura Omori Omura Onishi Ono Onoda Onoe Onohara Ooka Osagawa Osaragi Oshima Oshin Ota Otaka Otake Otani Otomo Otsu Otsuka Ouchi Oyama Ozaki Ozawa Ozu Raikatuji Royama Ryusaki Sada Saeki Saga Saigo Saiki Saionji Saito Saitoh Saji Sakagami Sakai Sakakibara Sakamoto Sakanoue Sakata Sakiyurai Sakoda Sakubara Sakuraba Sakurai Sammiya Sanda Sanjo Sano Santo Saromi Sarumara Sasada Sasakawa Sasaki Sassa Satake Sato Satoh Satoya Sawamatsu Sawamura Sayuki Segawa Sekigawa Sekine Sekozawa Sen Senmatsu Seo Serizawa Shiba Shibaguchi Shibanuma Shibasaki Shibasawa Shibata Shibukji Shichirobei Shidehara Shiga Shiganori Shige Shigeki Shigemitsu Shigi Shikitei Shikuk Shima Shimada Shimakage Shimamura Shimanouchi Shimaoka Shimazaki Shimazu Shimedzu Shimizu Shimohira Shimon Shimura Shimuzu Shinko Shinozaki Shinozuka Shintaro Shiokawa Shiomi Shiomiya Shionoya Shiotani Shioya Shirahata Shirai Shiraishi Shirane Shirasu Shiratori Shirokawa Shiroyama Shiskikura Shizuma Shobo Shoda Shunji Shunsen Siagyo Soga Sohda Soho Soma Someya Sone Sonoda Soseki Sotomura Suenami Sugai Sugase Sugawara Sugihara Sugimura Sugisata Sugita Sugitani Sugiyama Sumitimo Sunada Suzambo Suzuki Tabuchi Tadeshi Tagawa Taguchi Taira Taka Takabe Takagaki Takagawa Takagi Takahama Takahashi Takaki Takamura Takano Takaoka Takara Takarabe Takashi Takashita Takasu Takasugi Takayama Takecare Takeda Takei Takekawa Takemago Takemitsu Takemura Takenouchi Takeshita Taketomo Takeuchi Takewaki Takimoto Takishida Takishita Takizawa Taku Takudo Takudome Tamazaki Tamura Tamuro Tanaka Tange Tani Taniguchi Tanizaki Tankoshitsu Tansho Tanuma Tarumi Tatenaka Tatsuko Tatsuno Tatsuya Tawaraya Tayama Temko Tenshin Terada Terajima Terakado Terauchi Teshigahara Teshima Tochikura Togo Tojo Tokaji Tokuda Tokudome Tokuoka Tomika Tomimoto Tomioka Tommii Tomonaga Tomori Tono Torii Torisei Toru Toshishai Toshitala Toshusai Toyama Toyoda Toyoshima Toyota Toyotomi Tsubouchi Tsucgimoto Tsuchie Tsuda Tsuji Tsujimoto Tsujimura Tsukada Tsukade Tsukahara Tsukamoto Tsukatani Tsukawaki Tsukehara Tsukioka Tsumemasa Tsumura Tsunoda Tsurimi Tsuruga Tsuruya Tsushima Tsutaya Tsutomu Uboshita Uchida Uchiyama Ueda Uehara Uemura Ueshima Uesugi Uetake Ugaki Ui Ukiyo Umari Umehara Umeki Uno Uoya Urogataya Usami Ushiba Utagawa Wakai Wakatsuki Watabe Watanabe Watari Watnabe Watoga Yakuta Yamabe Yamada Yamagata Yamaguchi Yamaguchiya Yamaha Yamahata Yamakage Yamakawa Yamakazi Yamamoto Yamamura Yamana Yamanaka Yamanouchi Yamanoue Yamaoka Yamashita Yamato Yamawaki Yamazaki Yamhata Yamura Yanagawa Yanagi Yanagimoto Yanagita Yano Yasuda Yasuhiro Yasui Yasujiro Yasukawa Yasutake Yoemon Yokokawa Yokoyama Yonai Yosano Yoshida Yoshifumi Yoshihara Yoshikawa Yoshimatsu Yoshinobu Yoshioka Yoshitomi Yoshizaki Yoshizawa Yuasa Yuhara Yunokawa ================================================ FILE: data/names/Korean.txt ================================================ Ahn Baik Bang Byon Cha Chang Chi Chin Cho Choe Choi Chong Chou Chu Chun Chung Chweh Gil Gu Gwang Ha Han Ho Hong Hung Hwang Hyun Jang Jeon Jeong Jo Jon Jong Jung Kang Kim Ko Koo Ku Kwak Kwang Lee Li Lim Ma Mo Moon Nam Ngai Noh Oh Pae Pak Park Ra Rhee Rheem Ri Rim Ron Ryom Ryoo Ryu San Seo Seok Shim Shin Shon Si Sin So Son Song Sook Suh Suk Sun Sung Tsai Wang Woo Yang Yeo Yeon Yi Yim Yoo Yoon You Youj Youn Yu Yun ================================================ FILE: data/names/Polish.txt ================================================ Adamczak Adamczyk Andrysiak Auttenberg Bartosz Bernard Bobienski Bosko Broż Brzezicki Budny Bukoski Bukowski Chlebek Chmiel Czajka Czajkowski Dubanowski Dubicki Dunajski Dziedzic Fabian Filipek Filipowski Gajos Gniewek Gomolka Gomulka Gorecki Górka Górski Grzeskiewicz Gwozdek Jagoda Janda Janowski Jaskolski Jaskulski Jedynak Jelen Jez Jordan Kaczka Kaluza Kamiński Kasprzak Kava Kedzierski Kijek Klimek Kosmatka Kowalczyk Kowalski Koziol Kozlow Kozlowski Krakowski Król Kumiega Lawniczak Lis Majewski Malinowski Maly Marek Marszałek Maslanka Mencher Miazga Michel Mikolajczak Mozdzierz Niemczyk Niemec Nosek Nowak Pakulski Pasternack Pasternak Paszek Piatek Piontek Pokorny Poplawski Róg Rudaski Rudawski Rusnak Rutkowski Sadowski Salomon Serafin Sienkiewicz Sierzant Sitko Skala Slaski Ślązak Ślusarczyk Ślusarski Smolák Sniegowski Sobol Sokal Sokolof Sokoloff Sokolofsky Sokolowski Sokolsky Sówka Stanek Starek Stawski Stolarz Szczepanski Szewc Szwarc Szweda Szwedko Walentowicz Warszawski Wawrzaszek Wiater Winograd Winogrodzki Wojda Wojewódka Wojewódzki Wronski Wyrick Wyrzyk Zabek Zawisza Zdunowski Zdunowski Zielinski Ziemniak Zientek Żuraw ================================================ FILE: data/names/Portuguese.txt ================================================ Abreu Albuquerque Almeida Alves Araújo Araullo Barros Basurto Belo Cabral Campos Cardozo Castro Coelho Costa Crespo Cruz D'cruz D'cruze Delgado De santigo Duarte Estéves Fernandes Ferreira Ferreiro Ferro Fonseca Franco Freitas Garcia Gaspar Gomes Gouveia Guerra Henriques Lobo Machado Madeira Magalhães Maria Mata Mateus Matos Medeiros Melo Mendes Moreno Nunes Palmeiro Paredes Pereira Pinheiro Pinho Ramires Ribeiro Rios Rocha Rodrigues Romão Rosario Salazar Santana Santiago Santos Serafim Silva Silveira Simões Soares Souza Torres Vargas Ventura ================================================ FILE: data/names/Russian.txt ================================================ Ababko Abaev Abagyan Abaidulin Abaidullin Abaimoff Abaimov Abakeliya Abakovsky Abakshin Abakumoff Abakumov Abakumtsev Abakushin Abalakin Abalakoff Abalakov Abaleshev Abalihin Abalikhin Abalkin Abalmasoff Abalmasov Abaloff Abalov Abamelek Abanin Abankin Abarinoff Abarinov Abasheev Abashev Abashidze Abashin Abashkin Abasov Abatsiev Abaturoff Abaturov Abaza Abaziev Abbakumov Abbakumovsky Abbasov Abdank-Kossovsky Abdeev Abdildin Abdrahimoff Abdrahimov Abdrahmanoff Abdrahmanov Abdrakhimoff Abdrakhimov Abdrakhmanoff Abdrakhmanov Abdrashitoff Abdrashitov Abdrazakoff Abdrazakov Abdulaev Abdulatipoff Abdulatipov Abdulazizoff Abdulazizov Abdulbasiroff Abdulbasirov Abdulbekoff Abdulbekov Abdulgapuroff Abdulgapurov Abdulgaziev Abdulhabiroff Abdulhabirov Abdulin Abdulkadyroff Abdulkadyrov Abdulkhabiroff Abdulkhabirov Abdulladjanov Abdulladzhanoff Abdulladzhanov Abdullaev Abdullin Abduloff Abdulov Abdulrahmanoff Abdulrahmanov Abdulrakhmanoff Abdulrakhmanov Abdurahmanoff Abdurahmanov Abdurakhmanoff Abdurakhmanov Abegyan Abel Abeldyaev Abelev Abelman Abelmazoff Abelmazov Abels Abelsky Abeltsev Abelyan Aberson Abertasov Abesadze Abezgauz Abgaryan Abibulaev Abidoff Abidov Abih Abikh Abisaloff Abisalov Abitoff Abitov Abjaliloff Abjalilov Abkin Ablaev Ablesimoff Ablesimov Abletsoff Abletsov Ableuhoff Ableuhov Ableukhoff Ableukhov Abloff Ablov Ablyakimoff Ablyakimov Ablyazov Aboev Aboff Aboimoff Aboimov Abolihin Abolikhin Abolin Abolins Abov Abovin Abovyan Aboyantsev Abragam Abragamson Abrahimoff Abrahimov Abrajevich Abrakhimoff Abrakhimov Abramchikoff Abramchikov Abramchuk Abrameitsev Abramenko Abramenkoff Abramenkov Abramkoff Abramkov Abramoff Abramov Abramovich Abramovitch Abramovsky Abramowich Abramowitch Abramowsky Abramson Abramtchikoff Abramtchikov Abramtchuk Abramtsev Abramyan Abraroff Abrarov Abrashin Abrashitov Abrasimoff Abrasimov Abrazhevich Abrikosoff Abrikosov Abrosimoff Abrosimov Abroskin Abrosoff Abrosov Abrukov Absalyamoff Absalyamov Absattaroff Absattarov Abubakiroff Abubakirov Abubekeroff Abubekerov Abudihin Abudikhin Abugoff Abugov Abuhoff Abuhov Abukhoff Abukhov Abuladze Abulgatin Abulhanoff Abulhanov Abulkhanoff Abulkhanov Abulmambetoff Abulmambetov Abushenko Abutaliev Abuzoff Abuzov Abylgaziev Abyshev Abyzgiddin Abyzoff Abyzov Abzaev Abzgildin Abzhaliloff Abzhalilov Abzyaparoff Abzyaparov Adabash Adabashian Adabir Adadurov Adaikin Adaksin Adam Adamenko Adamiants Adamishin Adamoff Adamov Adamovich Adamovitch Adams Adamski Adamsky Adamson Adamyan Adamyants Adamyuk Adarchenko Adaryukov Adashev Adashevski Adashevsky Adashik Adelfinski Adelfinsky Adelgeim Adelhanoff Adelhanov Adelhanyan Adelkhanoff Adelkhanov Adelkhanyan Adelson Adelung Aden Ader Aderihin Aderikhin Aderkas Adibekoff Adibekov Adiev Adigamoff Adigamov Adiloff Adilov Adjaloff Adjalov Adjemoff Adjemov Adjemyan Adjubei Adler Adlerberg Adleroff Adlerov Admakin Admoni Adno Ado Adodin Adoduroff Adodurov Adoff Adohin Adokhin Adolf Adomaitis Adoniev Adonts Adoratski Adoratsky Adov Adriankin Adrianoff Adrianov Adriyanoff Adriyanov Adroff Adrov Aduloff Adulov Adushkin Adyan Adylov Adyrhaev Adyrkhaev Adzhaloff Adzhalov Adzhemoff Adzhemov Adzhemyan Adzhubei Aedonitsky Agababoff Agababov Agababyan Agabekoff Agabekov Agadjanoff Agadjanov Agadjanyan Agadzhanoff Agadzhanov Agadzhanyan Agaev Agafonoff Agafonov Agahanyan Agaigeldiev Agakhanyan Agakoff Agakov Agalakoff Agalakov Agalaradze Agalaroff Agalarov Agaloff Agalov Agaltsoff Agaltsov Agamiroff Agamirov Agamirzyan Agamoff Agamov Aganbegyan Aganoff Aganov Agapeev Agaphonoff Agaphonov Agapiev Agapitoff Agapitov Agapkin Agapochkin Agapoff Agaponoff Agaponov Agapotchkin Agapov Agarev Agarin Agarkoff Agarkov Agaryshev Agasaroff Agasarov Agashin Agatoff Agatov Agatyev Agayan Agayants Agdaroff Agdarov Ageenko Ageenkov Ageev Ageevets Ageichev Ageichik Ageikin Ageitchev Ageitchik Agenosoff Agenosov Ageshin Aggeev Agibaloff Agibalov Agilera Agin Agishev Agitshtein Aglinskas Agliullin Agnivtsev Agoev Agol Agoshkoff Agoshkov Agrachev Agramoff Agramov Agranat Agranenko Agranoff Agranov Agranovich Agranovitch Agranovski Agranovsky Agranowich Agranowitch Agranowski Agranowsky Agrashev Agratchev Agratin Agrba Agrenev Agrest Agrikoff Agrikov Agroskin Agudoff Agudov Agulian Agulnik Agumaa Agureev Agurski Agursky Agutin Aguzaroff Aguzarov Agzamoff Agzamov Aivazovski Aivazovsky Ajaev Ajiganoff Ajiganov Ajinoff Ajinov Ajnikoff Ajnikov Ajogin Akimov Albanov Albats Albedinsky Albert Albertini Albinesku Albitsky Albov Alchangyan Alcheka Alchevsky Alchin Alchubaev Alferaki Alferiev Alferov Alfimov Alfionov Alfonsky Alfonsov Alftan Alhimenko Alhimov Alianaki Alianov Alkov Alkvist Alman Almedingen Almetiev Almetov Almondinov Almuhametov Almut Almyashkin Alper Alperovich Alpert Alshansky Alshevsky Alshibaya Alshits Alshtut Alsky Altentaller Alter Altfater Altman Altshtein Altshuler Altshuller Alybin Alymov Alypov Alyrchikov Alytsky Amelin Amelkin Amelyakin Amerhanov Amet-Han Ametistov Andreenko Andreev Andreevsky Andreichenko Andreichev Andreichik Andreichin Andreichuk Andreiko Andreli Andreyak Andreyanov Androhanov Androkhanov Andronchik Andronikov Andronnikov Andronov Andropov Androsenko Androsik Androsov Androsyuk Androvsky Andruhov Andruhovich Andrukhov Andrukhovich Andruschenko Andrusenko Andrushkevich Andrushko Andrusiv Andrusiw Andrusov Andruzsky Andryuhin Andryuk Andryukov Andryunin Andryuschenko Andryushin Anedchenko Anekshtein Anert Anikanov Anikeev Anikiev Anikin Anikst Anikushin Animitsa Anin Anipkin Anisemenok Anisfeld Anisihin Anisikhin Anisimkin Anisimov Aniskin Anisovets Anisovich Anistratenko Anodin Anofriev Anoprienko Anopriev Anorin Anoskov Anosov Antohin Antonchenko Antonchenkov Antonts Antontsev Antonyuk Antopolsky Antoschenko Antoschin Antoshevsky Antoshin Antoshkin Antropov Antufiev Antushevsky Antyshev Antyufeev Antyuganov Antyuhov Antyushin Anuchin Anufrienko Anufriev Anuprienko Anuriev Anurin Anurov Anutriev Anzimirov Anzonger Aparin Arapov Araslanov Arbudu Arbuzov Arsky Artemev Artemiev Artenov Artibyakin Artischev Artizov Artobolevsky Artseulov Artyuhin Artyuhov Artyukhin Artyukhov Artyushin Artyushkov Asfandiyarov Astrahankin Astrahansky Astrahantsev Astrakhankin Astrakhansky Astrakhantsev Astratov Astronomov Astrov Astsaturov Astyrev Asylmuratov At'Kov Atabekov Atabekyan Atabiev Ataev Atajahov Atajakhov Atalian Atalikov Atallahanov Atallakhanov Atamanchuk Atamanenko Atamanov Atamanyuk Atamoglanov Atanasyan Atanov Atarskih Atarskikh Atazhahov Atazhakhov Ateev Atepko Atiskov Atlanov Atlantov Atlas Atlasov Atopov Atramov Atroshenko Atvilov Atyashev Atyashkin Atyasov Atyurievsky Atyushov Auerbach Auerbah Auerbakh August Augustoff Augustov Auktsionek Aulov Aurov Aushev Auslender Autlev Auzan Avaev Avagimoff Avagimov Avak'Yan Avakoff Avakov Avakshin Avakyan Avaliani Avalishvili Avalov Avalyan Avanesov Avanesyan Avash Avatyan Avchenko Avchinnikov Avdakoff Avdakov Avdeeff Avdeenko Avdeev Avdeichikov Avdienko Avdiev Avdievsky Avdiewski Avdiyants Avdiyski Avdiysky Avdonin Avdoshin Avduevsky Avduewski Avduloff Avdulov Avdyukov Avdyunin Avdyushin Avelan Avelichev Avelitchev Aven Avenarius Averbah Averbakh Averbuch Averbuh Averbukh Averchenko Averchev Averianoff Averianov Averichkin Averin Averintsev Averitchkin Averkiev Averkin Averkoff Averkov Averkovich Averkovitch Averochkin Averotchkin Avertchenko Avertchev Averyanov Avetisov Avetisyan Avetyan Avgustoff Avgustov Avhadiev Avhimovich Avhimovitch Avik Avilkin Avilov Avinov Avinovitski Avinovitsky Avkhadiev Avkhimovich Avkhimovitch Avksentiev Avksentievski Avksentievsky Avladeev Avlov Avlukov Avraamov Avramchik Avramenko Avramov Avramtchik Avranek Avrorin Avrorov Avrov Avrus Avrutin Avrutsky Avryasov Avseenko Avsenev Avsyuk Avtaev Avtamonov Avtandilov Avtchenko Avtchinnikov Avtokratov Avtomovich Avtomovitch Avtonomov Avtorhanov Avtorkhanov Avtsin Avtsyn Avtuhov Avtukhov Avturhanov Avturkhanov Avvakumoff Avvakumov Avzalov Awaeff Awagimoff Awak'Yan Awakoff Awakshin Awakyan Awaliani Awalishwili Awaloff Awalyan Awanesov Awanesyan Awash Awatyan Awchenko Awchinnikoff Awdakoff Awdeeff Awdeenko Awdeichikoff Awdieff Awdienko Awdiewsky Awdiyants Awdiyski Awdiysky Awdonin Awdoshin Awduewski Awduewsky Awduloff Awdyukoff Awdyunin Awdyushin Awelan Awelicheff Awelitcheff Awen Awenarius Awerbah Awerbakh Awerbuh Awerbukh Awercheff Awerchenko Awerianoff Awerichkin Awerin Awerintsev Aweritchkin Awerkieff Awerkin Awerkoff Awerkowich Awerkowitch Awerochkin Awerotchkin Awertcheff Awertchenko Aweryanoff Awetisoff Awetisyan Awetyan Awgustoff Awhadieff Awhimowich Awik Awilkin Awiloff Awinoff Awinowitski Awinowitsky Awkhadieff Awkhimovich Awkhimovitch Awksentiev Awksentiewski Awksentiewsky Awladeeff Awloff Awlukoff Awraamoff Awramchik Awramenko Awramoff Awramtchik Awranek Awroff Awrorin Awroroff Awrus Awrutin Awrutsky Awryasoff Awseenko Awseneff Awsyuk Awtaeff Awtamonoff Awtandiloff Awtchenko Awtchinnikoff Awtokratoff Awtomovich Awtomovitch Awtonomoff Awtorhanoff Awtorkhanoff Awtsin Awtsyn Awtuhoff Awtukhoff Awturhanoff Awturkhanoff Awwakumoff Awzaloff Azhaev Azhiganoff Azhiganov Azhinoff Azhinov Azhnikoff Azhnikov Azhogin Babadei Babadjan Babadjanoff Babadjanov Babadjanyan Babadzhan Babadzhanoff Babadzhanov Babadzhanyan Babaev Babaevsky Babahanov Babaitsev Babak Babakhanoff Babakhanov Babakin Babakov Babakulov Baban Babanin Babanoff Babanov Babansky Babarin Babarykin Babashoff Babashov Babaskin Babayan Babayants Babchenko Babel Babenchikoff Babenchikov Babenko Babenkoff Babenkov Babentsev Babenyshev Babeshkin Babeshko Babetoff Babetov Babich Babichenko Babichev Babienko Babikoff Babikov Babilyas Babin Babinich Babinoff Babinov Babintsev Babitsky Babiy Babkeev Babkin Babkoff Babkov Babloev Bablumyan Babochkin Baboshin Baboshkin Babosoff Babosov Babst Babuh Babuhin Babukh Babukhin Baburin Baburkin Baburoff Baburov Babusenko Babushkin Babutski Babutsky Babynin Babyuk Bachaev Bachaldin Bachelis Bacherikoff Bacherikov Bachev Bachilo Bachinski Bachinsky Bachish Bachmanoff Bachmanov Bachuk Bachurin Bachyanskas Badaev Badalbeili Badalov Badalyants Badamshin Badanin Badanoff Badanov Badelin Bader Baderski Baderskoff Baderskov Badersky Badeschenkov Badich Badikoff Badikov Badmaev Badoev Badoff Badov Badridze Badukin Badyaev Badyagin Badyashin Badych Badygin Badyin Badykshanoff Badykshanov Badylkin Badyunoff Badyunov Baer Baev Baevski Baevsky Baewski Baewsky Bag Bagachev Bagaev Bagai-Ool Bagalei Bagalin Bagandaliev Bagaryakoff Bagaryakov Bagaryatsky Bagashev Bagaturiya Bagautdinov Bagdasaroff Bagdasarov Bagdasaryan Bagdatiev Baggovut Bagimoff Bagimov Bagin Baginoff Baginov Bagiroff Bagirov Bagishaev Bagishvili Baglaenko Baglai Baglanoff Baglanov Bagler Bagmet Bagmevski Bagmevsky Bagmewski Bagmewsky Bagmut Bagomaev Bagrak Bagramoff Bagramov Bagramyan Bagration Bagretsoff Bagretsov Bagrich Bagrintsev Bagritch Bagroff Bagrov Bagryanski Bagryansky Bagryantsev Bah Bahanoff Bahanov Baharev Bahchivandji Bahchivandzhi Baheloff Bahelov Bahin Bahir Bahlaev Bahlulzade Bahmat Bahmatoff Bahmatov Bahmetev Bahmetiev Bahmetoff Bahmetov Bahmin Bahmutoff Bahmutov Bahmutsky Baholdin Bahorin Bahovkin Bahovtsev Bahrah Bahrushin Bahshiev Bahtadze Bahtchivandji Bahtchivandzhi Bahtiaroff Bahtiarov Bahtiev Bahtigareev Bahtin Bahtinoff Bahtinov Bahtiyaroff Bahtiyarov Bahtizin Bahtoff Bahtov Bahturin Bahurin Bahusov Bahuta Bahvaloff Bahvalov Baibakoff Baibakov Baibikoff Baibikov Baiborodoff Baiborodov Baiburin Baiburski Baibursky Baiburtyan Baichenko Baichikoff Baichikov Baichoroff Baichorov Baidachny Baidak Baidakoff Baidakov Baidalin Baidavletoff Baidavletov Baidin Baidjanoff Baidjanov Baidukoff Baidukov Baidyuk Baidzhanoff Baidzhanov Baier Baigildeev Baigozin Baiguloff Baigulov Baigushev Baiguzin Baiguzoff Baiguzov Baikaloff Baikalov Baikin Baikin Baikoff Baikov Baikovski Baikovsky Baikowski Baikowsky Baimakoff Baimakov Baimiev Bair Bairak Bairamkuloff Bairamkulov Bairamukoff Bairamukov Bairashevski Bairashevsky Bairashewski Bairashewsky Bairov Baisak Baisaroff Baisarov Baiseitoff Baiseitov Baishev Baistryuchenko Baistryutchenko Baitalsky Baitchenko Baitchikoff Baitchikov Baitchoroff Baitchorov Baiteryakoff Baiteryakov Baitin Baitoff Baitov Bajaev Bajan Bajanoff Bajanov Bajenin Bajenoff Bajenov Bajev Bajin Bajinoff Bajinov Bajoff Bajov Bajukoff Bajukov Bajutkin Bak Baka Bakadoroff Bakadorov Bakaev Bakai Bakaleiko Bakaleinik Bakaleinikoff Bakaleinikov Bakalinsky Bakaloff Bakalov Bakanchuk Bakanoff Bakanov Bakastoff Bakastov Bakatin Bakeev Bakerkin Bakh Bakhanoff Bakhanov Bakharev Bakhchivandji Bakhchivandzhi Bakheloff Bakhelov Bakhin Bakhir Bakhlaev Bakhlulzade Bakhmat Bakhmatoff Bakhmatov Bakhmetev Bakhmetiev Bakhmetoff Bakhmetov Bakhmin Bakhmutoff Bakhmutov Bakhmutski Bakhmutsky Bakholdin Bakhorin Bakhovkin Bakhovtsev Bakhrakh Bakhrushin Bakhshiev Bakhtadze Bakhtchivandji Bakhtchivandzhi Bakhtiaroff Bakhtiarov Bakhtiev Bakhtigareev Bakhtin Bakhtinoff Bakhtinov Bakhtiyaroff Bakhtiyarov Bakhtizin Bakhtoff Bakhtov Bakhturin Bakhurin Bakhusoff Bakhusov Bakhuta Bakhvaloff Bakhvalov Bakiev Bakihanoff Bakihanov Bakikhanoff Bakikhanov Bakin Bakinoff Bakinov Bakiroff Bakirov Bakis Bakitski Bakitsky Bakkarevich Bakkarevitch Baklagin Baklan Baklanoff Baklanov Baklashoff Baklashov Baklastoff Baklastov Baklund Baklykoff Baklykov Bakmeister Bakoff Bakoni Bakotin Bakov Bakradze Bakrymoff Bakrymov Baksaraev Bakshandaev Bakshanski Bakshansky Baksheev Bakshtanovski Bakshtanovsky Bakshtanowski Bakshtanowsky Bakshtein Bakst Bakulev Bakulin Bakum Bakun Bakunin Bakunoff Bakunov Bakunovets Bakunts Bakuridze Bakurinsky Bakuroff Bakurov Bakushinsky Bakusoff Bakusov Balabaev Balabai Balaban Balabanoff Balabanov Balabas Balabko Balabolkin Balabudkin Balabuev Balabuha Balabukha Balaev Balagul Balagula Balaguroff Balagurov Balahnin Balahonov Balahonsky Balahontsev Balahovski Balahovsky Balahowski Balahowsky Balakaev Balakhnin Balakhonoff Balakhonov Balakhonsky Balakhontsev Balakhovski Balakhovsky Balakhowski Balakhowsky Balakin Balakirev Balakleevski Balakleevsky Balakshin Balalaev Balamutenko Balamykin Balanchivadze Balanda Balandin Balandyuk Balanev Balanovski Balanovsky Balanowski Balanowsky Balarev Balasanyan Balashev Balashoff Balashov Balasoglo Balavensky Balavin Balawensky Balawin Balayan Balazovski Balazovsky Balazowski Balazowsky Barabanov Barabanschikov Barabash Barabashev Barabolya Baraboshkin Barakin Barakov Barakovsky Baraks Baram Baramidze Barandych Baranenko Baranetsky Barankin Barannikov Barazbiev Barazgov Bas'Holov Bashkov Basov Basovsky Bass Bassin Bastanov Bastian Basto Bastrygin Basygysov Basyrov Basyuk Batchaev Batchaldin Batchelis Batcherikov Batchev Batchilo Batchinsky Batchish Batchmanoff Batchmanov Batchuk Batchurin Batchyanskas Batsanov Batsev Batsevich Batskaev Batsman Batsura Batsyn Batuev Batugin Batuhtin Batukov Batunov Batura Baturin Baturkin Baturov Bauer Baukin Baulin Baum Bauman Baumgarten Baushev Bausov Bautin Bauze Bavarin Bavidoff Bavidov Bavilin Bavin Bavtrukevich Bavtrukevitch Bavykin Bawarin Bawidov Bawilin Bawin Bawtrukevich Bawtrukevitch Bawvykin Bazaev Bazai Bazanov Bazarbaev Bazarevich Bazarhandaev Bazarov Bazen Bazetskov Bazhaev Bazhan Bazhanov Bazhenin Bazhenov Bazhev Bazhin Bazhinov Bazhov Bazhukov Bazhutkin Bazikov Bazil Bazilev Bazilevich Bazilevitch Bazilevsky Bazili Baziner Bazjin Bazovski Bazovsky Bazowski Bazowsky Bazulev Bazulin Bazunov Bazylev Bazylnikov Bazyuta Bazzhin Beh Behmetiev Behoev Behteev Behtenev Behterev Behtin Behtold Bei-Bienko Beider Beilin Beilis Beilshtein Beiman Bein Beinenson Beizerov Bekh Bekhmetiev Bekhoev Bekhteev Bekhtenev Bekhterev Bekhtin Bekhtold Bekk Bekkarevich Bekker Beklemeshev Beklemischev Beklemishev Beklenischev Bekleshev Bekleshov Beklov Bekmahanov Bekman Bekmurzov Beknazar-Yuzbashev Bekoryukov Bekov Bekovich-Cherkassky Bekrenev Bekshansky Bekshtrem Bektabegov Bektemirov Bektimirov Bektuganov Bekuh Bekyashev Belbaev Belchenko Belchenkov Belchikov Belchuk Beldy Belgibaev Belgov Belich Belichenko Belichev Belik Belikin Belikov Belikovetsky Belikovich Belilovsky Belimov Belin Belinder Belinskij Belinsky Belishko Belitsky Belkov Belman Belnikov Belnov Beloborodov Belobrov Belobrovkin Beloded Belodubrovsky Beloenko Beloglazov Belogolovkin Belogolovy Belogorsky Belogrud Belogubov Belohin Belohvostikov Belokhin Belokhvostikov Belorossov Belorusov Belorussov Beloschin Beloselsky Beloshapka Beloshapkin Beloshapkov Beloshitsky Belosludtsev Belosohov Belostotsky Belosvet Belotelov Belotserkovets Belotserkovsky Belotsitsky Belotsvetov Belous Belousko Belousov Belov Belovol Beloyartsev Belshtein Belsky Beltov Beltsev Beltsov Beltyukov Belyaninov Belyavin Belyavsky Belyusov Benevolensky Berezansky Berezin Berezinsky Berezitsky Berezitzky Berezkin Bereznev Bereznevich Bereznikov Bereznitsky Bereznitzky Berezovaya Berezovikov Berezovoi Berezovsky Berezutsky Berezutzky Berezyuk Besschetny Bessogonov Bessonov Bestemyanov Bestolov Bestov Bestujev Bestujev-Lada Bestujev-Ryumin Bestuzhev Bestuzhev-Lada Bestuzhev-Ryumin Bezruchenkov Bezrukavnikov Bezrukih Bezrukikh Bezrukov Bezubyak Bezuglov Bezugly Bezumov Bezusko Bezyazykov Bezyuk Bezyzvestnyh Bezyzvestnykh Bibichev Bibin Bibishev Bibitinsky Bibler Bilalov Bilbasov Bilderling Bildin Bilenkin Bilenko Bilenshtein Bilibin Bilichenko Bilihodze Bilik Bilimovich Bilinsky Biljo Bill Billert Billevich Bilmus Bilonog Bilov Bilyaev Bilyarsky Bilyk Bim Bim-Bad Bimbas Bindyukov Binevich Binshtok Bir Biragov Birentsveig Birger Birich Birilev Birin Birk Birkenberg Birkin Birman Birnbaum Biron Birshtein Birut Biryukov Biryukovich Biryulev Biryulin Biryuzov Bass Bass Chaadaev Chabanov Chabanov Chabrov Chabrov Chadin Chadin Chadov Chadov Chadovich Chadovich Chadrantsev Chadrantsev Chaganov Chagin Chajegov Chajengin Chaldymov Chaleev Chalov Chalovsky Chaly Chalyh Chalykh Chalyshev Chamov Chamushev Chanchikov Changli Chanov Chanturia Chanyshev Chapko Charkin Charnetsky Charnolusky Charoshnikov Chartorijsky Chartorizhsky Charuhin Charukhin Charushin Charushkin Charykov Chazov Cheh Chehanov Cheharin Chehladze Chehlakovsky Chehluev Chehoev Chehonin Chehov Chehovich Chehovsky Chekachev Chekh Chekhanov Chekharin Chekhladze Chekhlakovsky Chekhluev Chekhoev Chekhonin Chekhov Chekhovich Chekhovsky Chekin Chekis Chekletsov Cheklyanov Chekmarev Chekmasov Chekmenev Chekmezov Chekoev Chekomasov Chekonov Chekvin Chepaksin Cheparev Chepasov Chepchyak Chepel Chepelkin Chepelyanov Chepik Chepikov Chepin Chepko Cheplakov Chepraga Cheptsov Cheptygmashev Chepulyanis Chepurenko Chepurin Chepurkovsky Chepurnov Chepurnoy Chepurnyh Chepurov Chepygin Cherchen Cherchesov Chernin Chernov Chernovisov Chernovol Cherov Cherpakov Chershintsev Chersky Chertakov Chertischev Chertkov Chertkovsky Chertok Chertolyas Chertorijsky Chertorinsky Chertoritsky Chertorizhsky Chertorogov Chertov Chertushkin Chertykov Cheruhin Cherukhin Cherushov Cheryshev Chevtzoff Chihachev Chihanchin Chijevsky Chijik Chijikov Chijov Chikanov Chikhachev Chikhanchin Chikichev Chikin Chikirev Chikishev Chikomasov Chikov Chikulaev Chikun Chikurov Chikviladze Chizhevsky Chizhik Chizhikov Chizhov Chkhartishvili Chkheidze Chkhenkeli Chkhikvadze Chugaev Chugainov Chugreev Chuguev Chugunov Chuhadjyan Chuhalov Chuhanov Chuharev Chuhin Chuhlomin Chuhlomsky Chuhlov Chuhman Chuhmantsev Chuhnin Chuhnov Chuhnovsky Chuho Chuhonkin Chuhontsev Chuhraev Chuhray Chuhrov Chukhadzhyan Chukhalov Chukhanov Chukharev Chukhin Chukhlomin Chukhlomsky Chukhlov Chukhman Chukhmantsev Chukhnin Chukhnov Chukhnovsky Chukho Chukhonkin Chukhontsev Chukhraev Chukhray Chukhrov Churnosov Chursalov Churshukov Chursin Chursinov Churuksaev Churyukin Chusov Chusovitin Chuta Chutchenko Chutchev Chutchikov Chutko Chuvahin Chuvailov Chuvaldin Chuvanov Chuvashev Chuvashov Chuvatkin Chuvilev Chuvilkin Chuvilo Chuvilyaev Chuvstvin Chuvyrov Chyrgal-Ool Ciurlionis Dabahov Dagaev Dahaev Dahin Dahno Dahnov Dahov Dakhaev Dakhin Dakhno Dakhnov Dakhov Dan'Ko Dan'Shin Danchenko Danchuk Danich Danichenko Danichkin Danilchenko Danilchuk Daniltsev Danilyak Danilyan Danilyuk Danin Danisevich Dankin Dankov Dankuldinets Dannenberg Danshin Dantsig Dantsiger Danyarov Danyukov Danyushevsky Dar'In Dar'Kin Daraev Daragan Darakov Darchiashvili Darchiev Darchinyants Dardyk Dardyrenko Darenkov Darevsky Dargevich Dargomyjsky Darichev Darinsky Darjaev Darkov Darkshevich Darminov Darsigov Darsky Daryalov Dasaev Datdeev Dats Datsenko Daty Daue Dauengauer Daugelo Daugule Daugulis Dauman Daunene Daursky Daushev Dautov Dav David Davidchuk Davidenko Davidenkov Davidov Davidovich Davidson Davidyants Davidyuk Davidzon Davitashvili Davlatov Davlertgareev Davletgaraev Davletkildeev Davletov Davletshin Davletyarov Davlyatov Davydchenko Davydchenkov Davydenko Davydenkov Davydkin Davydov Davydovich Defabr Dehanov Dehant Dehtyar Dehtyar Dehtyarenko Dehtyarev Demeshko Demetkin Demetr Demich Demichev Demidenko Demidoff Demidov Demidovich Demihov Demin Deminov Demirchyan Demirhanov Demishev Deniskin Denisov Denisovsky Derchansky Derfel Derfelden Deribas Deribin Deribo Derich Deriglazov Deripaska Derjavets Derjavin Derkach Derkachenko Derkachev Derkovsky Derman Dermelev Dernov Dertynov Derunov Deryabin Deryabkin Deryagin Deryugin Deryujinsky Deryujkov Deryushev Deryuzhinsky Deryuzhkov Derzhavets Derzhavin Deshesko Deshevyh Deshkin Desnitsky Destunis Desyatchikov Desyatkin Desyatkov Desyatnichenko Desyatnikov Desyatov Desyatovsky Desyatskov Detengof Detinko Detkov Detsenko Deulenko Deulin Deyanov Didarov Didenko Diderihs Didevich Didichenko Didigov Didkovsky Didrikil Diduh Didychenko Dienko Diev Digurov Dijbak Dijin Dijur Dik Dikansky Dikarev Dikarevsky Dikih Dikikh Dikolenko Dikov Dikovenko Dikovsky Dikson Dikul Dikusar Dikushin Diky Divaev Divakov Divavin Diveev Divilkovsky Divin Divinets Divnich Divnov Divov Diyajev Dizhbak Dizhin Dizhur Djabrailov Djabruev Djahaya Djahbarov Djakson Djaldjireev Djamaldinov Djanaev Djanakavov Djanashia Djanashiya Djangirli Djanibekov Djankezov Djanumov Djarimov Djatdoev Djatiev Djavahishvili Djejela Djeladze Djelepov Djemal Djemilev Djevetsky Djibladze Djibuti Djigarhanyan Djigit Djikaev Djikovich Djincharadze Djindo Djirin Djisev Djugashvili Djumabaev Djumaev Djumagaliev Djumaniyazov Djunkovsky Djunusov Djura Djuro Djuromsky Dmitrochenko Dmitrov Dmitrovsky Dmohovsky Dmokhovsky Dmuhovsky Dmukhovsky Dneprov Dnishev Dobrajansky Dobreitser Dobrenkov Dobretsky Dobretsov Dobridnyuk Dobrik Dobrinsky Dobritsky Dobrivsky Dobriyan Dobrjansky Dobrodeev Dobrohotov Dobrojanov Dobroklonsky Dobrolensky Dobrolyubov Dobromyslov Dobronos Dobronravov Dobropolsky Dobroserdov Dobroslavin Dobrosotsky Dobrotin Dobrotvorsky Dobrotvortsev Dobrov Dobrovolsky Dobrovsky Dobrushin Dobrushkin Dobrusin Dobryakov Dobryansky Dobrynchenko Dobrynin Dobrynsky Dobryshev Dobryshin Dobujinsky Dobulevich Dobuzhinsky Dobychin Dodin Dodolev Dodonov Doev Doga Dogadaev Dogadin Dogadkin Dogadov Dogel Dogilev Dogmarov Dogujiev Doguzov Doich Doikov Doinikov Doino Dojdikov Dojin Donchak Donchenko Dontsov Dopiro Dorofeev Dovator Dovbyschuk Doveiko Dovetov Dovgaev Dovgalev Dovgalevsky Dovgan Dovgel Dovgello Dovgolevsky Dovgopoly Dovgun Dovgusha Dovgyallo Dovjenko Dovjuk Dovladbegyan Dovlatov Dovlatyan Dovnar Dovydenko Dovzhenko Dovzhuk Dozmorov Dozorny Dozortsev Drojdin Drojjin Drojjinov Drozdenko Drozdetsky Drozdkov Drozdov Drozdovsky Dubakin Dubasov Dubatkov Dubatolov Dubelir Dubelt Duben Dubenetsky Dubenkov Dubensky Dubentsov Dubik Dubin Dubina Dubinin Dubinkin Dubinovsky Dubinsky Dubitsky Dubko Dubkoff Dubkov Dublin Dublyansky Dubman Dubnikov Dubnitsky Dubnov Dubnyakov Dubrouski Dubrov Dubrovin Dubrovo Dubrovsky Dubrowski Dubrowsky Dudchik Dudnakov Dudnik Dudnikov Dudochkin Dudorov Dudunov Dudurich Durakov Durasov Durdin Durdyev Durgaryan Durkin Durmanov Durmashkin Durnev Durnopeiko Durnov Durnovo Durnovtsev Duronov Durov Duryagin Durylin Dutikov Dutov Dyachkov Dyachkovsky Dyakov Dyo Dzhabrailov Dzhabruev Dzhahaya Dzhahbarov Dzhakson Dzhaldzhireev Dzhamaldinov Dzhanaev Dzhanakavov Dzhanashia Dzhanashiya Dzhangirli Dzhanibekov Dzhankezov Dzhanumov Dzharimov Dzhatdoev Dzhatiev Dzhavahishvili Dzhavakhishvili Dzheladze Dzhelepov Dzhemal Dzhemilev Dzhevetsky Dzhezhela Dzhibladze Dzhibuti Dzhigarhanyan Dzhigit Dzhikaev Dzhikovich Dzhincharadze Dzhindo Dzhirin Dzhisev Dzhugashvili Dzhumabaev Dzhumaev Dzhumagaliev Dzhumaniyazov Dzhunkovsky Dzhunusov Dzhura Dzhuro Dzhuromsky Eberg Ebergard Eberling Eberman Ebers Ebert Ebralidze Ebsvort Ebzeev Efanov Egamberdiev Eganov Eganyan Egarmin Eger Egerev Egershtrom Eggert Egiazarov Egiazaryan Egides Egides Egin Egipko Egishev Egle Eglevsky Eglevsky Egof Egolin Egorenko Egorenkov Egorichev Egorihin Egorin Egorkin Egorov Eidelman Eidelnant Eidelstein Eideman Eides Eidinov Eidlin Eidman Eifman Eig Eigin Eihe Eihenbaum Eihengolts Eihenvald Eihfeld Eihmans Eihvald Eijvertin Eikalovich Eikhe Eikhenbaum Eikhengolts Eikhenvald Eikhfeld Eikhmans Eikhvald Eilenkrig Eiler Eimontov Eindorf Eingorn Eirih Eizen Eizenstein Eizhvertin Ekaterininsky Ekelchik Ekimov Ekin Elachich Elagin Elanchik Elanin Elansky Elapov Elashkin Elatontsev Elebaev Elehin Elenin Elensky Elentuh Elepin Elepov Elesin Eletskih Eletsky Elez Elgin Eliasberg Eliashberg Eliasov Elinson Eliovitch Elisman Eltsin Emanov Emchenko Emelianenko Emelianenkov Emelianov Emelin Emelyantsev Emeshin Emets Emkov Emlin Emohonov Emtsov Emyashev Emyshev En'Ko En'Kov Enchev Enden Endogurov Endolov Endzelin Eneev Enenko Engalychev Engel Engelgard Engelgardt Engelke Engelmeier Engelsberg Engibarov Engman Engver Enik Enikeev Enikolopov Enileev Enin Enman Enner Ennikeev Enns Enohin Ens Enshin Entin Entov Entov Ents Entus Enukidze Enyagin Eremchenko Eremkin Eremushkin Erenkov Erepov Eretsky Eretzky Erjenkov Eroschenko Eroschenkov Eroshenko Eroshevsky Eroshin Eroshkevich Eroshkin Eroshov Eruhimovich Erunov Erusalimchik Erusalimsky Eruzalimchik Erzhenkov Es'Kin Es'Kov Esaulov Esenchuk Esenin Esenkov Esennikov Esikov Esimontovsky Esin Esionov Esipenko Esipov Esipovich Esmansky Esmonsky Estafiev Esyp Evald Evarestov Evdakov Evdokimov Evdoshenko Evelson Eventov Evers Eversman Everstov Evert Evranov Evsiukov Evstafiev Evstafiev Evstifeev Evstigneev Evstratov Evsyukov Evsyutin Fabelinsky Fabr Fabri Fabrichnikov Fabrichnov Fabrichny Fabrikant Faddeev Fadeechev Fadeev Fadin Fadyaev Fadyuhin Fadzaev Faen Favorsky Fazilov Fazleev Fazlov Fazylzyanov Fedchenko Fedchenkov Fedosov Fedotenko Fedotiev Fedotkin Fedotko Fedotov Fedotovskih Felkerzam Feschenko Feschuk Filipchenko Filipchuk Filipenko Filipiev Filipkov Filipov Filipovich Filipovsky Filippenko Filippenkov Filippishin Filippkin Filippov Filippovich Fin Finagin Finchuk Finenko Fingrut Finik Finkel Finkelshtein Finkelson Finko Finn Finochkin Finogeev Finogenov Finoshin Finov Finsky Fintiktikov Finyagin Finyutin Fiohin Fiokhin Fionin Fionov Fisichev Fisik Fiskin Fistal Fisun Fofanov Foht Fominov Fomintsev Fominyh Forer Forsh Forshteter Fortov Fortunatov Fortunov Fortygin Fotiadi Fotiev Fotinov Foya Frolandin Frolenkov Frolkov Frolov Frolovsky Froltsov Frolushkin From Froman Fromberg Frontov Froyanov Frukalov Frumin Frumkin Frunze Frush First Gach Gachegov Gachev Gachinsky Gafarov Gafin Gafiyatullin Gaft Gafurov Gaganov Gagarin Gagarinov Gagarinsky Gagemeister Gagen Gagentorn Gagiev Gagin Gagonin Gagrin Gagulin Gaguliya Gagut Galda Galdin Galdus Galeev Galei Galena Galenkov Galenkovsky Galenovich Galepa Galerkin Galetsky Galev Galevko Galevsky Galkin Galkin-Vraskoi Galko Galkov Galkovsky Galkovsky Galkus Gall Gall Gallai Galler Galli Gallinger Gallutdinov Gallyamov Galochkin Galoganov Galstyan Galteev Gansky Gasanov Gaschenkov Gasfort Gashibayazov Gashkin Gashkov Gasho Gasich Gasilin Gasilov Gasinov Gaskoin Gaskov Gasman Gasnikov Gasparov Gasparyan Gaspirovich Gassan Gasselblat Gassiy Gastello Gastev Gastfreind Gasvitsky Gasymov Gasyukov Gatashov Gataullin Gateev Gatiev Gatilov Gatin Gatiyatullin Gatovsky Gatsak Gatsenko Gatsuk Gatsukov Gatsunaev Gatturov Gau Gaubrich Gaudasinsky Gauer Gauk Gaur Gayanov Gayazov Gayulsky Geft Gefter Geftler Gehfenbaum Gehman Geht Gehtman Gerasimov Gerasimovich Gerasimovsky Geroeff Geroev Gerojev Gerschcovich Gershkovich Gershkovitsh Geshtovt Gess Gesse Gessen Gest Gesti Get'Man Geta Getelmaher Getie Getling Getman Getmanchuk Getmanenko Getmanov Gets Getselev Getsen Getsov Getta Getya Gladchenko Gladenkov Gladilin Gladilschikov Gladkih Gladkikh Gladkov Gladky Gladshtein Gladston Gladtsin Gladun Gladysh Gladyshev Glagolev Glagolevskii Glagolevsky Glasko Glasov Glavak Glavatskih Glavatsky Glavchev Glavin Glavinsky Glaz Glazachev Glazanov Glazatov Glazaty Glazenap Glaziev Glazkov Glazman Glaznev Glazov Glazovsky Glazunov Glazychev Glazyrin Glebov Glebovich Glebovitsky Gleizer Glek Glezarov Glezer Glezerman Glezmer Glubokovsky Glubotsky Gludin Gluharev Gluhih Gluhman Gluhonkov Gluhotko Gluhov Glukharev Glukhih Glukhman Glukhonkov Glukhotko Glukhov Glumov Gluskin Glusov Glussky Gluz Gluzman Gluzsky Golobokih Golobokov Goloborodko Goloborodov Golochevsky Golofaev Golofastov Golofeev Goloha Golohvastov Gololobov Golomolzin Golomovzy Goloschapov Goloschekin Goloschuk Golosenin Golosenko Goloshov Goloskokov Golosnenko Golosov Golosovker Golostenov Golota Golotik Golotyuk Goloulin Goloushev Goloushin Golov Golovach Golovachev Golovan Golovanchikov Golovanets Golovanov Golovanyov Golovatov Golovatsky Golovaty Golovei Golovenchenko Golovenkin Golovenok Goloveshkin Goloveshko Goncharuk Gorbach Gorbachev Gorbachevsky Gorbenko Gorbikov Gorbman Gorbov Gorbovsky Gorbulin Gorbulsky Gorbunov Gorbunov-Posadov Gorbushin Gorbuzenko Gorchak Gorchakov Gorchakovsky Gorcharenko Gorchatov Gorchilin Gorchinsky Gorchkhanov Gordasevich Gordeenko Gordeev Gordeichik Gordon Gordopolov Gordov Gordusenko Gordyagin Gordyushin Gorfinkel Gorfunkel Gorlov Gorski Gorskih Gorskikh Gorskin Gorskov Gorsky Gorst Gorstkin Gorsun Gortikov Gortyshov Govallo Govendyaev Govoretsky Govorin Govorkov Govorov Govoruhin Govorun Govorushin Govyadin Govyrin Graifer Grakovich Gramatke Gramberg Gramenitsky Grametsky Graminovsky Grammatikov Grammatin Granat Granatkin Granberg Grandberg Granik Granikov Granin Granitov Grankin Grankov Granov Granovsky Gransky Grant Grib Gribachev Gribakin Gribalev Gribanov Gribanovsky Gribashev Gribenkin Gribin Gribkov Gribnov Griboedov Gribov Gribovsky Gributsky Gridchin Gridnev Grigolyuk Grigoraschuk Grigorchikov Grigorenko Grigorevsky Grigoriadi Grigoriev Grigoriev Grigorishin Grigorov Grigorovich Grischenko Grischuk Grizodubov Grobivker Grobovsky Grodensky Grodetsky Grodko Grodsky Grodzensky Groer Grohar Groholsky Grohotov Grohov Grohovsky Groisberg Groisman Groizman Grojantsev Grokhar Grokholsky Grokhotov Grokhov Grokhovsky Gromyhalin Gromyko Gronsky Gropyanov Grosfeld Groshev Groshikov Groshkov Groshopf Groshovkin Grositsky Groskov Grosov Gross Grosse Grossgeim Grosshopf Grossman Grosu Grosul Grot Grotus Groundon Groza Grozdov Grozhantsev Grozovsky Gruschak Grusha Grushelevsky Grushenko Grushetsky Grushevenko Grushevoi Grushevsky Grushi Grushihin Grushikhin Grushin Grushinsky Grushka Grushko Gudarenko Gudenko Gudenok Gudev Gudilin Gudim Gudima Gudimov Gudjabidze Gudkov Gudojnik Gudoshin Gudov Gudovich Gudovsky Gudtsov Gudvan Gudymenko Gudymo Gudz Gudzenko Guio Gujavin Gujo Gujov Gujva Gujvin Guk Gukasov Guketlev Gukov Gukovsky Gul Gulaev Gulai Gulak Gulamov Gulaya Gulbinsky Gulchenko Gulchinsky Guldenbalk Guldin Guldreih Guleichik Gulenko Gulenkov Gulentsov Gulevich Gulevsky Gulia Gulichev Gulidov Guliev Gulimov Gulin Gulishambarov Gulkevich Gulkin Gulko Gulshin Gultyaev Gulyaev Gulyak Gulyakov Gulyansky Gulyaschih Gulyashko Gulyga Guzairov Guzanov Guzatov Guzeev Guzei Guzenko Guzenkov Guzev Guzevatov Guzichenko Guzik Guzilov Guzner Guznischev Guzov Guzovatsker Guzovsky Guzun Gzovsky Habalov Habarin Habarov Habarovsky Habelashvili Habibulaev Habibulin Habibullaev Habibullin Habichev Habin Habirov Habitsov Habov Habriev Hachapuridze Hachatur'Yan Hachaturov Hachaturyan Hachirov Hadarin Hadartsev Hadjiev Hadjula Hadonov Haesh Hafizov Hagajeev Hagondokov Hagur Hagurov Hahaev Hahalev Hahanyan Hahulin Hahva Haibullin Haidakin Haidin Haidukov Haidurov Haikin Haikov Hailov Haimi Hain Hainadsky Hairetdinov Hairov Hairulin Hairullin Hairullov Hairutdinov Hairyuzov Hait Hait Haitov Haitsin Hajkasimov Hakamada Hakhaev Hakhalev Hakhanyan Hakhulin Hakhva Hakimov Hakmaza Haladjan Haladzhan Halaev Halansky Halaphaev Halapkhaev Halatnikov Halatov Halatyan Haldei Haldoyanidi Haleev Halenkov Halepsky Haletsky Halevin Halevinsky Halfin Halichevsky Halifman Halikov Halileev Halilov Halilulin Halimov Halin Halip Halipov Halitov Haliulin Haliullin Halkechev Halkin Halkiopov Hallyev Halo Haluev Haluga Halutin Halyapin Halyavin Halyavkin Halymbadja Halymbadzha Halyuta Hamadeev Hamadullin Hamaev Hamatnurov Hamatov Hamchiev Hamenkov Hamidulin Hamidullin Hamikoev Hamitov Hamitsev Hamitski Hamlov Hamraev Hamukov Hamzin Han Hanaev Hanafiev Hanahu Hanakhu Hananaev Hanbikov Hanchuk Handirov Handjaevsky Handjyan Handohin Handokhin Handorin Handrikov Handrilov Handruev Handurin Handzhaevsky Handzhyan Haneev Hanenko Hanenya Hanetsky Hanevich Hangurian Hanifatullin Hanikyan Hanin Hanjiev Hanjin Hanjin Hanjonkov Hankeev Hankoev Hannanov Hanok Hanov Hantimerov Hantsev Hantuev Hanukov Hanutin Hanykov Hanyutin Hanzhiev Hanzhin Hanzhin Hanzhonkov Hapachev Hapaev Hapchaev Hapitsky Hapkov Hapov Haprov Hapsirokov Haptahaev Haptakhaev Hapy Harabornikov Haradurov Haradze Haraev Harahinov Harakhinov Haraman Harash Haratyan Haraz Harchenko Harchenkov Harchev Harchevnikov Harchikov Hardaev Hardin Harebov Harev Harharov Harik Harin Harinov Harionovsky Harisov Harito Hariton Haritonenko Haritonov Haritoshkin Harkevich Harkharov Harkin Harkov Harkovchuk Harkovsky Harlachev Harlamov Harlampovich Harlanov Harlap Harlashenkov Harlashkin Harlinsky Harlov Harmansky Harms Harnikov Hartukov Harybin Haryuchi Haryukov Hasabov Hasaev Hasainov Hasanov Hasbulatov Haschenko Haschev Hashaba Hashachih Hasiev Hasis Haskin Haslavsky Hasminsky Hasnulin Hasyanov Hataevich Hatagov Hatin Hatit Hatkevich Hatkov Hatmullin Hatov Hatskevich Hatukaev Hatuntsev Haustov Haustovich Hautiev Havanov Havin Havinson Havkin Havkunov Havrichev Havronin Havroshin Havroshkin Hayaletdinov Hayaliev Hayutin Hazan Hazanov Hazanovich Hazbulatov Hazeev Haziahmetov Haziev Hazipov Hazov Hegai Heifets Helashvili Helimsky Helkvist Helvas Henkin Hentov Her Heraskov Herheulidzev Herovets Hersonsky Heruvimov Hesin Hetagurov Heveshi Hevrolin Hidirov Hidiyatullin Hihich Hij Hijny Hijnyak Hijnyakov Hil Hilchevsky Hilkov Hilyuk Himenko Himich Himichev Himonenko Hinchin Hinchuk Hinich Hirikilis Hisametdinov Hisamutdinov Hismatullin Hismatulov Histyaev Hitarov Hitrenko Hitrin Hitrinsky Hitro Hitrov Hitrovo Hitruk Hitrun Hityaev Hizh Hizhny Hizhnyak Hizhnyakov Hizriev Hlamov Hlebanov Hlebnikov Hlebodarov Hlebovich Hlestkov Hlestov Hlevniuk Hlgatyan Hlobystin Hlobystov Hlopetsky Hlopiev Hlopin Hlopkin Hlopkov Hloponin Hlopotin Hlopotnya Hlopov Hludeev Hludov Hlupin Hlusov Hlutkov Hlybov Hlynov Hlypovka Hlystov Hlystun Hlyupin Hlyzov Hodorovich Holboev Holeva Holin Holkin Holkin Holmansky Holminov Holmogorov Holmogortsev Holmov Holmsky Holod Holodilin Holodilov Holodkov Holodkovsky Holodny Holodnyh Holodnykh Holodov Holodovsky Holoevsky Holomeev Holomenko Holopov Holoshevsky Holoshin Holschevnikov Holschigin Holshevnikov Holstov Holuev Holyavin Holyuchenko Holzakov Homa Homaiko Homar Homatsky Homenko Homentovsky Homeriki Homich Homichenko Hominsky Homsky Homuha Homusko Homutnikov Homutov Homyakov Hon Honenev Honov Horalya Horanov Horev Horhordin Horkin Horkov Horobryh Horohorkin Horos Horoshavin Horoshavtsev Horoshev Horoshevsky Horoshih Horoshilov Horoshiltsev Horoshkevich Horoshko Horujenko Horujev Horujy Horuzhenko Horuzhev Horuzhy Horvat Hot Hoteev Hotetovsky Hotimsky Hramchenkov Hramov Hramtsov Huajev Huako Hubaev Hubiev Hublaryan Hubulava Hubutiya Hudabirdin Hudaiberdin Hudainatov Hudekov Hudiev Hudik Hudilainen Hudkov Hudoinatov Hudojnikov Hudokormov Hudoleev Hudolei Hudonogov Hudoshin Hudyaev Hudyak Hudyakov Hudyh Hudyshkin Hugaev Hujev Hujin Hulhachiev Humaryan Hunagov Hundanov Hunov Huramshin Huranov Huraskin Hurdey Hurinov Huroshvili Hurtov Hurtsilava Huzangai Huzin Huziyatov Ignatyuk Ilyahin Ilyakhin Ilyasov Ilyuhin Ilyukhin Ilyumjinov Ilyunin Ilyushin Ilyushkin Ilyutenko Imamaliev Imamutdinov Imanov Imatkulov Imbulgin Imedoev Imendaev Imenin Imeretinsky Imerlishvili Imnadze Imniaminov Imshenetsky Imukov Isaakidis Isachenko Isachenkov Isachenok Isadjanov Isaenko Isaev Isaevich Isagaliev Isaichenkov Isaichev Isaichikov Isaikin Isaiko Isaikov Isakov Isakovich Isakovsky Isanbet Isangulov Isanin Isasev Isayan Iskaev Iskakov Iskandarov Iskandaryan Iskander Iskenderov Iskin Iskortsev Iskos Iskoz Iskra Iskritsky Iskrov Iskujin Iskyul Islakaev Islambekov Islamov Islamshin Islanov Islavin Isleniev Islon Islyamov Ivanov Ivchenko Jaba Jabin Jabinsky Jabitsky Jablochkin Jablochkov Jablokov Jablonovsky Jablonowsky Jablonsky Jablontsev Jablontzev Jablovsky Jaboev Jabotinsky Jabrev Jabrov Jabsky Jaburov Jabykin Jachevsky Jachikov Jachmenev Jachmenkov Jachmentsev Jachnik Jadaev Jadan Jadanov Jadanovsky Jadenov Jadin Jadkevich Jadne Jadov Jadovsky Jadrennikov Jadrihinsky Jadrikhinsky Jadrov Jadryshnikov Jafaev Jafarov Jafrakov Jagafarov Jagalin Jaganov Jagello Jageman Jagfarov Jagich Jaglintsev Jagoda Jagodin Jagodinsky Jagodnikov Jagofarov Jagovenko Jagubov Jagubsky Jagudin Jagujinsky Jagunov Jagupa Jagupets Jagutkin Jagutyan Jaguzhinsky Jagya Jahaev Jahimovich Jahin Jahlakov Jahnenko Jahno Jahnyuk Jahontov Jahot Jaikbaev Jaimov Jaitsky Jaivoronok Jakhaev Jakhimovich Jakhin Jakhlakov Jakhnenko Jakhno Jakhnyuk Jakhontov Jakhot Jakimchik Jakimchuk Jakimenko Jakimets Jakimov Jakimovich Jakimovsky Jakimychev Jakir Jaklashkin Jakmon Jakon Jakov Jakov Jakovchenko Jakovchuk Jakovenko Jakovets Jakovichenko Jakovkin Jakovlenko Jakovlev Jakovuk Jakshibaev Jakshin Jakub Jakuba Jakubchik Jakubenko Jakubik Jakubonis Jakubov Jakubovich Jakubovsky Jakunchikov Jakunichev Jakunin Jakunkin Jakunov Jakupov Jakurin Jakuschenko Jakush Jakushev Jakushevich Jakushin Jakushkin Jakushkov Jakushov Jakutin Jakutkin Jalagin Jalamov Jalchevsky Jalilo Jalkovsky Jalnin Jalovenko Jalovets Jalovoi Jalunin Jalybin Jam Jamaletdinov Jamaltdinov Jambaev Jamburg Jamilov Jaminsky Jamlihanov Jamlikhanov Jamoida Jamoido Jamov Jampolsky Jamschikov Jamskov Jamsuev Jan Janaev Janaki Janalov Janaslov Janbarisov Jandarbiev Jandarov Jandiev Jandr Jandulsky Jandutkin Janek Janenko Jangarber Jangel Janibekov Janimov Janin Janishevsky Janishin Janitsky Janjul Jankelevich Jankevich Jankilevsky Jankilovich Jankin Jankis Janko Jankov Jankov Jankovsky Janochkin Janov Janover Janovich Janovitsky Janovka Janovsky Janowich Janpolsky Janshin Janshole Janzhul Japaskurt Japondych Japparov Jardetsky Jarihin Jarikhin Jarikov Jarinov Jarkih Jarkikh Jarkov Jarkovsky Jarmuhamedov Jarmukhamedov Jarnikov Jarnov Jarov Jarovtsev Jarsky Jaruev Jashkov Jatkov Jatsenko Jatsevich Jatskevich Jatskov Jatskovsky Jatsuba Jatsun Jatsunov Jatsyk Jatsyshin Jatzenko Jatzevich Jatzkevich Jatzkov Jatzkovsky Jatzuba Jatzun Jatzunov Jatzyk Jatzyshin Javoronkov Javoronok Javoronsky Javrid Jbankov Jbanov Jdakaev Jdan Jdankin Jdanko Jdankov Jdanov Jdanovich Jdanovsky Jebelev Jebit Jebo Jebrovsky Jebryakov Jechkov Jedrinsky Jegin Jeglov Jegulin Jegunov Jeimo Jejel Jejera Jekov Jekulin Jelaev Jeldakov Jelehovsky Jelekhovsky Jelezko Jeleznikov Jeleznov Jelezny Jeleznyak Jeleznyakov Jelezov Jelezovsky Jeleztsov Jeliba Jelnin Jelnov Jelobinsky Jelohovtsev Jelokhovtsev Jeltouhov Jeltoukhov Jeltov Jeltuhin Jeltukhin Jeltyannikov Jeludev Jeludkov Jelvakov Jelyabov Jelyabovsky Jelyabujsky Jemaitis Jemaldinov Jemchugov Jemchujnikov Jemchujny Jemlihanov Jemlikhanov Jemoitel Jemuhov Jemukhov Jendarov Jenin Jenovach Jeravin Jerbin Jerdev Jerebin Jerebko Jerebovich Jerebtsov Jerebyatiev Jerihin Jerikhin Jernakov Jernevsky Jernokleev Jernosek Jernov Jernovoy Jeromsky Jeronkin Jeryapin Jerzdev Jestkov Jestovsky Jeurov Jevahov Jevaikin Jevakhov Jevanov Jeverjeev Jevlakov Jevolojnov Jgutov Jiboedov Jidelev Jidenko Jidilev Jidilin Jidkih Jidkikh Jidkin Jidkov Jidomirov Jigachev Jigailo Jigailov Jigalev Jigalin Jigalkin Jigalov Jiganov Jigarev Jigily Jigin Jigmytov Jigulenkov Jigulin Jigulsky Jigultsov Jigun Jigunov Jiharev Jiharevitch Jija Jijchenko Jijemsky Jijikin Jijilev Jijin Jijnov Jikharev Jikharevitch Jikin Jikov Jilchikov Jilenko Jilenkov Jilin Jilinsky Jilis Jilkin Jilnikov Jilov Jiltsov Jilyaev Jilyakov Jilyardy Jilyuk Jimailov Jimerin Jimila Jimirov Jimulev Jinkin Jinov Jirdetsky Jirenkin Jirikov Jiril Jirinovsky Jiritsky Jirkevich Jirkov Jirmunsky Jirnikov Jirnov Jirnyakov Jiro Jirov Jiryakov Jitarev Jitenev Jitetsky Jitin Jitinev Jitinkin Jitkov Jitluhin Jitlukhin Jitnik Jitnikov Jitny Jitomirsky Jituhin Jitukhin Jivaev Jivago Jivilo Jivin Jivkovich Jivlyuk Jivoderov Jivokini Jivoluk Jivopistsev Jivotenko Jivotinsky Jivotovsky Jivov Jivulin Jizdik Jiznevsky Jiznyakov Jjenov Jloba Jluktov Jmaev Jmakin Jmakov Jmelkov Jminko Jmotov Jmudsky Jmulev Jmuro Jogin Jogov Johin Johov Jokhin Jokhov Jokin Jolkov Jolobov Jolovan Joltovsky Joludev Jongolovich Jorin Jorjev Jornyak Jorov Jovnerik Jovnir Jovtun Jovtyak Juchenko Juchkov Judaev Judahin Judakhin Judakov Judanov Judashkin Judasin Judelevich Judenich Judenkov Judin Judinsky Juditsky Judkin Judkov Judkovich Judochkin Judolovich Judovich Judushkin Juferev Juferov Jufit Jufryakov Jugai Jugin Jugov Juhanaev Juhimenko Juhimuk Juhma Juhman Juhnev Juhnin Juhno Juhotsky Juhov Juhtanov Juhtman Juhvidov Juikov Jujlev Jujnev Juk Jukalov Jukhanaev Jukhimenko Jukhimuk Jukhma Jukhman Jukhnev Jukhnin Jukhno Jukhotsky Jukhov Jukhtanov Jukhtman Jukhvidov Jukov Jukovets Jukovich Jukovin Jukovsky Julebin Julev Julidov Julyabin Jumenko Jun Junda Junin Junusov Juon Jupanenko Jupikov Jura Jurakovsky Juravel Juravkov Juravlenko Juravliov Juravov Juravsky Jurba Jurbenko Jurbin Jurihin Jurikhin Jurin Jurkin Jurko Jurkov Jurkovsky Jurman Juromsky Jurov Juruli Jushman Juzeev Juzefov Juzefovich Juzgin Juzhakov Juzhalin Juzhanov Juzhenko Juzhilin Juzin Juzva Juzvikov Juzvishin Juzvyuk Jvachkin Jvanetsky Jvirblis Jvykin Kabachev Kabachnik Kabaev Kabaidze Kabak Kabakchi Kabakov Kabalevsky Kabalin Kabalkin Kabaloev Kabanov Kabashkin Kabatsky Kaberman Kaberov Kabes Kabeshov Kabirov Kabisha Kabitsin Kabitsky Kabjihov Kablahov Kablits Kablov Kablukov Kabulahin Kabulov Kaburneev Kabuzan Kabysh Kabyshev Kabytov Kachaev Kachainik Kachalin Kachalkin Kachalov Kachalovsky Kachan Kachanov Kachanovsky Kacharov Kacharyants Kachemaev Kachenovsky Kachimov Kachin Kachinsky Kachioni Kachkaev Kachkov Kachnov Kachur Kachurin Kalabekov Kalaberda Kalabin Kalabuhov Kalabukhov Kalachev Kalachihin Kalachikhin Kalachinsky Kalachov Kalaev Kalaganov Kalaichev Kalaida Kalaidjan Kalaidovich Kalakin Kalakutsky Kalamanov Kalambetov Kalamkaryan Kalandarov Kalandinsky Kalashnik Kalashnikov Kalatin Kalatsky Kalautov Kaldin Kaldybaev Kaledin Kaleev Kalekin Kalenik Kalenov Kalentiev Kaleri Kaleshin Kalesnik Kaletin Kaletkin Kaletsky Kalganov Kalgashkin Kaliashvili Kaliberda Kalievsky Kalihanov Kalihman Kalihov Kalikhanov Kalikhman Kalikhov Kalikyan Kalimahi Kalimakhi Kalimulin Kalimullin Kalin Kalina Kalinchenko Kalinchuk Kalinich Kalinichenko Kalinichev Kalinin Kalinka Kalinkin Kalinko Kalinnikov Kalinochkin Kalinov Kalinovich Kalinovsky Kalinsky Kalintsev Kalinushkin Kalishevsky Kalishewsky Kalisov Kalistratov Kalita Kaliteevsky Kalitievsky Kalitin Kalitinkin Kalitinsky Kalitkin Kalitvin Kalitvintsev Kaliyants Kallash Kallik Kaloshin Kamenetzky Kartaev Kartajev Kartalov Kartamyshev Kartashev Kartashevsky Kartashkin Kartashov Kartavenko Kartavtsev Kartazhev Karteshkin Kartomyshev Kartoshkin Kartovenko Kartoziya Kartunov Kartushin Kartuzov Kartvelin Kartyshov Kats Katsan Katsarev Katsari Katsebin Katsenelenbaum Katsenellenbogen Katsepov Katsev Katsevman Katsibin Katsis Katsman Katsnelson Katsovsky Katsuba Katsukov Katsur Katz Katzan Katzarev Katzari Katzebin Katzenelenbaum Katzenellenbogen Katzepov Katzev Katzevman Katzibin Katzis Katzman Katznelson Katzovsky Katzuba Katzukov Katzur Kaufman Ladyjensky Ladyjets Ladyjnikov Ladyzhensky Ladyzhets Ladyzhnikov Lajentsev Lajintsev Lapaev Lapakov Lapegin Lapenko Lapenkov Lapidus Lapikov Lapin Lapinsky Lapinus Lapir Lapisov Lapitsky Lapkin Lapochkin Lappo Laps Lazhentsev Lazhintsev Lebed Lebedenko Lebedev Lebedevich Lebedich Lebedinets Lebedinsky Lebedintsev Lebedkin Lebedyansky Lebereht Lebeshev Lebidko Lebin Lebinson Leboperov Lebsky Lebzak Lebzyak Leiba Leibe Leibenzon Leiberov Leibin Leibkin Leibov Leibovich Leibovsky Leichik Leiferkus Leihtenbergsky Leikam Leikin Leikisman Leiko Leiman Leimon Lein Leipunsky Leites Leitis Leitman Leiviman Leizarenko Leizerman Lejankin Lejankov Lejava Lejebokov Lejenko Lejepekov Lejikov Lejnev Lejnin Leonenko Lepehin Lepekhin Lepihin Lepihov Lepikhin Lepikhov Lermontov Lerner Levichev Levish Levit Levitan Levitansky Levite Levitin Levitis Levitov Levitsky Levitsky Levitt Levitzky Lewitckyj Lezdinysh Lezhankin Lezhankov Lezhava Lezhebokov Lezhenko Lezhepekov Lezhikov Lezhnev Lezhnin Lezjov Lezov Lgov Li Lianozov Liberman Liberzon Libkin Libman Libreht Libson Libusov Lichagin Lichenko Lichintser Lichkanovsky Lichko Lichkov Lichkun Lichkus Lichman Lichnov Liders Lidorenko Lidval Liepa Ligachev Ligin Ligorner Ligostaev Lih Lihachev Lihanov Liharev Lihobaba Lihobabin Lihodedov Lihodeev Lihodei Liholat Liholobov Lihomanov Lihonosov Lihosherstov Lihov Lihovidov Lihovskih Lihovtsev Lihtenshtedt Lihtenshtein Lihtentul Lihterman Lihtin Lihvantsev Likh Likhachev Likhanov Likharev Likhobaba Likhobabin Likhodedov Likhodeev Likhodei Likholat Likholobov Likhomanov Likhonosov Likhosherstov Likhov Likhovidov Likhovskikh Likhovtsev Likhtenshtedt Likhtenshtein Likhtentul Likhterman Likhtin Likhvantsev Likin Likov Likum Likunov Likutov Lileev Liliental Lilov Lilyin Lim Limanov Limansky Limar Limarev Limarov Limitovsky Limonov Limorenko Limoshin Lischenko Lischuk Lishansky Lishin Lishtovny Lishtva Litovchenko Livadin Livadny Livanov Livansky Liven Liventsev Liventsov Livenzon Liverovsky Livnev Livshin Livshitz Livson Lizander Lizogub Lizorkin Lizunov Loder Lodkin Lodochnikov Lody Lodyagin Lodygin Lodyjensky Loenko Loevsky Loh Lohanin Lohanov Lohin Lohmatikov Lohno Lohov Lohtin Lohvitsky Loi Loifman Loiko Loiter Loitzyansky Lojchenko Lojkin Lokh Lokhanin Lokhanov Lokhin Lokhmatikov Lokhno Lokhov Lokhtin Lokhvitsky Los Losenko Losev Losik Lositsky Loskov Loskutov Loson Lossky Losyukov Lotarev Lotkov Lotman Lotorev Lotosh Lotsmanov Lotter Loza Lozben Lozhchenko Lozhkin Lozin Lozinsky Lozivets Lozovoy Lozovsky Lubsky Lubushkin Lubutin Lubutov Luferov Luha Luhmanov Luhovitsky Luhvich Lukha Lukhmanov Lukhovitsky Lukhvich Lupachev Lupalenko Lupan Lupandin Lupanenko Lupanov Lupehin Lupei Lupekhin Lupenko Lupenkov Lupichev Lupin Lupov Luppa Luppian Luppol Luppov Luptsov Lurie Luriya Luskanov Luspekaev Luss Lustgarten Lut Lutchenko Lutchenkov Lutfullin Lutkov Lutkovsky Lutohin Lutoshkin Lutoshnikov Lutov Lutovich Lutovinov Luts Lutsev Lychagin Lychakov Lychanaya Lychev Lygach Lygin Lyhin Lyjenkov Lyjin Lykasov Lykhin Lykin Lykoshin Lykosov Lykov Lymar Lymarev Lyndin Lyndyaev Lyrschikov Lysak Lysakov Lysansky Lysenko Lysenkov Lysenny Lysev Lysihin Lysikhin Lysikov Lyskin Lysko Lyskov Lysov Lystsov Lysy Lysyakov Lysyansky Lysyh Lysykh Lysyuk Lytkin Lyvin Lyzhenkov Lyzhin Lyzin Lyzlov Mahachev Mahaev Mahagonov Mahalin Mahalov Mahankov Mahanov Maharov Mahin Mahinov Mahinya Mahlai Mahlinsky Mahlov Mahmudov Mahmutov Mahnenko Mahnev Mahno Mahonin Mahonov Mahorin Mahortov Mahotin Mahotkin Mahov Mahovikov Mahro Mahrov Mahrovsky Mahtiev Mahurov Mahutov Makaseev Makferson Makhachev Makhaev Makhagonov Makhalin Makhalov Makhankov Makhanov Makharov Makhin Makhinov Makhinya Makhlai Makhlinsky Makhlov Makhmudov Makhmutov Makhnenko Makhnev Makhno Makhonin Makhonov Makhorin Makhortov Makhotin Makhotkin Makhov Makhovikov Makhro Makhrov Makhrovsky Makhtiev Makhurov Makhutov Makshakov Maksheev Maksimchenko Maksimchik Maksimchikov Maksimchuk Maksimov Maksimovich Maksimovsky Maksimtsev Maksimychev Maksimyuk Maksin Maksinev Maksumov Maksutov Maksyuta Maksyutenko Maksyutov Makuha Makuhin Makukha Makukhin Makul Makulov Makunin Makurov Makusev Makushev Makushkin Makushok Makyshev Mar'In Marchanukov Marchenko Marchenkov Marchenov Marchuk Marchukov Marfelev Marfin Marfunin Marfusalov Marhanov Marhasin Marhinin Marholenko Marievsky Markhanov Markhasin Markhinin Markholenko Marks Markus Markushev Markushevich Markushin Marlovetsky Marmazov Maron Marov Marr Marshak Marshalko Marshansky Marshev Marsky Martemyanov Martens Martidi Martin Martinenas Martinkus Martinovsky Martinson Martirosov Martkovich Martos Martov Martoyas Martsenko Martsenkov Martsenyuk Martsevich Martsinkovsky Martyanchik Martyanov Martynenko Martynenkov Martynov Martynovsky Martynyuk Martyshevsky Martyshin Martyshkin Martyshko Martyshov Martysyuk Martyuk Martyushev Martyushin Martyushov Martzenko Martzenkov Martzenyuk Martzevich Martzinkovsky Maruk Marunin Maruschak Maruschenko Marusev Marushkin Marushko Marusin Marutenkov Marutsky Maryanov Maryanovsky Maryashev Maryasov Marychev Maryenko Maryltsev Maryshev Marysyuk Maryushkin Maryutin Matasoff Matasov Matsaev Matsak Matsakov Matsevich Matseyovsky Matsiev Matsievich Matsievsky Matsigura Matskevich Matsko Matskov Matskovsky Matsnev Matsotsky Matsuev Matsukevich Matsyuk Matzaev Matzak Matzakov Matzevich Matzeyovsky Matziev Matzievich Matzievsky Matzigura Matzkevich Matzko Matzkov Matzkovsky Matznev Matzotsky Matzuev Matzukevich Matzyuk Medved Meshkovsky Michkov Michudo Michurin Mih Mihailenko Mihailets Mihailichenko Mihailidi Mihailin Mihailitsyn Mihailov Mihailovich Mihailovsky Mihailushkin Mihailutsa Mihailyants Mihailychev Mihailyuk Mihalchenko Mihalchev Mihalchuk Mihaleiko Mihalenkov Mihalev Mihalevich Mihalevsky Mihalitsin Mihalkin Mihalkov Mihalkov Mihalkovsky Mihalsky Mihaltsev Mihaltsov Mihalushkin Mihalychev Mihasenko Miheenkov Miheev Miheikin Mihel Mihels Mihelson Mihelyus Mihersky Mihilev Mihin Mihlin Mihmel Mihnenko Mihnev Mihnevich Mihno Mihnov Mihoels Mikh Mikhailenko Mikhailets Mikhailichenko Mikhailidi Mikhailin Mikhailitsyn Mikhailjants Mikhailjuk Mikhailov Mikhailovich Mikhailovsky Mikhailushkin Mikhailutsa Mikhailyants Mikhailychev Mikhailyuk Mikhalchenko Mikhalchev Mikhalchuk Mikhaleiko Mikhalenkov Mikhalev Mikhalevich Mikhalevsky Mikhalitsin Mikhalkin Mikhalkov Mikhalkov Mikhalkovsky Mikhalsky Mikhaltsev Mikhaltsov Mikhalushkin Mikhalychev Mikhasenko Mikheenkov Mikheev Mikheikin Mikhel Mikhels Mikhelson Mikhelyus Mikhersky Mikhilev Mikhin Mikhlin Mikhmel Mikhnenko Mikhnev Mikhnevich Mikhno Mikhnov Mikhoels Minchenkov Minchev Mindadze Mindel Mindeli Mindiashvili Mindibekov Minding Mindlin Mindovsky Mindra Mineev Minenko Minenkov Minervin Minevich Mingaleev Mingalev Mingazetdinov Mingazov Mingrelsky Minh Miniahhmetov Minih Minin Minitsky Minjurenko Minkevich Minkin Minko Minkov Minkov Minkovich Minniahmetov Minniakhmetov Minnibaev Minnihanov Minnikhanov Minnikov Minnubaev Minnulin Minov Minovalov Minovitsky Minovitzky Minskoi Mints Mintskovsky Mintz Minuhin Minukhin Minushkin Minyaev Minyaichev Minyajetdinov Minyar Minyazhetdinov Minyukov Minyushev Miodushevsky Mischenko Mischenkov Mischihin Mischikhin Mischuk Mitskevich Mitsukov Mkervali Mkrtchan Mkrtchyants Mkrtumov Mkrtumyan Mlachnev Mladentsev Mlechin Mliss Mlodik Mlotkovsky Mlynnik Mnatsakanov Mnatsakanyan Mndjoyan Mndoyants Mniszech Mnogogreshny Mnuskin Mochalin Mochalov Mochalsky Mochalygin Mochanov Mochanovsky Mocharov Mochtarev Mochulov Mochulsky Mochutkovsky Model Modenov Moderah Modestov Modin Modyaev Modylevsky Modzalevsky Modzko Mogila Mogilensky Mogilev Mogilevich Mogilevsky Mogilevtsev Mogilner Mogilnichenko Mogilnikov Mogilnitsky Mogilny Mogilyuk Moguchev Moh Mohnachev Mohnatkin Mohnatsky Mohorov Mohosoev Mohov Moisinovich Mojaev Mojaikin Mojaiskov Mojaisky Mojar Mojarenko Mojarov Mojarovsky Mojartsev Mojeiko Mojin Mokeenkov Mokeev Mokerov Mokh Mokhnachev Mokhnatkin Mokhnatsky Mokhorov Mokhosoev Mokhov Mokievsky Mokin Moklyachenko Mokretsov Mokrinsky Mokritsky Mokronosov Mokrousov Mokrov Mokrushev Mokry Mokryak Mokshin Molcanovs Molchadsky Molchanov Molchansky Moldovanov Moldovyan Moletotov Molev Molevich Molin Mollaev Moller Mollo Molnovetsky Molochko Molochkov Molochnikov Molodchinin Molodensky Molodin Molodkin Molodojenov Molodtsov Molodyh Molodykh Molojavy Molokanov Molokov Molokovsky Molorodov Moloshnikov Molostov Molostvov Molotilov Molotkov Molotov Molov Moltenskoi Molvo Molyakov Molyavin Molyavinsky Mombelli Momdji Momdjyan Momotov Mordakov Mordashov Mordasov Mordberg Mordin Mordinov Mordkin Mordkovich Mordovin Mordovtsev Morduhovich Mordvin Mordvinoff Mordvinov Mordvintsev Mordyukov Morehin Moreinis Morekhin Morenets Morengeim Morenshildt Morev Morjin Morjitsky Morozov Morozovsky Morzhin Morzhitsky Moschenko Moshcovitsh Mosheev Moshenko Moshenkov Moshetov Moshin Moshkarkin Moshkarnev Moshkin Moshkov Moshkovich Moshkovsky Moshkunov Moshnikov Moshninov Moshnyaga Moshnyakov Moshonkin Movchan Movchun Movsaev Movsarov Movsesov Movshovich Movsumadze Mozhaev Mozhaikin Mozhaiskov Mozhaisky Mozhar Mozharenko Mozharov Mozharovsky Mozhartsev Mozheiko Mozhin Mravin Mravinsky Mrelashvili Mrevlishvili Mryhin Mstislavets Mstislavsky Mubarakshin Mubaryakov Mudrak Mudrik Mudrov Muijel Muizhel Mujdabaev Mujikov Mujitskih Mujitskikh Mujjavlev Mujkaterov Mukanov Mukaseev Mukasey Mukin Mukke Muklevich Mukomel Mukov Mukovozov Muksinov Muksunov Mukubenov Mukusev Muladjanov Mulatov Muldashev Mulenkov Mulerman Mulin Mulinov Mullayanov Muller Multah Multakh Multyh Multykh Mulyarchik Mulyavin Mulyukov Mumdjian Mumladze Mun Munaev Munasipov Munchaev Munehin Munekhin Munin Munsky Munster Munte Muntyan Munyabin Murchenko Murogov Muromtsev Muromtsov Murov Mursalimov Murtazaliev Murtazin Muru Murychev Murygin Murylev Musabaev Musaev Musahanov Musahanyants Musakov Musalatov Musalimov Musalnikov Musatov Musavirov Muzafarov Muzalevskih Muzalevskikh Muzalevsky Muzarev Muzenitov Muzgin Muzhdabaev Muzhikov Muzhitskih Muzhitskikh Muzhkaterov Muzhzhavlev Muzipov Muzrukov Muzychenko Muzychka Muzyka Muzykantov Muzykantsky Muzykin Muzylev Muzyrya Muzyukin Myachkov Myatishkin Myatlev Nahabtsev Nahamkin Nahamkis Nahapetov Nahimov Nahmanovich Nahodkin Nahushev Nahutin Nakhabtsev Nakhamkin Nakhamkis Nakhapetov Nakhimov Nakhmanovich Nakhodkin Nakhushev Nakhutin Nasakin Nasedkin Nasetkin Nasibullaev Nasibullin Nasikan Nasikovsky Naslednikov Nasledov Nasonov Nasretdinov Nasrullaev Nasrutdinov Nastavin Nastogunin Nastoyaschy Nasybullin Nasyrov Natalenko Natalushko Natapov Natareev Natashkin Natho Natochin Nazarkin Nazarko Nejdanov Nejentsev Nejinsky Nejlukto Nelyubin Nelyubov Nesgovorov Nesis Neskorodev Neskoromny Neskrebin Neskuchaev Neslyuzov Nesmachko Nesmachnov Nesmelov Nesmeyanov Nesselrode Nessen Nessler Nesvetaev Nevedomsky Nevejin Nevelskoi Nevelsky Neverkovets Neverkovich Neverov Neverovsky Neveselov Nevezhin Nevitsky Nezabytovsky Nezamai Nezametdinov Nezamutdinov Nezavitin Nezhdanov Nezhentsev Nezhinsky Nezhlukto Nezlin Neznamov Neznanov Nezvigin Nijegorodov Nijegorodtsev Nijevyasov Nijinsky Nizhegorodov Nizhegorodtsev Nizhevyasov Nizhinsky Nosach Noschenko Nosenko Nosihin Nosik Nosikov Noskin Nosko Noskov Noskovsky Nouvel Novohatsky Novokhatsky Novosadov Novosadsky Novoselitsky Novoselov Novoselski Novoselsky Novoseltsev Novoshinsky Novosilsky Novosiltsev Novosiltsov Novotortsev Nudatov Nugaev Nugaibekov Nugumanov Nuikin Nujdin Numerov Nunuev Nuraliev Nurdinov Nureev Nurgaleev Nurgaliev Nurgalin Nurhamitov Nuridjanov Nuriev Nurislamov Nurjanov Nurkaev Nurmuhamedov Nurmuhametov Nurok Nurov Nursubin Nurtdinov Nuruchev Nurullin Nuryaev Nuryshev Nurzat Nusberg Nusinov Nusuev Nutrihin Nyago Nyamin Nyashin Nymmik Nyrko Nyrkov Nyrtsev Nyuhalov Nyuhtilin Nyuren Nyurnberg Oboldin Obolensky Obolonsky Obolsky Obolyaninov Oborin Oborkin Oborotov Obuh Obuhov Obukovkin Obydennikov Obydennov Obyedkin Obyedkov Obysov Olenev Olenew Omarjanov Omarov Omashev Omegov Omelchenko Omelianovsky Omelichev Omelin Omelko Omelkov Omelyanenko Omelyansky Omelyuk Omischenko Omoloev Onchukov Ondrikov Onegin Onenko Onikov Onilov Onischenko Onischuk Onishko Onkov Onopko Onoprienko Onopriev Onoshkin Ontikov Onuchin Onufrienko Onufriev Onufrievich Onusaitis Onyky Oom Osipenko Otain Otchenashenko Otdelnov Otellin Otiev Otlesnov Otletov Otlivschikov Otmahov Otmakhov Otov Otradnov Otrohov Otrokhov Otroshenko Ots Otsing Otstavnoi Otstavnov Ott Ottyasov Otyaev Otyutsky Overchuk Padalka Padalkin Paderin Padkin Paduchev Padva Padylin Pagaev Pagiev Pahalchuk Paharev Paharkov Pahmutov Paholkov Pahomov Pahrin Pahtanov Pahtel Pahunov Paidoverov Paidyshev Paikin Paimuhin Paimukhin Paimullin Pain Paivin Pajinsky Pajitnov Pajukov Pakhalchuk Pakharev Pakharkov Pakhmutov Pakholkov Pakhomov Pakhrin Pakhtanov Pakhtel Pakhunov Paradiz Paradjanov Paradzhanov Paradzinsky Paragulgov Parahin Parakhin Parakin Paramonov Paramoshin Paramoshkin Paranin Paraschenko Paraskun Parasyuk Parenago Parensky Parensov Pashkov Pasternak Pastreiter Pats Patsalo Patsev Patsevich Patsiorkovsky Patskevich Patsyna Patz Patzalo Patzev Patzevich Patziorkovsky Patzkevich Patzyna Pavelko Pavelyev Pavin Pavkin Pavlenko Pavlenkov Pavlenok Pavlichenko Pavlin Pavlinov Pavlinsky Pavlischev Pavlishin Pavluhin Pavlukhin Pavlunin Pavlunovsky Pavlusenko Pavlusha Pavlushin Pavlychev Pavlyuchenko Pavlyuchkov Pavlyuchuk Pavlyuk Pavlyukov Pavlyukovsky Pavlyushkevich Pavsky Pawluk Pazdnikov Pazhinsky Pazhitnov Pazhukov Pazi Pazuhin Pazukhin Pazy Pazyun Pechagin Pechatkin Pechatnov Pechenejsky Pechenev Pechenezhsky Pechenin Pechenkin Pecheny Pecheritsa Pecherkin Pechernikov Pechersky Pechinin Pechinkin Pechkovsky Pechkurov Pechnikov Pechuev Pechurkin Peftiev Pehotin Pehterev Pehtin Pekhotin Pekhterev Pekhtin Peleev Pelevin Pelih Pelin Pellenen Peller Pelman Pelmenev Pelsh Pelshe Peltser Peltsman Pelyushenko Pen'Kovsky Pen Pendik Pendyuhov Pendyurin Penev Penkin Penkov Penkovsky Pensky Pentin Pentsak Penyaev Penzin Pepelyaev Peresada Persov Petlenko Petrov Petsyk Petsyuha Pieha Pietsuh Piffer Piradov Pirashkov Pirin Pirogov Pirojenko Pirojkov Pirozhenko Pirozhkov Pirsky Pirtskhalava Piruev Pirumov Pirushkin Piruzyan Pirzadyan Piskarenkov Piskarev Pisklov Piskoppel Piskorsky Piskotin Piskovoy Piskulov Piskun Piskunov Piskus Pismanik Pismenny Pismensky Pismichenko Pistolkors Pitaevsky Pitatelev Pitel Pitenin Piterskih Piterskikh Pitersky Pitkevich Pitomets Pituhin Pitukhin Piunov Plichko Pliev Pligin Plihin Plikhin Plimak Pliner Plis Plisetsky Pliska Pliskanovsky Plisov Pliss Plitman Plotnicky Plotnitsky Pochechikin Pochekin Pocheshev Pochevalov Pochinkov Pochinkovsky Pochinok Pochinsky Pochitalin Pochivalov Pochkaev Pochkailo Pochkin Pochkunov Pochtarev Pochtennyh Pochuev Pochupailov Podolinsky Podsevalov Podshibihin Podshibikhin Podshivalov Podsizertsev Podstavka Podsvirov Podsyadlo Pogosov Pogosyan Pohilchuk Pohilenko Pohilevich Pohilko Pohis Pohitonov Pohlebaev Pohlebkin Pohmel'Nyh Pohmelkin Pohodeev Pohodin Pohodun Pohojaev Poholkov Pohvisnev Pohvoschev Pokhilchuk Pokhilenko Pokhilevich Pokhilko Pokhis Pokhitonov Pokhlebaev Pokhlebkin Pokhmel'Nyh Pokhmelkin Pokhodeev Pokhodin Pokhodun Pokhojaev Pokholkov Pokhvisnev Pokhvoschev Polibin Poliev Polikanov Polikarpov Polikashkin Polilov Polivanov Polivka Polivkin Polivoda Ponafidin Ponagushin Ponarovsky Ponasov Ponedelkov Ponedelnik Ponidelko Ponikarov Ponikarovsky Poninsky Ponizov Ponizovsky Ponkratov Ponomarenko Ponomarev Ponomarkov Ponosov Pontekorvo Pontovich Pontryagin Pontyushenko Ponurovsky Ponyatkov Ponyatovsky Poogelman Por Porai-Koshits Poret Poretsky Poretzky Porfiriev Porfirov Porhun Porhunov Porkhun Porkhunov Porodnya Poroh Porohin Porohnya Porohov Porohovschikov Porokh Porokhin Porokhnya Porokhov Porokhovschikov Porosenkov Poroshin Poroskov Porosyuk Poroykov Porshenkov Porshnev Portnenko Portnikov Portnov Portnoy Portnyagin Portnyakov Portsevsky Portsienko Portugalsky Portyanik Portyankin Portyanko Portyansky Porublev Porus Porva Porval Poryadin Poryvaev Poryvay Poshehonov Poshekhonov Poshevnev Poshibalov Poshiklov Poshlyakov Poshumensky Poshutilin Postemsky Postnikov Potseiko Potseluev Potsepkin Potsyapun Poyarkov Poyasnik Pribylov Pribylovsky Pribylsky Pribytkov Pridannikov Pridchenko Pridvorov Pridybailo Priemyhov Priemykhov Priezjaev Prigara Prigarin Prigoda Prigojin Prigojy Prigorodov Prigorovsky Prigov Prigozhin Prigozhy Priimkov Prik Prikazchikov Priklonsky Prikupets Privalihin Privalikhin Privalov Privorotsky Priymak Prokofiev Prokoshev Prokoshin Prokoshkin Prokudin Prokuronov Prokurorov Prolubnikov Promyslov Prygoda Puscharovsky Puschin Pushkov Pyankov Pyankovsky Pyanochenko Pyanov Pyavchenko Pyavko Pyhov Pyhteev Pyhtin Pyjev Pyjiev Pyjikov Pyjov Pyl Pylev Pylin Pylnev Pylyaev Pypin Pyrchenko Pyrchenkov Pyriev Pyrikov Pyrin Pyrkov Pyrlin Pyschev Pyshin Pyshkin Pyslar Pytalev Pytel Pytov Pytsky Rahalsky Rahamimov Rahil Rahimbaev Rahimov Rahletsky Rahletzky Rahlevsky Rahlin Rahmail Rahmanin Rahmaninov Rahmanov Rahmatulin Rahmatullin Rahmetov Rahmilovich Rahov Rahvalov Raich Raifeld Raifikesht Raih Raihelgauz Raihelson Raihert Raihlin Raihman Raikevich Raikh Raikhelgauz Raikhelson Raikhert Raikhlin Raikhman Raikin Raikov Raikovsky Raimanov Raimov Rainbagin Rainov Raisky Raiter Raitses Raitsin Raizer Raizman Rakhalsky Rakhamimov Rakhil Rakhimbaev Rakhimov Rakhletsky Rakhletzky Rakhlevsky Rakhlin Rakhmail Rakhmanin Rakhmaninov Rakhmanov Rakhmatulin Rakhmatullin Rakhmetov Rakhmilovich Rakhov Rakhvalov Rapota Razygrin Rebinder Rehbinder Rekemchuk Rekitar Rekke Reks Rekshinsky Rekun Rekunkov Rekunov Reles Remaev Rembeza Remchukov Remenny Rementsov Remeslo Remez Remezentsev Remezov Remih Remikh Remin Remishevsky Remizov Remmer Remmert Remnev Rempel Rempler Remyannikov Ren'Kas Renard Rendino Rengarten Renkas Renkevich Renne Rennenkampf Renov Renovants Renskov Rents Renzyaev Ribakov Ribopier Richardson Richman Richter Rifkind Riga Rigert Rigin Rihman Rihter Rikhman Rikhter Rishitnik Rivel Riverov Rivkin Rivkind Rivman Rjanitsin Rjanov Rjavin Rjavinsky Rjeshevsky Rjeshotarsky Rjeussky Rjevsky Robakidze Robkanov Robustov Rochegov Rochev Rogachev Rogachevsky Rogal Rogalev Rogalnikov Roganov Roganovich Rogashkov Rogatkin Rogatko Rogatsky Rohatsevich Rohin Rohlin Rohmanov Rohmistrov Rokhatsevich Rokhin Rokhlin Rokhmanov Rokhmistrov Rosenbloom Rotai Rotar Rotaru Rotast Rotenberg Rotermel Rotgang Rotin Rotmistrov Rotov Rotshild Rotshtein Rott Routiyainen Rovbel Rovensky Rovinsky Rovkov Rovkovsky Rovner Rovnev Rovnin Rovnyansky Rozenbloom Rozenblum Ruhimovich Ruhledev Ruhlin Ruhlov Ruhlyada Ruhlyadko Ruhtoev Rujenkov Rujentsov Rujilo Rujitsky Rujje Rujnikov Rukhimovich Rukhledev Rukhlin Rukhlov Rukhlyada Rukhlyadko Rukhtoev Ruslanov Rusov Russkih Russkikh Russkin Russov Rustamov Rustikov Rusu Rusyaev Rutberg Rutenburg Rutkevich Rutkovsky Rutman Ruts Rutshtein Ruzaev Ruzaikin Ruzakov Ruzankin Ruzanov Ruzavin Ruzhenkov Ruzhentsov Ruzhilo Ruzhitsky Ruzhnikov Ruzimatov Ruzin Ruzsky Ryjak Ryjakov Ryjankov Ryjanov Ryjenko Ryjenkov Ryjev Ryjih Ryjik Ryjikh Ryjikov Ryjkin Ryjko Ryjkov Ryjkovsky Ryjov Ryjy Ryzhak Ryzhakov Ryzhankov Ryzhanov Ryzhenko Ryzhenkov Ryzhev Ryzhey Ryzhih Ryzhik Ryzhikh Ryzhikov Ryzhkin Ryzhko Ryzhkov Ryzhkovsky Ryzhov Ryzhy Rzhanitsin Rzhanov Rzhavin Rzhavinsky Rzheshevsky Rzheshotarsky Rzheussky Rzhevsky Researcher Researcher Researcher Researcher Researcher Researcher Researcher Researcher Researcher Sai Saidbaev Saidulaev Saidullaev Saifitdinov Saifulaev Saifulin Saifullin Saifulov Saifutdinov Saigin Saigutin Saihanov Saikin Saiko Saikov Sailotov Saitanov Saitiev Saitov Sak Saker Sakiev Sakin Sakiyaev Sakov Sakovich Saks Saksagansky Sakson Saksonov Sakulin Sakun Sapojinsky Sapojnikov Sapon Savinov Scetintsev Scheblykin Schebrov Schepansky Schepatov Schepelev Schepetkov Schepin Schepiorko Schepitsky Schepkin Schepotkin Schepotyev Schepovskih Schetchikov Schetinin Schetinkin Schevaev Schevelev Schirovsky Schitov Schits Schitsyn Schkrebitko Schugorev Sen Senkov Senkovsky Senyagin Senyakovich Senyavin Senyukov Sepelev Sepp Serchuk Serechenko Sereda Seredavin Seredin Seredinin Seredkin Serednitsky Serednyakov Seredohov Seredov Seredyuk Serejin Serejkin Serejnikov Serjantov Sert Shadhan Shadkhan Shadsky Shadura Shadursky Shadyev Shaer Shaev Shaevich Shah Shah-Nazaroff Shahanov Shahansky Shahbanov Shahbazov Shahbazyan Shahgildyan Shahin Shahkalamyan Shahlamov Shahlevich Shahlin Shahmaev Shahmagon Shahmametiev Shahmatov Shahmin Shahnarovich Shahnazarov Shahnazaryan Shahnazaryants Shahno Shahnovich Shahnovsky Shahorin Shahov Shahovskoi Shahovsky Shahpaev Shahrai Shahtin Shahtmeister Shahurin Shahurov Shahvorostov Shaiahmetov Shaidakov Shaidarov Shaidenko Shaidullin Shaidurov Shaiewich Shaihmurzin Shaihutdinov Shaikevich Shaikhmurzin Shaikhutdinov Shaikin Shaikov Shaimardanov Shaimiev Shain Shain Shainsky Shainurov Shaitan Shaitanov Shakh Shakhanov Shakhansky Shakhbanov Shakhbazov Shakhbazyan Shakhgildyan Shakhin Shakhkalamyan Shakhlamov Shakhlevich Shakhlin Shakhmaev Shakhmagon Shakhmametiev Shakhmatov Shakhmin Shakhnarovich Shakhnazarov Shakhnazaryan Shakhnazaryants Shakhno Shakhnovich Shakhnovsky Shakhorin Shakhov Shakhovskoi Shakhovsky Shakhpaev Shakhrai Shakhtin Shakhtmeister Shakhurin Shakhurov Shakhvorostov Shalabanov Shalaev Shalagaev Shalagin Shalaginov Shalahonov Shalai Shalamanov Shalamov Shalashilin Shalashov Shalavin Shaldenkov Shaldybin Shalenkov Shalganov Shalgin Shalikov Shalimo Shalimov Shalin Shalitkin Shalko Shalmanov Shalnev Shalnikov Shalnov Shalonin Shalov Shalunov Shalyapin Shalygin Shalyto Shalyugin Shamaev Shamahov Shamakhov Shamakin Shamanaev Shamanin Shamankov Shamanov Shamardin Shamarin Shamaro Shambazov Shamburkin Shamgulov Shamilyan Shamin Shamkov Shammazov Shamonin Shamota Shamov Shamraev Shamrai Shamro Shamrun Shamsetdinov Shamshev Shamshin Shamshurin Shamshurov Shamsiev Shamsudinov Shamsutdinov Shamurin Shamuzafarov Shan'Gin Shanaev Shananykin Shanaurin Shangareev Shangin Shazzo Shel Shelting Sheludchenko Sheludko Sheludshev Sheludyakov Sheluhin Sheluntsov Shelyag Shelyakin Shelyuh Shen Shenaev Shenagin Shendalev Shendel Shenderovich Shendrik Shenfeld Shenfeldt Shenfer Shengeliya Shening Shenk Shenkarev Shenker Shenkovets Shennikov Shenshin Shentel Shenterev Shenyavsky Sherman Shevtsov Shinkaruk Shiraev Shirdov Shirikov Shirin Shirinkin Shirinsky-Shikhmatov Shirinyan Shirinyants Shirko Shirkov Shirkovets Shirle Shirmankin Shirmanov Shirnin Shirvindt Tal Talagaev Talalaev Talalai Talalihin Talalikhin Talalykin Talambum Talankin Talanov Talapa Talashkevich Talbaev Talberg Taldykin Talian Taliev Talikov Talipov Talitskih Talitskikh Talitsky Talkov Talkovsky Tallat Taller Talmi Talmin Talmud Talov Taloverov Talovirko Talpin Talroze Taltangov Taltskov Talvik Talvir Talyantsev Talygin Talypin Talyzin Talzi Tamaev Tamanin Tamanyan Tamarchenko Taube Tchaadaev Tchaganov Tchagin Tchajegov Tchajengin Tchaldymov Tchaleev Tchalov Tchalovsky Tchaly Tchalyh Tchalykh Tchalyshev Tchamov Tchamushev Tchanchikov Tchangli Tchanov Tchanturia Tchanyshev Tchapko Tcharkin Tcharnetsky Tcharnolusky Tcharoshnikov Tchartorijsky Tchartorizhsky Tcharuhin Tcharukhin Tcharushin Tcharushkin Tcharykov Tchazov Tcheh Tchehanov Tcheharin Tchehladze Tchehlakovsky Tchehluev Tchehoev Tchehonin Tchehov Tchehovich Tchehovsky Tchekachev Tchekh Tchekhanov Tchekharin Tchekhladze Tchekhlakovsky Tchekhluev Tchekhoev Tchekhonin Tchekhov Tchekhovich Tchekhovsky Tchekin Tchekis Tchekletsov Tcheklyanov Tchekmarev Tchekmasov Tchekmenev Tchekmezov Tchekoev Tchekomasov Tchekonov Tchekvin Tetekin Tetelmin Teterev Teterichev Teterin Teterkin Teteruk Tets Tettenborn Teumin Teunaev Tihankin Tihenko Tihin Tihmenev Tihobaev Tihobrazov Tihodeev Tihomirnov Tihomirov Tihonchuk Tihonenko Tihonin Tihonitsky Tihonkih Tihonov Tihonravov Tihotsky Tihov Tihvinsky Tihy Tikhankin Tikhenko Tikhin Tikhmenev Tikhobaev Tikhobrazov Tikhodeev Tikhomirnov Tikhomirov Tikhonchuk Tikhonenko Tikhonin Tikhonitsky Tikhonkih Tikhonov Tikhonravov Tikhotsky Tikhov Tikhvinsky Tikhy Timaev Timakin Timakov Timarevsky Timashev Timashov Timashuk Time Timerbaev Timerbulatov Timerhanov Timin Timirev Timirgazeev Timiryazev Timiskov Timkachev Timkaev Timkin Timkov Timlin Timonin Timonkin Timonnikov Timonov Tkachuk Tobias Tobiash Tobolev Tobolin Tobolkin Tobulinsky Todorov Todorovsky Todorsky Todriya Togoev Togulev Togunov Toguzov Toichkin Toidze Toien Toka Tokaev Tokar Tokarchuk Tokarenko Tokarev Tokarovsky Tokarsky Tokin Tokmachev Tokmagambetov Tokmakov Tokombaev Tokovoi Toktahunov Toktakhunov Tokunov Tolboev Tolbuhin Tolvinsky Tomaev Toman Tomanov Tomas Tomashenko Tomashev Tomashevsky Tomashov Tomashpolsky Tomashuk Tomeev Tomjevsky Tovarovsky Tovbich Tovbin Tovkan Tovma Tovstoles Tovstolit Tovstolujsky Tovstonogov Tovstuha Tovstukha Tovstyh Tovstykh Tovstyko Tovuu Troeglazov Troekurov Troepolsky Troilin Troinin Troinitsky Troitsky Tromonin Tron Tronin Tronko Tronye Tropin Tropinin Tropinov Tropinsky Tropko Tropp Truchanow Truhachev Truhanov Truhanovsky Truhin Truhnin Trukhachev Trukhanov Trukhanovsky Trukhin Trukhnin Tsagadaev Tsagareli Tsagolov Tsagunov Tsah Tsahilov Tsai Tsaizer Tsakh Tsakhilov Tsakul Tsakunov Tsalaban Tsaliev Tsalikov Tsalko Tsallagov Tsalyhin Tsalykhin Tsanava Tsander Tsann-Kay-Si Tsapaev Tsapelik Tsapenko Tsapin Tsapko Tsaplin Tsaplinsky Tsapov Tsarakov Tsaran Tsaregorodtsev Tsaregradsky Tsarek Tsarenko Tsarenkov Tsarev Tsarevsky Tsarik Tsarikaev Tsarkov Tsarsky Tsayukov Tseboev Tsebrikov Tsederbaum Tsegoev Tsehanovich Tsehansky Tsehmistrenko Tsei Tseidler Tseiger Tseimen Tseiner Tseitlin Tseizik Tsekhanovich Tsekhansky Tsekhmistrenko Tsel'Ko Tselibeev Tselikov Tselikovsky Tselischev Tselobenok Tselovalnikov Tselovalnov Tseluiko Tsenin Tsenkovsky Tsevlonsky Tsevlovsky Tsidilin Tsidilkovsky Tsigal Tsigelnik Tsigleev Tsigler Tsigra Tsiolkovsky Tsipushtanov Tsiulev Tsval Tsvei Tsveiba Tsvelev Tsvelihovsky Tsvelikhovsky Tsvelyuh Tsvelyukh Tsverkun Tsvetaev Tsvetkov Tsvetnov Tsvetov Tsvibak Tsvigun Tsvilgnev Tsvirko Tsvylev Tsyavlovsky Tsyrba Tsyrulik Tsys Tsytovich Tsyurko Tsyurupa Tubelsky Tubinov Tubli Tubolkin Tuboltsev Tubylov Tudorovsky Tueshev Tuev Tugaev Tugai Tuganaev Tuganbaev Tuganov Tugarinov Tugarov Tugolukov Tugov Tugujekov Tugushev Tuguz Tuikin Tuikov Tuinov Tujikov Tujilin Tujilkin Tukabaev Tukmanov Tuktarov Tukumtsev Tukvachinsky Tulaev Tulaikin Tulaikov Tulakov Tulchinsky Tulebaev Tuleev Tulikov Tulin Tulinov Tulintsev Tulkin Tulnikov Tulohonov Tulov Tultsev Tulub Tulubensky Tulumbasov Tulupov Tulya Tulyakov Tumaev Tumanov Tumanovsky Tumansky Tumanyan Tumarkin Tumashev Tumasiev Tumbakov Tumenov Tumilovich Tumin Tumko Tumolsky Tumov Tumunbayarov Tundykov Tuneev Tunev Tungusov Tuniev Tunik Tunkin Tunnikov Tupalo Tupihin Tupikhin Tupikin Tupikov Tupolev Tuporshin Tur Turaev Turanov Turarov Turashev Turatbekov Turbai Turbanov Turbin Turchak Turchaninov Turchenko Turchin Turetskov Turetsky Turgenev Turik Turintsev Turischev Turiyansky Turkestanov Turkevich Turkin Turko Turkov Turkul Turlak Turlapov Turlov Turmanov Turmilov Turmov Turno Turov Turoverov Turovsky Turovtsev Turpaev Turpyatko Tursky Tursunov Turta Turtsevich Turtygin Turubanov Turuhin Turukhin Turulo Turunov Turupanov Turushev Turusin Turusov Turutin Turyanov Turyansky Tuvin Tuzin Tuzov Tzagadaev Tzagareli Tzagolov Tzagunov Tzah Tzahilov Tzai Tzaizer Tzakh Tzakhilov Tzakunov Tzalaban Tzaliev Tzalikov Tzalko Tzallagov Tzalyhin Tzalykhin Tzander Tzann-Kay-Si Tzapaev Tzapelik Tzapenko Tzapin Tzapko Tzaplin Tzaplinsky Tzapov Tzarakov Tzaran Tzaregorodtsev Tzaregradsky Tzarek Tzarenko Tzarenkov Tzarev Tzarevsky Tzarik Tzarikaev Tzarkov Tzarsky Tzayukov Tzeboev Tzebrikov Tzederbaum Tzegoev Tzehanovich Tzehansky Tzehmistrenko Tzei Tzeidler Tzeiger Tzeimen Tzeiner Tzeitlin Tzeizik Tzekhanovich Tzekhansky Tzekhmistrenko Tzel'Ko Tzelibeev Tzelikov Tzelikovsky Tzelischev Tzelobenok Tzelovalnikov Tzelovalnov Tzeluiko Tzenin Tzenkovsky Tziolkovsky Tzipushtanov Uchaev Uchaikin Uchitel Uchuev Uchuvatkin Uemlyanin Uemov Ufimkin Ufimov Ufimtsev Uhabin Uhanov Uhin Uhobotin Uhov Uhovsky Uhtomsky Ujentsev Ujinov Ujva Ujvak Ujvy Ukhabin Ukhanov Ukhin Ukhobotin Ukhov Ukhovsky Ukhtomsky Useev Usenko Usievich Usik Usikov Usiskin Uss Ustenko Ustilovsky Ustimenko Ustimkin Ustimov Ustimovich Ustinchenko Ustinkin Ustinov Ustinovich Ustkachkintsev Ustryalov Ustvolsky Ustyantsev Ustynyuk Ustyugov Ustyujanin Ustyuzhanin Usynin Usyskin Utkin Utochkin Utoplov Utrobin Utropov Utugunov Utulov Utyaganov Utyashev Utyugov Uzakov Uzbekov Uzdenov Uzenya Uzhentsev Uzhinov Uzhva Uzhvak Uzhvy Uzky Uzlov Uzov Uzunov V'Unnikov V'Yugin V'Yuhin V'Yun V'Yunkov V'Yunov V'Yurkov Vaarandi Vabbe Vadbolski Vadbolsky Vadeev Vadin Vadkovski Vadkovsky Vadovski Vadovsky Vaganoff Vaganov Vagapoff Vagapov Vagarshyan Vagin Vaginoff Vaginov Vagizoff Vagizov Vagner Vagnoryus Vagnyuk Vagramenko Vaidanovich Vaidanovitch Vaigant Vaikin Vaikule Vaiman Vaimer Vainberg Vainberger Vaindrah Vaindrakh Vainer Vainonen Vainrub Vainshtein Vainshtok Vainson Vainunas Vaipan Vaisberg Vaiserman Vaiserman Vaisero Vaisfeld Vaisman Vaisner Vaistuh Vaitsehovsky Vaitsekhovsky Vajenin Vajnichy Vajorov Vajov Vakanya Vakar Vakichev Vakilov Vakitchev Vakker Vaks Vaksberg Vaksel Vakser Vaksman Vakulenchuk Vakulenko Vakulentchuk Vakulich Vakulin Vakulitch Vakulko Vakulov Vakulovski Vakulovsky Vakulski Vakulsky Val Val Valaev Valberh Valchikovski Valchikovsky Valchitski Valchitsky Valchuk Valdaev Valden Valdenberg Valdes Valdin Valdman Valednitsky Valeev Valendik Valentei Valentik Valentinov Valentinovich Valentinovitch Valentsev Valetov Valetto Valev Valevin Valevsky Valiahmetov Valiakhmetov Valiev Valihanov Valikhanov Valikov Valishin Valitov Valitsky Valiullin Valk Valkevich Valkevitch Valkin Valko Valkov Valkovoy Vallah Vallakh Vallander Valmasov Valmus Valnev Valov Valovoi Valshin Valtchikovski Valtchikovsky Valtchitski Valtchitsky Valtchuk Valter Valters Valts Valtuh Valuev Valy Valyaev Valyanov Valyavski Valyavsky Valyushkin Valyushkis Vampilov Van-Puteren Vanag Vanchagov Vanchugov Vanchurov Vandalkovsky Vandyshev Vanechkin Vanetchkin Vangengeim Vanichev Vaniev Vanifatiev Vanin Vanitchev Vanja Vanjula Vanke Vankov Vannikov Vannovsky Vansheidt Vanshenkin Vanshtein Vanslov Vansovich Vanstein Vantchagov Vantchugov Vantchurov Vanteev Vantenkov Vantorin Vanyashin Vanyat Vanykin Vanyukov Vanyushin Vanzha Vanzhula Varaev Varakin Varaksin Varakuta Vasianov Vasiliev Vasilievsky Vasin Vasindin Vaskin Vaskov Vaskovsky Vaskovtsev Vasserman Vassoevich Vasyaev Vasyagin Vasyakin Vasyankin Vasyanovich Vasyatkin Vasyuchkov Vasyuk Vasyukevich Vasyukov Vasyurin Vasyutin Vasyutinsky Vasyutsky Vasyutynsky Vavakin Vaver Vavich Vavilin Vavilov Vavkin Vavra Vavravsky Vavrovsky Vavulin Vazhenin Vazhnichy Vazhorov Vazhov Vazyaev Vazyulin Vedeneev Vedenin Vedenisov Vedenkin Vedenkov Vedenov Vedensky Vedenyapin Vederman Vedernikov Vedev Vedihov Vedikhov Vedinyapin Vedischev Vedrinsky Vedrov Vedyaev Vedyakin Vedyashkin Vedyaskin Veledeev Veletsky Velgus Velichansky Velichinsky Velichkin Velichko Velichkovsky Velidov Veligjanin Veligodsky Veligorsky Veligura Velihov Velikanov Velikhov Velikih Velikin Velikopolsky Velikorechanin Velikorechin Velikorodny Velikorussov Velikov Velikson Veliky Velio Vellansky Veller Velli Velmukin Velovsky Velsh Velsovsky Veltischev Veltistov Veltman Velts Velyaminov Velyashev Veprentsev Veprentsov Veprev Veprik Vepryushkin Verba Verbenko Verber Verbin Verbitsky Verpeto Verre Versan Verstakov Verstin Verstovsky Vertegel Vertelko Vertiev Vertinsky Vertiprahov Vertkin Vertkov Vertman Vertogradov Vertogradsky Vertyankin Verushkin Veselago Veselenko Veseliev Veselitsky Veselitsky Veselkin Veselkov Veselov Veselovsky Vesich Vesin Vesner Vesnik Vesnin Vesninov Vesnitsky Vesnovsky Vestfrid Vestman Vestov Vielgorsky Vihansky Viharev Vihert Vihirev Vihlyaev Vihnovich Vihorev Vihrev Vihrov Vijonsky Vikhansky Vikharev Vikhert Vikhirev Vikhlyaev Vikhnovich Vikhorev Vikhrev Vikhrov Vikuliev Vikulin Vikulov Vil Vilbreht Vilbushevich Vilchek Vilchepolsky Vilchinsky Vilchitsky Vilchur Vild Vildanov Vilde Vilenchik Vilensky Vilesov Vilgelminin Viliev Vilimaa Vilin Vilinbahov Vilke Vilken Vilkitsky Vilkov Vilkovsky Villamov Ville Villevalde Villiam Vilm Vilmont Vilonov Vilson Vilutis Vilyamovsky Vilyams Vilyunas Vinarov Vinaver Vinberg Vinchevsky Vinchi Vinchugov Vinding Vindman Viner Vingilevsky Vingovatov Vingranovsky Vinichenko Vinidiktov Vinitsky Vinius Vinkler Vinnichenko Vinnik Vinnikov Vinnitsky Vins Vinsgeim Vinter Vinterfeldt Vintergalter Vintikov Vintov Vinyarsky Vipper Virachev Viranovsky Virehovsky Virekhovsky Virenius Virgasov Virichev Viridarsky Virkovsky Viron Viroslavsky Virvitsiotti Viselov Visilkin Viskhanov Viskov Viskovatov Vislobokov Visloguzov Visly Visnap Visnapu Vispovatyh Vispovatykh Vistchinsky Vistitsky Vitmer Vitorgan Vitorsky Vitoshkin Vitoshnov Vitov Vitovoi Vitram Vitrik Vitruk Vitryansky Vizhonsky Vlasenko Vlasenkov Vlasevich Vlasievsky Volsky Vorogushin Voronichev Voronihin Voronikhin Voronin Vorotnikov Vorotnikov Vozdvijensky Vozgilevich Vozgov Vozianov Vozilov Vozlyubleny Vozmitel Voznesensky Voznitsin Voznov Voznyak Vozovik Vyacheslavov Vyahirev Vyakhirev Vyakkerev Vyalba Vyalbe Vyalko Vyalkov Vyalov Vyaltsev Vyaltsin Vyalushkin Vyalyh Vyalykh Vyatkin Vyatkovsky Vyazalov Vyazankin Vyazikov Vyazmikin Vyazmin Vyazmitinov Vyaznikov Vyaznikovtsev Vyazov Vyazovchenko Vyazovoy Vybornov Vyborny Vyborov Vydrin Vyglovsky Vygodin Vygodovsky Vygotsky Vygovsky Vygran Vyguzov Vyhodtsev Vyjletsov Vyjutovich Vykhodtsev Vylegjanin Vylko Vylkov Vylomov Vyltsan Vymenets Vypirailenko Vypolzov Vyrenkov Vyrodkov Vyrodov Vyrubov Vyrupaev Vyschepan Vyschipan Vyshegorodtsev Vyshemirsky Vysheslavtsev Vyshinsky Vyshkovsky Vyshkvarko Vyshnegradsky Vyskrebtsov Vyslouh Vysochin Vysokin Vysokinsky Vysokosov Vysokov Vysotskih Vysotsky Vystavkin Vyucheisky Vyuchnov Vyvodtsev Vyzhletsov Vyzhutovich Yablochkin Yablochkov Yablokov Yablonovsky Yablonowsky Yablonsky Yablontsev Yablontzev Yablovsky Yabrov Yaburov Yachevsky Yachikov Yachmenev Yachmenkov Yachmentsev Yachnik Yadne Yadov Yadrennikov Yadrihinsky Yadrikhinsky Yadrov Yadryshnikov Yafaev Yafarov Yafrakov Yagafarov Yaganov Yagello Yageman Yagfarov Yagich Yaglintsev Yagoda Yagodin Yagodinsky Yagodnikov Yagofarov Yagovenko Yagubov Yagubsky Yagudin Yagujinsky Yagunov Yagupa Yagupets Yagutkin Yagutyan Yaguzhinsky Yagya Yahaev Yahimovich Yahin Yahlakov Yahnenko Yahno Yahnyuk Yahontov Yahot Yaikbaev Yaimov Yaitsky Yakhaev Yakhimovich Yakhin Yakhlakov Yakhnenko Yakhno Yakhnyuk Yakhontov Yakhot Yakimchik Yakimchuk Yakimenko Yakimets Yakimov Yakimovich Yakimovsky Yakimychev Yakir Yaklashkin Yakob Yakobi Yakobson Yakon Yakov Yakovchenko Yakovchuk Yakovenko Yakovets Yakovichenko Yakovkin Yakovlenko Yakovlev Yakovuk Yakshibaev Yakshin Yakub Yakuba Yakubchik Yakubenko Yakubik Yakubonis Yakubov Yakubovich Yakubovsky Yakunchikov Yakunichev Yakunin Yakunkin Yakunov Yakupov Yakurin Yakuschenko Yakush Yakushev Yakushevich Yakushevich Yakushin Yakushkin Yakushkov Yakushov Yakutin Yakutkin Yalamov Yalchevsky Yalovenko Yalovets Yalovoi Yalunin Yam Yamaletdinov Yamaltdinov Yambaev Yamburg Yamilov Yaminsky Yamlihanov Yamlikhanov Yamov Yampolsky Yamschikov Yamskov Yan Yanaev Yanaki Yanalov Yanaslov Yanbarisov Yandarbiev Yandiev Yandulsky Yandutkin Yanek Yanenko Yangarber Yangel Yanibekov Yanin Yanishevsky Yanishin Yanitsky Yanjul Yankelevich Yankevich Yankilevsky Yankilovich Yankin Yankis Yanko Yankov Yankov Yankovsky Yanochkin Yanov Yanover Yanovich Yanovitsky Yanovka Yanovsky Yanowich Yanpolsky Yanshin Yanshole Yanson Yansons Yanushevsky Yanvarev Yanzhul Yanzinov Yapaskurt Yapondych Yapparov Yatsenko Yatsevich Yatskevich Yatskov Yatskovsky Yatsuba Yatsun Yatsunov Yatsyk Yatsyshin Yatzenko Yatzevich Yatzkevich Yatzkov Yatzkovsky Yatzuba Yatzun Yatzunov Yatzyk Yatzyshin Yepishev Yudaev Yudahin Yudakhin Yudakov Yudanov Yudashkin Yudasin Yudelevich Yudenich Yudenkov Yudin Yudinsky Yuditsky Yudkin Yudkov Yudkovich Yudochkin Yudolovich Yudovich Yudushkin Yufa Yuferev Yuferov Yufit Yufryakov Yugai Yugin Yugov Yuhanaev Yuhimenko Yuhimuk Yuhma Yuhman Yuhnev Yuhnin Yuhno Yuhotsky Yuhov Yuhtanov Yuhtman Yuhvidov Yujakov Yujalin Yujanov Yujenko Yujilin Yukalov Yukhanaev Yukhimenko Yukhimuk Yukhma Yukhman Yukhnev Yukhnin Yukhno Yukhotsky Yukhov Yukhtanov Yukhtman Yukhvidov Yuschak Yuschenko Yushenkov Yushin Yushkevich Yushkin Yushkov Yushmanov Yushnevsky Yuskevich Yuzeev Yuzefov Yuzefovich Yuzgin Yuzhakov Yuzhalin Yuzhanov Yuzhenko Yuzhilin Yuzin Yuzva Yuzvikov Yuzvishin Yuzvyuk Zabrodin Zabrovsky Zasedatelev Zasetsky Zaskanov Zasko Zaskokin Zaslavets Zaslavsky Zasluev Zasoba Zasodimsky Zasosov Zastavsky Zastrojny Zastrozhny Zasuha Zasuhin Zasukha Zasukhin Zasulich Zasursky Zasyad'Ko Zasyadko Zasypkin Zavatsky Zavodchikov Zavodnov Zavodov Zavodskoi Zavoisky Zavolokin Zavolokov Zavorin Zavorohin Zavorokhin Zavoruev Zavrajnov Zeifert Zelenenkov Zelenetsky Zelenev Zelenevsky Zelenin Zelenkevich Zelenkin Zelenko Zelenkov Zelenkov Zelenoi Zelenov Zelenovsky Zelensky Zelent Zelentsov Zeleny Zenbitsky Zenchenko Zenger Zenilov Zenin Zenischev Zenkevich Zenkin Zenkov Zenkovich Zenkovsky Zenzinov Zhaba Zhabin Zhabinsky Zhabitsky Zhaboev Zhabotinsky Zhabrev Zhabsky Zhabykin Zhadaev Zhadan Zhadanov Zhadanovsky Zhadenov Zhadin Zhadkevich Zhadovsky Zhagalin Zhaivoronok Zhakmon Zhakov Zhalagin Zhalilo Zhalkovsky Zhalnin Zhalybin Zhamoida Zhamoido Zhamsuev Zhandarov Zhandr Zhanimov Zhardetsky Zharihin Zharikhin Zharikov Zharinov Zharkih Zharkikh Zharkov Zharkovsky Zharmuhamedov Zharmukhamedov Zharnikov Zharnov Zharov Zharovtsev Zharsky Zharuev Zhashkov Zhatkov Zhavoronkov Zhavoronok Zhavoronsky Zhavrid Zhbankov Zhbanov Zhdakaev Zhdan Zhdankin Zhdanko Zhdankov Zhdanov Zhdanovich Zhdanovsky Zhebelev Zhebit Zhebo Zhebrovsky Zhebryakov Zhechkov Zhedrinsky Zhegin Zheglov Zhegulin Zhegunov Zheimo Zhekov Zhekulin Zhelaev Zheldakov Zhelehovsky Zhelekhovsky Zhelezko Zheleznikov Zheleznov Zhelezny Zheleznyak Zheleznyakov Zhelezov Zhelezovsky Zheleztsov Zheliba Zhelnin Zhelnov Zhelobinsky Zhelohovtsev Zhelokhovtsev Zheltouhov Zheltoukhov Zheltov Zheltuhin Zheltukhin Zheltyannikov Zheludev Zheludkov Zhelvakov Zhelyabov Zhelyabovsky Zhelyabuzhsky Zhemaitis Zhemaldinov Zhemchugov Zhemchujnikov Zhemchujny Zhemlihanov Zhemlikhanov Zhemoitel Zhemuhov Zhemukhov Zhendarov Zhenin Zhenovach Zheravin Zherbin Zherdev Zherebin Zherebko Zherebovich Zherebtsov Zherebyatiev Zherihin Zherikhin Zhernakov Zhernevsky Zhernokleev Zhernosek Zhernov Zhernovoy Zheromsky Zheronkin Zheryapin Zherzdev Zhestkov Zhestovsky Zheurov Zhevahov Zhevaikin Zhevakhov Zhevanov Zheverzheev Zhevlakov Zhevolozhnov Zhezhel Zhezhera Zhgutov Zhiboedov Zhidelev Zhidenko Zhidilev Zhidilin Zhidkih Zhidkikh Zhidkin Zhidkov Zhidomirov Zhigachev Zhigailo Zhigailov Zhigalev Zhigalin Zhigalkin Zhigalov Zhiganov Zhigarev Zhigily Zhigin Zhigmytov Zhigulenkov Zhigulin Zhigulsky Zhigultsov Zhigun Zhigunov Zhiharev Zhiharevitch Zhikharev Zhikharevitch Zhikin Zhikov Zhilchikov Zhilenko Zhilenkov Zhilin Zhilinsky Zhilis Zhilkin Zhilnikov Zhilov Zhiltsov Zhilyaev Zhilyakov Zhilyardy Zhilyuk Zhimailov Zhimerin Zhimila Zhimirov Zhimulev Zhinkin Zhinov Zhirdetsky Zhirenkin Zhirikov Zhiril Zhirinovsky Zhiritsky Zhirkevich Zhirkov Zhirmunsky Zhirnikov Zhirnov Zhirnyakov Zhiro Zhirov Zhiryakov Zhitarev Zhitenev Zhitetsky Zhitin Zhitinev Zhitinkin Zhitkov Zhitluhin Zhitlukhin Zhitnik Zhitnikov Zhitny Zhitomirsky Zhituhin Zhitukhin Zhivaev Zhivago Zhivilo Zhivin Zhivkovich Zhivlyuk Zhivoderov Zhivokini Zhivoluk Zhivopistsev Zhivotenko Zhivotinsky Zhivotovsky Zhivov Zhivulin Zhizdik Zhizha Zhizhchenko Zhizhemsky Zhizhikin Zhizhilev Zhizhin Zhizhnov Zhiznevsky Zhiznyakov Zhloba Zhluktov Zhmaev Zhmakin Zhmakov Zhmelkov Zhminko Zhmotov Zhmudsky Zhmulev Zhmuro Zhogin Zhogov Zhohin Zhohov Zhokhin Zhokhov Zhokin Zholkov Zholobov Zholovan Zholtovsky Zholudev Zhongolovich Zhorin Zhornyak Zhorov Zhorzhev Zhovnerik Zhovnir Zhovtun Zhovtyak Zhuchenko Zhuchkov Zhuikov Zhuk Zhukov Zhukovets Zhukovich Zhukovin Zhukovsky Zhulebin Zhulev Zhulidov Zhulyabin Zhumenko Zhun Zhunda Zhunin Zhunusov Zhupanenko Zhupikov Zhura Zhurakovsky Zhuravel Zhuravkov Zhuravlenko Zhuravlev Zhuravliov Zhuravov Zhuravsky Zhurba Zhurbenko Zhurbin Zhurihin Zhurikhin Zhurin Zhurkin Zhurko Zhurkov Zhurkovsky Zhurman Zhuromsky Zhurov Zhuruli Zhushman Zhuzhlev Zhuzhnev Zhvachkin Zhvanetsky Zhvirblis Zhvykin Zihanov Zimaev Zimakin Zimakov Zimarev Zimarin Zimatsky Zimenkov Zimin Zimitsky Zimnitsky Zimnuhov Zimny Zimonin Zimovets Zimyanin Zinatullin Zinchenko Zinchuk Zinder Zinevich Zingarevich Zinger Zingerman Zingman Zinich Zinin Zinkevich Zinkovsky Zinkovsky Zinnatov Zinnurov Zinov Zinoviev Zinovin Zinyuhin Zis Zitev Zitserman Ziyakov Ziyatdinov Ziyazov Zobanov Zobkob Zobnin Zobov Zogalev Zolin Zolkin Zoloev Zolotai Zolotar Zolotarev Zolotarevsky Zolotarsky Zolotavin Zolotdinov Zolotenkov Zolotilin Zolotkov Zolotnitsky Zolotnitzky Zozrov Zozulya Zukerman ================================================ FILE: data/names/Scottish.txt ================================================ Smith Brown Wilson Campbell Stewart Thomson Robertson Anderson Macdonald Scott Reid Murray Taylor Clark Ross Watson Morrison Paterson Young Mitchell Walker Fraser Miller Mcdonald Gray Henderson Hamilton Johnston Duncan Graham Ferguson Kerr Davidson Bell Cameron Kelly Martin Hunter Allan Mackenzie Grant Simpson Mackay Mclean Macleod Black Russell Marshall Wallace Gibson Kennedy Gordon Burns Sutherland Stevenson Munro Milne Watt Murphy Craig Wood Muir Wright Mckenzie Ritchie Johnstone Sinclair White Mcmillan Williamson Dickson Hughes Cunningham Mckay Bruce Millar Crawford Mcintosh Douglas Docherty King Jones Boyle Fleming Mcgregor Aitken Christie Shaw Maclean Jamieson Mcintyre Hay Lindsay Alexander Ramsay Mccallum Whyte Jackson Mclaughlin Hill ================================================ FILE: data/names/Spanish.txt ================================================ Abana Abano Abarca Abaroa Abascal Abasolo Abel Abelló Aberquero Abreu Acosta Agramunt Aiza Alamilla Albert Albuquerque Aldana Alfaro Alvarado Álvarez Alves Amador Andreu Antúnez Aqua Aquino Araújo Araullo Araya Arce Arechavaleta Arena Aritza Armando Arreola Arriola Asis Asturias Avana Azarola Banderas Barros Basurto Bautista Bello Belmonte Bengochea Benitez Bermúdez Blanco Blanxart Bolívar Bonaventura Bosque Bustillo Busto Bustos Cabello Cabrera Campo Campos Capello Cardona Caro Casales Castell Castellano Castillion Castillo Castro Chavarría Chavez Colón Costa Crespo Cruz Cuéllar Cuevas D'cruz D'cruze De la cruz De la fuente Del bosque De leon Delgado Del olmo De santigo Díaz Dominguez Duarte Durante Echevarría Echeverría Elizondo Escamilla Escárcega Escarrà Esparza Espina Espino Espinosa Espinoza Estévez Etxebarria Etxeberria Félix Fernández Ferrer Fierro Flores Fonseca Franco Fuentes Gallego Gallo García Garrastazu Garza Gaspar Gebara Gomez Gonzales Gonzalez Grec Guadarrama Guerra Guerrero Gutiérrez Gutierrez Hernandez Herrera Herrero Hierro Holguín Huerta Ibáñez Ibarra Iñíguez Iturburua Jaso Jasso Jimenez Jordà Juárez Lobo Lopez Losa Loyola Machado Macías Maradona María Marino Márquez Martell Martí Martínez Martinez Mas Mata Mateu Medina Melendez Méndez Mendoza Menendez Merlo Michel Mingo Moles Molina Montero Morales Moralez Moreno Narváez Nieves Noguerra Núñez Obando Ochoa Ojeda Ola Oleastro Olguin Oliver Olmos Oquendo Orellana Oriol Ortega Ortiz Palomo Paredes Pavia Peláez Peña Pérez Perez Petit Picasso Porra Porras Prieto Puerta Puga Puig Quinones Quintana Quirós Ramírez Ramos Rana Rendón Rey Reyes Rios Rivera Rivero Robledo Robles Rocha Rodríguez Rodriquez Roig Rojas Rojo Roldán Romà Romà Romero Rosa Rosales Rubio Ruiz Sala Salamanca Salazar Salcedo Salinas Sanchez Sandoval San nicolas Santana Santiago Santillian Santos Sastre Sepúlveda Sierra Silva Soler Solo Solos Soto Suárez Suero Tapia Terrazas Tomàs Torres Tos Tosell Toset Travieso Trujillo Ubina Urbina Ureña Valdez Valencia Varela Vargas Vásquez Vázquez Vega Vela Vela Velazquez Ventura Vicario Vilaró Villa Villalobos Villanueva Villaverde Viola Viteri Vivas Vives Ybarra Zabala Zambrano Zamorano Zapatero Zavala Zubizarreta Zuñiga ================================================ FILE: data/names/Vietnamese.txt ================================================ Nguyen Tron Le Pham Huynh Hoang Phan Vu Vo Dang Bui Do Ho Ngo Duong Ly An an Bach Banh Cao Chau Chu Chung Chu Diep Doan Dam Dao Dinh Doan Giang Ha Han Kieu Kim La Lac Lam Lieu Luc Luong Luu Ma Mach Mai Nghiem Phi Pho Phung Quach Quang Quyen Ta Thach Thai Sai Thi Than Thao Thuy Tieu To Ton Tong Trang Trieu Trinh Truong Van Vinh Vuong Vuu ================================================ FILE: glove-word-vectors/glove-word-vectors.ipynb ================================================ { "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "![](https://i.imgur.com/eBRPvWB.png)\n", "\n", "# Practical PyTorch: Exploring Word Vectors with GloVe\n", "\n", "When working with words, dealing with the huge but sparse domain of language can be challenging. Even for a small corpus, your neural network (or any type of model) needs to support many thousands of discrete inputs and outputs.\n", "\n", "Besides the raw number words, the standard technique of representing words as one-hot vectors (e.g. \"the\" = `[0 0 0 1 0 0 0 0 ...]`) does not capture any information about relationships between words.\n", "\n", "Word vectors address this problem by representing words in a multi-dimensional vector space. This can bring the dimensionality of the problem from hundreds-of-thousands to just hundreds. Plus, the vector space is able to capture semantic relationships between words in terms of distance and vector arithmetic.\n", "\n", "![](https://i.imgur.com/y4hG1ak.png)\n", "\n", "There are a few techniques for creating word vectors. The word2vec algorithm predicts words in a context (e.g. what is the most likely word to appear in \"the cat ? the mouse\"), while GloVe vectors are based on global counts across the corpus — see [How is GloVe different from word2vec?](https://www.quora.com/How-is-GloVe-different-from-word2vec) on Quora for some better explanations.\n", "\n", "In my opinion the best feature of GloVe is that multiple sets of pre-trained vectors are easily [available for download](https://nlp.stanford.edu/projects/glove/), so that's what we'll use here.\n", "\n", "## Recommended reading\n", "\n", "* https://blog.acolyer.org/2016/04/21/the-amazing-power-of-word-vectors/\n", "* https://blog.acolyer.org/2016/04/22/glove-global-vectors-for-word-representation/\n", "* https://levyomer.wordpress.com/2014/04/25/linguistic-regularities-in-sparse-and-explicit-word-representations/" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Installing torchtext\n", "\n", "The torchtext package is not currently on the PIP or Conda package managers, but it's easy to install manually:\n", "\n", "```\n", "git clone https://github.com/pytorch/text pytorch-text\n", "cd pytorch-text\n", "python setup.py install\n", "```" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Loading word vectors\n", "\n", "Torchtext includes functions to download GloVe (and other) embeddings" ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [], "source": [ "import torch\n", "import torchtext.vocab as vocab" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Loaded 400000 words\n" ] } ], "source": [ "glove = vocab.GloVe(name='6B', dim=100)\n", "\n", "print('Loaded {} words'.format(len(glove.itos)))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The returned `GloVe` object includes attributes:\n", "- `stoi` _string-to-index_ returns a dictionary of words to indexes\n", "- `itos` _index-to-string_ returns an array of words by index\n", "- `vectors` returns the actual vectors. To get a word vector get the index to get the vector:" ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "collapsed": true, "scrolled": true }, "outputs": [], "source": [ "def get_word(word):\n", " return glove.vectors[glove.stoi[word]]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Finding closest vectors\n", "\n", "Going from word → vector is easy enough, but to go from vector → word takes more work. Here I'm (naively) calculating the distance for each word in the vocabulary, and sorting based on that distance:\n", "\n", "Anyone with a suggestion for optimizing this, please let me know!" ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "collapsed": true }, "outputs": [], "source": [ "def closest(vec, n=10):\n", " \"\"\"\n", " Find the closest words for a given vector\n", " \"\"\"\n", " all_dists = [(w, torch.dist(vec, get_word(w))) for w in glove.itos]\n", " return sorted(all_dists, key=lambda t: t[1])[:n]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This will return a list of `(word, distance)` tuple pairs. Here's a helper function to print that list:" ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "collapsed": true }, "outputs": [], "source": [ "def print_tuples(tuples):\n", " for tuple in tuples:\n", " print('(%.4f) %s' % (tuple[1], tuple[0]))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now using a known word vector we can see which other vectors are closest:" ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "(0.0000) google\n", "(3.0772) yahoo\n", "(3.8836) microsoft\n", "(4.1048) web\n", "(4.1082) aol\n", "(4.1165) facebook\n", "(4.3917) ebay\n", "(4.4122) msn\n", "(4.4540) internet\n", "(4.4651) netscape\n" ] } ], "source": [ "print_tuples(closest(get_word('google')))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Word analogies with vector arithmetic\n", "\n", "The most interesting feature of a well-trained word vector space is that certain semantic relationships (beyond just close-ness of words) can be captured with regular vector arithmetic. \n", "\n", "![](https://i.imgur.com/d0KuM5x.png)\n", "\n", "(image borrowed from [a slide from Omer Levy and Yoav Goldberg](https://levyomer.wordpress.com/2014/04/25/linguistic-regularities-in-sparse-and-explicit-word-representations/))" ] }, { "cell_type": "code", "execution_count": 6, "metadata": { "collapsed": false }, "outputs": [], "source": [ "# In the form w1 : w2 :: w3 : ?\n", "def analogy(w1, w2, w3, n=5, filter_given=True):\n", " print('\\n[%s : %s :: %s : ?]' % (w1, w2, w3))\n", " \n", " # w2 - w1 + w3 = w4\n", " closest_words = closest(get_word(w2) - get_word(w1) + get_word(w3))\n", " \n", " # Optionally filter out given words\n", " if filter_given:\n", " closest_words = [t for t in closest_words if t[0] not in [w1, w2, w3]]\n", " \n", " print_tuples(closest_words[:n])" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The classic example:" ] }, { "cell_type": "code", "execution_count": 7, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\n", "[king : man :: queen : ?]\n", "(4.0811) woman\n", "(4.6916) girl\n", "(5.2703) she\n", "(5.2788) teenager\n", "(5.3084) boy\n" ] } ], "source": [ "analogy('king', 'man', 'queen')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now let's explore the word space and see what stereotypes we can uncover:" ] }, { "cell_type": "code", "execution_count": 8, "metadata": { "collapsed": false, "scrolled": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\n", "[man : actor :: woman : ?]\n", "(2.8133) actress\n", "(5.0039) comedian\n", "(5.1399) actresses\n", "(5.2773) starred\n", "(5.3085) screenwriter\n", "\n", "[cat : kitten :: dog : ?]\n", "(3.8146) puppy\n", "(4.2944) rottweiler\n", "(4.5888) puppies\n", "(4.6086) pooch\n", "(4.6520) pug\n", "\n", "[dog : puppy :: cat : ?]\n", "(3.8146) kitten\n", "(4.0255) puppies\n", "(4.1575) kittens\n", "(4.1882) pterodactyl\n", "(4.1945) scaredy\n", "\n", "[russia : moscow :: france : ?]\n", "(3.2697) paris\n", "(4.6857) french\n", "(4.7085) lyon\n", "(4.9087) strasbourg\n", "(5.0362) marseille\n", "\n", "[obama : president :: trump : ?]\n", "(6.4302) executive\n", "(6.5149) founder\n", "(6.6997) ceo\n", "(6.7524) hilton\n", "(6.7729) walt\n", "\n", "[rich : mansion :: poor : ?]\n", "(5.8262) residence\n", "(5.9444) riverside\n", "(6.0283) hillside\n", "(6.0328) abandoned\n", "(6.0681) bungalow\n", "\n", "[elvis : rock :: eminem : ?]\n", "(5.6597) rap\n", "(6.2057) rappers\n", "(6.2161) rapper\n", "(6.2444) punk\n", "(6.2690) hop\n", "\n", "[paper : newspaper :: screen : ?]\n", "(4.7810) tv\n", "(5.1049) television\n", "(5.3818) cinema\n", "(5.5524) feature\n", "(5.5646) shows\n", "\n", "[monet : paint :: michelangelo : ?]\n", "(6.0782) plaster\n", "(6.3768) mold\n", "(6.3922) tile\n", "(6.5819) marble\n", "(6.6524) image\n", "\n", "[beer : barley :: wine : ?]\n", "(5.6021) grape\n", "(5.6760) beans\n", "(5.8174) grapes\n", "(5.9035) lentils\n", "(5.9454) figs\n", "\n", "[earth : moon :: sun : ?]\n", "(6.2294) lee\n", "(6.4125) kang\n", "(6.4644) tan\n", "(6.4757) yang\n", "(6.4853) lin\n", "\n", "[house : roof :: castle : ?]\n", "(6.2919) stonework\n", "(6.3779) masonry\n", "(6.4773) canopy\n", "(6.4954) fortress\n", "(6.5259) battlements\n", "\n", "[building : architect :: software : ?]\n", "(5.8369) programmer\n", "(6.8881) entrepreneur\n", "(6.9240) inventor\n", "(6.9730) developer\n", "(6.9949) innovator\n", "\n", "[boston : bruins :: phoenix : ?]\n", "(3.8546) suns\n", "(4.1968) mavericks\n", "(4.6126) coyotes\n", "(4.6894) mavs\n", "(4.6971) knicks\n", "\n", "[good : heaven :: bad : ?]\n", "(4.3959) hell\n", "(5.2864) ghosts\n", "(5.2898) hades\n", "(5.3414) madness\n", "(5.3520) purgatory\n", "\n", "[jordan : basketball :: woods : ?]\n", "(5.8607) golf\n", "(6.4110) golfers\n", "(6.4418) tournament\n", "(6.4592) tennis\n", "(6.6560) collegiate\n" ] } ], "source": [ "analogy('man', 'actor', 'woman')\n", "analogy('cat', 'kitten', 'dog')\n", "analogy('dog', 'puppy', 'cat')\n", "analogy('russia', 'moscow', 'france')\n", "analogy('obama', 'president', 'trump')\n", "analogy('rich', 'mansion', 'poor')\n", "analogy('elvis', 'rock', 'eminem')\n", "analogy('paper', 'newspaper', 'screen')\n", "analogy('monet', 'paint', 'michelangelo')\n", "analogy('beer', 'barley', 'wine')\n", "analogy('earth', 'moon', 'sun') # Interesting failure mode\n", "analogy('house', 'roof', 'castle')\n", "analogy('building', 'architect', 'software')\n", "analogy('boston', 'bruins', 'phoenix')\n", "analogy('good', 'heaven', 'bad')\n", "analogy('jordan', 'basketball', 'woods')" ] } ], "metadata": { "anaconda-cloud": {}, "kernelspec": { "display_name": "Python [default]", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.5.2" } }, "nbformat": 4, "nbformat_minor": 2 } ================================================ FILE: reinforce-gridworld/helpers.py ================================================ def interpolate(i, v_from, v_to, over): return (v_from - v_to) * max(0, (1 - i / over)) + v_to class SlidingAverage: def __init__(self, name, steps=100): self.name = name self.steps = steps self.t = 0 self.ns = [] self.avgs = [] def add(self, n): self.ns.append(n) if len(self.ns) > self.steps: self.ns.pop(0) self.t += 1 if self.t % self.steps == 0: self.avgs.append(self.value) @property def value(self): if len(self.ns) == 0: return 0 return sum(self.ns) / len(self.ns) def __str__(self): return "%s=%.4f" % (self.name, self.value) def __gt__(self, value): return self.value > value def __lt__(self, value): return self.value < value ================================================ FILE: reinforce-gridworld/reinforce-gridworld.ipynb ================================================ { "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "![](https://i.imgur.com/eBRPvWB.png)\n", "\n", "# Practical PyTorch: Playing GridWorld with Reinforcement Learning (Policy Gradients with REINFORCE)\n", "\n", "In this project we'll teach a neural network to navigate through a dangerous grid world.\n", "\n", "![](http://i.imgur.com/XNGB7sr.gif)\n", "\n", "Training uses [policy gradients](http://www.scholarpedia.org/article/Policy_gradient_methods) via the REINFORCE algorithm and a simplified Actor-Critic method. A single network calculates both a policy to choose the next action (the actor) and an estimated value of the current state (the critic). Rewards are propagated through the graph with PyTorch's [`reinforce` method](http://pytorch.org/docs/autograd.html?highlight=reinforce#torch.autograd.Variable.reinforce)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Resources\n", "\n", "* [*The* Reinforcement learning book from Sutton & Barto](http://incompleteideas.net/sutton/book/the-book-2nd.html)\n", "* [The REINFORCE paper from Ronald J. Williams (1992)](http://www-anw.cs.umass.edu/~barto/courses/cs687/williams92simple.pdf)\n", "* [Scholarpedia article on policy gradient methods](http://www.scholarpedia.org/article/Policy_gradient_methods)\n", "* [A Lecture from David Silver (of UCL, DeepMind) on policy gradients](http://www0.cs.ucl.ac.uk/staff/d.silver/web/Teaching_files/pg.pdf)\n", "* [The REINFORCE PyTorch example this tutorial is based on](https://github.com/pytorch/examples/tree/master/reinforcement_learning)\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Requirements\n", "\n", "The main requirements are PyTorch (of course), and numpy, matplotlib, and iPython for animating the states." ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Populating the interactive namespace from numpy and matplotlib\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "/Users/sean/anaconda3/lib/python3.5/site-packages/IPython/core/magics/pylab.py:161: UserWarning: pylab import has clobbered these variables: ['random']\n", "`%matplotlib` prevents importing * from pylab and numpy\n", " \"\\n`%matplotlib` prevents importing * from pylab and numpy\"\n" ] } ], "source": [ "import numpy as np\n", "from itertools import count\n", "import random\n", "\n", "import torch\n", "import torch.nn as nn\n", "import torch.nn.functional as F\n", "import torch.optim as optim\n", "import torch.autograd as autograd\n", "from torch.autograd import Variable\n", "\n", "import matplotlib.mlab as mlab\n", "import matplotlib.pyplot as plt\n", "import matplotlib.animation\n", "from IPython.display import HTML\n", "%pylab inline\n", "\n", "from helpers import *" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## The Grid World, Agent and Environment\n", "\n", "First we'll build the training environment, which is a simple square grid world with various rewards and a goal. If you're just interested in the training code, skip down to [building the actor-critic network](#Actor-Critic-network)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### The Grid\n", "\n", "The **Grid** class keeps track of the grid world: a 2d array of empty squares, plants, and the goal.\n", "\n", "![](https://i.imgur.com/kss8W95.png)\n", "\n", "Plants are randomly placed values from -1 to 0.5 (mostly poisonous) and if the agent lands on one, that value is added to the agent's health. The agent's goal is to reach the goal square, placed in one of the corners. As the agent moves around it gradually loses health so it has to move with purpose.\n", "\n", "The agent can see a surrounding area `VISIBLE_RADIUS` squares out from its position, so the edges of the grid are padded by that much with negative values. If the agent \"falls off the edge\" it dies instantly." ] }, { "cell_type": "code", "execution_count": 2, "metadata": { "collapsed": true }, "outputs": [], "source": [ "MIN_PLANT_VALUE = -1\n", "MAX_PLANT_VALUE = 0.5\n", "GOAL_VALUE = 10\n", "EDGE_VALUE = -10\n", "VISIBLE_RADIUS = 1\n", "\n", "class Grid():\n", " def __init__(self, grid_size=8, n_plants=15):\n", " self.grid_size = grid_size\n", " self.n_plants = n_plants\n", " \n", " def reset(self):\n", " padded_size = self.grid_size + 2 * VISIBLE_RADIUS\n", " self.grid = np.zeros((padded_size, padded_size)) # Padding for edges\n", " \n", " # Edges\n", " self.grid[0:VISIBLE_RADIUS, :] = EDGE_VALUE\n", " self.grid[-1*VISIBLE_RADIUS:, :] = EDGE_VALUE\n", " self.grid[:, 0:VISIBLE_RADIUS] = EDGE_VALUE\n", " self.grid[:, -1*VISIBLE_RADIUS:] = EDGE_VALUE\n", " \n", " # Randomly placed plants\n", " for i in range(self.n_plants):\n", " plant_value = random.random() * (MAX_PLANT_VALUE - MIN_PLANT_VALUE) + MIN_PLANT_VALUE\n", " ry = random.randint(0, self.grid_size-1) + VISIBLE_RADIUS\n", " rx = random.randint(0, self.grid_size-1) + VISIBLE_RADIUS\n", " self.grid[ry, rx] = plant_value\n", " \n", " # Goal in one of the corners\n", " S = VISIBLE_RADIUS\n", " E = self.grid_size + VISIBLE_RADIUS - 1\n", " gps = [(E, E), (S, E), (E, S), (S, S)]\n", " gp = gps[random.randint(0, len(gps)-1)]\n", " self.grid[gp] = GOAL_VALUE\n", " \n", " def visible(self, pos):\n", " y, x = pos\n", " return self.grid[y-VISIBLE_RADIUS:y+VISIBLE_RADIUS+1, x-VISIBLE_RADIUS:x+VISIBLE_RADIUS+1]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### The Agent\n", "\n", "The **Agent** has a current position and a health. All this class does is update the position based on an action (up, right, down or left) and decrement a small `STEP_VALUE` at every time step, so that it eventually starves if it doesn't reach the goal.\n", "\n", "The world based effects on the agent's health are handled by the Environment below." ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "collapsed": true }, "outputs": [], "source": [ "START_HEALTH = 1\n", "STEP_VALUE = -0.02\n", "\n", "class Agent:\n", " def reset(self):\n", " self.health = START_HEALTH\n", "\n", " def act(self, action):\n", " # Move according to action: 0=UP, 1=RIGHT, 2=DOWN, 3=LEFT\n", " y, x = self.pos\n", " if action == 0: y -= 1\n", " elif action == 1: x += 1\n", " elif action == 2: y += 1\n", " elif action == 3: x -= 1\n", " self.pos = (y, x)\n", " self.health += STEP_VALUE # Gradually getting hungrier" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### The Environment\n", "\n", "The **Environment** encapsulates the Grid and Agent, and handles the bulk of the logic of assigning rewards when the agent acts. If an agent lands on a plant or goal or edge, its health is updated accordingly. Plants are removed from the grid (set to 0) when \"eaten\" by the agent. Every time step there is also a slight negative health penalty so that the agent must keep finding plants or reach the goal to survive.\n", "\n", "The Environment's main function is `step(action)` → `(state, reward, done)`, which updates the world state with a chosen action and returns the resulting state, and also returns a reward and whether the episode is done. The state it returns is what the agent will use to make its action predictions, which in this case is the visible grid area (flattened into one dimension) and the current agent health (to give it some \"self awareness\").\n", "\n", "The episode is considered done if won or lost - won if the agent reaches the goal (`agent.health >= GOAL_VALUE`) and lost if the agent dies from falling off the edge, eating too many poisonous plants, or getting too hungry (`agent.health <= 0`).\n", "\n", "In this experiment the environment only returns a single reward at the end of the episode (to make it more challenging). Values from plants and the step penalty are implicit - they might cause the agent to live longer or die sooner, but they aren't included in the final reward.\n", "\n", "The Environment also keeps track of the grid and agent states for each step of an episode, for visualization." ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "collapsed": true }, "outputs": [], "source": [ "class Environment:\n", " def __init__(self):\n", " self.grid = Grid()\n", " self.agent = Agent()\n", "\n", " def reset(self):\n", " \"\"\"Start a new episode by resetting grid and agent\"\"\"\n", " self.grid.reset()\n", " self.agent.reset()\n", " c = math.floor(self.grid.grid_size / 2)\n", " self.agent.pos = (c, c)\n", " \n", " self.t = 0\n", " self.history = []\n", " self.record_step()\n", " \n", " return self.visible_state\n", " \n", " def record_step(self):\n", " \"\"\"Add the current state to history for display later\"\"\"\n", " grid = np.array(self.grid.grid)\n", " grid[self.agent.pos] = self.agent.health * 0.5 # Agent marker faded by health\n", " visible = np.array(self.grid.visible(self.agent.pos))\n", " self.history.append((grid, visible, self.agent.health))\n", " \n", " @property\n", " def visible_state(self):\n", " \"\"\"Return the visible area surrounding the agent, and current agent health\"\"\"\n", " visible = self.grid.visible(self.agent.pos)\n", " y, x = self.agent.pos\n", " yp = (y - VISIBLE_RADIUS) / self.grid.grid_size\n", " xp = (x - VISIBLE_RADIUS) / self.grid.grid_size\n", " extras = [self.agent.health, yp, xp]\n", " return np.concatenate((visible.flatten(), extras), 0)\n", " \n", " def step(self, action):\n", " \"\"\"Update state (grid and agent) based on an action\"\"\"\n", " self.agent.act(action)\n", " \n", " # Get reward from where agent landed, add to agent health\n", " value = self.grid.grid[self.agent.pos]\n", " self.grid.grid[self.agent.pos] = 0\n", " self.agent.health += value\n", " \n", " # Check if agent won (reached the goal) or lost (health reached 0)\n", " won = value == GOAL_VALUE\n", " lost = self.agent.health <= 0\n", " done = won or lost\n", " \n", " # Rewards at end of episode\n", " if won:\n", " reward = 1\n", " elif lost:\n", " reward = -1\n", " else:\n", " reward = 0 # Reward will only come at the end\n", "\n", " # Save in history\n", " self.record_step()\n", " \n", " return self.visible_state, reward, done" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Visualizing History\n", "\n", "To visualize an episode the `animate(history)` function uses Matplotlib to plot the grid state and agent health over time, and turn the resulting frames into a GIF." ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "collapsed": false }, "outputs": [], "source": [ "def animate(history):\n", " frames = len(history)\n", " print(\"Rendering %d frames...\" % frames)\n", " fig = plt.figure(figsize=(6, 2))\n", " fig_grid = fig.add_subplot(121)\n", " fig_health = fig.add_subplot(243)\n", " fig_visible = fig.add_subplot(244)\n", " fig_health.set_autoscale_on(False)\n", " health_plot = np.zeros((frames, 1))\n", "\n", " def render_frame(i):\n", " grid, visible, health = history[i]\n", " # Render grid\n", " fig_grid.matshow(grid, vmin=-1, vmax=1, cmap='jet')\n", " fig_visible.matshow(visible, vmin=-1, vmax=1, cmap='jet')\n", " # Render health chart\n", " health_plot[i] = health\n", " fig_health.clear()\n", " fig_health.axis([0, frames, 0, 2])\n", " fig_health.plot(health_plot[:i + 1])\n", "\n", " anim = matplotlib.animation.FuncAnimation(\n", " fig, render_frame, frames=frames, interval=100\n", " )\n", "\n", " plt.close()\n", " display(HTML(anim.to_html5_video()))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Testing the Environment\n", "\n", "Let's test what we have so far with a quick simulation:" ] }, { "cell_type": "code", "execution_count": 6, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "[ 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0.375\n", " 0.375]\n", "Rendering 6 frames...\n" ] }, { "data": { "text/html": [ "" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "env = Environment()\n", "env.reset()\n", "print(env.visible_state)\n", "\n", "done = False\n", "while not done:\n", " _, _, done = env.step(2) # Down\n", "\n", "animate(env.history)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Actor-Critic network\n", "\n", "Value-based reinforcement learning methods like Q-Learning try to predict the expected reward of the next state(s) given an action. In contrast, a policy method tries to directly choose the best action given a state. Policy methods are conceptually simpler but training can be tricky - due to the high variance of rewards, it can easily become unstable or just plateau at a local minimum.\n", "\n", "Combining a value estimation with the policy helps regularize training by establishing a \"baseline\" reward that learns alongside the actor. Subtracting a baseline value from the rewards essentially trains the actor to perform \"better than expected\".\n", "\n", "In this case, both actor and critic (baseline) are combined into a single neural network with 5 outputs: the probabilities of the 4 possible actions, and an estimated value.\n", "\n", "The input layer `inp` transforms the environment state, $(radius*2+1)^2$ squares plus the agent's health and position, into an internal state. The output layer `out` transforms that internal state to probabilities of possible actions plus the estimated value." ] }, { "cell_type": "code", "execution_count": 7, "metadata": { "collapsed": false }, "outputs": [], "source": [ "class Policy(nn.Module):\n", " def __init__(self, hidden_size):\n", " super(Policy, self).__init__()\n", " \n", " visible_squares = (VISIBLE_RADIUS * 2 + 1) ** 2\n", " input_size = visible_squares + 1 + 2 # Plus agent health, y, x\n", " \n", " self.inp = nn.Linear(input_size, hidden_size)\n", " self.out = nn.Linear(hidden_size, 4 + 1, bias=False) # For both action and expected value\n", "\n", " def forward(self, x):\n", " x = x.view(1, -1)\n", " x = F.tanh(x) # Squash inputs\n", " x = F.relu(self.inp(x))\n", " x = self.out(x)\n", " \n", " # Split last five outputs into scores and value\n", " scores = x[:,:4]\n", " value = x[:,4]\n", " return scores, value" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Selecting actions\n", "\n", "To select actions we treat the output of the policy as a multinomial distribution over actions, and sample from that to choose a single action. Thanks to the [REINFORCE](https://webdocs.cs.ualberta.ca/~sutton/williams-92.pdf) algorithm we can calculate gradients for discrete action samples by calling `action.reinforce(reward)` at the end of the episode.\n", "\n", "To encourage exploration in early episodes, here's one weird trick: apply dropout to the action scores, before softmax. Randomly masking some scores will cause less likely scores to be chosen. The dropout percent gradually decreases from 30% to 5% over the first 200k episodes." ] }, { "cell_type": "code", "execution_count": 8, "metadata": { "collapsed": false }, "outputs": [], "source": [ "DROP_MAX = 0.3\n", "DROP_MIN = 0.05\n", "DROP_OVER = 200000\n", "\n", "def select_action(e, state):\n", " drop = interpolate(e, DROP_MAX, DROP_MIN, DROP_OVER)\n", " \n", " state = Variable(torch.from_numpy(state).float())\n", " scores, value = policy(state) # Forward state through network\n", " scores = F.dropout(scores, drop, True) # Dropout for exploration\n", " scores = F.softmax(scores)\n", " action = scores.multinomial() # Sample an action\n", "\n", " return action, value" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Playing through an episode\n", "\n", "A single episode is the agent moving through the environment from start to finish. We keep track of the chosen action and value outputs from the model, and resulting rewards to reinforce at the end of the episode." ] }, { "cell_type": "code", "execution_count": 9, "metadata": { "collapsed": true }, "outputs": [], "source": [ "def run_episode(e):\n", " state = env.reset()\n", " actions = []\n", " values = []\n", " rewards = []\n", " done = False\n", "\n", " while not done:\n", " action, value = select_action(e, state)\n", " state, reward, done = env.step(action.data[0, 0])\n", " actions.append(action)\n", " values.append(value)\n", " rewards.append(reward)\n", "\n", " return actions, values, rewards" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Using REINFORCE with a value baseline\n", "\n", "The policy gradient method is similar to regular supervised learning, except we don't know the \"correct\" action for any given state. Plus we are only getting a single reward at the end of the episode. To give rewards to past actions we fake history by copying the final reward (and possibly intermediate rewards) back in time with a discount factor:\n", "\n", "![](https://i.imgur.com/IoXMuCb.png)\n", "\n", "Then for every time step, we use `action.reinforce(reward)` to encourage or discourage those actions.\n", "\n", "We will use the value output of the network as a baseline, and use the difference of the reward and the baseline with `reinforce`. The value estimate itself is trained to be close to the actual reward with a MSE loss." ] }, { "cell_type": "code", "execution_count": 10, "metadata": { "collapsed": false }, "outputs": [], "source": [ "gamma = 0.9 # Discounted reward factor\n", "\n", "mse = nn.MSELoss()\n", "\n", "def finish_episode(e, actions, values, rewards):\n", " \n", " # Calculate discounted rewards, going backwards from end\n", " discounted_rewards = []\n", " R = 0\n", " for r in rewards[::-1]:\n", " R = r + gamma * R\n", " discounted_rewards.insert(0, R)\n", " discounted_rewards = torch.Tensor(discounted_rewards)\n", "\n", " # Use REINFORCE on chosen actions and associated discounted rewards\n", " value_loss = 0\n", " for action, value, reward in zip(actions, values, discounted_rewards):\n", " reward_diff = reward - value.data[0] # Treat critic value as baseline\n", " action.reinforce(reward_diff) # Try to perform better than baseline\n", " value_loss += mse(value, Variable(torch.Tensor([reward]))) # Compare with actual reward\n", "\n", " # Backpropagate\n", " optimizer.zero_grad()\n", " nodes = [value_loss] + actions\n", " gradients = [torch.ones(1)] + [None for _ in actions] # No gradients for reinforced values\n", " autograd.backward(nodes, gradients)\n", " optimizer.step()\n", " \n", " return discounted_rewards, value_loss" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "With everything in place we can define the training parameters and create the actual Environment and Policy instances. We'll also use a SlidingAverage helper to keep track of average rewards over time." ] }, { "cell_type": "code", "execution_count": 11, "metadata": { "collapsed": false }, "outputs": [ ], "source": [ "hidden_size = 50\n", "learning_rate = 1e-4\n", "weight_decay = 1e-5\n", "\n", "log_every = 1000\n", "render_every = 20000\n", "\n", "env = Environment()\n", "policy = Policy(hidden_size=hidden_size)\n", "optimizer = optim.Adam(policy.parameters(), lr=learning_rate)#, weight_decay=weight_decay)\n", "\n", "reward_avg = SlidingAverage('reward avg', steps=log_every)\n", "value_avg = SlidingAverage('value avg', steps=log_every)\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Finally, we run a bunch of episodes and wait for some results. The average final reward will help us track whether it's learning. This took about an hour on a 2.8GHz CPU to get some reasonable results." ] }, { "cell_type": "code", "execution_count": 12, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "[epoch=0] reward avg=-1.0000 value avg=5.6570\n", "[epoch=1000] reward avg=-0.9740 value avg=2.7766\n", "[epoch=2000] reward avg=-0.9600 value avg=1.5183\n", "[epoch=3000] reward avg=-0.9420 value avg=1.5846\n", "[epoch=4000] reward avg=-0.9640 value avg=1.4439\n", "[epoch=5000] reward avg=-0.9420 value avg=1.5593\n", "[epoch=6000] reward avg=-0.9360 value avg=1.7154\n", "[epoch=7000] reward avg=-0.9240 value avg=1.8695\n", "[epoch=8000] reward avg=-0.9180 value avg=1.9271\n", "[epoch=9000] reward avg=-0.8700 value avg=2.2150\n", "[epoch=10000] reward avg=-0.8620 value avg=2.2781\n", "[epoch=11000] reward avg=-0.8320 value avg=2.5636\n", "[epoch=12000] reward avg=-0.7760 value avg=2.6614\n", "[epoch=13000] reward avg=-0.7640 value avg=2.7424\n", "[epoch=14000] reward avg=-0.7060 value avg=2.8898\n", "[epoch=15000] reward avg=-0.6620 value avg=3.1093\n", "[epoch=16000] reward avg=-0.6300 value avg=3.0002\n", "[epoch=17000] reward avg=-0.6100 value avg=3.0183\n", "[epoch=18000] reward avg=-0.5580 value avg=3.0212\n", "[epoch=19000] reward avg=-0.5680 value avg=2.9777\n", "[epoch=20000] reward avg=-0.5220 value avg=2.9321\n", "Rendering 14 frames...\n" ] }, { "data": { "text/html": [ "" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" }, { "name": "stdout", "output_type": "stream", "text": [ "[epoch=21000] reward avg=-0.4800 value avg=3.0787\n", "[epoch=22000] reward avg=-0.4660 value avg=3.1405\n", "[epoch=23000] reward avg=-0.3340 value avg=3.2157\n", "[epoch=24000] reward avg=-0.4140 value avg=3.0677\n", "[epoch=25000] reward avg=-0.2980 value avg=3.2760\n", "[epoch=26000] reward avg=-0.3500 value avg=3.0733\n", "[epoch=27000] reward avg=-0.3220 value avg=3.0263\n", "[epoch=28000] reward avg=-0.3160 value avg=2.9758\n", "[epoch=29000] reward avg=-0.3180 value avg=3.0533\n", "[epoch=30000] reward avg=-0.2880 value avg=2.8530\n", "[epoch=31000] reward avg=-0.2500 value avg=3.0794\n", "[epoch=32000] reward avg=-0.2780 value avg=3.0500\n", "[epoch=33000] reward avg=-0.3060 value avg=2.8212\n", "[epoch=34000] reward avg=-0.2440 value avg=2.9480\n", "[epoch=35000] reward avg=-0.1440 value avg=3.0938\n", "[epoch=36000] reward avg=-0.2100 value avg=3.1951\n", "[epoch=37000] reward avg=-0.1140 value avg=3.1707\n", "[epoch=38000] reward avg=-0.2060 value avg=3.0702\n", "[epoch=39000] reward avg=-0.1800 value avg=2.9726\n", "[epoch=40000] reward avg=-0.1040 value avg=3.0987\n", "Rendering 73 frames...\n" ] }, { "data": { "text/html": [ "" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" }, { "name": "stdout", "output_type": "stream", "text": [ "[epoch=41000] reward avg=-0.2180 value avg=2.8552\n", "[epoch=42000] reward avg=-0.1860 value avg=2.9884\n", "[epoch=43000] reward avg=-0.1800 value avg=2.8004\n", "[epoch=44000] reward avg=-0.1400 value avg=2.9231\n", "[epoch=45000] reward avg=-0.1200 value avg=2.8050\n", "[epoch=46000] reward avg=-0.1180 value avg=3.0405\n", "[epoch=47000] reward avg=-0.1220 value avg=2.8643\n", "[epoch=48000] reward avg=-0.0820 value avg=2.8232\n", "[epoch=49000] reward avg=-0.0760 value avg=2.7912\n", "[epoch=50000] reward avg=-0.1360 value avg=2.8212\n", "[epoch=51000] reward avg=-0.1120 value avg=2.8823\n", "[epoch=52000] reward avg=-0.0880 value avg=2.8519\n", "[epoch=53000] reward avg=-0.0640 value avg=2.8911\n", "[epoch=54000] reward avg=-0.0760 value avg=2.8220\n", "[epoch=55000] reward avg=-0.1200 value avg=2.5774\n", "[epoch=56000] reward avg=-0.1060 value avg=2.7156\n", "[epoch=57000] reward avg=-0.0460 value avg=2.7477\n", "[epoch=58000] reward avg=-0.0540 value avg=2.7149\n", "[epoch=59000] reward avg=-0.0420 value avg=2.8103\n", "[epoch=60000] reward avg=-0.0020 value avg=2.8103\n", "Rendering 51 frames...\n" ] }, { "data": { "text/html": [ "" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" }, { "name": "stdout", "output_type": "stream", "text": [ "[epoch=61000] reward avg=0.0140 value avg=2.8072\n", "[epoch=62000] reward avg=0.0120 value avg=2.7167\n", "[epoch=63000] reward avg=0.0100 value avg=2.7989\n", "[epoch=64000] reward avg=-0.0340 value avg=2.6266\n", "[epoch=65000] reward avg=-0.0780 value avg=2.4347\n", "[epoch=66000] reward avg=-0.0680 value avg=2.6278\n", "[epoch=67000] reward avg=-0.0480 value avg=2.5624\n", "[epoch=68000] reward avg=-0.0540 value avg=2.4876\n", "[epoch=69000] reward avg=-0.0120 value avg=2.5917\n", "[epoch=70000] reward avg=-0.0160 value avg=2.5240\n", "[epoch=71000] reward avg=-0.0800 value avg=2.3887\n", "[epoch=72000] reward avg=0.0340 value avg=2.6404\n", "[epoch=73000] reward avg=-0.0160 value avg=2.5540\n", "[epoch=74000] reward avg=-0.0220 value avg=2.4573\n", "[epoch=75000] reward avg=-0.0300 value avg=2.4484\n", "[epoch=76000] reward avg=-0.0720 value avg=2.3868\n", "[epoch=77000] reward avg=0.0160 value avg=2.5220\n", "[epoch=78000] reward avg=-0.0520 value avg=2.5630\n", "[epoch=79000] reward avg=0.0280 value avg=2.3931\n", "[epoch=80000] reward avg=-0.0380 value avg=2.2635\n", "Rendering 33 frames...\n" ] }, { "data": { "text/html": [ "" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" }, { "name": "stdout", "output_type": "stream", "text": [ "[epoch=81000] reward avg=0.0500 value avg=2.4029\n", "[epoch=82000] reward avg=-0.0540 value avg=2.2796\n", "[epoch=83000] reward avg=0.0160 value avg=2.3000\n", "[epoch=84000] reward avg=-0.0380 value avg=2.3727\n", "[epoch=85000] reward avg=0.0040 value avg=2.4260\n", "[epoch=86000] reward avg=-0.0460 value avg=2.1574\n", "[epoch=87000] reward avg=-0.0080 value avg=2.3031\n", "[epoch=88000] reward avg=-0.0840 value avg=2.0344\n", "[epoch=89000] reward avg=-0.0760 value avg=2.0799\n", "[epoch=90000] reward avg=-0.0680 value avg=2.0726\n", "[epoch=91000] reward avg=-0.0260 value avg=2.2026\n", "[epoch=92000] reward avg=-0.0640 value avg=2.0950\n", "[epoch=93000] reward avg=-0.0600 value avg=1.9660\n", "[epoch=94000] reward avg=-0.1100 value avg=1.8462\n", "[epoch=95000] reward avg=-0.0680 value avg=1.9794\n", "[epoch=96000] reward avg=-0.0300 value avg=2.0902\n", "[epoch=97000] reward avg=-0.0440 value avg=2.0968\n", "[epoch=98000] reward avg=-0.0340 value avg=2.0948\n", "[epoch=99000] reward avg=-0.0340 value avg=1.9508\n", "[epoch=100000] reward avg=-0.0480 value avg=1.9128\n", "Rendering 19 frames...\n" ] }, { "data": { "text/html": [ "" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" }, { "name": "stdout", "output_type": "stream", "text": [ "[epoch=101000] reward avg=-0.0200 value avg=1.8954\n", "[epoch=102000] reward avg=-0.0080 value avg=2.0457\n", "[epoch=103000] reward avg=-0.0360 value avg=2.0535\n", "[epoch=104000] reward avg=-0.0520 value avg=1.9996\n", "[epoch=105000] reward avg=0.0020 value avg=2.1378\n", "[epoch=106000] reward avg=0.0340 value avg=2.1125\n", "[epoch=107000] reward avg=0.0420 value avg=2.1620\n", "[epoch=108000] reward avg=0.0400 value avg=2.2281\n", "[epoch=109000] reward avg=0.0360 value avg=2.2380\n", "[epoch=110000] reward avg=0.0000 value avg=2.1415\n", "[epoch=111000] reward avg=0.0580 value avg=2.1649\n", "[epoch=112000] reward avg=-0.0160 value avg=2.2506\n", "[epoch=113000] reward avg=0.0120 value avg=2.2178\n", "[epoch=114000] reward avg=-0.0360 value avg=1.8879\n", "[epoch=115000] reward avg=0.0600 value avg=2.1395\n", "[epoch=116000] reward avg=0.0060 value avg=2.1164\n", "[epoch=117000] reward avg=-0.0400 value avg=1.9164\n", "[epoch=118000] reward avg=-0.0160 value avg=1.8714\n", "[epoch=119000] reward avg=0.0140 value avg=2.0096\n", "[epoch=120000] reward avg=0.0360 value avg=2.0119\n", "Rendering 64 frames...\n" ] }, { "data": { "text/html": [ "" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" }, { "name": "stdout", "output_type": "stream", "text": [ "[epoch=121000] reward avg=0.0660 value avg=2.0845\n", "[epoch=122000] reward avg=0.0760 value avg=2.0736\n", "[epoch=123000] reward avg=-0.0140 value avg=1.7696\n", "[epoch=124000] reward avg=-0.1140 value avg=1.7106\n", "[epoch=125000] reward avg=-0.1720 value avg=1.5944\n", "[epoch=126000] reward avg=-0.2460 value avg=1.3580\n", "[epoch=127000] reward avg=-0.0560 value avg=1.7342\n", "[epoch=128000] reward avg=-0.0860 value avg=1.5200\n", "[epoch=129000] reward avg=-0.1580 value avg=1.5148\n", "[epoch=130000] reward avg=-0.1100 value avg=1.5950\n", "[epoch=131000] reward avg=-0.1940 value avg=1.4144\n", "[epoch=132000] reward avg=-0.2380 value avg=1.4147\n", "[epoch=133000] reward avg=-0.1900 value avg=1.3975\n", "[epoch=134000] reward avg=-0.2000 value avg=1.3473\n", "[epoch=135000] reward avg=-0.2560 value avg=1.2985\n", "[epoch=136000] reward avg=-0.1800 value avg=1.3935\n", "[epoch=137000] reward avg=-0.2300 value avg=1.2835\n", "[epoch=138000] reward avg=-0.2440 value avg=1.3503\n", "[epoch=139000] reward avg=-0.2300 value avg=1.2931\n", "[epoch=140000] reward avg=-0.2180 value avg=1.3109\n", "Rendering 9 frames...\n" ] }, { "data": { "text/html": [ "" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" }, { "name": "stdout", "output_type": "stream", "text": [ "[epoch=141000] reward avg=-0.2580 value avg=1.2084\n", "[epoch=142000] reward avg=-0.2340 value avg=1.2617\n", "[epoch=143000] reward avg=-0.1680 value avg=1.3163\n", "[epoch=144000] reward avg=-0.1480 value avg=1.3855\n", "[epoch=145000] reward avg=-0.1720 value avg=1.2902\n", "[epoch=146000] reward avg=-0.1840 value avg=1.3111\n", "[epoch=147000] reward avg=-0.1320 value avg=1.3029\n", "[epoch=148000] reward avg=-0.1760 value avg=1.2370\n", "[epoch=149000] reward avg=-0.1800 value avg=1.3052\n", "[epoch=150000] reward avg=-0.2460 value avg=1.2406\n", "[epoch=151000] reward avg=-0.1580 value avg=1.2520\n", "[epoch=152000] reward avg=-0.1240 value avg=1.3091\n", "[epoch=153000] reward avg=-0.2140 value avg=1.1697\n", "[epoch=154000] reward avg=-0.2220 value avg=1.2158\n", "[epoch=155000] reward avg=-0.1260 value avg=1.3016\n", "[epoch=156000] reward avg=-0.1500 value avg=1.2398\n", "[epoch=157000] reward avg=-0.1260 value avg=1.2868\n", "[epoch=158000] reward avg=-0.2060 value avg=1.2502\n", "[epoch=159000] reward avg=-0.1500 value avg=1.2797\n", "[epoch=160000] reward avg=-0.1060 value avg=1.3336\n", "Rendering 56 frames...\n" ] }, { "data": { "text/html": [ "" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" }, { "name": "stdout", "output_type": "stream", "text": [ "[epoch=161000] reward avg=-0.0940 value avg=1.2735\n", "[epoch=162000] reward avg=-0.1060 value avg=1.3275\n", "[epoch=163000] reward avg=-0.0540 value avg=1.3247\n", "[epoch=164000] reward avg=0.1500 value avg=2.1876\n", "[epoch=165000] reward avg=0.5620 value avg=2.5559\n", "[epoch=166000] reward avg=0.6060 value avg=2.4327\n", "[epoch=167000] reward avg=0.5480 value avg=2.2885\n", "[epoch=168000] reward avg=0.5560 value avg=2.3068\n", "[epoch=169000] reward avg=0.6060 value avg=2.1906\n", "[epoch=170000] reward avg=0.5940 value avg=2.1082\n", "[epoch=171000] reward avg=0.5860 value avg=2.1375\n", "[epoch=172000] reward avg=0.5660 value avg=2.1844\n", "[epoch=173000] reward avg=0.6360 value avg=1.9815\n", "[epoch=174000] reward avg=0.6700 value avg=1.9800\n", "[epoch=175000] reward avg=0.6340 value avg=2.0020\n", "[epoch=176000] reward avg=0.7060 value avg=1.8904\n" ] } ], "source": [ "e = 0\n", "\n", "while reward_avg < 0.75:\n", " actions, values, rewards = run_episode(e)\n", " final_reward = rewards[-1]\n", " \n", " discounted_rewards, value_loss = finish_episode(e, actions, values, rewards)\n", " \n", " reward_avg.add(final_reward)\n", " value_avg.add(value_loss.data[0])\n", " \n", " if e % log_every == 0:\n", " print('[epoch=%d]' % e, reward_avg, value_avg)\n", " \n", " if e > 0 and e % render_every == 0:\n", " animate(env.history)\n", " \n", " e += 1" ] }, { "cell_type": "code", "execution_count": 13, "metadata": { "collapsed": false, "scrolled": false }, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAhkAAAFkCAYAAACNTikJAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzs3Xl81NX1//HXYRMVRRQBqYiiyOKCgKIWRQWVirhDMUqr\n1lqtO7Zq6/Kzaq27VqtUq98KbhGKFZeiKKBCVVCCOyAKthVkX2IhyHp/f5x8OpMwk3U+zEzyfj4e\neUzms+V8Apk5c++591oIAREREZFMa5DtAERERKRuUpIhIiIisVCSISIiIrFQkiEiIiKxUJIhIiIi\nsVCSISIiIrFQkiEiIiKxUJIhIiIisVCSISIiIrFQkiEiIiKxiD3JMLNLzOxrM1trZlPN7JBKjj/b\nzD4yszVm9q2Z/Z+Z7Rx3nCIiIpJZsSYZZjYEuBe4CegOfAyMN7OWaY7vDYwEHgO6AoOAXsBf4oxT\nREREMs/iXCDNzKYC00IIV5Q+N+Ab4MEQwl0pjv8VcFEIoWPStkuBa0IIe8QWqIiIiGRcbC0ZZtYY\n6AlMjLYFz2gmAIenOe09oJ2ZnVB6jdbAYOAfccUpIiIi8WgU47VbAg2BxeW2LwY6pTohhPCumQ0F\nRplZ09L4XgIuTfdDzGwXoD/wL+D72octIiJSbzQF9gTGhxCWZ/ricSYZ1WZmXYEHgN8BrwO7AfcA\njwI/T3Naf+CZrRGfiIhIHXU28GymLxpnkrEM2AS0Lre9NbAozTm/Ad4JIdxX+vwzM7sYmGJm14cQ\nyreKgLdg8PTTT9OlS5faR50Dhg0bxv3335/tMDKmLt1PXboX0P3ksrp0L6D7yVWzZs1i6NChUPpe\nmmmxJRkhhA1mVgT0w7s8osLPfsCDaU7bDlhfbttmIACW5pzvAbp06UKPHj1qG3ZOaN68eZ25F6hb\n91OX7gV0P7msLt0L6H7yQCzlBnHPk3EfcIGZ/dTMOgOP4InECAAzu93MRiYd/zJwhpldZGZ7lQ5p\nfQAfoZKu9UNERERyUKw1GSGE0aVzYtyCd5N8BPQPISwtPaQN0C7p+JFm1gy4BK/FWIWPTvlNnHGK\niIhI5sVe+BlCGA4MT7PvvBTbHgYejjsuERERiZfWLslBBQUF2Q4ho+rS/dSlewHdTy6rS/cCup/6\nKtYZP7cGM+sBFBUVFdW1IhwREZFYzZgxg549ewL0DCHMyPT11ZIhIiIisVCSISIiIrFQkiEiIiKx\nUJIhIiIisVCSISIiIrFQkiEiIiKxUJIhIiIisVCSISIiUgd8/z2sWpXtKMpSkiEiIlIHTJwILVrA\n/PnZjiRBSYaIiEgdMH06tGwJP/hBtiNJUJIhIiJSB0yfDgcfDGbZjiRBSYaIiEieC8GTDF+GJHco\nyRAREclz334LixZ5S0YuUZIhIiKSZzZsgCVLEs+nT/dHJRkiIiJSKzfdBPvt58NWwZOM1q1zq+gT\nlGSIiIjklbVr4dFHYdkyePFF35aLRZ+gJENERCSvjB4NK1ZAx44wYkSi6DPXukoAGmU7ABEREam6\nhx+G/v1h0CC48EKYOtVbNXIxyVBLhoiISJ744AP/uuQSGDwYttkGrrzS9+Xa8FVQkiEiIpI3hg+H\n9u1hwABo3hxOPx3ef98LPnfbLdvRbUlJhoiISB5YvRqee867SBo29G3nnuuPudhVAlshyTCzS8zs\nazNba2ZTzeyQSo5vYma3mdm/zOx7M5tnZufGHaeIiEguKyryIasDBya2HXMMdO4M/fplL66KxFr4\naWZDgHuBXwDvA8OA8Wa2bwhhWZrT/gbsCpwHzAV2Qy0uIiJSz02bBs2aQdeuiW0NG8Lnn0ODHH2X\njHt0yTDg0RDCkwBmdhFwIvAz4K7yB5vZj4AjgQ4hhFWlm/8Tc4wiIiI5b9o07xaJukoiuZpgQIwt\nBGbWGOgJTIy2hRACMAE4PM1pJwHTgWvNbL6ZfWFmd5tZ07jiFBERidvSpbW/xrRpcOihtb/O1hRn\n/tMSaAgsLrd9MdAmzTkd8JaM/YBTgSuAQcDDMcUoIiISq5kzfeTHlCmp969Y4ZNqrV+f/hoLFviX\nkozaaQBsBs4KIUwPIbwGXAWcY2bbZDc0ERGR6hs9GjZtgjfeSL3/7rvhvPO8K2TGjNTHTJvmj716\nxRNjXOKsyVgGbAJal9veGliU5pyFwIIQwuqkbbMAA3bHC0FTGjZsGM2bNy+zraCggIKCgmqGLSIi\nkjljxvjj5Mlb7gvB9x93nK+q2quXT6513XWw886J46ZN87kwarMAWmFhIYWFhWW2FRcX1/yCVWBe\nJhHTxc2mAtNCCFeUPje8kPPBEMLdKY6/ALgfaBVCKCnddgowBmgWQliX4pweQFFRURE9evSI7V5E\nRESq64svfIjpUUf59N/FxT5LZ+STT6BbNxg3zoeh3nUX3HmnF3f+5jdw9dX+/dFHwy67wPPPZza+\nGTNm0NOnCu0ZQkjTjlJzcXeX3AdcYGY/NbPOwCPAdsAIADO73cxGJh3/LLAceMLMuphZH3wUyv+l\nSjBERERy2fPPw/bbw223wbp1PiV4+f3Nm3uC0aQJ3HADzJ0L55zjrRl/+IN3tUyfnn/1GBDzENYQ\nwmgzawncgneTfAT0DyFEdbZtgHZJx68xs+OAPwEf4AnHKODGOOMUERGJw5gxcOKJcNhhsMMO3mVy\nxBFl9598sicYkVat4IEHoEUL+N3vPAlZsyY/k4zYCz9DCMNDCHuGELYNIRweQpietO+8EELfcsfP\nCSH0DyE0CyG0DyFco1YMERHJN/PmwYcf+mqpDRt6cpFclzFrlo88OeOM1OffeKN3k1xxhc+FkYsL\noFUm10aXiIiI1AnPPw9Nm8IJJ/jzPn3gnXdg48bE/mbN4PjjU5/fsCE88wy0bg377+/H5hslGSIi\nIjF4801fWyRKDvr08UXOPvrI58QYNcq7UrbdNv012rSBiRPhr3/dOjFnmpIMERGRGHz3Hey6a+L5\nwQd7y8aoUZ58zJkDv/xl5dfZb7/87CqB+NcuERERqZfWri3bStGkCRx+ONxzD7RtC2+/7QWhdZmS\nDBERkRiUlMB225Xd9rOfwY47wiOPeFdIXackQ0REJAblWzIAhg71r/pCNRkiIiIxSJVk1DdKMkRE\nRGKQqrukvlGSISIiEgO1ZCjJEBERybgNG3zNESUZIiIiklElJf6o7hIRERHJqLVr/VEtGSIiIpJR\nSjKckgwREZEMU3eJU5IhIiKSYWrJcEoyREREMkxJhlOSISIikmFRkqHuEhEREcmoqCZDLRkiIiKS\nUeoucUoyREREMkxJhlOSISIikmElJdCkCTRsmO1IsktJhoiISIZpcTSnJENERCTD1q7VyBJQkiEi\nIpJxJSVqyYCtkGSY2SVm9rWZrTWzqWZ2SBXP621mG8xsRtwxioiIZJK6S1ysSYaZDQHuBW4CugMf\nA+PNrGUl5zUHRgIT4oxPREQkDkoyXNwtGcOAR0MIT4YQZgMXASXAzyo57xHgGWBqzPGJiIhkXEmJ\najIgxiTDzBoDPYGJ0bYQQsBbJw6v4LzzgL2Am+OKTUREJE5qyXCNYrx2S6AhsLjc9sVAp1QnmFlH\n4A/AESGEzWYWY3giIiLxUJLh4kwyqsXMGuBdJDeFEOZGm6t6/rBhw2jevHmZbQUFBRQUFGQuSBER\nkSooKYEWLbIdRVmFhYUUFhaW2VZcXBzrzzTvwYjhwt5dUgKcEUJ4KWn7CKB5COG0csc3B1YCG0kk\nFw1Kv98IHB9CeCvFz+kBFBUVFdGjR48Y7kRERKR6Dj0UDjgAHn8825FUbMaMGfTs2ROgZwgh46M5\nY6vJCCFsAIqAftE28/6PfsC7KU75DtgfOAjoVvr1CDC79PtpccUqIiKSSeoucXF3l9wHjDCzIuB9\nfLTJdsAIADO7HWgbQjintCh0ZvLJZrYE+D6EMCvmOEVERDJGo0tcrElGCGF06ZwYtwCtgY+A/iGE\npaWHtAHaxRmDiIjI1qaWDBd74WcIYTgwPM2+8yo592Y0lFVERPKMkgyntUtEREQyTAukOSUZIiIi\nGbR5M3z/vVoyQEmGiIhIRn3/vT8qyVCSISIiklFr1/qjukuUZIiIiGRUSYk/qiVDSYaIiEhGRS0Z\nSjKUZIiIiGSUuksSlGSIiIhkkLpLEpRkiIiIZJC6SxKUZIiIiGSQuksSlGSIiIhkkLpLEpRkiIiI\nZJC6SxKUZIiIiGTQ2rXQoAE0aZLtSLJPSYaIiEgGlZR4K4ZZtiPJPiUZIiIiGaRl3hOUZIiIiGSQ\nlnlPUJIhIiKSQVF3iSjJEBERySh1lyQoyRAREckgdZckKMkQERHJIHWXJCjJEBERySB1lyQoyRAR\nEckgdZckKMkQERHJILVkJMSeZJjZJWb2tZmtNbOpZnZIBceeZmavm9kSMys2s3fN7Pi4YxQREckU\n1WQkxJpkmNkQ4F7gJqA78DEw3sxapjmlD/A6cALQA3gTeNnMusUZp4iISKaouyQh7paMYcCjIYQn\nQwizgYuAEuBnqQ4OIQwLIdwTQigKIcwNIVwPfAmcFHOcIiIiGaHukoTYkgwzawz0BCZG20IIAZgA\nHF7FaxiwA7AijhhFREQyTd0lCXG2ZLQEGgKLy21fDLSp4jWuBrYHRmcwLhERkdiouyShUbYDSMfM\nzgJuBE4OISzLdjwiIiKVCUHdJcniTDKWAZuA1uW2twYWVXSimZ0J/AUYFEJ4syo/bNiwYTRv3rzM\ntoKCAgoKCqocsIiISG1s2ACbNuVmklFYWEhhYWGZbcXFxbH+TPMyiZgubjYVmBZCuKL0uQH/AR4M\nIdyd5pwC4HFgSAjhlSr8jB5AUVFRET169Mhc8CIiItVUXAw77QSjR8PgwdmOpnIzZsygZ8+eAD1D\nCDMyff24u0vuA0aYWRHwPj7aZDtgBICZ3Q60DSGcU/r8rNJ9lwMfmFnUCrI2hPBdzLGKiIjUytq1\n/piLLRnZEGuSEUIYXTonxi14N8lHQP8QwtLSQ9oA7ZJOuQAvFn249CsykjTDXkVERHJFSYk/Kslw\nsRd+hhCGA8PT7Duv3PNj4o5HREQkLlFLhkaXOK1dIiIikiHqLilLSYaIiEiGqLukLCUZIiIiGaLu\nkrKUZIiIiGTImjX+qCTDKckQERHJkOXLwcznyhAlGSIiIhmzYoUnGA0bZjuS3KAkQ0REJEOWL4dd\ndsl2FLlDSYaIiEiGKMkoS0mGiIhIhijJKEtJhoiISIYsXw4775ztKHKHkgwREZEMUUtGWUoyRERE\nMmTFCiUZyZRkiIiIZEAIaskoT0mGiIhIBvz3v7Bxo5KMZEoyREREMmD5cn9U4WeCkgwRkRy1aROM\nGAEff5ztSKQqoiRDLRkJSjJERHLQnDnQpw+cdx7ccUe2o5GqUJKxJSUZIiI5ZtIk6NYNliyBo46C\nWbOyHZFUxYoV/qgkI0FJhohIFW3cCM89B5s3V37spEmwYEH1f8amTXDlldCzp3eTnH46zJ7t2yW3\nLV8OTZrA9ttnO5LcoSRDpJ4Kwb+k6l5+GQoK4N13Kz4uBE8Ofve76v+MZ5+FTz+Fe++F7baDrl1h\n3Tr4+usahSxbUTTbp1m2I8kdSjJE6qmhQ+GSS7IdRX6ZNMkfp0+v+Lhvv4XiYnj11eolcuvWwY03\neoJy6KG+rUsXf1SXSe7THBlbUpIhUk9NmQJTp2Y7iuxbvhwGDIBdd/VuipkzvVtk48Ytu0WiJKOo\nqOJrzpzpjwsWeKtEVf35z/DNN3DbbYltbdvCjjsmrim5S0nGlpRkiNRD333nb2Zz5tTvLpOPP4ZD\nDoH334chQ+CZZ2C//aBxY/9q1w7WrPFjFy70N/q2bStvyZg5E5o29b75ceOqFsuqVZ5c/Oxn0Llz\nYruZd5koych9SjK2pCRDJEeF4MV/Y8fW/lrz5sHq1YnnUdP7mjWwaFHtr5+P5s2DH/4Qmjf3pOGh\nh2D+fHjpJZ+bYvhwTyxGj/bj33zTHy+/HL74whO1dGbN8kShX7+qJxm33ALff++P5XXpou6SfKB1\nS7YUe5JhZpeY2ddmttbMpprZIZUcf7SZFZnZ92Y2x8zOiTtGkVy0ZAnMmAF/+UvtrhMC9O4Nf/hD\nYtvnnye+nzOndtfPV7//vXdDTJkCe+7p27bZBk46Cc45B375Szj2WHjsMd83aZK3cpx4ov9OP/ww\n/bVnzvTEYMAALxJdubLiWGbPhj/9Ca6/Hnbbbcv9UUtGfW51ygda5n1LsSYZZjYEuBe4CegOfAyM\nN7OWaY7fE3gFmAh0Ax4AHjez4+KMUyQXzZ3rjxMmeFN6Tc2f760V//xnYtvnn8Puu0ODBvUzyZg7\nF558Eq69Fpo1S3/cBRfAe+/572viRG+Z6NzZR32k6zIJwY/v2hVOOMGHnr7xRsXxXHWVd81ceWXq\n/V27eqvTN99U7f4kO9RdsqW4WzKGAY+GEJ4MIcwGLgJKgJ+lOf6XwLwQwjUhhC9CCA8DY0qvI5JV\nf/2rf5I99li48EJ49FF/s4rr02WUZGzYAK+8UrVz1q2Dvn3ho48S22bM8Mfp0/1a4J+Ku3eH9u3h\nyy8Txz75pN9bXXfbbV7oWdm9nnKKH3fddfCvf/nvtlEj/90lJxkLFyb+Hyxd6s3mXbvCHnvA/vv7\nKJN0xo3z/ffe63UcqUQjTFSXkbs2bvQRRUoyyootyTCzxkBPvFUCgBBCACYAh6c57bDS/cnGV3C8\nyFZx991w/vmwzz6w004wbZoP/9xnH38TWbIk8z9z3jxo3RoOOwzGjKnaOTNneu3A888ntkXN+mvX\nwief+Peff+4J0777lm3JePJJePzxxPTI4G+azz5bu3vJJcmtGNtuW/GxTZp418lLL3mrz1FH+faD\nD04kGZ984slE9DuKaie6dvXHAQM8iUg1gdemTXDNNXD00XDqqenjaN/eY1VdRu7SbJ+pxdmS0RJo\nCCwut30x0CbNOW3SHL+jmW2T2fBEquamm/yN4IYbvAhzzBhvKVixwp9/9ZWPSsi0uXNh771h0CB4\n7TVfRroyn33mj8ldIx9+6G+OjRv7kNVoZEnXrtCxYyLJ2LjRk6fNm+H11xPn3303nH02/Oc/mbu3\nbLrzzqq1YkR+/nN/7NHDE0zwJOOrr7zW4sor/Xf39NO+b+ZMb+3Ye29/PnAgLF6cerjwU095wnfn\nnRVP4NSggbdmqCUjd2ndktQaZTuATBk2bBjNmzcvs62goICCgoIsRSR1wTffeLX/TTdtOXvjjjt6\nc/rAgf4GMyzDnXpRknH66fDrX3uz+pAhFZ8TzckwbRqsX++fxD/8EM46y1sy3nvPR6yAt2SsWuWF\npZs2eYKyerV/Yh43zme2DCExuuL11xNvuBs3wosvemz5NLvh5s3wwgt+H5W1YkQ6dfLWjEOSStYP\nPtgfb7jBW45OOQX+8Q9/o5k501uIGjf2Y3r39mGvo0b5aJbI99/D//t/cMYZ0KtX5XFoGGtui1oy\ncrnws7CwkMLCwjLbiouL4/2hIYRYvoDGwAbg5HLbRwAvpDnnbeC+ctvOBVZW8HN6AKGoqCiIZNo9\n94SwzTYhFBenP+b5532C7pkzq3/9TZtC2LAh9b7WrUP43e/8+549Qxg8uPLr/ehHIbRt6/FMnRrC\n0qX+/XPPhXDFFSHsvXcI//d/IZiFsGZNCK+95vvnzQvh4YdDaNQohCuvDKFlS4/tvfd8f/PmIQwa\nlPg5Tz/t2999t/r3HKeSkhDuvDOEyy7z+Mv74AOP++23a/dzNm0KoVkzv9aAASEsWhRCgwYh/OUv\nIfTtW/Z3FYL/Ttu0CWHjxsS2e+8NoWHDEGbPrtrPvO22EFq0CGHz5trFLvF48UX//7BwYbYjqZ6i\noqIABKBHiCEXiK27JISwASgC+kXbzMxKn6eb+f+95ONLHV+6XWSrGzXK+9R33DH9MQMGeDN6TbpM\nzjjDi/06dPCRCNGcFatXexN71OQ+eLB3zVx/vReXpfPZZ95qse223mUS1WP06OG1HXPnwltvwV57\n+QiJjh19/5w5PtSyRw+PadkyrzkYNQratPG5Id54w1swINE1EBWV5oKRI71G5vrrfTjoE09secz4\n8bDDDnB4Lau8GjTw31WjRnDffV47c8wx/vuaNStRqBkZMsT/badM8ecrVviQ4vPP95aSquja1btn\n6uu8Jrku6i7J5ZaMbIh7dMl9wAVm9lMz6ww8AmyHt2ZgZreb2cik4x8BOpjZnWbWycwuBgaVXkdk\nq5o7Fz74oPIuiqZNvW7imWeqN9Jk/nzvcjj3XD//jTd8AS7wok9IJBlXXOFdJvff7wnJvfd6c3uy\nlSv9mt27e0IRJRk77ODXid5Yx4zxrhLwgsLGjX2EybvvenP+YYd50vTKK95VMniwJ0DFxT4z5pIl\niSGZFc0VsTW9847/Hnv39jf5n/4Urr56y4Lc8eN9hEjUlVEb11zjI4yiJOHMM30ujYULE0WfkUMP\n9d/1qFH+/LLLvIuqOguoRbOAzp5d69AlBsuX+99akybZjiS3xJpkhBBGA78GbgE+BA4E+ocQlpYe\n0gZol3T8v4ATgWOBj/Chq+eHEMqPOBGJ3ejR/ml/4MDKjz37bB/iWNnqnMmeesoTlPvug7vu8uRg\n8mTfFw1fjZKMpk39k+9XX8GPf+wjIzp29JkpI9EEWwccAEcc4UnGjBnQrZt/8t5jD2+VWLs2kWQ0\nbOif/idP9lU+e/f2T+f9+8ODD/pCX2ee6fUIO+3kb9KjRvn1TjmlaknG/fd7chKnm2/2+37uOb+f\ne+7xWpGrrkoc8913XpPSv39mfuaJJ/oU4JHTT/ffJ2yZZJh5sjpmjMf47LM+w2iqibfS6dDBr//F\nF7WPXTJPc2SkEUcfzNb8QjUZEpNu3UIYMqRqx27aFMLuu4fwy19W7fjNm0PYd98Qhg5NbLvqqhD2\n2MO/v/vuELbfPn3/+5w5XqORXF8wfLjXVKxbF8L48b6vWTOvT4icdppvf/LJxLaTTw6hSRPfvmCB\nbxs50p+3a5eobRg0KIRDDw2hVy8/509/8vPWrUt/n//6l9d/9O5dtd9LCCF8/73/jN/8JlHDsG6d\n/07OO8+/rrjC601CCOGf//RYx4wpe52//tW3v/qqP3/hBX8+d27VY6muE07w2oy1a7fcV1TkP79x\n4xBOP71mtRX77uv1HZJ7LrgghB49sh1F9eVtTYZIPps92xfPqqyrJNKggddCjB7tozoqM3Wq10Gc\ne25iW58+Pkz03/9OjCxJN3KjY0dvUdhrL28RAR9Z0qmTN9cedpjHtHq11w5EDjvMH6OWDPCREOvX\ne3N+27a+7Uc/8p/94x/7dcBbAN5/37/OPttbXtavr3jEw1NPeRfSO+9411NVPP+8/4y77vJ6lzff\n9NEXv/2td4XMnu3zXPTp491DUSvGaaeVvc6558Lxx3usX33lrTD77OMtAnG59lof0ppqUq3u3f3f\nbaed4JFHajYqp1MntWTkKq1bkpqSDKn3iou9yDLy1Vf+RrHDDl6LUFVDh3qT6fjxqfcvXpxY0XPE\nCJ9G+phjEvuPOMIfJ09OJBkVMfM30L/9zeszPvvM32zBC1W7dfPvu3dPnHPGGf6VnGRExZ/Jwytb\ntfJ6kd/+NrGtf39PGHbYwdf36NbNY0jXZRKC3+fQof7G/sc/Vnw/kYcf9rqJ11/3JdX79vVrvf++\nd3e8+64naWvW+FDSN97wIcYNyr2amXnXRMuWHu8//uHJU5yOOsrrZVIx8+6St9/2eTpqolMn1WTk\nKnWXpKYkQ+q9wYO9VuGAA7wAs0sXbxV44on00zyncsAB/pVqlMncuf4G0bYtXHyxv/mdc07ZN8Zd\ndvHZQ6uaZIAnGcXFPq/Fp58mkgzwpKVJk7L1AXvv7W902yRNbbfvvv6YnGSAvzEnv2i2a+cJS0GB\nj15p1szPTZdkvPOO38f553vh6ujRsGBBxffz0UeeRFxyia8TMn26L1D2wQdlk6V99/WakxYtfHv5\nVoxIixZeTLtokc95kql6jJo68MAtR55UR6dOXvtTvuhXsk9JRmpKMqReKynxT5Y/+Yk3yc+b5+ta\nfPWVf+KvrqFDvQUgeRnwdeu826FlS7j0Up8Mas0aTzLK69PHRyj8+99VSzI6d/bJte66yyfW2n//\nxL5f/coLDCsbSdG9u38Cr0qB65tvwgMPlD03XZIxYoSvbtqnD5x3nhfRPvSQt0osXeq/l/KGD4cf\n/ABOPtmf77mnT5yVqmK/XTtPSiZP3rIVI9m++3oXzIABZVuO8lGnTv77++qrio979VX/PyRbx+bN\nnkC3apXtSHJQHIUeW/MLFX5KCqtXe9Hm669XfNwbb3gx3mefZebn/uc/Xug4YkRi26WXeoFk9F90\n/foQ/v3v1Oc/95zHA5XHHrnvvsQ58+bVLv7quvNOLy4tP/HV6tUh7LBDCDfdlNh21VX+e4gmsTrz\nzLLnrFwZwnbbhXDLLbGHnbcWL05d5Fpeq1Yh/PCHmrhra4kmeXvrrWxHUn0q/BSpgTvv9MLIE0+s\neHGxiRP900f5IYc11a6df3J/+mlvJbn5Zv/0fv/9iQLMxo19OGkqRx6Z+L4qLRngQ0wbNPDui/bt\naxd/dfXo4cWlyZ+sS0rgxht9rZXk1pprr/WWnJtu8u0vv1y22f/JJ32V2Asu2Hrx55tdd/UuoIrq\nMjZt8paid9/1bjmJ37hxXgdVvstR1F0ieeZf/0rMmljRMXff7ZMlDR7sI0SS55NINmmSFxZmcv2N\noUP9uvvs410v114Lv/xl1c5t29bPa9jQE5aq2G03X34+mg9ja4rqJD780BOERx7x+B96yNfl2Guv\nxLGtWnlR5K9/7V9r1vjso5GRI70OpE265RMFs8pHmKxc6e1au+7qfwNRsbHEZ9w4H8mUiUne6hol\nGZIXJkzwkQEdOviy2MuWpT/2mmv8096NN/oQynPOgYsuKlsnAV4wOX26Fxhm0qBB3lLRt69/4rzj\njuolMUcf7a0Y1XnBeuopKLfu0Vaxyy6eDD32mLcGXXyx/z5nz/ZWnHT228/rLaIZTmfN8onDhg7d\nKmHntcpZYZ4BAAAgAElEQVSSjKWlUx3ec4/PeHrXXVsnrvpq2TIf+TRgQLYjyU1KMiTnffutJxjL\nlnkLxebN6Vsz3n7bh3Teead3HzRo4G9269bBSy+VPXbyZL9W376ZjXennXz2zKefrtmcDLff7sWj\n1dGqVdVbPjLt4IO926lTJy/EfOqpyu/bzAtNX3nFP3U/84z/3vRCXbkoyQhpprCPEvBevXzG07vu\nqni9G6md8eP93yLu4dH5SkmG5LwnnvAhl5Mm+YiJDh3KNrNHNm3yoZK9evnQzki7dt5XWr5/euJE\nr2FIbtLPBS1bJtapyAd//KMPMX3lFR+iWVUnneSTj33yiScZgweXHVorqXXq5ElD+XVZIlFLRsuW\n3or3/ffeYifxGDfOuw2rM0V8faIkQ3La5s3w+ONeVxGthHr00amTjL/+1WfpfOCBLWsTzjzTJ3da\nuTKxLY56jPpojz28NaO6jjrKW5t++1uvo0lODCW9yhZKW7rU///vvLMP323WTElGpqxb5yv8Hnig\nFzCvXQuvvaYWuIooyZCMW73a52zIhAkT/A0oecTB0Uf7p99oaWXwn3f99T7fRTR1drJBg3yZ8hde\n8OdLlvjkVZnuKpGq22Ybnxzr1Ve9tSl5ZI2kt/fenkSkq8tYtsxrZRo08K+ePZVkZMLbb3uCd+WV\nsPvuvmBhly4+nbiSjPSUZEjGXXqptxxkwmOPeZFgcuJw1FH+GK1YCnDrrV5Ff/vtqa+z225+3nPP\n+SiIYcN8BIeSjOw66SR/POusrT8yJl9ts4138aVLMpYu9a6SSPkkY9Mm+POfNWtodV19tbcOffqp\nd5F88IHXEbVtC4cemu3ocpf+rKXaVqyoeO6JKVN8HY3q2LjRWxeSi9mWLPECyAsuKNulscceZesy\nZs70Zcmvu85ni0xnyBDvIhk40ItDCwsTC4JJdpx0krdMaW6M6unUyUfkpLJ0adm1UQ4+2FsDo5a/\nCRN8FNC4cbGHWWfMn+9JxVVXJebUOeggT95mzfIPLJKakgypthEjvEgvVZfIihU+NfeCBVX/pLRu\nnX8SaN3aF9/abz+f5Onww/3T7U9+suU5UV3Ghg3w05/63AxXXVXxz4mmCZ8yxUeaDB5ctfgkPjvv\n7FOVV3XiMXGHHOKLxW3atOW+Zcu2TDLAF5sDGDvWHz/9NN4Y89Hatam3v/QSNGq0ZbdIo0aJWjFJ\nTUmGVFu0tPfnn2+5L3ohA//0VBU33+wveI895t8fe6wnGD/6ETz6qL8RlRfVZfz61z5s8sknfdGu\niuy6q0/49PbbGm4m+a1fP0/yP/poy33lu0v23huaN/dP3Zs3J4ZHf/LJ1ok1E1IlUxWZM8cLidev\nT73/5Zd9krzkltMHHvBai1TnjB3r6960aFG9OAQaZTsAyT/JSUbv3mX3FRV50+GmTb4CZ2VDMadO\n9Tktbr3VF8Kqqqgu48EHfdKtQw6p2nkawSB1waGHelI9aZLXXCQr312SXPz5wQewcKE/z5ck4+OP\nfQj6jBneTVQVv/udd4f+6leJ6fyTPfOMLztw2GG+gu/ixYmp8GfPLjsUe9WqLRcGlKpTS4ZUSwiJ\nJCNV3cX06YklxufNq/haJSU+jv+QQ3yWzurYYw/vIuneHW64oXrniuS7Jk18NM6kSWW3h7Bldwl4\nl8n06f6JvGVLnwF37tz8mHL8pZf8tWLUqKodP29e4th0tWFR0exll/lMwNdfn6ir+PjjsseOG+c1\nY9HKwFI9SjKkQv/8p69HEVm40CcC2mGH1N0l06f7ZFh77VV5kjFypL/QjRzpfZvV9fLLPvwx1TLg\nInVd375eX5TcvL9mjddCJXeXgCcZ33zj3YonneTJeQip/4ZzzYQJ/vj881U7/u67fQjvD36Quu5k\n82bvTrn8cm+lOPNMn2Pn97/3qe7LJxljx/oHod13r9Vt1FtKMqRCf/mL1z1s3OjPo4r2k0/e8lPC\n0qXw73/7C9ree3sCUZGnn/Z5EqraBFpe585eLCpSH/Xt60nFBx8ktkWzfaZqyQCfov/UU32ERIMG\nudFlsny5L+73179uuW/1ai9w7dfPY/3yy4qvtWiRzxB85ZXeTZIqyViwwFtGjj8ebrnFP6h07QoX\nXuhxJP9Ovv/e9596au3usT5TkiEVmjPHX8ii7H7mTB+nP3CgDzGNXtQgUfTZs6cPMa2oJWPePF+K\nWjUSIjXTo4cXdE6cmNgWrVtSPsnYc08voN5uOzjuOK/n6NgxN0aYXHyxv7FfeumWScTkyT6C7J57\nPPbKWjMeeMBbNi++GPbfP3V3SdRV0qmTt2ZceqknOI0aeZKR3JLx1lue6KirpOaUZEiF5szxx3/+\n0x9nzvQ/zoMO8ufJza3Tp/vkNB06JJKMdIs4PfssbL89nHJKfLGL1GUNG/ooq+S6jOR1S5KZ+bGn\nnZYYhXXggdlvyRg1CkaP9qUD2rb1Gq3kkSRvvOGzwXbr5sNHK5qfB7wV4/zz/XXogAO81SJ5KQHw\nws7GjT3xatTIpwnv1cv3devmH54WLfLn48d7N8l++2XslusdJRmS1vLl/gfaoEHZJKNrVy+6bNKk\n7CeF6dO9WdbMk4y1axN/rMmiVTdPO80TDRGpmb59vTuhpMSfp+suAR9t8cQTiedRkpHug0DcFi70\nFocf/9gTg5EjfbTZPfckjpkwwVtezHxpgKKi9EPjFy/2ryOO8OcHHOCP5VszvvjCX79S1YF16+aP\nUWvG+PHepav1jWoutiTDzFqY2TNmVmxmK83scTNL+5ZiZo3M7E4z+8TMVpvZAjMbaWZa2y5LolaM\nH/3Ik4wQvCajSxf/A+3cOXWSAYnJlVJ1mcyY4Z8m1FUiUjt9+3rh57vv+vNly7woO9Vqtk2a+Cf4\nyIEH+uR5CxdunVjLu+MOb415+GF/3ru313/dcIMvOrZwob++HHec7x8wwO/r739Pfb3otShKLvbd\n1++3fJfQF1+krwPbay9fUO7jj71QdtYsTzKk5uJsyXgW6AL0A04E+gCPVnD8dsBBwM1Ad+A0oBPw\nYowxSgWiJOPcc71FYto0/6QUTau7336J7pJFi7xpMkoyouXTyycZIXiFe6tWPumWiNTcfvt518DU\nqf68/ERcFYnejKvbZfJ//+eLg9XWm296rUNyvLfdBiecAKef7nPnQGJ9oR128Fqw666DoUMTrauR\nTz+Fpk0TH3CaNPFkIlWSkW7+ngYN/Pfy8cfeitGggV6naiuWJMPMOgP9gfNDCNNDCO8ClwFnmlmb\nVOeEEL4LIfQPITwfQvgyhPA+cCnQ08w0eCgLvvzS+yOPPdabCx97zLdHSUZUWBWCT4rVpIlPmgPe\nDdK6dWKEyVtv+R/v9tv7sUOH1mzYqogkmHliH40wSTVHRjrt2/sbd6okIxpNlspf/+oT6G3YUP14\nIytW+Jt/nz5ltzdu7DUahx3mi7gddJB/IIk8/riPCJk61ecJSV5/5dNP/bUpeR2RAw4o29paUgL/\n+U/FI9qiESavvea1Gprls3biask4HFgZQvgwadsEIADVWa9up9JzMrRwuFTHnDlegd6ihX9ieu45\nTwz22cf377+/jzN//XUfm3799b7aaWTvvRPFn7/6lf/x/+EPPq3xHXdk555E6ppDDoH33/e/s/Kz\nfVYk+tRePskYPdr/5keP3vKczZv9U/533yW6aGoiaoUon2SAt0a8+KK3aJx/ftl9O+3kE/fNmeOv\nQ8lJxmefJVpnIgcc4MlHVHcStc5WlmTMmuX1IOoqqb24kow2wJLkDSGETcCK0n2VMrNtgDuAZ0MI\nqzMeoVRqzhzv1wQvpiopSRR8QqLi+qyzfPu115Y9PxphMmWK12HceaePXz/55LJ9wyJSc716Jbor\nq9NdAl6XkbzeEPiHiQ0bfNXia68tO9ojeZbQ2qziOnmyjxpp3z71/h128Otfemnq/Q0a+NICkyf7\n882bveu2fJKx//4+eeCCBf48efhqOt26+T0XFyvJyIRqJRlmdruZba7ga5OZ7VvboMysEfA3vBXj\n4tpeT6ovBO8uSU4yINFVAl53se223vT56KNbFpt16OAvSvff78Wixx+/dWIXqU+idXs++KB63SXg\nRd2zZyfefNev92GjN97oozzuucfXAYlEC7KdcIJPUlVTkyd7K0ZtRm306eOtFCtW+IiTNWs8qUgW\nJR1RXcYXX3gSlmrRxeRzzLzVpKprIkl61e0Vvwd4opJj5gGLgFbJG82sIbBz6b60khKMdkDfqrZi\nDBs2jObNm5fZVlBQQEFBQVVOl3K+/dZbLjp29OepkowGDfwPvUOH1M2eHTr4J6wXX/SpyTUMTCTz\nfvADn2Pi/fer110Cnvhvv71PcnXddd6NsXq1j+To3t27IF58MVGE+dFH/rPOPddbOr75xlskquO/\n//WWzeosiJhK9JoTjXyDLVsy2rf30SKffuqJUUVFn5Fmzfx1r1u3ulc3VlhYSGFhYZltxcXFsf7M\nav0KQwjLgeWVHWdm7wE7mVn3pLqMfoAB0yo4L0owOgDHhBBWpju2vPvvv58eqZbbkxqJ+i6jlow9\n9oBhw2Dw4LLHVfRpJqry3nln+MlPMh+jiLhDDvEaiVWrqtddsu22PmIjSjLGjfO6qmiyvX79YMQI\nn6CqVStPMg46yIeVNmzof/+/+EX1Yn3vPe+OSPXBpDrat/fC9MmTvdWhRYuyNWHgH2wOPNAn8fr5\nzz3JiO6tIn/7m69/Utek+uA9Y8YMepZfyjeDYqnJCCHMBsYDj5nZIWbWG/gTUBhC+F9LhpnNNrNT\nSr9vBDwP9ACGAo3NrHXpl3rwt7Ivv/QXkWgoqhncd1/ZJZCj7elaKKIk46KLErMMikjmRUkGVK8l\nA+CMM7xlYd48TzJOOCHxNx0NH33rLX+MkowWLXwkWUV1GTNnendLck0HeFKw6641X7MoYuaJyuTJ\niaLPVK9Fd9zh93bwwV7QWZWfe+CB3kIktRfnPBlnAbPxUSWvAJOBC8sd0xGI+jh+AAwEdgc+Ar4F\nFpY+Hh5jnJLCnDk+7W5tVjjdbTf/RPCb32QsLBFJoVevxLDT6iYZJ5zgIzruu8/fhAcMSOxr29a7\nFyZO9NaMb79NtAQMGOAjMNatK3u9jRv9jb17d7j6ap9vItnkyT78NBPdp336eII0deqWXSWRI4/0\niQKbN/cu4Mq6SySzYksyQgirQghDQwjNQwgtQggXhBBKyh3TMITwZOn3/y59nvzVoPRxclxxSmrJ\nRZ+1MWiQ93GKSHyiSfCget0l4H+fJ5wAw4d7DUL5yaf69vX1UaKptqMk44QTvNjymWcSxy5f7qM+\nrr/eR5IdeGBifh3wVU2nTat9V0mkTx9vKfn3v7cs+ky2557wzjve9fOjH2XmZ0vVaO0SSSl5+KqI\n5LYWLRLz11S3JQO8yyQEn9q7XP08ffvCV1/BSy95kWjUDXrggT6p3gUX+CygCxb4m/6XX/qw9Tvv\n9HqNl19OTF3+yCPe0pGpN/rOnRNJVbqWjMh22/kCbBo+v3UpyZAtbNzoQ0+jkSUikvt69fI30B13\nrP65Awf6m3CqJc2PPtq7Np54wkdcNCh91zDzRc0uvNCLKrt395EjU6YkZv49+2zvch0xwke+/O53\nnpTUth4jEtVlQMUtGZI9dWyAjlTVpk1lp99NNm+eT8ajlgyR/HHGGT78tCa1Ds2be6Fm27Zb7ttl\nF+8i+fDDLUdmNGjgC5ztuqtPw/23v/lItMhOO/mItMcf9y4NSAyHzZSCAu+GKd8CI7lBLRn10MaN\n/qnj//2/1PsnT/YXj169tm5cIlJzp5/uc1rUVPv26bsSolEmqYZ/msHNN3utRXKCEbngAv/g8uij\ncNNNNevOqcigQfCPf2T2mpI5SjLqoZdf9slp/vAH/3RS3ltvQY8e+mQgIi5abr0mUxH17u2T+HXq\nBJdcktm4JPepu6Qeevhhb6VYu9b7U997L9F1EoInGZooVUQixx/vM2vWZM4mMy8abdSodkPiJT8p\nyahnZs/2Me9PPeVV4j/8oS+pHC1ENHeuV4kffXRWwxSRHGLmLRI1FY1IkfpH3SX1zJ//7H2igwfD\n4Yf7bJzXXecT7YC3YjRokFirREREpKaUZNQjq1f7ULKf/zyxYurvf++P997rj6rHEBGRTFGSUY88\n+aQnGhcmTe6+yy5w+eXw0EM+jv2tt9RVIiIimaEkI0+F4BPfVFVJibdaDBniQ9WSDRvmhZ8XXeT1\nGMcck9lYRUSkflKSkaeiSW9Wr67a8Q8+CMuWJbpHkkWtGX//u+oxREQkc5Rk5KmPP4ZVq+DNNys/\ndsUKXxXxwguhQ4fUxwwbBjvs4EPUajItsYiISHkawpqn5s3zx3Hj4KSTKj72jjt8ls8bbkh/zC67\neFGoVkwVEZFMUZKRp77+2h/HjfP6jGi9glWrfFnmCRPgo488GVm82KcQb9264muefnq8MYuISP2i\nJCNPff21z3Px3nu+sNF++/naAUcf7YsFdewIhx0G/fv7dL6DBmU7YhERqW+UZOSB4mL/ihYfWrPG\nJ8+69VZvrRg3ztcGuPpqXzn1pZe2HEEiIiKytSnJyAO33uqrDM6a5c//9S9/7NoV+vWDV1+FLl1g\nyhT/XgmGiIjkAiUZeWDuXF9z5L//9REgUT3GXnvBgAE+/HThQu8q6d8/q6GKiIj8j4aw5oH58/3x\n00/98euvfVrw3XaDE07wkSOzZ/sokqgAVEREJNuUZOSBKMn4+GN//Ppr7xJp0AD23BO6dfPCzkMP\nzVqIIiIiW1B3SY5bv96HoELZJGOvvRLHvPkmbLvt1o9NRESkImrJyHHffuvzYLRunT7JaNECmjbN\nTnwiIiLpKMnIcVFXyYABXpOxefOWSYaIiEguii3JMLMWZvaMmRWb2Uoze9zMtq/G+Y+Y2WYzuzyu\nGPNBlGSceKLPj/HBB/Ddd+nXIBEREckVcbZkPAt0AfoBJwJ9gEercqKZnQYcCiyILbo8MX++L1gW\nrYw6dqw/qiVDRERyXSxJhpl1BvoD54cQpocQ3gUuA840szaVnPsD4AHgLGBjHPHlk2++gd1395qM\n1q2VZIiISP6IqyXjcGBlCOHDpG0TgIC3UKRkZgY8CdwVQpgVU2x5Zf58TzLAh6rOnu0tGy1aZDcu\nERGRysSVZLQBliRvCCFsAlaU7kvnN8D6EMJDMcWVd8onGeCtGJp0S0REcl215skws9uBays4JOB1\nGNVmZj2By4HuNTl/2LBhNG/evMy2goICCgoKanK5nDF/vs/qCXDggf6orhIREamuwsJCCgsLy2wr\nLi6O9WdWdzKue4AnKjlmHrAIaJW80cwaAjuX7kvlCGBX4BtLfExvCNxnZleGECocT3H//ffTo0eP\nSkLLLxs2+JokqVoyREREqiPVB+8ZM2bQs2fP2H5mtZKMEMJyYHllx5nZe8BOZtY9qS6jH2DAtDSn\nPQm8UW7b66XbK0ts6qSFC30irijJ6NwZmjeH/ffPblwiIiJVEcu04iGE2WY2HnjMzH4JNAH+BBSG\nEP7XkmFms4FrQwgvhhBWAiuTr2NmG4BFIYQv44gz10VzZERJRuPGXvjZsmX2YhIREamqOOfJOAuY\njY8qeQWYDFxY7piOQHPSC/GElh+iJKNdu8S2Nm2gkVacERGRPBDb21UIYRUwtJJjGlayv17Pazl/\nPjRr5kNWRURE8o3WLslh0URcGq4qIiL5SElGDkueI0NERCTfKMnIYUoyREQknynJyGHz55ct+hQR\nEcknSjJy1MaNZSfiEhERyTdKMnLU88/Dpk1KMkREJH8pycgxy5fDWWfBmWfCySdD377ZjkhERKRm\nNK1TjjnrLPjgA3jqKTj7bA1fFRGR/KUkI4csWQITJsCjj8LQCqcxExERyX3qLskhY8d6y8Wpp2Y7\nEhERkdpTkpFDxoyBo47SAmgiIlI3KMnIEcuXw6RJMGhQtiMRERHJDCUZOeKll2DzZjjttGxHIiIi\nkhlKMnLE88/DEUf4Uu4iIiJ1gZKMHFBcDG+8AWecke1IREREMkdJRg54+WVYvx5OPz3bkYiIiGSO\nkowcMGoU/PCHWgxNRETqFiUZWbZyJYwfD0OGZDsSERGRzFKSkWUvvOArrg4enO1IREREMktJRpaN\nGuUTcO22W7YjERERySwlGVvZBx/AsGHw3XewdClMnKiuEhERqZu0QNpW9sAD8Mwz8NprMHCgb9PQ\nVRERqYvUkrEVheCrrP74x/78nnugXz/YddfsxiUiIhKH2JIMM2thZs+YWbGZrTSzx81s+yqc18XM\nXjSzVWa22symmdnuccW5NX32GSxeDBdcANOmwWWXwQ03ZDsqERGReMTZXfIs0BroBzQBRgCPAkPT\nnWBmewNTgMeAG4H/AvsB38cY51bzxhvQtKlPH960KTz4YLYjEhERiU8sSYaZdQb6Az1DCB+WbrsM\n+IeZ/TqEsCjNqb8H/hFC+G3Stq/jiDEbJkyAI4/0BENERKSui6u75HBgZZRglJoABODQVCeYmQEn\nAl+a2WtmttjMpprZKTHFuFWtWwdvvw3HHpvtSERERLaOuJKMNsCS5A0hhE3AitJ9qbQCmgHXAuOA\n44AXgL+b2ZExxbnVTJ0KJSVw3HHZjkRERGTrqFaSYWa3m9nmCr42mdm+tYxlbAjhwRDCJyGEO4FX\ngItqeM2c8cYb0LIldOuW7UhERES2jurWZNwDPFHJMfOARXjLxP+YWUNg59J9qSwDNgKzym2fBfSu\nLLBhw4bRvHnzMtsKCgooKCio7NStYsIEH67aQIOGRUQkCwoLCyksLCyzrbi4ONafaSGEzF/UCz8/\nBw5OKvw8Hu8G2T1d4aeZvQN8FUI4J2nb34GSEELKUSlm1gMoKioqokePHhm+k8yYPh0OPRQeewx+\n9rNsRyMiIuJmzJhBz549wQdqzMj09WP5XB1CmA2MBx4zs0PMrDfwJ6AwOcEws9nlCjvvBoaY2c/N\nbG8zuxQYCDwcR5xbw4IFcPLJcMghcNZZ2Y5GRERk64mz8f4sYDY+quQVYDJwYbljOgL/6+MIIYzF\n6y+uAT4BfgacHkJ4L8Y4Y1NSAqecAo0awdixGroqIiL1S2yTcYUQVlHBxFulxzRMsW0EPnFX3vv1\nr2HWLHjnHWiTbkyNiIhIHaUF0mKyYAE8/jjcdhscdFC2oxEREdn6NNYhJvffD9tvDxeW7yASERGp\nJ5RkxGDFCnj0UbjkEthxx2xHIyIikh1KMmIwfDhs3AiXX57tSERERLJHSUaGlZTAAw/4fBitWlV+\nvIiISF2lJCPDXngBli2Dq67KdiQiIiLZpSQjw0aNgsMPh733znYkIiIi2aUkI4NWroTXXoMhQ7Id\niYiISPYpycigsWO94HPw4GxHIiIikn1KMjJo1Cjo0wfats12JCIiItmnJCNDli3z5dzVVSIiIuKU\nZGTI889DCHDGGdmOREREJDcoyciQv/0N+vbV3BgiIiIRJRkZUFICU6bASSdlOxIREZHcoSQjA959\nF9avh379sh2JiIhI7lCSkQETJ3o3Sdeu2Y5EREQkdyjJyIBJk7wewyzbkYiIiOQOJRm1VFwM06er\nq0RERKQ8JRm1NHkybN7sLRkiIiKSoCSjliZNgvbtYa+9sh2JiIhIblGSUUsTJ3pXieoxREREylKS\nUQtLlsCnn6qrREREJBUlGTUUAvzpT/79McdkNxYREZFc1CjbAeSjkhL4+c+hsBBuuEGrroqIiKQS\nW0uGmbUws2fMrNjMVprZ42a2fSXnbG9mD5nZN2ZWYmafm9mFccVYEyF4DcaLL8Jzz8Gtt2Y7IhER\nkdwUZ0vGs0BroB/QBBgBPAoMreCc+4GjgbOAfwPHA382swUhhFdijLXKPv8cpk6FsWPhlFOyHY2I\niEjuiqUlw8w6A/2B80MI00MI7wKXAWeaWZsKTj0cGBlCmBJC+E8I4XHgY6BXHHHWxIQJsM02cPzx\n2Y5EREQkt8XVXXI4sDKE8GHStglAAA6t4Lx3gZPNrC2AmR0DdATGxxRntb3xBhxxBGy7bbYjERER\nyW1xJRltgCXJG0IIm4AVpfvSuQyYBcw3s/XAOOCSEMI7McVZLevXw9tvw7HHZjsSERGR3Fetmgwz\nux24toJDAtClFvFcjrd0DAT+A/QBhpvZtyGESRWdOGzYMJo3b15mW0FBAQUFBbUIp6ypU2HNGjju\nuIxdUkREZKsoLCyksLCwzLbi4uJYf6aFEKp+sNkuwC6VHDYP+AlwTwjhf8eaWUPge2BQCOHFFNdu\nChQDp4YQXk3a/hjwgxDCgDQx9QCKioqK6NGjR5XvpSZuvBGGD/dJuBo2jPVHiYiIxG7GjBn07NkT\noGcIYUamr1+tlowQwnJgeWXHmdl7wE5m1j2pLqMfYMC0NKc1Lv3aVG77JnJk0rAJE3z4qhIMERGR\nysXy5h1CmI0Xaz5mZoeYWW/gT0BhCGFRdJyZzTazU0rP+S/wNnCPmR1lZnua2bnAT4G/xxFndaxa\nBe+/r64SERGRqopznoyzgIfwUSWbgTHAFeWO6QgkF1IMAW4HngZ2xufK+G0I4S8xxlklb77pS7qr\n6FNERKRqYksyQgirqHjiLUIIDcs9XwKcH1dMtTFmDOyzj5Z0FxERqSqtXVIFX3zhU4g/+GC2IxER\nEckfOVFQmet+/3vYbTc4PyfbWERERHKTWjIqMWcOPPust2I0bZrtaERERPKHWjIqoVYMERGRmlFL\nRhrLl8Mf/gDPPAMPPKBWDBERkepSkpHC00/DJZf4kNWbboKLLsp2RCIiIvlHSUY5mzfDtdfCkUfC\nE0/ArrtmOyIREZH8pCSjnOnT4dtvvdhTCYaIiEjNqfCznLFjYZddoHfvbEciIiKS35RklDN2LJx0\nEjRSG4+IiEitKMlI8sUXMGsWnHpqtiMRERHJf0oykrz4Imy7rVZaFRERyQQlGUlefBH694fttst2\nJB0pshoAAAqJSURBVCIiIvlPSUapuXPhvffUVSIiIpIp9T7JmDIFBg2Czp1h551h4MBsRyQiIlI3\n1NskY/NmuOUW6NMHZs+G++6DL7/04asiIiJSe/VyoOby5b7g2UsveaJx/fXQoN6mWyIiIvGok0lG\nCPDNN7DHHmW3T50KDz0EY8bANtt4kqHuERERkXjUyc/vd9wB7dvDc88ltr39NvzwhzBtGtx6K3z1\nlRIMERGRONW5lowXXoDrroN27eDii+Goo6BZMzj3XJ8q/K23oGHDbEcpIiJS99WpJOOjj2DoUB8t\n8uc/w/77wwUXQNu2sHQpTJyoBENERGRrqTNJxpdfwpVX+lDUkSN9Qq3HH/d1SAAeeQQ6dMhujCIi\nIvVJnanJuOACaNMGXn01MWPnwIFw9dVw9tnwi19kN77qKCwszHYIGVWX7qcu3QvofnJZXboX0P3U\nV7ElGWZ2nZm9Y2ZrzGxFNc67xcy+NbMSM3vDzPapynkdOsCbb0KrVmW333UXPP00mFUv/myqa/95\n69L91KV7Ad1PLqtL9wK6n/oqzpaMxsBo4M9VPcHMrgUuBX4B9ALWAOPNrEll5z78MOy0Uw0jFRER\nkYyLrSYjhHAzgJmdU43TrgBuDSG8UnruT4HFwKl4wpLWttvWMFARERGJRc7UZJjZXkAbYGK0LYTw\nHTANODxbcYmIiEjN5NLokjZAwFsuki0u3ZdOU4BZs2bFFNbWV1xczIwZM7IdRsbUpfupS/cCup9c\nVpfuBXQ/uSrpvbNpHNe3EELVDza7Hbi2gkMC0CWEMCfpnHOA+0MIO1dy7cOBfwJtQwiLk7aPAjaH\nEArSnHcW8EyVb0JERETKOzuE8GymL1rdlox7gCcqOWZeDWNZBBjQmrKtGa2BDys4bzxwNvAv4Psa\n/mwREZH6qCmwJ/5emnHVSjJCCMuB5XEEEkL42swWAf2ATwDMbEfgUODhSmLKePYlIiJST7wb14Xj\nnCejnZl1A9oDDc2sW+nX9knHzDazU5JO+yNwg5mdZGYHAE8C84EX44pTRERE4hFn4ectwE+TnkcV\nMscAk0u/7wg0jw4IIdxlZtsBjwI7AVOAE0II62OMU0RERGJQrcJPERERkarKmXkyREREpG5RkiEi\nIiKxyPskw8wuMbOvzWytmU01s0OyHVNlzOy3Zva+mX1nZovN7AUz2zfFcTVaLC6bzOw3ZrbZzO4r\ntz1v7sXM2prZU2a2rDTej82sR7lj8uJ+zKyBmd1qZvNKY/3KzG5IcVxO3o+ZHWlmL5nZgtL/Vyen\nOKbC2M1sGzN7uPTf879mNsbMWpW/ztZQ0f2YWSMzu9PMPjGz1aXHjDSz3cpdIyfupyr/NknHPlJ6\nzOXltufEvZTGUpX/a13M7EUzW1X6bzTNzHZP2p8392Nm25vZQ2b2TenfzudmdmG5Y2p9P3mdZJjZ\nEOBe4CagO/AxvqBay6wGVrkjgT/hw3OPxReTe93M/rcCi9VisbhsMU/wfoH/OyRvz5t7MbOdgHeA\ndUB/oAvwK2Bl0jF5cz/Ab4ALgYuBzsA1wDVmdml0QI7fz/bAR3j8WxSQVTH2PwInAmcAfYC2wPPx\nhp1WRfezHXAQcDP+enYa0IktR9flyv1U+G8TMbPT8Ne6BSl258q9QOX/1/bGByPMxGM9ALiVsvMz\n5c39APcDxwNn4a8N9wMPmdnApGNqfz8hhLz9AqYCDyQ9N3zI6zXZjq2a99ES2AwckbTtW2BY0vMd\ngbXAj7Mdb5p7aAZ8AfQF3gTuy8d7Ae4A3q7kmHy6n5eBx8ptGwM8mW/3U/o3cnJ1/i1Kn68DTks6\nplPptXrl2v2kOOZgYBOwey7fT7p7AX4A/AdP1r8GLi/3b5Vz91LB/7VCYGQF5+Tb/XwKXF9u23Tg\nlkzeT962ZJhZY6AnZRdUC8AE8m9BtZ3wTHMF5O1icQ8DL4cQJiVvzMN7OQmYbmajzbuyZpjZz6Od\neXg/7wL9zKwjgPncNb2BcaXP8+1+/qeKsR+MD9VPPuYL/I0vp++vVPTasKr0eU/y5H7MzPC5ju4K\nIaRaXCrf7uVE4Esze630tWGqlZ3nKW/up9S7wMlm1hbAzI7Bp5WIZv7MyP3kbZKBf/pvSPUXVMsp\npf95/wj8M4Qws3RzTReLywozOxNv5v1tit15dS9AB+CXeKvM8cCfgQfN7Cel+/Ptfu4ARgGzzWw9\nUAT8MYTwXOn+fLufZFWJvTWwvjT5SHdMTjKzbfB/v2dDCKtLN7chf+7nN3isD6XZn0/30gpvrb0W\nT9CPA14A/m5mR5Yek0/3A3AZMAuYX/raMA64JITwTun+jNxPLq3CWl8NB7riny7zTmnR0x+BY0MI\nG7IdTwY0AN4PIdxY+vxjM9sfuAh4Knth1dgQvM/1TLwv+SDgATP7NoSQj/dTL5hZI+BveBJ1cZbD\nqTYz6wlcjteW1AXRB/KxIYQHS7//xMx+iL82TMlOWLVyOV4rMxBvnegDDC99bZhU4ZnVkM8tGcvw\nvsrW5ba3xhdby3lm9hAwADg6hLAwaVfyYnHJcvHeegK7AjPMbIOZbQCOAq4ozY4Xkz/3ArAQz+6T\nzQL2KP0+n/5tAO4C7ggh/C2E8HkI4Rm8wCtqdcq3+0lWldgXAU3M10FKd0xOSUow2gHHJ7ViQP7c\nzxH468I3Sa8L7YH7zCxaRDNf7gX8/WYjlb825MX9mFlT4DbgqhDCuBDCZyGE4Xir569LD8vI/eRt\nklH6qbkIX1AN+F/XQz9iXOwlU0oTjFOAY0II/0neF0L4Gv9HTL63aLG4XLu3CXiV9UFAt9Kv6cDT\nQLcQwjzy517g/7d39yxOBWEYhu/CVf+AdjYqxmbdxsbCD7ATBW3EallEthBBK2ttLCws7VSwtdE/\noIXIFqLFglus4Ad24mIhLihsYvFO3DgJ7BEye87AfUGKJFPMk5Mz8yaZyYmdJb3ssR7wGao7NhA7\nFjayx/qkc7/CPH817PsbYnIYbdMjJoalbetsQyMFxn7g9GAw+J41qSXPY+AIm2PCHLFI9y6xawvq\nyTKcb14zPjYcIo0NVJSH2NE4w/jYsMFmXTCdPG2ueJ3CitmLwDpxjZTDxDVP1oA9bfdti37fJ7ZE\nHieqwuFt90ibmynLOWISfwq8B3a23f8G+fLdJdVkIRYK/iI+6R8gfmr4AVyqNM8j4qvQM8QnyQvA\nV+BODXmIbXhzRBHbB26k+/ua9j2dbx+BU8Q3b6+Al13LQ/x8/YyYtGazsWGma3m2OjYT2v+zu6RL\nWRq+184T21WvpLHhGvAbOFZpnhfEFc9PEpd6XyDm08Vp5tn24AVeyKvAJ2Lb2hJwtO0+Nehzn6gY\n89t81u4WUf2vEyt+D7bd94b5njNSZNSWhZiQl1Nf3wGXJ7SpIk8aaO6lgeInMQHfBnbUkCcNgJPO\nl4dN+w7sIv6X5htRMD4B9nYtD1EE5s8N75/oWp4mxyZr/4HxIqMTWf7jvbYArKZz6S1wttY8xGLW\nB8CXlGcFuD7tPF4gTZIkFVHtmgxJktRtFhmSJKkIiwxJklSERYYkSSrCIkOSJBVhkSFJkoqwyJAk\nSUVYZEiSpCIsMiRJUhEWGZIkqQiLDEmSVMQflqUTq6gtGpQAAAAASUVORK5CYII=\n", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAg0AAAFkCAYAAACjCwibAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzs3XmYU+XdPvD7mYV9H5gBZliEwgzKvolsoqhVtAhqUdC6\nvShqrRbbV9/azV9rbdVWtFVfq63aoo51lyqiAipQFARU1mFHthl2GPYZmOf3xzfPe06Sk+Qkk+Uk\nc3+ui+tMTk6Sc4BJ7nyfTWmtQURERBRJVqpPgIiIiNIDQwMRERG5wtBARERErjA0EBERkSsMDURE\nROQKQwMRERG5wtBARERErjA0EBERkSsMDUREROQKQwMRERG5ElVoUErdppT6Ril1yPdnoVLq4jDH\nn6uUqgn4c1oplV/7UyciIqJkyony+G0A7gOwHoACcCOAd5VSfbXWa0I8RgPoDuDw/+3Qenf0p0pE\nRESppGq7YJVSah+An2qtX3C471wAcwG01FpX1uqFiIiIKKVi7tOglMpSSl0DoBGAz8MdCuBrpdRO\npdRHSqmhsb4mERERpU60zRNQSvWEhIQGkCaH8VrrshCHlwOYAmAJgPoAbgHwqVJqsNb66zCvkQfg\nuwC2ADgR7TkSERHVYQ0AdAbwodZ6XzyfOOrmCaVUDoCOAJoDuAoSBEaGCQ6Bj/8UwLda6xvCHDMJ\nwMtRnRgRERHZXau1fiWeTxh1pUFrfQrAJt/Nr5RSgwHcDeB2l0+xGMCwCMdsAYCXXnoJPXr0iPYU\nPWfq1KmYNm1aqk8jbng93pVJ1wLwerwsk64FyKzrWbNmDa677jrA91kaT1GHBgdZkKYHt/pCmi3C\nOQEAPXr0QP/+/WM9L89o3rx5RlyHwevxrky6FoDX42WZdC1A5l2PT9yb96MKDUqphwB8AGArgKYA\nrgVwLoCLfPf/HkB70/SglLobwGYAqyBtLLcAOA/AhXE6fyIiIkqSaCsN+QD+AaAdgEMAlgO4SGs9\n13d/WwAdbMfXA/AnAO0BHPMdP1prPa82J01ERETJF1Vo0FpPjnD/TQG3HwXwaAznRURERB7DtSeS\nYOLEiak+hbji9XhXJl0LwOvxsky6FiDzridRaj0jZCIopfoDWLp06dJM7JhCRESUMMuWLcOAAQMA\nYIDWelk8n5uVBiIiInKFoYGIiIhcYWggIiIiVxgaiIiIyBWGBiIiInKFoYGIiIhcYWggIiIiVxga\niIiIyBWGBiIiInKFoYGIiIhcYWggIiIiVxgaiIiIyBWGBiIiInKFoYGIiIhcYWggIiIiVxgaiIiI\nyBWGBiIiInKFoYGIiIhcYWggIiIiVxgaPOTdd4HKylSfBRERkTOGBo84cgQYNw4oLU31mRARETlj\naPCInTtlu317as+DiIgoFIYGjygvl60JD0RERF7D0OARJizs2JHa8yAiIgqFocEjTKWBoYGIiLyK\nocEj2DxBRERex9CQAqdOAc8+K1vDhIX9+4Hjx1NzXkREROEwNKTARx8BU6YA8+ZZ+8rLgdat5WdW\nG4iIyIsYGlJg7lzZbtxo7SsvBwYOlJ8ZGoiIyIsYGlLgk09ku2mTtW/nTis0JKsz5McfA08+mZzX\nIiKi9JeT6hOoaw4cAL76Sn42lYZjx2T66O7dgSZNkldp+NOfgA8/BAoKgO9/PzmvSURE6YuVhiSb\nNw/QGrjgAqvSYEZOtG8vf5JRadAaWLwYaNYMuPlmYO3axL8mERGlN4aGJJs7F+jcGRg92qo0mNDQ\nrh1QWJic0LB+vVQ9nn8eKCoCrrpKKh5EREShMDQk2SefAOedB3TpAhw8KEMsTXOEqTQko3li0SLZ\nnn8+8MYbQFkZMH164l+XiIjSF0NDEu3ZA6xYIR/UXbvKvk2bpNLQoAHQvHnyKg2LFkkfipYtgbPO\nAnr3toIEERGRE3aETKJPP5XteecBjRrJzxs3Smho1w5QygoNWsvtRFm0CDj7bOv24MHA/PmJez0i\nIkp/rDQk0SefyLf7wkL5ht+ypVQadu6UZglAtidPSn+DRDlxAvjmm+DQsHo1cPhw4l6XiIjSG0ND\nktTUADNnSgdIo0sX/0oDIIECSGwTxddfA9XVwaFBa2Dp0sS9LhERpTeGhiRZuBD49lvgmmusfV27\nWn0aTGgwFYdEdoZctAioX1/6MRglJUDjxsCXXybudYmIKL0xNCTJK68AHToAw4db+7p2lUqDvXnC\nhIdEVhoWLQL69wfq1bP2ZWfLjJSLFyfudYmIKL0xNLjwxBPAZ5/F/vjqauC114CJE4Es2994ly7A\ntm3Sf8GEhXr1gPz8xIeGwYOD9w8ezNBAREShMTS48Ic/AM88E/vjP/oI2LcPuPZa//1du0o/AsAK\nDUBi52rYu1eaROz9GYzBg4GtW4FduxLz2kRElN4YGiI4fRrYvdtaLyIWL78scyH06uW/v0sX62fT\nPAEkdq6GVatk27dv8H2DBsmW/RqIiMgJQ0MEe/bIyId164AjR6J//JEjwLvvSpUhcN6FoiIgN1d+\nTlalwax3ccYZwfd17ChNI4FNFOvXAwMGyKJaRERUdzE0RFBRIVutZW6DaH30kazpYB81YWRnyzoU\n9eoBrVpZ+wsLge3bYzrdiDZvloDSoEHwfUo592uYPRtYtgzYsCEx50REROmBoSECExoA+eCM1po1\nQF6e8zd7QPo1mNkgjX79pF9BNE0ic+dKMHn9dZm8KZTNm/2bRQKZ0GD6WgBWWLL/XRARUd3D0BCB\nWYGyV6/YQsPGjcB3vhP6/lGj/IdhAsCYMVJtiKbz5VtvycJTEyYAbdtKdcDJpk2hAwwg/RoOHLBW\n4ASA5ctla/4uiIiobmJoiKCiQioFQ4bE1hlywwZrcSon990HvPSS/76cHOCWW6QDpVM/gldflXkf\n7MrLZbbJsjJp7jDrXATavDlyaACszpA1NVZoYKWBiKhuY2iIoLxcvrn37y8jD06ejO7xGzaErzSE\nMnmyNDMEBgoAeOAB4Omng8+zXTuguFg6NO7eHfy448fluHDNE3l5EnJMv4bNm4GjR63XiIXW/s0d\nRESUnqIKDUqp25RS3yilDvn+LFRKXRzhMaOUUkuVUieUUuuUUjfU7pSTq6JCQkO/fsCpU8DKle4f\ne/SofNCGqzSEUlgIjB0L/O//+n/gbtsGrF0bPCTTPhV1QYFzaNiyRbbhKg2Af2dI05+hV6/YKw1X\nXikVFSIiSm/RVhq2AbgPQH8AAwDMBfCuUqqH08FKqc4A3gMwB0AfAE8A+JtS6sIYzzfpKirkw7h3\nbxntENivobISePZZ5+GYZnhjLJUGALj9dgkp//mPtW/OHNnu2CFNB4CEip07rdCQn+88QdPmzbJ1\nExqWLZOZLJcvB9q0kXkdYg0NK1fG1h+EiIi8JarQoLV+X2s9S2u9UWu9QWv9CwBHAAwJ8ZDbAWzS\nWt+rtV6rtX4KwBsAptbutJPHNE80bCiLOpl+DVoDpaXSHDBlCvD448GPNUMUYw0No0fLY+0dImfP\nlpEW1dUyuyMgHRerqvxDg1OlYfNmmRfCrKQZyqBB0jSycqVUGvr0keeOtXli1y5ZrIuIiNJbzH0a\nlFJZSqlrADQC8HmIw4YACOzH/yGAc2J93WQzlQZA+jUsWybDKEePBiZNAoYNA666SvoYVFX5P3bj\nRqBJE/mmHousLAkkr78uk0xpLaHh/PPlftNEYT7MI4WGTZuATp2kYhJOv35yzJdfSqWhd28JThUV\n0fdNOHFCqjHbtlmVESIiSk9RhwalVE+l1GEAJwE8DWC81rosxOFtAQQWyncBaKaUqh/tayfbkSPy\np21bud2vH7B0qXzz3roV+OADGeb4wAPywf3mm/6PN50gA2eCjMZNN8njX3hBOmLu2gXc4OsVYiaA\nCgwNBQVy3seO+T9XpJETRqNG0odh9mwJGqbScOxY9LNimmaSkyedgwwREaWPnBgeUwbpn9AcwFUA\n/qmUGhkmOMRs6tSpaN68ud++iRMnYuLEifF+KUemDd+EhtGjgRYtgB/9CLj3XmtWxbPOkvueeEJW\nsjQiDbd0Iy9P5l7461/l23/9+sAVV0iYCFdpAKQ60amT9VybNzsvVOVk8GDgxRfl5969gYMHrddq\n2tT9+dv7Vmzdav1dEhFR7ZWWlqK0tNRv36FDhxL2elGHBq31KQC+Ln74Sik1GMDdkP4LgSoAFATs\nKwBQqbWOOHhx2rRp6N+/f7SnGDcmNJgP49695YPYyV13AZdfLstOmw/mjRuBq6+u/XncfjswfTrw\nyCMyEVTjxnJO9tDQvLlUCAArNOzaZYUGraVq4DSdtZNBg6SDZ04O0KOHNfKiogLo3l1+3rdPQk04\ngaHBaUluIiKKjdMX6WXLlmHAgAEJeb14zNOQBSBUU8PnAEYH7LsIoftAeIr5Bu/m2/Gll8r8B3/+\ns9w+eVI+JGPtBGk3ZIgElt27gQsukH1FRf7NE/YFr0xosDcHHDggfQvcNE8A1od7SYlUN8zfgQlS\nFRWysNa8eeGfZ9cuaV5p1IidIYmI0l208zQ8pJQaoZTq5Ovb8HsA5wJ4yXf/75VS/7A95BkAXZRS\nDyulipVSd0CaNB6L1wUkUkWFfGC2aBH52Oxs4M47gddek+GPW7ZIx7/aNk8A8qF7u6+OY0KDffns\nwNBgOl7aQ4MZbhluYie7M8+UD/reveV2s2bSHGOC1FdfScfP1avDP8+uXVKN6NxZQhQREaWvaCsN\n+QD+AenXMBsyV8NFWuu5vvvbAuhgDtZabwFwKYALAHwNGWr5X1rrECsjeIsZbum2I+PNN0vIeOYZ\na+2GeFQaAJkh8oMPZIlqIHxoyM2VVTPtoSHckthOcnKA3/8euO02ua2UvIapNJippSMFgV27pGNm\nx44MDURE6S6qPg1a68kR7r/JYd88SLhIO/bhlm40bw7ceKOEhqZNJUBEmhPBrZwc4GLb3Jv20LBz\np7VmhBE4wdPmzXJO9iW4I7nrLv/bbdtalYZYQoNZz4KIiNIT154Iw0whHY0f/Ug6Sz7+uDQFZCXo\nb7ioCDh0SIZABlYagOC5GsyS2LUZ/mnmagCiDw2dOrHSQESU7hgawjDNE9EoLpaKwM6d8WuacGIq\nGGvXyhoX7dv73x8YGiItie2GaZ6oqpLVNPPyoqs07NtnLX5FRETph6EhjGibJ4y775ZtPDpBhmJC\ngyn5B55n4KJVa9ZIoKkN0zxRViaLd11yiYzgOH069GPslQaA1QYionTG0BDC6dPyoRvLZEQXXQR8\n//vAmDHxPy/DhIYlS2Qbrnni4EGZxrlXr9q9Ztu20vRiFp8aM0b+nuxrUkyYAMyaJT9XVclQT1Np\nABgaiIjSWSwzQtYJe/bIkMlYKg1ZWTL0MpEaNpROjeFCg7kGs5x3z561e8127WSSqDlzpHJgQsi2\nbdLH4sABWSejVStpojGhpaBAmk+ysjhXAxFROmOlIYRoJnZKlcJCCQSNGgVP7ZyfL1WA/fvlmOxs\nmaipNszfxUcfyfwNgdUDE05WrJCtGb2Rn2+trslKAxFR+mJoCCFw3QkvKiyUYNCuXfCoiALf5N27\nd8uHeHGxDAGtDVPN2L1bQkOzZjLM1Ck0aG2FBnMunKuBiCi9MTSEYEJDQeDKGR5i+jUEjpwA/KeS\nXrGi9v0Z7M8JWDNFduhgBQFTYTh8WJoh7JUGQEJDbZonHn0U+OMfY388ERHVDkNDCFu3SmCoVy/V\nZxJaUZFsnfpd2BetildoyM0FWreWn01osFcPVq4Ehg2Tn1eskNdu0cKqcEQzV8Pnn/sfu3EjcP/9\nsmhXTU3tr4WIiKLH0BDChg2JnWchHkylwSk0NGsmgWfZMhk9EY/QYF6rQQPr78aEBq0lKFxyiTRZ\nmNBgr9R07Bh5iCYAHD8uI1BGj5ZFtgDgF7+QWTH37LE6fxIRUXIxNISQ7qFBKfnAnjNHbscrNLRt\nK4tZ5fjG3ZjQsHOnFU5695YZIwNDQ6dOMr+DfYimk/fes2a6vOUWYOlS4NVXgWnTgJYtgfffj8+1\nEBFRdBgaQtiwAejWLdVnEV640ABIE8WyZUCTJtbkSrV1zz3AAw9Ytzt2lBEaX3wht3v1kj9OlQYz\nudTHH4d/jdJSYOBA4MUXZejqZZfJyI/Jk4HvfpehgYgoVRgaHBw8COzd6/1Kw3e+A5x9tvxxkp8v\nzQY9e8ZvDYyLLwa+9z3rthl2+cEHQOPGEk5695bprU2/EKNrV+Cqq4Bf/xo4ccL5+Q8elFAwaZIc\ne+ed0in1oYekunHppVJ5iFStICKi+GNocLBhg2y9HhoaN5Zv+KGmhzadIePVNOGkg28h9JkzrXDS\nq5f0W9iwIXj0ye9+J00ZTz3l/Hxvvw1UVwNXXy23H3sMmD8fGDdObl98sTS9fPBBYq6HiIhCY2hw\nYEJDIteOSAbzgZ3I0FBYKB/i5eXWjJP2mScDQ0P37tLM8NBDUlUI9MorwKhR1jDS3Fxg+HBrHorW\nrYEhQ9hEQUSUCgwNDjZskA+nFi1SfSa1k4xKQ26u9QFvwkKzZkDnzvKz0zwXpnni4Yf991dUAHPn\nAhMnhn/NSy+VfhFVVbU6dSIiihJDg4N0GDnhRlGRTB+dyNAAWP0a7K9jfnYKDe3aAT/5iTQ9rFlj\n7f/LX+R8r7wy/OuNGSMTSH3+eWzne+KEPJ6IiKLD0OAgU0LDFVfI0tl5eYl9HRMa7M0SZvKnUDNq\n/uxn8rhbb5XJmhYsAP7wB+DnP5cFr8Ixa2hEMyW11sBzz8k8Ei1byt/JZZcB06dzsigiIrcYGhxk\nSmjIzQX69Uv863TpIuHAHhBGjZImnlDDQRs2BJ59VsLCH/8IXHcdcM45EhoiadhQFugyq2i6MX++\nBJQTJ4AHH5TXPHQIuP56GdpJRESRcWnsAIcPy/wCmRAakuUnP7FGOxgXXCAf6oELadmddx5w883A\nfffJLJKffWZNGhVJfn50oWHBAgkas2dLEwgA3HWXDFedO1fOwzj3XKnS3H23++cnIqoLGBoCbNwo\nW4YG9/LynJtAwgUG49FHpV/Df/93dBNQRRsaFi6UURcmMBjDhwNvvmnd3rkTmDfPmjiLiIgsbJ4I\nYIZben02yEzRqpV8oI8fH93jogkNWkunyaFDg+8bNkxW3ty+XW7PnSvb2qzGSUSUqRgaAmzYIB3l\nInXGo9SKJjSsWydTXYcKDQDwn//I1qzVwdBARBSMoSFApnSCzHTRhIaFC6WpxGm67YICqSotWCAV\niTlzpAPnzp2cB4KIKBBDQwCGhvRgQoPWkY9duFCGgzZv7nz/sGESGjZsALZtk5EcWsvPRERkYWgI\nwNCQHvLzZfjkkSPB982fL7NGHjggtxcudG6aMIYPl6W8335bRm9cf73sZxMFEZE/hgab/fuBHTtk\nfQTyNjNFdmATxeHDUimYOVOW8T5wAFi9WuaACGX4cJng6bHHgMGDgTPPlP0MDURE/jjk0ubTT2U7\ncmRKT4NcsIcG+8Ji994L7NsH/OpXwG9+AzRpIvvDVRq6d5d+DLt2yQRQ9evLpFRbtiTs9ImI0hIr\nDTZz5kjThJkWmbzLqdIwdy7wzDPAI48ADzwAfPe7wJNPSiAI1+SklDWKYvRo2XbqxEoDEVEghgab\nOXOsDw3ytrw8+bC3h4Yf/lBmc7ztNrnvuedkxc2hQyNPNHXBBbKq6ZAhcrtzZ4YGIqJADA0+O3YA\na9cyNKSL7GypIJjQcOgQUFYGTJkCZPn+V3foAHzyifRViOS22+Tfv359uc1KAxFRMPZp8DEzAY4a\nldLToCjY52pYu1a2xcX+x/Tv7+65cnKsJg9AQsO2bcDp08FTTxMR1VWsNPjMmQP06QO0aZPqMyG3\nnEJDvEa+dOoEnDolkzwREZFgaIBM5DN3Lpsm0k1BgX9oKCy0RkvUVufOsmUTBRGRhaEB1kyA55+f\n6jOhaARWGgKbJmrDrLjJYZdERBaGBkjTRHY252dIN4kMDY0bywgNVhqIiCwMDQDmzQMGDgSaNk31\nmVA08vOBvXuB6mpg/fr4hgaAIyiIiAIxNABYvDj8NMPkTfn5Mv3zsmWyDkW8QwPnaiAi8lfnQ8O+\nfcDGjbLmAKUXM0Ry/nzZJqLSEO8+DWVl1vkSEaWbOh8avvxStgwN6cceGurXj//03506AVu3Bi+/\nffx47M/58MPApEnulvQmIvKaOh8aFi8GWrUCunRJ9ZlQtExoWLAA6NYt/pMwdeokzR72qar375cO\nkrNnx/acu3YB27dLdYuIKN0wNCyWKkOktQnIe5o0ARo0kA/yeDdNAFblYvt2a9+mTVJpePnl2J5z\nzx7ZfvJJ7c6NiCgV6nRo0NoKDZR+lLKqDYkIDYWFsrWHBvPzjBkyaiNapmphlmEnIkondTo0fPut\nfPNjaEhfiQwNbdoAubn+oWHHDtnu3w989ll0z6e1/H9r1kwqDezXQETppk6HhsWLZTtoUGrPg2KX\nyNCQlSXVhsDQ0KmT/HnrreDH1NQAmzc7P9/Ro9K0cfnlQHk5sG5d/M+ZiCiR6nRo+PJLefO3r25I\n6SWRoQGQ0GCqC4AEiKIi4IorgLfflpBg9+qrci5OC12Z/gxXXCGdNtlEQUTppk6HBvZnSH+FhUD7\n9kCLFol5/qKi4EpDYSFw5ZVARQXwxRf+x8+fL30d3n03+LlMf4YuXaS6xc6QRJRu6mxoOHUKWLKE\noSHd3XMP8OGHiXt+p9BQVCQziLZtG9xEsWiRbJ2aLkxoaNMGOO88qTRE06+hpgb4+9+lmYOIKBXq\nbGhYvx44dkzWnKD01aoV0LNn4p7fNE9oLX+2b5d9WVnAuHESDswH//HjwPLlQN++UkXYv9//uUzz\nROvWwKhRMmdDWZn7c3nzTWDyZGDmzLhcGhFR1OpsaKiokK0ZVkfkpKhIwuXBg0BlpXzLN/9nvvc9\n6fS4dq3cXrYMOH0a+N3vpCrw3nv+z7V7N9CypYzIGDYMyMmRxdLcqKkBfvMb+ZnrYRClxmefAe3a\nyXtCXVVnQ4P5FpiXl9rzIG8rKpLt9u1Wh0izb9QomVzKfPNftAho2BC48EJpvghsotizx+q42bgx\n0KuXNYInkrffBlaulAmttm6t1SURUYzKyuQL55o1qT6T1IkqNCilfqaUWqyUqlRK7VJKva2U6h7h\nMecqpWoC/pxWSqV0zMK+fTI5UKI60FFmMFWFHTusvg1mX6NG0jfBhIbFi4H+/aWSMH689LU4csR6\nrt27pT+DMWiQtfZJOKbKMHo0MHIkKw1EqVJZKduVK1N7HqkUbaVhBIC/ADgbwAUAcgF8pJRqGOFx\nGkA3AG19f9pprXeHf0hi7dsnpeKsOltrITfatZNwaa80tG9v3T9mjDQxHD4slYazz5b948fLuhWz\nZlnH2isNgHTCXbXKP1g4mTFD+kr8+tcyRJihgSg1Dh2S7apVqT2PVIrqI1NrPUZrPV1rvUZrvQLA\njQA6Ahjg4uF7tNa7zZ8YzjWuzMJDROHk5sooie3b5U9+PlCvnnX/mDEyxPKVV2QZbRMaunaV5gd7\np0WnSkNNjfSFCOeFF4ChQ4ERI6yVN4ko+VhpqH2fhhaQKsL+CMcpAF8rpXYqpT5SSg2t5evW2r59\nDA3kjhlBYeZosOvSRSZzeughuW0fwtu/v/83ksBKw5lnShNHpCaKsjIrjHTsCBw4IJUNIkouU2lg\naIiBUkoBeBzAAq316jCHlgOYAuBKAFcA2AbgU6VU31hfOx727ZPhekSRmLkazBwNgcaMkW//+flS\nCTB69JAPfDNcM7DSkJMjwSJcZ8hTp2RlzW7d5LZ5fjZRECWfqTRs22YFiLompxaPfRrAmQCGhTtI\na70OgH2W/S+UUl0BTAVwQ7jHTp06Fc2bN/fbN3HiREycODGmE7bbvx8444xaPw3VAUVFMhFTTg4w\nZEjw/WPGANOmSTXAvsR6SYm8yVRUyGiJqqrgKcsHD5aREaFs2SLBobuvu7E9NCRyfgoiCnbokFQI\nV6+WKuLQlNfMgdLSUpSWlvrtO5TARBNTaFBKPQlgDIARWuvyGJ5iMSKEDQCYNm0a+vfvH8PTR7Zv\nHyd2InfMolW5uc7zeowYIZ1qhw/3319SIts1a6RZAfCvNADSr+Gxx6TpIvA+wFrUylQa2raV8MJ+\nDUTJV1kpv7Nr10oThRdCg9MX6WXLlmHAADddDaMXdWjwBYbLAZyrtY71rasvpNkiZdg8QW4VFcnk\nTubnQPXry+iGwA/9Ll0kaJSVyfwNgHOlAZApzS+5JPi516+XuSDM62ZnAx06sHmCKBUOHZLf4W7d\n6m6/hmjnaXgawLUAJgE4qpQq8P1pYDvmIaXUP2y371ZKjVVKdVVKnaWUehzAeQCejPR60czLHw2t\nOXqC3LMHhVAziBYVSXiwy80FvvMdCQ32dSfszjhD/h+G6tewbp08h31oMIddEqVGZSXQrJk0DTI0\nuHMbgGYAPgWw0/Zngu2YdgA62G7XA/AnAMt9j+sFYLTW+tNIL3b6dJRn59Lhw9JOzNBAbtiDglOl\nIZySEmme2LNH+jsE/p9TKvwkT+vXW/0ZjI4d2TxBlAqHDgHNmzM0uKa1ztJaZzv8+aftmJu01ufb\nbj+qte6mtW6stW6jtR6ttXY1436iQsO+fbJl8wS5YQ8N0a5VYkZQ7N4t/99yHBoEBw+WjpZPPBE8\nlHLdOqs/g8FKA1HynTwpf0xo2LPHqiDWJZ6eDzFRoYHrTlA0GjWSD/wmTaQ0GY2SEulEuWlTcH8G\n40c/Ai6/HPjpT6W/wvvvy/4TJ6SiEFhp6NQJ2LlTRmMQUXKY4ZameQKom9WGOhkaTKWBoYHcKiyM\nvmkCsEZQLFjgPDoCkKWyX35ZgkWPHsAjj8j+jRul/01gpaFjR9lvprUmosQzoaF5c5nxtX59hgbP\nOXUqMc9rKg1sniC3OnWyhk1Gw4SGtWtDVxqMDh2AyZMlYOzeLf0ZAOdKAxB7E0VVlSx+tWhRbI8n\nqovM1AfNmkkzY48edXMNijoZGvbtk57tTZok5vkp8zz2GPDnP0f/uKZNrX4QoSoNdmPHynbGDOnP\n0LRpcNgS1tn8AAAgAElEQVTo4OtmHGtomDtX/jA0ELlnQoOZb7C42JpHpS7xdGhIZPNEq1b+s/cR\nhdOtm7xJxMJUGyJVGgAJFiNGyCyR69fL6wb+P23YUJ4r1tDw5puy3bs3tscT1UX2Pg2ADIU21cC6\npE6GBs7RQMnUo4ds3VQaAFlWe/ZsYOnS4KYJw2m1S61lyupwTp0C3nlHfjZ9e4gossBKQ7du0q/o\n2LHUnVMqeDo0JLJ5gqGBkiWaSgMAjBsn/Q6++iq4E6ThNOzynXdkFkrzjcjJ/PlSYcjLY2ggikZl\nJVCvnjWJm/nd3LAhdeeUCp4ODYluniBKBhMa3FYaOnUCzLTxoSoNHTvKYlZ2X38NHD8evkf3m2/K\nY0eNYmggioaZ2MkwoaGuNVHUydDA5glKpmHDgF/8QlbBdGv8eNmGqjR07y6hobra2mc6ZYUKDTU1\nwFtvAVdcIcM8GRqI3DNTSButW0uIYKXBQ9g8QZmgQQPgt7+1Fq1y48YbgWuvBXr3dr6/pER+P+xv\nWOYbT6jQ8MUXQHm5hAY2TxBFJ7DSoJSEelYaPITNE1RXFRYCL70UOmiYJo+yMtlqHbnS8NZbQEGB\nLOfL0EAUncBKA8DQ4DmJqDScPi3LHLPSQOksPx9o0cIKDbt3y7oVffsCK1Y4rxA7Y4bMA5GdLaXV\no0dlqmoiiiyw0gDUzWGXng4Niag0HDwob6gMDZTOlJJqgwkN5o1r/HgZHRG4kM66dXLMZZfJbfP/\nn9UGIndCVRrKy4EjR1JzTqlQ50IDV7ikTGFW0AQkFCglwzWB4CaK996TvhUXXCC3GRqIouNUaaiL\nwy49HRoS0TzBFS4pU5SUAGvWSOVs/XqZXvqssyQcrFjhf+x77wHnny8rdgIMDUTRClVpABgaPCMR\noYErXFKmKCmRfgzl5VJp6NZN+iuceaZ/peHgQZnUyTRNAAwNRNFyqjTk5QEtW9atfg2eDg1sniAK\nzT6CYv16ayKonj39Q8OHH0oAt4eGFi2ArCyGBiI3tJZKQ2BoAOreCIo6Fxr275cSbYMG8X9uomTq\n0kVWa129WsqjplTas6cs2VtTI7ffew/o08daHROQwNCqlf+iVadOOY+6IKrrjh6V36fA5gnAfwRF\nVZVU/zJZnQsNnNiJMkVOjgSFOXNk+mhTaejVS3pzf/ut/A7NnOlfZTAC52o491zgj39MzrkTpROz\nnku4SsO6dfK75/S7lklyUn0C4SSq0sCmCcoUJSXArFnys73SAAALFwL33iv/56+4Ivix9tBw+jSw\nZIm1IicRWcwKl06Vhm7dgF27ZJr4o0eBbdvk9yk7O7nnmCyerjQkqiMkKw2UKUpKZGne7GzgjDNk\nX2GhfCO68Ubpz/Dmm0D//sGPtYeG7dultGpvriAiEa7ScNZZsj37bODll6Xqt3lz8s4t2TxdaWBo\nIArPdIY84wzp3wDIfA0jRsiCVm++GXqlzLw8a56HjRtly46RRMHCVRr69gUWLwb69bMmVVu1Svo6\nZCJPVxri3Txx8iTw5ZcyJI0oE5jQEBgM3noLWL48dGAA/CsNJjSw0kAULFylAQAGDZI+Ru3aycik\ncMvTpztPVxriHRrmzJHEeNVV8X1eolQxoSFwCW1TdQjHvjy2mZyGlQaiYKbS0LRp+OOUskYvZSpP\nVxri3TzxxhtAcbHVBkWU7po2Ba65Brj00ugfm5cHHDgg4dxUGvbvt4Zq1jU1NdJ5lMNOKVBlJdCk\nibvOjYHzpGQaT4eGeFYaqquBd96RKoNS8XteolQrLQUuvDD6x+XlyQfkgQNSacjLk985860q05w6\nBUybFryYlzFjBjBsmDTtENkdOuTcn8HJWWcBa9fKZ04m8nRoiGelYe5ceXNk0wSRsE8lvXGj9P42\ntzON1sAPfwjccw/w/PPOx3z6qWynTpWhc0RGqNkgnfTsKSORMnU9Ck+HhnhWGt54A+jaVWbGIyIr\nNKxZI5NBZXJoePhh4Nln5ZqXLHE+Zv58YORIqUQ89FByz4+8LdpKA5C5/Ro8HRriVWk4dQp4+202\nTRDZmdCweLFsBw+WbaaNoHjrLeBnPwN+9SvgppucQ0NlJfD118APfiATYv3xj3VrPQEKL5pKQ5s2\nQH5+5vZr8HRoiFelYf58+fbEpgkiiwkNixbJ1oSGTKs0PPgg8N3vAg88AAwcKNNr79njf8zChdIR\ncuRI4H/+R4bO/fa3KTld8qBoKg2AVBtYaUiBeIWGDRukwtCvX3yejygT1Ksnoy++/BJo21amV2/c\nOLMqDStXAl99Bdx2m7wHDBwo+5cu9T9u3jz5dtitmyxoN3JkZs/qR9GJptIAZPYIijoRGvbtkzXP\nM3UucKJY5eXJqnxdu8pt+9wNoXzxhYxCSAfTp0sYGjNGbnfpIpPvBDZRmP4MpvmyZUvg4MHknit5\nVyyVhvXrZULBTOPp0BCvPg1cpIrImWmiMKEhcOVLJ3/7m5Twvf6GePq0rAVwzTVSVQGsaoM9NBw/\nLv06Royw9rVowdBAllgqDadPy9DLUNJ1PpA6ERq43gSRM/N7YebJz8uL3DyxapUMKfvmm8SeW219\n8gmwY4d0brQLDA2LF8v1jBxp7WvRQoZoEwEyuqhxY/fHmxEUoX5H5s93V9XzIk+Hhng2TzA0EAUL\nrDREeiPTGli9Wn42HSi9avp06aNghpIaAwdKmCgvl9vz50vpuVcv65iWLWWuhkydoIeiU1UF1K/v\n/vgWLYABA4D333e+/6GHpAKejnM51InQwOYJImetW8vWbfPEjh1Sqs3O9nZoOHpUVvj8wQ+Ch1nb\nO0NqLWvSDB/u3+epRQvZZursmBSdaEMDAIwbB8ycGdyMt2oVMGuW/LxzZ3zOL5nqRGhgpYHIWWDz\nROvW4ZsnzDCyMWO8HRpee02Cw3XXBd/XsaNc55IlMhzz00+BSZP8jzGhgU0UdPq0DMc1/WLcGjdO\nOhl/8on//sceA9q3l+fbsSN+55ksng4N7AhJlFgDBsj8DOb3w1QaQnXSWrVKhiROmCClVa8Oz3z2\nWeCii4Azzgi+Tym57qeekgmfHnwQuPZa/2NatpQtO0NSVZVsow0NZ50lFbx33rH2VVQAL70E3HWX\nzAXCSkOcxSM0aM1KA1Eol10mFQNTws/LkzfJI0ecj1+1CujRAxg6VG6b2SRTqbrafyKd5ctlWOit\nt4Z+zMCBEnjuuw+4//7g+02lwR4a1qwB/vnP+JwzpQ/TvBBtaFBKqg3vvmutHPvUU7Js/a23SrWB\noSHO4tE8ceSIvKkwNBBFZvo4hOrXsGqVfIM64ww51gtNFK+/LkPczOqUzz0HFBQAY8eGfszdd8s3\nvt//3nlqeafmib//HbjhBuCvf43fuZP3xVppACQ0VFTI78mHH8r05LfeKpWswkI2T8RdPELD/v2y\nZfMEUWT2lS8DmZETZ54pH7Rnn+2N0LBihWxvuEH6KUyfDtx8s3yjC6VNG2mSCLUWTdOmQFaWf6Vh\n9245/oc/BD7+OH7nT95Wm9Bwzjky0+j990uIveACCaoAKw0JEY/QYN78WGkgisz8njj1Vdi+XTp2\nmTHoZ58tzROpnqSmrAwYNgzo3FnmWjh0CJg8uXbPmZUlk/nYQ8OePcD3vid9Ja66Cli3rnavQenB\nhIZoR08AMiJn7FjpbHvJJTKixzxPYSFDQ9zFo08DKw1E7oVrnjD9Buyh4cCB1K8GWVYmHRvfeUfe\nkC+6SKaLrq3ACZ5275Y1Ol59VT4MSktr/xrkfbWpNACyaupvfyvNaPbnaN9eQumxY7U/x2TydGhg\npYEouRo1kg/eUKGhUSOgUye5bVbFjKaJQms5ftkyCfS1rVJUV8sojpIS6an+zTfAK6/U7jmNwPUn\n9uyRUnOzZhJK0rE9mqIXa0dIo1s34Be/CG4ua99etulWbfB0aIhHpWHfPiAnR9ooiSg8pULP1bBq\nlfRnyPK9a7RoIR+epk9BOCdOSAfFnj2BIUOkMpCXB/TtW7vz3bRJ3idKSuR2x47x+4JgrzRoLZWG\nNm3kdrp2YqPo1bbSEEphoWwZGuIoXh0hW7UK3eGJiPzZZ4V88klZVnrXLukEaZomjJISaR6I5P77\ngSlTgO7dgdmzpS/E/ffL8MjazL9vXtuEhniyL1p15Ih844w1NJw+Ddx0U/yqIJQ8iQoNptKQbuHT\n06EhXpUGNk0QuWcWrTpxAvjlL2WIYXEx8PXXUmmwcxsa5s0Drr8eePttYPRoYNAgmSAKqF2fiLVr\npbmgbdvYnyMUe/PEnj2yzc+XbbShYckS4MUXZcTGDTdIh1JKD7XpCBlO06ZAkyasNMRVPCsNROSO\nWbTq3XflQ/M//5EP+KoqGUJmV1wsTQThlsmurpYmjP79/febqatrMwqhrEyCSyIqifbmid27ZWsq\nDUVFVrBy4/33JYS88ILMJ3H++fE/X0qMRFUagPQcQZHxoYGVBqLomOaJF1+UmR+HDpVpmSsrgREj\n/I8tKZHf040bQz/f6tXyxtuvn//+xo2lRFubSoMJDYlgb54wlQZ78wTg/g1/5kzg4ouBG28E/vQn\nqTzEa5p8SqzadoQMp317Nk/EVbyaJ1hpIHKvdWsZkfDRR9IObzRpEnys+cBeuzb08331lWz79Am+\nr1s396Fh3z7gX/8CZsyQ21pLaCgudvf4aJnmCa2t0GCGpJrQ4OYNv7xcVtQcM0ZumyYOrmuRHlhp\n8JeT6hMIJ17zNLDSQOReXp60uTdsaPU7CKVNG/lwDdev4auvJBw0axZ8X7duMvwynBMnZO6FBQvk\nAzw3F9iyRUZFHTiQ2EpDVRVw/Lg0T7RsaQ2biyY0zJolzScXXyy3zZeY/futEELelcjQ0L498Pnn\n8X/eRIqq0qCU+plSarFSqlIptUsp9bZSqruLx41SSi1VSp1QSq1TSt3g5vXYPEGUfOb35cornT/o\n7ZSSb/qRQkNg04RhKg3h5mtYswaYPx/43e+kqaNBAxnVkciRE4D/olV79lhNE4D8vTRp4i40vP++\nTIRlAoJZQdNMPEfelujQsGNH6mdVjUa0zRMjAPwFwNkALgCQC+AjpVTDUA9QSnUG8B6AOQD6AHgC\nwN+UUhdGerHahoaaGvkmwuYJIvcKCmR7443ujg83gqKmRkZdhAsNhw9bHQ2dbNgg21tvlRU2J08G\nnnlGKhTZ2TKpUyLYl8c2EzvZuRlBUV0tzTyXXmrts1cayPuqqmRukpwE1OULC6WSlk5NVVGFBq31\nGK31dK31Gq31CgA3AugIYECYh90OYJPW+l6t9Vqt9VMA3gAwNdLr1TY0HDokb1qsNBC5d9550m/A\nbQ//khLp0+D0bWnTJgkF4UIDEL5fw4YN8gFufo/vukt+t//wB5lcKt5D4Qz7Spf2iZ0MN6FhwQK5\nftOfAWBoSDdVVYmpMgDpOStkbTtCtgCgAYT77z8EwOyAfR8COMfhWD+1DQ1m0hhWGojcy82VhZnc\nDmMsLpYP8V27gu8z/RVChQZTJQgXGtavt4ZnArIw1ZVXyuslqmkCCN88AUho2L7d+bHffAM8+CBw\n550yh4R95suGDaWJhaEhPZw8mfjQkE4jKGIODUopBeBxAAu01qvDHNoWQODbyS4AzZRSYb8j1LYj\npPmlZKWBKHHMB7dTE8VXX8kbY2Bp32jYEOjQIXKlwR4aAOCee2SbqJETQOzNE2vWSEh45BE5v9JS\na+pto1UrhoZ0kchKQ7t2sg2sNCxe7N3ZQ2vTSvM0gDMBDIvTuQQ5eXIqxo5t7rdv4sSJmDhxoqvH\nc7EqosTr2lXae8vKgFGj/O8L1wnS6N49cmgIfN4hQ4AHHgAuvzyGE3apQQP5sAjXPLFzpzSB2kPB\n8uWy3bw59HsPQ0P6SGRoqF9fOsjaQ8Py5cCFF0ql/eqrpd9OOKWlpSgNWHL10KFDCThbEVNoUEo9\nCWAMgBFa6/IIh1cAKAjYVwCgUmsdZh45AJiGGTP6hz8kDC6LTZR4ubkSHAIrDVpLaLj11vCP79Yt\n9LCzI0dkngPT98Hu17+O7XzdUkqaKLZt8193wigqko6Oe/f6VyHKyuTYcF9WWrXyX3abvKuqKnH9\nZgD/CZ62bQMuuUR+pyorZdK07hHGJzp9kV62bBkGDAjX1TB2UTdP+ALD5QDO01pvdfGQzwGMDth3\nkW9/WFpLio/Vvn3ybaFRo9ifg4giKy4OnuCpvFy+oQdOHx0o3LBLM9NkYPNEsrRsaU1z7dQ8AQQ3\nUaxdG7mvBSsN6SORlQZAwueLL0pFbuhQCQyffCL3ffNN4l43VtHO0/A0gGsBTAJwVClV4PvTwHbM\nQ0qpf9ge9gyALkqph5VSxUqpOwBcBeAxN69Zm34NnA2SKDmchl2uWiXbXr3CP7ZbN+DYMece5Ga4\nZapCQ4sWVtOJU/ME4BwaIvW1YGhIH4nsCAkADz0E3HefNLmddx7w4YfyO9OunTdDQ7TNE7dBRkt8\nGrD/JgD/9P3cDkAHc4fWeotS6lIA0wDcBWA7gP/SWgeOqHB06lTs/2CcDZIoOUpKgG+/lQ9/U9kr\nK5Pf3TPOCP9Y+7BL80FsbNgANG+eupkTW7SQeSaA4NBQUCDtzfbQoLWEhkjdrhga0keiKw19+jhP\nsd63r/V/z0uiCg1a64iVCa31TQ775iH8XA4h1bbSwNBAlHhnnSUfmGvWAKYptaxM2mMjdeTq0kU6\nEq5fH9zh0Qy3TMQqlm60bGktWBQYXLKzZTilfdjljh3A0aOsNGSSRIeGUPr0AV56KfmvG4mnF6wC\nahcauCw2UXKceaZsV6yw9rldgbJePQkOCxYE3+c03DKZzFwN9nUn7AKHXZp+HZFCQ8uW8v5Umz5b\nlBypDA3bt1ujAL0io0MDKw1EydGkiXzwr1xp7Ytm2eo775RvVfbHA94JDYFNE4ZTaMjNjdwk06qV\nBIbDh+NznpQ4iR49EYqZEMxr/Ro8Hxqqq2N/LDtCEiVPz55WpaGyUjo2ug0Nt98uH7T33mvtO3ZM\nPpCdhlsmi5ngKdTkVE6hoWtX56qEHaeSTh+pqjR06yaTnzE0RCnWSsPRozLkq23b+J4PETnr1cuq\nFJgyvdvQUK8e8PDDwAcfAB9/LPtSPdwSiL7SUFbmbpZKhob0kejRE6FkZ0sQ91pnyIwNDe++K6uH\njR0b3/MhImc9e0p1Yf9+a/hlNNM8X3GFjFP/6U+B48dTP9wSiBwaOnWSaaZNcHAz3BJgaEgnqao0\nANJEwUpDlGINDdOnA8OGSTsrESVez56yXblSQkNRkfR1cEsp4PHHZTKlIUOk6tCkSeimgWSI1Dwx\nZgzQtCnw1FMSdLZuZWjINKkMDX36AKtXyzl4hedDQyx9GioqZA37H/wg/udDRM66d5e2fBMaYlmB\nctAgWaynqgp47jlp103VcEsgcqWheXNg8mTgr3+Vb4Rau7vupk2l/MyppL0v1aGhulqGMnuF50ND\nLJWG0lJZQGfChPifDxE5q1dPvmWvWBF7aACkb8SSJTKi4oYb4nuO0YoUGgDgrrukieLnP5fbbioN\nSnGuhnSRqtETANC7t2y91ERRm1UukyKW0DB9OnDppVZpkYiSo1cv6bi1fj1wxx2xP0/jxsBf/hK/\n84pVx47AuHHS1yKUzp2lP8Ybb8gQb7fDvBka0kOqOkICQLNm8lkWaTROMmVcpWHVKllZj00TRMnX\nsyewaJGUVGOtNHhJ/frA229Lh8dw7rlHttF0/GRoSA+pbJ4AgPfeizwteTJlXKXhrbeknXHMmMSc\nDxGF1quXtVplJoQGt845RxYb6tfP/WMYGtJDqkOD13g+NETbEbKsTNqBUtUGRVSXmREUTZoA7dun\n9lyS7eOPZQ0Nt1q1ArZsSdjpUJwwNPjLuOaJTZs4zJIoVTp1kv4IJSWpHfWQCtnZ0V2zWX+CvC2V\nHSG9iKGBiOImK0uGTUZTpq+r2DyRHlhp8Of55oloQsORI8Du3QwNRKn01lt8k3XDhAat615VJp2k\ncvSEF3k+NETTp2HTJtl27ZqYcyGiyDjU2Z1WreQD6fhxoFGjVJ8NOampkS+uDA2WjGqeMKGBlQYi\n8jpOJe195ksrQ4Ml40JDo0apnaueiMgNExo4lbR3mTUfGBosGRcaunRh+yAReR8rDd5nQgNHT1g8\nHxqi7dPApgkiSgcMDd538qRsWWmweDo0KBVdpWHjRnaCJKL0YBbDYmjwLjZPBPN0aMjOdh8aTp+W\n2dVYaSCidJCdLcGBocG7GBqCeTo05OS4Dw07d8o/MEMDEaWLVq2AiopUnwWFwtAQLGNCA4dbElG6\nGT4cmDnTWuSLvIUdIYN5OjRkZ7vvCLlpk/SB6Nw5oadERBQ3EybIInsrVqT6TMgJKw3BPB8a3FYa\nNm4ECguBBg0Se05ERPFy4YXSr+G111J9JuSEoyeCZUxo4HBLIko39eoB48dLaGAThfew0hDM06Eh\n2j4NDA1ElG6uvhpYvx74+uvQx/ztb8Do0ck7JxIMDcE8HRqi7dPA0EBE6eb884G8vPBNFCtXAp9/\nzmpEsjE0BPN0aHBbaTh8GNizhxM7EVH6yc0FrrgC+Ne/QoeCQ4dkNczKyuSeW13H0RPBPB0a3PZp\n4HBLIkpnV14JbN4MrF3rfL8JCzt3Ju+ciB0hnXg6NLitNDA0EFE6GzBAtsuXO99vQkN5eXLOh4Sp\nNOTmpvY8vMTToSGaSkPjxkCbNok/JyKieGvdGmjfPnRoOHRItgwNyVVVJYGBKydbPB8a3HSE5JLY\nRJTueveOXGlg80RyVVWxaSKQp0NDNM0T7ARJROnMTWhgpSG5GBqCeTo0RNM8wf4MRJTOevcGvv3W\naoqwY/NEalRVceREIE+HBjeVhtOnpdcxQwMRpbPevWUbuA7FqVPAsWPS/MrmieQ6eZKVhkCeDg1u\n+jTs2CHHMDQQUTorLpZOd4FNFIcPy7ZzZ1Yako3NE8E8HxoiVRo43JKIMkG9ekCPHsGhwTRNFBcz\nNCQbQ0OwjAgNXBKbiDKBU2dI0wmypAQ4csSqPFDiMTQE83RocNOnYdMmoKiInVWIKP317i19Gmpq\nrH320ACw2pBM7AgZLO1Dw8aNbJogoszQu7dUE7ZssfbZmycAhoZkYkfIYJ4ODW46QnK4JRFlCjOC\nwt5EEVhp4AiK5GHzRDDPhwY3zRMMDUSUCdq2lSmlv/nG2ldZKe+FBQUyXb5TpeGzz0JPDEWxY2gI\n5unQEKl5orIS2LuXs0ESUWZQCujVy3+uhkOHgGbN5L527ZxDw803Aw8+mLzzrCsYGoJ5OjTYKw23\n3w68/rr//Zs3y5aVBiLKFJ06yfwzRmWlhAZAFrUKbJ7YuVMqrtu2Je8c6wqGhmCeDg05OdKnobwc\neOYZ4J13/O/fuFG2DA1ElCkKCoBdu6zblZVA8+bys1OlYf582TI0xN/Jkxw9EcjTocFUGv79b7m9\nbp3//Zs2AU2aSBsgEVEmCAwNpnkCkEpDYGhYsEC25eXu1uoh91hpCJYWocFUGNatA7S27ueS2ESU\naQoKZK2JI0fktr15ol274OaJ+fMlTNTUcGRFvDE0BPN0aMjJAQ4eBObMAc49V3559uyx7l+7FujW\nLXXnR0QUb/n5sjXVhkOH/JsnKiuBo0fl9sGDMmrimmvkNpso4ouhIZinQ0N2NrB/v/zD/fSnss80\nUWgtw5LMuGYiokxQUCDb3btlG9gRErCaKBYulPfCSZPk9vbtyTvPuoChIVjUoUEpNUIpNUMptUMp\nVaOUGhvh+HN9x9n/nFZK5Ud6rexs2fbpA4weLc0Q69fLvvJyYN8+hgYiyiwmNJhKQ2DzBGCFhgUL\n5Pj+/YGmTVlpiDdOIx0slkpDYwBfA7gDgI5wrKEBdAPQ1venndZ6d6QH5eTI9vLLgYYNgQ4drEqD\nmfykT59oTp2IyNvy8oCsrNDNE4AVGubPB0aMkC9UHTowNMQbp5EOlhPtA7TWswDMAgClouqCuEdr\nXRnNa5nQMG6cbLt39w8NzZpxdUsiyizZ2UCbNs6VhubNZVbI556T98PFi4FHH5X7GBrij80TwZLV\np0EB+FoptVMp9ZFSaqibB515JnD11UDfvnK7e3erecL0Z+DICSLKNPn5EhqqqoATJ6xKg1LA888D\nZWVAv35y//Dhcl9REUNDvDE0BEtGaCgHMAXAlQCuALANwKdKqb6RHtivH/Dqq1Yw6NZNQkNNjfQY\nZn8GIspEZq4Gs1iVqTQAwIQJwIYNwNNPA5MnW020rDTEH0NDsKibJ6KltV4HwD4t0xdKqa4ApgK4\nIdxjp06diuYmYkN+iU6cmIiNGydi7Vrg7rsTcspERClVUABs3eocGgDpnHf77f77OnSwqhP8oKs9\nrdPj77K0tBSlpaV++w6Z9dQTIOGhIYTFAIZFOmjatGno37///91ev16aKN55Bzh9mp0giSgzFRQA\nX35phQbbd6eQOnSQ7Y4dwBlnJO7c6opTpyQ4eH30xMSJEzFx4kS/fcuWLcOAAQMS8nqpmqehL6TZ\nIiqdO0vnyNdflyaLnj3jf2JERKlmmifMF8bASoMTExrYRBEfVVWy9XqlIdmirjQopRoD+A6kcyMA\ndFFK9QGwX2u9TSn1ewDttdY3+I6/G8BmAKsANABwC4DzAFwY7Wvn5kqC/vJL6d/QuHG0z0BE5H0F\nBRIYzARPbkJDUZFsGRrig6HBWSzNEwMBfAKZe0ED+JNv/z8A3AyZh6GD7fh6vmPaAzgGYDmA0Vrr\nebGcsBlBwaYJIspUZirpDRtk66Z5okkToEULhoZ4YWhwFss8DZ8hTLOG1vqmgNuPAng0+lNz1r07\n8P77DA1ElLnMrJDr10uF1W27OkdQxA9DgzNPrz3hxCxQxdBARJnKHhqaNXM/H02HDlx/Il4YGpyl\nXWgYNEimlB44MNVnQkSUGKZ5Yv16d00TBisN8XPypGy9Pnoi2dIuNAwcKMvBmjnYiYgyTW4u0KqV\njNBcDSgAABJcSURBVKBw0wnSYGiIH1YanKVdaAD4j0hEmc9UG6INDXv3AsePJ+ac6hKGBmepmtyJ\niIjCKCiQNSaibZ4AgKlTgcJCYMAAYMyYxJxfpmNocMbQQETkQaYzZDSVhr59gXPPBT79VJpxd+0C\nvv994MknrcoFucPQ4CwtmyeIiDJdLKGhZUsJDGVlQHm5LPg3dy5w1lmyMrATrYEVK2p9uhmHHSGd\nMTQQEXmQCQ3RNE/YKQVcfTWwerU8x8MPOx83a5asGGwmkoqH6ur071fBSoMzhgYiIg+KpdLgJD8f\nuOMO4I03rGmp7RYulO1XX9XudeymTgUuvjh+z5cKDA3OGBqIiDzI9EGItdJgd8MNQFYW8OKLwfct\nXizb5ctr/zoAUFMDvPmmhJFjx+LznKnA0OCMoYGIyIPiVWkAgLw8YMIE4Nln5UPd0FoWAATiFxq+\n+gqoqJClpZcujc9zpgJDgzOGBiIiD2rfXrYtW8bn+aZMATZuBObMsfZt3AgcOAD06hUcGrZvl1AR\nrZkzJeg0bgx88UXtzjmVqqqkOpOdneoz8RaGBiIiD+rQAZgxA7jwwvg839ChQM+ewDPPWPtM08R/\n/RewZYssxw3Iz507A++8E/3rvP8+cNFFMuX/55/X8qRT6ORJjpxwwtBARORR3/te/MrjSgG33Qa8\n+y6wc6fsW7wY6NoVOO88ub1ypWw/+AA4fRp46aXoXmPPHnnOSy8FzjlHQkMs1QovqKpi04QThgYi\nojriuuvk2/Pf/y63Fy8GBg8GSkqAnByrieLDDyVkzJwJHD7s/vlnzZKQcPHFEhoqKoCtW+N/HfFQ\nVibVl/37ne9naHDG0EBEVEc0bw5MnAg89xxw4oR0Whw8WD4ce/SQ0FBdLRNC3XKLHDNjhvvnnzlT\nFhVs2xY4+2zZ59Umir/+Vc7t7betfUePApddJn9ee42hwQlDAxFRHXLbbbIS5qOPSigYNEj29+4t\noeHzz6W6cMstUi3417/cPe+pU1JpMGtd5OdL04cXO0OeOgWUlsrPb7xh7X/9dQk+SknlJd3nmkgE\nhgYiojpk4EBZyOp3v5ORAf36yf7evWU66VmzgNatgf79ZUbJWbNkhEV1tXw7D9XcMHeurHdx2WXW\nPtOvAZB5G37yE/lmX1kZ/XmvXBldU0k4c+bIuhw33wzMnm01UTz/PHDBBcC//w0sWQL87W/xeb1M\nwtBARFTHTJkiowN69QIaNZJ9vXvLh/KLL8qIjawsWezq1Cngqaeks+RttwFPPOH8nM8+C5x5poQS\nY8gQaQJ59llg1CjghReAK66QeSP++U//x//qV8B99zk/9/79UhF58knn+0+cAH77W2lecGP6dGmO\nefBB6fA5Ywawbh0wf74ECQqNoYGIqI6ZOBFo2tTqdwBIaABkoavvfld+bt8eGDEC+OUvgW+/lQqF\nmQzKrqJCRmVMmSKlfeOcc6RCMWUKcP31ctymTRIg/vxn67jqagkETz3lPItkaakEg1WrnK/n448l\ndLhpSjlyRKod110HtGsHDB8uTRQvvgi0aAGMGxf5OeoyhgYiojqmSRPgk0+AX//a2teuHdCqlfx8\n0UXW/p//XOZx+PprmVVy2TL5dm73/PPSB+AHP/Df37u3LNX9xz9K58t69YAzzpDnW7rUauqYP1+a\nQI4eleGegV54QbZr1zpfz/z5sn3tNf/91dXBx77zjgSTSZPk9ve/D3z0kVzDpElAgwbOr0GCoYGI\nqA4aMECCgqGUfMj37u2//6KLpG0/L0+aCI4eBdasse6vqZFAcPXVwbNX5uTIUt0/+Yl/BeKSS4Dc\nXKlOAPLNv0MHoG/f4A/+5cslYAwdKqHBad6HefOkmWX2bGDvXtlXUSGVkl/+0jru1CnplzFypExe\nBUhzSXW11ceBwmNoICIiAMBjj4Xv/DdggHz425soPv5YZpCcMsX96zRvDoweLWFBa/n2P26cBI/3\n3vPvm/DCCzISY+pU6XNRXu7/XEePSqi49155LjOE8tFHpWPmgw9K00d1tTTLfPEFcP/91uMLC4Fh\nwyQs9e/v/hrqKoYGIiICICMpzBBMJ82aAcXF/qHh2WelQ+WQIdG91vjxUiH46CNZ52L8eGkqOHZM\npqIGZIKll16S/ge9esm+wCaKL76QCsJVV0lfiddekyrD//6vhIOpU4G77pK+CzNmyAqcps+G8cor\nElzs1RByxtBARESuDRpkhYbdu+WDePLk6D9wx46Vpo0775S+FCNGyLwOAwbIB7/WUvXYuxe46Sag\nSxdp7ggMDfPmSdNJjx5SqZg7F/jv/5b+Ez/+sfSnmDAB+OYbCQZjxwafS8eO0teCImNoICIi1wYN\nkg/gkyelCpCVBVx7bfTP07atjK7YsEHW2MjJkf0TJkil4ZxzgB/+UCoIPXtKH4guXWT6Z7v58yVw\nZGVJtUIpOa8f/1j6WGRlSSVh+3bpS0G1w9BARESuDR4s/QOWL5cRB+PGyTf9WIwfL1v7MMcJE6QC\nkZUlTRf2jpHFxf6VhqoqmTxqxAi53aYNcP750mfixz+2jsvKkgmrqPZyUn0CRESUPvr0karA00/L\nvAl/+lPsz3X99TLs0j5dc+fOMpKhefPgJo+SEv9pn5cskfkbRo609j31lKy22aJF7OdFoTE0EBGR\naw0ayEiDF18Eiopk2uVY5ef7T/JkhPrALy6WkRrHjwMNG0rTRJMmMlTT6NZN/lBisHmCiIiiYkZY\n3HijrF+RLMXF0kFywwa5PWeO9H3I4dffpGFoICKiqAwdKk0HN96Y3NctKZFtWZksYPXxxzL3AiUP\n8xkREUVl0iSZ06Fr1+S+buvWMjxz7VqZxKljR5nDgZKHoYGIiKKSk2NNtpRsxcXAzJnAokXAX/4i\nQzEpedg8QUREaaOkRIZZ5udzrYhUYGggIqK0UVws25/+lCtSpgJDAxERpY0LLpDltqNZIIvih30a\niIgobQwYIMttU2qw0kBERESuMDQQERGRKwwNRERE5ApDAxEREbnC0EBERESuMDQQERGRKwwNRERE\n5ApDAxEREbnC0EBERESuMDQQERGRKwwNRERE5ApDAxEREbnC0EBERESuMDQkQWlpaapPIa54Pd6V\nSdcC8Hq8LJOuBci860mUqEODUmqEUmqGUmqHUqpGKTXWxWNGKaWWKqVOKKXWKaVuiO1001Om/Wfk\n9XhXJl0LwOvxsky6FiDzridRYqk0NAbwNYA7AOhIByulOgN4D8AcAH0APAHgb0qpC2N4bSIiIkqR\nnGgfoLWeBWAWACillIuH3A5gk9b6Xt/ttUqp4QCmAvg42tcnIiKi1EhGn4YhAGYH7PsQwDlJeG0i\nIiKKk6grDTFoC2BXwL5dAJoppeprrU86PKYBAKxZsybR55YUhw4dwrJly1J9GnHD6/GuTLoWgNfj\nZZl0LUBmXY/ts7NBvJ9baR2xW0LoBytVA2Cc1npGmGPWAnhea/2wbd8lkH4OjZxCg1JqEoCXYz4x\nIiIiulZr/Uo8nzAZlYYKAAUB+woAVIaoMgDSfHEtgC0ATiTu1IiIiDJOAwCdIZ+lcZWM0PA5gEsC\n9l3k2+9Ia70PQFzTERERUR2yMBFPGss8DY2VUn2UUn19u7r4bnfw3f97pdQ/bA95xnfMw0qpYqXU\nHQCuAvBYrc+eiIiIkibqPg1KqXMBfILgORr+obW+WSn1AoBOWuvzbY8ZCWAagDMBbAfwG6319Fqd\nORERESVVrTpCEhERUd3BtSeIiIjIFYYGIiIicsVzoUEp9UOl1Gal1HGl1BdKqUGpPqdIlFI/U0ot\nVkpVKqV2KaXeVkp1dzjuN0qpnUqpY0qpj5VS30nF+UZLKfU/vsXJHgvYnzbXo5Rqr5SarpTa6zvf\nb5RS/QOO8fz1KKWylFK/VUpt8p3nBqXULxyO8+S1uFnwLtK5K6XqK6We8v1bHlZKvaGUyk/eVfid\nS8jrUUrl+DqAL1dKHfEd8w+lVLuA50iL63E49hnfMXcF7PfE9bj8v9ZDKfWuUuqg799okVKqyHa/\nJ67Fdy5hr0fJIIUnlVLbfL87q5RSUwKOqfX1eCo0KKWuBvAnAL8G0A/ANwA+VEq1TumJRTYCwF8A\nnA3gAgC5AD5SSjU0Byil7gNwJ4BbAQwGcBRybfWSf7ruKQltt0L+Lez70+Z6lFItAPwHwEkA3wXQ\nA8BPABywHZMu1/M/AKZAFowrAXAvgHuVUneaAzx+LWEXvHN57o8DuBTAlQBGAmgP4M3EnnZI4a6n\nEYC+AP4f5P1sPIBiAO8GHJcu1/N/lFLjIe93Oxzu9sr1RPq/1hXAfACrIefZC8Bv4T83kFeuBYj8\nbzMNMp3BJMh7wzQATyqlLrMdU/vr0Vp75g+ALwA8YbutIKMt7k31uUV5Ha0B1AAYbtu3E8BU2+1m\nAI4DmJDq8w1zHU0ArAVwPmTEzGPpeD0A/gDgswjHpMX1APg3gOcC9r0B4J9peC01AMZG8+/gu30S\nwHjbMcW+5xrstetxOGYggNMAitL1egAUAtgKCd+bAdwV8O/luesJ8X+tFDLqL9RjPHktYa5nBYCf\nB+xbAhmtGLfr8UylQSmVC2AAZAltAICWq5qN9FvcqgUkCe4HAKXUGZA1OOzXVglgEbx9bU8B+LfW\neq59Zxpez/cALFFKvaak+WiZUmqyuTPNrmchgNFKqW4AoJTqA2AYgJm+2+l0LX5cnvtAyKR09mPW\nQj7EPH19Pua94aDv9gCk0fUopRSAfwJ4RGvttDhQWlyP7zouBbBeKTXL977whVLqctthaXEtNgsB\njFVKtQcApdR5ALrBmhUyLtfjmdAA+XaeDefFrdom/3Ri4/vP+DiABVrr1b7dbSFvFGlzbUqpayCl\n1Z853J1u19MFskT7Wkj57n8B/Fkp9QPf/el0PX8A8C8AZUqpKgBLATyutX7Vd386XUsgN+deAKDK\nFyZCHeNJSqn6kH+/V7TWR3y72yK9rud/IOf7ZIj70+V68iGV1PsggftCAG8DeEspNcJ3TLpci/Ej\nAGsAbPe9N8wE8EOt9X9898flepIxjXRd8zRkEqthqT6RWPk6Aj0O4AKtdXWqzycOsgAs1lr/0nf7\nG6VUTwC3AUi3ScauhrRZXgNpi+0L4Aml1E7NCdM8SymVA+B1SCi6I8WnExOl1AAAd0H6Z6Q784X5\nHa31n30/L1dKDYW8L8xPzWnVyl2QfiaXQaoHIwE87XtvmBv2kVHwUqVhL6Stz2lxq4rkn070lFJP\nAhgDYJTWutx2VwWkf0a6XNsAAG0ALFNKVSulqgGcC+BuX4LdhfS6nnJIArdbA6Cj7+d0+vd5BMAf\ntNava61Xaa1fhnR4MhWhdLqWQG7OvQJAPaVUszDHeIotMHQAcJGtygCk1/UMh7wvbLO9L3QC8JhS\napPvmHS5nr0ATiHy+0I6XAuUUg0A/A7APVrrmVrrlVrrpyFVyZ/6DovL9XgmNPi+0S4FMNrs85X6\nRyNBC2/Eky8wXA7gPK31Vvt9WuvNkH8U+7U1g6RCL17bbEhP4r4A+vj+LAHwEoA+WutNSK/r+Q+k\nw49dMYBvgbT792kECdd2NfD9LqfZtfhxee5LIW/29mOKIW/0IRfBSxVbYOgCYLTW+kDAIel0Pf8E\n0BvWe0IfSMfVRyCjkoA0uR7f582XCH5f6A7f+wLS5Fp8cn1/At8bTsP6nI/P9aSyB6hDj9AJAI4B\nuB4yZOSvAPYBaJPqc4tw3k9Dhu+NgKQ286eB7Zh7fdfyPcgH8jsA1gOol+rzd3mNgaMn0uZ6IJ3n\nTkK+jXeFlPcPA7gm3a4HwAuQ0uMYyLe88QB2A3goHa4FMmysDySQ1gD4se92B7fn7vt92wxgFKQq\n9h8A8712PZDm33chH0K9At4bctPtekIc7zd6wkvX4+L/2jjI8MrJvveFOwFUATjHa9fi8no+AbAc\nUhXuDOBGyOfprfG8nqRfuIu/mDsAbIEMs/ocwMBUn5OLc66BJLrAP9cHHPcAJJkfg/Ro/U6qzz2K\na5wLW2hIt+uBfMgu953rKgA3Oxzj+evxvXE85vvFPwr5QP1/AHLS4Vp8b2hOvy/Puz13APUh86Ls\nhYS/1wHke+16IKEu8D5ze2S6XU+I4zchODR44npc/l+7EcA63+/SMgCXefFa3FwPpHPn3wFs813P\nagB3x/t6uGAVERERueKZPg1ERETkbQwNRERE5ApDAxEREbnC0EBERESuMDQQERGRKwwNRPT/260D\nAQAAAABB/tYrDFAUASzSAAAs0gAALNIAACzSAAAs0gAALAHjP8Nes0BOAQAAAABJRU5ErkJggg==\n", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "# Plot average reward and value loss\n", "plt.plot(np.array(reward_avg.avgs))\n", "plt.show()\n", "plt.plot(np.array(value_avg.avgs))\n", "plt.show()" ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "As you can see from the shape of the rewards graph, training these kinds of networks is a rollercoaster of luck." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Exercises\n", "\n", "* Uncomment the line in the Environment class that returns a reward every step - the agent tends learn a bit quicker because the effects of eating plants are more immediately rewarded.\n", "* Try with a bigger grid size, bigger visible area, bigger network, etc.\n", "* Try with a recurrent network - it will train slower (in clock time) but often reaches higher values in fewer episodes.\n", "* Observe the effects of different learning rates and gamma values." ] } ], "metadata": { "anaconda-cloud": {}, "kernelspec": { "display_name": "Python [conda root]", "language": "python", "name": "conda-root-py" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.5.2" } }, "nbformat": 4, "nbformat_minor": 1 } ================================================ FILE: reinforce-gridworld/reinforce-gridworld.py ================================================ #!/usr/bin/env python # # Practical PyTorch: Playing GridWorld with Reinforcement Learning (Actor-Critic with REINFORCE) # ## Resources # ## Requirements import numpy as np from itertools import count import random, math import torch import torch.nn as nn import torch.nn.functional as F import torch.optim as optim import torch.autograd as autograd from torch.autograd import Variable from helpers import * # Configuration gamma = 0.9 # Discounted reward factor hidden_size = 50 learning_rate = 1e-4 weight_decay = 1e-5 log_every = 1000 render_every = 20000 import sconce job = sconce.Job('rl2', { 'gamma': gamma, 'learning_rate': learning_rate, }) job.log_every = log_every job.plot_every = 500 DROP_MAX = 0.3 DROP_MIN = 0.05 DROP_OVER = 200000 # ## The Grid World, Agent and Environment # ### The Grid MIN_PLANT_VALUE = -1 MAX_PLANT_VALUE = 0.5 GOAL_VALUE = 10 EDGE_VALUE = -10 VISIBLE_RADIUS = 1 class Grid(): def __init__(self, grid_size=8, n_plants=15): self.grid_size = grid_size self.n_plants = n_plants def reset(self): padded_size = self.grid_size + 2 * VISIBLE_RADIUS self.grid = np.zeros((padded_size, padded_size)) # Padding for edges # Edges self.grid[0:VISIBLE_RADIUS, :] = EDGE_VALUE self.grid[-1*VISIBLE_RADIUS:, :] = EDGE_VALUE self.grid[:, 0:VISIBLE_RADIUS] = EDGE_VALUE self.grid[:, -1*VISIBLE_RADIUS:] = EDGE_VALUE # Randomly placed plants for i in range(self.n_plants): plant_value = random.random() * (MAX_PLANT_VALUE - MIN_PLANT_VALUE) + MIN_PLANT_VALUE ry = random.randint(0, self.grid_size-1) + VISIBLE_RADIUS rx = random.randint(0, self.grid_size-1) + VISIBLE_RADIUS self.grid[ry, rx] = plant_value # Goal in one of the corners S = VISIBLE_RADIUS E = self.grid_size + VISIBLE_RADIUS - 1 gps = [(E, E), (S, E), (E, S), (S, S)] gp = gps[random.randint(0, len(gps)-1)] self.grid[gp] = GOAL_VALUE def visible(self, pos): y, x = pos return self.grid[y-VISIBLE_RADIUS:y+VISIBLE_RADIUS+1, x-VISIBLE_RADIUS:x+VISIBLE_RADIUS+1] # ### The Agent START_HEALTH = 1 STEP_VALUE = -0.02 class Agent: def reset(self): self.health = START_HEALTH def act(self, action): # Move according to action: 0=UP, 1=RIGHT, 2=DOWN, 3=LEFT y, x = self.pos if action == 0: y -= 1 elif action == 1: x += 1 elif action == 2: y += 1 elif action == 3: x -= 1 self.pos = (y, x) self.health += STEP_VALUE # Gradually getting hungrier # ### The Environment class Environment: def __init__(self): self.grid = Grid() self.agent = Agent() def reset(self): """Start a new episode by resetting grid and agent""" self.grid.reset() self.agent.reset() c = int(self.grid.grid_size / 2) self.agent.pos = (c, c) self.t = 0 self.history = [] self.record_step() return self.visible_state def record_step(self): """Add the current state to history for display later""" grid = np.array(self.grid.grid) grid[self.agent.pos] = self.agent.health * 0.5 # Agent marker faded by health visible = np.array(self.grid.visible(self.agent.pos)) self.history.append((grid, visible, self.agent.health)) @property def visible_state(self): """Return the visible area surrounding the agent, and current agent health""" visible = self.grid.visible(self.agent.pos) y, x = self.agent.pos yp = (y - VISIBLE_RADIUS) / self.grid.grid_size xp = (x - VISIBLE_RADIUS) / self.grid.grid_size extras = [self.agent.health, yp, xp] return np.concatenate((visible.flatten(), extras), 0) def step(self, action): """Update state (grid and agent) based on an action""" self.agent.act(action) # Get reward from where agent landed, add to agent health value = self.grid.grid[self.agent.pos] self.grid.grid[self.agent.pos] = 0 self.agent.health += value # Check if agent won (reached the goal) or lost (health reached 0) won = value == GOAL_VALUE lost = self.agent.health <= 0 done = won or lost # Rewards at end of episode if won: reward = 1 elif lost: reward = -1 else: reward = 0 # Reward will only come at the end # reward = value # Try this for quicker learning # Save in history self.record_step() return self.visible_state, reward, done # ## Actor-Critic network class Policy(nn.Module): def __init__(self, hidden_size): super(Policy, self).__init__() visible_squares = (VISIBLE_RADIUS * 2 + 1) ** 2 input_size = visible_squares + 1 + 2 # Plus agent health, y, x self.inp = nn.Linear(input_size, hidden_size) self.out = nn.Linear(hidden_size, 4 + 1, bias=False) # For both action and expected value def forward(self, x): x = x.view(1, -1) x = F.tanh(x) # Squash inputs x = F.relu(self.inp(x)) x = self.out(x) # Split last five outputs into scores and value scores = x[:,:4] value = x[:,4] return scores, value # ## Selecting actions def select_action(e, state): drop = interpolate(e, DROP_MAX, DROP_MIN, DROP_OVER) state = Variable(torch.from_numpy(state).float()) scores, value = policy(state) # Forward state through network scores = F.dropout(scores, drop, True) # Dropout for exploration scores = F.softmax(scores) action = scores.multinomial() # Sample an action return action, value # ## Playing through an episode def run_episode(e): state = env.reset() actions = [] values = [] rewards = [] done = False while not done: action, value = select_action(e, state) state, reward, done = env.step(action.data[0, 0]) actions.append(action) values.append(value) rewards.append(reward) return actions, values, rewards # ## Using REINFORCE with a value baseline mse = nn.MSELoss() def finish_episode(e, actions, values, rewards): # Calculate discounted rewards, going backwards from end discounted_rewards = [] R = 0 for r in rewards[::-1]: R = r + gamma * R discounted_rewards.insert(0, R) discounted_rewards = torch.Tensor(discounted_rewards) # Use REINFORCE on chosen actions and associated discounted rewards value_loss = 0 for action, value, reward in zip(actions, values, discounted_rewards): reward_diff = reward - value.data[0] # Treat critic value as baseline action.reinforce(reward_diff) # Try to perform better than baseline value_loss += mse(value, Variable(torch.Tensor([reward]))) # Compare with actual reward # Backpropagate optimizer.zero_grad() nodes = [value_loss] + actions gradients = [torch.ones(1)] + [None for _ in actions] # No gradients for reinforced values autograd.backward(nodes, gradients) optimizer.step() return discounted_rewards, value_loss env = Environment() policy = Policy(hidden_size=hidden_size) optimizer = optim.Adam(policy.parameters(), lr=learning_rate, weight_decay=weight_decay) reward_avg = SlidingAverage('reward avg', steps=log_every) value_loss_avg = SlidingAverage('value loss avg', steps=log_every) e = 0 while reward_avg < 1.0: actions, values, rewards = run_episode(e) final_reward = rewards[-1] discounted_rewards, value_loss = finish_episode(e, actions, values, rewards) job.record(e, final_reward) # REMOVE reward_avg.add(final_reward) value_loss_avg.add(value_loss.data[0]) if e % log_every == 0: print('[epoch=%d]' % e, reward_avg, value_loss_avg) e += 1 ================================================ FILE: seq2seq-translation/images/attention-decoder-network.dot ================================================ digraph G { // Main styles nodesep=0.3; ranksep=0.15; node [shape=rect, fillcolor=darkorange, color=white, style=filled, fontsize=11, fontname="arial", height=0.2]; edge [color=gray, arrowsize=0.5]; // Layout {rank=same;input;prev_hidden;encoder_outputs} input -> embedding; embedding -> dropout; dropout -> embedded; embedded -> attn; prev_hidden -> attn; attn -> attn_softmax; attn_softmax -> attn_weights; attn_weights -> bmm; encoder_outputs -> bmm; bmm -> attn_applied; attn_applied -> attn_combine; embedded -> attn_combine; attn_combine -> relu -> gru; prev_hidden -> gru; gru -> out; gru -> hidden; out -> softmax; softmax -> output; {rank=same;output;hidden} // Layer nodes embedding [fillcolor=dodgerblue, fontcolor=white]; attn [fillcolor=dodgerblue, fontcolor=white]; attn_combine [fillcolor=dodgerblue, fontcolor=white]; bmm [fillcolor=dodgerblue, fontcolor=white]; gru [fillcolor=dodgerblue, fontcolor=white]; out [fillcolor=dodgerblue, fontcolor=white]; // Function nodes dropout [fillcolor=palegreen]; relu [fillcolor=palegreen]; softmax [fillcolor=palegreen]; attn_softmax [fillcolor=palegreen]; } ================================================ FILE: seq2seq-translation/images/decoder-network.dot ================================================ digraph G { // Main styles nodesep=0.3; ranksep=0.15; node [shape=rect, fillcolor=darkorange, color=white, style=filled, fontsize=11, fontname="arial", height=0.2]; edge [color=gray, arrowsize=0.5]; // Layout {rank=same;input;prev_hidden} input -> embedding; embedding -> relu; relu -> gru; prev_hidden -> gru; gru -> out; gru -> hidden; out -> softmax; softmax -> output; {rank=same;output;hidden} // Layer nodes embedding [fillcolor=dodgerblue, fontcolor=white]; gru [fillcolor=dodgerblue, fontcolor=white]; out [fillcolor=dodgerblue, fontcolor=white]; // Function nodes relu [fillcolor=palegreen]; softmax [fillcolor=palegreen]; } ================================================ FILE: seq2seq-translation/images/encoder-network.dot ================================================ digraph G { // Main styles nodesep=0.3; ranksep=0.15; node [shape=rect, fillcolor=darkorange, color=white, style=filled, fontsize=11, fontname="arial", height=0.2]; edge [color=gray, arrowsize=0.5]; // Layout {rank=same;input;prev_hidden} input -> embedding; embedding -> embedded; embedded -> gru; prev_hidden -> gru; gru -> output; gru -> hidden; embedding [fillcolor=dodgerblue, fontcolor=white]; gru [fillcolor=dodgerblue, fontcolor=white]; } ================================================ FILE: seq2seq-translation/masked_cross_entropy.py ================================================ import torch from torch.nn import functional from torch.autograd import Variable def sequence_mask(sequence_length, max_len=None): if max_len is None: max_len = sequence_length.data.max() batch_size = sequence_length.size(0) seq_range = torch.range(0, max_len - 1).long() seq_range_expand = seq_range.unsqueeze(0).expand(batch_size, max_len) seq_range_expand = Variable(seq_range_expand) if sequence_length.is_cuda: seq_range_expand = seq_range_expand.cuda() seq_length_expand = (sequence_length.unsqueeze(1) .expand_as(seq_range_expand)) return seq_range_expand < seq_length_expand def masked_cross_entropy(logits, target, length): length = Variable(torch.LongTensor(length)).cuda() """ Args: logits: A Variable containing a FloatTensor of size (batch, max_len, num_classes) which contains the unnormalized probability for each class. target: A Variable containing a LongTensor of size (batch, max_len) which contains the index of the true class for each corresponding step. length: A Variable containing a LongTensor of size (batch,) which contains the length of each data in a batch. Returns: loss: An average loss value masked by the length. """ # logits_flat: (batch * max_len, num_classes) logits_flat = logits.view(-1, logits.size(-1)) # log_probs_flat: (batch * max_len, num_classes) log_probs_flat = functional.log_softmax(logits_flat) # target_flat: (batch * max_len, 1) target_flat = target.view(-1, 1) # losses_flat: (batch * max_len, 1) losses_flat = -torch.gather(log_probs_flat, dim=1, index=target_flat) # losses: (batch, max_len) losses = losses_flat.view(*target.size()) # mask: (batch, max_len) mask = sequence_mask(sequence_length=length, max_len=target.size(1)) losses = losses * mask.float() loss = losses.sum() / length.float().sum() return loss ================================================ FILE: seq2seq-translation/seq2seq-translation-batched.ipynb ================================================ { "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "![](https://i.imgur.com/eBRPvWB.png)\n", "\n", "# Practical PyTorch: Translation with a Sequence to Sequence Network and Attention\n", "\n", "In this project we will be teaching a neural network to translate from French to English.\n", "\n", "```\n", "[KEY: > input, = target, < output]\n", "\n", "> il est en train de peindre un tableau .\n", "= he is painting a picture .\n", "< he is painting a picture .\n", "\n", "> pourquoi ne pas essayer ce vin delicieux ?\n", "= why not try that delicious wine ?\n", "< why not try that delicious wine ?\n", "\n", "> elle n est pas poete mais romanciere .\n", "= she is not a poet but a novelist .\n", "< she not not a poet but a novelist .\n", "\n", "> vous etes trop maigre .\n", "= you re too skinny .\n", "< you re all alone .\n", "```\n", "\n", "... to varying degrees of success.\n", "\n", "This is made possible by the simple but powerful idea of the [sequence to sequence network](http://arxiv.org/abs/1409.3215), in which two recurrent neural networks work together to transform one sequence to another. An encoder network condenses an input sequence into a single vector, and a decoder network unfolds that vector into a new sequence.\n", "\n", "To improve upon this model we'll use an [attention mechanism](https://arxiv.org/abs/1409.0473), which lets the decoder learn to focus over a specific range of the input sequence." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Sequence to Sequence Learning\n", "\n", "A [Sequence to Sequence network](http://arxiv.org/abs/1409.3215), or seq2seq network, or [Encoder Decoder network](https://arxiv.org/pdf/1406.1078v3.pdf), is a model consisting of two separate RNNs called the **encoder** and **decoder**. The encoder reads an input sequence one item at a time, and outputs a vector at each step. The final output of the encoder is kept as the **context** vector. The decoder uses this context vector to produce a sequence of outputs one step at a time.\n", "\n", "![](https://i.imgur.com/tVtHhNp.png)\n", "\n", "When using a single RNN, there is a one-to-one relationship between inputs and outputs. We would quickly run into problems with different sequence orders and lengths that are common during translation. Consider the simple sentence \"Je ne suis pas le chat noir\" → \"I am not the black cat\". Many of the words have a pretty direct translation, like \"chat\" → \"cat\". However the differing grammars cause words to be in different orders, e.g. \"chat noir\" and \"black cat\". There is also the \"ne ... pas\" → \"not\" construction that makes the two sentences have different lengths.\n", "\n", "With the seq2seq model, by encoding many inputs into one vector, and decoding from one vector into many outputs, we are freed from the constraints of sequence order and length. The encoded sequence is represented by a single vector, a single point in some N dimensional space of sequences. In an ideal case, this point can be considered the \"meaning\" of the sequence.\n", "\n", "This idea can be extended beyond sequences. Image captioning tasks take an [image as input, and output a description](https://arxiv.org/abs/1411.4555) of the image (img2seq). Some image generation tasks take a [description as input and output a generated image](https://arxiv.org/abs/1511.02793) (seq2img). These models can be referred to more generally as \"encoder decoder\" networks." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## The Attention Mechanism\n", "\n", "The fixed-length vector carries the burden of encoding the the entire \"meaning\" of the input sequence, no matter how long that may be. With all the variance in language, this is a very hard problem. Imagine two nearly identical sentences, twenty words long, with only one word different. Both the encoders and decoders must be nuanced enough to represent that change as a very slightly different point in space.\n", "\n", "The **attention mechanism** [introduced by Bahdanau et al.](https://arxiv.org/abs/1409.0473) addresses this by giving the decoder a way to \"pay attention\" to parts of the input, rather than relying on a single vector. For every step the decoder can select a different part of the input sentence to consider.\n", "\n", "![](https://i.imgur.com/5y6SCvU.png)\n", "\n", "Attention is calculated using the current hidden state and each encoder output, resulting in a vector the same size as the input sequence, called the *attention weights*. These weights are multiplied by the encoder outputs to create a weighted sum of encoder outputs, which is called the *context* vector. The context vector and hidden state are used to predict the next output element.\n", "\n", "![](https://i.imgur.com/K1qMPxs.png)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Requirements\n", "\n", "You will need [PyTorch](http://pytorch.org/) to build and train the models, and [matplotlib](https://matplotlib.org/) for plotting training and visualizing attention outputs later. The rest are builtin Python libraries." ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "collapsed": true }, "outputs": [], "source": [ "import unicodedata\n", "import string\n", "import re\n", "import random\n", "import time\n", "import datetime\n", "import math\n", "import socket\n", "hostname = socket.gethostname()\n", "\n", "import torch\n", "import torch.nn as nn\n", "from torch.autograd import Variable\n", "from torch import optim\n", "import torch.nn.functional as F\n", "from torch.nn.utils.rnn import pad_packed_sequence, pack_padded_sequence#, masked_cross_entropy\n", "from masked_cross_entropy import *\n", "\n", "import matplotlib.pyplot as plt\n", "import matplotlib.ticker as ticker\n", "import numpy as np\n", "%matplotlib inline" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Here we will also define a constant to decide whether to use the GPU (with CUDA specifically) or the CPU. **If you don't have a GPU, set this to `False`**. Later when we create tensors, this variable will be used to decide whether we keep them on CPU or move them to GPU." ] }, { "cell_type": "code", "execution_count": 2, "metadata": { "collapsed": true }, "outputs": [], "source": [ "USE_CUDA = True" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Loading data files\n", "\n", "The data for this project is a set of many thousands of English to French translation pairs.\n", "\n", "[This question on Open Data Stack Exchange](http://opendata.stackexchange.com/questions/3888/dataset-of-sentences-translated-into-many-languages) pointed me to the open translation site http://tatoeba.org/ which has downloads available at http://tatoeba.org/eng/downloads - and better yet, someone did the extra work of splitting language pairs into individual text files here: http://www.manythings.org/anki/\n", "\n", "The English to French pairs are too big to include in the repo, so download `fra-eng.zip`, extract the text file in there, and rename it to `data/eng-fra.txt` before continuing (for some reason the zipfile is named backwards). The file is a tab separated list of translation pairs:\n", "\n", "```\n", "I am cold. Je suis froid.\n", "```" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Similar to the character encoding used in the character-level RNN tutorials, we will be representing each word in a language as a one-hot vector, or giant vector of zeros except for a single one (at the index of the word). Compared to the dozens of characters that might exist in a language, there are many many more words, so the encoding vector is much larger. We will however cheat a bit and trim the data to only use a few thousand words per language." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Indexing words\n", "\n", "We'll need a unique index per word to use as the inputs and targets of the networks later. To keep track of all this we will use a helper class called `Lang` which has word → index (`word2index`) and index → word (`index2word`) dictionaries, as well as a count of each word (`word2count`). This class includes a function `trim(min_count)` to remove rare words once they are all counted." ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "collapsed": true }, "outputs": [], "source": [ "PAD_token = 0\n", "SOS_token = 1\n", "EOS_token = 2\n", "\n", "class Lang:\n", " def __init__(self, name):\n", " self.name = name\n", " self.trimmed = False\n", " self.word2index = {}\n", " self.word2count = {}\n", " self.index2word = {0: \"PAD\", 1: \"SOS\", 2: \"EOS\"}\n", " self.n_words = 3 # Count default tokens\n", "\n", " def index_words(self, sentence):\n", " for word in sentence.split(' '):\n", " self.index_word(word)\n", "\n", " def index_word(self, word):\n", " if word not in self.word2index:\n", " self.word2index[word] = self.n_words\n", " self.word2count[word] = 1\n", " self.index2word[self.n_words] = word\n", " self.n_words += 1\n", " else:\n", " self.word2count[word] += 1\n", "\n", " # Remove words below a certain count threshold\n", " def trim(self, min_count):\n", " if self.trimmed: return\n", " self.trimmed = True\n", " \n", " keep_words = []\n", " \n", " for k, v in self.word2count.items():\n", " if v >= min_count:\n", " keep_words.append(k)\n", "\n", " print('keep_words %s / %s = %.4f' % (\n", " len(keep_words), len(self.word2index), len(keep_words) / len(self.word2index)\n", " ))\n", "\n", " # Reinitialize dictionaries\n", " self.word2index = {}\n", " self.word2count = {}\n", " self.index2word = {0: \"PAD\", 1: \"SOS\", 2: \"EOS\"}\n", " self.n_words = 3 # Count default tokens\n", "\n", " for word in keep_words:\n", " self.index_word(word)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Reading and decoding files\n", "\n", "The files are all in Unicode, to simplify we will turn Unicode characters to ASCII, make everything lowercase, and trim most punctuation." ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "collapsed": false }, "outputs": [], "source": [ "# Turn a Unicode string to plain ASCII, thanks to http://stackoverflow.com/a/518232/2809427\n", "def unicode_to_ascii(s):\n", " return ''.join(\n", " c for c in unicodedata.normalize('NFD', s)\n", " if unicodedata.category(c) != 'Mn'\n", " )\n", "\n", "# Lowercase, trim, and remove non-letter characters\n", "def normalize_string(s):\n", " s = unicode_to_ascii(s.lower().strip())\n", " s = re.sub(r\"([,.!?])\", r\" \\1 \", s)\n", " s = re.sub(r\"[^a-zA-Z,.!?]+\", r\" \", s)\n", " s = re.sub(r\"\\s+\", r\" \", s).strip()\n", " return s" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To read the data file we will split the file into lines, and then split lines into pairs. The files are all English → Other Language, so if we want to translate from Other Language → English I added the `reverse` flag to reverse the pairs." ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "collapsed": true }, "outputs": [], "source": [ "def read_langs(lang1, lang2, reverse=False):\n", " print(\"Reading lines...\")\n", "\n", " # Read the file and split into lines\n", "# filename = '../data/%s-%s.txt' % (lang1, lang2)\n", " filename = '../%s-%s.txt' % (lang1, lang2)\n", " lines = open(filename).read().strip().split('\\n')\n", "\n", " # Split every line into pairs and normalize\n", " pairs = [[normalize_string(s) for s in l.split('\\t')] for l in lines]\n", "\n", " # Reverse pairs, make Lang instances\n", " if reverse:\n", " pairs = [list(reversed(p)) for p in pairs]\n", " input_lang = Lang(lang2)\n", " output_lang = Lang(lang1)\n", " else:\n", " input_lang = Lang(lang1)\n", " output_lang = Lang(lang2)\n", "\n", " return input_lang, output_lang, pairs" ] }, { "cell_type": "code", "execution_count": 6, "metadata": { "collapsed": false }, "outputs": [], "source": [ "MIN_LENGTH = 3\n", "MAX_LENGTH = 25\n", "\n", "def filter_pairs(pairs):\n", " filtered_pairs = []\n", " for pair in pairs:\n", " if len(pair[0]) >= MIN_LENGTH and len(pair[0]) <= MAX_LENGTH \\\n", " and len(pair[1]) >= MIN_LENGTH and len(pair[1]) <= MAX_LENGTH:\n", " filtered_pairs.append(pair)\n", " return filtered_pairs" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The full process for preparing the data is:\n", "\n", "* Read text file and split into lines\n", "* Split lines into pairs and normalize\n", "* Filter to pairs of a certain length\n", "* Make word lists from sentences in pairs" ] }, { "cell_type": "code", "execution_count": 7, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Reading lines...\n", "Read 135646 sentence pairs\n", "Filtered to 25706 pairs\n", "Indexing words...\n", "Indexed 6999 words in input language, 4343 words in output\n" ] } ], "source": [ "def prepare_data(lang1_name, lang2_name, reverse=False):\n", " input_lang, output_lang, pairs = read_langs(lang1_name, lang2_name, reverse)\n", " print(\"Read %d sentence pairs\" % len(pairs))\n", " \n", " pairs = filter_pairs(pairs)\n", " print(\"Filtered to %d pairs\" % len(pairs))\n", " \n", " print(\"Indexing words...\")\n", " for pair in pairs:\n", " input_lang.index_words(pair[0])\n", " output_lang.index_words(pair[1])\n", " \n", " print('Indexed %d words in input language, %d words in output' % (input_lang.n_words, output_lang.n_words))\n", " return input_lang, output_lang, pairs\n", "\n", "input_lang, output_lang, pairs = prepare_data('eng', 'fra', True)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Filtering vocabularies\n", "\n", "To get something that trains in under an hour, we'll trim the data set a bit. First we will use the `trim` function on each language (defined above) to only include words that are repeated a certain amount of times through the dataset (this softens the difficulty of learning a correct translation for words that don't appear often)." ] }, { "cell_type": "code", "execution_count": 8, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "keep_words 1717 / 6996 = 0.2454\n", "keep_words 1529 / 4340 = 0.3523\n" ] } ], "source": [ "MIN_COUNT = 5\n", "\n", "input_lang.trim(MIN_COUNT)\n", "output_lang.trim(MIN_COUNT)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Filtering pairs\n", "\n", "Now we will go back to the set of all sentence pairs and remove those with unknown words." ] }, { "cell_type": "code", "execution_count": 9, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Trimmed from 25706 pairs to 15896, 0.6184 of total\n" ] } ], "source": [ "keep_pairs = []\n", "\n", "for pair in pairs:\n", " input_sentence = pair[0]\n", " output_sentence = pair[1]\n", " keep_input = True\n", " keep_output = True\n", " \n", " for word in input_sentence.split(' '):\n", " if word not in input_lang.word2index:\n", " keep_input = False\n", " break\n", "\n", " for word in output_sentence.split(' '):\n", " if word not in output_lang.word2index:\n", " keep_output = False\n", " break\n", "\n", " # Remove if pair doesn't match input and output conditions\n", " if keep_input and keep_output:\n", " keep_pairs.append(pair)\n", "\n", "print(\"Trimmed from %d pairs to %d, %.4f of total\" % (len(pairs), len(keep_pairs), len(keep_pairs) / len(pairs)))\n", "pairs = keep_pairs" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Turning training data into Tensors\n", "\n", "To train we need to turn the sentences into something the neural network can understand, which of course means numbers. Each sentence will be split into words and turned into a `LongTensor` which represents the index (from the Lang indexes made earlier) of each word. While creating these tensors we will also append the EOS token to signal that the sentence is over.\n", "\n", "![](https://i.imgur.com/LzocpGH.png)" ] }, { "cell_type": "code", "execution_count": 10, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# Return a list of indexes, one for each word in the sentence, plus EOS\n", "def indexes_from_sentence(lang, sentence):\n", " return [lang.word2index[word] for word in sentence.split(' ')] + [EOS_token]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We can make better use of the GPU by training on batches of many sequences at once, but doing so brings up the question of how to deal with sequences of varying lengths. The simple solution is to \"pad\" the shorter sentences with some padding symbol (in this case `0`), and ignore these padded spots when calculating the loss.\n", "\n", "![](https://i.imgur.com/gGlkEEF.png)" ] }, { "cell_type": "code", "execution_count": 11, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# Pad a with the PAD symbol\n", "def pad_seq(seq, max_length):\n", " seq += [PAD_token for i in range(max_length - len(seq))]\n", " return seq" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To create a Variable for a full batch of inputs (and targets) we get a random sample of sequences and pad them all to the length of the longest sequence. We'll keep track of the lengths of each batch in order to un-pad later.\n", "\n", "Initializing a `LongTensor` with an array (batches) of arrays (sequences) gives us a `(batch_size x max_len)` tensor - selecting the first dimension gives you a single batch, which is a full sequence. When training the model we'll want a single time step at once, so we'll transpose to `(max_len x batch_size)`. Now selecting along the first dimension returns a single time step across batches.\n", "\n", "![](https://i.imgur.com/nBxTG3v.png)" ] }, { "cell_type": "code", "execution_count": 12, "metadata": { "collapsed": true }, "outputs": [], "source": [ "def random_batch(batch_size):\n", " input_seqs = []\n", " target_seqs = []\n", "\n", " # Choose random pairs\n", " for i in range(batch_size):\n", " pair = random.choice(pairs)\n", " input_seqs.append(indexes_from_sentence(input_lang, pair[0]))\n", " target_seqs.append(indexes_from_sentence(output_lang, pair[1]))\n", "\n", " # Zip into pairs, sort by length (descending), unzip\n", " seq_pairs = sorted(zip(input_seqs, target_seqs), key=lambda p: len(p[0]), reverse=True)\n", " input_seqs, target_seqs = zip(*seq_pairs)\n", " \n", " # For input and target sequences, get array of lengths and pad with 0s to max length\n", " input_lengths = [len(s) for s in input_seqs]\n", " input_padded = [pad_seq(s, max(input_lengths)) for s in input_seqs]\n", " target_lengths = [len(s) for s in target_seqs]\n", " target_padded = [pad_seq(s, max(target_lengths)) for s in target_seqs]\n", "\n", " # Turn padded arrays into (batch_size x max_len) tensors, transpose into (max_len x batch_size)\n", " input_var = Variable(torch.LongTensor(input_padded)).transpose(0, 1)\n", " target_var = Variable(torch.LongTensor(target_padded)).transpose(0, 1)\n", " \n", " if USE_CUDA:\n", " input_var = input_var.cuda()\n", " target_var = target_var.cuda()\n", " \n", " return input_var, input_lengths, target_var, target_lengths" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We can test this to see that it will return a `(max_len x batch_size)` tensor for input and target sentences, along with a corresponding list of batch lenghts for each (which we will use for masking later)." ] }, { "cell_type": "code", "execution_count": 13, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "(Variable containing:\n", " 88 92\n", " 44 208\n", " 107 297\n", " 634 14\n", " 14 2\n", " 2 0\n", " [torch.cuda.LongTensor of size 6x2 (GPU 0)], [6, 5], Variable containing:\n", " 50 50\n", " 1128 19\n", " 436 26\n", " 969 4\n", " 4 2\n", " 2 0\n", " [torch.cuda.LongTensor of size 6x2 (GPU 0)], [6, 5])" ] }, "execution_count": 13, "metadata": {}, "output_type": "execute_result" } ], "source": [ "random_batch(2)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Building the models" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## The Encoder\n", "\n", "\n", "\n", "The encoder will take a batch of word sequences, a `LongTensor` of size `(max_len x batch_size)`, and output an encoding for each word, a `FloatTensor` of size `(max_len x batch_size x hidden_size)`.\n", "\n", "The word inputs are fed through an [embedding layer `nn.Embedding`](http://pytorch.org/docs/nn.html#embedding) to create an embedding for each word, with size `seq_len x hidden_size` (as if it was a batch of words). This is resized to `seq_len x 1 x hidden_size` to fit the expected input of the [GRU layer `nn.GRU`](http://pytorch.org/docs/nn.html#gru). The GRU will return both an output sequence of size `seq_len x hidden_size`." ] }, { "cell_type": "code", "execution_count": 14, "metadata": { "collapsed": false }, "outputs": [], "source": [ "class EncoderRNN(nn.Module):\n", " def __init__(self, input_size, hidden_size, n_layers=1, dropout=0.1):\n", " super(EncoderRNN, self).__init__()\n", " \n", " self.input_size = input_size\n", " self.hidden_size = hidden_size\n", " self.n_layers = n_layers\n", " self.dropout = dropout\n", " \n", " self.embedding = nn.Embedding(input_size, hidden_size)\n", " self.gru = nn.GRU(hidden_size, hidden_size, n_layers, dropout=self.dropout, bidirectional=True)\n", " \n", " def forward(self, input_seqs, input_lengths, hidden=None):\n", " # Note: we run this all at once (over multiple batches of multiple sequences)\n", " embedded = self.embedding(input_seqs)\n", " packed = torch.nn.utils.rnn.pack_padded_sequence(embedded, input_lengths)\n", " outputs, hidden = self.gru(packed, hidden)\n", " outputs, output_lengths = torch.nn.utils.rnn.pad_packed_sequence(outputs) # unpack (back to padded)\n", " outputs = outputs[:, :, :self.hidden_size] + outputs[:, : ,self.hidden_size:] # Sum bidirectional outputs\n", " return outputs, hidden" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Attention Decoder" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Interpreting the Bahdanau et al. model\n", "\n", "[Neural Machine Translation by Jointly Learning to Align and Translate](https://arxiv.org/abs/1409.0473) (Dzmitry Bahdanau, Kyunghyun Cho, Yoshua Bengio) introduced the idea of using attention for seq2seq translation.\n", "\n", "Each decoder output is conditioned on the previous outputs and some $\\mathbf x$, where $\\mathbf x$ consists of the current hidden state (which takes into account previous outputs) and the attention \"context\", which is calculated below. The function $g$ is a fully-connected layer with a nonlinear activation, which takes as input the values $y_{i-1}$, $s_i$, and $c_i$ concatenated.\n", "\n", "$$\n", "p(y_i \\mid \\{y_1,...,y_{i-1}\\},\\mathbf{x}) = g(y_{i-1}, s_i, c_i)\n", "$$\n", "\n", "The current hidden state $s_i$ is calculated by an RNN $f$ with the last hidden state $s_{i-1}$, last decoder output value $y_{i-1}$, and context vector $c_i$.\n", "\n", "In the code, the RNN will be a `nn.GRU` layer, the hidden state $s_i$ will be called `hidden`, the output $y_i$ called `output`, and context $c_i$ called `context`.\n", "\n", "$$\n", "s_i = f(s_{i-1}, y_{i-1}, c_i)\n", "$$\n", "\n", "The context vector $c_i$ is a weighted sum of all encoder outputs, where each weight $a_{ij}$ is the amount of \"attention\" paid to the corresponding encoder output $h_j$.\n", "\n", "$$\n", "c_i = \\sum_{j=1}^{T_x} a_{ij} h_j\n", "$$\n", "\n", "... where each weight $a_{ij}$ is a normalized (over all steps) attention \"energy\" $e_{ij}$ ...\n", "\n", "$$\n", "a_{ij} = \\dfrac{exp(e_{ij})}{\\sum_{k=1}^{T} exp(e_{ik})}\n", "$$\n", "\n", "... where each attention energy is calculated with some function $a$ (such as another linear layer) using the last hidden state $s_{i-1}$ and that particular encoder output $h_j$:\n", "\n", "$$\n", "e_{ij} = a(s_{i-1}, h_j)\n", "$$" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Interpreting the Luong et al. models" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "[Effective Approaches to Attention-based Neural Machine Translation](https://arxiv.org/abs/1508.04025) (Minh-Thang Luong, Hieu Pham, Christopher D. Manning) describe a few more attention models that offer improvements and simplifications. They describe a few \"global attention\" models, the distinction between them being the way the attention scores are calculated.\n", "\n", "The general form of the attention calculation relies on the target (decoder) side hidden state and corresponding source (encoder) side state, normalized over all states to get values summing to 1:\n", "\n", "$$\n", "a_t(s) = align(h_t, \\bar h_s) = \\dfrac{exp(score(h_t, \\bar h_s))}{\\sum_{s'} exp(score(h_t, \\bar h_{s'}))}\n", "$$\n", "\n", "The specific \"score\" function that compares two states is either *dot*, a simple dot product between the states; *general*, a a dot product between the decoder hidden state and a linear transform of the encoder state; or *concat*, a dot product between a new parameter $v_a$ and a linear transform of the states concatenated together.\n", "\n", "$$\n", "score(h_t, \\bar h_s) =\n", "\\begin{cases}\n", "h_t ^\\top \\bar h_s & dot \\\\\n", "h_t ^\\top \\textbf{W}_a \\bar h_s & general \\\\\n", "v_a ^\\top \\textbf{W}_a [ h_t ; \\bar h_s ] & concat\n", "\\end{cases}\n", "$$\n", "\n", "The modular definition of these scoring functions gives us an opportunity to build specific attention module that can switch between the different score methods. The input to this module is always the hidden state (of the decoder RNN) and set of encoder outputs." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Implementing an attention module" ] }, { "cell_type": "code", "execution_count": 15, "metadata": { "collapsed": false }, "outputs": [], "source": [ "class Attn(nn.Module):\n", " def __init__(self, method, hidden_size):\n", " super(Attn, self).__init__()\n", " \n", " self.method = method\n", " self.hidden_size = hidden_size\n", " \n", " if self.method == 'general':\n", " self.attn = nn.Linear(self.hidden_size, hidden_size)\n", "\n", " elif self.method == 'concat':\n", " self.attn = nn.Linear(self.hidden_size * 2, hidden_size)\n", " self.v = nn.Parameter(torch.FloatTensor(1, hidden_size))\n", "\n", " def forward(self, hidden, encoder_outputs):\n", " max_len = encoder_outputs.size(0)\n", " this_batch_size = encoder_outputs.size(1)\n", "\n", " # Create variable to store attention energies\n", " attn_energies = Variable(torch.zeros(this_batch_size, max_len)) # B x S\n", "\n", " if USE_CUDA:\n", " attn_energies = attn_energies.cuda()\n", "\n", " # For each batch of encoder outputs\n", " for b in range(this_batch_size):\n", " # Calculate energy for each encoder output\n", " for i in range(max_len):\n", " attn_energies[b, i] = self.score(hidden[:, b], encoder_outputs[i, b].unsqueeze(0))\n", "\n", " # Normalize energies to weights in range 0 to 1, resize to 1 x B x S\n", " return F.softmax(attn_energies).unsqueeze(1)\n", " \n", " def score(self, hidden, encoder_output):\n", " \n", " if self.method == 'dot':\n", " energy = hidden.dot(encoder_output)\n", " return energy\n", " \n", " elif self.method == 'general':\n", " energy = self.attn(encoder_output)\n", " energy = hidden.dot(energy)\n", " return energy\n", " \n", " elif self.method == 'concat':\n", " energy = self.attn(torch.cat((hidden, encoder_output), 1))\n", " energy = self.v.dot(energy)\n", " return energy" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Implementing the Bahdanau et al. model\n", "\n", "In summary our decoder should consist of four main parts - an embedding layer turning an input word into a vector; a layer to calculate the attention energy per encoder output; a RNN layer; and an output layer.\n", "\n", "The decoder's inputs are the last RNN hidden state $s_{i-1}$, last output $y_{i-1}$, and all encoder outputs $h_*$.\n", "\n", "* embedding layer with inputs $y_{i-1}$\n", " * `embedded = embedding(last_rnn_output)`\n", "* attention layer $a$ with inputs $(s_{i-1}, h_j)$ and outputs $e_{ij}$, normalized to create $a_{ij}$\n", " * `attn_energies[j] = attn_layer(last_hidden, encoder_outputs[j])`\n", " * `attn_weights = normalize(attn_energies)`\n", "* context vector $c_i$ as an attention-weighted average of encoder outputs\n", " * `context = sum(attn_weights * encoder_outputs)`\n", "* RNN layer(s) $f$ with inputs $(s_{i-1}, y_{i-1}, c_i)$ and internal hidden state, outputting $s_i$\n", " * `rnn_input = concat(embedded, context)`\n", " * `rnn_output, rnn_hidden = rnn(rnn_input, last_hidden)`\n", "* an output layer $g$ with inputs $(y_{i-1}, s_i, c_i)$, outputting $y_i$\n", " * `output = out(embedded, rnn_output, context)`" ] }, { "cell_type": "code", "execution_count": 16, "metadata": { "collapsed": true }, "outputs": [], "source": [ "class BahdanauAttnDecoderRNN(nn.Module):\n", " def __init__(self, hidden_size, output_size, n_layers=1, dropout_p=0.1):\n", " super(BahdanauAttnDecoderRNN, self).__init__()\n", " \n", " # Define parameters\n", " self.hidden_size = hidden_size\n", " self.output_size = output_size\n", " self.n_layers = n_layers\n", " self.dropout_p = dropout_p\n", " self.max_length = max_length\n", " \n", " # Define layers\n", " self.embedding = nn.Embedding(output_size, hidden_size)\n", " self.dropout = nn.Dropout(dropout_p)\n", " self.attn = Attn('concat', hidden_size)\n", " self.gru = nn.GRU(hidden_size, hidden_size, n_layers, dropout=dropout_p)\n", " self.out = nn.Linear(hidden_size, output_size)\n", " \n", " def forward(self, word_input, last_hidden, encoder_outputs):\n", " # Note: we run this one step at a time\n", " # TODO: FIX BATCHING\n", " \n", " # Get the embedding of the current input word (last output word)\n", " word_embedded = self.embedding(word_input).view(1, 1, -1) # S=1 x B x N\n", " word_embedded = self.dropout(word_embedded)\n", " \n", " # Calculate attention weights and apply to encoder outputs\n", " attn_weights = self.attn(last_hidden[-1], encoder_outputs)\n", " context = attn_weights.bmm(encoder_outputs.transpose(0, 1)) # B x 1 x N\n", " context = context.transpose(0, 1) # 1 x B x N\n", " \n", " # Combine embedded input word and attended context, run through RNN\n", " rnn_input = torch.cat((word_embedded, context), 2)\n", " output, hidden = self.gru(rnn_input, last_hidden)\n", " \n", " # Final output layer\n", " output = output.squeeze(0) # B x N\n", " output = F.log_softmax(self.out(torch.cat((output, context), 1)))\n", " \n", " # Return final output, hidden state, and attention weights (for visualization)\n", " return output, hidden, attn_weights" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now we can build a decoder that plugs this Attn module in after the RNN to calculate attention weights, and apply those weights to the encoder outputs to get a context vector." ] }, { "cell_type": "code", "execution_count": 17, "metadata": { "collapsed": false }, "outputs": [], "source": [ "class LuongAttnDecoderRNN(nn.Module):\n", " def __init__(self, attn_model, hidden_size, output_size, n_layers=1, dropout=0.1):\n", " super(LuongAttnDecoderRNN, self).__init__()\n", "\n", " # Keep for reference\n", " self.attn_model = attn_model\n", " self.hidden_size = hidden_size\n", " self.output_size = output_size\n", " self.n_layers = n_layers\n", " self.dropout = dropout\n", "\n", " # Define layers\n", " self.embedding = nn.Embedding(output_size, hidden_size)\n", " self.embedding_dropout = nn.Dropout(dropout)\n", " self.gru = nn.GRU(hidden_size, hidden_size, n_layers, dropout=dropout)\n", " self.concat = nn.Linear(hidden_size * 2, hidden_size)\n", " self.out = nn.Linear(hidden_size, output_size)\n", " \n", " # Choose attention model\n", " if attn_model != 'none':\n", " self.attn = Attn(attn_model, hidden_size)\n", "\n", " def forward(self, input_seq, last_hidden, encoder_outputs):\n", " # Note: we run this one step at a time\n", "\n", " # Get the embedding of the current input word (last output word)\n", " batch_size = input_seq.size(0)\n", " embedded = self.embedding(input_seq)\n", " embedded = self.embedding_dropout(embedded)\n", " embedded = embedded.view(1, batch_size, self.hidden_size) # S=1 x B x N\n", "\n", " # Get current hidden state from input word and last hidden state\n", " rnn_output, hidden = self.gru(embedded, last_hidden)\n", "\n", " # Calculate attention from current RNN state and all encoder outputs;\n", " # apply to encoder outputs to get weighted average\n", " attn_weights = self.attn(rnn_output, encoder_outputs)\n", " context = attn_weights.bmm(encoder_outputs.transpose(0, 1)) # B x S=1 x N\n", "\n", " # Attentional vector using the RNN hidden state and context vector\n", " # concatenated together (Luong eq. 5)\n", " rnn_output = rnn_output.squeeze(0) # S=1 x B x N -> B x N\n", " context = context.squeeze(1) # B x S=1 x N -> B x N\n", " concat_input = torch.cat((rnn_output, context), 1)\n", " concat_output = F.tanh(self.concat(concat_input))\n", "\n", " # Finally predict next token (Luong eq. 6, without softmax)\n", " output = self.out(concat_output)\n", "\n", " # Return final output, hidden state, and attention weights (for visualization)\n", " return output, hidden, attn_weights" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Testing the models\n", "\n", "To make sure the encoder and decoder modules are working (and working together) we'll do a full test with a small batch." ] }, { "cell_type": "code", "execution_count": 18, "metadata": { "collapsed": false, "scrolled": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "input_batches torch.Size([7, 3])\n", "target_batches torch.Size([8, 3])\n" ] } ], "source": [ "small_batch_size = 3\n", "input_batches, input_lengths, target_batches, target_lengths = random_batch(small_batch_size)\n", "\n", "print('input_batches', input_batches.size()) # (max_len x batch_size)\n", "print('target_batches', target_batches.size()) # (max_len x batch_size)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Create models with a small size (a good idea for eyeball inspection):" ] }, { "cell_type": "code", "execution_count": 19, "metadata": { "collapsed": true }, "outputs": [], "source": [ "small_hidden_size = 8\n", "small_n_layers = 2\n", "\n", "encoder_test = EncoderRNN(input_lang.n_words, small_hidden_size, small_n_layers)\n", "decoder_test = LuongAttnDecoderRNN('general', small_hidden_size, output_lang.n_words, small_n_layers)\n", "\n", "if USE_CUDA:\n", " encoder_test.cuda()\n", " decoder_test.cuda()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To test the encoder, run the input batch through to get per-batch encoder outputs:" ] }, { "cell_type": "code", "execution_count": 20, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "encoder_outputs torch.Size([7, 3, 8])\n", "encoder_hidden torch.Size([4, 3, 8])\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "/home/sean/anaconda3/lib/python3.6/site-packages/torch/backends/cudnn/__init__.py:46: UserWarning: PyTorch was compiled without cuDNN support. To use cuDNN, rebuild PyTorch making sure the library is visible to the build system.\n", " \"PyTorch was compiled without cuDNN support. To use cuDNN, rebuild \"\n" ] } ], "source": [ "encoder_outputs, encoder_hidden = encoder_test(input_batches, input_lengths, None)\n", "\n", "print('encoder_outputs', encoder_outputs.size()) # max_len x batch_size x hidden_size\n", "print('encoder_hidden', encoder_hidden.size()) # n_layers * 2 x batch_size x hidden_size" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Then starting with a SOS token, run word tokens through the decoder to get each next word token. Instead of doing this with the whole sequence, it is done one at a time, to support using it's own predictions to make the next prediction. This will be one time step at a time, but batched per time step. In order to get this to work for short padded sequences, the batch size is going to get smaller each time." ] }, { "cell_type": "code", "execution_count": 21, "metadata": { "collapsed": false, "scrolled": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "loss 7.343282222747803\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "/home/sean/anaconda3/lib/python3.6/site-packages/torch/backends/cudnn/__init__.py:46: UserWarning: PyTorch was compiled without cuDNN support. To use cuDNN, rebuild PyTorch making sure the library is visible to the build system.\n", " \"PyTorch was compiled without cuDNN support. To use cuDNN, rebuild \"\n", "/home/sean/Projects/practical-pytorch/seq2seq-translation/masked_cross_entropy.py:9: UserWarning: torch.range is deprecated in favor of torch.arange and will be removed in 0.3. Note that arange generates values in [start; end), not [start; end].\n", " seq_range = torch.range(0, max_len - 1).long()\n" ] } ], "source": [ "max_target_length = max(target_lengths)\n", "\n", "# Prepare decoder input and outputs\n", "decoder_input = Variable(torch.LongTensor([SOS_token] * small_batch_size))\n", "decoder_hidden = encoder_hidden[:decoder_test.n_layers] # Use last (forward) hidden state from encoder\n", "all_decoder_outputs = Variable(torch.zeros(max_target_length, small_batch_size, decoder_test.output_size))\n", "\n", "if USE_CUDA:\n", " all_decoder_outputs = all_decoder_outputs.cuda()\n", " decoder_input = decoder_input.cuda()\n", "\n", "# Run through decoder one time step at a time\n", "for t in range(max_target_length):\n", " decoder_output, decoder_hidden, decoder_attn = decoder_test(\n", " decoder_input, decoder_hidden, encoder_outputs\n", " )\n", " all_decoder_outputs[t] = decoder_output # Store this step's outputs\n", " decoder_input = target_batches[t] # Next input is current target\n", "\n", "# Test masked cross entropy loss\n", "loss = masked_cross_entropy(\n", " all_decoder_outputs.transpose(0, 1).contiguous(),\n", " target_batches.transpose(0, 1).contiguous(),\n", " target_lengths\n", ")\n", "print('loss', loss.data[0])" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Training\n", "\n", "## Defining a training iteration\n", "\n", "To train we first run the input sentence through the encoder word by word, and keep track of every output and the latest hidden state. Next the decoder is given the last hidden state of the decoder as its first hidden state, and the `` token as its first input. From there we iterate to predict a next token from the decoder.\n", "\n", "### Teacher Forcing vs. Scheduled Sampling\n", "\n", "\"Teacher Forcing\", or maximum likelihood sampling, means using the real target outputs as each next input when training. The alternative is using the decoder's own guess as the next input. Using teacher forcing may cause the network to converge faster, but [when the trained network is exploited, it may exhibit instability](http://minds.jacobs-university.de/sites/default/files/uploads/papers/ESNTutorialRev.pdf).\n", "\n", "You can observe outputs of teacher-forced networks that read with coherent grammar but wander far from the correct translation - you could think of it as having learned how to listen to the teacher's instructions, without learning how to venture out on its own.\n", "\n", "The solution to the teacher-forcing \"problem\" is known as [Scheduled Sampling](https://arxiv.org/abs/1506.03099), which simply alternates between using the target values and predicted values when training. We will randomly choose to use teacher forcing with an if statement while training - sometimes we'll feed use real target as the input (ignoring the decoder's output), sometimes we'll use the decoder's output." ] }, { "cell_type": "code", "execution_count": 22, "metadata": { "collapsed": false }, "outputs": [], "source": [ "def train(input_batches, input_lengths, target_batches, target_lengths, encoder, decoder, encoder_optimizer, decoder_optimizer, criterion, max_length=MAX_LENGTH):\n", " \n", " # Zero gradients of both optimizers\n", " encoder_optimizer.zero_grad()\n", " decoder_optimizer.zero_grad()\n", " loss = 0 # Added onto for each word\n", "\n", " # Run words through encoder\n", " encoder_outputs, encoder_hidden = encoder(input_batches, input_lengths, None)\n", " \n", " # Prepare input and output variables\n", " decoder_input = Variable(torch.LongTensor([SOS_token] * batch_size))\n", " decoder_hidden = encoder_hidden[:decoder.n_layers] # Use last (forward) hidden state from encoder\n", "\n", " max_target_length = max(target_lengths)\n", " all_decoder_outputs = Variable(torch.zeros(max_target_length, batch_size, decoder.output_size))\n", "\n", " # Move new Variables to CUDA\n", " if USE_CUDA:\n", " decoder_input = decoder_input.cuda()\n", " all_decoder_outputs = all_decoder_outputs.cuda()\n", "\n", " # Run through decoder one time step at a time\n", " for t in range(max_target_length):\n", " decoder_output, decoder_hidden, decoder_attn = decoder(\n", " decoder_input, decoder_hidden, encoder_outputs\n", " )\n", "\n", " all_decoder_outputs[t] = decoder_output\n", " decoder_input = target_batches[t] # Next input is current target\n", "\n", " # Loss calculation and backpropagation\n", " loss = masked_cross_entropy(\n", " all_decoder_outputs.transpose(0, 1).contiguous(), # -> batch x seq\n", " target_batches.transpose(0, 1).contiguous(), # -> batch x seq\n", " target_lengths\n", " )\n", " loss.backward()\n", " \n", " # Clip gradient norms\n", " ec = torch.nn.utils.clip_grad_norm(encoder.parameters(), clip)\n", " dc = torch.nn.utils.clip_grad_norm(decoder.parameters(), clip)\n", "\n", " # Update parameters with optimizers\n", " encoder_optimizer.step()\n", " decoder_optimizer.step()\n", " \n", " return loss.data[0], ec, dc" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Running training\n", "\n", "With everything in place we can actually initialize a network and start training.\n", "\n", "To start, we initialize models, optimizers, a loss function (criterion), and set up variables for plotting and tracking progress:" ] }, { "cell_type": "code", "execution_count": 23, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Starting job 59739ec4f8e1c2083c28a9f6 at 2017-07-22 20:11:42\n" ] } ], "source": [ "# Configure models\n", "attn_model = 'dot'\n", "hidden_size = 500\n", "n_layers = 2\n", "dropout = 0.1\n", "batch_size = 100\n", "batch_size = 50\n", "\n", "# Configure training/optimization\n", "clip = 50.0\n", "teacher_forcing_ratio = 0.5\n", "learning_rate = 0.0001\n", "decoder_learning_ratio = 5.0\n", "n_epochs = 50000\n", "epoch = 0\n", "plot_every = 20\n", "print_every = 100\n", "evaluate_every = 1000\n", "\n", "# Initialize models\n", "encoder = EncoderRNN(input_lang.n_words, hidden_size, n_layers, dropout=dropout)\n", "decoder = LuongAttnDecoderRNN(attn_model, hidden_size, output_lang.n_words, n_layers, dropout=dropout)\n", "\n", "# Initialize optimizers and criterion\n", "encoder_optimizer = optim.Adam(encoder.parameters(), lr=learning_rate)\n", "decoder_optimizer = optim.Adam(decoder.parameters(), lr=learning_rate * decoder_learning_ratio)\n", "criterion = nn.CrossEntropyLoss()\n", "\n", "# Move models to GPU\n", "if USE_CUDA:\n", " encoder.cuda()\n", " decoder.cuda()\n", "\n", "import sconce\n", "job = sconce.Job('seq2seq-translate', {\n", " 'attn_model': attn_model,\n", " 'n_layers': n_layers,\n", " 'dropout': dropout,\n", " 'hidden_size': hidden_size,\n", " 'learning_rate': learning_rate,\n", " 'clip': clip,\n", " 'teacher_forcing_ratio': teacher_forcing_ratio,\n", " 'decoder_learning_ratio': decoder_learning_ratio,\n", "})\n", "job.plot_every = plot_every\n", "job.log_every = print_every\n", "\n", "# Keep track of time elapsed and running averages\n", "start = time.time()\n", "plot_losses = []\n", "print_loss_total = 0 # Reset every print_every\n", "plot_loss_total = 0 # Reset every plot_every" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Plus helper functions to print time elapsed and estimated time remaining, given the current time and progress." ] }, { "cell_type": "code", "execution_count": 24, "metadata": { "collapsed": false }, "outputs": [], "source": [ "def as_minutes(s):\n", " m = math.floor(s / 60)\n", " s -= m * 60\n", " return '%dm %ds' % (m, s)\n", "\n", "def time_since(since, percent):\n", " now = time.time()\n", " s = now - since\n", " es = s / (percent)\n", " rs = es - s\n", " return '%s (- %s)' % (as_minutes(s), as_minutes(rs))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Evaluating the network\n", "\n", "Evaluation is mostly the same as training, but there are no targets. Instead we always feed the decoder's predictions back to itself. Every time it predicts a word, we add it to the output string. If it predicts the EOS token we stop there. We also store the decoder's attention outputs for each step to display later." ] }, { "cell_type": "code", "execution_count": 25, "metadata": { "collapsed": false }, "outputs": [], "source": [ "def evaluate(input_seq, max_length=MAX_LENGTH):\n", " input_lengths = [len(input_seq)]\n", " input_seqs = [indexes_from_sentence(input_lang, input_seq)]\n", " input_batches = Variable(torch.LongTensor(input_seqs), volatile=True).transpose(0, 1)\n", " \n", " if USE_CUDA:\n", " input_batches = input_batches.cuda()\n", " \n", " # Set to not-training mode to disable dropout\n", " encoder.train(False)\n", " decoder.train(False)\n", " \n", " # Run through encoder\n", " encoder_outputs, encoder_hidden = encoder(input_batches, input_lengths, None)\n", "\n", " # Create starting vectors for decoder\n", " decoder_input = Variable(torch.LongTensor([SOS_token]), volatile=True) # SOS\n", " decoder_hidden = encoder_hidden[:decoder.n_layers] # Use last (forward) hidden state from encoder\n", " \n", " if USE_CUDA:\n", " decoder_input = decoder_input.cuda()\n", "\n", " # Store output words and attention states\n", " decoded_words = []\n", " decoder_attentions = torch.zeros(max_length + 1, max_length + 1)\n", " \n", " # Run through decoder\n", " for di in range(max_length):\n", " decoder_output, decoder_hidden, decoder_attention = decoder(\n", " decoder_input, decoder_hidden, encoder_outputs\n", " )\n", " decoder_attentions[di,:decoder_attention.size(2)] += decoder_attention.squeeze(0).squeeze(0).cpu().data\n", "\n", " # Choose top word from output\n", " topv, topi = decoder_output.data.topk(1)\n", " ni = topi[0][0]\n", " if ni == EOS_token:\n", " decoded_words.append('')\n", " break\n", " else:\n", " decoded_words.append(output_lang.index2word[ni])\n", " \n", " # Next input is chosen word\n", " decoder_input = Variable(torch.LongTensor([ni]))\n", " if USE_CUDA: decoder_input = decoder_input.cuda()\n", "\n", " # Set back to training mode\n", " encoder.train(True)\n", " decoder.train(True)\n", " \n", " return decoded_words, decoder_attentions[:di+1, :len(encoder_outputs)]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We can evaluate random sentences from the training set and print out the input, target, and output to make some subjective quality judgements:" ] }, { "cell_type": "code", "execution_count": 26, "metadata": { "collapsed": true }, "outputs": [], "source": [ "def evaluate_randomly():\n", " [input_sentence, target_sentence] = random.choice(pairs)\n", " evaluate_and_show_attention(input_sentence, target_sentence)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Visualizing attention\n", "\n", "A useful property of the attention mechanism is its highly interpretable outputs. Because it is used to weight specific encoder outputs of the input sequence, we can imagine looking where the network is focused most at each time step.\n", "\n", "You could simply run `plt.matshow(attentions)` to see attention output displayed as a matrix, with the columns being input steps and rows being output steps:" ] }, { "cell_type": "code", "execution_count": 27, "metadata": { "collapsed": true }, "outputs": [], "source": [ "import io\n", "import torchvision\n", "from PIL import Image\n", "import visdom\n", "vis = visdom.Visdom()\n", "\n", "def show_plot_visdom():\n", " buf = io.BytesIO()\n", " plt.savefig(buf)\n", " buf.seek(0)\n", " attn_win = 'attention (%s)' % hostname\n", " vis.image(torchvision.transforms.ToTensor()(Image.open(buf)), win=attn_win, opts={'title': attn_win})" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "For a better viewing experience we will do the extra work of adding axes and labels:" ] }, { "cell_type": "code", "execution_count": 28, "metadata": { "collapsed": true }, "outputs": [], "source": [ "def show_attention(input_sentence, output_words, attentions):\n", " # Set up figure with colorbar\n", " fig = plt.figure()\n", " ax = fig.add_subplot(111)\n", " cax = ax.matshow(attentions.numpy(), cmap='bone')\n", " fig.colorbar(cax)\n", "\n", " # Set up axes\n", " ax.set_xticklabels([''] + input_sentence.split(' ') + [''], rotation=90)\n", " ax.set_yticklabels([''] + output_words)\n", "\n", " # Show label at every tick\n", " ax.xaxis.set_major_locator(ticker.MultipleLocator(1))\n", " ax.yaxis.set_major_locator(ticker.MultipleLocator(1))\n", "\n", " show_plot_visdom()\n", " plt.show()\n", " plt.close()" ] }, { "cell_type": "code", "execution_count": 29, "metadata": { "collapsed": false }, "outputs": [], "source": [ "def evaluate_and_show_attention(input_sentence, target_sentence=None):\n", " output_words, attentions = evaluate(input_sentence)\n", " output_sentence = ' '.join(output_words)\n", " print('>', input_sentence)\n", " if target_sentence is not None:\n", " print('=', target_sentence)\n", " print('<', output_sentence)\n", " \n", " show_attention(input_sentence, output_words, attentions)\n", " \n", " # Show input, target, output text in visdom\n", " win = 'evaluted (%s)' % hostname\n", " text = '

> %s

= %s

< %s

' % (input_sentence, target_sentence, output_sentence)\n", " vis.text(text, win=win, opts={'title': win})" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Putting it all together\n", "\n", "**TODO** Run `train_epochs` for `n_epochs`" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To actually train, we call the train function many times, printing a summary as we go.\n", "\n", "*Note:* If you're running this notebook you can **train, interrupt, evaluate, and come back to continue training**. Simply run the notebook starting from the following cell (running from the previous cell will reset the models)." ] }, { "cell_type": "code", "execution_count": 30, "metadata": { "collapsed": false, "scrolled": false }, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "/home/sean/anaconda3/lib/python3.6/site-packages/torch/backends/cudnn/__init__.py:46: UserWarning: PyTorch was compiled without cuDNN support. To use cuDNN, rebuild PyTorch making sure the library is visible to the build system.\n", " \"PyTorch was compiled without cuDNN support. To use cuDNN, rebuild \"\n", "/home/sean/Projects/practical-pytorch/seq2seq-translation/masked_cross_entropy.py:9: UserWarning: torch.range is deprecated in favor of torch.arange and will be removed in 0.3. Note that arange generates values in [start; end), not [start; end].\n", " seq_range = torch.range(0, max_len - 1).long()\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "[log] 1m 50s (100) 3.1331\n", "1m 50s (- 921m 56s) (100 0%) 3.8196\n", "[log] 3m 41s (200) 2.3766\n", "3m 41s (- 921m 4s) (200 0%) 2.7289\n", "[log] 5m 35s (300) 2.1629\n", "5m 35s (- 926m 34s) (300 0%) 2.2523\n", "[log] 7m 28s (400) 1.9996\n", "7m 28s (- 926m 21s) (400 0%) 1.9320\n", "[log] 9m 20s (500) 1.5955\n", "9m 20s (- 924m 47s) (500 1%) 1.6854\n", "[log] 11m 13s (600) 1.2429\n", "11m 13s (- 924m 11s) (600 1%) 1.4429\n", "[log] 13m 5s (700) 1.2304\n", "13m 5s (- 922m 26s) (700 1%) 1.2527\n", "[log] 14m 57s (800) 0.9507\n", "14m 57s (- 919m 49s) (800 1%) 1.1110\n", "[log] 16m 49s (900) 0.8307\n", "16m 49s (- 917m 34s) (900 1%) 0.9817\n", "[log] 18m 39s (1000) 0.7994\n", "18m 39s (- 914m 34s) (1000 2%) 0.8726\n", "> suis je en retard ?\n", "= am i late ?\n", "< am i late ? \n" ] }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXEAAAEZCAYAAABhIBWTAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAGHdJREFUeJzt3X+0XWV95/H3x6ACEqAl2IUhCNr4AxSQpAFbqliFBkdl\nGF0VxFpBGpmB/hitwri67IwyMyrT1cEKxujgjy5HajuORhvFwSm1iiyTACaGH5qFIAGmrAvILxFI\nzmf+2Pvi4XLu3ucm5959npvPy3VWzv5xn/vlmHzuvs9+9vPINhERUaandV1ARETsvIR4RETBEuIR\nEQVLiEdEFCwhHhFRsIR4RETBEuIREQVLiEdEFCwhHhFRsIR4RHRGlS9LenHXtZQqIR4RXToJ+A3g\n7K4LKVVCPCK69A6qAH+9pD26LqZECfGI6ISkRcARtr8OXAn8645LKlJCPCK68vvAF+r3nyZdKjsl\nIR4RXTmLKryxvR44SNKSbksqT/qgIgaQtBmYdrJ920fOYTnzjqT9gY/ZvqNv958Bi4Dbu6mqTMqi\nEBFPJem59dtz6z//pv7zDADbF8x5UREDJMQjGki6zvbLpuy71vYxXdVUOkl/CFxl+8eSBFwGvBG4\nFfgD29d1WV9p0ice0UySfqtv4zfJv5td9SdUgQ1wOnAkcBjwLuCjHdVUrPSJRzQ7C/i0pP3q7Z/V\n+2Lnbbf9eP3+dcDnbN8DXCnpIx3WVaSEeMQ0JD0N+HXbR02GuO37Oy5rPuhJOgi4D3g18J/7ju3V\nTUnlyq+FEdOw3QPeW7+/PwE+Mu8HNlB1qay1vQVA0iuBWzqsq0i5sRnRQNKHgAngb4GHJ/fbvrez\nouaB+hH7hbbv69v3LKpMeqi7ysqTEI9oIOknA3bb9vPmvJh5RNKzqYZvHlHv2gJcavtfuquqTAnx\niJhT9Wif/wl8BthY714G/AFwhu3vdlRakRLiLST9CrDE9qaua4luSHoJcDiw5+Q+25/rrqKySboG\n+LdTx4NLOhr4hO1ju6msTBmdMoCkq4A3UH0+G4G7JX3X9rs6LSzmnKS/AE6gCvF1wMnAd4CE+M7b\nd9ADPbavl7Swi4JKltEpg+1n+wHg31CNYT0WeE3HNUU33kQ1DO7/2T4TOArYr/lLooXq33Cn7vxV\nkkkzlg9ssD3qcay/B3yt62KiU4/UQw23S9oXuBvITHu75q+Ab0p6paSF9esE4Ov1sZiBdKcM9gHg\nCuA7ttdLeh7w445rim5sqGfc+yRV19pDwPe6LalsttdIuhP4INXoFAM3ABfa/mqnxRUoNzYjhiTp\nUKr+3NzkjrGR7pQ+kt5b//nXkj469dV1fTH3JH1r8r3tW21v6t8XMyfpi33vPzzl2DfnvqKypTvl\nyW6s/9zQaRXROUl7AnsDi+qbcKoP7Qss7qyw+WFp3/sTgfP7tg+c41qKlxDvM9kfZ/uzXdcyX0k6\nEPhD4FD6/v7ZHreZAd8J/CnwHODavv0PAB/rpKL5o6kPN/27M5QQH0DSPzLgL5Pt3+mgnPnmK8A/\nU61uvqPjWqZl+2LgYkl/ZPuvu65nntlb0suounP3qt+rfmUWwxnKjc0BJC3r29yTatWR7bbf21FJ\njSQdDyy1/en6Sncf24Pm/OicpOttH911HcOqJ2X698AhtldJWgq80HaGnu6k+iJpWrZfNVe1zAcJ\n8SFJ+r7tFV3XMVX9ROFyqmB5gaTnAH9n+7davrQTki4Erra9rutahiHpb6mGFr7N9ksk7U1VfzE/\niGJ+y+iUAST9at9rkaSVjO9TeqdSTRHwMIDtO4FxfnT5T4CvSnpE0gOSHpT0QNdFNXi+7Y8AjwPY\n/jm/vMkZO0nSXpKOmrLvEEm5aTxD6RMfbCNVn7io/vHeCryjy4IaPGbbkgxP/Po/zvajWjH+MNsf\nkHQIcFDHNTV5TNJe1PdIJD0feLTbkuaF7cCXJB1pe3Ke9k8B7wPu6K6s8uRKfLDzgaNtHwb8DdVV\n7s+7LWlaX5T0CWD/ehXxb1H9YxhXlwDHUS2QC/AgYzrao16JfTXwDWCJpM9Tfb5jdW+k/o3xfZLe\nVU8NMPbqNTb/N9XUFtQ/zA+0neG9M5Q+8QEkbbJ9ZH3D8IPAfwPeP65TZEo6ETip3rzC9pVd1tNE\n0rW2j5F0ne2X1ft+YPuotq/tgqTNVLMYHkf1m9k1tic6LWqK+kbh94BnAiuB19se+2XOJL0IWGP7\nFZL+HHjAdh6qm6F0pww2OfTtXwGftP0P9Q25sSHpO7aPl/Qgv+z6AThHUg+4F7jI9qWdFTnY45IW\n8MvuiQOBXrclNboWeJ7tf+i6kAYH2H4fPPHE4z9J+hnwbuBs27/XaXXTsH2TKi8ATgN+u+uaSpQr\n8QEkfY2qX+5E4BjgEeD743q1OIikA6hGUbyw61r6SToDeDPV5/pZqqle/9z233Va2DQk3QT8OnAb\nVbeaqJZnO7LTwvpI+i7Viji31tuiekjpPqpple/qsLxGkt4OnAXcYfv0ltNjgIT4APUwspXAZts/\nrqelfantouZ1kHTQOP4Drn+NfjVVIH7L9o0tX9IZSc8dtN/2bXNdy3QkvZDqB8uPuq5lpup/a3cB\nbxznbsBxlhCPiChYRqdERBQsIT4ESau6rmFYJdUKZdVbUq1QVr0l1borJF0m6W5JP5zmuOqpr7dK\n2iTpmLY2E+LDKekvWEm1Qln1llQrlFVvSbXuis9Q3W+bzslUU/UupfpMPt7WYEI8ImKO2P421fDf\n6ZxCtTi7bV9D9RBf4xPN82qc+OSj56W1PWol1Qpl1TsbtS5btqz9pJ1wyCGHsHz58pHWu3HjxlE2\n9ySz9PdgwvYuLTSxcuVKT0y0P9+1cePGLcAv+natsb1mht9uMXB73/a2et+0o8zmVYhHlGjDhnKe\nNK+GoBdll4eCTkxMDPX/kaRf2F6+q99vphLiEREt5nAo9h3Akr7tg2mZECx94hERDQzs6PVaXyOy\nFnhbPUrlOOD+tgf2ciUeEdHIeERLf0r6AtWEaoskbQP+Ang6gO3VwDrgtcBWqplTz2xrMyEeEdHE\n0BtRb0rb/DCu+m3OnUmbCfGIiBbjPD1JQjwiooGBXkI8IqJcuRKPiCiU7VGOPhm5hHhERItciUdE\nFGxUQwxnQ0I8IqJBdWOz6yqmlxCPiGiR7pSIiFLlxmZERLlMrsQjIoqWh30iIgqWK/GIiGKNbhbD\n2ZAQj4ho4BHOYjgbEuIRES16Yzw6pbOVfSR9WdJGSVskrar3PSTponrflZJWSLpK0i2S3tBVrRGx\n+5qcxbDt1ZUul2c7y/YyYDnwx5IOAJ4F/F/bRwAPAhcCJwKnAh8Y1IikVZI2SCpntdmIKIrt1ldX\nuuxO+WNJp9bvlwBLgceAb9T7NgOP2n5c0mbg0EGN2F4DrAGQNMY9VxFRpI6vtNt0EuKSTgBeA7zc\n9s8lXQXsCTzuX/5I6wGPAtjuSUr/fUR0IkMMn2o/4L46wF8EHNdRHRERjQzsSIg/xTeAcyTdCNwM\nXNNRHRERrXIlPoXtR4GTBxzap++c/zjla/Z5ytkREXMgIR4RUSjnxmZERNlyJR4RUbCEeEREoarR\nKeP72H1CPCKiRSbAiogoVceP1bdJiEdENMjybBERhcsQw4iIguVKPCKiULbZMcaLQiTEIyJaZI3N\niIiCjfMQwy5X9omIGHuTo1NGsbKPpJWSbpa0VdIFA47vJ+mrkn5QL1N5ZlubCfGIiBajCHFJC4BL\nqGZwPRw4XdLhU047F7jB9lHACcBfSnpGU7vpTomIaDK6G5srgK22bwGQdDlwCnBD/3cDFkoS1dTc\n9wLbmxpNiEdENBjhwz6Lgdv7trcBx04552PAWuBOYCHwZrt54paEeEfGedzpINWFQcyGfLbjb8iH\nfRZJ2tC3vaZeyH0mfhe4Hvgd4PnA/5H0z7YfmO4LEuIRES2GHGI4YXt5w/E7gCV92wfX+/qdCXyo\nXjB+q6SfAC8Cvj9do7mxGRHRwm5/DWE9sFTSYfXNytOouk76/RR4NYCkXwNeCNzS1GiuxCMiGpjR\nzJ1ie7uk84ArgAXAZba3SDqnPr4a+CDwGUmbAQHn255oajchHhHRZISP3dteB6ybsm913/s7gZNm\n0mZCPCKiQaaijYgoXEI8IqJgmU88IqJYziyGERGlmsEQwk4kxCMiWmRRiIiIQo1qnPhsSYhHRLTI\n6JSIiFLNYNGHLiTEIyLaJMQjIsrV25EQj4goUjXEMCEeEVGscQ7xYuYTl3R11zVExO6ofZHkLkO+\nmCtx27/ZdQ0RsXtyb3yvxIsJcUkP2d6n6zoiYveSPvGIiMI5j93PHkmrgFVd1xER89cYX4iXH+K2\n1wBrACSN8UcdEUWy0yceEVGy9IlHRBQqa2yOSEamRERXEuIREaWy8Y6MTomIKFauxCMiCjbGGZ4Q\nj4hokhubEREly2P3ERElM73c2IyIKFeuxCMiCpVZDCMiSpcQj4gol8e3SzwhHhHRJt0pERGlsull\nUYiIiDKN+8M+xax2HxHRCVcLJbe9hiFppaSbJW2VdME055wg6XpJWyT9U1ubuRKPiGgzgitxSQuA\nS4ATgW3Aeklrbd/Qd87+wKXASts/lfTstnZzJR4R0cjY7a8hrAC22r7F9mPA5cApU855C/Al2z8F\nsH13W6O5Eu+IpK5LmLfGuf9ykPxdGH+94bpLFkna0Le9pl4DeNJi4Pa+7W3AsVPaeAHwdElXAQuB\ni21/rumbJsQjIhq47hMfwoTt5bv47fYAlgGvBvYCvifpGts/avqCiIhoMKLf7u4AlvRtH1zv67cN\nuMf2w8DDkr4NHAVMG+LpE4+IaDGiPvH1wFJJh0l6BnAasHbKOV8Bjpe0h6S9qbpbbmxqNFfiERGN\nhg7p5lbs7ZLOA64AFgCX2d4i6Zz6+GrbN0r6BrAJ6AGfsv3DpnYT4hERTUY4i6HtdcC6KftWT9m+\nCLho2DYT4hERDQx4x/iOeEqIR0S0GOdhqwnxiIgmw9+47ERCPCKixbBzo3QhIR4R0SJX4hERhRr3\nqWgT4hERTWycRSEiIsqVNTYjIgqW7pSIiFKN8InN2ZAQj4hokBubERFFM70d49spnhCPiGgy5t0p\nczafuKSHWo7vL+nfzVU9ERFDs9tfHRmnRSH2BxLiETF2xjjD5z7EJe0j6VuSrpW0WdLkas8fAp4v\n6XpJF9XnvkfSekmbJP2nua41ImLyxuYIVvaZFV30if8CONX2A5IWAddIWgtcALzE9tEAkk4ClgIr\nAAFrJb3C9rf7G5O0Clg1p/8FEbH7GH6h5E50EeIC/oukV1AtP7QY+LUB551Uv66rt/ehCvUnhbjt\nNcAaAEnj+0lHRKFML4/dP8kZwIHAMtuPS7oV2HPAeQL+q+1PzGVxERFTZXTKk+0H3F0H+KuA59b7\nHwQW9p13BXCWpH0AJC2W9Oy5LTUigrG+s9nFlfjnga9K2gxsAG4CsH2PpO9K+iHwddvvkfRi4HuS\nAB4C3grc3UHNEbGbcvrEK7b3qf+cAF4+zTlvmbJ9MXDx7FcXETG9Me5NyRObERHNssZmRES5TEan\nRESUyqRPPCKiaOlOiYgoVseTo7RIiEdENBnzqWgT4hERLXo7EuIREUXK8mwRESVLd0pERMnysE9E\nRNES4hERBRvnh33GaY3NiIixMzmLYdtrGJJWSrpZ0lZJFzSc9xuStkt6U1ubCfGIiBajWGNT0gLg\nEuBk4HDgdEmHT3Peh4FvDlNbQjwiolF7gA/ZZ74C2Gr7FtuPAZcDpww474+A/8WQayckxCMimoyu\nO2UxcHvf9rZ63xMkLQZOBT4+bHm5sRnzTr0SVDHGeeTDVKV9tqMy5P9HiyRt6NteUy/kPhP/HTjf\ndm/YzzohHhHRYAZPbE7YXt5w/A5gSd/2wfW+fsuBy+sAXwS8VtJ221+ertGEeEREI+PRLAqxHlgq\n6TCq8D4NmLok5WGT7yV9BvhaU4BDQjwiopnBI8hw29slnQdcASwALrO9RdI59fHVO9NuQjwiosWo\n7lvYXgesm7JvYHjbfvswbSbEIyJajPPN54R4RESDTEUbEVEym96OrHYfEVGuXIlHRJTLJMQjIork\nrOwTEVEy41EMFJ8lCfGIiBa5Eo+IKFhvNI/dz4qEeEREg2q+8IR4RES50p0SEVGuDDGMiChYbmzu\nJEkvAi4DFgL3Am+0PdFtVRGxezG93o6ui5hWCWtsvtX2S4GrgXO6LiYidi+TD/uMYKHkWTHWV+K2\nb+rbfCZwT1e1RMTuK90pu0jS7wInAy/vupaI2P0kxHeBpKcB/wN4le2fDTi+Clg154VFxG7CGWK4\ni54D3G/7x4MO2l4DrAGQNL6fdEQUy+Rhn11xH/DurouIiN2TPd6P3ZcwOmU/4Oyui4iI3VX7yJSM\nTmlg+07gTV3XERG7r8ydEhFRsIxOiYgoWEI8IqJUzhDDiIhiGeh5fOdOSYhHRDTqdvRJm4R4RESL\nhHhERMES4hERharua2aceEREoYzH+LH7hHhERIussRkRUbD0iUdEFMvpE4+IKNXkGpvjqoSpaCMi\nOjWqqWglrZR0s6Stki4YcPwMSZskbZZ0taSj2trMlXhERItRLAohaQFwCXAisA1YL2mt7Rv6TvsJ\n8Erb90k6mWrVsmOb2k2IR0Q0MoymT3wFsNX2LQCSLgdOAZ4IcdtX951/DXBwW6MJ8YiOSeq6hKGN\nc9/wIKP6bIccYrhI0oa+7TX1GsCTFgO3921vo/kq+x3A19u+aUI8IqLBDG5sTthePorvKelVVCF+\nfNu5CfGIiBYj+g3kDmBJ3/bB9b4nkXQk8CngZNv3tDWaEI+IaDSyceLrgaWSDqMK79OAt/SfIOkQ\n4EvA79v+0TCNJsQjIlqMYnSK7e2SzgOuABYAl9neIumc+vhq4P3AAcCldX/+9rYumoR4RESDUT7s\nY3sdsG7KvtV9788Gzp5JmwnxiIhGWWMzIqJoJnOnREQUa5zHxyfEIyIaeSQ3NmdLQjwiokGWZ4uI\nKFy6UyIiCpYQj4goVoYYRkQULQslR0QUyoZeb0fXZUwrIR4R0Wj45de6kBCPiGiREI+IKNg4h/gu\nr3Yv6ap69ebr69ff9x1bJemm+vV9Scf3HXudpOsk/UDSDZLeuau1RETMBrvX+urKTl2JS3oG8HTb\nD9e7zrC9Yco5rwPeCRxve0LSMcCXJa0A7qFaxXmF7W2SngkcWn/dr9i+b+f+cyIiRszjPcRwRlfi\nkl4s6S+Bm4EXtJx+PvAe2xMAtq8FPgucCyyk+gFyT33sUds311/3Zkk/lPRuSQfOpL6IiFEz0HOv\n9dWV1hCX9CxJZ0r6DvBJ4AbgSNvX9Z32+b7ulIvqfUcAG6c0twE4wva9wFrgNklfkHSGpKfBExOk\nnwzsDXxb0t9LWjl5fEB9qyRtmLLKdETEyJTenXIXsAk42/ZN05zzlO6UNrbPlvRS4DXAnwEnAm+v\nj90OfFDShVSBfhnVD4A3DGhnDVXXDJLG93eeiCjUeA8xHKY75U1Ui3p+SdL7JT13yLZvAJZN2bcM\n2DK5YXuz7b+iCvA39p9Y951fCnwU+CLwH4b8vhERI2W79dWV1hC3/U3bbwZ+G7gf+IqkKyUd2vKl\nHwE+LOkAAElHU11pXyppH0kn9J17NHBbfd5JkjYBFwL/CBxu+09tbyEiYo5NrrE5riE+9OgU2/cA\nFwMX11fJ/c+hfl7SI/X7Cduvsb1W0mLg6rqb40HgrbbvkrQQeK+kTwCPAA9Td6VQ3ex8ve3bdum/\nLCJiJIzH+LF7jXNfz0ylTzxidpWWF5I22l6+K23sscfTve++B7Sed999/7LL32tn5InNiIgW4/zD\nKyEeEdEiIR4RUajqxmXW2IyIKFauxCMiCtbr5Uo8IqJcuRKPiCiVMbkSj4go0uQTm+MqIR4R0SIh\nHhFRsIR4RESxTG+M505JiEdENBj3PvFdXig5ImLem1xns+k1hHqVspslbZV0wYDjkvTR+vimem3i\nRgnxiIhGHup/bSQtAC6hWq3scOB0SYdPOe1kYGn9WgV8vK3dhHhERIsRrbG5Athq+xbbjwGXA6dM\nOecU4HOuXAPsL+mgpkYT4hERLXq9XutrCIuB2/u2t9X7ZnrOk8y3G5sT1Mu8jdiiuu0SlFQrlFVv\nSbXCLNQraZTN9Zutz3bYNYGbXEFVX5s9JfUvGL+mXsh9Vs2rELd94Gy0K2lDFyt27IySaoWy6i2p\nViir3nGu1fbKETV1B7Ckb/vget9Mz3mSdKdERMyN9cBSSYdJegZwGrB2yjlrgbfVo1SOA+63fVdT\no/PqSjwiYlzZ3i7pPKrumQXAZba3SDqnPr4aWAe8FtgK/Bw4s63dhPhwZr1fa4RKqhXKqrekWqGs\nekuqdafZXkcV1P37Vve9N3DuTNqcV6vdR0TsbtInHhFRsIR4RETBEuIREQVLiEdEFCwhHhFRsIR4\nRETBEuIREQX7/xH10golaULwAAAAAElFTkSuQmCC\n", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" }, { "name": "stdout", "output_type": "stream", "text": [ "[log] 20m 29s (1100) 0.6578\n", "20m 29s (- 911m 11s) (1100 2%) 0.7791\n", "[log] 22m 19s (1200) 0.6510\n", "22m 19s (- 907m 38s) (1200 2%) 0.6962\n", "[log] 24m 10s (1300) 0.5559\n", "24m 10s (- 905m 40s) (1300 2%) 0.6159\n", "[log] 26m 0s (1400) 0.4897\n", "26m 0s (- 903m 1s) (1400 2%) 0.5736\n", "[log] 27m 50s (1500) 0.5131\n", "27m 50s (- 900m 5s) (1500 3%) 0.5190\n", "[log] 29m 38s (1600) 0.3948\n", "29m 38s (- 896m 52s) (1600 3%) 0.4632\n", "[log] 31m 27s (1700) 0.6653\n", "31m 27s (- 893m 44s) (1700 3%) 0.4410\n", "[log] 33m 15s (1800) 0.3286\n", "33m 15s (- 890m 39s) (1800 3%) 0.3999\n", "[log] 35m 5s (1900) 0.4149\n", "35m 5s (- 888m 17s) (1900 3%) 0.3684\n", "[log] 36m 54s (2000) 0.2788\n", "36m 54s (- 885m 52s) (2000 4%) 0.3466\n", "> honte a toi !\n", "= shame on you .\n", "< shame on you ! \n" ] }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAUYAAAEZCAYAAADrD4zSAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAF3BJREFUeJzt3X+0XWV95/H3h/gTZIA21GESaKgraoMShBCYKbZYxUlc\nOGitAloZGGm0Ay4dRoH2D+tY+wNpbWsFY0oj0h+wrAWMNgLahUoHKST8ThQnC0SCTjWISFHB3Pvp\nH3vfuM/OPffuk5y79znXz8u1F2f/uM/5Em++PM9+9n6+sk1ERPzEPl0HEBExapIYIyJqkhgjImqS\nGCMiapIYIyJqkhgjImqSGCMiapIYIyJqkhgjImqSGCPmGRWulfSLXccyrpIYI+afVwLHAmd3Hci4\nSmKMmH/eQpEUXy3paV0HM46SGCPmEUkLgSNsfxb4PPCajkMaS0mMEfPLm4Ery88fI8PpPZLEGDG/\n/A+KhIjt24BDJB3abUjjJ4kxYp6QdCDwYdsPVw6/C1jYUUhjS1motj9JzwYOs31f17FERHvSY+xD\n0quBO4Hryv2jJG3oNqqI6Un6TUlLy8+S9DFJ35d0t6SXdB3fuEli7O+9wErgewC27wQO7zKgiBm8\nA/h6+fl04EiK39fzgA91FNPYSmLs78e2H6sdy32HGFU7bf+4/HwycIXtR2x/Htivw7jGUhJjf1sk\nvRFYIGmppL8Abu46qIg+JiUdIulZwMspnmGc8uyOYhpbSYz9vR04AngS+DvgMYrhSsQoeg+wiWI4\nvcH2FgBJvwLc32FcYymz0n1Ier3tv5/tWMSoKF//29/2o5Vj+1H8Pf+37iIbP0mMfUi63fbRsx2L\nGBWSfg44h2KkA7AFuNT2v3YX1XjKC+Y1klYDrwIWSarO5v0HYGc3UUXMTNIvUdzyuRy4ojx8DPAv\nkt5k+/92Fds4So+xRtJy4CjgfRT3baY8DtxYHaZEjApJtwC/ZfuO2vGjgI/aPq6byMZTEmMfkp5e\nefwhYqRJ2mp72aDnYnqZle5vpaTPSfqapPslPSAps3sxqiTpoGkO/gz5ez6w3GPs76+A/wVsBiY6\njiViNn8K3CDpXcDt5bFjgIvKczGADKX7kPQvuS8T40TSycD5FLPSBrYCF9v+dKeBjaEkxj4k/RGw\nALia4iFvAGzf3veHImJeSGLsQ9KN0xy27V9tPZiIWUj6hO03lJ8vsn1B5dwNtl/ZXXTjJ/cY+7D9\nsq5jiBjA0srnk4ALKvsHtxzL2Eti7EPSAcDvAr9cHvoi8L5pVtyJBqZ6NJLuoXeVIlH0xI/sKLT5\nYqahX4aFA0pi7G89cC/whnL/zRS1NH6ts4jG29QCHCd3GsX8tW+5IO0+wLPLzyq3rK4zoNxj7EPS\nnbaPmu1YDE7ScykKwgPcavvbXcYzH/S5J75Lbg0NJj3G/n4o6QTb/wy73kX9YccxjT1JbwAuBr5A\n0Zv5C0nvtv3JTgMbc0l8w5UeYx/lO6YfBw4oDz0K/Hfbd3cX1fiTdBdw0lQvUdLBwOdtL+82svFX\nFm97vu27KscOAyZqlQNjFukx9vcV4APA84ADKRaqfQ0wkomxfB1sKfCsqWO2v9RdRH3tUxs6P0Je\nWRuWncDVko60/UR57DLgd4AkxgEkMfb3KYpCWLcz4r9Uks6mmNxYTFHZ8Hjgy8AoPnP5WUnXA1eW\n+6cCGzuMZ96w/WNJ11BMGH6s7C0ebHtTx6GNnSTG/hbbXtV1EA29g2Iy4xbbL5P0QuAPOo6pHwMf\nBU4o99dRJPIYjsso/kw/BpxR/jMGlCFMfzdLenHXQTT0I9s/ApD0TNtfBV7QcUz9nGT7atvnlds1\nwOqug5ovyv/vJen5wGnAX3cc0lhKj7Gm8gDy04CzyqXGnmS0H0TeLulA4Frgc5IeBR7sOKYekn4L\n+J/AL0iq3qfdHxjJ1aUlnVc/ZvuD5bnfsP037UfVyF9R9BzvycLKeyaz0jWSfn6m87ZHKuHUlVXh\nDgCus/1U1/FMKd8kOgj4Q+DCyqnHbX+3m6hmJul368ds/5/y3Fttf7T9qGYnaV/gW8DryrrSMaAk\nxoiImtxjjIioSWJsQNKarmNoapxihfGKd5xihfGLd09IWi/p25Lu7XNekj4kaZukuyU1Kn+cxNjM\nOP2CjVOsMF7xjlOsMH7x7onLgZkeq1tN8eLDUoo/j480aTSJMSLGVvl210yTd6cAV7hwC3CgpENm\na3dePa6zcOFCL1myZOjtHnbYYaxYsWKos1SbN28eZnM9JI3VjNo4xTtOscLcxGtbe/Pzq1at8o4d\nOxpdu3nz5i3AjyqH1tleN8DXLQIequxvL499a6YfmleJccmSJWzaNB5vP0l79bsVMbZ27NjR+O+p\npB/ZXjHHIe1mXiXGiBgPLT4m+DBwaGV/MQ3WPsg9xoholYGJyclG2xBsAM4oZ6ePBx6zPeMwGtJj\njIjWGQ+pDI2kK4ETgYWStlPUaXo6gO21FCs3vQrYBvwAOKtJu0mMEdEuw+SQRtK2T5/lvIFzBm03\niTEiWjfqryInMUZEqwxMJjFGRPRKjzEiosL2sGac50wSY0S0Lj3GiIiaYT2uM1eSGCOiVcXkS9dR\nzCyJMSJal6F0RERVJl8iInqZ9BgjInaTB7wjImrSY4yI6DG81XXmyl6vxyjp65IWDiOYiJj/XK6u\n02TrSnqMEdG6yRGflR6oxyhpP0n/KOkuSfdKOrU89XZJt0u6R9ILy2tXSvqypDsk3SzpBeXxMyVd\nK+lzZW/zXEnnldfdIulnyuueJ+k6SZsl3TTVbkSMt6nVdZpsXRl0KL0K+Kbt5bZfBFxXHt9h+2iK\nmq3vKo99FXip7ZcA7wH+oNLOi4BfA44Ffh/4QXndl4EzymvWAW+3fUzZ5qXTBSRpjaRNkjZ95zvf\nGfBfJyK6YLvR1pVBh9L3AH8i6SLgM7ZvKqvdXV2e30yR8AAOAD4uaSnFfySeXmnnRtuPA49Legz4\ndKX9IyU9B/gvwN9Xquk9c7qAylKK64ChlziNiDnQcW+wiYESo+2vSTqaoobC+yX9U3nqyfKfE5U2\nf48iAb5W0hLgC5Wmnqx8nqzsT5Y/vw/wPdtHDRJfRIyHUX9cZ9B7jP+JYtj7N8DFwNEzXH4APylT\neOYg32P7+8ADkl5ffq8kLR+kjYgYTQYm7EZbVwa9x/hi4FZJd1JU43r/DNd+APhDSXewZ7PfbwLe\nIukuYAtwyh60EREjaNTvMWrUu7SDWLFihTdt2tR1GI1U7p1GjBXbe/XL+6Lly/2JjRsbXXvE4sWb\nba/Ym+/bE3mOMSJa5fk2+RIRMQyjPlJNYoyI1iUxRkRUFLPSo/1KYBJjRLQuNV8iIqo6fhSniSTG\niGhVShtEREwjj+tERNSkxxgRUeGUT42I2N2o13xJYoyI1o364zp7XQwrImIQU7PSw1pdR9IqSfdJ\n2ibpwmnOHyDp02VJli2SzpqtzSTGiGjdsBKjpAXAJcBqYBlwuqRltcvOAbbaXg6cSFGF4BkztZuh\ndES0a7iTLyuBbbbvB5B0FcXarVur3wjsr2Ktv+cA3wV2ztRoEmNEtGrID3gvAh6q7G8Hjqtd82Fg\nA/BNYH/gVHvml7XnVWLcvHlzFoCdI6P+3Fldfg9G2wAPeC+UVF19el1ZAG8Q/xW4E/hV4HnA5yTd\nVJZQmda8SowRMR4GeFxnxywreD8MHFrZX8xPak1NOQv4Ixf/dd8m6QHghcCt/RrN5EtEtM5utjVw\nG7BU0uHlhMppFMPmqm8ALweQ9FzgBcD9MzWaHmNEtMoM711p2zslnQtcDywA1tveIult5fm1FKWc\nL5d0DyDgAts7Zmo3iTEi2jXkVwJtbwQ21o6trXz+JvDKQdpMYoyIVmXZsYiIaSQxRkTUZD3GiIge\nzuo6ERFVAzyK05kkxohoXRaqjYioGOZzjHMliTEiWpdZ6YiIqtSVjoiYRhJjRESvyYkkxoiIXYrH\ndZIYIyJ6JDFGRPTI5EtExG484oWlR2IFb0nnSbq33N4paYmkr0j6y7IO7A2Snt11nBGx96buMQ6r\nrvRc6DwxSjqGoibDccDxwG8CBwFLgUtsHwF8D3hdZ0FGxFB5crLR1pVRGEqfAFxj+wkASVcDLwUe\nsH1nec1mYMl0PyxpDbCmhTgjYkhG/BbjSCTGfp6sfJ4Aph1Kl6UU1wFIGvE/7ojAzj3GBm4CXiNp\nX0n7Aa8tj0XEPDXq9xg77zHavl3S5fykxutlwKPdRRQRcyk1Xxqy/UHgg7XDL6qc/+N2I4qIuZTE\nGBFRZeOJLFQbEdEjPcaIiJoRz4tJjBHRrky+RETUZdmxiIg6M5nJl4iIXukxRkRUZAXviIjpJDFG\nRPTyaN9iTGKMiPZlKB0RUWUz2eEitE0kMUZEq8bhAe9RWI8xIn6auCiG1WRrQtIqSfdJ2ibpwj7X\nnCjpzrKG1BdnazM9xoho35B6jJIWAJcAJwHbgdskbbC9tXLNgcClwCrb35D0c7O1mx5jRLSs2erd\nDYfbK4Fttu+3/RRwFXBK7Zo3Alfb/gaA7W/P1mgSY0S0bnLSjTZgoaRNla1e+G4R8FBlf3t5rOr5\nwEGSviBps6QzZosvQ+mIaJXLe4wN7bC9Yi+/8mnAMcDLKYrqfVnSLba/NtMPRES0aoiz0g8Dh1b2\nF5fHqrYDj5Qlmp+Q9CVgOdA3MWYoHRGtG+I9xtuApZIOl/QM4DRgQ+2aTwEnSHqapH2B44CvzNRo\neowR0bLhlUa1vVPSucD1wAJgve0tkt5Wnl9r+yuSrgPuBiaBy2zfO1O7SYwR0a4hr65jeyOwsXZs\nbW3/YuDipm0mMUZEqwx4YrTffElijIjWjforgUmMEdGu5hMrnUlijIjWDfAcYyeSGCOidekxRkRU\njMOyY0mMEdEuG2eh2oiIXqn5EhFRk6F0RERV6kpHRPTK5EtExG7M5MRo32RMYoyIdmUoHRExjSTG\niIheI54X21vBW9L7JL2zsv/7kt4h6WJJ90q6R9Kp5bkTJX2mcu2HJZ3ZVqwRMXemJl+GtIL3nGiz\ntMF64AwASftQLEG+HTiKov7CK4CLJR0ySKOS1kxVEBtyvBExF8piWE22rrQ2lLb9dUmPSHoJ8Fzg\nDuAE4ErbE8C/SvoicCzw/QHaXQesA5A04h30iAAzmVcCe1wGnAn8R4oe5El9rttJb2/2WXMbVkS0\nadRnpduuEngNsIqiV3g9cBNwqqQFkg4Gfhm4FXgQWCbpmZIOpKgHGxHzhd1s60irPUbbT0m6Efie\n7QlJ1wD/GbiL4p7s+bb/P4CkTwD3Ag9QDLsjYh6ws1Btj3LS5Xjg9QAu+tPvLrcets8Hzm8zvoho\nx4iPpFt9XGcZsA34J9v/r63vjYhR0+xRnS7vQ7Y5K70V+IW2vi8iRpTJrHRERJXJPcaIiN2M+uM6\nSYwR0bJuH8VpIokxItqVZcciInY3OZHEGBGxS0obRETUZSgdEVHX7cPbTSQxRkTrkhgjImpG/QHv\ntpcdi4ifclOr6wxrBW9JqyTdJ2mbpAtnuO5YSTsl/fpsbSYxRkTrhrWIhKQFwCXAamAZcHq5YM10\n110E3NAkviTGiGjZUFfXWQlss32/7aeAq4BTprnu7cA/AN9u0mgSY0S0a7hD6UXAQ5X97eWxXSQt\nAl4LfKRpiJl8iUYkdR3CgMYn3onJia5DaGzlsccOpZ0BZqUX1iqArisL4A3iz4ALbE82/T1OYoyI\nVg345ssO2ytmOP8wcGhlf3F5rGoFcFWZFBcCr5K00/a1/RpNYoyIlhkPb6Ha24Clkg6nSIinAW/s\n+Tb78KnPki4HPjNTUoQkxohom8FDyou2d0o6l6Lq6AJgve0tkt5Wnl+7J+0mMUZE64b55ovtjcDG\n2rFpE6LtM5u0mcQYEa3LK4ERERVZdiwios5mciJVAiMieqXHGBHRyyQxRkTs4qzgHRFRZzysBxnn\nSBJjRLQuPcaIiJrJ4b0SOCeSGCOiVcVai0mMERG9MpSOiOiVx3UiImoy+TIEkt4L/JvtP+46lojY\nW2ZyxFctH4vEGBHzRx7wjoiYRhJjRERNEuMck7QGWNN1HBHRlPO4zjDYfu8M59YB6wAkjfafdkQA\nYPKAd0TELnZeCRyKsuLXD2xf0XUsEbG3nHuMw7CnJRAjYjTlXemIiJr0GCMiapIYIyKqnMd1IiJ6\nGJh03pWOiKjIrHRExG6SGCMiapIYIyIqirmXPMcYEVFhnFcCIyJ6peZLRERN7jFGRPRIXemIiB7j\nUPNln64DiIifPrYbbU1IWiXpPknbJF04zfk3Sbpb0j2Sbpa0fLY202OMiNYNa6FaSQuAS4CTgO3A\nbZI22N5auewB4FdsPyppNcWK/8fN1G4SY0S0zDC8e4wrgW227weQdBVwCrArMdq+uXL9LcDi2RpN\nYox5arTvYVXtI3UdQusGeFxnoaRNlf11ZZ2nKYuAhyr725m5N/gW4LOzfWkSY0S0asDJlx22Vwzj\neyW9jCIxnjDbtUmMEdG6Ic5KPwwcWtlfXB7rIelI4DJgte1HZms0iTEiWjbU5xhvA5ZKOpwiIZ4G\nvLF6gaTDgKuBN9v+WpNGkxgjonXDmpW2vVPSucD1wAJgve0tZWXRqUJ67wF+FrhUxf3cnbMNz5MY\nI6JVw37A2/ZGYGPt2NrK57OBswdpM4kxIlqWmi8REbsxeVc6IqLHqL8rncQYES3z0CZf5koSY0S0\nKqUNIiKmkaF0RERNEmNERI88rhMRsZsUw4qIqLBhcnKi6zBmlMQYES1rXragK0mMEdG6JMaIiJpR\nT4x7XSVQ0hfKCl13ltsnK+fWSPpqud0q6YTKuZMl3SHpLklbJb11b2OJiPFgTzbaurJHPUZJzwCe\nbvuJ8tCbbG+qXXMy8FbgBNs7JB0NXCtpJfAIRaWulba3S3omsKT8uYNsP7pn/zoRMfI8+o/rDNRj\nlPSLkv4EuA94/iyXXwC82/YOANu3Ax8HzgH2p0jKj5TnnrR9X/lzp0q6V9L/lnTwIPFFxOgzMOnJ\nRltXZk2MkvaTdJakfwb+kqIs4ZG276hc9reVofTF5bEjgM215jYBR9j+LrABeFDSlWVB7H1g1wKT\nq4F9gS9J+mRZUHvaWMvh+qZaJbGIGGHzYSj9LeBu4GzbX+1zzW5D6dnYPlvSi4FXAO+iKJh9Znnu\nIeD3JL2fIkmup0iq/22adtZRDMuRNNr984hgHB7XaTKU/nWKIjNXS3qPpJ9v2PZW4JjasWOALVM7\ntu+x/acUSfF11QvLe5GXAh8CPgH8dsPvjYgRZ7vR1pVZE6PtG2yfCrwUeAz4lKTPS1oyy49+ALhI\n0s8CSDqKokd4qaTnSDqxcu1RwIPlda+UdDfwfuBGYJntd9reQkSMvamaL6OcGBvPSpe1WP8c+POy\nN1d9p+dvJf2w/LzD9itsb5C0CLi5HOI+DvyG7W9J2h84X9JHgR8CT1AOoykmZF5t+8G9+jeLiBFl\nPB9fCbR9a+XziTNc9xHgI9Mcfxx4VZ+fqU/YRMQ8k0UkIiJqRn3yJYkxIlqXxBgRUVFMrKTmS0RE\nj/QYIyJqUj41IqIuPcaIiCpj0mOMiNhl6s2XUZbEGBGtS2KMiKhJYoyI6OGUT42IqBqHe4x7XQwr\nImJgU3VfZtsaKFf4v0/SNkkXTnNekj5Unr+7rD81oyTGiGiZG/9vNpIWAJdQrPS/DDhd0rLaZauB\npeW2hmlW/KpLYoyI1g2x5stKYJvt+20/BVwFnFK75hTgChduAQ6UdMhMjeYeY0S0boivBC4CHqrs\nbweOa3DNIop6VtOab4lxB2WJhCFbWLY9DsYpVhiveOckVknDbnLKXMTbtObTTK6niK2JZ9UqgK4r\nC+DNqXmVGG3PSR1qSZtsr5iLtodtnGKF8Yp3nGKF0Y3X9qohNvcwcGhlf3F5bNBreuQeY0SMs9uA\npZIOl/QM4DSKmvVVG4Azytnp44HHbPcdRsM86zFGxE8X2zslnUsxPF8ArLe9RdLbyvNrgY0UNaa2\nAT8Azpqt3STGZub8nsYQjVOsMF7xjlOsMH7x7hHbGymSX/XY2spnA+cM0qZG/Qn0iIi25R5jRERN\nEmNERE0SY0RETRJjRERNEmNERE0SY0RETRJjRETNvwM+NCjPw8BKNAAAAABJRU5ErkJggg==\n", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" }, { "name": "stdout", "output_type": "stream", "text": [ "[log] 38m 44s (2100) 0.3325\n", "38m 44s (- 883m 46s) (2100 4%) 0.3377\n", "[log] 40m 36s (2200) 0.2391\n", "40m 36s (- 882m 12s) (2200 4%) 0.3217\n", "[log] 42m 25s (2300) 0.2144\n", "42m 25s (- 879m 49s) (2300 4%) 0.3013\n", "[log] 44m 14s (2400) 0.2987\n", "44m 15s (- 877m 38s) (2400 4%) 0.2743\n", "[log] 46m 4s (2500) 0.2795\n", "46m 4s (- 875m 27s) (2500 5%) 0.2610\n", "[log] 47m 53s (2600) 0.2676\n", "47m 53s (- 873m 0s) (2600 5%) 0.2380\n", "[log] 49m 43s (2700) 0.1816\n", "49m 43s (- 871m 10s) (2700 5%) 0.2199\n", "[log] 51m 35s (2800) 0.2438\n", "51m 35s (- 869m 43s) (2800 5%) 0.2171\n", "[log] 53m 25s (2900) 0.2003\n", "53m 25s (- 867m 44s) (2900 5%) 0.1989\n", "[log] 55m 16s (3000) 0.2235\n", "55m 16s (- 865m 59s) (3000 6%) 0.1900\n", "> choisis quelque chose .\n", "= choose something .\n", "< choose something . \n" ] }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYgAAAEbCAYAAADAsRPLAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAGwlJREFUeJzt3Xu8XGV97/HPNwGVW0UNKg1gKIIV5B5BkZsiGC4WW1Au\nUgWKiEpbeyxCqy89PeKrh8NRi1XAFLlVhJcgIkIU5FguKkgSbiERNGKRBCoGrQIqkOzv+WOtjcPe\na/aeZM+eZ/bk++Y1rz3rMs/8VmYzv/1c1vPINhERESNNKx1ARET0pySIiIholAQRERGNkiAiIqJR\nEkRERDRKgoiIiEZJEBER0SgJIiIiGiVBRFdIWr90DBHRXUkQMSGS9pC0BLiv3t5R0tmFw4qILkiC\niIn6DPAW4DEA23cDexeNKCK6IgkiJsz2QyN2rSoSSER01TqlA4gp7yFJewCWtC7wt8APC8cUEV2Q\nGkRM1EnAB4CZwHJgp3o7omdUuUrSq0vHMkiU6b4jYqqT9BbgfOAy2x8qHc+gSILoEUlvAO6y/aSk\nY4BdgLNsP1g4tAmRdAEw6pfI9vEFwom1lKSvABcAZwHb2l5ZOKSBkCam3jkH+K2kHYEPAT8BLi4b\nUldcA1xbP/4f8EfAE0UjirWKpBnAdra/CdwAvK1wSAMjNYgekXSH7V0kfQxYbvuLw/tKx9ZNkqYB\n37W9R+lYYu0g6e+ADWyfLum1wCdszykd1yDIKKbeeVzSPwDHAHvXX6TrFo5pMmwNvLR0ELFWOR6Y\nA2B7vqRNJW3eMPw6VlOamHrnCOAp4K9s/xewGXBm2ZAmTtLjkn4z/BP4BnBq6bhi7SBpY+Bztpe3\n7P57YEahkAZKmpgiRqjnlfoQsIXt90jaGniV7WsKhxbRU2limmSSvmt7T0mP89zRPgJs+48KhdYV\nksbsQ7F9R69i6aILgIXA6+vt5cDlVB3y0SckvQe40faPJYlqmOthwH8C77Z9Z8n4BkFqEDEhkm6j\nGrJ7D1XS2wFYAPyeKgG+qWB4a0TSAtuzJd1pe+d63922dywdW/yBpHuBnW0/I+loqlrfAcDOwMdt\n71U0wAGQPogekbSVpOfXz/eV9Dd1++lU9zCwq+3Ztnel+p9zue03TsXkUHta0nrUNT5JW1H1H0V/\nWWn7mfr5IcDFth+zfQOwQcG4BkYSRO98FVgl6ZXAXGBz4MtlQ+qKV9leNLxh+15gqk938HHgW8Dm\nki6hur/jw2VDigZD9YilFwD7Ud0DMWy9QjENlPRB9M6Q7ZWS/hz4V9v/KmkQ2kjvkXQe8KV6+51U\nzU1Tlu1vS7oDeB1Vs9nf2l5ROKwY7WNUzZnTgattLwaQtA/wQMnABkX6IHpE0g+AfwE+ArzV9k8l\n3Wv7NYVDm5D6r7f38Yc1IG4GzrH9+3JRTcygTosyiCStA2xk+1ct+zag+m7LHf0TlATRI5K2pZr5\n9Fbbl0raEniH7TMKhxYjSLoH2JGqw/0C4ItUn9U+RQOLUSS9lGr24O3qXYuBs23/vFxUgyMJItaI\npEU0TNI3zPYOPQynq9aWaVGmurqm92XgQqphyQC7Au8G3mn7e4VCGxhJEJNM0ldsv6PhC3X4Pogp\n+UUq6RVjHZ/KzTGSbqLqpD4e2At4FLjb9vZFA4vnqIdYv2/k/Q6SdgK+YHv3MpENjiSISSZpU9uP\ntPtCncpfpMPqa9va9g318NB1bD9eOq41JenlwNHAfNu3SNoC2Nf2IMy+OzAkLbG97eoei85lmOsk\ns/1I/fNBqpvHtq8fvxuQ5PAe4ArgC/WuzYCrykU0cfVcWZcAL5R0CPD7JIe+JEkvatj5YvLd1hX5\nR+wRSe8AbgfeDrwD+IGkw8tG1RUfAN4A/AbA9o+Z4rO5DvBnNWg+A1wvaR9JG9WPfYFv1sdignIf\nRO98BHit7UcBJG1CdWPPFUWjmrinbD9dTYXz7LDDqd5uOaif1UCxPVfSw8AnqEYxGVgCnG77G0WD\nGxBJEL0zbfgLp/YYg1GDu0nSPwLrSdofeD/VlN9T2aB+VgOnnmE3kyhOkvzS9863JF0n6VhJx1It\n0TmvcEzdcBrwC2AR8F6qa/po0YgmblA/q4FSr0M9/PyMEceu731EgyejmHpI0mFU7fUAt9j+Wsl4\nor18Vv1vxGy7z7lPpfVYrLkkiJgQST+loc/B9p8UCCfWIq1JoSFB5MbGLkgfRI9I+gvgDKoRPmJA\nFgwCZrc8fwHVyJ8XF4qlKwb4sxo060vamaqpfL36+fDnldlcuyA1iB6RtJRqkr4flo5lsklaWK8N\nMSWtTZ/VVCbpP8Y6bvuNvYplUKUG0Ts/H8QvnBFLjk6jqlFM9d+rgfysBk0SwORLDWKS1c0VAPsA\nL6e6y/jZ1clsX1kirm6p/4ob/iVaSbUe8P+1/aNiQa2hQf+sBlE9tcs2tu9u2bcFsMr28nKRDYYk\niEkm6YL6qanaRlvZ9vE9DqmrJH2I517bc36hbH+650GtoUH/rAaRpHWB+4AdbD9Z77se+EfbC4oG\nNwCmelNA37N9HICki6hWJvvvevtFwKdKxtYluwKvBb5O9aX6VqppKn5cMqg1sRZ8VgPH9jOSvkY1\nJcoFde1hkySH7kiC6J0dhr9wAGz/qh51MdVtBuwyPHurpP8JXGv7mKJRTcygflaD6jyqdd4vAN5V\n/4wuyJ3UvTOtdebJesbJQUjQLwOebtl+ut43lQ3qZzWQbN9HNbPrNsCRwL8XDmlg5Je+dz4F3Crp\n8nr77cAnC8bTLRcDt9fVfIC3Ua3wNZUN6mfVSNLL6ynOp7IvUtUkFrWuTx0Tk07qHqrXpX5Tvfkd\n20tKxtMt9VDXverNm0eu8DUVDepn1UTStbYPLh3HREhaH3gEOMz2DaXjGRRJEBERA0DS+cAhwKO2\nX9NwXMBZwEHAb4Fjbd8xVpnpg4iIGAwXAnPGOH4gsHX9OBE4Z7wCkyAKkXRi6Ri6bRCvCXJdU8kg\nXlOnbN8M/HKMUw4FLnblNmBjSZuOVWY6qcs5kWpo3iAZxGuCXNdUMqWuac6cOV6xYkVH5y5cuHAx\n1br2w+baXp1rnQk81LK9rN73SLsXJEFERBSyYsUKFizo7J4+Sb+3PXv8M7snCaLFjBkzPGvWrJ68\n1xZbbMHs2bN7MkJg4cKFvXgbACQN5KiHXNfU0cNrWmF7k4kW0sOBQsuBzVu2N6v3tZUE0WLWrFkd\nZ/OppBq8EBFd9uBECzCwamioC6F05GrgZEmXAbsDv7bdtnkJkiAiIgoyHr0g4xqRdCmwLzBD0jLg\n48C6ALbPpVpX/SBgKdUw1+PGKzMJIiKiFMNQl1qYbB81znEDH1idMpMgIiIK6ueblZMgIiIKMTCU\nBBEREU1Sg4iIiFFs93IU02pLgoiIKCg1iIiIaNStYa6TIQkiIqKQqpO6dBTtJUFERBSUJqaIiBgt\nndQREdHEpAYRERFt5Ea5iIholBpEREQ06N5srpMhCSIiohB3cTbXyZAEERFR0FBGMUVExEiZzTUi\nItpKJ3VERIxmpwYRERHNUoOIiIhRDKzq4wQxrdQbS7pQ0uGl3j8ioh/Y7uhRQmoQEREF9XMTU89q\nEJLeJekeSXdL+vd6996Svi/pgeHahCpnSrpX0iJJR4yzf1NJN0u6qz62V73/AEm3SrpD0uWSNuzV\ntUZEdMJ1J3UnjxJ6UoOQtB3wUWAP2yskvRj4NLApsCfwp8DVwBXAXwA7ATsCM4D5km4G9miz/2jg\nOtuflDQdWF/SjPr93mz7SUmnAv8D+F8NsZ0InAiwxRZbTNY/QUREo36uQfSqielNwOW2VwDY/qUk\ngKtsDwFLJL2sPndP4FLbq4CfS7oJeO0Y++cD50taty7vLkn7ANsC36vf53nArU2B2Z4LzAWYPXt2\n/35SETGQkiDae6rludakANs3S9obOBi4UNKngV8B37Z9VBdijIiYFNUopv6daqNXfRDfAd4u6SUA\ndRNTO7cAR0iaLmkTYG/g9nb7Jb0C+LntfwPOA3YBbgPeIOmV9fttIGmbybq4iIg1NeTOHiX0pAZh\ne7GkTwI3SVoF3DnG6V8DXg/cTZVgP2z7vyS12/9u4BRJzwBPAO+y/QtJxwKXSnp+Xe5HgR9NxvVF\nRKyRgkNYO9GzJibbFwEXjXF8w/qngVPqR+vxdvsby7X9Hao+ioiIvpQlRyMioq3MxRQREY1Sg4iI\niFFssyoLBkVERJOsSR0REY2yJnVERIzS76OYik33HRER3ZvuW9IcSfdLWirptIbjL5T0jXrC1MWS\njhuvzNQgIiJK6VIndT1R6eeB/YFlVJOZXm17SctpHwCW2H5rPRvF/ZIusf10u3JTg4iIKGS4iakL\nNYjdgKW2H6i/8C8DDm14u41UzWC6IfBLYOVYhaYGERFR0GrcKDdD0oKW7bn1bNQAM4GHWo4tA3Yf\n8frPUS2r8DCwEXBEPZt2W0kQEREFrcYw1xW2Z0/grd4C3EW1/MJWwLcl3WL7N+1ekCamiIiC7M4e\n41gObN6yvVm9r9VxwJWuLAV+SrVYW1tJEBERhRi6teTofGBrSVtKeh5wJFVzUqufAfsB1Au0vQp4\nYKxC08QUEVFKl0Yx2V4p6WTgOmA6cH69zMJJ9fFzgU9QLaq2iGqBtlOHV/lsJwkiIqKQbt4oZ3se\nMG/EvnNbnj8MHLA6ZSZBREQU1M93UidBREQUlPUgIiKigTOba0REjNbhENZikiAiIgrKgkFTxMKF\nC6mmKYl+188dexOR37+1y/B9EP0qCSIioqB+/mMnCSIiopQO13ooJQkiIqKkJIiIiGgytCoJIiIi\nRqiGuSZBREREgySIiIhokE7qiIhow0NJEBERMUL6ICIioi1nqo2IiGjSxxWIJIiIiGLs9EFERESz\n9EFERMQo3VyTejIkQUREFJQEERERo9l4VUYxRUREg9QgIiKiUR/nhySIiIhS0kkdERHNMtVGREQ0\nM0PppI6IiCapQURExCj9PpvrtNIBDJO0saT3t2zvK+maNueeJ2nb3kUXETFJqiwx/qOAvkkQwMbA\n+8c9C7B9gu0lkxxPRMSk81BnjxLGTRCSNpB0raS7Jd0r6QhJ+0m6U9IiSedLen597n9K+mdJd0la\nIGkXSddJ+omkk1rKPEXSfEn3SPqnevf/BraqX3tmvW9DSVdIuk/SJZJUv/5GSbPr509I+mQd322S\nXlbv36reXiTpdElPdPHfLSKiK2x39CihkxrEHOBh2zvafg3wLeBC4Ajb21P1Y7yv5fyf2d4JuKU+\n73DgdcA/AUg6ANga2A3YCdhV0t7AacBPbO9k+5S6rJ2BDwLbAn8CvKEhvg2A22zvCNwMvKfefxZw\nVh3jsnYXJ+nEOpkt6ODfIiKie2yGhoY6epTQSYJYBOwv6QxJewGzgJ/a/lF9/CJg75bzr2553Q9s\nP277F8BTkjYGDqgfdwJ3AH9KlTCa3G57me0h4K76vUd6Ghjuq1jYcs7rgcvr519ud3G259qebXt2\nu3MiIibD8I1yU7YGUSeCXai+8E8H3jbOS56qfw61PB/eXgcQ8M91TWEn26+0/cVxygJYRfOoq2f8\nh3+9dudERPQfg4fc0WM8kuZIul/SUkmntTln37oZf7Gkm8Yrs5M+iD8Gfmv7S8CZVH+Zz5L0yvqU\nvwTGfaMW1wHHS9qwLn+mpJcCjwMbrUY547kNOKx+fmQXy42I6J4ujGKSNB34PHAgVZP8USNHetYt\nOGcDf2Z7O+Dt44XWyV/b2wNnShoCnqHqb3ghcLmkdYD5wLkdlAOA7eslvRq4te5zfgI4xvZPJH1P\n0r3AN4FrOy2zjQ8CX5L0Eap+k19PsLyIiC7rWvPRbsBS2w8ASLoMOBRoHe15NHCl7Z8B2H50vELH\nTRC2r6P6q3+knRvOndXy/EKqTuqmY2dRdSKPfP3RI3bd2HLs5Jbn+7Y837Dl+RXAFfXmcuB1ti3p\nSOBVDdcQEVHUUOdrUs8YMZhmru259fOZwEMtx5YBu494/TbAupJupGqtOcv2xWO94SC31+8KfK4e\nGvvfwPGF44mIeA7XfRAdWjHBwTTrUH0v7gesR9WKc1vLgKPGFwwk27cAO5aOIyJiLF1qYloObN6y\nvVm9r9Uy4DHbTwJPSrqZ6juybYLopzupIyLWOl0a5jof2FrSlpKeRzUw5+oR53wd2FPSOpLWp2qC\n+uFYhQ5sDSIiov91p5Pa9kpJJ1P1F08Hzre9eHgGC9vn2v6hpG8B91DddnCe7XvHKjcJIiKilC7O\n5mp7HjBvxL5zR2yfSXW7QkeSICIiCjHgVf073XcSREREQf28HkQSREREKQXnWepEEkREREGrcR9E\nzyVBREQUlBpERESMMjzdd79KgoiIKMXGhRYD6kQSREREQaXWm+5EEkREREFpYoqIiNG6eCf1ZEiC\niIgoJJ3UERHRhhla1b+dEEkQERGlpIkpIiLaSoKIiIgmfZwfkiBiaqqWGh88/dzcsKYG9bPqhnRS\nR0REM2eyvoiIaGSGMtVGREQ0SRNTREQ0S4KIiIiRnD6IiIhop48rEEkQERHlZE3qiIhoYjKKKSIi\nRjPpg4iIiDbSxBQREQ3c173USRAREaVkuu+IiGhnaFUSREREjJDZXCMiolmamCIiollulIuIiDaS\nICIiolFulIuIiFH6fTbXaaUDiIhYm9nu6DEeSXMk3S9pqaTTxjjvtZJWSjp8vDKTICIiiuksOYyX\nICRNBz4PHAhsCxwlads2550BXN9JdEkQERGl1E1MnTzGsRuw1PYDtp8GLgMObTjvr4GvAo92El4S\nREREQatRg5ghaUHL48SWYmYCD7VsL6v3PUvSTODPgXM6jS2d1BERhazmndQrbM+ewNv9C3Cq7SFJ\nHb0gCSIiohjj7iwYtBzYvGV7s3pfq9nAZXVymAEcJGml7avaFZoEERFRisHdWVBuPrC1pC2pEsOR\nwNHPeSt7y+Hnki4ErhkrOUASREREUd24k9r2SkknA9cB04HzbS+WdFJ9/Nw1KXetTxB1R8+J454Y\nETEJujXVhu15wLwR+xoTg+1jOylzrU8QtucCcwEk9e8tjRExcDLdd0RENLMZWtWdTojJkAQREVFS\nH9cg1pob5STNk/THpeOIiGjlDv8rYa2pQdg+qHQMERGtnBXlIiKimXGXboSYDEkQEREFpQYRERGN\nhroz1cakSIKIiCikmqk1CSIiIpqkiSkiIpqUGsLaiSSIiIiC0kkdERENzNDQqtJBtJUEERFRSG6U\ni4iItpIgIiKiURJEREQ0cIa5RkREM5Mb5SIiYgQ7U21EREQjpw8iIiKaZS6miIholBpEREQ0SoKI\niIjRnGGuERHRwMCQMxdTRHTg4INPKh1C133uq9eUDmFSnHzYIV0oJaOYIiKijSSIiIholAQRERGj\nVH3UuQ8iIiJGMc5UGxER0SRrUkdERKP0QURERAOnDyIiIkbr9zWpp5UOICJibWa7o8d4JM2RdL+k\npZJOazj+Tkn3SFok6fuSdhyvzNQgIiIK6saCQZKmA58H9geWAfMlXW17SctpPwX2sf0rSQcCc4Hd\nxyo3CSIiohhDd/ogdgOW2n4AQNJlwKHAswnC9vdbzr8N2Gy8QtPEFBFRkDv8D5ghaUHL48SWYmYC\nD7VsL6v3tfNXwDfHiy01iIiIQlazk3qF7dkTfU9Jb6RKEHuOd24SREREQV0axbQc2Lxle7N633NI\n2gE4DzjQ9mPjFZoEERFRTNfug5gPbC1pS6rEcCRwdOsJkrYArgT+0vaPOik0CSIioqBujGKyvVLS\nycB1wHTgfNuLJZ1UHz8X+BjwEuBsSQArx2uySoKIiCikmzfK2Z4HzBux79yW5ycAJ6xOmUkQERHF\nZE3qiIhow2QupoiIaNDPczElQUREFOOudFJPliSIiIhCsuRoRES01c9NTMXnYpJ0Yz1F7V3144qW\nYydKuq9+3C5pz5Zjh0i6U9LdkpZIem+ZK4iIWHPdmu57MhSpQUh6HrCu7SfrXe+0vWDEOYcA7wX2\ntL1C0i7AVZJ2Ax6jmqp2N9vLJD0fmFW/7kW2f9Wra4mIWHP9Pcy1pzUISa+W9CngfmCbcU4/FTjF\n9goA23cAFwEfADaiSm6P1ceesn1//bojJN0r6UOSNpmM64iI6JbVmM215yY9QUjaQNJxkr4L/BvV\n/OQ72L6z5bRLWpqYzqz3bQcsHFHcAmA7278ErgYelHRpvVLSNHj2zsEDgfWBmyVdUa+01HitdTPW\nAkkLmo5HREwWG4aGVnX0KKEXTUyPAPcAJ9i+r805o5qYxmP7BEnbA28G/p5qJaVj62MPAZ+QdDpV\nsjifKrn8WUM5c6maq5DUv3W9iBhA5foXOtGLJqbDqWYXvFLSxyS9osPXLQF2HbFvV2Dx8IbtRbY/\nQ5UcDms9se6rOBv4LPAV4B/WLPyIiMnTz53Uk54gbF9v+whgL+DXwNcl3SBp1jgv/T/AGZJeAiBp\nJ6oawtmSNpS0b8u5OwEP1ucdIOke4HTgP4BtbX/Q9mIiIvpMPyeIno1iqhenOAs4q/7rvrVR7RJJ\nv6ufr7D9ZttXS5oJfL9u+nkcOMb2I5I2Aj4s6QvA74AnqZuXqDqu32r7wR5cVkTEhORGuRFs397y\nfN8xzjsHOKdh/+PAQW1eM7JjOyKiP7m/h7nmTuqIiEIMDKUGERERTdLEFBERDfp7mGsSREREQUkQ\nERExSjfXpJ4MSRAREcUYF5pGoxNJEBERBZWaiK8TSRAREQWliSkiIholQURExCjVPEu5DyIiIhqk\nBhEREY2GhlKDiIiIJqlBRETEaMakBhERESPkTuqIiGgrCSIiIholQURERAMzlLmYIiJipPRBRERE\ne32cIKaVDiAiYu3ljv8bj6Q5ku6XtFTSaQ3HJemz9fF7JO0ybpn9XL3pNUm/AB7s0dvNAFb06L16\nZRCvCXJdU0kvr+kVtjeZSAGSPG1aZ3+nDw0NLbQ9u00504EfAfsDy4D5wFG2l7SccxDw18BBwO7A\nWbZ3H+s908TUYqIf9uqQtKDdhz1VDeI1Qa5rKpmK19SlqTZ2A5bafgBA0mXAocCSlnMOBS52VSu4\nTdLGkja1/Ui7QpMgIiLKuY6q1tOJF0ha0LI91/bc+vlM4KGWY8uoagmtms6ZCSRBRET0G9tzSscw\nlnRSlzN3/FOmnEG8Jsh1TSWDeE2dWA5s3rK9Wb1vdc95jnRSR0RMcZLWoeqk3o/qS38+cLTtxS3n\nHAyczB86qT9re7exyk0TU0TEFGd7paSTqfo0pgPn214s6aT6+LnAPKrksBT4LXDceOWmBhEREY3S\nBxEREY2SICIiolESRERENEqCiIiIRkkQERHRKAkiIiIaJUFERESj/w9v0Xrr8QLVDAAAAABJRU5E\nrkJggg==\n", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" }, { "name": "stdout", "output_type": "stream", "text": [ "[log] 57m 8s (3100) 0.2420\n", "57m 8s (- 864m 34s) (3100 6%) 0.1877\n", "[log] 58m 57s (3200) 0.1424\n", "58m 57s (- 862m 22s) (3200 6%) 0.1783\n", "[log] 60m 50s (3300) 0.1371\n", "60m 50s (- 860m 59s) (3300 6%) 0.1750\n", "[log] 62m 41s (3400) 0.1539\n", "62m 41s (- 859m 21s) (3400 6%) 0.1679\n", "[log] 64m 30s (3500) 0.1167\n", "64m 30s (- 857m 8s) (3500 7%) 0.1695\n", "[log] 66m 23s (3600) 0.1849\n", "66m 23s (- 855m 44s) (3600 7%) 0.1630\n", "[log] 68m 16s (3700) 0.1372\n", "68m 16s (- 854m 25s) (3700 7%) 0.1544\n", "[log] 70m 8s (3800) 0.1163\n", "70m 8s (- 852m 44s) (3800 7%) 0.1434\n", "[log] 72m 0s (3900) 0.1499\n", "72m 0s (- 851m 7s) (3900 7%) 0.1415\n", "[log] 73m 51s (4000) 0.1129\n", "73m 51s (- 849m 18s) (4000 8%) 0.1405\n", "> nous sommes amoureux .\n", "= we re in love .\n", "< we re in love . \n" ] }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAASEAAAEnCAYAAADmRPUGAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAGadJREFUeJzt3XuUnXV97/H3hwgiF0WJbTWAAUykQa4JIBUUBDEgFquW\naxFY2AhL7HFZVOzpsmvVtquIHlstGFOMgMdCrUWJNQha5aIhkIRLSCKpKYpJatUg5SgCkszn/PE8\nQ3aGmdl7kp35PbP358Xai/1c5tnfTCbf+d1/sk1ERCk7lA4gIvpbklBEFJUkFBFFJQlFRFFJQhFR\nVJJQRBSVJBQRRSUJRURRSULREUk7D3NucolYorckCUWnlkh6zeCBpLcDiwrGEz3ieaUDiAnjbGC+\npNuAlwN7Am8oGlH0BGXuWHRK0luBLwC/BF5ne03hkKIHpCQUHZH0OWB/4GBgOvBvkj5t+8qykcVE\nlzah6NSDwPG2f2j7FuAo4PDCMUUPSHUsIopKdSw6IumHwHN+Y9ner0A40UOShKJTs1re7wz8IfCS\nQrFED0l1rKEk7Qo8aXtA0nTgAOBm288UDu1ZkpbZnlk6jpjYUhJqrjuAYyW9GLgVWAKcAZxTIhhJ\nrY3QO1CVjPLzE9ssP0TNJdu/lnQhcJXtj0m6v2A8n2h5vxH4EXB6mVCilyQJNZckHU1V8rmwPjep\nVDC2jy/12dHbMk6oud4HfBj4iu2VkvYDvlMqGEm/Lelzkm6uj2fUpbSIbZKG6YaTtIvtXzcgjpuB\nzwP/2/Yhkp4H3Gf7oMKhxQSXklBDSTpa0irgofr4EElXFQxpsu0vAQMAtjcCmwrG00iqfFXS75aO\nZaJIEmquvwPeBDwKYPsB4HUF43lC0p7UAxbrZT0eLxhPU50EHAG8q3QgE0WSUIPZXjvkVMmSx/uB\nBcD+kr4HXAe8t2A8TXUhVQJ6S11ljTbyTWqutZJ+D7CkHYH/BXy/RCCSdqAaJf164FWAgNVNGjjZ\nBPVKkwfavlnSW4C3Al8uHFbjpSTUXBcB7wGmAOuBQ+vjcWd7ALjS9kbbK22vSAIa1rnA9fX7z5Mq\nWUfSOxYdkfRx4C7gRueHZliSHgRm215fHz8AnDpMtTpaJAk1lKR9qdpcptJSbbb9+4Xi+SWwK9Vo\n6aeoqmS2/cIS8TSNpD2AM2x/tuXcG4ENtu8rF1nzJQk1VP1b9HNUi4kNDJ63fXuxoCK2gyShhpJ0\nt+2jSscxSNKwwwNs3zHesTSNpD8GbrP9A0kC5gNvp5pfd15KQqNLEmooSWcD06hm0D89eN72vYXi\n+VrL4c7AkcAy232/44akFcBhtp+p/97+lGq80GHAX9g+tmiADZcu+uY6iKq35Q1sro6ZQtvs2H5L\n67GkvakGVAZsbOktPBW4zvajwLckfaxgXBNCklBz/SGwn+3flA5kBOuATE2oDEh6GfAYcALw1y3X\nXlAmpIkjSagmaX9gne2nJR1HtbXNdbb/p1BIK4A9gJ8V+vwtSPo0m9eY3oFq3FKRqmEDfQRYSrXU\nygLbKwEkvR54uGRgE0HahGr1gmGzqLrEFwI3UY1+PaVQPLdRJcIlbNkmVKqL/ryWw43Aj2x/r0Qs\nTVRP0djd9mMt53al+jf2q3KRNV9KQpsN2N4o6Q+AT9v+tKSSvRp/UfCzn8P2tZJ2otr4EGB1yXga\n6CXAeyQdWB+vpFoR86cFY5oQkoQ2e0bSWcB5wGAj7I6lghkcDyTphTTg76muol5L1e0sYG9J56WL\nHiS9Fvgn4Bqqib0AM4G7JZ2TEuPoUh2rSZpBNV/rLtvX1yOWT7d9eaF45gB/STU6eYDNI5SL7PMl\naRlwtu3V9fF04PrstgGSFgMXDx0PJOlQ4LNNGu/VRElCDSXpB8DRtjeUjgVA0nLbB7c7148krbI9\nY6zXolK8mN8UDdxh9D+B4su6tlgq6Wrg/9bH51D1CEW1oOKLWxul65MvIStVtJUktFnTdhj9MLBI\n0t1s2Tv2J4XiuZhqKZHBz78TKLncbJN8ErhV0qVsHrYwE7i8vhajSHVsFCV3GJV0D/BdnjuB9doS\n8cToJJ0KfBA4kKpEvQq4wvbXRv3CSBIaNMIOoxfbPqRQPPfZPqzEZw+n/kf2UeAVVCXoLOURXZEk\nVJPUuqfX4A6jHx/sDSoQz9/UMXyNLatjvygUzxrgbcCDWdRsS5K+ZPv0+v3ltj/Ucu1W2yeVi675\nkoQaqm4oH6pkF/13gBPqpV6jRWupVdK9tg8f7loMLw3TNUkvohqlPLhuzu3AX9ousq2N7X1LfO4o\nPggslHQ7W5bM/k+5kBpjtN/k+S3fRpLQZvOpJo2eXh+fS7VY+dtKBFPvsHExm5PibVQD30otMP/X\nwK+oeg53KhRDU+0i6TCqtsQX1O9VvzKLvo1Ux2qS7rd9aLtz4xjP1VTTRgZ7w84FNtkusoODpBW2\nX13is5tuSHvic9g+frximYhSEtrsSUnH2P4uPDsf6MmC8RwxpGfu2/W606UslHSS7VsLxtBISTLb\nJklos4uBa+u2IagWqDpvlPu3t02S9rf9nwCS9qPsDqwXA5dKehp4hnTRb0HSC4Dp9Xbdg+f2oSq9\nri8XWfMlCW32feBjwP5Ui4k9TrWD5vJC8VwKfEfS4KJYU4ELCsWC7d3raQjTqNqFYksbgRslHWz7\nifrc1cCfUW1eGSNIEtrsJuB/qIbdN+GHZk/g1VTJ563A0VSJsQhJ76Lainov4H7gNcAiquVM+169\nyP1XqDo2Pl+Xgl5qO/Pr2kjDdK1pDa+DM9QlHUM1UvnjwEdKLQtR7y56BLDY9qGSDgD+xnaR3sMm\nqr8n82y/TtKfA//P9qdKx9V0meG72SJJB5UOosVg+8+bgX+0/XXKdo0/ZfspAEnPt/0Q8KqC8TRO\n/T1RvdbSmcAXCoc0IaQ6ttkxwPn1SOWn2dzwWmq9nPWSPgu8Ebhc0vMp+0tjXb3V8VeBb0p6DHik\nYDzDkvQ7tv+7YAifo2oLenDo0h4xvFTHapJeMdx520X+oUnaBZhN9cP8g3pLmYOa0EVe7yLxIuAb\nTduSSNLXbb+54OfvAvwEeLvtb5WKYyJJEoqIotImFBFFJQkNo15kvjGaFg80L6bEMz4kzZf0M0kr\nRrguSZ+StEbS8iHrdA0rSWh4TfsBalo80LyYEs/4uIaqrXIkJ1MNaJ1G9T34TLsHJglFRMfqfeZG\nW1jvNKrt0217MbBH3akyop7qopfUtVb2bj6rG5oWD3Qnppkzu7OE9z777MOsWbO68j1atmxZNx7T\ntb8z29qWr589e7Y3bOhs56hly5atpNrrbtA82/PG8HFTgLUtx+vqcz8Z6Qt6KgnFxLN0afNmNUjb\n9G++cTZs2NDx91nSU7Zntb+ze5KEIvrAOA7FWQ/s3XK8F23mYqZNKKLHGdg0MNDRqwsWAO+se8le\nAzxue8SqGKQkFNEHjLu01LWk64HjgMmS1lGty74jgO25wELgFGAN1Q7CbZefSRKK6HWGgS7Vxmyf\n1ea6qXbq7ViSUEQfaPL0rCShiB5nYCBJKCJKSkkoIoqx3a2er+0iSSiiD6QkFBFFdauLfntIEoro\ncVXDdOkoRpYkFNEHUh2LiHLSMB0RJZmUhCKisCYPViw2i17SByT9Sf3+k5K+Xb9/g6QvSjpJ0l2S\n7pX0L5J2KxVrxERnu6NXCSWX8rgTOLZ+PwvYTdKO9bnlwJ8DJ9o+HFgKvH+4h0iaI2mppOatjhXR\nCO74vxJKVseWATMlvZBqx9N7qZLRsVRrkswAvlevcrcTcNdwD6mXnpwHzVwCNaI0d3EW/fZQLAnZ\nfqbecvl8YBFV6ed44JXAD4Fvtls2ICI6M9Dg3rHSKyveCVwK3FG/vwi4D1gMvFbSKwEk7SpperEo\nIyawwVn0nbxKaEISehlwl+2fUq3yf6ftn1OVkK6XtJyqKnZAsSgjJrgmN0wX7aK3/e/US0PWx9Nb\n3n8bOKJEXBE9pWAppxMZJxTRBzJYMSKKMbApSSgiSkpJKCKKShKKiGKchumIKC0loYgoKkkoIoqp\neseaO20jSSiiD2QCa0SUU3BKRieShCJ6XJZ3jYji0kUfjdG034j1onWxnTXt771VklBEj8te9BFR\nXLaBjoiimtxFX3plxYjYzgZ7x7qxsqKk2ZJWS1oj6bJhrr9I0tckPSBppaQL2j0zSSiiD3QjCUma\nBFwJnEy1G85ZkmYMue09wCrbhwDHAZ+QtNNoz011LKLXda9h+khgje2HASTdAJwGrGr9NGB3Vd2e\nuwG/ADaO9tAkoYge18XBilOAtS3H64CjhtzzD1T7Bv4XsDtwhj36xLVUxyL6wBi2/Jk8uKNx/Zoz\nxo96E3A/8HLgUOAf6g1OR5SSUEQfGEMX/Qbbs0a4th7Yu+V4r/pcqwuAv3VV9FpTb3B6AHDPSB+Y\nklBEH7A7e7WxBJgmad+6sflMqqpXqx8DJwBI+m3gVcDDoz00JaGIHje4A+s2P8feKOkS4BZgEjDf\n9kpJF9XX5wIfBa6R9CAg4EO2N4z23CShiF7XxWkbthcCC4ecm9vy/r+Ak8byzCShiB6XpTwiorgk\noYgoKusJbYV6xKXaDXSKiHbc6Fn0jeqilzS1nhx3HbACOFfSXZLulfQvknYrHWPERNNp93ypwlKj\nklBtGnAV8HrgQuBE24cDS4H3lwwsYqLaNDDQ0auEJlbHHrG9WNKpVDN1v1cvAboTcNfQm+th5WMd\nWh7RN7o1Tmh7aWISeqL+v4Bv2j5rtJttzwPmAUhq7nc6oqAm9441sTo2aDHwWkmvBJC0q6TphWOK\nmHg6XEuoVKJqbBKy/XPgfOB6ScupqmIHFA0qYqJqcMt0o6pjtn8EvLrl+NvAEcUCiugRA5uaWx1r\nVBKKiO6rCjlJQhFRUJJQRBRUrtG5E0lCEX3ADd54LEkooselTSgiinP2oo+IkhpcEEoSiuh5dtqE\nIqKstAlFRDFZYzoiiksSiohybLwpvWMRUVBKQhFRVINzUJJQRK9Lw3RElJVpGxFRlhlIw3RElJSS\nUEQUk1n0EVFeklBElOTmNgklCUX0g1THIqIcm4EsahYRpTR9sGJjd2CNiC5xtdB9J692JM2WtFrS\nGkmXjXDPcZLul7RS0u3tnpmSUEQ/6EJJSNIk4ErgjcA6YImkBbZXtdyzB3AVMNv2jyX9VrvnpiQU\n0fOqfcc6ebVxJLDG9sO2fwPcAJw25J6zgRtt/xjA9s/aPbRxSUjSotIxRPSagQF39AImS1ra8prT\n8pgpwNqW43X1uVbTgRdLuk3SMknvbBdb46pjtn+vdAwRvcQe0+aHG2zP2oaPex4wEzgBeAFwl6TF\ntv9jpC9oYknoV/X/j6uz6ZclPSTpi5JUOr6IiahL1bH1wN4tx3vV51qtA26x/YTtDcAdwCGjPbRx\nSWiIw4D3ATOA/YDXlg0nYmLqUhJaAkyTtK+knYAzgQVD7rkJOEbS8yTtAhwFfH+0hzauOjbEPbbX\nAUi6H5gKfLf1hrrOOue5XxoRlY4STPun2BslXQLcAkwC5tteKemi+vpc29+X9A1gOTAAXG17xWjP\nbXoSerrl/SaGidf2PGAegKTmjsiKKKWLs+htLwQWDjk3d8jxFcAVnT6z6UkoIraRAW9q7u/nJKGI\nPtDkaRuNS0K2d6v/fxtwW8v5SwqFFDGxddboXEzjklBEdN8YxgmNuyShiD6QklBEFNP0pTyShCJ6\nnY2zqFlElJQ1piOiqFTHIqKc7DsWESWlYToiCste9BFRUqpjEVFcklBElNTgHJQkFNHr0jAdEWWN\nbaH7cZckFNHzshd9RBSW6lhElJUkFBGljHHzw3GXJBTRBxpcEEoSiuh9WWM6Ikoy6R2LiHJM2oQi\norBUxyKiIDe6ZTpJKKLXZSmPiChtIHvRR0QpmUUfEWX1e3VM0q9s77a9PyciRpLBihFRWJOT0A7j\n9UGqXCFphaQHJZ1Rn79B0ptb7rtG0jskTarvXyJpuaR3j1esEb3GA+7oVcK4JSHgbcChwCHAicAV\nkl4G/DNwOoCknYATgK8DFwKP2z4COAL4Y0n7Dn2opDmSlkpaOj5/jIiJZXAWfTeSkKTZklZLWiPp\nslHuO0LSRknvaPfM8UxCxwDX295k+6fA7VTJ5WbgeEnPB04G7rD9JHAS8E5J9wN3A3sC04Y+1PY8\n27NszxqvP0jERGO7o9doJE0CrqT6dzoDOEvSjBHuuxy4tZPYircJ2X5K0m3Am4AzgBvqSwLea/uW\nUrFF9IauNUwfCayx/TBUTSnAacCqIfe9F/hXqkJGW+NZEroTOKNu63kp8DrgnvraPwMXAMcC36jP\n3QJcLGlHAEnTJe06jvFG9IbuVcemAGtbjtfV554laQrwB8BnOg1vPEtCXwGOBh6gGj/1Qdv/XV+7\nFfgCcJPt39TnrgamAvdKEvBz4K3jGG9EzxhDSWjykPbVebbnjeGj/g74kO2B6p9te9s9CQ2OEXL1\nXfhA/Rp6zzPAS4acGwD+rH5FxFYa44jpDaO0r64H9m453qs+12oWcEOdgCYDp0jaaPurI31g8Tah\niNjejLuzqNkSYFrdS70eOBM4e4tPsp/twZZ0DfBvoyUgSBKK6H0GdyEH2d4o6RKq9tpJwHzbKyVd\nVF+fuzXPTRKK6APdGjFteyGwcMi5YZOP7fM7eWaSUEQfaPK0jSShiB6XpTwioiybgU3ZbSMiSkpJ\nKCJKMklCEVGI+31lxYgozbgbA4W2kyShiD6QklBEFJW96COimGrBsiShiCgp1bGIKCld9BFRVBqm\nI6IgMzCwqXQQI0oSiuhxGawYEcUlCUVEUUlCEVGQ00UfEWWZDFaMiELsTNuIiKK6tg30dpEkFNEH\nMncsIopKSSgiikoSiohynC76iCjIwIAzdywiiknv2HYlaQ4wp3QcEU2WJLQd2Z4HzAOQ1NzvdERB\nSUIRUUzVLp1xQhFRjHGDp23sUDqATklaKOnlpeOImIjc4X8lTJiSkO1TSscQMVGlTSgiCsq+YxFR\nUNPXmJ4wbUIRsfWqXVjbv9qRNFvSaklrJF02zPVzJC2X9KCkRZIOaffMlIQi+kA3FjWTNAm4Engj\nsA5YImmB7VUtt/0QeL3txySdTDWG76jRnpskFNHzDN1pEzoSWGP7YQBJNwCnAc8mIduLWu5fDOzV\n7qGpjkX0gTF00U+WtLTl1TolagqwtuV4XX1uJBcCN7eLLSWhiB43xobpDbZnbetnSjqeKgkd0+7e\nJKGIPtCl3rH1wN4tx3vV57Yg6WDgauBk24+2e2iSUETP69o4oSXANEn7UiWfM4GzW2+QtA9wI3Cu\n7f/o5KFJQhF9oBu9Y7Y3SroEuAWYBMy3vVLSRfX1ucBHgD2BqyQBbGxXvVOTBzGNVZbyaK9pf9/1\nD2qMwvY2fZN23nk3T5366o7uXb367mXdaBMai5SEInpe1piOiMKyDXQ0Rqo/7TWpyjprVndqRk36\nMw2VJBTR85y96COinCzvGhHFpToWEUUlCUVEQemij4jCSi1i34kkoYgeZ8PAQPaij4hishd9RBSW\nJBQRRSUJRURRGawYEeU4XfQRUZCBgQaXhLZ5tw1Jt9Wbod1fv77ccm2OpIfq1z2Sjmm5dqqk+yQ9\nIGmVpHdvaywRMTx7oKNXCVtVEpK0E7Cj7SfqU+fYXjrknlOBdwPH2N4g6XDgq5KOBB6l2hTtSNvr\nJD0fmFp/3YttP7Z1f5yIeK5md9GPqSQk6XclfQJYDUxvc/uHgA/Y3gBg+17gWuA9wO5UCfDR+trT\ntlfXX3eGpBWS/lTSS8cSX0QMr1vbQG8PbZOQpF0lXSDpu8A/Uu22eLDt+1pu+2JLdeyK+tyBwLIh\nj1sKHGj7F8AC4BFJ19f7V+8Azy6WfTKwC3CHpC/X+19no8aIrTC471hTk1An1bGfAMuBd9l+aIR7\nnlMda8f2uyQdBJwIXEq1v/X59bW1wEcl/RVVQppPlcB+f+hz6h0i5ww9HxGDjBs8baOT0sU7qPYY\nulHSRyS9osNnrwJmDjk3E1g5eGD7QdufpEpAb2+9sW47ugr4FPAl4MPDfYjtebZnjfcOARETyRi2\ngR53bZOQ7VttnwEcCzwO3CTpW5KmtvnSjwGXS9oTQNKhVCWdqyTtJum4lnsPBR6p7ztJ0nLgr4Dv\nADNsv8/2SiJiq0z06hgA9Xaufw/8fV1KaS3ffVHSk/X7DbZPtL1A0hRgUb0f2C+BP7L9E0m7Ax+U\n9FngSeAJ6qoYVWP1W2w/sk1/soh4VpN7x7aqi972PS3vjxvlvs8Anxnm/C+BU0b4mqGN2RGxDapS\nTnMHK2bEdEQf6LmSUERMLNnyJyLKSkkoIspxtoGOiHIGR0w3VZJQRB9IEoqIopKEIqIgZ8ufiCin\n6W1CWR4joh8MrjPd7tVGvazOaklrJF02zHVJ+lR9fXm9mOGokoQiel6nc+hHT0KSJgFXUi2vMwM4\nS9KMIbedDEyrX3MYZtrWUElCEX2gS2tMHwmssf2w7d8ANwCnDbnnNOA6VxYDe0h62WgPTZtQRB/o\n0rSNKcDaluN1wFEd3DOFanHEYfVaEtpAvS7RNppcP6spmhYPNC+mrsUjqRuP6VY8nS4iOJpbqOLp\nxM6SWldJnWd7XhdiGFFPJSHbXVkYX9LSJq3U2LR4oHkxJZ6R2Z7dpUetB/ZuOd6rPjfWe7aQNqGI\n6NQSYJqkfettv86k2rCi1QLgnXUv2WuAx22PWBWDHisJRcT2Y3ujpEuoqneTgPm2V0q6qL4+F1hI\ntWDhGuDXwAXtnpskNLztWgfeCk2LB5oXU+IZB7YXUiWa1nNzW96bam/BjqnJIykjovelTSgiikoS\nioiikoQioqgkoYgoKkkoIopKEoqIopKEIqKo/w+d8FO7tZwA0gAAAABJRU5ErkJggg==\n", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" }, { "name": "stdout", "output_type": "stream", "text": [ "[log] 75m 43s (4100) 0.1106\n", "75m 43s (- 847m 48s) (4100 8%) 0.1315\n", "[log] 77m 34s (4200) 0.0593\n", "77m 34s (- 846m 0s) (4200 8%) 0.1353\n", "[log] 79m 27s (4300) 0.1601\n", "79m 27s (- 844m 29s) (4300 8%) 0.1256\n", "[log] 81m 17s (4400) 0.1076\n", "81m 17s (- 842m 29s) (4400 8%) 0.1285\n", "[log] 83m 8s (4500) 0.1967\n", "83m 8s (- 840m 42s) (4500 9%) 0.1237\n", "[log] 84m 59s (4600) 0.1156\n", "84m 59s (- 838m 49s) (4600 9%) 0.1175\n", "[log] 86m 51s (4700) 0.0809\n", "86m 51s (- 837m 13s) (4700 9%) 0.1118\n", "[log] 88m 41s (4800) 0.0821\n", "88m 41s (- 835m 13s) (4800 9%) 0.1115\n", "[log] 90m 32s (4900) 0.1044\n", "90m 32s (- 833m 18s) (4900 9%) 0.1140\n", "[log] 92m 23s (5000) 0.0773\n", "92m 23s (- 831m 35s) (5000 10%) 0.1076\n", "> qu ai je ?\n", "= what do i have ?\n", "< what do i have ? \n" ] }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAASEAAAEZCAYAAADYNJPbAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAFtxJREFUeJzt3XuUnVV9xvHvQ+SeCEqsCxMotA0qKEQTLrWgoYoGC0Xr\nDe9QMdIF2ouo1LbatbysWrVWyyVGRGTVCooo0Uah1gtQRJLIzQSCWSASZOkKd1CEZJ7+8b5DToaZ\nOWcyZ2a/55znw3pX3tvs88tJ8mPv/b57b9kmIqKU7UoHEBGDLUkoIopKEoqIopKEIqKoJKGIKCpJ\nKCKKShKKiKKShCKiqCShiCgqSSiii1T5hqRnl46lVyQJRXTXS4GDgZNKB9IrkoQiuuttVAnoWElP\nKh1ML0gSiugSSbOBA2x/G/gu8IrCIfWEJKGI7nkz8OV6/wukSdaRJKGI7vlLquSD7ZXAnpL2KhtS\n8yUJRXSBpN2BM2zf2XL6NGB2oZB6hjKpWUSUlJpQxCRJerukefW+JH1B0gOSbpD0vNLxNV2SUMTk\n/TXw83r/9cCBwL7A3wGfKRRTz0gSipi8TbYfq/ePAc63fbft7wK7FoyrJyQJRUzekKQ9Je0EvJjq\nHaFhOxeKqWfkjc6IyfsAsAqYASy3vQZA0ouAW0sG1gvydCyiC+ohGrNs39tybleqf2MPlYus+VIT\niuiOpwKnSDqgPl4DnGX7VwVj6gnpE4qYJEl/AqysD8+vN4Af19diHGmORUySpKuBv7J97Yjz84HP\n2j60TGS9ITWhiMl78sgEBGD7OmBWgXh6SpJQxORJ0lNGOflU8m+srXxBEZP3KeAySS+SNKveFgHf\nrq/FONInFNEFko4B3gscABhYC3zc9jeLBtYDkoQioqg0xyImSdJXWvY/NuLaZdMfUW9JEoqYvHkt\n+0eNuPa06QykF+WN6Zqk26ja8lux/QcFwoneMl6fRvo72kgS2mJhy/5OwGuoXsWPaGeXevKy7YCd\n633VW0bRt5GO6XFIWm17Qek4mkDS4cA821+Q9DRgpu3bSsfVBJK+P95120dOVyy9KDWhmqTntxxu\nR1UzmvbvR9KTbT9Qv+j2BLbvKRDTB6m+j2dSrSaxPfCfQMZFkSQzWUlCW3ySLe33TVTTdb6mQBz/\nRTU732q27k9QfVyij+qVwPOAnwDY/qWkDEdoIWlnYD/b17ec2xvYPGIFjhghSWiLb1H9I1d9bOAY\nqTq0/W/TEYTtY+pf961rQ/Oo+qhKetS2JRkenycntrYJuFjSgbYfrs+dA7wfSBIaR5LQFguAg4FL\nqBLRscA1wM9KBCPpJKoJ1OcC1wGHAVdRTR863b4i6bPA7pLeTrXe+jkF4hgej3Uy8Ahwju0HSsQx\nku3HJH0deC3whboW9DTbqwqH1njpmK5Juhz4M9sP1sezgP+2/cJC8dxIlRSvtj1f0rOAj9r+i0Lx\nHAW8tD68tJ7EvUQc3wd+BOwILAaOtd2IKVTrP6Nltl8o6R+BB2xntY02UhPa4unAoy3Hj9bnSnnE\n9iOSkLSj7ZslPXM6A5B0pe3DJT3I1k3VkyUNAfdQjY86axrD2sP2++v4LgN+KOk+4N3ASbZfO42x\nbKX+M5Kk/YDjgSNKxdJLkoS2OB+4pq5SA7wCOK9cOGyolxb+BvA/ku4Fbp/OAGwfXv86aie0pD2o\nmojTmYQelLSP7Z/bvrRu9jwDuBe4cRrjGMvnqZqqN7bONx1jS3OsRf2Yfvj/XpePNlFVCfWqDbsB\n37H9aLv7p5OkPW3fNY2f90zAtm+Zrs+cCEm7AHcBryrVZO01SUIRUVQGsEZEUUlCo5C0pHQMrZoW\nDzQvpsQzPSSdK+nXkn46xnVJ+oyk9ZJuGDESYVRJQqNr2l+gpsUDzYsp8UyP86hejRjL0VQv2M6j\n+g7ObldgklBEdMz25VSvZozlOOB8V66mesF1z/HK7KtH9MPDCppWVjc0LR7oTkwLFnRnkoK9996b\nhQsXduU7Wr16dTeK6dqfmW21v2tsixcv9saNGzu6d/Xq1Wuo3kYftsz2sgl83BzgjpbjDfW5MZ+g\n9lUSit6zalXzRjUMjxfsFxs3buz4e5b0iO2F7e/sniShiAEwja/i3Ans1XI8lzYDeNMnFNHnDGwe\nGupo64LlwFvqp2SHAfe3e5k1NaGIvmfcpamuJX0ZWATMlrQB+CDVJHfYXgqsAF4OrAd+A5zYrswk\noYh+ZxjqUmvM9uvbXDdwykTKTBKKGABNHp6VJBTR5wwMJQlFREmpCUVEMba79eRrSiQJRQyA1IQi\noqhuPaKfCklCEX2u6pguHcXYkoQiBkCaYxFRTjqmtybpIdszJ3D/IqoVQK+auqgi+pdJTWiyFgEP\nUS0tExHboMkvK3Z9FL2k90h6V73/KUnfq/f/VNKX6v2PSLpe0tWSnl6fO1bSjyVdK+m7kp4uaR+q\nJX//VtJ1krKYXMQ2sN3RVsJUTOVxBVvW7loIzJS0fX3ucmBXqqWND6qP317feyVwmO3nARcA77X9\nc2Ap8Cnb821fMfLDJC2RtEpS82bHimgEd/xfCVPRHFsNLJD0ZOB3wE+oktERwLuollf+Vsu9R9X7\nc4EL6/lodwBu6+TD6qknl0Ezp0CNKM1dHEU/FbpeE7L9GFUCOYGqH+cK4Ejgj4CbgMe8pd63mS2J\n8D+AM2w/F3gHsFO3Y4sYVENDQx1tJUzVzIpXAKdRNbeuoOrXudbjNzp3Y8s0kG9tOf8gMOpa6BHR\n3vAo+k62EqYyCe0J/Mj2r6hm739Cf84I/wx8VdJqoHVpgG8Cr0zHdMS2a3LH9JQ8orf9v9RTPtbH\n+7Xsz2zZvwi4qN6/BLhklLJuAQ6cijgjBkLBWk4neuE9oYiYpLysGBHFGNicJBQRJaUmFBFFJQlF\nRDFOx3RElJaaUEQUlSQUEcVUT8cyqVlEFNTkAaxJQhH9ruCQjE4kCUX0uUzvGhHF5RF9RBSVmlBE\nFJO16COiuCwDHRFFNfkR/VTNrBgRDTH8dKwbMytKWixpnaT1kk4f5fpukr5ZL+m1RtKJ7cpMEooY\nAN1IQpJmAGcCRwP7A6+XtP+I204B1tZLei0CPilph/HKTXMsot91r2P6EGC97VsBJF0AHAesbf00\nYJYkATOBe4BN4xWaJBTR57r4suIc4I6W4w3AoSPuOQNYDvySapWc19njD1xLcyxiAExgyZ/Zwysa\n19uSCX7Uy4DrgGcA84Ez6oVQx5SaUMQAmMAj+o22F45x7U5gr5bjuWxZK3DYicC/1GsMrpd0G/As\n4JqxPjA1oYgBYHe2tbESmCdp37qz+XiqplerXwAvBpD0dOCZwK3jFZqaUESfG16BddLl2JsknQpc\nCswAzrW9RtLJ9fWlwIeA8yTdCAh4n+2NYxZKklBE/+visA3bK4AVI84tbdn/JfDSiZSZJBTR5zKV\nR0QUlyQ0AZL+GXjI9idKxxLRLzKfUEQU5EaPom/EI3pJ/yDpFklXUj3SQ9J8SVdLukHS1yU9pXCY\nET2p08fzpSpLxZOQpAVU7xvMB14OHFxfOp/q8d6BwI3AB8tEGNH7Ng8NdbSV0ITm2BHA123/BkDS\ncmBXYHfbP6zv+SLw1dF+uH6tfKKvlkcMjG69JzRVmpCEJsX2MmAZgKTmftMRBTX56Vjx5hhwOfAK\nSTtLmgUcCzwM3CvpiPqeNwM/HKuAiBhHh3MJlUpUxWtCtn8i6ULgeuDXVONTAN4KLJW0C9XYk7Yz\ntEXEGBpcEyqehABsfwT4yCiXDpvuWCL60dDmJKGIKKR6/J4kFBEFJQlFREHlOp07kSQUMQDc4IXH\nkoQi+lz6hCKiOGct+ogoqcEVoSShiL5np08oIspKn1BEFJM5piOiuCShiCjHxpvzdCwiCkpNaEDt\nsMNOpUN4gnsfuK90CFuRVDqEgdDgHJQkFNHv0jEdEWVl2EZElGWG0jEdESWlJhQRxWQUfUSUlyQU\nESW5uV1CSUIRgyDNsYgox2Yok5pFRClNf1mxCctAR8RUcjXRfSdbO5IWS1onab2k08e4Z5Gk6ySt\nkdR2+fbUhCIGQRdqQpJmAGcCRwEbgJWSltte23LP7sBZwGLbv5D0e+3KTU0oou9V6451srVxCLDe\n9q22HwUuAI4bcc8bgItt/wLA9q/bFZokFDEAhobc0QbMlrSqZVvSUswc4I6W4w31uVb7AU+R9ANJ\nqyW9pV1sPdMck3SV7ReUjiOi19gTWvxwo+2Fk/i4JwELgBcDOwM/knS17VvG+4GekAQUse269HTs\nTmCvluO59blWG4C7bT8MPCzpcuAgYMwk1DPNMUkPlY4hold1qU9oJTBP0r6SdgCOB5aPuOcS4HBJ\nT5K0C3AocNN4hfZMTWgsdZt1SdsbIwZWRwmmfSn2JkmnApcCM4Bzba+RdHJ9fantmyR9B7gBGALO\nsf3T8crt+SRkexmwDEBSc9/Iiiili6Poba8AVow4t3TE8ceBj3daZs8noYgYnwFvbu7/n5OEIgZA\nk4dtJAlF9LvOOp2L6ZkkZHtm6RgietUE3hOadj2ThCJi26UmFBHFNH0qjyShiH5n40xqFhElZY7p\niCgqzbGIKCfrjkVESemYjojCshZ9RJSU5lhEFJckFBElNTgHJQlF9Lt0TA+wRx99pHQIT7DLjjuW\nDiGm28Qmup92SUIRfS9r0UdEYWmORURZSUIRUcoEFz+cdklCEQOgwRWhJKGI/pc5piOiJJOnYxFR\njkmfUEQUluZYRBTkRvdMJwlF9LtM5RERpQ1lLfqIKCWj6COirIY3x7brdoGS9pH0026XGxHbqnpZ\nsZOthNSEIgbAQNWEajMkfU7SGkmXSdpZ0tslrZR0vaSvSdpF0m6Sbpe0HYCkXSXdIWl7SX8o6TuS\nVku6QtKzpijWiL7nIXe0lTBVSWgecKbtA4D7gFcBF9s+2PZBwE3A22zfD1wHvKj+uWOAS20/BiwD\n3ml7AXAacNZoHyRpiaRVklZN0e8loqcNj6LvRhKStFjSOknrJZ0+zn0HS9ok6dXtypyq5thttq+r\n91cD+wDPkfRhYHdgJnBpff1C4HXA94HjgbMkzQReAHxV0nCZo85LansZVcJCUnPrnBEFdaM5JmkG\ncCZwFLABWClpue21o9z3MeCyTsqdqiT0u5b9zcDOwHnAK2xfL+kEYFF9fTnwUUlPBRYA3wN2Be6z\nPX+K4osYIF3rdD4EWG/7VgBJFwDHAWtH3PdO4GvAwZ0UOlXNsdHMAu6StD3wxuGTth8CVgKfBr5l\ne7PtB4DbJL0GQJWDpjHWiP7RvebYHOCOluMN9bnHSZoDvBI4u9PwpjMJ/RPwY+D/gJtHXLsQeFP9\n67A3Am+TdD2whirjRsQ2mMAj+tnDfaz1tmSCH/XvwPtsdzx3SNebY7Z/Djyn5fgTLZdHzY62LwI0\n4txtwOJuxxcxaCb4xvRG2wvHuHYnsFfL8dz6XKuFwAV1X+5s4OWSNtn+xlgfmPeEIvqecXcmNVsJ\nzJO0L1XyOR54w1afZO87vC/pPKouljETECQJRfQ/Q+eNo3GKsTdJOpXqyfYM4FzbaySdXF9fui3l\nJglFDIBuvTFtewWwYsS5UZOP7RM6KTNJKGIANHnYRpJQRJ/LVB4RUZbN0OasthERJaUmFBElmSSh\niCjEDZ9ZMUkoou+ZCYyimHZJQhEDIDWhiCgqa9FHRDHVCPkkoYgoKc2xiCgpj+gjoqh0TEdEQWZo\naHPpIMaUJBTR5/KyYkQUlyQUEUUlCUVEQc4j+ogoy+RlxYgoxM6wjYgoqmvLQE+JJKGIAZCxYxFR\nVGpCEVFUk5PQdqUDGI+kZ0m6StKNkn4oaXbpmCJ6jt35VkCjk1DtTbafC1wFnFw6mIheY2DImzva\nSmh0c8z2zS2HOwJ3l4olonfl6dikSXoZcDTwx6NcWwIsmfagInpIktAkSNoO+DxwpO37Rl63vQxY\nVt/b3G86oqAkocl5BnC/7Z+VDiSiF1V9znlPaDLuBd5dOoiI3mXc4GEbvfB0bDfgpNJBRPQyd/hf\nCY2vCdn+JfDq0nFE9LL0CUVEQVl3LCIKavoc073QJxQRk1Stwtp+a0fSYknrJK2XdPoo198o6YZ6\nqNVVkg5qV2ZqQhEDoBuTmkmaAZwJHAVsAFZKWm57bctttwEvsn2vpKOp3uE7dLxyk4Qi+p6hO31C\nhwDrbd8KIOkC4Djg8SRk+6qW+68G5rYrNM2xiAEwgUf0syWtatlah0TNAe5oOd5QnxvL24Bvt4st\nNaGIPjfBjumNthdO9jMlHUmVhA5vd2+SUMQA6NLTsTuBvVqO59bntiLpQOAc4GjbbWe+SBKK6Htd\ne09oJTBP0r5Uyed44A2tN0jaG7gYeLPtWzopNEkoYgB04+mY7U2STgUuBWYA59peI+nk+vpS4APA\nHsBZkgA2tWveqckvMU1UpvJor2l/3vVf1BiH7Ul9STvtNNP77POcju5dt+7Hq7vRJzQRqQlF9L0s\nAx0RhWUZ6GiMNH/aa1KTdeHC7rSMmvR7GilJKKLvOWvRR0Q5md41IopLcywiikoSioiC8og+Igor\nNYl9J5KEIvqcDUNDZdaZ70SSUETfy1r0EVFYklBEFJUkFBFF5WXFiCjHeUQfEQUZGGpwTWjSq21I\n+kG9GNp19XZRy7Ulkm6ut2skHd5y7RhJ10q6XtJaSe+YbCwRMTp7qKOthG2qCUnaAdje9sP1qTfa\nXjXinmOAdwCH294o6fnANyQdAtxNtSjaIbY3SNoR2Kf+uafYvnfbfjsR8UTNfkQ/oZqQpGdL+iSw\nDtivze3vA95jeyOA7Z8AXwROAWZRJcC762u/s72u/rnXSfqppHdLetpE4ouI0XVrGeip0DYJSdpV\n0omSrgQ+R7Xa4oG2r2257UstzbGP1+cOAFaPKG4VcIDte4DlwO2SvlyvX70dPD5Z9tHALsDlki6q\n17/OQo0R22B43bGmJqFOmmN3ATcAJ9m+eYx7ntAca8f2SZKeC7wEOI1qfesT6mt3AB+S9GGqhHQu\nVQL785Hl1CtELhl5PiKGGTd42EYntYtXU60xdLGkD0j6/Q7LXgssGHFuAbBm+MD2jbY/RZWAXtV6\nY913dBbwGeArwN+P9iG2l9leON0rBET0kgksAz3t2iYh25fZfh1wBHA/cImk70rap82P/ivwMUl7\nAEiaT1XTOUvSTEmLWu6dD9xe3/dSSTcAHwa+D+xv+29sryEitkmvN8cAqJdz/TTw6bqW0lq/+5Kk\n39b7G22/xPZySXOAq+r1wB4E3mT7LkmzgPdK+izwW+Bh6qYYVWf1sbZvn9TvLCIe1+SnY9v0iN72\nNS37i8a572zg7FHOPwi8fIyfGdmZHRGTUNVymvuyYt6YjhgAfVcTiojekiV/IqKs1IQiohxnGeiI\nKGf4jemmShKKGABJQhFRVJJQRBTkLPkTEeU0vU8o02NEDILheabbbW3U0+qsk7Re0umjXJekz9TX\nb6gnMxxXklBE3+t0DP34SUjSDOBMqul19gdeL2n/EbcdDcyrtyWMMmxrpCShiAHQpTmmDwHW277V\n9qPABcBxI+45DjjflauB3SXtOV6h6ROKGABdGrYxB7ij5XgDcGgH98yhmhxxVP2WhDZSz0s0SbPr\nspqiafFA82LqWjySulFMt+LpdBLB8VxKFU8ndpLUOkvqMtvLuhDDmPoqCdnuysT4klY1aabGpsUD\nzYsp8YzN9uIuFXUnsFfL8dz63ETv2Ur6hCKiUyuBeZL2rZf9Op5qwYpWy4G31E/JDgPutz1mUwz6\nrCYUEVPH9iZJp1I172YA59peI+nk+vpSYAXVhIXrgd8AJ7YrN0lodFPaBt4GTYsHmhdT4pkGtldQ\nJZrWc0tb9k21tmDH1OQ3KSOi/6VPKCKKShKKiKKShCKiqCShiCgqSSgiikoSioiikoQioqj/Bz5t\nHSqs1C56AAAAAElFTkSuQmCC\n", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" }, { "name": "stdout", "output_type": "stream", "text": [ "[log] 94m 14s (5100) 0.0806\n", "94m 14s (- 829m 41s) (5100 10%) 0.1055\n", "[log] 96m 6s (5200) 0.0678\n", "96m 6s (- 827m 57s) (5200 10%) 0.1025\n", "[log] 97m 59s (5300) 0.1065\n", "97m 59s (- 826m 30s) (5300 10%) 0.1026\n", "[log] 99m 49s (5400) 0.1059\n", "99m 49s (- 824m 29s) (5400 10%) 0.1007\n", "[log] 101m 40s (5500) 0.1084\n", "101m 40s (- 822m 41s) (5500 11%) 0.0991\n", "[log] 103m 32s (5600) 0.1498\n", "103m 32s (- 820m 54s) (5600 11%) 0.0985\n", "[log] 105m 23s (5700) 0.0675\n", "105m 23s (- 819m 6s) (5700 11%) 0.1009\n", "[log] 107m 15s (5800) 0.1340\n", "107m 15s (- 817m 21s) (5800 11%) 0.0985\n", "[log] 109m 7s (5900) 0.0902\n", "109m 7s (- 815m 39s) (5900 11%) 0.0970\n", "[log] 110m 59s (6000) 0.0942\n", "110m 59s (- 813m 57s) (6000 12%) 0.0997\n", "> vous plaisantez ?\n", "= are you kidding ?\n", "< are you kidding ? \n" ] }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAARoAAAEnCAYAAAB2V4zJAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAGdRJREFUeJzt3Xu4XXV95/H3h4CCyk2SOjYQwWkEwSKacJsixRZioFBs\nvYCXMmCRMpUZnSoXnY6XWp8pMj4daFGMNCqtj4gINtIolI6AFGlzwj1AJA9yCaIY7oLCJOczf6x1\nYOewzzn75Oy119ornxfPfrL3Wuus/eUAX37rt37r+5VtIiKqtEXdAURE+yXRRETlkmgionJJNBFR\nuSSaiKhcEk1EVC6JJiIql0QTEZVLomkBSX8naZ9x2z5ZUzgRL5BE0w5vAb4q6biObb9fVzAR4yXR\ntMNDwMHAOySdK2lLQDXHFPGcJJp2kO3HbR8F/By4Cti+3pAinpdE0w7Lxt7Y/iRwJnBPXcFEjKc8\nvd0Okl4FzLd9paSXALNsP1l3XBGQEU0rSHo/cDHwxXLTXODb9UUUsbEkmnb4APBbwBMAtu8Cfq3W\niCI6JNG0wzO2nx37UN51yjVxNEYSTTtcLeljwDaSDgO+CXyn5pginpPJ4BaQtAXwx8AiivUzl9v+\nUr1RRTwviaYFJH3Q9tlTbYuoSy6d2uE/d9l2/KCDiJjIlnUHEJtO0ruAdwO7SVrWsWtb4JF6oop4\noSSa4XYd8CAwG/hcx/YngVtqiSiii8zRRExBkoBLgY/avqPueIZR5mhaQNIfSrpL0uOSnpD0pKQn\n6o6rRRYB+wIn1h3IsMqIpgUkrQGOyv9tqyHpIuDLwNnAnrbX1xzS0MmIph1+liRTDUmzgb1sfxe4\nEnhrzSENpUwGt8OIpG9QPEj5zNhG25fUF1Jr/BHw9fL9l4FPUzzAGtOQRNMO2wFPU8wljDGQRDNz\n7wMWA9heIemVknaxfX/NcQ2VzNFETEDSDsAxtr/Yse0wYJ3tG+uLbPgk0bSApK0pnnXaC9h6bLvt\n99UWVESHTAa3w98D/4GiG8LVwM4Ui/ZiE0l6v6T55XtJ+nK5dOAWSW+oO75hk0TTDr9h+38CT9n+\nKvB7wP41xzTsPsjzdZffBewN7Ab8GXBOTTENrSSadvh/5Z+PSXodRQeEVNibmfW2x36vRwIX2H7Y\n9pXAS2uMaygl0bTDEkk7An9O0RHhduCz9YY09EbLO0xbA79LsYZmzDY1xTS0cnu7BWyfX769Bnh1\nnbG0yMeBEWAWsMz2KgBJvw3cXWdgwygjmi4kfVbSdpK2kvQvkn4u6b11xzURSR8s45Wk8yXdIGnR\n1D8ZE7F9GfAq4LW239+xawQ4pp6ohlcSTXeLbD9BcW1+D/AbwKm1RjS595XxLgJ2oljN+lf1htQK\nLwc+JOni8vUp4GW2f1F3YMMmiaa7sUvK3wO+afvxOoPpwVif7SMoJi1Xkd7bMyLpt4AV5ccLyhfA\nv5X7YhoyR9PdZZLuBH4J/BdJc4Bf1RzTZFZKuoLi9utHJW0LjNYc07D7HPDWcSuAl0m6lKJRX5YP\nTENWBk9A0suBx21vKFvMbmf7p3XH1U3ZBWEf4G7bj0naCZhrO1X2NpGk223vOd190V1GNF1IOq7j\nfeeuC154dH0k7WH7TookA/DqcfHGppOkHW0/Om7jy8mUw7Ql0XS3b8f7sXUUN9CwREOxSvUkNq4X\nPMbA7ww2nFb5a+AKSR+h+GcPsAA4s9wX05BLpx6UT/FeaHtx3bHE4Eg6EjiN4mFVUyyEPMt2uoBO\nUxJNDyRtBdxme/e6Y5lI+ejBnmz89HbTRmCxmcqlUxeSvkPxfzAoVoa+FriovogmJ+kTwCEUiWY5\ncDhwLc271Bsaki6y/c7y/Zm2T+/Yd4XtLIichiSa7v53x/v1wL2219YVTA/eDrweuNH2CZJeAfxD\nzTENu/kd7w8DTu/4PGfAsQy9zJ53Yftq4E6Kjo87As/WG9GUfml7FFgvaTvgIWCXmmMadpPNKWS+\nYZoyoulC0juBs4CrKFbY/o2kU203tSj1SDlh/SVgJfAL4If1hjT0XlIWuNoC2KZ8r/KVp7enKZPB\nXUi6GTjM9kPl5znAlbZfX29kU5O0K8XiwizWmwFJ359sv+03DyqWNsiIprstxpJM6WEaeJkp6Y2T\n7bN9w0T7Y3JJJP2VRNPddyVdzvP9fI6huJvTNJ0L9TqHpiIL9mZM0jbAa2zf3LFtHrDB9gP1RTZ8\ncunURbka9Gc8v7T/WtuX1hjSpMr/IP4UOIgiwfwA+ILtJj8I2njl+qk7gb1tP1VuuwL4mO2RWoMb\nMo27HGiIlwJnAPsBPwauqzecKX2VYq3POcDfUKynadwaGkkvl/QxSX9W3h1rtLJm8KXA2HqaecCc\nJJnpy4hmEpL2prhsehuw1vahNYfUVbeniZv4hHE5wfpD4MUU3R+Pst3ospiS9gCW2D5Y0p8DT9hO\nF4RpyhzN5B4CfkoxGdzkrgI3SDrA9vUAkvanKDnZNDvZ/hg8dwlytaTHgA8DJ46txG0S23eWJVJf\nAxwLvKnumIZRRjRdSPpTiuHyHOCbwEW2b683qolJugPYHbiv3DQPWE2xqtm2964rtk6S/hV4j+17\nys8Cfh14FNje9oM1hjchScdT9OB+wPa7ag5nKCXRdCHpfwHfsH1T3bH0QtKrJttv+95BxTIZSbtT\nJL4f1R3LdJSFzx4E3lb2dYppSqKJiMrlrlNEVC6JpgeSTqo7hl4NU6wwXPEOU6wzIWmppIck3TbB\nfkk6R9IaSbdMtkJ9TBJNb4bpX7BhihWGK95hinUmvkKx/GAih1OU0ZhP8Tv5wlQnTKKJiI3YvgZ4\nZJJDjqboH+ZyScUOkl452TlbtY5GUmUz21Weu9+qiHXBggX9PiUA8+bNY+HChX2Nd+XKlf083UYq\n+vdgne0ZFdNavHix161b19OxK1euXMXGfcqW2F4yja+bC9zf8XltuW3C5QmtSjRRnZGRJq7/624I\nW87MePnBunXrev5nJOlXthfO9DunI4kmoiUGuFTlATau4LhzuW1CmaOJaAEDG0ZHe3r1wTLguPLu\n0wEUHV0nXdWdEU1EKxj3qZSxpK9TdNWYLWkt8AlgKwDb51HUZjoCWAM8DZww1TmTaCLawDDapyun\nqZ7ncnGN9oHpnDOJJqIlmvw4URJNRAsYGE2iiYiqZUQTEZWy3a87SpVIooloiYxoIqJy/bq9XYUk\nmogWKCaD645iYkk0ES2RS6eIqFYmgyOiaiYjmogYgCzYi4jKZUQTERXr39PbVUiiiWgB9/Hp7So0\nMtFImmV7Q91xRAyT0Qbfdaqlwp6kb0taKWnVWK8cSb+Q9DlJNwMHSlog6eryuMunqrIesTkbe3q7\nl1cd6hrRvM/2I5K2AVZI+hbwUuDfbH9Y0lbA1cDRtn8u6RjgMxSN1jdSJqrNpd9OxIQyGfxC/03S\nH5Tvd6FoRLUB+Fa5bXfgdcA/lxXtZzFBK4eyTcQSGK6WKBF9VeNopRcDTzSSDgEOBQ60/bSkq4Ct\ngV91zMsIWGX7wEHHFzGsmjyiqWOOZnvg0TLJ7AEc0OWY1cAcSQcCSNpK0l6DDDJimBjYYPf0qkMd\nieZ7wJaS7gD+Crh+/AG2nwXeDpxZTg7fBPyngUYZMWRs9/Sqw8AvnWw/Q9EkfLyXjTvuJuDggQQV\n0QJNvnRq5DqaiJgeZzI4IgYhI5qIqFwSTURUqrjr1NxHEJJoIloiD1VGRLVqvHXdiySaiBZIKc+I\nGIjc3o6IymVEExGVSu/tiBiI1AyOiMrl9nZEVKrpd51qqRkcEf3XrzIRkhZLWi1pjaQzuuzfXtJ3\nJN1c1v0+YapzZkQT0QZ9mgyWNAs4FzgMWEtR03uZ7ds7DvsAcLvtoyTNAVZL+lpZR6qrjGgiWmDs\n0qkPI5r9gDW27y4Tx4XA0V2+blsVBb1fBjwCrJ/spK0a0SxYsICRkZG6w+hJWXR9aAxbvJujaSzY\nmy2p8z+UJWWRf4C5wP0d+9YC+4/7+b8FlgE/AbYFjrEnf6KzVYkmYnM2jdvb62wvnMFXvYWivO7v\nAP+RolvJD2w/MdEP5NIpoiXs3l5TeICiBdKYncttnU4ALnFhDfBjYI/JTppEE9ECfexUuQKYL2k3\nSS8CjqW4TOp0H/C7AJJeQdGH7e7JTppLp4g26NNdJ9vrJZ0CXE7RuHGp7VWSTi73nwd8GviKpFsp\nerCdbnvdZOdNoologX4u2LO9HFg+btt5He9/AiyazjmTaCJaoskrg5NoIloi9WgiomLO09sRUa0e\nb13XJokmoiVS+CoiKjW2jqapkmgiWiJ3nSKiWunrFBEDkUQTEVUb3ZBEExEVKm5vJ9FERMWSaCKi\nYpkMjogBcIMbOw2s8JWkv5D0oY7Pn5H0QUlnSbpN0q2Sjin3HSLpso5j/1bS8YOKNWLYjM3R9KPd\nShUGWWFvKXAcgKQtKCp3rQX2AV4PHAqcJemVA4wpojU8OtrTqw4Du3SyfY+khyW9AXgFcCNwEPB1\n2xuAn0m6GtgXmLDI8XiSTgJOApg3b17/A48YEg2eohl4zeDzgeMpihsvneS49Wwc29YTHWh7ie2F\nthfOmTOnL0FGDB0bj/b2qsOgE82lwGKKUcvlwA+AYyTNKjveHQz8O3AvsKekF0vagbIQckRMrMlz\nNAO962T7WUnfBx6zvUHSpcCBwM0UD6CeZvunAJIuAm6jaOVw4yDjjBg2/awZXIWBJppyEvgA4B0A\nLn4zp5avjdg+DThtkPFFDLMmJ5pB3t7eE1gD/Ivtuwb1vRGbBRtvGO3pVYdB3nW6HXj1oL4vYnPT\n5BFNVgZHtESD80wSTUQbZDI4IqqXMhERUT0zWtNEby+SaCJaIiOaiKhUKuxFxGAk0URE1dzcKZok\nmoi2yKVTRFTLZjS9tyOiSk1fsDfoejQRUQXTt8JXkhZLWi1pjaQzJjjmEEk3SVpVVsacVEY0EW3R\nhxGNpFnAucBhFDW9V0haVj4UPXbMDsDngcW275P0a1OdNyOaiFborbpeD5dX+wFrbN9t+1ngQuDo\ncce8G7jE9n0Ath+a6qStGtGsXLkSSXWH0ZMmX093Myy/183ZaO/1gGdLGun4vMT2kvL9XOD+jn1r\ngf3H/fxrgK0kXQVsC5xt+4LJvrBViSZic2VPq4HcOtsLZ/B1WwILKGp5bwP8UNL1tn802Q9ERAv0\naZT8ALBLx+edy22d1gIP234KeErSNRS92SZMNJmjiWiJPs3RrADmS9pN0osoGj0uG3fMPwIHSdpS\n0ksoLq3umOykGdFEtEJ/WqnYXi/pFIp2SLOApbZXSTq53H+e7TskfQ+4BRgFzrd922TnTaKJaIM+\nPr1tezmwfNy288Z9Pgs4q9dzJtFEtIABb2juncwkmoiWaPKSiSSaiDaosd1tL5JoIlpiGutoBi6J\nJqIlMqKJiEo1vUxEEk1EG9g4ha8iomqpGRwRlculU0RUK32dIqJqmQyOiAFI7+2IqFrDL52mrEcj\naVdJt43btlDSORMcf4+k2V22f1LSR8r3fyHp0E0NOiK6KBpwT/2qwSaNaGyPACNTHjjxz398U382\nIrpr8IBmehX2JL1a0o2STpV0WbltJ0lXlP1dzgfUcfz/kPQjSdcCu3ds/4qkt5fv75H0KUk3SLpV\n0h7l9jmS/nnsvJLu7TZSiojnJ4P7UGGvEj0nGkm7A98Cjqco9zfmE8C1tvcCLgXmlccvoCgDuA9w\nBLDvJKdfZ/uNwBeAj3Sc9/+W57147Lxd4jpJ0si4qu4Rm5c+NpCrQq+JZg5FndD32L553L6DgX8A\nsP1PwKPl9jcBl9p+2vYTvLDuaKdLyj9XAruW7w+i6CmD7e91nHcjtpfYXjjDqu4RQ67ovd3Lqw69\nJprHgfso/uOvwjPlnxvInbCITdKGS6dngT8AjpP07nH7rqHoXIekw4EdO7a/VdI2krYFjppmbP8K\nvLM876KO80ZENw2+69TzHE3Zw+VI4L8D23Xs+hRwsKRVwB9SjHywfQPwDeBm4LtsPK/Ti08Bi8pb\n6+8Afgo8Oc1zRGwW3PA5mikvU2zfA7yufP8Yz0/qLiu3PQwsmuBnPwN8psv24zve79rxfgQ4pPz4\nOPCWsv3DgcC+tp8hIrpq8u3tJs+HzAMukrQFxaXb+2uOJ6LBUjN4k9i+C3hD3XFEDAVT2x2lXjQ2\n0URE70yKk0fEAOTSKSIqVt+t614k0US0QcPLRCTRRLTEaHpvR0SVUsozIqqXS6eIqF4W7EXEACTR\nRETlsmAvIio19vR2U02rZnBENFe/Cl9JWixptaQ1ks6Y5Lh9Ja0fq/89mSSaiFboLclMlWgkzQLO\nBQ4H9gTeJWnPCY47E7iil+iSaCLaoH+Fr/YD1ti+2/azFHW7j+5y3H+laFbwUC/hZY6mJqMNvkPQ\nnaY+pDGG7XfbH9O46zR7XNeQJbaXlO/nAvd37FsL7N/5w5LmUpT2fTOTdzd5ThJNRAtMc2Xwuhl2\nDfk/wOm2R6Xe/geURBPRCsb9KXz1ALBLx+edy22dFgIXlklmNnCEpPW2vz3RSZNoItrA4P4U2FsB\nzJe0G0WCOZayy8lzX2XvNvZe0leAyyZLMpBEE9Ea/VgZXDYDOAW4HJgFLLW9StLJ5f7zNuW8STQR\nLdGvRxBsLweWj9vWNcF0djSZTBJNRAukTEREVM9mdEO6IERE1TKiiYiqucELFZNoIlrAqbAXEdUz\n7tNCmiok0US0REY0EVG59N6OiEoVtWaSaCKiarl0ioiq5fZ2RFQuk8EzIGkPYCmwLfAI8Dbb6+qN\nKqJpzOjohrqDmNCw1Ax+r+3fBK4DTq47mIimGVuw148uCFVo/IjG9p0dH18MPFxXLBFNlkunPpD0\nFooWEAfWHUtEEyXRzJCkLYC/A95s+7Fx+04CTqolsIjGcG5v98GvA4/bvmv8jrJNxBIASc39TUdU\nzGTB3kw9Cny47iAimspu9iMIw3LXaXvgxLqDiGiu/rTErcpQjGhs/wSYspF4xOYszzpFROVy1yki\nKpdEExHVcm5vR0TFDIy6uc86JdFEtEJ9d5R6kUQT0RJJNBFRuSSaiKhUMRecdTQRUSnjBj+CkEQT\n0RKpGRwRlcscTURULH2dIqJiYzWDm2pYykRExBT6VSZC0mJJqyWtkXRGl/3vkXSLpFslXSfp9VOd\nMyOaiJboR+ErSbOAc4HDgLXACknLbN/ecdiPgd+2/aikwykqXO4/2XmTaCJawdCfOZr9gDW27waQ\ndCFwNPBcorF9Xcfx1wM7T3XSJJqazNoiV61VafJcRTeS+nKeadzeni1ppOPzkrL2NsBc4P6OfWuZ\nfLTyx8B3p/rCJJqIFpjmZPA62wtn+p2S3kyRaA6a6tgkmoiW6NNI7gFgl47PO5fbNiJpb+B84HDb\nUzZ1TKKJaIW+raNZAcyXtBtFgjkWeHfnAZLmAZcAf2T7R72cNIkmoiX6cdfJ9npJpwCXA7OApbZX\nSTq53H8e8HFgJ+Dz5fzS+qkuxZJoIlqgnwv2bC8Hlo/bdl7H+xOZZvujJJqIVkjN4IgYgLTEjYjK\nNXn9UBJNRCu40b23k2giWiClPCNiIHLpFBGVS6KJiIrl9nZEDECKk0dEpWwYHU3v7YioVHpvR8QA\nJNFEROWanGhmXE9S0lVlxfSbytfFHftOknRn+fp3SQd17DtS0o2SbpZ0u6Q/mWksEZsze7SnVx02\naUQj6UXAVrafKje9x/bIuGOOBP4EOMj2OklvBL4taT/gYYrK6fvZXivpxcCu5c/taPvRTfvbidhM\nudm3t6c1opH0WkmfA1YDr5ni8NOBU22vA7B9A/BV4APAthRJ7uFy3zO2V5c/d4yk2yR9WNKc6cQX\nsbkyMOrRnl51mDLRSHqppBMkXQt8iaLtwt62b+w47Gsdl05nldv2AlaOO90IsJftR4BlwL2Svl42\npNoCniuwczjwEuAaSReXDa26xlpeno2Mq+oesdkZ9kunB4FbgBNt3znBMS+4dJqK7RMl/SZwKPAR\nioZVx5f77gc+LekvKZLOUook9ftdzrOE4jIMSc0dO0ZUqtm3t3u5dHo7RZHiSyR9XNKrejz37cCC\ncdsWAKvGPti+1fZfUySZt3UeWM7lfB44B7gI+GiP3xuxWepXS9wqTJlobF9h+xjgTcDjwD9KulLS\nrlP86GeBMyXtBCBpH4oRy+clvUzSIR3H7gPcWx63SNItwF8C3wf2tP0h26uIiK7GagY3NdH0fNep\n7N1yNnB2OdroXO/8NUm/LN+vs32o7WWS5gLXlZc0TwLvtf2gpG2B0yR9Efgl8BTlZRPFBPFRtu+d\n0d9ZxGbFuMGPIKjJ13XTlTmagGYvXOtG0sqZdo7ccsutvN12O/V07KOP/mzG3zddWRkc0RJNTrBJ\nNBEtkUQTEZUqJnpTMzgiKpYRTURULu1WIqJ6GdFERLWclrgRUa2xlcFNlUQT0RJJNBFRuSSaiKiY\n024lIqqVOZqIGIwGJ5oZd0GIiCZwz39NpSydu1rSGklndNkvSeeU+28pGw9MKiOaiJbox7NOkmYB\n51JUvVwLrJC0zPbtHYcdDswvX/sDXyj/nFBGNBEtMTo62tNrCvsBa2zfbftZ4ELg6HHHHA1c4ML1\nwA6SXjnZSds2ollHWRK0z2aX5x4GwxQrVBCvpH6erlNVv9te63BP5nKK+Hqx9biuIUvKIv8Ac4H7\nO/at5YWjlW7HzKVoZNBVqxKN7Ur6QEkaGXRFsk01TLHCcMXb5FhtL647hsnk0ikiOj0A7NLxeedy\n23SP2UgSTUR0WgHMl7Rb2fr6WIpmj52WAceVd58OAB63PeFlE7Ts0qlCS6Y+pDGGKVYYrniHKdZN\nYnu9pFMo5nxmAUttr5J0crn/PGA5cASwBngaOGGq87aqC0JENFMunSKickk0EVG5JJqIqFwSTURU\nLokmIiqXRBMRlUuiiYjK/X/OJWPHO3C0cQAAAABJRU5ErkJggg==\n", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" }, { "name": "stdout", "output_type": "stream", "text": [ "[log] 112m 52s (6100) 0.0712\n", "112m 52s (- 812m 17s) (6100 12%) 0.0981\n" ] }, { "ename": "KeyboardInterrupt", "evalue": "", "output_type": "error", "traceback": [ "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 15\u001b[0m \u001b[0minput_batches\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0minput_lengths\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mtarget_batches\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mtarget_lengths\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 16\u001b[0m \u001b[0mencoder\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mdecoder\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 17\u001b[0;31m \u001b[0mencoder_optimizer\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mdecoder_optimizer\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mcriterion\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 18\u001b[0m )\n\u001b[1;32m 19\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n", "\u001b[0;32m\u001b[0m in \u001b[0;36mtrain\u001b[0;34m(input_batches, input_lengths, target_batches, target_lengths, encoder, decoder, encoder_optimizer, decoder_optimizer, criterion, max_length)\u001b[0m\n\u001b[1;32m 36\u001b[0m \u001b[0mtarget_lengths\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 37\u001b[0m )\n\u001b[0;32m---> 38\u001b[0;31m \u001b[0mloss\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mbackward\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 39\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 40\u001b[0m \u001b[0;31m# Clip gradient norms\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", "\u001b[0;32m/home/sean/anaconda3/lib/python3.6/site-packages/torch/autograd/variable.py\u001b[0m in \u001b[0;36mbackward\u001b[0;34m(self, gradient, retain_graph, create_graph, retain_variables)\u001b[0m\n\u001b[1;32m 149\u001b[0m \u001b[0mDefaults\u001b[0m \u001b[0mto\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0munless\u001b[0m\u001b[0;31m \u001b[0m\u001b[0;31m`\u001b[0m\u001b[0;31m`\u001b[0m\u001b[0mgradient\u001b[0m\u001b[0;31m`\u001b[0m\u001b[0;31m`\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0ma\u001b[0m \u001b[0mvolatile\u001b[0m \u001b[0mVariable\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 150\u001b[0m \"\"\"\n\u001b[0;32m--> 151\u001b[0;31m \u001b[0mtorch\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mautograd\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mbackward\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mgradient\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mretain_graph\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mcreate_graph\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mretain_variables\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 152\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 153\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mregister_hook\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mhook\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", "\u001b[0;32m/home/sean/anaconda3/lib/python3.6/site-packages/torch/autograd/__init__.py\u001b[0m in \u001b[0;36mbackward\u001b[0;34m(variables, grad_variables, retain_graph, create_graph, retain_variables)\u001b[0m\n\u001b[1;32m 96\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 97\u001b[0m Variable._execution_engine.run_backward(\n\u001b[0;32m---> 98\u001b[0;31m variables, grad_variables, retain_graph)\n\u001b[0m\u001b[1;32m 99\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 100\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n", "\u001b[0;31mKeyboardInterrupt\u001b[0m: " ] } ], "source": [ "# Begin!\n", "ecs = []\n", "dcs = []\n", "eca = 0\n", "dca = 0\n", "\n", "while epoch < n_epochs:\n", " epoch += 1\n", " \n", " # Get training data for this cycle\n", " input_batches, input_lengths, target_batches, target_lengths = random_batch(batch_size)\n", "\n", " # Run the train function\n", " loss, ec, dc = train(\n", " input_batches, input_lengths, target_batches, target_lengths,\n", " encoder, decoder,\n", " encoder_optimizer, decoder_optimizer, criterion\n", " )\n", "\n", " # Keep track of loss\n", " print_loss_total += loss\n", " plot_loss_total += loss\n", " eca += ec\n", " dca += dc\n", " \n", " job.record(epoch, loss)\n", "\n", " if epoch % print_every == 0:\n", " print_loss_avg = print_loss_total / print_every\n", " print_loss_total = 0\n", " print_summary = '%s (%d %d%%) %.4f' % (time_since(start, epoch / n_epochs), epoch, epoch / n_epochs * 100, print_loss_avg)\n", " print(print_summary)\n", " \n", " if epoch % evaluate_every == 0:\n", " evaluate_randomly()\n", "\n", " if epoch % plot_every == 0:\n", " plot_loss_avg = plot_loss_total / plot_every\n", " plot_losses.append(plot_loss_avg)\n", " plot_loss_total = 0\n", " \n", " # TODO: Running average helper\n", " ecs.append(eca / plot_every)\n", " dcs.append(dca / plot_every)\n", " ecs_win = 'encoder grad (%s)' % hostname\n", " dcs_win = 'decoder grad (%s)' % hostname\n", " vis.line(np.array(ecs), win=ecs_win, opts={'title': ecs_win})\n", " vis.line(np.array(dcs), win=dcs_win, opts={'title': dcs_win})\n", " eca = 0\n", " dca = 0" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Plotting training loss\n", "\n", "Plotting is done with matplotlib, using the array `plot_losses` that was created while training." ] }, { "cell_type": "code", "execution_count": 31, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXcAAAD8CAYAAACMwORRAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3XmUXOV55/HvU0vvq9RaWztoYROyJYRsZNbYgJwxJia2\nwQMOxibEGDszmQnGiT1JfJIM9pDBGQIcjDNgY0M8QQaM2WyQAhgESCC0IiSB1GhttdRq9b4+88et\nFq1SdXdJutXV1fp9ztFRV9Xbt55Lobdvv/d3n2vujoiIjCyRbBcgIiLh0+QuIjICaXIXERmBNLmL\niIxAmtxFREYgTe4iIiOQJncRkRFIk7uIyAiU1uRuZtvMbK2ZrTazlQOMO8fMuszsqvBKFBGRYxU7\nhrEXuXtdfy+aWRS4HXgunY1VVVX5tGnTjuHtRURk1apVde4+ZrBxxzK5D+YW4FHgnHQGT5s2jZUr\n+/0lQEREUjCz7emMS3fN3YHfmdkqM7sxxZtVA1cC96RfooiIZEq6R+6L3X2nmY0Ffmtm77j7i31e\nvxO41d17zKzfjSR+MNwIMGXKlOOtWUREBpHWkbu770z8XQv8CliYNGQB8IiZbQOuAu42s8+m2M59\n7r7A3ReMGTPokpGIiBynQY/czawYiLh7Y+LrTwF/13eMu0/vM/4B4El3fyzkWkVEJE3pLMuMA36V\nWG6JAb9w92fM7CYAd783g/WJiMhxGHRZxt3fA8oTYzsJTpzi7vf2Tuxm9iUzW2Nma4FZwObMlSwi\nIoMJK+f+PnCBu9eb2eXAfcC5J1ydiIgcl1DaD7j7K+5en3i4ApgUxnZT2bSnkTue28T+pvZMvYWI\nSM4LJeee5Abg6RMrq39bapv4Py9soa6pI1NvISKS88LKuQNgZhcRTO6LU20kjJx7LBrk6Du7e47r\n+0VETgZh5dwxs7nA/cAV7r6/n+2ccM49npjcu3r8uL5fRORkMOjkbmbFZlba+zVBzn1d0pgpwFLg\nWnd/NxOF9opFgpK7dOQuItKvsHLu3wNGE1yZCtDl7gsyUvDhZRkduYuI9CeUnDvwNeBnQEli3GAn\nXY9bPJo4cu/RkbuISH/CyrlfDsxM/DmXoDtkRnLusUhizV1H7iIi/QrrNntXAD/1wAqgwswmhLTt\nI/QeuSstIyLSv7By7tXAB30e70g8dwQzu9HMVprZyn379h17tXy45q60jIhI/9Kd3Be7+zyC5Zeb\nzez843mzMKKQvWkZHbmLiPQvrJz7TmByn8eTEs+F7nDOXWvuIiL9CiXnDjwBXGeBRUCDu+8OvVog\nprSMiMig0jlyHwe8bGZvA3XAqN6ce2/WHXgZmAO0AcvJYG+ZeEQ5dxGRwaSVc3f3s4EHCZZktiSe\n75tzvxn4pbvnEyzP3GJmeZko+PCRu9bcRUT6ldaau5lNAj5N0DsmFQdKLbg8tQQ4AHSFUmESpWVE\nRAaX7kVMdwJ/CZT28/pdBOvuuxJjvuDuGTm0jh9Oy2hyFxHpTzonVP8QqHX3VQMMuxRYDUwE5gF3\nmVlZim2Fl3PXsoyISL/SWZY5D/iMmW0DHgEuNrOHksZcDyxNXKG6heC2e3OSNxROzj1xQlXLMiIi\n/UrnhOpt7j7J3acBXwRecPf/nDSsBrgEwMzGAbOB90KulcT2iUVMR+4iIgM4lsZhR0hq+ft94AEz\nWwsYcOsATcZOWCxqOqEqIjKAtBuHmVkU+N+9j/tGId19F/APQDfB5P61kOs8QjwSUfsBEZEBHMuR\n+7eAjUCqE6UVwN3AZe5ek7jXasbEoqb2AyIiAwgr534NwQnVGjjcgyZjYtGI2g+IiAwg3WWZ3px7\nfzPqLKDSzJYn2gJfF0p1/YhHTDl3EZEBhJVzjwHzCY7uLwW+a2azUmzrhHPukDhy15q7iEi/wsq5\n7wCedffmRErmReDs5A2FkXOHYM1dOXcRkf6FlXN/HFhsZjEzKyK4f+rG0KtNyNORu4jIgELJubv7\nRjN7BlhDsC5/v7sn93wPjdIyIiIDS3ty75Nz770r0719X3f3H5rZcuBVgmWajIlFIlqWEREZQNoX\nMfFhzj2lxOR/O/DciRY1mHhU7QdERAYSVs4d4BbgUSCjGXcIjty1LCMi0r9Qcu5mVg1cCdwTUl0D\nCtIyOnIXEelPWDn3OwmahQ0444aVc49HdeQuIjKQdE6o9ubclwAFQJmZPZQUh1wAPBLcZY8qYImZ\ndbn7Y3035O73AfcBLFiw4Lhn51jE1DhMRGQAg07u7n4bcBuAmV0I/LfknLu7T+/92sweAJ5MntjD\nFI9G1PJXRGQAx5KWOYKZ3dSbdR9qMaVlREQGFErO3cy+BNxK0Mu9EdgcbplHikUiahwmIjKAUPq5\nE9wz9QJ3rzezywnW1c8Nob6U4lFTy18RkQGEknN391fcvT7xcAUwKZzyUlP7ARGRgYXVz72vG4Cn\nU70QWstf3WZPRGRAYeXce8deRDC535rq9bBa/sZ1g2wRkQGFlXPHzOYSLNtc7u77wy/1QzFdxCQi\nMqBQ+rmb2RRgKXCtu7+bkUr7iEfUfkBEZCDHGoX8MYm0TN9+7sD3CE6iLjOzHuB9dz89/HIDsWgE\nd+jucaIRy9TbiIjkrGONQr5BYnJP6ue+FKgGlhBEIH8UVoGpxKLBhN7Z3UM0Es3kW4mI5KSwWv5e\nAfzUAyuACjObEFKNR4lHgrJ1UlVEJLWwopDVwAd9Hu9IPJcRvUfuakEgIpJaqFHINLYVTs49GpSt\nFgQiIqmlc+TeG4XcBjwCXGxmDyWN2QlM7vN4UuK5I4SWc0+cRFULAhGR1EKJQgJPANdZYBHQ4O67\nwy830Hvkrqy7iEhqx5KWOUJSFPIpgqTMFqAFuD6U6voR75OWERGRow06uZtZAfAikJ8Y/+9wVBSy\nDJgCNCXGnAWsDLvYXvmxIP7Y2tmdqbcQEclp6Ry5twMXu3uTmcWBl83s6UTksdfNwAZ3/09mNgbY\nZGY/d/eOTBRdWRQH4GBLZyY2LyKS89K5zZ4THJEDxBN/khe7HSi14CaqJcABoCvEOo9QWZwHQH1L\nRn52iIjkvHQvYoqa2WqgFvitu7+WNOQu4DRgF7AW+Ja7Z2xBvCJx5F6vI3cRkZTSmtzdvdvd5xFE\nHBea2ZlJQy4FVgMTgXnAXWZ21B2bwsq5VxYljtybdeQuIpLKMd0g290PAsuAy5Jeuh5Ymmg/sIXg\ntntzUnx/SP3cI5Tmx7QsIyLSj3SuUB1jZhWJrwuBTwLvJA2rAS5JjBkHzAbeC7fUI1UW5+nIXUSk\nH+mkZSYADyZa/kaAX7r7k0k59+8DD5jZWsCAW929LlNFQ5CY0Zq7iEhq6SzLvAt0EiRiDIhCMKn3\nZt3dfRfwD0B3YszXMlJtHxVFeVqWERHpRyg598Syzd3AZe5eY2ZjM1TvYaOK83ivrmnwgSIiJ6Gw\ncu7XEJxQrUl8T22YRaZSURSnvlnLMiIiqYSVc58FVJrZcjNbZWbX9bOdUKKQAKOK8mhq76KjS/1l\nRESShZVzjwHzCe7WdCnwXTOblWI7oUQhASoSV6kebNW6u4hIsrBy7juAZ929OZGSeRE4O5wSUxud\nmNz3N2lyFxFJFlbO/XFgsZnFzKyI4CbZG8Mutq9xZfkA7DnUlsm3ERHJSekcuU8FdppZK1APdPXm\n3Ptk3TcCzxDEJpuBN9x9XaaKBhhfXgjA3gZN7iIiydKJQq4CxiVFIRcl9XMH+CfgcoKj+t+EXOdR\nxpbmYwa7NbmLiBwlndvsubsPFoUEuAV4lCBRk3HxaISqknz2aHIXETlKKFFIM6sGrgTuCb/E/k0o\nL9Cau4hICmFFIe8k6CczYOg8zJw7wLiyAh25i4ikEFYUcgHwiJltA64C7jazz6b4/tBy7hAcue9u\naD3h7YiIjDShRCHdfbq7T3P3aQQ30P66uz+WgXqPML68gENtXbR0ZOyOfiIiOSmdI/cJwDIzWwO8\nQbDmfkQUMlsmlBcASsyIiCRLJwrZ2/I3QlLL394BZvYl4NbE643A5tArTWHKqGIAttU1c8qYkqF4\nSxGRnJDOkXtvy9+zCe6PepmZLUoa8z5wgbufRXDjjvvCLTO1U8YEk/t7+5qH4u1ERHJGKC1/3f2V\nPg9XEKRqMq6iKE993UVEUgir5W9fNwBPh1FcOmZUFbNVR+4iIkcIK+cOgJldRDC539rP66Hm3AFm\njCnWsoyISJKwcu6Y2VzgfuAKd9/fz/eHmnMHmDGmhLqmdg616a5MIiK9Qsm5m9kUYClwrbu/m4lC\n+9ObktlSq3V3EZFeYeXcvweMJrgydbWZrcxQvUeZPa4UgE17GofqLUVEhr1Qcu7A14AWYEni7xvD\nLbN/kyoLKcqLanIXEekjncm9N+fet5/70+6+os+Yy4GZiT/nEnSHPDf0alOIRIxZ40o1uYuI9BFW\nP/crgJ8mxq4AKsxsQril9m/O+FI27W0kiOSLiEhYOfdq4IM+j3cknhsSs8aVcqC5g31N7UP1liIi\nw1qoOffBZCLnDjBzXJCY2VqrvLuICISXc98JTO7zeFLiueTvDz3nDjC9KtFAbL8mdxERCCnnDjwB\nXGeBRUCDu+8Ovdp+TCwvJC8W4f06Te4iIpBeWmYC8KCZRQl+GPyyN+cOhyORTxHEILcQRCGvz1C9\nKUUixvTRakMgItIrncm9HjgIjCNIyTTAUTn3MmAKQffIGHAWMGQXMgFMqypSAzERkYR01ty7gL9w\n99OBRcDNZnZ60pibgQ2Jnu8XAneYWV6olQ5ielUJ2/c3092jOKSISDo5993u/mbi60ZgI0fHHB0o\nNTMDSoADBD8UhsyMqmI6u52aAy1D+bYiIsPSMaVlzGwa8BEgOed+F3AasAtYC3zL3XtSfH9GopAA\nZ0+uAGDltgOhbldEJBelPbmbWQnwKPDn7n4o6eVLgdXARIJb8d1lZmXJ28hUFBJg5tgSRhXnseI9\nTe4iIuleoRonmNh/7u5LUwy5HliaaD+wheCeqnPCK3NwkYixcNooXns/ZSt5EZGTSjo5dwN+Amx0\n93/qZ1gNcEli/DhgNvBeWEWma9GMUeyob6Vmv9bdReTkls6R+5XAtcBNZtZqZjvMbElSP/fvA582\ns1ZgO3DI3esyVHO/LpozFoDnNuwZ6rcWERlW0pncXwXmu3sBMJbgIqVt7n5vn6x7C1AMzE6MuyAj\n1Q5i6uhizphYxlNrh+ziWBGRYSmsKOQ1BGvuNYlxtWEXmq4lZ03gzZqD1B5qy1YJIiJZF1YUchZQ\naWbLzWyVmV0XTnnHbtGM0QCs2dGQrRJERLIurChkDJgPfJogFvldM5uVYhsZy7n3mjO+FDNYvyu5\nRBGRk0dYUcgdwLPu3pw4kfoicHbyoEzm3HsV58eYXlXM+l06cheRk1dYUcjHgcVmFjOzIoL7p24M\nr8xjc8bEch25i8hJLZ2ukOcRRCHXJm61B/Adgi6QJFIzG83sGWAN0APc7+7rMlFwOs6YWMav395F\nfXMHlcVD2r9MRGRYSGdy3w4s58OWv/e5+1PJg9z9h2a2nCA6uSPEGo/ZxxInVZ9at5svnTs1m6WI\niGRFWC1/SdzM43bguXBLPHZzJ5UzZ3wpj7z+weCDRURGoLBy7gC3EJx0zVrGvZeZ8fkFk1m7s0G3\n3hORk1IoOXczqyZoU3BPWIWdqHNnjAJg7U6lZkTk5BNWzv1O4NZUPdyTtpHxnHuvmWNLiUeNDUrN\niMhJKJ0Tqunk3BcAjwSpSaqAJWbW5e6P9R3k7vcB9wEsWLAgo/fDy4tFmDm2VHl3ETkpDTq5p5Nz\nd/fpfcY/ADyZPLFnwxkTy3jhnVrcncQPHhGRk0I6yzK9OfeLzWx14k9yy99h6fSJZexv7uCDA63Z\nLkVEZEilM7n35txjQBz4v+7+VN+Wv2b2JTNbY2ZrCZqIbc5UwcfiU2eMJx417n1xa7ZLEREZUmHl\n3N8HLnD3swhu3HFfuGUen+qKQq5eOIVfvvGB7s4kIieVUHLu7v6Ku9cnHq4AJoVd6PG6+aJTiUaM\nO59/N9uliIgMmbD6ufd1A/B0P98/ZFHIXuPKCrjuY1N57K2d7DqotXcROTmElXPvHXMRweR+a6rX\nh6LlbypfOGcyPQ7LNw3NDxQRkWwLq587ZjYXuB+4wt33h1fiiTtlTAkTywv4j3ez3hlBRGRIhNLP\n3cymAEuBa9192C1umxkXzB7DK1v2097Vne1yREQyLp0j9ysJcu43mVmrme1IkXP/HsFJ1GWJMRsy\nVfDxWnLWBBrbu/jWw6vp6cnoxbEiIlmXzuT+KjDf3QuAsUALsK1vzp3gqH0ZUABcBDRmotgT8YmZ\nY/jvl87mmfV7eLOmfvBvEBHJYWG1/L0C+KkHVgAVZjYh9GpP0DULp2AGv98yrE4JiIiELqwoZDXQ\n984YO0jd8z2rKovzOGNiGb/fWpftUkREMirUKGQa2xjynHuy806t4q2aeprau7Ly/iIiQyGsKORO\nYHKfx5MSzx0hWzn3vj552jg6u51fv70rK+8vIjIUQolCAk8A11lgEdDg7rtDrDM086dWctqEMh58\nZRvuSs2IyMgUVsvfp4D3gC3Aj4GvZ6bcE2dm/MnHp/LOnkaWbdJFTSIyMqVzJ6avAPuAiLvPTX7R\nzMqBh4ApQBvwT+6+MtQqQ/ZHH53E3cu38oNnNnHBrLFEI7qRh4iMLOkcuT8AXDbA6zcDG9z9bOBC\n4A4zyzvx0jInHo3wF5+azTt7Gnl89VGnBkREcl46OfcXgQMDDQFKE2vzJYmxwz6K8odnTeCMiWXc\n8dy7akkgIiPOMeXc+3EXcBqwC1gLfMvde0LYbkZFIsatl81h58FWfr6iJtvliIiEKozJ/VJgNTAR\nmAfcZWZlqQYOh5x7X5+YWcXHTxnNXcu20NDSme1yRERCE8bkfj2wNNF6YAvBLffmpBo4HHLufZkZ\n31lyGg2tnfztr9dnuxwRkdCEMbnXAJcAmNk4YDZBLDInnFldzs0XncrSt3aycttApxZERHJHOhcx\nPUzQGXJ2ot3vDUkZ9+8DHzeztcDzwK3unlPNW/7sglMYXZzHj57fnO1SRERCkc6ReysQBTa5+yR3\n/0nfdr/uvgv4B6AbMOBrGas2Qwrzotx4/gxe2lzHzb94k67uYX8+WERkQCecczezCuBu4DPufgbw\nx+GUNrS++okZfP3CU/jNmt38fqtaAotIbgsj534NwQnVmsT4nLymPxoxvnnJTIrzojyzbli2xRER\nSVsYJ1RnAZVmttzMVpnZdSFsMysK4lEuPm0cz63fS7duxSciOSyMyT0GzAc+TZB5/66ZzUo1cLjl\n3FO5/Mzx7G/u4PX3lZwRkdwVxuS+A3jW3ZsTKZkXgbNTDRxuOfdULpw9hoJ4hKe1NCMiOSyMyf1x\nYLGZxcysCDiX4D6rOakoL8aFs8byzLo9Ss2ISM464Zy7u28EngHWAK8D97v7ukwWnWmfmz+J2sZ2\n/uznb7Jh13HdUVBEJKvS6efeN+d+ZqoB7v5DM1tO8ENgR3jlZccnTx/HX3/6NP7x6Xd4fuNeHr95\nMWdNKs92WSIiaQujnztmFgVuB54LoaZh4aufmMGrt11MWWGcHz63KdvliIgckzBy7gC3ENxAOycz\n7v0ZW1rATRecwovv7uPdvY3ZLkdEJG0nfELVzKqBK4F70hg77KOQya78SDUAz63fk+VKRETSF0Za\n5k6CZmGDRktyIQqZbFxZAR+ZUsGz6/dmuxQRkbSFMbkvAB4xs23AVcDdZvbZELY7bFx6xnjW7mxg\nzY6D2S5FRCQtJzy5u/t0d5/m7tOAfwe+7u6PnXBlw8jVC6cwriyf//rLt/mPd3NjOUlETm7p5Ny3\nAluBM1Ll3M3sS2a2JtHPfQkwNbMlD73ywjg/uOps9h5q48v/+jqrttdnuyQRkQGlc+R+PXAOsD5V\nP3eC2+pd4O5nAV8GPp+hWrPqglljWHHbJVQUxbnrhc20dXZnuyQRkX6dcBTS3V9x995D2RXApJBq\nG3aK82N85bzpLNu0j4V//zvW72rIdkkiIimFcUK1rxuAp0Pe5rDyjYtO5d7/PJ/i/Bhfe3AltY1t\n2S5JROQooU3uZnYRweR+6wBjci7nniwSMS47czw/vm4B9S2d3PSzVXSqwZiIDDOhTO5mNhe4H7jC\n3fu9R10u5tz7c2Z1OT+4ai5v1hzk/pfez3Y5IiJHCOMK1SnAUuBad3/3xEvKHf/p7IlcesY4bn/m\nHc77ny+wWS0KRGSYOOGWv8D3gNEEFy+tNrOVGax32PnB587m25fPob2rh+sfeIPWDqVoRCT7wmj5\n+zWghSDj3gLcGF55w195UZybLjiFMyaWce1PXufpdbv5o4+O2MCQiOSIMFr+Xg7MTPy5kTQaiI1E\ni0+tYtroIn7+Wg076luyXY6InOTCaPl7BfBTD6wAKsxsQlgF5goz4/PnTGbV9noW376Mv/rVWnp6\nPNtlichJKp1lmcFUAx/0ebwj8dxJd4fpry6ewdzqCp5Zv5uHVtRw4eyxfPL0cdkuS0ROQmFM7mkz\nsxtJrMlPmTJlKN96SOTFIiyeWcWiGaNY9s4+7nhuE1tqm7hi3kQmVhRmuzwROYmEkXPfCUzu83hS\n4rmjjKSc+0Bi0Qg3nj+Dd/Y0cvsz73Dh/1rOo6ty/tayIpJDwpjcnwCus8AioMHdT7olmWTXfWwq\nq/76D3jpLy9i/pRK/uL/vc3G3YeyXZaInCTCyLk/BbwHbAF+DHw9Y9XmEDNjdEk+k0cVBb1o8qLc\n+x9bWfrmDuZ//7fUN3dku0QRGcHSSctcTdD29z2gDRjTt+WvuzvwHWAD0AM8aGbXZ67k3FNeFOdL\ni6by67d38cNnN7G/uYNH39QyjYhkTjpH7lHgXwjy7KcDV5vZ6UnDbgY2uPvZwIXAHWaWF3KtOe3r\nF57C6JJ8dje0URCP8M/Pb+Zz97xCS0dXtksTkREonTX3hcAWd3/P3TuARwiy7X05UGpmBpQQ5OI1\na/VRUZTHnV+Yx6fPmsAPrjqbwrwoq7bX88TqXdkuTURGIAtWVQYYYHYVcJm7fzXx+FrgXHf/Rp8x\npQQnVucApcAX3P03KbbVNwo5f/v27WHtR85xdy678yV2HWylpCDG7Z+by/mzRm6CSETCYWar3H3B\nYOPC6ud+KbAamAjMA+4ys7LkQSdLFDIdZsZNF84AC/LxX3ngDbbvb852WSIyQqQzuaeTY78eWJpo\nQbCF4L6qc8IpceS68iOTWPs3l/L//vRjRCLGj57fzL7G9myXJSIjQDpXqL4BzDSz6QST+heBa5LG\n1ACXAC+Z2ThgNkG6RtIwtqyALyyYzM9WbGfpmzs5q7qcSZWFfP+zZ1JVkp/t8kQkB6UThewC/i+w\nCWgG9rj7+qSs+/eBT5tZK7AdOOTudZkqeiT6y8tm86MvzuPP/2AmpQUxlm2q5Y/vfZVV2+sH/2YR\nkSSDHrknopB/QrDMsgN4w8xO7825J7QAxcBsd68xs7GZKHYkKy2Ic8W86sOPV247wDcffosv3vcq\nD35lIXsPtfHZedUEgSQRkYGFFYW8hmDNvQbA3WvDLfPks2DaKB77xnnEIhGu+fFr/Jd/e5uHXqvJ\ndlkikiPSmdz7a+nb1yyg0syWm9kqM7surAJPZmNLC7jpglPIi0WYO6mc//H4Om544A1uW7qW+196\nT/3iRaRfYbX8jQHzCU6qFgKvmtmK5Btmj/SWv5nwzUtO5frF0zDg/7ywhd9u2MvbOw7ycFMHG3Yd\nYkJFAW2dPfzp+TMYW1aQ7XJFZJhIZ3JPJwq5A9jv7s1As5m9CJwNHDG5u/t9wH0ACxYs0GFnGsyM\nsoI4AN9ZchrfWXIa7s7f/noDD7yyjWjEiBj88o0P+M6nT+PqhfqhKSLpXaEaI5ikLyGY1N8ArnH3\n9X3GnAbcRXAxUx7wOvBFd1/X33YXLFjgK1euPOEdOJkdbOmgMC/KroNt/PVja/n9lv186vRxNLV3\n8YOr5jKurIB4NKzr1ERkOAjtCtVEFPIbwLPARuCXyVFId98IPAOsIZjY7x9oYpdwVBTlkR+LMr2q\nmAevX8glc8by3Ia9vFkT3Mf13H94ntpDbfzqrR380d2/p62zO9sli8gQSXfNvYegOZgD3QBJUUjc\n/Ydmtpyg97v62Q6xWDTCvdfOZ/fBNtq6unnsrZ3c/9L7/PVj63jrg4Psa2zn56/VcMPi6bR3dfPB\ngVZOHVuS7bJFJEPSzbn/C/BJPsy5P+HuG1KMux14LhOFyuDi0QhTRhcB8JeXzaE4P8YPn90EwNTR\nRfzod++SFzXeqjnI0rd2csmcsXz81Cp2HWzlv3xyFiX5Q3pLXRHJoHT+NR/OuQOYWW/OfUPSuFuA\nR4FzQq1QjtvXLzyFhdNHsbuhjTMnlvHtR9fy3ceDUyUfP2U0r287wPPvBJckvLR5Hw9+ZSHdPU5F\nUZ4mepEcl86/4FQ593P7DjCzauBK4CI0uQ8bZsY500Ydfvxvf7qIf1m2hd9v2c/9X15AR1cPuxva\n2N/UwY0/W8nH/vEFACqL4vzmm59gYkVhyu26O+1dPRTEo0OyHyJy7MI6PLsTuNXdewa6PF459+wy\nM75x8Uy+cfFMAIrygpOyAA9/bRFPr9tDdUUBf//URv743leZMaaY+VMr+e2GvbR2djN1VBF3fH4e\n3350DWt2NLD8v19Ic3sXr79/gNKCOItnVmVz90Skj3SikB8D/sbdL008vg3A3f+xz5j3gd5ZvYqg\n18yN7v5Yf9tVFHL4emjFdn7y8vvUt3RwsKWT+VMrGVeWz3Pr9xKLGm2dPQBcMmcsL22uo6O7h4jB\n0q+fR3VFIXnRCF09PXS7s62uhflTK4lG1BNHJAzpRiFDybknjX8AeNLd/32g7WpyH/72NLSxaW8j\n58+swsx4fPVOHntrJ1cvnMLfPbmBHfWtLD61ilsuPpWbf/EWdU0f9qKPRoyCWITmjm6mjS7i7644\nkwXTKrnjuXfZXNvE7Z87iwnlqZd9RKR/oU3uiY0tIVh6iQL/6u5/3yfjfm/S2AfQ5D7iPfx6DT9/\nbTsP3XDJIHWYAAAMTElEQVQuFUV5LN9Uy+Ord3FmdTndPT3saWhnX1M758+s4p7lW3mvrpkxpfns\na2ynIB6hq9s579Qq/vnqj/Dy5joOtHTw+QWTyItGcIeIjvRFUgp7cr8M+BHB5H6/u//PpNe/BNxK\nsDTTCPyZu7890DY1uZ882jq7+dtfb+Ctmnr+7oozGV2Sx7+98QE/efl9Igad3cH/g3nRCJ09PRTF\no5w/awx/vGASTe3dtHV280cfqSaWuNq29lAbtY3tnFldns3dEsmKMJdlogTLModz7sDVfXPuZvZx\nYKO715vZ5QRr9Oem3GCCJnd5cs0ufrdhL0vOmkB+PMrLm/dRGI+yv7mD36zdzcGWzsNjZ4wp5vyZ\nY/jo1Er+17Ob2NPQxpPfXMzo4jzuWb6Vls5u/uKTs3iz5iBnVpdpyUdGrDAn90FPqCaNrwTWuXty\nW+AjaHKXgbR2dPPWB/VEzdjX1M7Dr9fw5vaDtHZ2kxeNUJgXxd1p6+yhozs4wVteGKehtZO8WISr\nz5lMe1cPebEI1y6aytqdDWyra+bGC04hFrF+Y5xN7V00t3cxTh02ZZhKd3IPJeee5Abg6TS2K9Kv\nwrwoHz/lw2jlH86dSGd3D+t2NhCPRmjr7OYXr9UwqjiPz58zmW89sprNexv50Rfn8Zs1u/nF6zWU\nFsRpbu/ip69uP7ydf35hCwD5sQiVRXlMrypmdEkeZsakykIeenU7je1dnDahjNMnlDG2LJ/Z40rZ\nXNvImJJ8lpw1gRc317F5byPVlYV84ZzJ5MeitHR04Q41B1pobOvinGmV/G5jLfub2vnMvIkU5emi\nMBla6Ry5XwVc5u5fTTy+FjjX3b+RYuxFwN3AYnffn+L1vjn3+du3b08eInJcag+1sbuhjbMnVxzx\nfF1TO0+s3kU8FuGUqmLe2FZPLGo0tHZS39zBmzX1NLR20d7ZTVNHF5efOZ6zqitYtqmWDw60sK+x\nna4eJxoxuvvcHCUvFqGjq4dYxKgqyae+pQMz6Op2unqcyaMK+eBAKwAVRXHGluaTF4uQF43w9o4G\nxpTkM3lUISX5Mc6ZPoppo4s52NLJroOtTB5VyKIZo9lZ38rs8aWM7nOT9O4eZ2d9K/nxCAWxKE5w\nRfGW2kYATh1betR/m5c31xEx+Pipug5hJBjyZRkzmwv8Crg8+SYdqWhZRoaThpZOmju6jroqd39T\nO3VNHUyrKmLZO/vYuq+JC2aN4YyJZbyydT8vb6mjrrGd0oI4bV3dxCNGWWGcd/YEEdLZ48v415ff\np6G1kz2H2mjr7OYz8yays76VuqZ26ps72bS38fD7mUHyP8nKojgTygtp6+rmvX3NR9V+ZnUZ7+5t\nAg96CJUXxjl7cgUtHd1MHlXI//7tu7jDOdNGUVkc5/yZY+jqcSqL8thS20RZYYyzqstZub2e/U3t\nTKosojAepbWzm7mTyunqcX6zZjf5sQgXzRnL/KmVh3972tPQxsHWTqaMKmJUcXBBXGd38EMv1QWN\nnYkltI6uHuLRCHkxtaQ+VkOaczezKcALwHXu/ko6BWpyFwnsqG/hYEsnY0rzKcmP8fDrNdQ1dXDe\nqaNZu7OBXQdbeXdvE1EzFk4fxfjyAg61dh6eKJ9dv5dxZfnkx6LUNrbxfl0zh1q7KIhHONTWxdTR\nRRTlxahraqeprYvWPq2fk3+Y5McitHf1HFVjQTxCTw90dPdQWhBjUmURW/c10ZEYWxCPcOkZ41m3\ns4Gt+5qpKsljTOmH5y3aOrsZVZzH5r2NdHY7Hd09lBfGmTe5gl0HW5lUWcSscSVs2tNIfuK9tuxr\nYm51OfOmVFDX1AEEP+j2N3WQF4swtjSfQ22d5Mei7GsMrrGoriykqzv4wTGuvICttU1EI0Y8GqG0\nIEZeNEJjexdTRhUxf2olW/c18eK7+2hu76YwL0pBLMIH9a1UVxQye3wpdU3tjC8roLI4j+37W9i6\nr4mqknwunjOWzXsb2VHfSnePM7okj0jEeHlzHaUFMf7gtHHsaWhj58FWWjq6qa4sxIC5k8qJRoxo\nxI77pP+Q5tzN7H7gc0DvOkvXYG+uyV0kM9q7uunucQrjUXbUt1JWEKcoP4oBB1o6Dk/8ze3dTKgo\nYPfBNnY3tDJzXCkTywuobWyno6uH/HiEJ9/eTTxqXDV/Mt3uvLy5jmXv1LK/uZ2po4s5fUIZpQUx\nnt9Yy/Pv1DKhvIBLThvL9v3BuYdeBfEIuxvaGF9ewJiSfArzomzf38z2/S2MLslnw64G6ls6mV5V\nTHtXN/FI0OH091vqDkdle0UjRo/7ET+Ueq+A7j6G+woXxCOHr7buq3fJbSARg1RvlReL0Nndc9Rv\nX716f5j+6QUzuO3y09Ku9chtDG3O3RKvLyFoPfAn7v7mQNvU5C4ivXpv9p588dqehjY6u3uYPKoI\nd2d/cwfFeTEc52BLJyUFMdo6uynJD47K9xxqIx4NJue9h9qYUFFIPGq4w8GWTnrcKc6LsXHPIV7Z\nUsfkUUV8Zt5EqorzaevqprWjm/LCOB/Ut7L3UBuVRXls299Me1cPkyoLOWVMCR8caOGJt3cxdXQR\ni0+tImJGbWM7Pe7MGV9KZ7fz7Po9jC8vYG51OYV5Ud7b14w7LNtUS0l+jIvnjGVaVfFx/bca6pz7\nEoKWv0sIkjQ/Us5dRCR8od1mjz793N29A+jt597XFcBPPbACqDCzCcdctYiIhCKdyT1Vzj35AqV0\nxmBmN5rZSjNbuW/fvmOtVURE0jSkOSR3v8/dF7j7gjFjxgzlW4uInFTSmdx3ApP7PJ6UeO5Yx4iI\nyBBJZ3J/A5hpZtPNLA/4IvBE0pgngOsssAhocPfdIdcqIiJpGrThhbt3mdk3gGf5MOe+Pqmf+1ME\nSZktBFHI6zNXsoiIDCatbkbu/hTBBN73uXv7fO3AzeGWJiIix0uNHURERqC0rlDNyBub7ePDdgXH\nqgqoC7GcbNF+DC/aj+FF+5HaVHcfNG6Ytcn9RJjZynSu0BrutB/Di/ZjeNF+nBgty4iIjECa3EVE\nRqBcndzvy3YBIdF+DC/aj+FF+3ECcnLNXUREBparR+4iIjKAnJvczewyM9tkZlvM7NvZrudYmNk2\nM1trZqvNbGXiuVFm9lsz25z4uzLbdSYzs381s1ozW9fnuX7rNrPbEp/PJjO7NDtVH62f/fgbM9uZ\n+ExWJ+5N0PvasNsPM5tsZsvMbIOZrTezbyWez6nPY4D9yLXPo8DMXjeztxP78beJ57P/ebh7zvwh\naH+wFZgB5AFvA6dnu65jqH8bUJX03A+Abye+/jZwe7brTFH3+cBHgXWD1Q2cnvhc8oHpic8rmu19\nGGA//gb4bynGDsv9ACYAH018XUpwI53Tc+3zGGA/cu3zMKAk8XUceA1YNBw+j1w7ck/nxiG55grg\nwcTXDwKfzWItKbn7i8CBpKf7q/sK4BF3b3f39wn6DS0ckkIH0c9+9GdY7oe77/bELSzdvRHYSHDv\nhJz6PAbYj/4M1/1wd29KPIwn/jjD4PPItck9rZuCDGMO/M7MVpnZjYnnxvmHHTT3AOOyU9ox66/u\nXPyMbjGzNYllm95fn4f9fpjZNOAjBEeLOft5JO0H5NjnYWZRM1sN1AK/dfdh8Xnk2uSe6xa7+zzg\ncuBmMzu/74se/N6Wc/GlXK074R6CZb55wG7gjuyWkx4zKwEeBf7c3Q/1fS2XPo8U+5Fzn4e7dyf+\nXU8CFprZmUmvZ+XzyLXJPadvCuLuOxN/1wK/Ivh1bG/v/WYTf9dmr8Jj0l/dOfUZufvexD/OHuDH\nfPgr8rDdDzOLE0yIP3f3pYmnc+7zSLUfufh59HL3g8Ay4DKGweeRa5N7OjcOGZbMrNjMSnu/Bj4F\nrCOo/8uJYV8GHs9Ohcesv7qfAL5oZvlmNh2YCbyehfrSYkfeyP1Kgs8Ehul+mJkBPwE2uvs/9Xkp\npz6P/vYjBz+PMWZWkfi6EPgk8A7D4fPI9tnm4zg7vYTgzPpW4K+yXc8x1D2D4Cz528D63tqB0cDz\nwGbgd8CobNeaovaHCX5F7iRYI7xhoLqBv0p8PpuAy7Nd/yD78TNgLbCG4B/ehOG8H8Bigl/x1wCr\nE3+W5NrnMcB+5NrnMRd4K1HvOuB7ieez/nnoClURkREo15ZlREQkDZrcRURGIE3uIiIjkCZ3EZER\nSJO7iMgIpMldRGQE0uQuIjICaXIXERmB/j+fxUoXUnht0gAAAABJRU5ErkJggg==\n", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "def show_plot(points):\n", " plt.figure()\n", " fig, ax = plt.subplots()\n", " loc = ticker.MultipleLocator(base=0.2) # put ticks at regular intervals\n", " ax.yaxis.set_major_locator(loc)\n", " plt.plot(points)\n", "\n", "show_plot(plot_losses)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "scrolled": true }, "outputs": [], "source": [ "output_words, attentions = evaluate(\"je suis trop froid .\")\n", "plt.matshow(attentions.numpy())\n", "show_plot_visdom()" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "evaluate_and_show_attention(\"elle a cinq ans de moins que moi .\")" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "evaluate_and_show_attention(\"elle est trop petit .\")" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "evaluate_and_show_attention(\"je ne crains pas de mourir .\")" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "scrolled": true }, "outputs": [], "source": [ "evaluate_and_show_attention(\"c est un jeune directeur plein de talent .\")" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "evaluate_and_show_attention(\"est le chien vert aujourd hui ?\")" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "evaluate_and_show_attention(\"le chat me parle .\")" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "evaluate_and_show_attention(\"des centaines de personnes furent arretees ici .\")" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "evaluate_and_show_attention(\"des centaines de chiens furent arretees ici .\")" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "evaluate_and_show_attention(\"ce fromage est prepare a partir de lait de chevre .\")" ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "# Exercises\n", "\n", "* Try with a different dataset\n", " * Another language pair\n", " * Human → Machine (e.g. IOT commands)\n", " * Chat → Response\n", " * Question → Answer\n", "* Replace the embedding pre-trained word embeddings such as word2vec or GloVe\n", "* Try with more layers, more hidden units, and more sentences. Compare the training time and results.\n", "* If you use a translation file where pairs have two of the same phrase (`I am test \\t I am test`), you can use this as an autoencoder. Try this:\n", " * Train as an autoencoder\n", " * Save only the Encoder network\n", " * Train a new Decoder for translation from there" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [] } ], "metadata": { "anaconda-cloud": {}, "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.6.0" } }, "nbformat": 4, "nbformat_minor": 1 } ================================================ FILE: seq2seq-translation/seq2seq-translation-batched.py ================================================ # coding: utf-8 import unicodedata import string import re import random import time import datetime import math import socket hostname = socket.gethostname() import torch import torch.nn as nn from torch.autograd import Variable from torch import optim import torch.nn.functional as F from torch.nn.utils.rnn import pad_packed_sequence, pack_padded_sequence import matplotlib.pyplot as plt import matplotlib.ticker as ticker matplotlib.use('Agg') import numpy as np import io import torchvision from PIL import Image import visdom vis = visdom.Visdom() USE_CUDA = True SOS_token = 0 EOS_token = 1 # Configure models attn_model = 'dot' hidden_size = 500 n_layers = 2 dropout = 0.1 batch_size = 100 batch_size = 50 # Configure training/optimization clip = 50.0 teacher_forcing_ratio = 0.5 learning_rate = 0.0001 decoder_learning_ratio = 5.0 n_epochs = 50000 epoch = 0 plot_every = 20 print_every = 100 evaluate_every = 1000 # Initialize models encoder = EncoderRNN(input_lang.n_words, hidden_size, n_layers, dropout=dropout) decoder = LuongAttnDecoderRNN(attn_model, hidden_size, output_lang.n_words, n_layers, dropout=dropout) # Initialize optimizers and criterion encoder_optimizer = optim.Adam(encoder.parameters(), lr=learning_rate) decoder_optimizer = optim.Adam(decoder.parameters(), lr=learning_rate * decoder_learning_ratio) # Move models to GPU if USE_CUDA: encoder.cuda() decoder.cuda() # Keep track of time elapsed and running averages start = time.time() plot_losses = [] print_loss_total = 0 # Reset every print_every plot_loss_total = 0 # Reset every plot_every # ---------------------------------------------------------------------------------------- class Lang: def __init__(self, name): self.name = name self.word2index = {} self.word2count = {} self.index2word = {0: "SOS", 1: "EOS"} self.n_words = 2 # Count SOS and EOS def index_words(self, sentence): for word in sentence.split(' '): self.index_word(word) def index_word(self, word): if word not in self.word2index: self.word2index[word] = self.n_words self.word2count[word] = 1 self.index2word[self.n_words] = word self.n_words += 1 else: self.word2count[word] += 1 def trim(self, min_count=3): keep = [] for k, v in self.word2count.items(): if v >= min_count: keep.append(k) print('total', len(self.word2index)) print('keep', len(keep)) print('keep %', len(keep) / len(self.word2index)) self.word2index = {} self.word2count = {} self.index2word = {0: "SOS", 1: "EOS"} self.n_words = 2 # Count SOS and EOS for word in keep: self.index_word(word) # Turn a Unicode string to plain ASCII, thanks to http://stackoverflow.com/a/518232/2809427 def unicode_to_ascii(s): return ''.join( c for c in unicodedata.normalize('NFD', s) if unicodedata.category(c) != 'Mn' ) # Lowercase, trim, and remove non-letter characters def normalize_string(s): s = unicode_to_ascii(s.lower().strip()) s = re.sub(r"([.!?])", r" \1", s) s = re.sub(r"[^a-zA-Z.!?]+", r" ", s) return s def read_langs(lang1, lang2, reverse=False): print("Reading lines...") # Read the file and split into lines # filename = '../data/%s-%s.txt' % (lang1, lang2) filename = '../%s-%s.txt' % (lang1, lang2) lines = open(filename).read().strip().split('\n') # Split every line into pairs and normalize pairs = [[normalize_string(s) for s in l.split('\t')] for l in lines] # Reverse pairs, make Lang instances if reverse: pairs = [list(reversed(p)) for p in pairs] input_lang = Lang(lang2) output_lang = Lang(lang1) else: input_lang = Lang(lang1) output_lang = Lang(lang2) return input_lang, output_lang, pairs MIN_LENGTH = 5 MAX_LENGTH = 20 good_prefixes = ( "i ", "he ", "she ", "you ", "they ", "we " ) def filter_pair(p): return len(p[0].split(' ')) <= MAX_LENGTH and len(p[1].split(' ')) <= MAX_LENGTH and len(p[0].split(' ')) >= MIN_LENGTH and len(p[1].split(' ')) >= MIN_LENGTH # and \ # p[1].startswith(good_prefixes) def filter_pairs(pairs): return [pair for pair in pairs if filter_pair(pair)] def prepare_data(lang1_name, lang2_name, reverse=False): input_lang, output_lang, pairs = read_langs(lang1_name, lang2_name, reverse) print("Read %s sentence pairs" % len(pairs)) pairs = filter_pairs(pairs) print("Trimmed to %s sentence pairs" % len(pairs)) print("Indexing words...") for pair in pairs: input_lang.index_words(pair[0]) output_lang.index_words(pair[1]) return input_lang, output_lang, pairs input_lang, output_lang, pairs = prepare_data('eng', 'fra', True) input_lang.trim() output_lang.trim() keep_pairs = [] for pair in pairs: keep_input = True keep_output = True for word in pair[0].split(' '): if word not in input_lang.word2index: keep_input = False break for word in pair[1].split(' '): if word not in output_lang.word2index: keep_output = False break if keep_input and keep_output: keep_pairs.append(pair) print(len(pairs)) print(len(keep_pairs)) print(len(keep_pairs) / len(pairs)) pairs = keep_pairs # Return a list of indexes, one for each word in the sentence def indexes_from_sentence(lang, sentence): return [lang.word2index[word] for word in sentence.split(' ')] + [EOS_token] def pad_seq(seq, max_length): seq += [0 for i in range(max_length - len(seq))] return seq def random_batch(batch_size=3): input_seqs = [] target_seqs = [] # Choose random pairs for i in range(batch_size): pair = random.choice(pairs) input_seqs.append(indexes_from_sentence(input_lang, pair[0])) target_seqs.append(indexes_from_sentence(output_lang, pair[1])) # Zip into pairs, sort by length (descending), unzip seq_pairs = sorted(zip(input_seqs, target_seqs), key=lambda p: len(p[0]), reverse=True) input_seqs, target_seqs = zip(*seq_pairs) # For input and target sequences, get array of lengths and pad with 0s to max length input_lengths = [len(s) for s in input_seqs] input_padded = [pad_seq(s, max(input_lengths)) for s in input_seqs] target_lengths = [len(s) for s in target_seqs] target_padded = [pad_seq(s, max(target_lengths)) for s in target_seqs] # Turn padded arrays into (batch x seq) tensors, transpose into (seq x batch) input_var = Variable(torch.LongTensor(input_padded)).transpose(0, 1) target_var = Variable(torch.LongTensor(target_padded)).transpose(0, 1) if USE_CUDA: input_var = input_var.cuda() target_var = target_var.cuda() return input_var, input_lengths, target_var, target_lengths random_batch() class EncoderRNN(nn.Module): def __init__(self, input_size, hidden_size, n_layers=1, dropout=0.1): super(EncoderRNN, self).__init__() self.input_size = input_size self.hidden_size = hidden_size self.n_layers = n_layers self.dropout = dropout self.embedding = nn.Embedding(input_size, hidden_size) self.gru = nn.GRU(hidden_size, hidden_size, n_layers, dropout=self.dropout) def forward(self, input_seqs, input_lengths, hidden=None): embedded = self.embedding(input_seqs) packed = torch.nn.utils.rnn.pack_padded_sequence(embedded, input_lengths) output, hidden = self.gru(packed, hidden) output, _ = torch.nn.utils.rnn.pad_packed_sequence(output) # unpack (back to padded) return output, hidden class Attn(nn.Module): def __init__(self, method, hidden_size): super(Attn, self).__init__() self.method = method self.hidden_size = hidden_size if self.method == 'general': self.attn = nn.Linear(self.hidden_size, hidden_size) elif self.method == 'concat': self.attn = nn.Linear(self.hidden_size * 2, hidden_size) self.v = nn.Parameter(torch.FloatTensor(1, hidden_size)) def forward(self, hidden, encoder_outputs): seq_len = encoder_outputs.size(0) batch_size = encoder_outputs.size(1) # print('[attn] seq len', seq_len) # print('[attn] encoder_outputs', encoder_outputs.size()) # S x B x N # print('[attn] hidden', hidden.size()) # S=1 x B x N # Create variable to store attention energies attn_energies = Variable(torch.zeros(batch_size, seq_len)) # B x S # print('[attn] attn_energies', attn_energies.size()) if USE_CUDA: attn_energies = attn_energies.cuda() # For each batch of encoder outputs for b in range(batch_size): # Calculate energy for each encoder output for i in range(seq_len): attn_energies[b, i] = self.score(hidden[:, b], encoder_outputs[i, b].unsqueeze(0)) # Normalize energies to weights in range 0 to 1, resize to 1 x B x S # print('[attn] attn_energies', attn_energies.size()) return F.softmax(attn_energies).unsqueeze(1) def score(self, hidden, encoder_output): if self.method == 'dot': energy = hidden.dot(encoder_output) return energy elif self.method == 'general': energy = self.attn(encoder_output) energy = hidden.dot(energy) return energy elif self.method == 'concat': energy = self.attn(torch.cat((hidden, encoder_output), 1)) energy = self.v.dot(energy) return energy rnn_output = Variable(torch.zeros(1, 2, 10)) encoder_outputs = Variable(torch.zeros(3, 2, 10)) attn = Attn('concat', 10) attn(rnn_output, encoder_outputs) attn_weights = torch.zeros(2, 1, 3) print('attn_weights', attn_weights.size()) encoder_outputs = torch.zeros(3, 2, 10) print('encoder_outputs', encoder_outputs.size()) # B x N x M # , B x M x P # -> B x N x P context = attn_weights.bmm(encoder_outputs.transpose(0, 1)) context = context.transpose(0, 1) print('context', context.size()) class LuongAttnDecoderRNN(nn.Module): def __init__(self, attn_model, hidden_size, output_size, n_layers=1, dropout=0.1): super(LuongAttnDecoderRNN, self).__init__() # Keep for reference self.attn_model = attn_model self.hidden_size = hidden_size self.output_size = output_size self.n_layers = n_layers self.dropout = dropout # Define layers self.embedding = nn.Embedding(output_size, hidden_size) self.embedding_dropout = nn.Dropout(dropout) self.gru = nn.GRU(hidden_size, hidden_size, n_layers, dropout=dropout) self.concat = nn.Linear(hidden_size * 2, hidden_size) self.out = nn.Linear(hidden_size, output_size) # Choose attention model if attn_model != 'none': self.attn = Attn(attn_model, hidden_size) def forward(self, input_seq, last_context, last_hidden, encoder_outputs): # Note: we run this one step at a time (in order to do teacher forcing) # Get the embedding of the current input word (last output word) batch_size = input_seq.size(0) # print('[decoder] input_seq', input_seq.size()) # batch_size x 1 embedded = self.embedding(input_seq) embedded = self.embedding_dropout(embedded) embedded = embedded.view(1, batch_size, hidden_size) # S=1 x B x N # print('[decoder] word_embedded', embedded.size()) # Get current hidden state from input word and last hidden state # print('[decoder] last_hidden', last_hidden.size()) rnn_output, hidden = self.gru(embedded, last_hidden) # print('[decoder] rnn_output', rnn_output.size()) # Calculate attention from current RNN state and all encoder outputs; # apply to encoder outputs to get weighted average attn_weights = self.attn(rnn_output, encoder_outputs) # print('[decoder] attn_weights', attn_weights.size()) # print('[decoder] encoder_outputs', encoder_outputs.size()) context = attn_weights.bmm(encoder_outputs.transpose(0, 1)) # B x S=1 x N # print('[decoder] context', context.size()) # Attentional vector using the RNN hidden state and context vector # concatenated together (Luong eq. 5) rnn_output = rnn_output.squeeze(0) # S=1 x B x N -> B x N context = context.squeeze(1) # B x S=1 x N -> B x N # print('[decoder] rnn_output', rnn_output.size()) # print('[decoder] context', context.size()) concat_input = torch.cat((rnn_output, context), 1) concat_output = F.tanh(self.concat(concat_input)) # Finally predict next token (Luong eq. 6) # output = F.log_softmax(self.out(concat_output)) output = self.out(concat_output) # Return final output, hidden state, and attention weights (for visualization) return output, context, hidden, attn_weights # ## Testing the models # # To make sure the encoder and decoder are working (and working together) we'll do a quick test. # # First by creating and padding a batch of sequences: # In[394]: # Input as batch of sequences of word indexes batch_size = 2 input_batches, input_lengths, target_batches, target_lengths = random_batch(batch_size) print('input_batches', input_batches.size()) print('target_batches', target_batches.size()) # Create models with a small size (in case you need to manually inspect): # In[395]: # Create models hidden_size = 8 n_layers = 2 encoder_test = EncoderRNN(input_lang.n_words, hidden_size, n_layers) decoder_test = LuongAttnDecoderRNN('general', hidden_size, output_lang.n_words, n_layers) if USE_CUDA: encoder_test.cuda() decoder_test.cuda() # Then running the entire batch of input sequences through the encoder to get per-batch encoder outputs: # In[396]: # Test encoder encoder_outputs, encoder_hidden = encoder_test(input_batches, input_lengths, None) print('encoder_outputs', encoder_outputs.size()) # max_len x B x hidden_size print('encoder_hidden', encoder_hidden.size()) # n_layers x B x hidden_size # Then starting with a SOS token, run word tokens through the decoder to get each next word token. Instead of doing this with the whole sequence, it is done one at a time, to support using it's own predictions to make the next prediction. This will be one time step at a time, but batched per time step. In order to get this to work for short padded sequences, the batch size is going to get smaller each time. # In[397]: decoder_attns = torch.zeros(batch_size, MAX_LENGTH, MAX_LENGTH) decoder_hidden = encoder_hidden decoder_context = Variable(torch.zeros(1, decoder_test.hidden_size)) criterion = nn.NLLLoss() max_length = max(target_lengths) all_decoder_outputs = Variable(torch.zeros(max_length, batch_size, decoder_test.output_size)) if USE_CUDA: decoder_context = decoder_context.cuda() all_decoder_outputs = all_decoder_outputs.cuda() loss = 0 # import masked_cross_entropy import importlib importlib.reload(masked_cross_entropy) # Run through decoder one time step at a time for t in range(max_length - 1): decoder_input = target_batches[t] target_batch = target_batches[t + 1] decoder_output, decoder_context, decoder_hidden, decoder_attn = decoder_test( decoder_input, decoder_context, decoder_hidden, encoder_outputs ) print('decoder output = %s, decoder_hidden = %s, decoder_attn = %s'% ( decoder_output.size(), decoder_hidden.size(), decoder_attn.size() )) all_decoder_outputs[t] = decoder_output # print('all decoder outputs', all_decoder_outputs.size()) # print('target batches', target_batches.size()) # print('all_decoder_outputs', all_decoder_outputs.transpose(0, 1).contiguous().view(-1, decoder_test.output_size)) print('target lengths', target_lengths) loss = masked_cross_entropy.compute_loss( all_decoder_outputs.transpose(0, 1).contiguous(), target_batches.transpose(0, 1).contiguous(), target_lengths ) # loss = criterion(all_decoder_outputs, target_batches) # print('loss', loss.size()) print('loss', loss.data[0]) # # Training # # ## Defining a training iteration # # To train we first run the input sentence through the encoder word by word, and keep track of every output and the latest hidden state. Next the decoder is given the last hidden state of the decoder as its first hidden state, and the `` token as its first input. From there we iterate to predict a next token from the decoder. # # ### Teacher Forcing vs. Scheduled Sampling # # "Teacher Forcing", or maximum likelihood sampling, means using the real target outputs as each next input when training. The alternative is using the decoder's own guess as the next input. Using teacher forcing may cause the network to converge faster, but [when the trained network is exploited, it may exhibit instability](http://minds.jacobs-university.de/sites/default/files/uploads/papers/ESNTutorialRev.pdf). # # You can observe outputs of teacher-forced networks that read with coherent grammar but wander far from the correct translation - you could think of it as having learned how to listen to the teacher's instructions, without learning how to venture out on its own. # # The solution to the teacher-forcing "problem" is known as [Scheduled Sampling](https://arxiv.org/abs/1506.03099), which simply alternates between using the target values and predicted values when training. We will randomly choose to use teacher forcing with an if statement while training - sometimes we'll feed use real target as the input (ignoring the decoder's output), sometimes we'll use the decoder's output. # In[398]: [SOS_token] * 5 # In[399]: def train(input_batches, input_lengths, target_batches, target_lengths, encoder, decoder, encoder_optimizer, decoder_optimizer, criterion, max_length=MAX_LENGTH): # Zero gradients of both optimizers encoder_optimizer.zero_grad() decoder_optimizer.zero_grad() loss = 0 # Added onto for each word # Run words through encoder encoder_outputs, encoder_hidden = encoder(input_batches, input_lengths, None) # Prepare input and output variables decoder_input = Variable(torch.LongTensor([[SOS_token] * batch_size])).transpose(0, 1) # print('decoder_input', decoder_input.size()) decoder_context = encoder_outputs[-1] decoder_hidden = encoder_hidden # Use last hidden state from encoder to start decoder max_length = max(target_lengths) all_decoder_outputs = Variable(torch.zeros(max_length, batch_size, decoder.output_size)) # Move new Variables to CUDA if USE_CUDA: decoder_input = decoder_input.cuda() decoder_context = decoder_context.cuda() all_decoder_outputs = all_decoder_outputs.cuda() # Choose whether to use teacher forcing use_teacher_forcing = random.random() < teacher_forcing_ratio # TODO: Get targets working if True: # Run through decoder one time step at a time for t in range(max_length): # target_batch = target_batches[t] # Trim down batches of other inputs # decoder_hidden = decoder_hidden[:, :len(target_batch)] # encoder_outputs = encoder_outputs[:, :len(target_batch)] decoder_output, decoder_context, decoder_hidden, decoder_attn = decoder( decoder_input, decoder_context, decoder_hidden, encoder_outputs ) # print(decoder_output.size(), decoder_hidden.size(), decoder_attn.size()) # loss += criterion(decoder_output, target_batch) all_decoder_outputs[t] = decoder_output decoder_input = target_batches[t] # TODO decoder_input = target_variable[di] # Next target is next input # Teacher forcing: Use the ground-truth target as the next input elif use_teacher_forcing: for di in range(target_length): decoder_output, decoder_context, decoder_hidden, decoder_attention = decoder(decoder_input, decoder_context, decoder_hidden, encoder_outputs) loss += criterion(decoder_output[0], target_variable[di]) decoder_input = target_variable[di] # Next target is next input # Without teacher forcing: use network's own prediction as the next input else: for di in range(target_length): decoder_output, decoder_context, decoder_hidden, decoder_attention = decoder(decoder_input, decoder_context, decoder_hidden, encoder_outputs) loss += criterion(decoder_output[0], target_variable[di]) # Get most likely word index (highest value) from output topv, topi = decoder_output.data.topk(1) ni = topi[0][0] decoder_input = Variable(torch.LongTensor([[ni]])) # Chosen word is next input if USE_CUDA: decoder_input = decoder_input.cuda() # Stop at end of sentence (not necessary when using known targets) if ni == EOS_token: break # Loss calculation and backpropagation # print('all_decoder_outputs', all_decoder_outputs.size()) # print('target_batches', target_batches.size()) loss = masked_cross_entropy.compute_loss( all_decoder_outputs.transpose(0, 1).contiguous(), # seq x batch -> batch x seq target_batches.transpose(0, 1).contiguous(), # seq x batch -> batch x seq target_lengths ) loss.backward() # Clip gradient norm # ec = torch.nn.utils.clip_grad_norm(encoder.parameters(), clip) # dc = torch.nn.utils.clip_grad_norm(decoder.parameters(), clip) # Update parameters with optimizers encoder_optimizer.step() decoder_optimizer.step() return loss.data[0], ec, dc # ## Running training # Plus helper functions to print time elapsed and estimated time remaining, given the current time and progress. # In[404]: def as_minutes(s): m = math.floor(s / 60) s -= m * 60 return '%dm %ds' % (m, s) def time_since(since, percent): now = time.time() s = now - since es = s / (percent) rs = es - s return '%s (- %s)' % (as_minutes(s), as_minutes(rs)) def evaluate(input_seq, max_length=MAX_LENGTH): input_lengths = [len(input_seq)] input_seqs = [indexes_from_sentence(input_lang, input_seq)] input_batches = Variable(torch.LongTensor(input_seqs)).transpose(0, 1) if USE_CUDA: input_batches = input_batches.cuda() # Run through encoder encoder_outputs, encoder_hidden = encoder(input_batches, input_lengths, None) # Create starting vectors for decoder decoder_input = Variable(torch.LongTensor([[SOS_token]])) # SOS decoder_context = Variable(torch.zeros(1, decoder.hidden_size)) decoder_hidden = encoder_hidden if USE_CUDA: decoder_input = decoder_input.cuda() decoder_context = decoder_context.cuda() # Store output words and attention states decoded_words = [] decoder_attentions = torch.zeros(max_length + 1, max_length + 1) # Run through decoder for di in range(max_length): decoder_output, decoder_context, decoder_hidden, decoder_attention = decoder( decoder_input, decoder_context, decoder_hidden, encoder_outputs ) decoder_attentions[di,:decoder_attention.size(2)] += decoder_attention.squeeze(0).squeeze(0).cpu().data # Choose top word from output topv, topi = decoder_output.data.topk(1) ni = topi[0][0] if ni == EOS_token: decoded_words.append('') break else: decoded_words.append(output_lang.index2word[ni]) # Next input is chosen word # THIS MIGHT BE THE LAST PART OF BATCHING (or is it already going?) decoder_input = Variable(torch.LongTensor([[ni]])) if USE_CUDA: decoder_input = decoder_input.cuda() return decoded_words, decoder_attentions[:di+1, :len(encoder_outputs)] def evaluate_randomly(): pair = random.choice(pairs) output_words, attentions = evaluate(pair[0]) output_sentence = ' '.join(output_words) show_attention(pair[0], output_words, attentions) print('>', pair[0]) print('=', pair[1]) print('<', output_sentence) print('') def show_plot_visdom(): buf = io.BytesIO() plt.savefig(buf) buf.seek(0) attn_win = 'attention (%s)' % hostname vis.image(torchvision.transforms.ToTensor()(Image.open(buf)), win=attn_win, opts={'title': attn_win}) def show_attention(input_sentence, output_words, attentions): # Set up figure with colorbar fig = plt.figure() ax = fig.add_subplot(111) cax = ax.matshow(attentions.numpy(), cmap='bone') fig.colorbar(cax) # Set up axes ax.set_xticklabels([''] + input_sentence.split(' ') + [''], rotation=90) ax.set_yticklabels([''] + output_words) # Show label at every tick ax.xaxis.set_major_locator(ticker.MultipleLocator(1)) ax.yaxis.set_major_locator(ticker.MultipleLocator(1)) show_plot_visdom() plt.show() plt.close() def evaluate_and_show_attention(input_sentence): output_words, attentions = evaluate(input_sentence) print('input =', input_sentence) print('output =', ' '.join(output_words)) show_attention(input_sentence, output_words, attentions) win = 'evaluted (%s)' % hostname text = '

> %s

= %s

< %s

' % (input_sentence, target_sentence, output_sentence) vis.text(text, win=win, opts={'title': win}) # Begin! ecs = [] dcs = [] eca = 0 dca = 0 while epoch < n_epochs: epoch += 1 # Get training data for this cycle input_batches, input_lengths, target_batches, target_lengths = random_batch(batch_size) # Run the train function loss, ec, dc = train( input_batches, input_lengths, target_batches, target_lengths, encoder, decoder, encoder_optimizer, decoder_optimizer, criterion ) # Keep track of loss print_loss_total += loss plot_loss_total += loss eca += ec dca += dc if epoch == 1: evaluate_randomly() continue if epoch % print_every == 0: print_loss_avg = print_loss_total / print_every print_loss_total = 0 print_summary = '%s (%d %d%%) %.4f' % (time_since(start, epoch / n_epochs), epoch, epoch / n_epochs * 100, print_loss_avg) print(print_summary) evaluate_randomly() if epoch % plot_every == 0: plot_loss_avg = plot_loss_total / plot_every plot_losses.append(plot_loss_avg) plot_loss_total = 0 # TODO: Running average helper ecs.append(eca / plot_every) dcs.append(dca / plot_every) ecs_win = 'encoder grad (%s)' % hostname dcs_win = 'decoder grad (%s)' % hostname vis.line(np.array(ecs), win=ecs_win, opts={'title': ecs_win}) vis.line(np.array(dcs), win=dcs_win, opts={'title': dcs_win}) eca = 0 dca = 0 ================================================ FILE: seq2seq-translation/seq2seq-translation.ipynb ================================================ { "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "![](https://i.imgur.com/eBRPvWB.png)\n", "\n", "# Practical PyTorch: Translation with a Sequence to Sequence Network and Attention\n", "\n", "In this project we will be teaching a neural network to translate from French to English.\n", "\n", "```\n", "[KEY: > input, = target, < output]\n", "\n", "> il est en train de peindre un tableau .\n", "= he is painting a picture .\n", "< he is painting a picture .\n", "\n", "> pourquoi ne pas essayer ce vin delicieux ?\n", "= why not try that delicious wine ?\n", "< why not try that delicious wine ?\n", "\n", "> elle n est pas poete mais romanciere .\n", "= she is not a poet but a novelist .\n", "< she not not a poet but a novelist .\n", "\n", "> vous etes trop maigre .\n", "= you re too skinny .\n", "< you re all alone .\n", "```\n", "\n", "... to varying degrees of success.\n", "\n", "This is made possible by the simple but powerful idea of the [sequence to sequence network](http://arxiv.org/abs/1409.3215), in which two recurrent neural networks work together to transform one sequence to another. An encoder network condenses an input sequence into a single vector, and a decoder network unfolds that vector into a new sequence.\n", "\n", "To improve upon this model we'll use an [attention mechanism](https://arxiv.org/abs/1409.0473), which lets the decoder learn to focus over a specific range of the input sequence." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# The Sequence to Sequence model\n", "\n", "A [Sequence to Sequence network](http://arxiv.org/abs/1409.3215), or seq2seq network, or [Encoder Decoder network](https://arxiv.org/pdf/1406.1078v3.pdf), is a model consisting of two separate RNNs called the **encoder** and **decoder**. The encoder reads an input sequence one item at a time, and outputs a vector at each step. The final output of the encoder is kept as the **context** vector. The decoder uses this context vector to produce a sequence of outputs one step at a time.\n", "\n", "![](https://i.imgur.com/tVtHhNp.png)\n", "\n", "When using a single RNN, there is a one-to-one relationship between inputs and outputs. We would quickly run into problems with different sequence orders and lengths that are common during translation. Consider the simple sentence \"Je ne suis pas le chat noir\" → \"I am not the black cat\". Many of the words have a pretty direct translation, like \"chat\" → \"cat\". However the differing grammars cause words to be in different orders, e.g. \"chat noir\" and \"black cat\". There is also the \"ne ... pas\" → \"not\" construction that makes the two sentences have different lengths.\n", "\n", "With the seq2seq model, by encoding many inputs into one vector, and decoding from one vector into many outputs, we are freed from the constraints of sequence order and length. The encoded sequence is represented by a single vector, a single point in some N dimensional space of sequences. In an ideal case, this point can be considered the \"meaning\" of the sequence.\n", "\n", "This idea can be extended beyond sequences. Image captioning tasks take an [image as input, and output a description](https://arxiv.org/abs/1411.4555) of the image (img2seq). Some image generation tasks take a [description as input and output a generated image](https://arxiv.org/abs/1511.02793) (seq2img). These models can be referred to more generally as \"encoder decoder\" networks." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## The Attention Mechanism\n", "\n", "The fixed-length vector carries the burden of encoding the the entire \"meaning\" of the input sequence, no matter how long that may be. With all the variance in language, this is a very hard problem. Imagine two nearly identical sentences, twenty words long, with only one word different. Both the encoders and decoders must be nuanced enough to represent that change as a very slightly different point in space.\n", "\n", "The **attention mechanism** [introduced by Bahdanau et al.](https://arxiv.org/abs/1409.0473) addresses this by giving the decoder a way to \"pay attention\" to parts of the input, rather than relying on a single vector. For every step the decoder can select a different part of the input sentence to consider.\n", "\n", "![](https://i.imgur.com/5y6SCvU.png)\n", "\n", "Attention is calculated with another feedforward layer in the decoder. This layer will use the current input and hidden state to create a new vector, which is the same size as the input sequence (in practice, a fixed maximum length). This vector is processed through softmax to create *attention weights*, which are multiplied by the encoders' outputs to create a new context vector, which is then used to predict the next output.\n", "\n", "![](https://i.imgur.com/K1qMPxs.png)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Requirements\n", "\n", "You will need [PyTorch](http://pytorch.org/) to build and train the models, and [matplotlib](https://matplotlib.org/) for plotting training and visualizing attention outputs later." ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "collapsed": true }, "outputs": [], "source": [ "import unicodedata\n", "import string\n", "import re\n", "import random\n", "import time\n", "import math\n", "\n", "import torch\n", "import torch.nn as nn\n", "from torch.autograd import Variable\n", "from torch import optim\n", "import torch.nn.functional as F" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Here we will also define a constant to decide whether to use the GPU (with CUDA specifically) or the CPU. **If you don't have a GPU, set this to `False`**. Later when we create tensors, this variable will be used to decide whether we keep them on CPU or move them to GPU." ] }, { "cell_type": "code", "execution_count": 2, "metadata": { "collapsed": true }, "outputs": [], "source": [ "USE_CUDA = True" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Loading data files\n", "\n", "The data for this project is a set of many thousands of English to French translation pairs.\n", "\n", "[This question on Open Data Stack Exchange](http://opendata.stackexchange.com/questions/3888/dataset-of-sentences-translated-into-many-languages) pointed me to the open translation site http://tatoeba.org/ which has downloads available at http://tatoeba.org/eng/downloads - and better yet, someone did the extra work of splitting language pairs into individual text files here: http://www.manythings.org/anki/\n", "\n", "The English to French pairs are too big to include in the repo, so download `fra-eng.zip`, extract the text file in there, and rename it to `data/eng-fra.txt` before continuing (for some reason the zipfile is named backwards). The file is a tab separated list of translation pairs:\n", "\n", "```\n", "I am cold. J'ai froid.\n", "```" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Similar to the character encoding used in the character-level RNN tutorials, we will be representing each word in a language as a one-hot vector, or giant vector of zeros except for a single one (at the index of the word). Compared to the dozens of characters that might exist in a language, there are many many more words, so the encoding vector is much larger. We will however cheat a bit and trim the data to only use a few thousand words per language." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Indexing words\n", "\n", "We'll need a unique index per word to use as the inputs and targets of the networks later. To keep track of all this we will use a helper class called `Lang` which has word → index (`word2index`) and index → word (`index2word`) dictionaries, as well as a count of each word `word2count` to use to later replace rare words." ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "collapsed": true }, "outputs": [], "source": [ "SOS_token = 0\n", "EOS_token = 1\n", "\n", "class Lang:\n", " def __init__(self, name):\n", " self.name = name\n", " self.word2index = {}\n", " self.word2count = {}\n", " self.index2word = {0: \"SOS\", 1: \"EOS\"}\n", " self.n_words = 2 # Count SOS and EOS\n", " \n", " def index_words(self, sentence):\n", " for word in sentence.split(' '):\n", " self.index_word(word)\n", "\n", " def index_word(self, word):\n", " if word not in self.word2index:\n", " self.word2index[word] = self.n_words\n", " self.word2count[word] = 1\n", " self.index2word[self.n_words] = word\n", " self.n_words += 1\n", " else:\n", " self.word2count[word] += 1" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Reading and decoding files\n", "\n", "The files are all in Unicode, to simplify we will turn Unicode characters to ASCII, make everything lowercase, and trim most punctuation." ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "collapsed": false }, "outputs": [], "source": [ "# Turn a Unicode string to plain ASCII, thanks to http://stackoverflow.com/a/518232/2809427\n", "def unicode_to_ascii(s):\n", " return ''.join(\n", " c for c in unicodedata.normalize('NFD', s)\n", " if unicodedata.category(c) != 'Mn'\n", " )\n", "\n", "# Lowercase, trim, and remove non-letter characters\n", "def normalize_string(s):\n", " s = unicode_to_ascii(s.lower().strip())\n", " s = re.sub(r\"([.!?])\", r\" \\1\", s)\n", " s = re.sub(r\"[^a-zA-Z.!?]+\", r\" \", s)\n", " return s" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To read the data file we will split the file into lines, and then split lines into pairs. The files are all English → Other Language, so if we want to translate from Other Language → English I added the `reverse` flag to reverse the pairs." ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "collapsed": true }, "outputs": [], "source": [ "def read_langs(lang1, lang2, reverse=False):\n", " print(\"Reading lines...\")\n", "\n", " # Read the file and split into lines\n", " lines = open('../data/%s-%s.txt' % (lang1, lang2)).read().strip().split('\\n')\n", " \n", " # Split every line into pairs and normalize\n", " pairs = [[normalize_string(s) for s in l.split('\\t')] for l in lines]\n", " \n", " # Reverse pairs, make Lang instances\n", " if reverse:\n", " pairs = [list(reversed(p)) for p in pairs]\n", " input_lang = Lang(lang2)\n", " output_lang = Lang(lang1)\n", " else:\n", " input_lang = Lang(lang1)\n", " output_lang = Lang(lang2)\n", " \n", " return input_lang, output_lang, pairs" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Filtering sentences\n", "\n", "Since there are a *lot* of example sentences and we want to train something quickly, we'll trim the data set to only relatively short and simple sentences. Here the maximum length is 10 words (that includes punctuation) and we're filtering to sentences that translate to the form \"I am\" or \"He is\" etc. (accounting for apostrophes being removed)." ] }, { "cell_type": "code", "execution_count": 6, "metadata": { "collapsed": false }, "outputs": [], "source": [ "MAX_LENGTH = 10\n", "\n", "good_prefixes = (\n", " \"i am \", \"i m \",\n", " \"he is\", \"he s \",\n", " \"she is\", \"she s\",\n", " \"you are\", \"you re \"\n", ")\n", "\n", "def filter_pair(p):\n", " return len(p[0].split(' ')) < MAX_LENGTH and len(p[1].split(' ')) < MAX_LENGTH and \\\n", " p[1].startswith(good_prefixes)\n", "\n", "def filter_pairs(pairs):\n", " return [pair for pair in pairs if filter_pair(pair)]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The full process for preparing the data is:\n", "\n", "* Read text file and split into lines, split lines into pairs\n", "* Normalize text, filter by length and content\n", "* Make word lists from sentences in pairs" ] }, { "cell_type": "code", "execution_count": 7, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Reading lines...\n", "Read 135842 sentence pairs\n", "Trimmed to 9129 sentence pairs\n", "Indexing words...\n", "['il est paresseux .', 'he s lazy .']\n" ] } ], "source": [ "def prepare_data(lang1_name, lang2_name, reverse=False):\n", " input_lang, output_lang, pairs = read_langs(lang1_name, lang2_name, reverse)\n", " print(\"Read %s sentence pairs\" % len(pairs))\n", " \n", " pairs = filter_pairs(pairs)\n", " print(\"Trimmed to %s sentence pairs\" % len(pairs))\n", " \n", " print(\"Indexing words...\")\n", " for pair in pairs:\n", " input_lang.index_words(pair[0])\n", " output_lang.index_words(pair[1])\n", "\n", " return input_lang, output_lang, pairs\n", "\n", "input_lang, output_lang, pairs = prepare_data('eng', 'fra', True)\n", "\n", "# Print an example pair\n", "print(random.choice(pairs))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Turning training data into Tensors/Variables\n", "\n", "To train we need to turn the sentences into something the neural network can understand, which of course means numbers. Each sentence will be split into words and turned into a Tensor, where each word is replaced with the index (from the Lang indexes made earlier). While creating these tensors we will also append the EOS token to signal that the sentence is over.\n", "\n", "![](https://i.imgur.com/LzocpGH.png)\n", "\n", "A Tensor is a multi-dimensional array of numbers, defined with some type e.g. FloatTensor or LongTensor. In this case we'll be using LongTensor to represent an array of integer indexes.\n", "\n", "Trainable PyTorch modules take Variables as input, rather than plain Tensors. A Variable is basically a Tensor that is able to keep track of the graph state, which is what makes autograd (automatic calculation of backwards gradients) possible." ] }, { "cell_type": "code", "execution_count": 8, "metadata": { "collapsed": false }, "outputs": [], "source": [ "# Return a list of indexes, one for each word in the sentence\n", "def indexes_from_sentence(lang, sentence):\n", " return [lang.word2index[word] for word in sentence.split(' ')]\n", "\n", "def variable_from_sentence(lang, sentence):\n", " indexes = indexes_from_sentence(lang, sentence)\n", " indexes.append(EOS_token)\n", " var = Variable(torch.LongTensor(indexes).view(-1, 1))\n", "# print('var =', var)\n", " if USE_CUDA: var = var.cuda()\n", " return var\n", "\n", "def variables_from_pair(pair):\n", " input_variable = variable_from_sentence(input_lang, pair[0])\n", " target_variable = variable_from_sentence(output_lang, pair[1])\n", " return (input_variable, target_variable)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Building the models" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## The Encoder\n", "\n", "\n", "\n", "The encoder of a seq2seq network is a RNN that outputs some value for every word from the input sentence. For every input word the encoder outputs a vector and a hidden state, and uses the hidden state for the next input word." ] }, { "cell_type": "code", "execution_count": 9, "metadata": { "collapsed": false }, "outputs": [], "source": [ "class EncoderRNN(nn.Module):\n", " def __init__(self, input_size, hidden_size, n_layers=1):\n", " super(EncoderRNN, self).__init__()\n", " \n", " self.input_size = input_size\n", " self.hidden_size = hidden_size\n", " self.n_layers = n_layers\n", " \n", " self.embedding = nn.Embedding(input_size, hidden_size)\n", " self.gru = nn.GRU(hidden_size, hidden_size, n_layers)\n", " \n", " def forward(self, word_inputs, hidden):\n", " # Note: we run this all at once (over the whole input sequence)\n", " seq_len = len(word_inputs)\n", " embedded = self.embedding(word_inputs).view(seq_len, 1, -1)\n", " output, hidden = self.gru(embedded, hidden)\n", " return output, hidden\n", "\n", " def init_hidden(self):\n", " hidden = Variable(torch.zeros(self.n_layers, 1, self.hidden_size))\n", " if USE_CUDA: hidden = hidden.cuda()\n", " return hidden" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Attention Decoder" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Interpreting the Bahdanau et al. model\n", "\n", "The attention model in [Neural Machine Translation by Jointly Learning to Align and Translate](https://arxiv.org/abs/1409.0473) is described as the following series of equations.\n", "\n", "Each decoder output is conditioned on the previous outputs and some $\\mathbf x$, where $\\mathbf x$ consists of the current hidden state (which takes into account previous outputs) and the attention \"context\", which is calculated below. The function $g$ is a fully-connected layer with a nonlinear activation, which takes as input the values $y_{i-1}$, $s_i$, and $c_i$ concatenated.\n", "\n", "$$\n", "p(y_i \\mid \\{y_1,...,y_{i-1}\\},\\mathbf{x}) = g(y_{i-1}, s_i, c_i)\n", "$$\n", "\n", "The current hidden state $s_i$ is calculated by an RNN $f$ with the last hidden state $s_{i-1}$, last decoder output value $y_{i-1}$, and context vector $c_i$.\n", "\n", "In the code, the RNN will be a `nn.GRU` layer, the hidden state $s_i$ will be called `hidden`, the output $y_i$ called `output`, and context $c_i$ called `context`.\n", "\n", "$$\n", "s_i = f(s_{i-1}, y_{i-1}, c_i)\n", "$$\n", "\n", "The context vector $c_i$ is a weighted sum of all encoder outputs, where each weight $a_{ij}$ is the amount of \"attention\" paid to the corresponding encoder output $h_j$.\n", "\n", "$$\n", "c_i = \\sum_{j=1}^{T_x} a_{ij} h_j\n", "$$\n", "\n", "... where each weight $a_{ij}$ is a normalized (over all steps) attention \"energy\" $e_{ij}$ ...\n", "\n", "$$\n", "a_{ij} = \\dfrac{exp(e_{ij})}{\\sum_{k=1}^{T} exp(e_{ik})}\n", "$$\n", "\n", "... where each attention energy is calculated with some function $a$ (such as another linear layer) using the last hidden state $s_{i-1}$ and that particular encoder output $h_j$:\n", "\n", "$$\n", "e_{ij} = a(s_{i-1}, h_j)\n", "$$" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Implementing the Bahdanau et al. model\n", "\n", "In summary our decoder should consist of four main parts - an embedding layer turning an input word into a vector; a layer to calculate the attention energy per encoder output; a RNN layer; and an output layer.\n", "\n", "The decoder's inputs are the last RNN hidden state $s_{i-1}$, last output $y_{i-1}$, and all encoder outputs $h_*$.\n", "\n", "* embedding layer with inputs $y_{i-1}$\n", " * `embedded = embedding(last_rnn_output)`\n", "* attention layer $a$ with inputs $(s_{i-1}, h_j)$ and outputs $e_{ij}$, normalized to create $a_{ij}$\n", " * `attn_energies[j] = attn_layer(last_hidden, encoder_outputs[j])`\n", " * `attn_weights = normalize(attn_energies)`\n", "* context vector $c_i$ as an attention-weighted average of encoder outputs\n", " * `context = sum(attn_weights * encoder_outputs)`\n", "* RNN layer(s) $f$ with inputs $(s_{i-1}, y_{i-1}, c_i)$ and internal hidden state, outputting $s_i$\n", " * `rnn_input = concat(embedded, context)`\n", " * `rnn_output, rnn_hidden = rnn(rnn_input, last_hidden)`\n", "* an output layer $g$ with inputs $(y_{i-1}, s_i, c_i)$, outputting $y_i$\n", " * `output = out(embedded, rnn_output, context)`" ] }, { "cell_type": "code", "execution_count": 10, "metadata": { "collapsed": true }, "outputs": [], "source": [ "class BahdanauAttnDecoderRNN(nn.Module):\n", " def __init__(self, hidden_size, output_size, n_layers=1, dropout_p=0.1):\n", " super(AttnDecoderRNN, self).__init__()\n", " \n", " # Define parameters\n", " self.hidden_size = hidden_size\n", " self.output_size = output_size\n", " self.n_layers = n_layers\n", " self.dropout_p = dropout_p\n", " self.max_length = max_length\n", " \n", " # Define layers\n", " self.embedding = nn.Embedding(output_size, hidden_size)\n", " self.dropout = nn.Dropout(dropout_p)\n", " self.attn = GeneralAttn(hidden_size)\n", " self.gru = nn.GRU(hidden_size * 2, hidden_size, n_layers, dropout=dropout_p)\n", " self.out = nn.Linear(hidden_size, output_size)\n", " \n", " def forward(self, word_input, last_hidden, encoder_outputs):\n", " # Note that we will only be running forward for a single decoder time step, but will use all encoder outputs\n", " \n", " # Get the embedding of the current input word (last output word)\n", " word_embedded = self.embedding(word_input).view(1, 1, -1) # S=1 x B x N\n", " word_embedded = self.dropout(word_embedded)\n", " \n", " # Calculate attention weights and apply to encoder outputs\n", " attn_weights = self.attn(last_hidden[-1], encoder_outputs)\n", " context = attn_weights.bmm(encoder_outputs.transpose(0, 1)) # B x 1 x N\n", " \n", " # Combine embedded input word and attended context, run through RNN\n", " rnn_input = torch.cat((word_embedded, context), 2)\n", " output, hidden = self.gru(rnn_input, last_hidden)\n", " \n", " # Final output layer\n", " output = output.squeeze(0) # B x N\n", " output = F.log_softmax(self.out(torch.cat((output, context), 1)))\n", " \n", " # Return final output, hidden state, and attention weights (for visualization)\n", " return output, hidden, attn_weights" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Interpreting the Luong et al. model(s)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "[Effective Approaches to Attention-based Neural Machine Translation](https://arxiv.org/abs/1508.04025) by Luong et al. describe a few more attention models that offer improvements and simplifications. They describe a few \"global attention\" models, the distinction between them being the way the attention scores are calculated.\n", "\n", "The general form of the attention calculation relies on the target (decoder) side hidden state and corresponding source (encoder) side state, normalized over all states to get values summing to 1:\n", "\n", "$$\n", "a_t(s) = align(h_t, \\bar h_s) = \\dfrac{exp(score(h_t, \\bar h_s))}{\\sum_{s'} exp(score(h_t, \\bar h_{s'}))}\n", "$$\n", "\n", "The specific \"score\" function that compares two states is either *dot*, a simple dot product between the states; *general*, a a dot product between the decoder hidden state and a linear transform of the encoder state; or *concat*, a dot product between a new parameter $v_a$ and a linear transform of the states concatenated together.\n", "\n", "$$\n", "score(h_t, \\bar h_s) =\n", "\\begin{cases}\n", "h_t ^\\top \\bar h_s & dot \\\\\n", "h_t ^\\top \\textbf{W}_a \\bar h_s & general \\\\\n", "v_a ^\\top \\textbf{W}_a [ h_t ; \\bar h_s ] & concat\n", "\\end{cases}\n", "$$\n", "\n", "The modular definition of these scoring functions gives us an opportunity to build specific attention module that can switch between the different score methods. The input to this module is always the hidden state (of the decoder RNN) and set of encoder outputs." ] }, { "cell_type": "code", "execution_count": 11, "metadata": { "collapsed": true }, "outputs": [], "source": [ "class Attn(nn.Module):\n", " def __init__(self, method, hidden_size, max_length=MAX_LENGTH):\n", " super(Attn, self).__init__()\n", " \n", " self.method = method\n", " self.hidden_size = hidden_size\n", " \n", " if self.method == 'general':\n", " self.attn = nn.Linear(self.hidden_size, hidden_size)\n", "\n", " elif self.method == 'concat':\n", " self.attn = nn.Linear(self.hidden_size * 2, hidden_size)\n", " self.other = nn.Parameter(torch.FloatTensor(1, hidden_size))\n", "\n", " def forward(self, hidden, encoder_outputs):\n", " seq_len = len(encoder_outputs)\n", "\n", " # Create variable to store attention energies\n", " attn_energies = Variable(torch.zeros(seq_len)) # B x 1 x S\n", " if USE_CUDA: attn_energies = attn_energies.cuda()\n", "\n", " # Calculate energies for each encoder output\n", " for i in range(seq_len):\n", " attn_energies[i] = self.score(hidden, encoder_outputs[i])\n", "\n", " # Normalize energies to weights in range 0 to 1, resize to 1 x 1 x seq_len\n", " return F.softmax(attn_energies).unsqueeze(0).unsqueeze(0)\n", " \n", " def score(self, hidden, encoder_output):\n", " \n", " if self.method == 'dot':\n", " energy = hidden.dot(encoder_output)\n", " return energy\n", " \n", " elif self.method == 'general':\n", " energy = self.attn(encoder_output)\n", " energy = hidden.dot(energy)\n", " return energy\n", " \n", " elif self.method == 'concat':\n", " energy = self.attn(torch.cat((hidden, encoder_output), 1))\n", " energy = self.other.dot(energy)\n", " return energy" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now we can build a decoder that plugs this Attn module in after the RNN to calculate attention weights, and apply those weights to the encoder outputs to get a context vector." ] }, { "cell_type": "code", "execution_count": 12, "metadata": { "collapsed": false }, "outputs": [], "source": [ "class AttnDecoderRNN(nn.Module):\n", " def __init__(self, attn_model, hidden_size, output_size, n_layers=1, dropout_p=0.1):\n", " super(AttnDecoderRNN, self).__init__()\n", " \n", " # Keep parameters for reference\n", " self.attn_model = attn_model\n", " self.hidden_size = hidden_size\n", " self.output_size = output_size\n", " self.n_layers = n_layers\n", " self.dropout_p = dropout_p\n", " \n", " # Define layers\n", " self.embedding = nn.Embedding(output_size, hidden_size)\n", " self.gru = nn.GRU(hidden_size * 2, hidden_size, n_layers, dropout=dropout_p)\n", " self.out = nn.Linear(hidden_size * 2, output_size)\n", " \n", " # Choose attention model\n", " if attn_model != 'none':\n", " self.attn = Attn(attn_model, hidden_size)\n", " \n", " def forward(self, word_input, last_context, last_hidden, encoder_outputs):\n", " # Note: we run this one step at a time\n", " \n", " # Get the embedding of the current input word (last output word)\n", " word_embedded = self.embedding(word_input).view(1, 1, -1) # S=1 x B x N\n", " \n", " # Combine embedded input word and last context, run through RNN\n", " rnn_input = torch.cat((word_embedded, last_context.unsqueeze(0)), 2)\n", " rnn_output, hidden = self.gru(rnn_input, last_hidden)\n", "\n", " # Calculate attention from current RNN state and all encoder outputs; apply to encoder outputs\n", " attn_weights = self.attn(rnn_output.squeeze(0), encoder_outputs)\n", " context = attn_weights.bmm(encoder_outputs.transpose(0, 1)) # B x 1 x N\n", " \n", " # Final output layer (next word prediction) using the RNN hidden state and context vector\n", " rnn_output = rnn_output.squeeze(0) # S=1 x B x N -> B x N\n", " context = context.squeeze(1) # B x S=1 x N -> B x N\n", " output = F.log_softmax(self.out(torch.cat((rnn_output, context), 1)))\n", " \n", " # Return final output, hidden state, and attention weights (for visualization)\n", " return output, context, hidden, attn_weights" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Testing the models\n", "\n", "To make sure the Encoder and Decoder model are working (and working together) we'll do a quick test with fake word inputs:" ] }, { "cell_type": "code", "execution_count": 13, "metadata": { "collapsed": false, "scrolled": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "EncoderRNN (\n", " (embedding): Embedding(10, 10)\n", " (gru): GRU(10, 10, num_layers=2)\n", ")\n", "AttnDecoderRNN (\n", " (embedding): Embedding(10, 10)\n", " (gru): GRU(20, 10, num_layers=2, dropout=0.1)\n", " (out): Linear (20 -> 10)\n", " (attn): Attn (\n", " (attn): Linear (10 -> 10)\n", " )\n", ")\n", "torch.Size([1, 10]) torch.Size([2, 1, 10]) torch.Size([1, 1, 3])\n", "torch.Size([1, 10]) torch.Size([2, 1, 10]) torch.Size([1, 1, 3])\n", "torch.Size([1, 10]) torch.Size([2, 1, 10]) torch.Size([1, 1, 3])\n" ] } ], "source": [ "encoder_test = EncoderRNN(10, 10, 2)\n", "decoder_test = AttnDecoderRNN('general', 10, 10, 2)\n", "print(encoder_test)\n", "print(decoder_test)\n", "\n", "encoder_hidden = encoder_test.init_hidden()\n", "word_input = Variable(torch.LongTensor([1, 2, 3]))\n", "if USE_CUDA:\n", " encoder_test.cuda()\n", " word_input = word_input.cuda()\n", "encoder_outputs, encoder_hidden = encoder_test(word_input, encoder_hidden)\n", "\n", "word_inputs = Variable(torch.LongTensor([1, 2, 3]))\n", "decoder_attns = torch.zeros(1, 3, 3)\n", "decoder_hidden = encoder_hidden\n", "decoder_context = Variable(torch.zeros(1, decoder_test.hidden_size))\n", "\n", "if USE_CUDA:\n", " decoder_test.cuda()\n", " word_inputs = word_inputs.cuda()\n", " decoder_context = decoder_context.cuda()\n", "\n", "for i in range(3):\n", " decoder_output, decoder_context, decoder_hidden, decoder_attn = decoder_test(word_inputs[i], decoder_context, decoder_hidden, encoder_outputs)\n", " print(decoder_output.size(), decoder_hidden.size(), decoder_attn.size())\n", " decoder_attns[0, i] = decoder_attn.squeeze(0).cpu().data" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Training\n", "\n", "## Defining a training iteration\n", "\n", "To train we first run the input sentence through the encoder word by word, and keep track of every output and the latest hidden state. Next the decoder is given the last hidden state of the decoder as its first hidden state, and the `` token as its first input. From there we iterate to predict a next token from the decoder.\n", "\n", "### Teacher Forcing and Scheduled Sampling\n", "\n", "\"Teacher Forcing\", or maximum likelihood sampling, means using the real target outputs as each next input when training. The alternative is using the decoder's own guess as the next input. Using teacher forcing may cause the network to converge faster, but [when the trained network is exploited, it may exhibit instability](http://minds.jacobs-university.de/sites/default/files/uploads/papers/ESNTutorialRev.pdf).\n", "\n", "You can observe outputs of teacher-forced networks that read with coherent grammar but wander far from the correct translation - you could think of it as having learned how to listen to the teacher's instructions, without learning how to venture out on its own.\n", "\n", "The solution to the teacher-forcing \"problem\" is known as [Scheduled Sampling](https://arxiv.org/abs/1506.03099), which simply alternates between using the target values and predicted values when training. We will randomly choose to use teacher forcing with an if statement while training - sometimes we'll feed use real target as the input (ignoring the decoder's output), sometimes we'll use the decoder's output." ] }, { "cell_type": "code", "execution_count": 14, "metadata": { "collapsed": false }, "outputs": [], "source": [ "teacher_forcing_ratio = 0.5\n", "clip = 5.0\n", "\n", "def train(input_variable, target_variable, encoder, decoder, encoder_optimizer, decoder_optimizer, criterion, max_length=MAX_LENGTH):\n", "\n", " # Zero gradients of both optimizers\n", " encoder_optimizer.zero_grad()\n", " decoder_optimizer.zero_grad()\n", " loss = 0 # Added onto for each word\n", "\n", " # Get size of input and target sentences\n", " input_length = input_variable.size()[0]\n", " target_length = target_variable.size()[0]\n", "\n", " # Run words through encoder\n", " encoder_hidden = encoder.init_hidden()\n", " encoder_outputs, encoder_hidden = encoder(input_variable, encoder_hidden)\n", " \n", " # Prepare input and output variables\n", " decoder_input = Variable(torch.LongTensor([[SOS_token]]))\n", " decoder_context = Variable(torch.zeros(1, decoder.hidden_size))\n", " decoder_hidden = encoder_hidden # Use last hidden state from encoder to start decoder\n", " if USE_CUDA:\n", " decoder_input = decoder_input.cuda()\n", " decoder_context = decoder_context.cuda()\n", "\n", " # Choose whether to use teacher forcing\n", " use_teacher_forcing = random.random() < teacher_forcing_ratio\n", " if use_teacher_forcing:\n", " \n", " # Teacher forcing: Use the ground-truth target as the next input\n", " for di in range(target_length):\n", " decoder_output, decoder_context, decoder_hidden, decoder_attention = decoder(decoder_input, decoder_context, decoder_hidden, encoder_outputs)\n", " loss += criterion(decoder_output, target_variable[di])\n", " decoder_input = target_variable[di] # Next target is next input\n", "\n", " else:\n", " # Without teacher forcing: use network's own prediction as the next input\n", " for di in range(target_length):\n", " decoder_output, decoder_context, decoder_hidden, decoder_attention = decoder(decoder_input, decoder_context, decoder_hidden, encoder_outputs)\n", " loss += criterion(decoder_output, target_variable[di])\n", " \n", " # Get most likely word index (highest value) from output\n", " topv, topi = decoder_output.data.topk(1)\n", " ni = topi[0][0]\n", " \n", " decoder_input = Variable(torch.LongTensor([[ni]])) # Chosen word is next input\n", " if USE_CUDA: decoder_input = decoder_input.cuda()\n", "\n", " # Stop at end of sentence (not necessary when using known targets)\n", " if ni == EOS_token: break\n", "\n", " # Backpropagation\n", " loss.backward()\n", " torch.nn.utils.clip_grad_norm(encoder.parameters(), clip)\n", " torch.nn.utils.clip_grad_norm(decoder.parameters(), clip)\n", " encoder_optimizer.step()\n", " decoder_optimizer.step()\n", " \n", " return loss.data[0] / target_length" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Finally helper functions to print time elapsed and estimated time remaining, given the current time and progress." ] }, { "cell_type": "code", "execution_count": 15, "metadata": { "collapsed": false }, "outputs": [], "source": [ "def as_minutes(s):\n", " m = math.floor(s / 60)\n", " s -= m * 60\n", " return '%dm %ds' % (m, s)\n", "\n", "def time_since(since, percent):\n", " now = time.time()\n", " s = now - since\n", " es = s / (percent)\n", " rs = es - s\n", " return '%s (- %s)' % (as_minutes(s), as_minutes(rs))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Running training\n", "\n", "With everything in place we can actually initialize a network and start training.\n", "\n", "To start, we initialize models, optimizers, and a loss function (criterion)." ] }, { "cell_type": "code", "execution_count": 16, "metadata": { "collapsed": false }, "outputs": [], "source": [ "attn_model = 'general'\n", "hidden_size = 500\n", "n_layers = 2\n", "dropout_p = 0.05\n", "\n", "# Initialize models\n", "encoder = EncoderRNN(input_lang.n_words, hidden_size, n_layers)\n", "decoder = AttnDecoderRNN(attn_model, hidden_size, output_lang.n_words, n_layers, dropout_p=dropout_p)\n", "\n", "# Move models to GPU\n", "if USE_CUDA:\n", " encoder.cuda()\n", " decoder.cuda()\n", "\n", "# Initialize optimizers and criterion\n", "learning_rate = 0.0001\n", "encoder_optimizer = optim.Adam(encoder.parameters(), lr=learning_rate)\n", "decoder_optimizer = optim.Adam(decoder.parameters(), lr=learning_rate)\n", "criterion = nn.NLLLoss()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Then set up variables for plotting and tracking progress:" ] }, { "cell_type": "code", "execution_count": 17, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Starting job 591f9f701438c4613b4c4dc7 at 2017-05-20 03:02:21\n" ] } ], "source": [ "# Configuring training\n", "n_epochs = 50000\n", "plot_every = 200\n", "print_every = 1000\n", "\n", "# Keep track of time elapsed and running averages\n", "start = time.time()\n", "plot_losses = []\n", "print_loss_total = 0 # Reset every print_every\n", "plot_loss_total = 0 # Reset every plot_every" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To actually train, we call the train function many times, printing a summary as we go.\n", "\n", "*Note:* If you run this notebook you can train, interrupt the kernel, evaluate, and continue training later. You can comment out the lines above where the encoder and decoder are initialized (so they aren't reset) or simply run the notebook starting from the following cell." ] }, { "cell_type": "code", "execution_count": 18, "metadata": { "collapsed": false, "scrolled": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "[log] 0m 42s (1000) 1.7562\n", "0m 42s (- 35m 0s) (1000 2%) 3.2168\n", "[log] 1m 28s (2000) 3.4178\n", "1m 28s (- 35m 14s) (2000 4%) 2.8085\n", "[log] 2m 13s (3000) 1.9268\n", "2m 13s (- 34m 50s) (3000 6%) 2.6295\n", "[log] 2m 59s (4000) 3.5481\n", "2m 59s (- 34m 24s) (4000 8%) 2.5226\n", "[log] 3m 45s (5000) 2.1306\n", "3m 45s (- 33m 51s) (5000 10%) 2.3431\n", "[log] 4m 31s (6000) 2.4112\n", "4m 31s (- 33m 9s) (6000 12%) 2.2012\n", "[log] 5m 16s (7000) 2.0306\n", "5m 16s (- 32m 26s) (7000 14%) 2.1778\n", "[log] 6m 3s (8000) 0.7172\n", "6m 3s (- 31m 46s) (8000 16%) 2.0516\n", "[log] 6m 49s (9000) 0.9867\n", "6m 49s (- 31m 3s) (9000 18%) 1.9482\n", "[log] 7m 35s (10000) 0.7058\n", "7m 35s (- 30m 22s) (10000 20%) 1.8463\n", "[log] 8m 22s (11000) 4.0532\n", "8m 22s (- 29m 40s) (11000 22%) 1.8389\n", "[log] 9m 8s (12000) 0.7909\n", "9m 8s (- 28m 56s) (12000 24%) 1.7710\n", "[log] 9m 53s (13000) 3.5531\n", "9m 53s (- 28m 10s) (13000 26%) 1.6996\n", "[log] 10m 40s (14000) 0.3723\n", "10m 40s (- 27m 26s) (14000 28%) 1.6392\n", "[log] 11m 27s (15000) 1.0735\n", "11m 27s (- 26m 43s) (15000 30%) 1.5887\n", "[log] 12m 13s (16000) 0.0578\n", "12m 13s (- 25m 58s) (16000 32%) 1.5159\n", "[log] 12m 59s (17000) 1.4627\n", "12m 59s (- 25m 13s) (17000 34%) 1.4669\n", "[log] 13m 46s (18000) 2.7408\n", "13m 46s (- 24m 28s) (18000 36%) 1.3980\n", "[log] 14m 33s (19000) 0.3085\n", "14m 33s (- 23m 44s) (19000 38%) 1.3614\n", "[log] 15m 19s (20000) 0.8777\n", "15m 19s (- 22m 59s) (20000 40%) 1.3163\n", "[log] 16m 6s (21000) 1.8394\n", "16m 6s (- 22m 15s) (21000 42%) 1.2799\n", "[log] 16m 53s (22000) 0.8656\n", "16m 53s (- 21m 29s) (22000 44%) 1.2038\n", "[log] 17m 39s (23000) 3.5788\n", "17m 39s (- 20m 43s) (23000 46%) 1.1853\n", "[log] 18m 26s (24000) 1.3385\n", "18m 26s (- 19m 58s) (24000 48%) 1.1643\n", "[log] 19m 12s (25000) 0.0158\n", "19m 12s (- 19m 12s) (25000 50%) 1.1351\n", "[log] 19m 59s (26000) 0.7937\n", "19m 59s (- 18m 27s) (26000 52%) 1.1285\n", "[log] 20m 46s (27000) 1.3123\n", "20m 46s (- 17m 41s) (27000 54%) 1.0553\n", "[log] 21m 32s (28000) 1.6989\n", "21m 32s (- 16m 55s) (28000 56%) 1.0265\n", "[log] 22m 18s (29000) 2.2208\n", "22m 18s (- 16m 9s) (29000 57%) 0.9440\n", "[log] 23m 4s (30000) 0.1320\n", "23m 4s (- 15m 23s) (30000 60%) 0.9769\n", "[log] 23m 51s (31000) 0.0043\n", "23m 51s (- 14m 37s) (31000 62%) 0.9395\n", "[log] 24m 37s (32000) 0.0119\n", "24m 37s (- 13m 51s) (32000 64%) 0.8899\n", "[log] 25m 23s (33000) 0.2071\n", "25m 23s (- 13m 5s) (33000 66%) 0.9135\n", "[log] 26m 10s (34000) 0.0169\n", "26m 10s (- 12m 19s) (34000 68%) 0.8698\n", "[log] 26m 57s (35000) 0.7662\n", "26m 57s (- 11m 33s) (35000 70%) 0.8209\n", "[log] 27m 43s (36000) 0.1208\n", "27m 43s (- 10m 46s) (36000 72%) 0.7931\n", "[log] 28m 29s (37000) 0.3535\n", "28m 29s (- 10m 0s) (37000 74%) 0.7899\n", "[log] 29m 15s (38000) 1.3398\n", "29m 15s (- 9m 14s) (38000 76%) 0.7603\n", "[log] 30m 2s (39000) 0.0115\n", "30m 2s (- 8m 28s) (39000 78%) 0.7454\n", "[log] 30m 48s (40000) 0.2135\n", "30m 48s (- 7m 42s) (40000 80%) 0.6740\n", "[log] 31m 34s (41000) 1.1087\n", "31m 34s (- 6m 55s) (41000 82%) 0.6738\n", "[log] 32m 20s (42000) 0.0262\n", "32m 20s (- 6m 9s) (42000 84%) 0.6659\n", "[log] 33m 7s (43000) 1.2855\n", "33m 7s (- 5m 23s) (43000 86%) 0.7443\n", "[log] 33m 54s (44000) 0.0022\n", "33m 54s (- 4m 37s) (44000 88%) 0.6427\n", "[log] 34m 40s (45000) 0.5267\n", "34m 40s (- 3m 51s) (45000 90%) 0.6092\n", "[log] 35m 27s (46000) 0.0068\n", "35m 27s (- 3m 4s) (46000 92%) 0.6172\n", "[log] 36m 12s (47000) 0.5520\n", "36m 12s (- 2m 18s) (47000 94%) 0.6145\n", "[log] 36m 59s (48000) 0.0185\n", "36m 59s (- 1m 32s) (48000 96%) 0.5903\n", "[log] 37m 46s (49000) 0.0026\n", "37m 46s (- 0m 46s) (49000 98%) 0.6131\n", "[log] 38m 32s (50000) 0.0138\n", "38m 32s (- 0m 0s) (50000 100%) 0.5403\n" ] } ], "source": [ "# Begin!\n", "for epoch in range(1, n_epochs + 1):\n", " \n", " # Get training data for this cycle\n", " training_pair = variables_from_pair(random.choice(pairs))\n", " input_variable = training_pair[0]\n", " target_variable = training_pair[1]\n", "\n", " # Run the train function\n", " loss = train(input_variable, target_variable, encoder, decoder, encoder_optimizer, decoder_optimizer, criterion)\n", "\n", " # Keep track of loss\n", " print_loss_total += loss\n", " plot_loss_total += loss\n", "\n", " if epoch == 0: continue\n", "\n", " if epoch % print_every == 0:\n", " print_loss_avg = print_loss_total / print_every\n", " print_loss_total = 0\n", " print_summary = '%s (%d %d%%) %.4f' % (time_since(start, epoch / n_epochs), epoch, epoch / n_epochs * 100, print_loss_avg)\n", " print(print_summary)\n", "\n", " if epoch % plot_every == 0:\n", " plot_loss_avg = plot_loss_total / plot_every\n", " plot_losses.append(plot_loss_avg)\n", " plot_loss_total = 0" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Plotting training loss\n", "\n", "Plotting is done with matplotlib, using the array `plot_losses` that was created while training." ] }, { "cell_type": "code", "execution_count": 19, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXcAAAD8CAYAAACMwORRAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3Xd4XNW59/3vrW713iVLtmUbN7nbGAM2pkNogdAJhHJ4\nQggkJ43kJCfnTc6TcPImgYSQBAgBDoSS4FCcAMGAMca9yHK35aZqFat3zcx6/thbsmyrjK2tfn+u\ny5c0M0sz99bA0p611/otMcaglFJqZPEZ7AKUUko5Tzt3pZQagbRzV0qpEUg7d6WUGoG0c1dKqRFI\nO3ellBqBtHNXSqkRSDt3pZQagbRzV0qpEchvsF44NjbWZGRkDNbLK6XUsLRly5YKY0xcb+0GrXPP\nyMhg8+bNg/XySik1LInIUW/aeT0sIyK+IrJNRFZ08djtIpIrIjtEZK2IZJ9JsUoppZx1JmfujwB7\ngPAuHjsMXGiMqRKRK4BngAUO1KeUUuoseHXmLiKpwFXAc109boxZa4ypsm+uB1KdKU8ppdTZ8HZY\n5gngO4DHi7b3Au919YCIPCAim0Vkc3l5uZcvrZRS6kz12rmLyNVAmTFmixdtl2J17t/t6nFjzDPG\nmLnGmLlxcb1e7FVKKXWWvBlzPw+4RkSuBIKAcBF52RhzR+dGIjIDa9jmCmPMcedLVUop5a1ez9yN\nMY8ZY1KNMRnALcDHXXTs6cBy4E5jzP5+qVQppZTXznqFqog8KCIP2jd/BMQAT4tIjoj02wT2fcfq\n+OW/9nG8vqW/XkIppYa9M1rEZIxZBayyv/9Dp/vvA+5zsrDuHCyv57cf53H1jGRiQgMH4iWVUmrY\nGXbZMv6+Vsltbm8m7iil1Og07Dp3P18BoFU7d6WU6pZT8QMiIr8RkTw7hmC2s2WeEGCfubvcpr9e\nQimlhr0zOXNvjx/oyhVAlv3vAeD3fayrWzoso5RSvXMkfgC4FnjJWNYDkSKS5FCNJ9FhGaWU6p1T\n8QMpQEGn24X2fSdxIn5Ah2WUUqp3jsYP9MaJ+AEdllFKqd55c+beHj9wBHgNuEhEXj6lTRGQ1ul2\nqn2f49qHZbRzV0qp7jkSPwC8A9xlz5pZCNQYY0qcL/fEsEybDssopVS3znqbvfboAXul6j+BK4E8\noBG4x5HquqDDMkop1Tun4gcM8JCThXVHh2WUUqp3w26Fqr8OyyilVK+8mS0TJCIbRWS7iOwSkf/q\nok2EiLzbqU0/DsvombtSSvXGm2GZFuAiY0y9iPgDa0TkPXuxUruHgN3GmC+ISBywT0ReMca0Ol1w\nx5m7Szt3pZTqTq+duz2eXm/f9Lf/nTomYoAwEREgFKgEXA7W2cHPxz5z9+iwjFJKdcfb+AFfEckB\nyoAPjTEbTmnyFHAOUAzsAB4xxvTLqbWI4O8rOiyjlFI98KpzN8a4jTEzsRYnzReRaac0uQzIAZKB\nmcBTIhJ+6vM4ET8A1tCMSzt3pZTq1hnNljHGVAOfAJef8tA9wHI7OCwPOAxM7uLn+xw/AFbnrrNl\nlFKqe97MlokTkUj7+zHAJcDeU5rlA8vsNgnAJOCQs6We4O8rmgqplFI98Ga2TBLwooj4Yv0xeMMY\ns+KUFao/AV4QkR2AAN81xlT0V9E6LKOUUj3zZrZMLjCri/s7r1AtBi51trTu6bCMUkr1bNitUAUr\ngkCHZZRSqnvDsnMP0GEZpZTqkSPxA3a7JSKSY7f51PlST9BhGaWU6pkj8QP2bJqngcuNMfkiEt9P\n9QLWsIwuYlJKqe45FT9wG9Y893z7Z8qcLPJU1pm7du5KKdUdp+IHJgJRIrJKRLaIyF1OF9pZgA7L\nKKVUj5yKH/AD5gBXYUUR/FBEJp76PE7FD+iwjFJK9cyp+IFC4ANjTIO9eGk1kN3Fz2v8gFJKDQCn\n4gfeBhaLiJ+IBAMLgD1OF9suQMfclVKqR47EDxhj9ojI+0Au4AGeM8bs7LeidVhGKaV65Ej8gH37\nF8AvnCute1a2jA7LKKVUd4blClV/Xx+NH1BKqR4M085dh2WUUqonjsUP2G3niYhLRG50tsyT6bCM\nUkr1zJH4AbAWOgGPA//qhzpPosMySinVs17P3O2t83qLHwB4GHgTaxVrv/L3FU2FVEqpHjgSPyAi\nKcD1wO97eR7HNsj2GHB7dGhGKaW64lT8wBNYW+v1eDrt5ApVQC+qKqVUN7wZc+9gjKkWkfb4gc6L\nlOYCr4kIQCxwpYi4jDFvOVZpJ/6+Alide5C/b3+8hFJKDWu9du4iEge02R17e/zA453bGGMyO7V/\nAVjRXx07dD5z12EZpZTqiiPxA/1ZYFd0WEYppXrmWPxAp/vv7ntZPfPrNCyjlFLqdMNyhWqADsso\npVSPhmXnrsMySinVM0fiB0TkdhHJFZEdIrJWRE7bqMNJOiyjlFI9cyp+4DBwoTGmSkSuAJ7B2rCj\nX+iwjFJK9cybC6oG6DF+wBizttPN9ViLnfqNDssopVTPHIkfOMW9wHvdPI9jG2SDdu5KKdUdp+IH\nABCRpVid+3e7eR6H4wd0WEYppbpyRrNljDHVQHv8wElEZAbwHHCtMea4M+V1rWPM3aVn7kop1RVv\nZsvEiUik/X17/MDeU9qkA8uBO40x+/uj0M7ah2VcHu3clVKqK07FD/wIiAGetsPDXMaYuf1Uc8ew\nTKsOyyilVJcciR8wxtwH3Odsad3TYRmllOrZsFyhGhRglV3X3DbIlSil1NDk1ApVEZHfiEievVJ1\ndv+Ua4kLDSQuLJDthTX9+TJKKTVsObVC9Qogy/63AGu7vX5boSoizB0bxeajlf31EkopNaw5tUH2\ntcBLdtv1QKSIJDlb6snmjI2ioLKJstpm1hyo4KY/rKW5zd2fL6mUUsOGUytUU4CCTrcL7fv6zdyM\naAA2H61i5Z5SNh2pYsvRqv58SaWUGjYcXaHaG6fiBwCmJocT5O/D5iNV7C+tA2BNXkWfnlMppUYK\np1aoFgFpnW6n2ved+vOOxA+ANdc9OzWSzUcr2V9qjRqtOaCdu1JKgUMrVIF3gLvsWTMLgRpjTInj\n1Z5ibkYUO4tqqKhvISYkgJ3FNVQ1tPb3yyql1JDnzZl7EvCJiOQCm7DG3FeIyIPtq1SBfwKHgDzg\nWeCr/VLtKeaOjcZjX9q9dX46xsDGIzqDRimlnFqhaoCHnC2td7PTozq+v25WCk+vymNXcS2XTU0c\n6FKUUmpIGZYrVNtFBPszMSGU0EA/xseFMD4ulF1FurBJKaW8WcQ0pN25cCwFVU2ICFOTw1l/SIdl\nlFLKmwuqaSLyiYjstuMHHumiTYSIvNspouCe/in3dHeem8H3rzwHgGkpERyrbaaivmWgXl4ppYYk\nb4ZlXMC/G2OmAAuBh0RkyiltHgJ2G2OygSXAL0UkwNFKvTA1OQKAXcW1A/3SSik1pHgTP1BijNlq\nf18H7OH01acGCBMrzD0UqMT6ozCgpiSHA7C9oHqgX1oppYaUM7qgKiIZWDNnTo0feAo4BygGdgCP\nGGMGPGw9Yow/2WmRfLi7lBaXm6LqpoEuQSmlhgSvO3cRCQXeBB41xpw67nEZkAMkAzOBp0QkvIvn\ncCx+oDtXTktkR1ENX/rjepb9chUFlY398jpKKTWUeRsc5o/Vsb9ijFneRZN7gOV2KmQecBiYfGoj\nJ+MHunPldCuMcntBNc1tHn76j9398jpKKTWUeTNbRoA/AXuMMb/qplk+sMxunwBMwlqxOuDSooOZ\nOzaKGakRfPOSiXywq5QduqmHUmqU8Wae+3nAncAOO/YX4PtAOnSsVP0J8IKI7AAE+K4xZtBSvF78\nynxEoNXl4bcfH+Cd7UVMT7Vm0hhjMAZ8fGSwylNKqX7nTfzAGqwOu6c2xcClThXVVyGB1mEFB8AF\nWXGsyC3hsSvOwcdHeDunmP96dxfrHltGkL/vIFeqlFL9Y1jHD3jjmpnJlNQ089BftrLpSCXv7zxG\nVWMbpbXNg12aUkr1m2EfP9CbS6YkcO64GD7dX05xdRP59uyZivoWxsaEDHJ1SinVPxyJH7DbLRGR\nHLvNp86XenaCA/x49YGFPHpxFtsLa6hqbAOgvE5z35VSI5cj8QP2Zh5PA9cYY6YCNzleaR9dk52C\ndLpyoPkzSqmRzKn4gduw5rnn2+3KnC60rxIjgjhvfCwpkWMA7dyVUiObU/EDE4EoEVklIltE5C5n\nynPWr2+eyav3LyQq2P+kzr25TaMKlFIji9cXVHuJH/AD5mAtZBoDrBOR9caY/ac8xwPAAwDp6el9\nqfusxIUFAhAbGkhFXSvHaprxEbj/pc0cqmhg0w8u1umRSqkRwavO3Yv4gULguDGmAWgQkdVANnBS\n526MeQZ4BmDu3LmmL4X3RWxoIEXVTVz8q0+pbzkRXrnmQAUXT0kYrLKUUsoxTsUPvA0sFhE/EQkG\nFmCNzQ9JsWGB7Cyuob7FxRXTEvnlTdmEBfrx4e7SwS5NKaUc4Uj8gDFmj4i8D+QCHuA5Y8zO/ijY\nCbGhARj7c8OPr5lKQngQn+wr46O9pXg8RqMJlFLDniPxA3a7XwC/cKKo/hYbao29J0UEkRAeBFiL\nnVbklrCruLYjh0YppYarER8/0JU4u3PPTo3suG/O2CgAcgp1Fyel1PA3Kjv32DBre9fstBOde0rk\nGGJDAzq26NtdXMvq/f2zoYhSSvU3x+IH7LbzRMQlIjc6W6azJiWGExMSwJJJJzYMERGyUyPJsTv3\nn723h/tf2kyZBowppYYhR+IHAETEF3gc+JezJTovJXIMW354CecknbwT4My0SA6W11Pb3Maeklpa\nXB5+/+nBQapSKaXOnlPxAwAPY82FH3LRA97KTovEGPhkbxkV9a2EBfnxyoZ8Gltdvf+wUkoNIY7E\nD4hICnA98Ptefr7fN8jui+y0SPx8hGdWWzsEXpOdTKvL0xETrJRSw4XXnXsv8QNPYG2t5+npOQZi\ng+y+iBjjz6IJsewqtg7vsqmJAOQf185dKTW8eNW5exE/MBd4TUSOADcCT4vIdY5VOYCunGZ16MkR\nQUxPsea765m7Umq4cSR+wBiTaYzJMMZkAH8DvmqMecvRSgfIpVMT8fURzkkKJzLYn7BAPwq0c1dK\nDTOOxA/0U22DIjokgP++bhrj40MREdKig/XMXSk17DgWP9Cp/d19KWgouGX+iTji9Ohg8srraW5z\n89ArW0mICOL/Xj99EKtTSqnejcoVqmciPcY6c//6q9v4aG8Zf9mQz67imsEuSymleuTIClURuV1E\nckVkh4isFZHs/il34KVFB9Pq8vCv3aU8enEWYUF+PLnywGCXpZRSPfJmzL19hepWEQkDtojIh8aY\n3Z3aHAYuNMZUicgVWBtyLOiHegdcenQwABdOjOORZVkAPLHyADuLapiWoumRSqmhyZEVqsaYtcaY\nKvvmeiDV6UIHy/yMaO4/P5Nf3DQDEeGe8zIJD/LjyY9OPns3xlDb3AbAlqOVVDa0Dka5SikFOLdB\ndmf3Au+dfUlDy5gAX35w1RTiw6zc94gx/ty7eBwf7i5l37G6jnYvb8hn4f/9iJ1FNdz0h3Xc/twG\nmlrdg1W2UmqUc2qFanubpVid+3e7eXxIxw94685zxxLg68OrG/M77nsnp4jGVjf//sZ2PAb2lNTy\n/63YNYhVKqVGM6dWqCIiM4DngGuNMce7ajPU4we8FR0SwOXTElm+tZCmVjfldS1sPmqNSu0rrWNi\nQij3Ls7k9U0F5JXVD3K1SqnRyJEVqiKSDiwH7jTG7He2xKHp1vnp1Da7+OeOEj7aU4oxcMMs61LE\nldOT+OqS8QT5+/LEylHx61BKDTFOrVD9ERCDlSkD4DLGzHW+3KFj4bhoMmND+MvGfIwxpEcH8x9X\nT6Gpzc3N89KICQ3krnMzeGb1QYqqm0iJHDPYJSulRhFHVqgaY+4D7nOqqOFARLhlXho/e28vAD+7\nYTrRIQH8/o45HW3uWJjOH1cf5PVNBXzzkomDVapSahTSFap98MU5qfj7CuNiQ7hpzumzP1Ojgrlw\nYhyvb8rH5e4xDVkppRylnXsfxIYG8ttbZ/ObW2fh59v1r/KG2amU1rawu6TLCUZKKdUvnIofEBH5\njYjk2TEEs/un3KHn8mmJPa5UnZQQBsCR4408+to2pv7ofe7+80bAWvj02PIdrD/U5eQipZQ6a07F\nD1wBZNn/FmBttzci4gf6qj2+IK+0jhW5JYzx92XVvnLKaptpanN3zJVfOC5mMMtUSo0wTm2QfS3w\nkrGsByJFJMnxaoehMQG+JIQHsmp/OS6P4baFVpzwqv3lbLHnxh8s17nwSilnORU/kAIUdLpdyOl/\nAEatsTEh5BZaMcHXZCeTGB7EJ3vL2Jpvd+660Ekp5TBH4we8eI4RET9wpjJirKEZXx9hfFwoSyfH\n8dmBCtbmWWPtxxtaqdKgMaWUg5yKHygC0jrdTrXvO8lIiR84U2NjQuyvwQT5+3Lr/HRaXR4OVTR0\nbMJ9qMI6e3d7DP/z/l6KqpsGrV6l1PDnSPwA8A5wlz1rZiFQY4wpcbDOYW2sfebePnNmRmokP7th\nOiJw17ljAThY1gBYgWNPrzrIW9tO+9uolFJecyp+4J/AlUAe0Ajc43ypw1eGfeaeZXfuYC2AunhK\nAqGBfvzgrZ3k2RdV2+fDd44TVkqpM+VU/IABHnKqqJFmQnwoyybHc9nUhJPujxjjD8C42BD22J36\n7mLr6/5S7dyVUmdPV6gOgCB/X/509zymJne92OmCiXGsO3ic4/UtHZ38wfJ62jSyQCl1lrRzHwK+\nODsVl8fwdk4xe0pqiRjjT5vb8PL6o7ydo2PvSqkz580F1edFpExEdnbzeISIvCsi2+14Ah1vP0OT\nEsOYmhzOs58dorbZxVUzrPVf//Xubv7znV14PIb3d5bwiw/2sru4lv9dd4Qbnv6cl9Yd4d/f2M4h\nXQSllDqFNxdUXwCeAl7q5vGHgN3GmC+ISBywT0ReMcboxO0z8PBFWTz6+jbAWuj0+qYC3B5DdWMb\ny7cV8a2/bgfglQ351De78PERtuZXA5AQHsh3Lp88aLUrpYYeb+IHVgOVPTUBwuwpk6F2W5cz5Y0e\nl09LZNW3lvLbW2exIDOay6cmcukU6wLsc58dwkfgna+dhwDxYYGs/d5FrHh4MTNSI9h8pOqk56pv\nsX79NY1t1Da3DfShKKWGAG/O3HvzFNY892IgDLjZGKNXAs9CYkQQX8hOBuB3t8+muc3N1P/8gL3H\n6pieEsGM1Ejee+QCfHysuOHY0EDmZ0Tz0vqjtLjcBPr5sqeklqt/u4Y3/u1cfvXhPgJ8ffjzPfMH\n+ciUUgPNiQuqlwE5QDIwE3hKRMK7ajha4wfOVpC/LxPiQgFYkBkNWH8A4sOCOtrMzYim1eVhZ5E1\ny+ajPaW4PYYtRyvZXlDDxsOVuD1m4ItXSg0qJzr3e4DldiJkHnAY6HIAeLTGD/TF1BTr7+SCbiKB\n52ZEAfCP3BKa29x8dqACgE/3l1Pf4qKh1c2BMp0zr9Ro40Tnng8sAxCRBGAScMiB51XAueNiCA30\nY57diZ8qNjSQheOief7zw1z1m886kibXHTyxAcg2+8KrUmr0EGtxaQ8NRF4FlgCxQCnwn4A/WNED\nIpKMNaMmCWsl68+NMS/39sJz5841mzdv7kvto4IxhoZWN6GB3V8eaXN7+GDXMb7xeg5tbkNWfCgH\n7Bjh4ABfvjAjmcdvnEF5XQs7iqrZU1JHZmwIV07XyH2lhhsR2WKMmdtbO2/iB27t5fFi4NIzqE2d\nARHpsWMH8Pf14eoZyTS3eXhx7RFumJ3Cf727m9jQQKYmh7OtoIqy2mYW/88ntLqsa91j/H1ZMimO\n4AAnrqkrpYYaXaE6gtw4J5V37emRABMTQpmfGc3+0nr+sjGfVpeHp2+fzfN3z6Wpzc2Hu0tP+vm9\nx2p5Y3NBV0+tlBpmtHMfgdrTJycmhHHFtEQAnl51kMTwIK6YlsiSifEkRQTxTk5xx88YY/j2X3P5\n3pu5NLe5B6VupZRz+hw/YLdZIiI5dvzAp86WqM5UeJA/T94yk3sXZzIuLpTJiWG0ujwsmRSHiODj\nI1yTncyn+8vZeNhan/bx3jJ2FNXgMXCgVOMMlBruvDlzfwG4vLsHRSQSeBq4xhgzFbjJmdJUX1w7\nM4W0aGuTkCumWRdOl0yK73j8gQvGkR4TzFde2ERBZSN//vwIYUHW+PveY2e1i6JSaghxIn7gNqx5\n7vl2+zKHalMOuWNhOg8tHc/SySfWFsSEBvLHO+ZQ3+LiswMV7Ciq4cppSQT6+ehGIUqNAE6MuU8E\nokRklYhsEZG7umuoK1QHR0xoIN++bDKBfr4n3T8+LpTgAF8+P1hBTVMbk5PCyEoIZW+nzv3Z1Yf4\n/t93DHTJSqk+cqJz9wPmAFdhRRH8UEQmdtVQV6gOLT4+QlZCGB/vsT5sZcWHMTkxnNzCaq773eds\nPFzJC2uP8Pa2Imqa2njktW2U1OjG3UoNB0507oXAB8aYBmNMBbAayHbgedUAmJQQSpM9OyYrwbr4\nWtvsIqegmh/8fQdF1U00tLp5O6eIt3OKeXb14UGuWCnlDSc697eBxSLiJyLBwAJgjwPPqwbARHva\nZHiQH/FhgVw4MY55GVFcMS2xY5UrWNk1AH/dUkBjqyY6KzXUeTMV8lVgHTBJRApF5F4ReVBEHgQw\nxuwB3gdygY3Ac8aYbqdNqqFlcqIVTJaVEIaINUzz1wcX8c1LrJG12NAAADYeqSTQz4e6Zhfv5BSz\n91gtr2/KH7S6lVI963P8gN3mF8AvHKlIDaiJiVakcFZ86En3ZyWEccu8NLLTIvnJit00trpZOime\nI8cbeGndUf66pZAtR6uYnR7FhPhQHn51Gwsyo7nz3IxBOAql1Kl0heooFxcayN2LMrhhduppj/38\nizO4dX46mbEhAExOCuPOc8eyu6SWLUet9Mk/rz3CitwSVuSW8MoGPZNXaqhwZIWq3W6eiLhE5Ebn\nylP9TUT48TVTmW9vBtKVcfaGIZMTw7luZgqhgX6EB/lx9Ywk3txSyH+9uwuAvcfqqKhvoaqhlbue\n38j+Up0vr9RgcWKDbETEF3gc+JczZamhZFz7mXtiGCGBfvzPjTPws6dRVja0Utfs4sELx/PTf+xh\n3cHjBPn7snp/Od9qbGX5/1mEn69+QFRqoHkz5r5aRDJ6afYw8CYwz4Ga1BBz87w0wsf4MzbGijPo\nnAP/l/sXAuD2GJ786ACf51UwLs76Y5BbWMO9L27mO5dPYmpyxMAXrtQo1udTKhFJAa4Hft/3ctRQ\nlBw5hnsXZyIi3bbx9REWjoth3aHjHK5oIDokgO9cPoncwmpue3YDH+8t5YmV+zW3RqkB4sTn5SeA\n7xpjPL011PiBkW3O2CiOHm9ky9EqxsWG8NUlE/j7V8/DYwxfeWEzT6w8wOVPfMbHe60ceWMMx+tb\nBrlqpUYmJzr3ucBrInIEuBF4WkSu66qhxg+MbLPSIgHYX1rfMcMmIzaEP945h/vPz+STby0hLiyQ\nN7cWAfCbj/JY9POPKahsZEVuMUXVTbg9RhdJKeWAPu+xZozJbP9eRF4AVhhj3urr86rhZ3pqBL4+\ngttjyLTH3QEWjY9l0fhYAJZOiuO9ncc4XNHA71bl0ery8I3Xc9h8tIovnzuW2NBAXlx3lE+/vYTt\nhdW4PYYFmTEE+OlFWaXORK+de+cNskWkkFM2yO7X6tSwEhzgx8SEMPaU1HbMsDnV0knxvLG5kLue\n34CvCOdnxfLZgQoAdhXXEuDnQ0V9C4+8to2VdqDZ1TOSeOq22QN2HEqNBI6sUO3U9u4+VaOGvVnp\nkewpqSUzNrTLxxdnxeLvKxRVNfHkLbOYEB/Kl/6wjvSYYPaU1OLjY120XbmnjEkJYczLjOIvG/Ip\nqGzs2HxEKdU7/ayrHHX1jCTmZ0aTEdt1RxwW5M9/XzedP909jy9kJ3NOUji5P76UL5+bQUOrm7pm\nF1dOTyQq2J+ffXE6Dy2dgIjw0roj3b5mm7vXa/lKjTrauStHLRofyxv/du5pG4N09qV5aSzttOWf\niDAlObzj9kNLJ7D5Py5hdnoUSRFjuGp6Ei+vz6eo+vQs+Q2HjpP1g/e46JerWH/o+EmPeTyGm/+4\njne3F5/2c/e9uJnffZJ3Noeo1LDQ5/gBEbldRHJFZIeIrBURzXJXZywrIRQ/HyHAz4eJCWH4+pyY\nU//tyyZhMPzk3d2n/dyavAp8fQSPx/B/Xt5CQWVjx2OHKhrYcLiSD3eXnvQzbW4Pq/aV8a/dpbS6\nPOwp0bn3auTp8wbZwGHgQmPMdOAnwDMO1KVGmUA/XyYnhTE1ORz/U+IK0qKDeWjJBN7fdYxdxTUn\nPZZTUM3EhDBeuGc+bW7Dr1fuP+kx4LSMm8KqJlwew96SWv605jBX/3YNx2qa++nIlBocfd4g2xiz\n1hhTZd9cD5weL6iUF379pZn8/zd1/cHvrnMzCPL34eX1J5InjTHsKKohOzWCjNgQFo6LZkfhic4/\np8D6z/Jgef1J4/KHyq1NSFpcHv533RHcHsPGIz3tAa/U8OP0mPu9wHsOP6caJbISwhgf1/Usm4hg\nf67JTuatbUW8v7OExlYX+ZWNVDe2kW0vnpqSHMHB8nqaWq1tA3MKqvERaHMbDlc0dDxX5++L7TP2\njYdPHq9XarhzrHMXkaVYnft3e2ij8QPqrN29KBMRePDlrTy58kDHsMuMVCuUbEpSOB4De4/V0tzm\nZm9JHUvsC7d7j50YmjlU0UB4kB+B9sKo0EA/Nh2uOum1thdU89MVu2m295ftyq8/3M9vPzrg6DEq\n5RRHOncRmQE8B1xrjOn2FEjjB1RfTEkOZ91jy1iQGc3He8vYcLiSMf6+HfvATrVn3OwuqWXj4Upc\nHsONc1Lx9RH2lNTi9hjAGpaZEG9tBu7rI9y+MJ19pXVUN7YCsHJ3KV/8/VqeW3OYdYe6P6N/d3sx\n7+861s9HrdTZcSIVMh1YDtxpjNnfW3ul+iJijD+XTEngQFk9b20r4tKpCR0XYFOjxhAe5Meu4lqW\nby0kPMgm6CTwAAAY6ElEQVSPiybHMy42hN+vOsiin3/EvmN1HK5oYFxcKDfOTeOOBeksm5wAwKp9\n1qfJd3OLCR/jjwjsKKzB1cU8eo/HUFjVRFmdBp+pocmJ+IEfATFYgWEALmPM3P4qWKkLJ8bx03/s\nobHVzRc7bQ/YPl9+zYEKyuqauXFOKkH+vvzgqnPYeLiSN7cWcsPTn9PQ6iYzNoQ7F44FrI46MzaE\n5z8/zLUzk8krq2dqcjjF1U3kFlZz/dNriQkN4A93zKHV7eFHb+3k3sXjaHV7OF7fgttjTpq6qdRQ\n0Of4AWPMfcB9jlWkVC8mxIeSEjkGt8dw3oTYkx67b/E4vv7aNprbPNw4Jw2AJZPiWTIpnpvmpvHU\nx3l4jOHqGSc2HPHxEe5dnMl/vLWT9YcqOVhez4JMK8RsRW4xbW5rOOf7y3ewdHI8b+UU4+tjfVrw\nGDje0EJ8WNAAHb1S3ulzKqRSA01EePyLM/ARTjtjvnhKAh88egE77SmSnWXGhvDLL3U91fKLs1N5\n/P29/PbjAzS3eZgQH0pzm5u/bysi0M+HL2Qn83ZOUccQ0Ed7TyyMKqvVzl0NPdq5q2FpcVZst4+l\nRQefccjYmABfLsiK4x87SgDr00H7WqqLpyRw/awU/ralkL9vs7LoqxvbOn62vNOGI8frWxgT4EtL\nm4fP8iq4Jjv5jOpQyinejLk/D1wNlBljpnXxuABPAlcCjcDdxpitTheqVH+7cNLJnXtIoC8Xn5PA\nA+ePY1JiGIF+PrS4PPiINRzT/rW81urcjTFc9/TnzEyLIj4skD+tOczU5PBu5+4r1Z+ciB+4Asiy\n/z2A7qWqhqklE63pudEhAUSHBBDo58tzX55LdlokQf6+zM+MBmDZOdbsmkmJ1tTLD/eUcu7PPuKD\nXaUUVDbxwc5jvJ1jhZVtPNz7ytfqxlZW5J4ebqZUX/Q5fgC4FnjJWNYDkSKS1EN7pYak+PAgslMj\nmJIU3uXjl0xJINDPp2OWzfi4EMKD/PhwdyklNc08tjwXgFa3hwp7qGZTp87d7TE0t7lpbnNz/dOf\ns9IONHth7RG+9pdt5JXV9+fhqVHGiTH3FKCg0+1C+74SB55bqQH17F1zoZtZjXcsGMulUxKJCvEn\nOMCXCfGh7D1WR22z1SlXNbbZq2QNR443sCAzhg125/7ZgXJ++NZO2tyG5++ex7b8ar7zZi7/Sr+A\nLUet1bHrDlYwIT6UdQePU9XYyhXTErGnFyt1xgb0gqqIPIA1dEN6evpAvrRSXokP737Wi4+PkBhh\nPb7i4cUkhAex4VAleWX1zBkbxZajVSyZFMfFUxIoq22mpKaZT/eXs7Oohq++spXmNjdtbsPuEivc\nrLKhlcff20tOvhWjsPbgce5YOJZv/207hVVNXD41kT/cOcfr2pvb3Dz48hYeviiLOWOj+vBbUCOB\nE/EDRUBap9up9n2n0fgBNVKMiwslJNCP+PBAAP790on89LppfGVxJrPTo7h8WhLnZ8XhI3DLM+tp\naHHx8EVZAGw6Yp2pL50Ux9+2FlLX4iI8yI91h45zqKKBwqomMmKCeX/XMQoqG7njuQ1sza+ivsXF\nv7+xnUU/+4i65rbTatpVXMuqfeX8ftXBgftFqCHLic79HeAusSwEaowxOiSjRoWxMSGEBfkxZ2wU\ndyy0Fj61mxAfym9vnU2Ly83N89K4aLIVYrb5SCU+Ao9ePBFjrY/invMyqW5s46mPrd2hvmb/IfjT\nmsOsyavgX7tK+emK3by5tZDimuaOoRwAl9tDZUMru+2s+0/2lbGjsOakjUvU6ONE/MA/saZB5mFN\nhbynv4pVaqj5PxeO50tzU7vdVvCqGUnMzYgiJiSAumYXAPtL60mKCCI7LZIZqREUVTXx5UUZ/O/6\no/x9WxHp0cFcOT2R7/xtO29sti5n7TtWS35lI+dNiGH9oUo2H6nqSLz84du7eH9nCUsmxRPk70Nz\nm4cvPLWGhPBA1n5vWY/RCJuOVBIXGkhGbIjDvxk12JyIHzDAQ45VpNQwMibAl9SAnhdMJdjj+JHB\n1oXYxlZ3x9j9EzfPpKqxleiQAH75pWzu+fMmLpwYR3CAHxMTwjqiinMLa6hsbOWa7BTqm11ssjcX\n2VlUw2ub8jHGSqlcOC7G/rla1h48zqYjlSwcFwPAUx8fwMdH+OqSCYA1L/+BlzazaHwsv7t9dr/8\nftTg0Q2ylRogIkJK5BgAkiOsr+PiQpkz1po/v3RSPK8/sJBHL7aGZNpz6oMDfDne0IoxMC0lnLkZ\n0eQUVNPicvPTf+wmcow/kcH+uDyGqcnh/OgLU3j2rrkE+vnwT3tRVmFVI79eeYAnPjxAuZ1kmV/Z\nSFVj20mbl3gjr6yOn723B48doayGJu3clRpAKVFWp54U0fWsnAXjYoixx+1npFo7TF0/K6Xj8Wkp\nEczLiKbF5eGHdtDZNy+ZyBXTEgEr8x4gJNCPpZPieW/nMdwew5/WHAasOfh/2WBtVbjd3pIwv7IR\nY7zrqI0xfH/5Tv746SH2l9X12n5bfhX3vrCJhhaXV8+vnONV5y4il4vIPhHJE5HvdfF4hIi8KyLb\nRWSXiOi4u1JdaD9zT7K/9uQLM5J5ZFkW950/DoDY0EDiwwK5YGIsM1IjeGNzIVnxodw6P52b56WT\nHBHEgsyYEz+fnUx5XQvvbi/mtY0FXJudzJJJcby84Sgut4cdhdYUzPoWF5UNrV7V/8m+so79ZnML\nanppDZ/sLeOjvWW8vP6oV8+vnNNr5y4ivsDvsGIGpgC3isiUU5o9BOw2xmRjXXz9pYgEOFyrUsNe\napQ1Pp/czZl7ZxHB/nzjkolkxAQTFujH1ORwRITgAD/e+Ldz+eYlE3nilpn4+fowMy2StY8t6xjL\nB2tFbUxIAN9bnktTm5t/u3A8t85Pp7yuhc/yKtheWEP7tdYjx72bWfPy+nxSIscQFuhHblF1r+3z\n7Rk7z6w+RGOrc2fv1Y2tvJ3T5YxrZfPmzH0+kGeMOWSMaQVew4oc6MwAYXaIWChWXIF+DlPqFGNj\nrM79TFIrRYT/vmE6j9hj8QBB/r58fVkWU5Mjuv25AD8fbpyTSnObh4smxzMpMYylk+KJDPbntY35\n7Cqq6cjDP1zRwK7iGg6U1mGMYU9JLTsKT5yZrz1YQUOLi02HK7lwUhzTUiJOerw7+ZWNhAX5cbyh\nldX7K7w+5lOdOmz09KqDPPJajk737IE3K1S7ihdYcEqbp7DmuxcDYcDNxpjT9ibTFapqtLtkSgJ/\nvHNOx36v3jrb6OA7Fo7lw92lPLLM+sMQ4OfD1TOSeHm9Ne5+17kZrMmr4PH393ZcaF04LpqtR6tp\ndXv48rljuf+Ccdz27AbOz4qlrsXFgsxowgL9+PPnR2h1edhw+DiCMD8zmk/3l3PxOfEdsQn5lU0s\nGh/DB7tKKaw6u444p6CaW55Zx4ffuJC06GCMMfwj17pQfLC8/ozjnUcLpy6oXgbkAMnATOApETnt\nv15doapGO39fHy6bOnCZMWnRwXz8rSVkp0V23PflczOYlxHFi1+ZzyVTEkiOGEN5XQsLx0XzrUsn\nsuVoFdlpEVw7M5kX1x3l0/3W3rKfHbDOvBdkxjA9NYJWt4fcwmoefS2Hh1/dyu8+yeP+lzZ3JGE2\ntLioqG9hRmokIQG+FFY10ery0Oo6fU/anmw+Uklzm4ecAmsYaHthDUXVTQAcLO9+ps+K3GK25ld1\n+/hI503n7k28wD3AcjsZMg84DEx2pkSllJOyEsL464OLuNCOOG4fKvrWpZP42kVZrH9sGa/ev5Db\n5lufrl+xz/Lb2ybaF26D/H145LUcjje0UtXYxm8+PgDQMQe/wD5TT48OJjUqmKLqJr72l608+vq2\nM6q3farmATs18587SvD3FUID/ThUXk9BZSP1nWbjlNRYHf+P39nNM58eOrNfzgjizbDMJiBLRDKx\nOvVbgNtOaZMPLAM+E5EEYBIwen+rSg0jV81IIj06mLkZ1nz79qmY01Mj8PURdpfUMjEhlAA/HxaN\nt8bo48ICeeCC8fzmowOkRY8hPMifXcW1hAT4dmTn5B8/0bmnRI2hsKqJkpomQgLOLK+wvXPPs6de\nbj1aRXZqJB5j2F5YzeVPrCY0yI8nb5lFm9vDXc9v5L1HzqeivuWkXbJGG29WqLpE5GvAB4Av8Lwx\nZpeIPGg//gfgJ8ALIrIDKzD1u8aYs796opQaMLcvGHv6VTToWCW7p6SWaSkR/PKm7JOGkx68cBwf\n7Snl9gVjyUoIZdW+Mqoa23g3pxi3x3TMlEmPDiYlcgyr95fj8hhqmtpobnPT0ubhpj+uZX5mNDfO\nScNjDNHBAadFIXScuZfW4/FYF3u/OCeVxlY3f9tSCIC/nw8/WbGbS6ckYgyssYeQ2q8j9CavrI7w\nMf4jai9cr/6EGmP+iZUh0/m+P3T6vhi41NnSlFKDbVZ6JHtKapmaHHHadYLgAD/+8fXzO27Py4jm\n79sK+cuGfPYdq+PI8QbCAv2IDPYnJWoMLntFqzHWLJpNRyrZX1rPgbL6jgu8fj7CuseWERdmfXpo\nbHVRUtNMoJ8PhysaOFTRQEOrmylJ4VQ2WnPzE8ODuH52Cs+uPtQR9dA+7l9e14IxptdrHF9+fhPJ\nkUH89cFFDvzWhgZdoaqU6tYs+0LsNC9n98yzh3Z+9eE+3txSxIJxMYgIqVEnL9o6XNHA8q1FTEwI\n5cNvXMDzd8/lP78wBZfHsK3TRdAjFdbZ//lZcbg8hvfsOIVzkk7sTXv5tESyUyNweQyr7Yu/m+3U\nzKY2Nw2t7h5rLq1tpqi6iU1Hqthy9MTOWb/+cD9bjlZRWNXIB7uO9Xrsa/MqhtQFXO3clVLdumZm\nMr/6UnZHp92b1KhgvnHxRFbuKcPfV/jJdVOBEytzo4L9AVi1r5wtR6u4YXYqE+LDuGhyArfOT8fP\nR9hWUM3K3aUUVzd1DMlcOtXat/atnCJ8BCYlhjE7PYrJiWHcMj+NaSnWfP/2TwedV9yeOjTzH2/t\n4LHlOzput8/X9xH4o30BtqK+hSc/OsDzaw7z5MoDPPjyli4z9Nv9+fPD3PbcBh57c0eXj2/Nr+Lu\nP2+kxdXzHxonORI/YLdZIiI5dvzAp86WqZQaDIF+vtwwOxWfHmKDT/X1ZRP4+Q3TefauuSTZAWnt\nmTpzxkYRHRLAG5sL8PURrpt5IjcnyN+XKcnhvLejhPte2syP3t7F3mO1AFw2JZHkiCAOljcwLi6U\nIH9f4sICef/RC5icGE5K5JiOPxwBfid3a+V1LRyvb+HLz2/kcEUDb+cU87ctBdQ0Wp31jiJrpe6t\n89P5ZF8Z9S0ucu1ohg2Hj7P24HGMsZI52206UslvPjrAgdI6Ciob+cmK3YQG+nGgrI4m+5NCeV1L\nR7jaqxvyWbWvnAOlA7dPriPxAyISCTwNXGOMmQrc1A+1KqWGARHhlvnpLBh3IucmLjSQhPBA5mVE\nkxETjNtjWDY5/qS4BICZaZEdUQgf7S3lhc+PcH5WLBHB/vz5nvmEBfkxs9Oc/c6v2X72vthedetn\n/0Eqr2vh79uK+HR/Of/9jz3UNbtocxv+tdsaatlRVMOE+FCunpFMm9vweV4FOXZuTkV9a8ec+s7D\nRf/59i5+9eF+Ln/yM37w1k4Avn3ZJDwGdpfUUlbbzOLHP+ZvWwsxxnSsFThYPoQ6d7yLH7gNa557\nPoAxpszZMpVSw5mI8Mm3lnDf+eM6ZsPcvnDsae1mpVsd9/lZsfj7+NDQ6uL7V54DWEMxK795IT++\nZmqXr3HehFgSw4NYNN76ozI5KQyA8rpm3t1eDMDKPaUARIzx5x87SjDGsKOohmkpEczNiCI00I9V\n+8rYXlDd8UkAICzQr2MR1f7SOnaX1PL1iyaQGmXNArpkSkLH0NHOohpWH6igxeVhW36V1dnbQ0MH\nywauc3cqfmAi4C8iq7DiB540xrx06hNp/IBSo1ewPb/98qmJNLW6Od8+w+5s8YQ4pqdE8NgV57D6\nQDkeYzgn6cTF3IQeNjC///xx3L0oo+MseXJiOHtL6th0tIrthTWkRweTX9lIYngQ181K4dnPDvHJ\nvjLK61qYmRaJv68PiyfE8snecppdbi6dksCn+8sxBhZnxbJ6fznGGN7aVoSvj3DnuRksOyeBR1/P\n4cELx5MYHkRsaAA7impoc1urcPceq2PVPque6JAA8gbwzP3MVhP0/DxzsBYyjQHWich6Y8z+zo2M\nMc8AzwDMnTtXk/6VGoUunZrIpVMTu3wsLiyQdx9eDJzIpveWr4/g6+PbMTMnOXIMsaGBvL/zGCLw\n8xumc9tzG5iXGc0dC9N59rNDfPWVrQQH+HZk91w/O4X37ZkxM9OimGtvpNLi9rB8axFHjjfydk4x\n502IJS4skLiwQD751pKOGqalRJBTUE2VfUF3/7E6fEWYmhxOUkQQB8vObGOUvnAqfqAQ+MAY02Av\nXloNZDtTolJKeS8zNoT06GDmjI0iLiwQt8dw2ZREFk2I5dGLs7jnvAxSo4K5cnoSzW0ebpufTmSw\nlVB+2dREVjy8mK8tncDV2Ul8aV4aX5qXxhI7quHH7+yiqLqJ62d1HeR2QVYceWX1HG9oZX5GNA2t\nbjYfrWLJpDjGx4dyuKIBl/vMsnXOllPxA29jhYX5AQFYwza/drJQpZTyRnCAH6u/sxSAF9ceAeCB\nC60NTx69eGJHu69fNIHyumbuv2DcST8/LSWi4+Jsu7ToYBZPiOXT/eWM8ffl0ildf/K457wMEsKD\nWLmnlOtmpbDx+Y0ALJkUz5GKBlrdHgqqmsgcgA3JHYkfMMbsEZH3gVzAAzxnjNnZn4UrpVRvLpmS\nQHxYILPTo057LCshjNceONfr57p5Xhpr8iq4ZEoCIYFdd50iwlUzkrhqRlJHmFl4kB+z0iI7Zu/s\nLakdGp079B4/YN/+BfAL50pTSqm+uXV+OrfOd2byxqVTE7hhVgpfWZzpVfvQQD8mJ4YxJTkcP18f\npiZHEBLgy2d5FVwxPcmRmnri1AVVpZQa0QL9fPnVzTPP6Gdee2AhgX6+gLW4atGEWD7dV+5V3k1f\nObZC1W43T0RcInKjcyUqpdTwFBkcwJgA347bSybFUVTdNCCLmZzaILu93ePAv5wuUimlRoL2DVLa\n5773J6dWqAI8DLwJ6OpUpZTqQmpUMNfOTO6INO5PjqxQFZEU4HpgKTDPseqUUmqEefKWWQPyOk5F\n/j6BtftSj7PzReQBEdksIpvLy/v/Y4lSSo1W3py5e7NCdS7wmn31Nxa4UkRcxpi3OjfS+AGllBoY\njqxQNcZ0TPwUkReAFad27EoppQaOUxtkK6WUGkIcW6Ha6f67+16WUkqpvtA9VJVSagTSzl0ppUYg\n7dyVUmoEEmMGZ0aiiJQDR8/yx2OBCgfLGS5G43HrMY8OeszeG2uMieut0aB17n0hIpuNMXMHu46B\nNhqPW495dNBjdp4Oyyil1AiknbtSSo1Aw7Vzf2awCxgko/G49ZhHBz1mhw3LMXellFI9G65n7kop\npXow7Dp3b7f8G+5E5IiI7BCRHBHZbN8XLSIfisgB++vpW7oPIyLyvIiUicjOTvd1e4wi8pj9vu8T\nkcsGp+q+6eaYfywiRfZ7nSMiV3Z6bCQcc5qIfCIiu0Vkl4g8Yt8/Yt/rHo554N5rY8yw+YcVXHYQ\nGAcEANuBKYNdVz8d6xEg9pT7/gf4nv3994DHB7vOPh7jBcBsYGdvx4i1xeN2IBDItP878B3sY3Do\nmH8MfKuLtiPlmJOA2fb3YcB++9hG7HvdwzEP2Hs93M7cvd3yb6S6FnjR/v5F4LpBrKXPjDGrgcpT\n7u7uGK8FXjPGtBhjDgN5WP89DCvdHHN3Rsoxlxhjttrf1wF7sHZ4G7HvdQ/H3B3Hj3m4de5dbfnX\n0y9sODPAShHZIiIP2PclGGNK7O+PAQmDU1q/6u4YR/p7/7CI5NrDNu3DEyPumEUkA5gFbGCUvNen\nHDMM0Hs93Dr30WSxMWYmcAXwkIhc0PlBY32WG9FTnUbDMdp+jzXUOBMoAX45uOX0DxEJBd4EHjXG\n1HZ+bKS+110c84C918Otc/dmy78RwRhTZH8tA/6O9RGtVESSAOyvZYNXYb/p7hhH7HtvjCk1xriN\ntQfxs5z4OD5ijllE/LE6uVeMMcvtu0f0e93VMQ/kez3cOveOLf9EJABry793Brkmx4lIiIiEtX8P\nXArsxDrWL9vNvgy8PTgV9qvujvEd4BYRCbS3fMwCNg5CfY5r7+Bs12O91zBCjlmszZX/BOwxxvyq\n00Mj9r3u7pgH9L0e7KvKZ3EV+kqsK88HgR8Mdj39dIzjsK6cbwd2tR8nEAN8BBwAVgLRg11rH4/z\nVayPpm1YY4z39nSMwA/s930fcMVg1+/gMf8vsAPItf8nTxphx7wYa8glF8ix/105kt/rHo55wN5r\nXaGqlFIj0HAbllFKKeUF7dyVUmoE0s5dKaVGIO3clVJqBNLOXSmlRiDt3JVSagTSzl0ppUYg7dyV\nUmoE+n/GQ0vYEKDNQgAAAABJRU5ErkJggg==\n", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "import matplotlib.pyplot as plt\n", "import matplotlib.ticker as ticker\n", "import numpy as np\n", "%matplotlib inline\n", "\n", "def show_plot(points):\n", " plt.figure()\n", " fig, ax = plt.subplots()\n", " loc = ticker.MultipleLocator(base=0.2) # put ticks at regular intervals\n", " ax.yaxis.set_major_locator(loc)\n", " plt.plot(points)\n", "\n", "show_plot(plot_losses)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Evaluating the network\n", "\n", "Evaluation is mostly the same as training, but there are no targets. Instead we always feed the decoder's predictions back to itself. Every time it predicts a word, we add it to the output string. If it predicts the EOS token we stop there. We also store the decoder's attention outputs for each step to display later." ] }, { "cell_type": "code", "execution_count": 20, "metadata": { "collapsed": false }, "outputs": [], "source": [ "def evaluate(sentence, max_length=MAX_LENGTH):\n", " input_variable = variable_from_sentence(input_lang, sentence)\n", " input_length = input_variable.size()[0]\n", " \n", " # Run through encoder\n", " encoder_hidden = encoder.init_hidden()\n", " encoder_outputs, encoder_hidden = encoder(input_variable, encoder_hidden)\n", "\n", " # Create starting vectors for decoder\n", " decoder_input = Variable(torch.LongTensor([[SOS_token]])) # SOS\n", " decoder_context = Variable(torch.zeros(1, decoder.hidden_size))\n", " if USE_CUDA:\n", " decoder_input = decoder_input.cuda()\n", " decoder_context = decoder_context.cuda()\n", "\n", " decoder_hidden = encoder_hidden\n", " \n", " decoded_words = []\n", " decoder_attentions = torch.zeros(max_length, max_length)\n", " \n", " # Run through decoder\n", " for di in range(max_length):\n", " decoder_output, decoder_context, decoder_hidden, decoder_attention = decoder(decoder_input, decoder_context, decoder_hidden, encoder_outputs)\n", " decoder_attentions[di,:decoder_attention.size(2)] += decoder_attention.squeeze(0).squeeze(0).cpu().data\n", "\n", " # Choose top word from output\n", " topv, topi = decoder_output.data.topk(1)\n", " ni = topi[0][0]\n", " if ni == EOS_token:\n", " decoded_words.append('')\n", " break\n", " else:\n", " decoded_words.append(output_lang.index2word[ni])\n", " \n", " # Next input is chosen word\n", " decoder_input = Variable(torch.LongTensor([[ni]]))\n", " if USE_CUDA: decoder_input = decoder_input.cuda()\n", " \n", " return decoded_words, decoder_attentions[:di+1, :len(encoder_outputs)]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We can evaluate random sentences from the training set and print out the input, target, and output to make some subjective quality judgements:" ] }, { "cell_type": "code", "execution_count": 21, "metadata": { "collapsed": true }, "outputs": [], "source": [ "def evaluate_randomly():\n", " pair = random.choice(pairs)\n", " \n", " output_words, decoder_attn = evaluate(pair[0])\n", " output_sentence = ' '.join(output_words)\n", " \n", " print('>', pair[0])\n", " print('=', pair[1])\n", " print('<', output_sentence)\n", " print('')" ] }, { "cell_type": "code", "execution_count": 22, "metadata": { "collapsed": false, "scrolled": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "> je suis ambitieux .\n", "= i m ambitious .\n", "< i m ambitious . \n", "\n" ] } ], "source": [ "evaluate_randomly()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Visualizing attention\n", "\n", "A useful property of the attention mechanism is its highly interpretable outputs. Because it is used to weight specific encoder outputs of the input sequence, we can imagine looking where the network is focused most at each time step.\n", "\n", "You could simply run `plt.matshow(attentions)` to see attention output displayed as a matrix, with the columns being input steps and rows being output steps:" ] }, { "cell_type": "code", "execution_count": 24, "metadata": { "collapsed": false }, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP4AAAECCAYAAADesWqHAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAChVJREFUeJzt3c+LXYUdhvH3bTpmTLQIbSpJRpouWkGkjeWSLiKFpmhS\nFdulgq6E2bQQaUHq0n9A3HQTVNqiVQQVirWGWCMS0GgSR2sSFRFLE4XRimgqjSa+XcydEpPYOTHn\n3HOa7/OBITPJ5c5LkmfO/TFzj5MIQC1f6XsAgMkjfKAgwgcKInygIMIHCiJ8oKDBhm97i+3XbL9h\n+zcD2HOv7Xnbr/S9ZZHtS2zvtH3A9n7bWwewadr287ZfGm+6o+9Ni2wvs/2i7cf63rLI9lu2/2Z7\nzvaeiX3eIT6Pb3uZpNclXSXpkKQXJN2Y5ECPm34k6YikPyS5vK8dJ7K9WtLqJPtsXyhpr6Sf9/z3\nZEkrkxyxPSVpl6StSZ7ra9Mi27+SNJL0tSTX9b1HWghf0ijJe5P8vEM94m+Q9EaSN5N8IulBST/r\nc1CSZyS93+eGkyV5J8m+8fsfSTooaW3Pm5LkyPjDqfFb70cX2zOSrpV0d99bhmCo4a+V9I8TPj6k\nnv9DD53tdZKukLS73yX/vUk9J2le0o4kvW+SdJek2yR91veQk0TSk7b32p6d1Ccdavg4A7YvkPSw\npFuTfNj3niTHk6yXNCNpg+1e7xrZvk7SfJK9fe74AleO/65+KukX47uUnRtq+IclXXLCxzPj38NJ\nxvejH5Z0f5JH+t5zoiQfSNopaUvPUzZKun58f/pBSZts39fvpAVJDo9/nZf0qBbu5nZuqOG/IOk7\ntr9t+zxJN0j6U8+bBmf8QNo9kg4mubPvPZJke5Xti8bvn6+FB2hf7XNTktuTzCRZp4X/S08luanP\nTZJke+X4QVnZXinpakkTedZokOEnOSbpl5K2a+EBq4eS7O9zk+0HJD0r6VLbh2zf0ueesY2SbtbC\nEWxu/HZNz5tWS9pp+2UtfAHfkWQwT58NzMWSdtl+SdLzkv6c5IlJfOJBPp0HoFuDPOID6BbhAwUR\nPlAQ4QMFET5Q0KDDn+S3MDY1xE3SMHexqZk+Ng06fEmD+0fSMDdJw9zFpmYIH0D3OvkGnvO8PNNa\nedbX86mOakrLW1jUnrY3ffd7H7dyPe/+87hWfX1ZK9f1+ssrWrmeCv9+bWhz07/1L32So17qcl9t\n5bOdZFor9UP/pIurPuds3z7X94RTbF6zvu8J+JJ256+NLsdNfaAgwgcKInygIMIHCiJ8oCDCBwoi\nfKAgwgcKInygIMIHCiJ8oCDCBwoifKCgRuEP7Vz1AM7OkuGPz1X/Wy2c1O8ySTfavqzrYQC60+SI\nP7hz1QM4O03C51z1wDmmtVfgGb9S6KwkTaudl24C0I0mR/xG56pPsi3JKMloaK9pBuDzmoTPueqB\nc8ySN/WTHLO9eK76ZZLu7ftc9QDOTqP7+Ekel/R4x1sATAjfuQcURPhAQYQPFET4QEGEDxRE+EBB\nhA8URPhAQYQPFET4QEGEDxRE+EBBrb0QB76c4/ms7wmnsvtecKqk7wXnFI74QEGEDxRE+EBBhA8U\nRPhAQYQPFET4QEGEDxRE+EBBhA8URPhAQYQPFET4QEGEDxRE+EBBS4Zv+17b87ZfmcQgAN1rcsT/\nnaQtHe8AMEFLhp/kGUnvT2ALgAnhPj5QUGuvuWd7VtKsJE1rRVtXC6ADrR3xk2xLMkoymtLytq4W\nQAe4qQ8U1OTpvAckPSvpUtuHbN/S/SwAXVryPn6SGycxBMDkcFMfKIjwgYIIHyiI8IGCCB8oiPCB\ngggfKIjwgYIIHyiI8IGCCB8oiPCBgggfKKi1V+DBl3PN2h/0PeEU299+se8Jp9i8Zn3fE84pHPGB\ngggfKIjwgYIIHyiI8IGCCB8oiPCBgggfKIjwgYIIHyiI8IGCCB8oiPCBgggfKKjJ2XIvsb3T9gHb\n+21vncQwAN1p8vP4xyT9Osk+2xdK2mt7R5IDHW8D0JElj/hJ3kmyb/z+R5IOSlrb9TAA3Tmj+/i2\n10m6QtLuLsYAmIzGL71l+wJJD0u6NcmHp/nzWUmzkjStFa0NBNC+Rkd821NaiP7+JI+c7jJJtiUZ\nJRlNaXmbGwG0rMmj+pZ0j6SDSe7sfhKArjU54m+UdLOkTbbnxm/XdLwLQIeWvI+fZJckT2ALgAnh\nO/eAgggfKIjwgYIIHyiI8IGCCB8oiPCBgggfKIjwgYIIHyiI8IGCCB8oiPCBghq/Ag/q2Lxmfd8T\nTrH97bm+J5zWEP+umuCIDxRE+EBBhA8URPhAQYQPFET4QEGEDxRE+EBBhA8URPhAQYQPFET4QEGE\nDxRE+EBBTU6TPW37edsv2d5v+45JDAPQnSY/j39U0qYkR2xPSdpl+y9Jnut4G4CONDlNdiQdGX84\nNX5Ll6MAdKvRfXzby2zPSZqXtCPJ7m5nAehSo/CTHE+yXtKMpA22Lz/5MrZnbe+xvedTHW17J4AW\nndGj+kk+kLRT0pbT/Nm2JKMkoyktb2sfgA40eVR/le2Lxu+fL+kqSa92PQxAd5o8qr9a0u9tL9PC\nF4qHkjzW7SwAXWryqP7Lkq6YwBYAE8J37gEFET5QEOEDBRE+UBDhAwURPlAQ4QMFET5QEOEDBRE+\nUBDhAwURPlAQ4QMFNfmxXKB3m9es73vCaW1/e67vCZ+zYfPHjS7HER8oiPCBgggfKIjwgYIIHyiI\n8IGCCB8oiPCBgggfKIjwgYIIHyiI8IGCCB8oiPCBgggfKKhx+LaX2X7RNqfIBv7PnckRf6ukg10N\nATA5jcK3PSPpWkl3dzsHwCQ0PeLfJek2SZ990QVsz9reY3vPpzrayjgA3VgyfNvXSZpPsvd/XS7J\ntiSjJKMpLW9tIID2NTnib5R0ve23JD0oaZPt+zpdBaBTS4af5PYkM0nWSbpB0lNJbup8GYDO8Dw+\nUNAZva5+kqclPd3JEgATwxEfKIjwgYIIHyiI8IGCCB8oiPCBgggfKIjwgYIIHyiI8IGCCB8oiPCB\ngggfKIjwgYIIHyiI8IGCCB8oiPCBgggfKIjwgYIIHyiI8IGCCB8oiPCBgggfKIjwgYIIHyiI8IGC\nGp000/Zbkj6SdFzSsSSjLkcB6NaZnC33x0ne62wJgInhpj5QUNPwI+lJ23ttz3Y5CED3mt7UvzLJ\nYdvflLTD9qtJnjnxAuMvCLOSNK0VLc8E0KZGR/wkh8e/zkt6VNKG01xmW5JRktGUlre7EkCrlgzf\n9krbFy6+L+lqSa90PQxAd5rc1L9Y0qO2Fy//xyRPdLoKQKeWDD/Jm5K+P4EtACaEp/OAgggfKIjw\ngYIIHyiI8IGCCB8oiPCBgggfKIjwgYIIHyiI8IGCCB8oiPCBgpyk/Su135X09xau6huShvYCn0Pc\nJA1zF5uaaXPTt5KsWupCnYTfFtt7hvZS3kPcJA1zF5ua6WMTN/WBgggfKGjo4W/re8BpDHGTNMxd\nbGpm4psGfR8fQDeGfsQH0AHCBwoifKAgwgcKInygoP8AK0Bh/+14UqIAAAAASUVORK5CYII=\n", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "output_words, attentions = evaluate(\"je suis trop froid .\")\n", "plt.matshow(attentions.numpy())" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "For a better viewing experience we will do the extra work of adding axes and labels:" ] }, { "cell_type": "code", "execution_count": 25, "metadata": { "collapsed": false }, "outputs": [], "source": [ "def show_attention(input_sentence, output_words, attentions):\n", " # Set up figure with colorbar\n", " fig = plt.figure()\n", " ax = fig.add_subplot(111)\n", " cax = ax.matshow(attentions.numpy(), cmap='bone')\n", " fig.colorbar(cax)\n", "\n", " # Set up axes\n", " ax.set_xticklabels([''] + input_sentence.split(' ') + [''], rotation=90)\n", " ax.set_yticklabels([''] + output_words)\n", "\n", " # Show label at every tick\n", " ax.xaxis.set_major_locator(ticker.MultipleLocator(1))\n", " ax.yaxis.set_major_locator(ticker.MultipleLocator(1))\n", "\n", " plt.show()\n", " plt.close()\n", "\n", "def evaluate_and_show_attention(input_sentence):\n", " output_words, attentions = evaluate(input_sentence)\n", " print('input =', input_sentence)\n", " print('output =', ' '.join(output_words))\n", " show_attention(input_sentence, output_words, attentions)" ] }, { "cell_type": "code", "execution_count": 26, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "input = elle a cinq ans de moins que moi .\n", "output = she s five years younger than me . \n" ] }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAWIAAAEZCAYAAACtuS94AAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAH6pJREFUeJzt3XucHGWd7/HPN0EFAUUFb1wEPCiCcksIuOIKKmxQlF0F\nQUGPIEZUvKyLiq5HVsVzZFlX8QCGyKLrkRVZBGUxiqIo3iGBkBAwmAOyBEU3CIgol8x894+qgWaY\nzPQkVV3V3d83r3pNdVX18zw1dH7z9FPPRbaJiIjmzGi6ABERwy6BOCKiYQnEERENSyCOiGhYAnFE\nRMMSiCMiGpZAHBHRsATiiIiGJRBHRDQsgTgiekKFr0l6TtNlaZsE4ojolQOAPYFjmi5I2yQQR0Sv\nvIkiCL9C0gZNF6ZNEogjonaSNgd2tv1N4FLgrxsuUqskEEdEL7we+HK5/3nSPPEwCcQR0QtHUwRg\nbF8JPE3S1s0WqT0SiGPgSXqBpI3L/SMl/bOkZzRdrmEhaTPgNNu3dhw+Hti8oSK1jjIxfAw6SUuB\nXYFdgC8AZwGvsf2iJssVMSY14hgGa1zUOA6mqJmdDmzacJmGgqQ3S9qh3Jekz0v6g6SlknZvunxt\nkUAcw+BuSR8AjgS+IWkG8KiGyzQs3gX8qtx/LcW3ku2A9wCfaahMrZNAHMPgMOA+4E22bwO2Ak5p\ntkhDY43tB8r9g4Av2r7d9qXAxg2Wq1XSRhwRtZF0FfBy4A7gZuDFtpeX5663neHOpEYcQ0DSqyT9\nUtJdZfvk3ZL+0HS5hsSHgUUUzRMXdQThFwE3NliuVkmNOAaepJXAK2xf33RZhlE5nHlT23d0HNuY\nIv78sbmStUfGe8cw+G2CcKOeCLxd0s7l6+XAGbZ/22CZWiU14j4naY/Jztu+qldlaStJpwJPBb5G\n8dAOANsXNFaoISHpBcC/UfTfXlwengX8T+AI2z9uqGitkkDc5yT9DNgDWAoIeB7FB/5ewLZf3GDx\nWkHS5yc4bNtH97wwQ6b8fL7V9tXjju8GnGl7r2ZK1i5pmuh/vwbebHsZgKTnAv9g+5Bmi9Ueto+q\nO49yyPQOti+VtBGwge276863DzxufBAGsL1EUgbVlBKIAUn7UPwj+rykLYBNbN/UdLm69OyxIAxg\n+9o6V0CQ9ARga9tL68qjKpLeZ/sfJf1f4BFf/Wy/s6J83gzMo2gLfSZFP+X5wEsqSLsn91AjSXpC\n54O68uATSa+tBw19IJZ0IjAbeDbF7FCPAr4EvKDJck3DUklnUZQZ4AiKZorKSPo+8EqKz8ti4HeS\nfmz7PVXmU4OxB3SLas7n7cAc4OcAtn8p6ckVpd2re6jLp4BvSzoeGHteMQs4uTwXpI0YSUuA3YGr\nbO9eHltqe5dmS9YdSRsCbwX+sjx0OfBZ2/dWmMfVtneXdAxFbfjEfvodjZG0CUDVXaYk/dz2Xh2/\npw0oPk+V/37quoc6SToIeB+wM0Wt/jrgFNv/0WjBWmToa8TA/bYtyfBg/8a+UQbcT1Fv7WIDSU8D\nXgP8fY351KJsN/9/FE0HkvRfwBvGBhdU4AeSPghsJGl/4G1ApUGmB/dQG9sXAxc3XY42SxsNnCfp\nTGCzsq3vUuBzDZepa+Vcu9+RdIOkG8e2irP5KHAJsNL2lZK2B35ZcR51WgC8x/YzbG8D/B3V/j8+\nAfgvYBnwFmAh8KEK04f676EWks7r2D953Llv975E7TT0TRMAZS3mAIruX5fY/k7DReqapF8Af0vR\ndjsydtz27Y0VqmUkXWN716mOtVm/3sNYc025f5XtPSY6N+zSNAGUgbdvgu84d5ULMtam7EnyZmBb\nOj4zVfXDlfQs4LPAU2w/V9IuwCttn1RF+sCNkv4XxVd7KKbDrOxbg6SbmLhHw/ZV5UHN91CjyWp6\nqQWWhjYQS7qbiT8Ioujs/7geF2ldXSbpFOACHj5qrMoRdV8HfkjRbDMyxbXr4nPAe4EzAWwvlfRv\nQFWB+GjgI8BXy9c/BKrsWzy7Y39D4FCKttwq1X0PdXlsOQH8DIo29N0p/o0J2KjRkrVImib6nKTL\nJjhc6Yg6SUts71ZVehOkf6XtPcd9ja0sT0mzKR4ybstDlQ/X2etD0mLbsypMr+f3UIW1fD4fZHu/\nXpWlzYa5RjxpjcX273tVlvXRow/yxZJeZnthTemvlvRMym8okg4BflNh+udQLFZ5LTBaYbrAI+b7\nmEFRQ67631at91CXBNruDG2NuKNdTzzURKHypytu36ucpCNtf0nShIMqbP9zhXndTbGawn3AA1Tc\nfFP2wlgA/AXFBOI3UUwIc3NF6f/I9j5VpLWW9C/joc/QGoq5d//J9g0V5lHrPdSpHPL9LNvXdBzb\nBhgZt7Lz0BraGrHt7QDK9cuOALaz/dHyA/K0KvMqhwXvQNF+OJb/5euZ7Fh/54nG61f619X2puU3\niIfdw/oa90dkIXAZRY3yHuDVQFV/TE4sRx9+l3pmX7uYh/6oU+4fJGksnyruo+57qNMa4AJJu9i+\npzx2FvBBIIGYIQ7EHU6n+Kr3Yor+sndTPBDZs4rEy9Fo76KYf2AJsDfw0zK/dWb7zHJ3e+Bdtu8s\n83sC8Mn1SXu8tdzDT1j/uRTG/og8m+L3/XWKYPZ64Ir1TLvTUcCOFMPXx77Wm+IBZxVm8fDyv4Ki\n/FX2ta77Hmpj+wFJF1IMCPp8WdnZwna/Dtuunu2h3iiGogJc3XHsmgrTX0ZRi1xSvt4RuKDC9K/u\n5ljL7+FyihUcxl5vClxeYforav4M1Vr+XtxD3Vv5mbm83P8Q8M6my9SmLSPr4AFJM3noQdEWVPsw\n5F6X8z5IeoztX1DUAKsyo6wFU+bxRKr/plP3PTwFuL/j9f3lsar8RNJOFaY3Xt3lh/rvoVblZ0Zl\nn/HDeag/dJCmCYDPABcCT5b0ceAQqh2eukrSZhSrQ3xH0thqtlX5JPBTSf9evj4U+HiF6UP99/BF\n4Iry6yvAX1Os6FCVvYEl5QPa+3joYWNVXb/qLj/Ufw+PIOmptm+rMMl/oWgbXuZx02IOu6HtNdFJ\n0o4U7Z0Cvuua1jdTsXLt44Fv2b5/quunke5OPNTm/D3b11WV9gR51XUPewAvLF9e7gkmE1+PtJ8x\n0XFX1CujzKO28pfp134PE+T5DdsvrzC9x1J0S3y17UurSncQJBBHRDQsbcQREQ1LIB5H0rykn/Tb\nmn4v8uj39Osm6WxJv5N07VrOS9JnJK2UtFRTrLQOCcQTqftDkvSTftvz6Pf06/YFYO4k5w+kGPy0\nA8W9fnaqBBOIIyKmwcWo2MnmojkY+KILP6NYdGLS0boD3X1N5fJHvXpf1enPmjX9ybu22WYbZs+e\n3XX5Fy9ePO082vL7Gdb0e5FHy9JfbXuL9clv7ty5Xr16dVfXLl68eDnQuebjAtsLppHdlsAtHa9X\nlcfWOpHVQAfifrdoUf0jQMfmQ4hosfXuord69equ/z1Jutf27KmvrE4CcUQMhR521b0V2Lrj9VZM\nMblR2ogjYuAZGBkd7WqrwEXAG8reE3tTLGc26fzaqRFHxBAwrmh2WElfBvYFNpe0CjiRYlY8bM+n\nmNL1ZcBK4E90saRVAnFEDD7DaEUtE7ZfO8V5A2+fTpoJxBExFNo8nUMCcUQMPAOjCcQREc1KjXga\nJP0KmG27u97XERFTsF1Vj4hatC4QR0TUoc014kb7EUvaWNI3JF0j6VpJh5Wn3iHpKknLyknbx649\nW9IVkq6WdHCDRY+IPuMu/2tC0wM65gK/tr2r7ecC3yqPr7a9B8WsRceXx/6eYvWJOcB+wCmSNh6f\noKR5khZJygqxEQGMPazrbmtC04F4GbC/pJMlvdD2XeXxsSXCFwPblvsHACdIWgJ8n2JV4W3GJ2h7\nge3ZvR4rHhHt1u2Kyk1otI3Y9g3lpMkvA06S9N3y1H3lzxEeKqMo1rpa0eNiRkS/a/nDuqbbiJ8O\n/Mn2l4BTgMlmsr+Eou1Y5Xt370ERI2IAmNSIJ/M8irbeUeAB4K3A+Wu59mPAp4GlkmYANwEH9aSU\nEdH3MqBjLWxfQlHT7bRtx/lFFJNrYPvPwFt6VbaIGCxt7r7WdI04IqIHmuua1o0E4ogYeG6wa1o3\nEogjYiiMtrjXRAJxRAy8zL4WEdECeVgXEdEkOzXiWDeDsNR93bWQQfgdRW+kRhwR0SADIwnEERHN\nSo04IqJhCcQREQ1yHtZFRDQvNeKIiIYlEEdENKjoNZEhzhERjcqkPxERTWpw9Y1uJBBHxMAbWyqp\nrZpexXnaJG0s6RuSrpF0raTDmi5TRLTfaNmFbaqtCf1YI54L/Nr2ywEkPb7h8kREH0iNuFrLgP0l\nnSzphbbv6jwpaZ6kRZIWNVS+iGgZ24yMjna1NaHvArHtG4A9KALySZI+PO78Atuzbc9upIAR0Uru\n8r8m9F3ThKSnA7+3/SVJdwLHNF2miGi/Nndf67saMfA84ApJS4ATgZMaLk9EtNxYr4lutqlImitp\nhaSVkk6Y4PzjJf1H2aFguaSjpkqz72rEti8BLmm6HBHRX6p4WCdpJnA6sD+wCrhS0kW2r+u47O3A\ndbZfIWkLYIWkc2zfv7Z0+y4QR0RMW/mwrgJzgJW2bwSQdC5wMNAZiA1sqmL5mE2A3wNrJks0gTgi\nBl6FAzq2BG7peL0K2GvcNacBFwG/BjYFDrMnn+iiH9uIIyKmbRoDOjYf6wJbbvOmmdVfAUuApwO7\nAadJetxkb0iNOCKGwjS6pq2epPvrrcDWHa+3Ko91Ogr4hIsq+EpJNwE7AlesLcPUiCNiKNjdbVO4\nEthB0naSHg0cTtEM0ek/gZcASHoK8GzgxskSTY04IgaeoZJ5JGyvkXQcRc+tmcDZtpdLOrY8Px/4\nGPAFScsAAe+3vXqydBOIh1zd4++LB8cRDauu1wS2FwILxx2b37H/a+CA6aSZQBwRA6/t02AmEEfE\nUEggjohoWFNzDXcjgTgihkBzM6t1I4E4IgZel13TGpNAHBFDoalJ37uRQBwRA6+qfsR1SSCOiKHQ\n5l4TjQ5xlvROSddLumOiCZYjIirR5aTwTQXrpmvEbwNeantVw+WIiEGXGvEjSZoPbA98U9LfSjqt\nXGLkZkkzyms2lnSLpEdJeqakb0laLOmHknZsquwR0X9GR9zV1oTGArHtYykmTt4PuKM8dhfFPJ4v\nKi87CLjE9gPAAuAdtmcBxwNn9LzQEdGXiu5raZqYjq8AhwGXUUwxd4akTYC/AP69YxKZx0z05nIS\n5+lO5BwRA67ND+vaGIgvAv63pCcCs4DvARsDd9rebao3215AUXtGUnt/8xHRQ83VdrvRuonhbf+R\nYvLlU4GLbY/Y/gNwk6RDAVTYtclyRkR/8ai72prQukBc+gpwZPlzzBHAmyRdAyynWDk1ImJKaSOe\nhO1ty90vlNvY8fMpZrbvvPYmYG6PihYRA8YZ4hwR0awWNxEnEEfEEHBz7b/dSCCOiKHQ5l4TCcQR\nMfCyZl1ERAskEEdENMnGI+k1ERHRqNSIo7U65u6ICdT9jze//95pcRxOII6IwZeHdRERTXMCcURE\nw8xoHtZFRDQrNeKIiAY5TRMRES2QQBwR0Sy3t4k4gTgihkOaJiIimmQzmonh15+kmbZHmi5HRPSf\ntg/oqGXNOkkflfTujtcfl/QuSe+VdKWkpZI+0nH+a5IWS1ouaV7H8T9K+mS5Tt3zJX1C0nXl+/+p\njrJHxABydYuHSporaYWklZJOWMs1+0paUsa0H0yVZl2Lh54NvKEs0AzgcOA2YAdgDrAbMEvSX5bX\nH217FjAbeKekJ5XHNwZ+bntX4Hrgb4Cdbe8CnDRRxpLmSVokaVE9txYRfanowzb1NglJM4HTgQOB\nnYDXStpp3DWbAWcAr7S9M3DoVEWrJRDb/hVwu6TdgQOAq4E9O/avAnakCMxQBN9rgJ8BW3ccHwG+\nWu7fBdwL/IukVwF/WkveC2zPtj276vuKiH7V3QrOXTRfzAFW2r7R9v3AuTxyRfnXARfY/k8A27+b\nKtE624jPAt4IPJWihvwS4P/YPrPzIkn7Ai8Fnm/7T5K+D2xYnr53rF3Y9hpJc8p0DgGOA15cY/kj\nYoCMdr9m3ebjvlEvsL2g3N8SuKXj3Cpgr3HvfxbwqDKWbQqcavuLk2VYZyC+EPgo8CiKvxBrgI9J\nOsf2HyVtCTwAPB64owzCOwJ7T5SYpE2Ax9peKOnHwI01lj0iBojLNuIurV7Pb9QbALMoKo0bAT+V\n9DPbN0z2hlrYvl/SZcCdZa3225KeUxYK4I/AkcC3gGMlXQ+soGiemMimwNclbQgIeE9dZY+IwVNR\nr4lbKZpPx2xVHuu0Crjd9j3APZIuB3YFeh+Iy4d0e9PRUG37VODUCS4/cKI0bG/Ssf8bivaZiIhp\nqygQXwnsIGk7igB8OMU3/k5fB06TtAHwaIqmi09Nlmgtgbh8ingxcKHtX9aRR0RE97p6EDd1KsWz\nquOAS4CZwNm2l0s6tjw/3/b1kr4FLAVGgbNsXztZurUEYtvXAdvXkXZExLRVOPua7YXAwnHH5o97\nfQpwSrdp9s3IuoiIdWXAI+0dWZdAHBFDoc1DnBOII2LwdTdYozEJxBExFKbRj7jnEogjJlH2ee9r\nIzVP/zhzxsxa0y9aeCtIJTXiiIjmtH0azATiiBh8Ns7E8BERzcqadRERDUvTREREkyocWVeHBOKI\nGHh5WBcR0TgzOtLeRuIE4ogYfGmaiIhogQTiekiaObamXUTEZFochydfxVnSRyW9u+P1xyW9S9Ip\nkq6VtEzSYeW5fSVd3HHtaZLeWO7/StJHJF1VvmfH8vgWkr4jabmksyTdLGnz8tyRkq6QtETSmeUy\n1kj6o6RPlqs+P7/qX0hEDJ6xh3UVrOJci0kDMcXqy2+AB5c+OpxiPabdKNZgeilwiqSndZHXatt7\nAJ8Fji+PnQh8z/bOwPnANmVezwEOA15gezdgBDiifM/GwM9t72r7R+MzkTRP0qJxq7BGxDArFw/t\nZmvCpE0Ttn8l6XZJuwNPAa4G9gG+XDYJ/FbSD4A9gT9MkdcF5c/FwKvK/X2Avynz+pakO8rjL6FY\nBfXKctKVjYDfledGgK9OUuYFwAIASS3+MhIRvWNG+3yI81nAG4GnUtSQ91/LdWt4eA17w3Hn7yt/\njnSRr4B/tf2BCc7dm3bhiJiuNveamKppAuBCYC5FrfcS4IfAYZJmStoC+EvgCuBmYCdJj5G0GUWt\ndio/Bl4DIOkA4Anl8e8Ch0h6cnnuiZKe0f1tRUSMY3e3NWDKGrHt+yVdBtxpe0TShRQPya6haAN/\nn+3bACSdB1wL3ETRjDGVjwBflvR64KfAbcDdtldL+hDw7bJt+gHg7RTBPiJiWuw+nxi+DIR7A4cC\nuKjfv7fcHsb2+4D3TXB82479RcC+5cu7gL8ql6h+PrCn7fvK674CfGWCtDaZqswREeO1uGVi8kAs\naSfgYuBC27+sIf9tgPPKYH8/8OYa8oiIodfHa9bZvg7Yvq7My+C+e13pR0QAYPq+10RERF8zfd5G\nHBExCPq2aSIiYjA01zWtGwnEETH4Mg1mRDRp5oxuxm2tu7oDXDnNwXobHUkgjohoTJZKiohoWpom\nIiKa1scDOiIiBkUCcUREw9o8oKPex6kRES0wNvtaFSt0SJoraYWklZJOmOS6PSWtkXTIVGkmEEfE\nUKhizbpy7czTgQOBnYDXlpOjTXTdycC3uylbAnFEDIHugnAX7chzgJW2b7R9P3AucPAE172DYkm3\n301w7hF6FoglbSbpbeX+w1Z8joioVXVNE1sCt3S8XlUee5CkLSnW4vxst8XrZY14M+BtPcwvIuJB\n06gRbz62Eny5zZtmVp8G3m+763k3e9lr4hPAMyUtoVj66B5J5wPPpVjZ+UjblvRh4BUUKzf/BHhL\nefz7wM+B/SiC+pts/7CH5Y+IPjXNkXWrbc9ey7lbga07Xm9VHus0Gzi3HJq9OfAySWtsf21tGfay\nRnwC8P9t70axzNLuwLspGry3B15QXnea7T1tP5ciGB/UkcYGtueU7ztxokwkzRv7S1bTfURE3zEe\nHe1qm8KVwA6StpP0aOBw4KKH5WRvZ3vbcom484G3TRaEodmHdVfYXlVW35cA25bH95P0c0nLgBcD\nO3e854Ly5+KO6x/G9gLbsyf5ixYRw8bg0e62SZOx1wDHUaxofz1wnu3lko6VdOy6Fq/JAR33deyP\nABtI2hA4A5ht+xZJ/wBsOMF7RshglIiYhqpG1tleCCwcd2z+Wq59Yzdp9rJGfDew6RTXjAXd1ZI2\nAabsCB0R0Y2Kuq/Vome1Stu3S/qxpGuBPwO/neCaOyV9DrgWuI2iPSYiYr1kGswOtl+3luPHdex/\nCPjQBNfs27G/mrW0EUdEPILN6EhWcY6IaFZqxBERzTIJxBERjXFW6IiIaJqZxojjnksgjoihkBpx\nRETDRqcevtyYBOKIGHjFYI0E4oiIZqVpIiKiWem+FhHRsDysi4holBkdHWm6EGuVQBwRAy8DOiIi\nWiCBOCKiYQnEERGNcrqvRUQ0zWRAR0REY+x2D3FuchXnh5G0raRfSPqCpBsknSPppeXySr+UNEfS\nxpLOlnSFpKslHdx0uSOiH3S3Xt3Ar1nXpf8BHAocTbFe3euAfYBXAh8ErgO+Z/toSZsBV0i61PY9\nYwlImgfM63nJI6LVMtdE926yvQxA0nLgu7YtaRnFGnVbAa+UdHx5/YbANsD1YwnYXgAsKNNob+t8\nRPRUek10776O/dGO16MUZR0BXm17Ra8LFhH9rc2BuDVtxF26BHiHJAFI2r3h8kREP7C73xrQthrx\nVD4GfBpYKmkGcBNwULNFioi2MzDqzDUxJdu/Ap7b8fqNazn3ll6WKyIGQXM9IrrRmkAcEVGnBOKI\niIYlEEdENKh4Dpd+xBERDTJu8RDnBOKIGApZsy4iomFpI46IaJTTRhwR0aS2r1nXb0OcIyLWSVXT\nYEqaK2mFpJWSTpjg/BGSlkpaJuknknadKs3UiCNiKFQxMbykmcDpwP7AKuBKSRfZvq7jspuAF9m+\nQ9KBFLNB7jVZugnEETEEDNW0Ec8BVtq+EUDSucDBFHOlFznZP+m4/mcU0/dOKk0TETEU3OV/wOaS\nFnVsnQtNbAnc0vF6VXlsbd4EfHOqsqVGHBEDb5oP61bbnr2+eUrajyIQ7zPVtQnEETEUKuo1cSuw\ndcfrrcpjDyNpF+As4EDbt0+VaAJxRAyByvoRXwnsIGk7igB8OMXamg+StA1wAfB62zd0k2gCcUQM\nhSp6TdheI+k4itWCZgJn214u6djy/Hzgw8CTgDPKxYTWTNXUkUAcEQOvygEdthcCC8cdm9+xfwxw\nzHTSTCCOiCHQ3Hp03UggjoihYDLXRM+Uff7mTXlhRAyVNs81MXCB2PYCiiGFSGrvbz4iesiVPKyr\ny8AF4oiI8dq+VFLfDnGWtFDS05suR0T0h6pmX6tD39aIbb+s6TJERP9IG3FERKPSfS0ionFZPDQi\nokE2jI6ONF2MtUogjogh0NyDuG4kEEfEUEggjohoWAJxRETD2jygI4F4PdT9F7acyzSi1fric+p0\nX4uIaJSB0dSIIyKalaaJiIhGpftaRETjEogjIhpU5Zp1dUggjoghYJwhzhERzWrzpD+1TAwv6fuS\nVkhaUm7nd5ybJ+kX5XaFpH06zh0k6WpJ10i6TtJb6ihfRAyfoZgYXtKjgUfZvqc8dITtReOuOQh4\nC7CP7dWS9gC+JmkOcDvFWnNzbK+S9Bhg2/J9T7B9R1VljYjh0+Y24vWuEUt6jqRPAiuAZ01x+fuB\n99peDWD7KuBfgbcDm1L8Ybi9PHef7RXl+w6TdK2kv5O0xfqWOSKGS1HbHe1qa8I6BWJJG0s6StKP\ngM8B1wG72L6647JzOpomTimP7QwsHpfcImBn278HLgJulvRlSUdImgFgez5wIPBY4HJJ50uaO3Y+\nImIqg9g08RtgKXCM7V+s5ZpHNE1MxfYxkp4HvBQ4HtgfeGN57hbgY5JOogjKZ1ME8Vd2piFpHjBv\nOvlGxOAbHW3vyLp1rVEeAtwKXCDpw5Ke0eX7rgNmjTs2C1g+9sL2MtufogjCr+68sGxLPgP4DHAe\n8IHxGdheYHu27dnd3kxEDIGxiX+m2hqwToHY9rdtHwa8ELgL+LqkSyVtO8Vb/xE4WdKTACTtRlHj\nPUPSJpL27bh2N+Dm8roDJC0FTgIuA3ay/W7by4mImJIxo11tTVivXhO2bwdOBU4ta6udPabPkfTn\ncn+17ZfavkjSlsBPJBm4GzjS9m8kbQq8T9KZwJ+BeyibJSge4L3C9s3rU96IGE5tH1mnNhdufZXB\nvjaZjziiJxavb1PjjBkz/ZjHbNTVtffee8965zddGVkXEUOhzZXOBOKIGAJmNHNNREQ0p+1txBkQ\nERHDoaLua+VgshWSVko6YYLzkvSZ8vzSciqHSSUQR8QQcNf/TUbSTOB0ikFlOwGvlbTTuMsOBHYo\nt3nAZ6cqXQJxRAyFiuaamAOstH2j7fuBc4GDx11zMPBFF34GbCbpaZMlmjbiiBgKFQ1x3hK4peP1\nKmCvLq7ZkmJqiAkNeiBeTTk6bxo2L983pXXs59t1+uso6Q92+r3Io23pdzuFwmQuKfPtxoaSOufJ\nWWB7QQVlWKuBDsS2pz1lpqRFdXbmTvpJv+159Hv6E7E9t6KkbgW27ni9VXlsutc8TNqIIyK6dyWw\ng6TtysUwDqeYvrfTRcAbyt4TewN32V5rswQMeI04IqJKttdIOo6iqWMmcLbt5ZKOLc/PBxYCLwNW\nAn8Cjpoq3QTiR6q1LSjpJ/0+yKPf06+V7YUUwbbz2PyOfVOsOtS1gZ70JyKiH6SNOCKiYQnEEREN\nSyCOiGhYAnFERMMSiCMiGpZAHBHRsATiiIiG/Teq40DxQ9jGKgAAAABJRU5ErkJggg==\n", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "evaluate_and_show_attention(\"elle a cinq ans de moins que moi .\")" ] }, { "cell_type": "code", "execution_count": 27, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "input = elle est trop petit .\n", "output = she s too short . \n" ] }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAUYAAAEZCAYAAADrD4zSAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAGLhJREFUeJzt3X+0XWV95/H3h6CVXxYkdJQECrqwGpVfCWCnMGIVDIjF\nLmz54WilYsSKq10dVOzMQmdJl1LGtlKBmFJw/LGkLIuaajTUVoqtMpAAJiQaVhYIBFFXwAEGrJrc\nz/yx90323bn3nHPJuXvvc+/nlbXXPfvZ+z7nm5uc7332s5/9PLJNRETstEfbAUREdE0SY0RETRJj\nRERNEmNERE0SY0RETRJjRERNEmNERE0SY0RETRJjRERNEmPELKbClyS9tO1YRkkSY8TsdipwHHBB\n24GMkiTGiNnt7RRJ8Q2S9mw7mFGRxBgxS0maD7zM9teAbwBvbDmkkZHEGDF7vQX4fPn6enI5PbAk\nxojZ6w8pEiK27wBeIOmQdkMaDUmMEbOQpP2BT9h+uFJ8MTC/pZBGijJRbUTERGkxRswykt4h6Yjy\ntSRdL+kJSeskHdN2fKMgiTFi9vlj4Afl63OBI4HDgT8FrmwpppGSxBgx+2yz/cvy9RnAp20/avsb\nwD4txjUykhgjZp8xSS+Q9BzgNRRjGMft1VJMIyUj4SNmn0uBNcA8YKXtDQCSXgXc12ZgoyJ3pWNk\nSfqM7bf0K5uLysf/9rP900rZPhSf+f/XXmSjIS3GGGUvq+5ImgcsbimWrnke8G5J4z+jDcDVtn/c\nYkwjI32MMXIkfUDSk8CR5TCUJ8r9nwBfbjm81kn6LeCOcvfT5Qbwf8pj0UcupWMgZUf+HwEnAgb+\nDbjG9n+0GNNHbH+grffvKkm3Ae+yfVet/Gjgk7ZPaCey0ZHEGAORdCPwJPDZsug8YH/bv9dCLC+x\n/X1Jx0523PadTcfUJZI22l403WOxU/oYKySdCBxh+3pJBwH72r6/7bg64uW1D9Q3JW1sKZY/BZYB\nH5vkmIHfbjaczpGkA6o3XsrC55Hus4EkMZYkfRBYAvwGxYwkz6JoHaVPpnCnpFfavg1A0gkUQ0Ia\nZ3tZ+fK0+qV8eck/1/0VcLOki4Hx1vNi4PLyWPSRS+mSpLuBY4A7bR9Tlq2zfWRL8fyK7Z/3K2sw\nnu9R/NJ4sCw6FNgEbAPcxs9J0p22j+1XNhdJOgN4H8WdewMbgSts/2OrgY2ItBh3+oVtSzLsGPPV\npu8A9Q/4ZGVNWdrS++5C0vOBBcBe5aQIKg89F9i7tcA6xPZXgK+0HceoSmLc6UZJnwT2l/QOikk+\n/7bpILr6obf9gKSjgJPKom/Z/m5L4bwOeBuwEPjLSvkTwJ+1EVCXSLrR9u+Xry+3/f7KsZttn9pe\ndKMhl9IVkk6hWFVNwGrb/9RCDH9A8aFfQjEWbTwxPgl8yvZNTcdUxvXHwDuA8ff/XWCF7b9pI54y\nprNs/0Nb799Vku6qdAdN6FqoHoupJTF2VNc+9JLWAb9p+6lyfx/gO231wZYxPB/4c+Bg26dJWlTG\n+HdtxdQF1WQ4SWJMH+wA5vyte0lPVp6eqG5PSnqixdAWSnpuOdHotZLulNTmJZCA7ZX97exszbbl\nemA1cHC5fy/wJ+2F0xl7SzpG0mLKLhlJx47vtx3cKJjzfYy292s7hin8oe2PS3odcCDFim+fAW5u\nKZ7rKR4p+2K5/0ag7ZbZfNs3SvoAgO1tkrb3+6Y54BF29r3+iIn9sD9qPpzRM+cTYznodUq2H2sq\nlprx1tjrKSYa3SCptRaa7b+UdAvFI4EA59cfOWvBU5IOpBiOgqRXAo+3G1L7bL+67RhG3ZzvY5R0\nP8UHS+VX2JmUbPuFLcV1PcUl4guBoyjm1rvFduOzx5Sz1myw/ZKm37uX8pHAv6EYq7cBOAh4k+11\nrQbWAZL2Al5cHTkg6VBge23lwJjEnO9jtH14mfxeBHyQ4s7v4cCrKNbLaMvbgX8HvmL7aeAAWuo/\ns70d2FR+sLpkI/BFirv3P6YYXnVvqxF1xzbgptp43GuBF7QUz0iZ8y3GcZKuAcaA37b9UkkHADfb\nPi7xgKRbKZ4Muh14arzc9u+0EU8Z040UYxc/Vxa1NrFFF0n6XxQt/evLX2pfzlCdwcz5PsaKE2wf\nK+kuANs/lfTsxLPDcygWVhonimdv29SliS266FpgBcWNs7eWX2MASYw7/bLsSxvvyD+IosWWeAp7\n2v7XakHZj9Wmzkxs0UXl1GyS9GLgHHY+tRR9JDHudCVFf9WvSfpz4E3A/5jr8Uh6F8UEtS8sB3mP\n24+iD7RNi4FvS5owsYWk9bQ0sUWdpOfbbnOIzN9RtBzX16chi6mlj7FC0ksolpsU8M+2vzfX45H0\nqxQ3fj4CXFI59GSLQ5kAkPTrvY7bfqCpWKYi6au2X9/i++9NMa7xrHJd6RhAEmNERM2cH64TEVGX\nxDgJScv6n9WcrsUD3Ysp8fTWtXiGRdJ1kn4i6Z4pjkvSlZI2S1qnKdYJqktinFzX/hN1LR7oXkyJ\np7euxTMsn6L3JMqnAUeU2zLgmkEqTWKMiJFl+1ag103AMynmGnA5rGt/SX2f/plVw3XGlyXoWl3D\nMKx4Fi8ezqPWhx56KEuWLNntmNauXTuMcIDZ+282LMOKx/ZuTWaydOlSb926daBz165duwGoLni2\nwvaKabzdAuChyv6WsuyRXt80qxJj9LdmTbfGP7c4YVC0ZOvWrQP/P5T0H7aXzHBIu0hijIjGNThM\n8GHgkMr+wrKsp/QxRkSjDGwfGxtoG4KVwFvLu9OvBB633fMyGtJijIjGGTOcFqOkzwMnA/MlbaGY\nOvBZALaXA6uA04HNwNPA+YPUm8QYEc0yjA3pStp2zzlTXVyzv3u69SYxRkTjuv4ochJjRDTKwFgS\nY0TERGkxRkRU2B7WHecZk8QYEY1LizEiomZYw3VmShJjRDSquPnSdhS9JTFGRONyKR0RUTUCN19a\nf1Za0g8kzW87johohilajINsbUmLMSIa1/UB3o22GCXtI+mrkr4r6R5JZ5eH3iPpTknryyVDx8+9\nTtLtku6SdGaTsUbEzOl6i7HpS+mlwA9tH2X75cDXy/Ktto+lWI/h4rLsvwP/Yvt44NXAFZL2qVco\naZmkNZK6NQNrREzBA/9pS9OJcT1wiqTLJZ1k+/Gy/Kby61rgsPL1qcAlku4GbgGeAxxar9D2CttL\n2pjlNyKmz+XsOoNsbWm0j9H2veXyhacDl0n65/LQz8uv2ysxCTjL9qYmY4yImTeWu9I7SToYeNr2\nZ4ErgF5rvK6m6HtU+b3HNBBiRMyw8dl1Btna0vRd6VdQ9BWOAb8E3gV8YYpzPwz8NbBO0h7A/cAZ\njUQZETMqA7wrbK+maAlWHVY5voZimnJs/wx4Z1OxRURDWm4NDiLjGCOicWkxRkRUGNiexBgRMVFa\njBERNUmMEREVzs2XiIhdpcUYEVGTxBgRUVHcle72I4FJjBHRuKz5EhFR1fJci4NIYoyIRo0vbdBl\nSYwR0bgM14mIqEmLMSKiwiOwfGoSY0Q0rs31XAaRxBgRjev6cJ2mF8OKiDlu/K70sJZPlbRU0iZJ\nmyVdMsnxX5X0j+WyzRsknd+vziTGiGjcsBKjpHnAVcBpwCLgXEmLaqe9G9ho+yiKFQI+JunZverN\npXRENGu4N1+OBzbbvg9A0g3AmcDG6jsC+5UL6+0LPAZs61VpEmNENGrIA7wXAA9V9rcAJ9TO+QSw\nEvghsB9wtt37Ye1cSkdE46axfOp8SWsq27Jn8HavA+4GDgaOBj4h6bm9viEtxoho3DSG62y1vaTH\n8YeBQyr7C8uyqvOBj7popm6WdD/wEuD2qSpNizEiGmcPtg3gDuAISYeXN1TOobhsrnoQeA2ApP8E\n/AZwX69K02KMiEaZ4T0rbXubpIso1qufB1xne4OkC8vjy4EPA5+StB4Q8H7bW3vVm8QYEc0a8iOB\ntlcBq2plyyuvfwicOp06kxgjolGZdiwiYhJJjBERNZmPMSJiAnd+dp1OD9eRtI+kr5YPf98j6ey2\nY4qI3TPoUJ02G5VdbzEuBX5o+/VQzJLRcjwRMQRdn6i20y1GYD1wiqTLJZ1k+/H6CZKWjT8u1EJ8\nETFN4+MYB3wksBWdToy27wWOpUiQl0m6dJJzVthe0uexoYjokGHOxzgTOn0pLelg4DHbn5X0f4EL\n2o4pInZT1pXeba8ArpA0BvwSeFfL8UTEMCQxPnO2V1M8AxkRs8jY9iTGiIgdiqE4SYwRERMkMUZE\nTJCbLxERu3DHF5ZOYoyIRqWPMSJiEu74I4FJjBHRuI43GJMYI6JhdvoYIyLq0scYEVGRNV8iIiaR\nxBgRUWXj7bkrHRExQVqM0SmS2g5hgq59QLr285mtOvbPvoskxohoVG6+RETU5ZHAiIg6M5abLxER\nE6XFGBFRkdl1IiImk8QYETGRu93FmMQYEc3LpXRERJXNWCaqjYjYaRQGeO/RdgARMce4WAxrkG0Q\nkpZK2iRps6RLpjjnZEl3S9og6V/71ZkWY0Q0b0gtRknzgKuAU4AtwB2SVtreWDlnf+BqYKntByX9\nWr9602KMiIYV60oPsg3geGCz7fts/wK4ATizds55wE22HwSw/ZN+lSYxRkTjxsY80AbMl7Smsi2r\nVbUAeKiyv6Usq3oxcICkWyStlfTWfvE1fildNmvPs3110+8dEe1z2cc4oK22l+zmW+4JLAZeA+wF\nfEfSbbbvneob2mgx7g/8UQvvGxEdMcRL6YeBQyr7C8uyqi3AattP2d4K3Aoc1avSNhLjR4EXlXeI\nrii3eyStl3Q2gAq7lEfE7DDExHgHcISkwyU9GzgHWFk758vAiZL2lLQ3cALwvV6VtnFX+hLg5baP\nlnQWcCFF9p5PcUfpVuA/A0fXy20/Uq+s7HOo9ztERGcNnPT612Rvk3QRsBqYB1xne4OkC8vjy21/\nT9LXgXXAGHCt7Xt61dv2cJ0Tgc/b3g78uBxfdFyP8vpvAmyvAFYASOr2qNGIGPpEtbZXAatqZctr\n+1cAVwxaZ9uJMSLmGAPe3u02TBt9jE8C+5WvvwWcLWmepIOA/wLc3qM8ImaBIfYxzojGW4y2H5X0\n75LuAb5Gcd3/XYpfJO+z/SNJXwR+s17edKwRMQNaTnqDaOVS2vZ5taL31o67LHsvETHrTGMcYyvS\nxxgRjUuLMSKiYhSmHUtijIhm2TgT1UZETJQ1XyIianIpHRFRlXWlIyImys2XiIhdmLHt3e5kTGKM\niGblUjoiYhJJjBERE3U8LyYxRkSzcvMlog9JbYcwQRc/sF37Ge226S2G1YokxohomBnLI4ERERN1\nsWVelcQYEc1LYoyI2MnpY4yI2FXHG4xJjBHRtKz5EhExkcld6YiIKpM+xoiIXeRSOiJiAnf+7ksS\nY0Q0K9OORUTsamx7EmNExA6ZXSciom4ELqX3mOk3kPQDSfN34/uPlnT6MGOKiDYVA7wH2doy44lx\nd0jaEzgaSGKMmEW6nhiHeiktaR/gRmAhMA/4cHnoPZLeADwL+D3b35f0POA64IXA08Ay2+skfQh4\nUVn+IPBbwF6STgQ+YvvvhxlzRDSv6wO8h91iXAr80PZRtl8OfL0s32r7WOAa4OKy7H8Cd9k+Evgz\n4NOVehYBr7V9LnAp8Pe2j54sKUpaJmmNpDVD/rtExAwYn11nkG0QkpZK2iRps6RLepx3nKRtkt7U\nr85hJ8b1wCmSLpd0ku3Hy/Kbyq9rgcPK1ycCnwGw/S/AgZKeWx5baftng7yh7RW2l9heMpS/QUTM\nuGFdSkuaB1wFnEbRoDpX0qIpzrscuHmQ+IaaGG3fCxxLkSAvk3Rpeejn5dftDHb5/tQw44qILhnq\nzZfjgc2277P9C+AG4MxJznsP8A/ATwapdKiJUdLBwNO2PwtcQZEkp/It4M3l951Mcbn9xCTnPQns\nN8w4I6JFw72UXgA8VNnfUpbtIGkB8LsUXXkDGfal9CuA2yXdDXwQuKzHuR8CFktaB3wU+IMpzvsm\nsEjS3ZLOHmawEdGOabQY54/fQyi3Zc/g7f4aeL/tgec6G+pdadurgdW14sMqx9cAJ5evHwPeOEkd\nH6rtPwYcN8w4I6I903zyZWuf+wcPA4dU9heWZVVLgBvKZWjnA6dL2mb7S1NVmidfIqJhxsObqPYO\n4AhJh1MkxHOA8ya8m334+GtJnwK+0ispQhJjRDTNMPhFbZ+q7G2SLqK4Up0HXGd7g6QLy+PLn0m9\nSYwR0bhhPtViexWwqlY2aUK0/bZB6kxijIjGdX0SiSTGiGhUph2LiKizGdueVQIjIiZKizEiYiKT\nxBgRsYNHYAbvJMaIaJiZxtN5rUhijIjGpcUYEVEzNrxHAmdEEmNERTnRQKd0qXW1ZMnuzwddzJyT\nxBgRMVGHkv1kkhgjonEZrhMRUdOl7oHJJDFGRMPM2Nj2toPoKYkxIhqVAd4REZNIYoyIqElijIiY\nwBmuExFRZzLAOyJiBzuPBEZE1Dh9jBERdXlWOiKiJi3GiIiaJMaIiCpnuE5ExAQGxpxnpSMiKnJX\nesZJWgYsazuOiBhcEuMMs70CWAEgqds/7YgAkhgjIiYo7r1kHGNERIVxxx8J3KPtAAYlaZWkg9uO\nIyJ2nwf805aRaTHaPr3tGCJiONLHGBExQdaVjoiYYBTWfBmZPsaImD1sD7QNQtJSSZskbZZ0ySTH\n3yxpnaT1kr4t6ah+dabFGBGNG9ZEtZLmAVcBpwBbgDskrbS9sXLa/cCrbP9U0mkU455P6FVvEmNE\nNMwwvD7G44HNtu8DkHQDcCawIzHa/nbl/NuAhf0qzaV0RDRuGsN15ktaU9nqj/8uAB6q7G8py6by\nduBr/eJLizEiGjXNmy9bbS8ZxvtKejVFYjyx37lJjBHRuCHelX4YOKSyv7Asm0DSkcC1wGm2H+1X\naRJjRDRsqOMY7wCOkHQ4RUI8BziveoKkQ4GbgLfYvneQSpMYI6Jxw7orbXubpIuA1cA84DrbGyRd\nWB5fDlwKHAhcLQlgW7/L8yTGiGjUsAd4214FrKqVLa+8vgC4YDp1JjFGRMOy5ktExC5MnpWes7r4\nPGjZxxIjZDb+m3Xxs1GVxBgRDfPQbr7MlCTGiGhUljaIiJhELqUjImqSGCMiJshwnYiIXbS50NUg\nkhgjolE2jI1tbzuMnpIYI6Jhgy9b0JYkxohoXBJjRERNEmNERE0GeEdEVDnDdSIiJjAw1vEW426v\nEijplnKx67vL7QuVY8skfb/cbpd0YuXYGZLukvRdSRslvXN3Y4mI0WCPDbS15Rm1GCU9G3iW7afK\nojfbXlM75wzgncCJtrdKOhb4kqTjgUcpFr0+3vYWSb8CHFZ+3wG2f/rM/joR0X3dH64zrRajpJdK\n+hiwCXhxn9PfD7zX9lYA23cC/xt4N7AfRVJ+tDz2c9ubyu87W9I9kv6bpIOmE19EjAbbA21t6ZsY\nJe0j6XxJ/wb8LbARONL2XZXTPle5lL6iLHsZsLZW3RrgZbYfA1YCD0j6vKQ3S9oDdqzVcBqwN3Cr\npC9IWjp+PCJG2/iaL11OjINcSj8CrAMusP39Kc7Z5VK6H9sXSHoF8FrgYuAU4G3lsYeAD0u6jCJJ\nXkeRVH+nXo+kZcCy6bx3RLTJuOOPBA7SCnsTxXqtN0m6VNKvD1j3RmBxrWwxsGF8x/Z6239FkRTP\nqp5Y9kVeDVwJ3Ah8YLI3sb3C9pJ+yyFGRHd4wD9t6ZsYbd9s+2zgJOBx4MuSviHpsD7f+hfA5ZIO\nBJB0NEWL8GpJ+0o6uXLu0cAD5XmnSloHXAZ8E1hk+09sbyAiZoXZcCkNgO1HgY8DHy9bc9W28Ock\n/ax8vdX2a22vlLQA+LYkA08C/9X2I5L2A94n6ZPAz4CnKC+jKW7IvMH2A7v1N4uIzur6XWl1PcDp\nKBNwZ3TxZzsbV5yLZtnerf9E8+bt6X333X+gc5944tG1bXST5cmXiGhcFxsNVUmMEdG4LJ8aEVGX\nFmNERJUxaTFGROww/uRLlyUxRkTjkhgjImqSGCMiJnCWT42IqBqFPsZM5RURzRtf96XfNoByWsJN\nkjZLumSS45J0ZXl8XTlpdk9JjBHRsEHn1umfGCXNA66imJ5wEXCupEW1004Djii3ZcA1/epNYoyI\nxg1xzZfjgc2277P9C+AG4MzaOWcCn3bhNmB/SS/oVWn6GCOicUN8JHAB8FBlfwtwwgDnLKCYhHtS\nsy0xbqWc13E3zS/r2i1DnMlmKPEMWddiSjy9DSueQSeq7mU1RTyDeI6k6uoAK2yvGEIMPc2qxGh7\nKItnSVrTpRnBuxYPdC+mxNNbl+KxvXSI1T0MHFLZX1iWTfecCdLHGBGj7A7gCEmHl8s6n0Ox0F7V\nSuCt5d3pVwKP257yMhpmWYsxIuYW29skXURxeT4PuM72BkkXlseXA6uA04HNwNPA+f3qTWKc3Iz3\nYUxT1+KB7sWUeHrrWjxDY3sVRfKrli2vvDbFevYDm1VLG0REDEP6GCMiapIYIyJqkhgjImqSGCMi\napIYIyJqkhgjImqSGCMiav4/I7VQqEffkqYAAAAASUVORK5CYII=\n", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "evaluate_and_show_attention(\"elle est trop petit .\")" ] }, { "cell_type": "code", "execution_count": 28, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "input = je ne crains pas de mourir .\n", "output = i m not scared to die . \n" ] }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAUYAAAEZCAYAAADrD4zSAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAG9BJREFUeJzt3Xm8XWV97/HPl4A1QJDxepkEro0oICAEcEDFIhqcsC+x\nIDiA2kgr3vZaB2y92gFflVJulTLE6AWkWnBANEIU1FZRkWZgCCQSmgsqQasGFRAQTM73/rHWievs\nnJyzT7L2Wnvv833zWq+zpr2eZ58kP55pPY9sExERv7NV2xmIiOg3CYwRER0SGCMiOiQwRkR0SGCM\niOiQwBgR0SGBMSKiQwJjRESHBMaIiA4JjBHTjApflPSMtvPSrxIYI6aflwBHAG9tOyP9KoExoqIs\nTe3ddj567C0UQfGVkrZuOzP9KIExosLFrCqL2s5Hr0jaFTjQ9leArwOvbjlLfSmBMWJjN0s6ou1M\n9MgbgCvK/UtJdXpcyrRjEWNJuhP4feCHwMOAKAqTB7easRpIuh2Ya/u+8vg24BW27203Z/0l7QtD\nRNJrga/afkjS+4HDgLNt39xy1gbNS9vOQC9I2hG4YDQolt4F7AokMFakxDhEJC23fbCko4GzgXOB\nD9g+quWsDQRJO9h+UNLO4123/Yum8xTtSBvjcFlf/nw5sMD2tcATWszPoPnX8ucyYGn5c1nleGBJ\n+mNJs8t9SbpU0oOSlkt6Vtv56zcpMQ4RSdcA9wHHUVSjHwUW2z6k1YwNEEkC9rb9o7bzUidJdwDP\nsv1bSacAf0ExnvFZwAdtP7/VDPaZlBiHyx8B1wEvtf0rYGfg3e1mabCUw3WubTsfPbDO9m/L/VcA\nl9u+3/bXge1azFdfSmAcIrYfAb4EPCzpKcA2wJ3t5mogDeNwnRFJu0t6InAsxRjGUTNbylPfSq/0\nEJH0DuCDwE+BkfK0gYEfZtKwo4BTJQ3TcJ0PULSTzgAW2l4BIOmFwN1tZqwfpY1xiEhaDRxl+/62\n8zLIJO0z3nnbP2w6L3UqX/+bZfuXlXPbUcSBX7eXs/6TEuNwuRd4oO1MDIFhLS3sDLxd0oHl8Qrg\nIts/bTFPfSmBcbjcDXxT0rXAY6Mnbf+f9rI0kK6lCI4CngjsB6wCDpzoQ/1M0vMohiNdBlxenj4c\n+A9Jp9r+blt560cJjMPlR+X2BDJ+cbPZfmb1WNJhwJ+2lJ26nAe82vYtlXMLJV0NfIyiXTVKaWOM\n6IKk2zsD5iCRtNL2AVO9Nl2lxDgEJH3E9p9L+jLjtI/ZflUDedgK2N72g71Oq9ckvbNyuBXFYPkf\nt5SdukjSTtWOl/LkzmTY3kYSGIfDv5Q//7HJRCX9K3AGxauIS4AdJH3U9rlN5qMHZlX211G0OV7V\nUl7q8k/A9ZLeBYxOKnI4cE55LSpSlY7NJulW24dKOpWiVHUWsGzAx/ttIGl7gGEZyiLpFcB7KDqR\nDKwEzrX95VYz1odSYhwi5SQBfw8cQNGbCoDt/9GjJLeRtA3FLNAXlO/hDvz/aSUdRFEK37k8Xgu8\nyfYdrWZsC9m+Brim7XwMgrQtDJdLgYspqn8vohiW8akepvcx4AcU79reUA6MHvg2RmAB8E7b+9je\nh2LChQUt52mLSPpsZf+cjmvXN5+j/paq9BCRtMz24dUe1NFzDeZha9vrmkqvFyTd1jkj0XjnBomk\nW2w/q9y/2fZh412LQqrSDZD0NIqS3JNtHyTpYOBVts+uOanHyt7h/5R0JsUUZNvXnMYYkl5O0Wb1\nxMrpv+1BOk39DgHulvS/+V2n1usZ/PeJJyoBpXTUIVXpZnwceB/wWwDby4GTe5DOnwHbAv+Tosfx\n9cCbepAOAJLmAycB76B4S+S1wLjvGdegqd8hwJuB3Sh6oq+imPr/9B6l1ZRtJT1L0uHAzHL/sNHj\ntjPXb6ZlibGc+n+27Usl7UYx/u6eHia5re3FxRyoG9Ra3ZQ0AzjJ9ruAX9PMP+TnlkspLLf9N5LO\nA77So7R6/juseCqwN0XBYWuKabr+gMGepegnwOirof9V2R89joppFxglfRCYA+xP0VmxDUUHxfN6\nmOxaSU+lrLJIOpHiL2ptbK8vA36THi1/PiJpD+B+YPcepdXz32HFpykWibqD303fNtBsv6jtPAyS\naRcYgT+kmM79ZgDbP5Y0a+KPbLG3U/RqPl3SfcA9wKk9SOcWSQuBz1HMIwiA7S/0IC2Aa8qV5/6B\nYl0UgE/0KK2mfocAPx/GsX2SZgJPs31b5dxTgPUdKwdOe9OuV1rSYttHjvbMlfPRfa+Xg5Il/R5w\nIrAvxdi4BykmPq21k0LSpeOctu0315lOJb2ZwJ8Az6coyX0buNj2b2pM450dp2ZSVHEfht7MHCTp\nWOB1wDcYO0tRr/4H04hyzOmdwMG2Hy7PXQ/8pe2BXuyrbtOxxPhZSR8DdpT0x8Bb6F0pZ9SXgF9R\nlFJ7+c7tVsCfleu9IGknillVeuWTwEPA+eXxKRRjJ/+oxjRGS/P7A0dQ/C4FvAFYXGM6VacDT6do\nZqnOhD7QgbEcgH81xZ/PpWVpcbcExY1NuxIjgKTjKFZIA7iuXBCol+ndYfugXqZRprPReLRejlEb\nb1aWXs3UIukG4OW2HyqPZwHX2n5BD9JaZXv/up/bDyQ9nWJp3RdIej/woO3zJ/vcdDNthutI+k75\n8yGKIRhnlNvVkh6QdI+kXs25d6OkJqas2qosJQIbZk7pZa3gZknPrqR3FL1bf/nJwOOV48fLc71w\no6ShnIbL9p0UM+08jWK4079M8pFpadpUpW0fXf4ct6NF0i7AjcBFPUj+aOA0SfdQtFn1anGl84Dv\nSfpcefxa4EM1p1F1OEUQGV2D+SnAKkm3U//3uxxYXFYFoXg/+7Ian1/1bODWBv68NknSf7fdq2E0\n/5ei+ej2zmnIojAtq9KbIml327UPAWlycaWypPMH5eG/2V5ZdxqVtCYczF339ytn0h5dGP6Gjtmo\n60yn9cWwJF1r++U9eva2FEOdXtPrZqRBlcAYEdFh2rQxRkR0K4ERkDQvaSWtttNqOr2mv1svSLpE\n0s8kjTtXpgrnS1otaXnZHDOpBMZCk39BklbS6pf0Bj4wUnTAzZ3g+vHA7HKbRzFD06QSGCNiYNm+\nAfjFBLecAFzuwk0UL3ZM+j7/UA3X2ZJp9Zuckn+qae2x976blc6TdtqFPZ+y35S/128eeWzymzps\nt90O7LLrHlNO6xf3b94ggH7+8xqk9DYnLdua/K5Nmzt3rteuXdvVvcuWLVsBVF8xXWB7KrOp7wnc\nWzleU56b8C/eUAXGYXXGe2uf93VCdy29q7G0PnVZL+aZjX62du1ali7t7j0ASb+xPafHWdpIAmNE\nNK7BYYL3UcytOWqv8tyE0sYYEY0ysH5kpKutBguBN5a9088GHujmJY6UGCOiYcY1LTMj6QrgGGBX\nSWuAD1LMioTt+cAi4GXAauARupzZPoExIpplGKmpJm37dZNcN8Ukx1OSwBgRjev3V5ETGCOiUQZG\nEhgjIsZKiTEiosJ2XT3OPZPAGBGN6/cS48CMY5R0Y9t5iIh6uMv/2jIwJUbbz207DxGx5YrOl7Zz\nMbGBCYySfm17+7bzERFbrt+r0gMTGDelnGxzGOaVi5ge0vnSe+UURAug+emhImLqTEqMEREbyQDv\niIgOKTFGRIzR7lCcbgxMYEyPdMRwcI2z6/TKwATGiBgeI+mVjoj4ncyuExExjnS+RERU2SkxRkR0\nSokxIqLCwPoExoiIsfq9xKh+z+BU5F3pejT5d0JSY2lFPWxv0R/aQYcc4s8uWtTVvQfutdcy23O2\nJL3NkRJjRDTK6XyJiNhYv9dUExgjonEJjBERFUWvdF4JjIgYI5NIRERU2alKR0RUZWmDiIhxZLhO\nRESHlBgjIiqc5VMjIjaWNV8iIjr0+3CdrdrOQJWkfSXdKekySXdJ+rSkF0v6rqT/lHRk23mMiC0z\n2ivdzdYNSXMlrZK0WtJZ41x/kqQvS7pN0gpJp0/2zL4KjKXfB84Dnl5upwBHA+8C/rLzZknzJC2V\ntLTRXEbEZqsrMEqaAVwIHA8cALxO0gEdt70dWGn7EOAY4DxJT5jouf1Ylb7H9u0AklYA37BtSbcD\n+3bebHsBsKC8v88L6BFBvZ0vRwKrbd8NIOlK4ARgZTVFYJaKOe62B34BrJvoof0YGB+r7I9Ujkfo\nz/xGxBTUPMB7T+DeyvEa4KiOey4AFgI/BmYBJ9kTv6zdj1XpiBhyI+WcjJNtwK6jTWXlNm8zknsp\ncCuwB3AocIGkHSb6QEpgEdG4KQzXWTvJDN73AXtXjvcqz1WdDnzYRTF1taR7KPovFm/qoX0VGG3/\nADiocnzapq5FxOCq8cWXJcBsSftRBMSTKTpsq34EHAt8W9KTgf2Buyd6aF8FxogYfqa+d6Vtr5N0\nJnAdMAO4xPYKSWeU1+cDfwdcVnbgCniv7bUTPTeBMSKaVfMrgbYXAYs6zs2v7P8YeMlUnpnAGBGN\nyrRjERHjSGCMiOiQ+RgjIsZwZteJiKiyax2u0xMJjBHRuExUGxFRUec4xl5JYIyIxqVXOiKiKutK\nR0SMI4ExImKskfUJjBERGxTDdRIYIyLGSGCMiBgjnS8RERtxny8sncAYEY0ahDbGgVgMS9JpkvZo\nOx8RUQ+PjHS1tWUgAiNwGsUKXxExBEYnkphsa0srgVHSvpK+L+njklZIul7STEmHSrpJ0nJJV0va\nSdKJwBzg05JulTSzjTxHRE1sPNLd1pY2S4yzgQttHwj8CngNcDnFQjUHA7cDH7T9eWApcKrtQ20/\nWn2IpHmja842nP+I2EwuXwucbGtLm50v99i+tdxfBjwV2NH2t8pznwQ+N9lDbC8AFgBI6u8W3YjI\nmi+TeKyyvx7Ysa2MRESz+j0w9lPnywPALyU9vzx+AzBaenwImNVKriKiXjZeP9LV1pZ+G8f4JmC+\npG2Bu4HTy/OXlecfBZ7T2c4YEYOl30uMrQRG2z8ADqoc/2Pl8rPHuf8q4Kre5ywimtDncbHvSowR\nMeTS+RIR0WkAXglMYIyIhpmRFjtWupHAGBGNS4kxIqJiEGbXSWCMiOYlMEZEjOX+bmJMYIyI5qUq\nHQNHUmNpNfkPpMnvFROwGWlxEtpuJDBGRKMGYYB3P00iERHTgal1olpJcyWtkrRa0lmbuOeYcqLr\nFZK+Nd49VSkxRkTzaioxSpoBXAgcB6wBlkhaaHtl5Z4dgYuAubZ/JOm/TfbclBgjomHdzd7dZXX7\nSGC17bttPw5cCZzQcc8pwBds/wjA9s8me2gCY0Q0bmTEXW3ArqNLl5TbvI5H7QncWzleU56rehqw\nk6RvSlom6Y2T5S9V6YholMs2xi6ttT1nC5PcGjgcOBaYCXxP0k2275roAxERjaqxV/o+YO/K8V7l\nuao1wP22HwYelnQDcAiwycCYqnRENK7GNsYlwGxJ+0l6AnAysLDjni8BR0vaulwd4Cjg+xM9NCXG\niGhYfUuj2l4n6UzgOmAGcIntFZLOKK/Pt/19SV8FlgMjwCds3zHRcwciMEo6DZhj+8y28xIRW6jm\n2XVsLwIWdZyb33F8LnBut89sLTBK2tr2urbSj4h2GPD6IXrzRdJ2kq6VdJukOySdJOkISTeW5xZL\nmiVpX0nflnRzuT23/Pwx5fmFwMry3OvLz90q6WPlgE0knS7pLkmLgefV/cUjoj01tjH2xFRLjHOB\nH9t+OYCkJwG3ACfZXiJpB+BR4GfAcbZ/I2k2cAUw2uV+GHCQ7XskPQM4CXie7d9Kugg4VdLXgL+h\n6GJ/APj3Mp2NlOOaOsc2RUS/ajnodWOqgfF24DxJ5wDXAL8CfmJ7CYDtB6EoWQIXSDoUWE8xwHLU\nYtv3lPvHUgS/JeXMJzMpgupRwDdt/7x83mc6nrGB7QXAgvK+/v5tRwQwpXGMrZhSYLR9l6TDgJcB\nZwP/tolb/xfwU4qxQlsBv6lce7iyL+CTtt9X/bCkV08lXxExWPq9xDjVNsY9gEdsf4qih+coYHdJ\nR5TXZ0naGngSRUlyBHgDRTf6eL4BnDj6UreknSXtA/wH8EJJu0jaBnjtZny3iOhDo9OODVMb4zOB\ncyWNAL8F/oSi1PfPkmZStC++mGImi6vKdxK/ythS4ga2V0p6P3C9pK3KZ77d9k2S/hr4HkV1/dYp\nf7OI6E827vOJatXvRdqpSBvj4MkM3oPH9hb9Inffax+ffuZfdXXv37/vbctqeFd6ygZigHdEDJd+\nL5AlMEZEs7KudETEWIOw5ksCY0Q0zIys7+/OlwTGiGhWqtIREeNIYIyIGKvP42ICY0Q0K50vEZMY\n1kHXTf/DH6jf49QWw2pFAmNENMyM9PkrgQmMEdG4VKUjIjolMEZE/I7TxhgRsbE+LzAmMEZE04Zv\nzZeIiC1j0isdEVFl0sYYEbGRfq9KT2kxrF6StKOkP207HxHRay67prvYWtI3gRHYEUhgjBh2Hr5V\nAnvpw8BTJd0KfK08dzxFk8TZtj/TWs4iolYj61OV7tZZwP+zfShwE3AocAjFcqznStq9zcxFRD0G\nYV3pfgqMVUcDV9heb/unwLeAI8a7UdI8SUslLW00hxGxeVKV7j3bC4AFkHWlIwZD/w/w7qcS40PA\nrHL/28BJkmZI2g14AbC4tZxFRK1SYuyS7fslfVfSHcBXgOXAbRRNEu+x/V+tZjAiapMB3lNg+5SO\nU+9uJSMR0TN1z64jaS7wUWAG8AnbH97EfUcA3wNOtv35iZ7ZT1XpiJgm6qpKS5oBXEgxtO8A4HWS\nDtjEfecA13eTvwTGiGhYd0GxyzbGI4HVtu+2/ThwJXDCOPe9A7gK+Fk3D01gjIhmlVXpbrYu7Anc\nWzleU57bQNKewB8CF3ebxb5qY4yI6WEKPc67doxRXlAO0ZuKjwDvtT3S7WqKCYwR0agpriu91vac\nCa7fB+xdOd6rPFc1B7iyDIq7Ai+TtM72Fzf10ATGiGiYcX0T1S4BZkvajyIgngyMGd1ie7/RfUmX\nAddMFBQhgTEimmZwTXHR9jpJZwLXUQzXucT2CklnlNfnb85zExgjonF1vtViexGwqOPcuAHR9mnd\nPDOBMaIHum3kr0tTr8/NmTNRc1/3+v1d6QTGiGjUFDtfWpHAGBHNshlZn1UCIyLGSokxImIsk8AY\nEbGBnTbGiIgOxnUNZOyRBMaIaFxKjBERHUbqeyWwJxIYI6JRxVyLCYyTkvTXwK+BHYAbbH+93RxF\nRE+lKt092x9oOw8R0Xv9PlyntRm8Jf2VpLskfQfYvzx3maQTy/3DJX1L0jJJ10nava28RkS9snzq\nOCQdTjFv2qFlHm4GllWubwP8M3CC7Z9LOgn4EPDmFrIbEbUyIyPr287EhNqqSj8fuNr2IwCSFnZc\n3x84CPhaOUvJDOAn4z1I0jxgXu+yGhF1ygDvzSdghe3nTHZjuf7DAgBJ/f3bjgig/wNjW22MNwCv\nljRT0izglR3XVwG7SXoOFFVrSQc2ncmI6I20MY7D9s2SPgPcRrHO65KO64+XnTDnS3oSRT4/Aqxo\nPLMRUTNnuM6m2P4QRYfKpq7fCryguRxFRFNMBnhHRGxg55XAiIgO7bYfdiOBMSIal3elIyI6pMQY\nEdEhgTEiosoZrhMRMYaBEedd6YiIivRKR0RsJIExIqJDAmNEREXR95JxjBERFcZ5JTAiYqx+X/Ml\ngTEiGpc2xoiIMbKudETEGIOw5ktry6dGxPRV59IGkuZKWiVptaSzxrl+qqTlkm6XdKOkQyZ7ZkqM\nEdG4uiaqlTQDuBA4DlgDLJG00PbKym33AC+0/UtJx1MsnnfURM9NYIyIhhnqa2M8Elht+24ASVcC\nJwAbAqPtGyv33wTsNdlDU5WOiMa5y/+AXSUtrWyda8jvCdxbOV5TntuUtwBfmSx/KTFGRKOm2Pmy\n1vacOtKV9CKKwHj0ZPcmMEZE42rslb4P2LtyvFd5bgxJBwOfAI63ff9kDx34wFgWrTuL1xHRt2od\nx7gEmC1pP4qAeDJwSvUGSU8BvgC8wfZd3Tx04AOj7QUUvUxI6u/BUREB1NcrbXudpDOB64AZwCW2\nV0g6o7w+H/gAsAtwkSSAdZNVzwc+MEbEYKl7gLftRcCijnPzK/tvBd46lWcmMEZEw/p/zZeBGa4j\naZGkPdrOR0RsOTPS1daWgSkx2n5Z23mIiHr0+7vSAxMYI2JYuLbOl15JYIyIRmVpg4iIcaQqHRHR\nIYExImKM/h+uk8AYEY3LYlgRERU2jIysbzsbE0pgjIiGdb9sQVsSGCOicQmMEREdEhgjIjpkgHdE\nRJUzXCciYgwDIykxRkSMlap0RMQYGa4TEbGRBMaIiIq613zphQTGiGiYcZ+/ErjFa75I+qakVZJu\nLbfPV67Nk3RnuS2WdHTl2isk3SLpNkkrJb1tS/MSEYPBXf7Xls0qMUp6ArCN7YfLU6faXtpxzyuA\ntwFH214r6TDgi5KOBO6nWAv6SNtrJP0esG/5uZ1s/3Lzvk5EDIJ+r0pPqcQo6RmSzgNWAU+b5Pb3\nAu+2vRbA9s3AJ4G3A7MogvL95bXHbK8qP3eSpDsk/YWk3aaSv4gYDLa72toyaWCUtJ2k0yV9B/g4\nsBI42PYtlds+XalKn1ueOxBY1vG4pcCBtn8BLAR+KOkKSadK2go2LJR9PLAtcIOkz0uaO3p9nPzN\nk7RU0tLxrkdEfymC3khXW1u6qUr/BFgOvNX2nZu4Z6Oq9GRsv1XSM4EXA+8CjgNOK6/dC/ydpLMp\nguQlFEH1VeM8ZwFFtRxJ/V0+jwhgOKrSJwL3AV+Q9AFJ+3T57JXA4R3nDgdWjB7Yvt32P1EExddU\nbyzbIi8Czgc+C7yvy3Qjos+NjIx0tbVl0sBo+3rbJwHPBx4AviTp65L2neSj/wCcI2kXAEmHUpQI\nL5K0vaRjKvceCvywvO8lkpYDZwP/Dhxg+89tryAihsPoRBKTbS3pulfa9v3AR4GPlqW56kCkT0t6\ntNxfa/vFthdK2hO4saziPgS83vZPJM0C3iPpY8CjwMOU1WiKDplX2v7hFn2ziOhTxgzhu9K2F1f2\nj5ngvouBi8c5/xDwsk18prPDJiKGSN58iYgYRwJjRESHBMaIiDGc5VMjIqoGoY1xiyeRiIiYshqH\n65Rvxq2StFrSWeNcl6Tzy+vLy3kbJpTAGBEN63ZunckDo6QZwIUUb8gdALxO0gEdtx0PzC63eYwz\nUqZTAmNENK7Gd6WPBFbbvtv248CVwAkd95wAXO7CTcCOknaf6KFpY4yIxtX4ut+ewL2V4zXAUV3c\nsyfFPBDjGrbAuJby1cIp2rX8bBOSVtKqPT1JTaXV7VwJE7muTLsbT+yYOWtBOXFMTw1VYLS9WfM3\nSlpqe07d+UlaSauf02v6u42yPbfGx90H7F053qs8N9V7xkgbY0QMsiXAbEn7lSsLnEwx12vVQuCN\nZe/0s4EHbG+yGg1DVmKMiOnF9jpJZ1JUz2cAl9heIemM8vp8YBHF3AyrgUeA0yd7bgJjoedtFkkr\nafVhek1/t56wvYgi+FXPza/sm2JJla6p30egR0Q0LW2MEREdEhgjIjokMEZEdEhgjIjokMAYEdEh\ngTEiokMCY0REh/8P6GBHIpoo0s8AAAAASUVORK5CYII=\n", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "evaluate_and_show_attention(\"je ne crains pas de mourir .\")" ] }, { "cell_type": "code", "execution_count": 29, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "input = c est un jeune directeur plein de talent .\n", "output = he s a very young young . \n" ] }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXgAAAEgCAYAAAC+QGg8AAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3Xu4XVV97vHvm6hcFIUKKhIQygkiKLcE1DZaFINBUWq9\n4KX24C3FSqvtUaR9+mCt+lRLPVZOQYwUL0cLogVLaSoKotIiSgJySRDNAyKJtxPuYgsm+z1/zLll\nZbP3XjtrzbnmXDPvh2c+e83LmmOskPz2WGOO8RuyTUREdM+8pisQERH1SICPiOioBPiIiI5KgI+I\n6KgE+IiIjkqAj4joqAT4iIiOSoCPiOioBPhoLUnzJf1p0/WIGFcJ8NFatjcDr2m6HhHjSklVEG0m\n6SPAI4HPA/dPHrd9TWOVihgTCfDRapIun+awbT9/5JWJGDMJ8BERHfWIpisQMRtJp0533PZfj7ou\nEeMmAT7a7v6e19sDxwI3NVSXiLGSLpoYK5K2Ay6xfWTTdYlouwyTjHGzI7Cg6UpEjIN00USrSboB\nmPyaOR/YDUj/e8QcpIsmWk3SU3p2NwE/s72pqfpEe0gScCHw57bzXGYa6aKJVrN9G7An8HzbG4Cd\nJe3TcLWiHY4GDgfe3HRF2ioBPlpN0nuAdwN/Xh56FPDZ5moULfImiuD+Eknpbp5GAny03cuAl1IO\nl7T9Y2CnRmsUjZO0K3Cg7X8HLgV+t+EqtVICfLTdgy4eFBlA0qMbrk+0w+uBc8vXnyTdNNNKgI+2\nO1/Sxyn63t9C0Vo7u+E6RfPeSBHYsX01sLukPZutUvtkFE20nqSlFA/URDHJ6asNV2lsSNrO9gP9\njo0TSTsDx9v+eM+xpcBG29c2V7P2SYCPgZTDFxfavlTSDsAjbN9XQzkfsv3ufsdiepKusX1Yv2PR\nTemiia1WdpV8EZhsQS0AvlRTcUunOXZMTWV1hqQnSVoE7CDpUEmHlduRFLOBx5Kkt0haWL6WpE9K\nulfS9ZIObbp+bZOhRTGItwFHAN8GsP0DSU+osgBJbwX+CNhX0vU9p3YCrqyyrI56IXACxS/f/91z\n/D7gL5qoUEXeDnyqfP0a4CBgH+BQ4HTgOc1Uq50S4GMQD9h+sJhICOUY5Kr7+v4J+Hfgb4BTeo7f\nZ/vOisvqHNufBj4t6eW2/7np+lRok+1fla+PBT5j+w7gUkl/22C9WikBPgbxDUl/QfH1fylFS/tf\nqyzA9j3APZI+Ctw52b8v6bGSnmn721WW12EXS3otsDc9/97HOJ/+hKTdgbuAo4AP9JzboZkqtVcC\nfAziFIpZhDcAfwispL6hix8Deh8I/mKaY5WQtBvwFh4eDN9YdVkj9C/APcBqYGxHzvQ4FVhFkXju\nIttrACT9DnBLkxVro4yiiVaT9F3bh0w5dr3tg2oo60rgCopguHny+Dh3cUi60fbTm65HlcouwZ1s\n39Vz7NEU8ewXzdWsfdKCj60m6beBvwKeQvF3SBQLYf9mDcXdIulPKFrtUHQH1dVS27GDwy+vlPQM\n2zc0XZEK/QbwNkkHlvtrgDNt/6zBOrVSWvCx1SR9D/hTHt7SvaOGsp5AMTri+RQPci8D3mH75zWU\n9X7gStsrq753UyStBf4HcCtFF83kL+PKvwGNQtm4+CeKkTSry8OLgP8JvM72fzZUtVZKgI+tJunb\ntp/ZdD2qJuk+4NEUgfBXPBQMH9toxYYwJZ/+r5VpmMeOpKuAt06dsSrpEODjXfx7OYxMdIpBXC7p\nNEnP7plAU8vMSEn7SbpM0o3l/kGS/rKOsmzvZHue7R1sP7bcH9vgDg/Lp38b8EvG+9/9Y6dLR2D7\nuyTL6MOkBR9bTdLl0xy27efXUNY3gHdRtM4OLY9V+uBQ0v62vzfTLynb11RV1qiV+fQXA0+1vZ+k\nJwNfsP3bDVdtIJJuAn6r9wFrefw3KLrX9m+mZu2Uh6yx1Ww/b4TF7Wj7O5OTqkpVL9n3Z8By4MPT\nnDNF//+4ehnFLM9roMinL2mcW7ofAb4i6Z2Un4miD/5D5bnokQBfs45m8zt1uuM1TZ7ZKGlfHsoH\n/wrgJ1UWYHt5+XOUv7hG5UHbltSJfPq2V0j6MfA+4ECKvxdrgffbrnSyXRckwNfvWzx8Us50x8bJ\n/T2vt6eYMl7XosdvA1YA+0vaQDEa5HV1FCRpR4rW/F62l5dJrZ5q++I6yhuRqfn03wh8ouE6DaX8\n/zHO/09GJgG+JpKeBOxBmc2PYkQGwGMZ42x+ALa36MqQ9HfAJVWXI2kesNj2C8qW57w6UhL3+CTF\n0LvfKvc3AF9gjIOJ7b8r00ncCzwVOHWc8+lLOt/2q8rXW6SNlvQV20c3V7v2SYCvT282vw/zUIAf\n92x+09mR4nNWyvaEpJOB823f3/cNw9vX9vGSXlOW/0tN6fwfR2VAH9ugPsXCntdLKRZkn7TbiOvS\negnwNWkim9+oZphKuoGHskfOp/iHVVfyqkvLB2qfp6drqKaMkg+Wi5dM9lfvS035WyTtRzE794m2\nny7pIOCltt9f0f3vY/oMn+M+tn+2YX8ZEjhFAnz9Fkh6LEXL/RMUfe+n2P5KDWX9I9PMMK3BsT2v\nNwE/s131yJZJx5c/39ZzzEAdaRHeA3wZ2FPS54DfpvgWVodPUA7/BLB9vaR/AioJ8LbHeaTMbHYs\nuzznsWX3p0g2yYfJOPiaSbrO9sGSXgicCPwl8H/rWDJtlDNMJS2hWLLvk5J2pUj+dOsoyq6TpMcD\nz6IIGFfZ3lhTOVfbPlzStT3j+x+WWK3C8p5A8UAcANs/qqOcus0wB+PXOjoSamBpwddvsg/3xRSL\nE6ypsV/3ckmnARfQ07VQ9USd3skzFA8mHwV8lqLFW1UZz7f9NUm/N9152xdUWNbUX7aTwzD3krRX\nTROdah/+Wd73pRTPgJ4M/Jyi++4miiGGYycBfOskwNdvtaRLKLoUTiknmUzUVNZk631R+VPUM1Fn\nFJNnngt8DXgJxWfQlJ+VBXi2nODU+5W2rj8/GN3wz/dRfCO51Pahkp4H/H4N5YxM+ZxkP9vX9Rzb\nC9hse0NzNWufBPj6vYmiW2ZtOSpjL+AdNZX19WmO1dEHN4rJM/dJ+jPgRh4K7FDD55lsFZaB44+A\nJWU5V/BQmuJKlJ9p0krgcor+5PuBl7Pl+qlV+JXtOyTNkzTP9uWS/r7iMkZtE3CBpIN6RledTTE6\nLQG+xzgnHRqYpE9L2rlnfxdJ59RU3BnAE4Fl5f59VP+PeNIverZNZZl711DO1Mkzl1H9ik6PoUge\ntQh4K7A7RTfDidQ3SezTwNMo0hP/H+AA4DMVl7FTuS2m+Fy7ADtT3+e6W9JjgG8Cn1OxBOJYL4pR\nrsl6ITA5Hn4vYDfbqxqtWAttkw9Zex9szXasorKusX3YlIdp19k+uOqypil7O+AS20fWcO+lwOSk\nkktsX1p1GWU53wRe7IfWZN0J+Dfbz62hrLW2D+h3rKKyRvK5JH2YYrTOPIouoMcBB9t+U5XljJqk\n/YEVtp9bZhe91/bpTderbbbVLpp5knaZzEhXZqKr68/iV5Lm89DDtN2orw9+qkonIEn6D9tLesZY\nT3abnChpArgTOM32mVWVSfHt58Ge/QfLY3W4RtKzbF8FIOmZFOt/1mFUn+t5tico/s59GoolD2so\nZ6TK7J8q5xO8GnhO03Vqo201wH8Y+JakL5T7r2TL1dmrdDrF18knSPoA8AqKPvnK1T0ByfaS8ue0\nD1TLIYZXAlUG+M8A35F0Ybn/uxSr+dRhEcUSd5NDCPcCbp78c614FaRaP5ekt1I8T9h3SkDfCRjp\nqkeSnmT7pzXc+h8pugZvmJo+OArbZBcNgKQDeGh0xNdsr62xrP2BoyhavJfZriUxl7ZcvafuCUgz\n1WF325UO9yuHMU620L453YIPFZUz7epHk1zxKkh1fi5Jj6Po3/8b4JSeU/fVNAt4trr8m+0X13Df\nHSmGlr68ri7CcbfNBviIiDYpB3ocC/x8ugVtyvkzHwVeRLEy1wn95mhsk6NoIiJa6FM8NNpuOsdQ\nJFtbSLFATd8hvNt8gJe0PGWNR1ld/Ewpa3zKqZvtb1IMVJjJcRSz4V0OBNhZ0u6z3XOb76KRtMr2\n4pTV/rK6+JlS1viUM51ly5Z548b+6YpWr169BvjvnkMrbK+Yep2kvYGLZ+iiuRj4oO3/KPcvA949\n2/j/bXUUTUTE0DZu3MiqVf1H0kr67yZ+CXUqwE9OnR/V+7pW1qJFi/pfNI299tqLxYsXb1VZq1ev\nHqisNv/5pazmyhqwnI22h14kZIS9IBuAPXv2F9AnNUOnAnwMZy4tkarUl1AzYs6GHvZqYPPEqOYt\nchFwkqTzKBIL3tNvSHICfETEwIwryn8n6VzgSGBXSespFqB5JIDtsyiS070IWEcxTPIN/e6ZAB8R\nMSjDREU9NLZf0+e82XJls74S4CMihtDmkYgJ8BERAzIwkQAfEdFNacFHRHSQ7VGOotlqCfAREUNI\nCz4ioqOqGiZZh8aTjUnaW9KNTdcjImJrFQ9Z+29NSQs+ImIIbe6iabwFX5ov6ROS1kj6iqQdJO0r\n6cuSVku6olwVKSKiPcqHrP22prQlwC8EzrB9IHA38HJgBfDHthcB72SGdT4lLZe0StLoEqlERFB0\n0djuuzWlLV00t9r+bvl6NbA38FvAF3qSUm033RvLnMorYLRZ8iIiIBOd5uKBntebgScCd9s+pKH6\nRETMSfrgt969wK2SXgnFYrOSDm64ThERU3hO/zWlrQEe4HXAmyRdB6yhWI8wIqI1PIchktv0MEnb\nPwSe3rP/dz2nZ1thPCKicRNJVRAR0T3JJhkR0WFtfsiaAB8RMSg7LfiIiK5KCz4iooMMbE6Aj4jo\nprTgIyI6KgE+xoLU5nlvMdWoshTOn5e/FzNxHrJGRHRXWvARER2VAB8R0UHFKJqkKoiI6KQmk4n1\nkwAfETGohlds6icBPiJiQJNL9rVVAnxExBAyTDIioqPSgo+I6CDbI5twNogE+IiIITS55mo/CfAR\nEUPIMMmIiA5q+yiaVmcRkvRoSf8m6TpJN0o6vuk6RUT0cjkWfrZtLiQtk3SzpHWSTpnm/OMk/WsZ\nD9dIekO/e7a9Bb8M+LHtF0PxAadeIGk5sHzUFYuIoKKHrJLmA2cAS4H1wNWSLrK9tueytwFrbb9E\n0m7AzZI+Z/vBme7b6hY8cAOwVNKHJD3H9j1TL7C9wvZi24sbqF9EbMMmu2gqaMEfAayzfUsZsM8D\njpumuJ0kCXgMcCewababtjrA2/4+cBhFoH+/pFMbrlJExBYmypzws23ArpJW9WxTex32AG7v2V9f\nHuv1D8DTgB9TxMS327NnOmt1F42kJwN32v6spLuBNzddp4iIXnMcJrmxgl6GFwLfBZ4P7At8VdIV\ntu+d6Q2tDvDAM4DTJE0AvwLe2nB9IiK2UNEgmg3Anj37C8pjvd4AfNBFn886SbcC+wPfmemmrQ7w\nti8BLmm6HhER0zGV5aK5GlgoaR+KwP5q4LVTrvkRcBRwhaQnAk8Fbpntpq0O8BERrVbRKBrbmySd\nRNGgnQ+cY3uNpBPL82cB7wM+JekGQMC7bW+c7b4J8BERA6pyopPtlcDKKcfO6nn9Y+DorblnAnxE\nxBDaPJM1AT4iYgjJBx8R0UlONsmIiC6yKxsmWYsE+IiIIWTBjxgLfWY9V6pIpxHDmD+v1ZlGtgkV\njoOvRQJ8RMQQMoomIqKLtiLfexMS4CMihpEAHxHRTRObE+AjIjqnGCaZAB8R0UkJ8BERnZSHrBER\nneWJBPiIiM5JH3xERIc5qQoiIrqpxQ34BPiIiIHZre6Db3W2IklfkrRa0hpJy5uuT0TEVC7TFcy2\nNaXtLfg32r5T0g7A1ZL+2fYdvReUgT/BPyJGrso1WevQ9gD/J5JeVr7eE1gIbBHgba8AVgBIau+f\ndER0UgL8ACQdCbwAeLbtX0r6OrB9o5WKiOhl480ZRTOIxwF3lcF9f+BZTVcoImKqNrfg2/yQ9cvA\nIyTdBHwQuKrh+kREPMzkuqyzbU1pbQve9gPAMU3XIyJiJnnIGhHRVUlVEBHRVWYiD1kjIropLfiI\niA5KNsmIiC5LgI+I6Ca3tws+AT4iYhjpoomxIKnpKkQLjTKAjd3fQZuJLPgREdE9bZ/o1OZUBRER\n7eZi0e1+21xIWibpZknrJJ0ywzVHSvpuuUbGN/rdMy34iIhhVNCClzQfOANYCqynWP/iIttre67Z\nGTgTWGb7R5Ke0O++acFHRAys/2pOc+zCOQJYZ/sW2w8C5wHHTbnmtcAFtn8EYPvn/W6aAB8RMYSJ\nCffdgF0lrerZpq5Ctwdwe8/++vJYr/2AXSR9vVzK9A/61S1dNBERA3LZBz8HG20vHrK4RwCLgKOA\nHYBvSbrK9vdne0NERAyoolE0GyiWJZ20oDzWaz1wh+37gfslfRM4GJgxwKeLJiJiCBX1wV8NLJS0\nj6RHAa8GLppyzb8ASyQ9QtKOwDOBm2a7aVrwEREDm3MAn/0u9iZJJwGXAPOBc2yvkXRief4s2zdJ\n+jJwPTABnG37xtnumwAfETGoCrNJ2l4JrJxy7Kwp+6cBp831nq0L8CrmKstucwqfiIhyJuvm9s5k\nrS3AS/ogcLvtM8r9vwJ+AQh4FbAdcKHt90jam+KrybcpnhKfL2kX2+8o3/sW4ADbf1pXfSMiBrGt\npir4PEUgn/Qq4P8BCykG9R8CLJL03PL8QuBM2wcCHwZeIumR5bk3AOfUWNeIiK03hwesTf4CqK0F\nb/taSU+Q9GRgN+Au4BnA0cC15WWPoQjsPwJus31V+d5fSPoacKykm4BH2r5hunLKCQNTJw1ERIzE\nXHPNNKHuPvgvAK8AnkTRon8K8De2P957UdlFc/+U954N/AXwPeCTMxVgewWworxPe/+kI6KT2txF\nU3eA/zzwCWBX4HcoWvDvk/S5spW+B/Cr6d5o+9uS9gQOAw6quZ4REVut7emCaw3w5TjOnYANtn8C\n/ETS0yim2ELx0PX3gc0z3OJ84BDbd9VZz4iIgdh4W17ww/Yzpux/FPjoNJc+fZpjS4CP1FGviIgq\ntHlAdytTFUjaWdL3gf+yfVnT9YmImMk2OYpmGLbvpkiNGRHRXhXOZK1DKwN8RMQ42KYfskZEdJuZ\n2NzeTvgE+IiIQaWLJiKiwxLgIyK6qcXxPQE+ImJQecgaEdFVc190uxEJ8BERAzMT23KqgoiILksX\nTUREVyXAR0R0j9MHHxHRXS1uwCfAR0QMrtlskf0kwEdEDMpkFE1ERBeZ9MFHRHRWumgiIjrJrX7K\nmgAfETGolqcLHmhNVkl/LekdPfsfkPR2SadJulHSDZKOL88dKeninmv/QdIJ5esfSnqvpGvK9+xf\nHt9N0lclrZF0tqTbJO061CeNiKjBxGb33Zoy6KLb5wB/ACBpHvBqYD1wCHAw8ALgNEm7z+FeG20f\nBnwMeGd57D3A12wfCHwR2GumN0taLmmVpFUDfpaIiIFMZpPs1KLbtn8o6Q5JhwJPBK4FlgDn2t4M\n/EzSN4DDgXv73O6C8udq4PfK10uAl5VlfVnSXbPUZQWwAkBSe78rRUT3tLyLZpg++LOBE4AnUbTo\nl85w3Sa2/Kaw/ZTzD5Q/Nw9Zn4iIEWv3RKdBu2gALgSWUbTSLwGuAI6XNF/SbsBzge8AtwEHSNpO\n0s7AUXO4938CrwKQdDSwyxD1jIioTee6aABsPyjpcuBu25slXQg8G7iOomvqZNs/BZB0PnAjcCtF\nd04/7wXOlfR64FvAT4H7Bq1rRERdOjnRqXy4+izglQAufk29q9y2YPtk4ORpju/d83oVcGS5ew/w\nQtubJD0bONz2A1PfHxHRpLZnkxx0mOQBwDrgMts/qLZKQDFq5mpJ1wGnA2+poYyIiKFV1UUjaZmk\nmyWtk3TKLNcdLmmTpFf0u+ego2jWAr85yHvneP8fAIfWdf+IiGpU08cuaT5wBsVglfUUDdyLylg7\n9boPAV+Zy32HecgaEbFtK7to+m1zcASwzvYtth8EzgOOm+a6Pwb+Gfj5XG6aAB8RMYQ5dtHsOjkh\ns9yWT7nNHsDtPfvry2O/JmkPivlBH5tr3TLuPCJiQJMzWedgo+3FQxb398C7bU9ImtMbEuAjIgZm\nXM2CHxuAPXv2F5THei0GziuD+67AiyRtsv2lmW6aAB8RMSiDq1nQ6WpgoaR9KAL7q4HXblGUvc/k\na0mfAi6eLbhDAnxE9LFgwX4jK+vG22/vf1FFnr7nnv0vmoMqRtGUc35OosgKMB84x/YaSSeW588a\n5L4J8BERQ6gqFYHtlcDKKcemDey2T5jLPRPgIyIGtBUPWRuRAB8RMSibic3VdMLXIQE+ImIYacFH\nRHSTSYCPiOgcd3hFp4iIbZxxRQPh65AAHxExhLTgIyI6aqKaVAW1SICPiBhQkS0yAT4iopvSRRMR\n0U1tHiY56Jqsfy3pHT37H5D0dkmnSbpR0g2Sji/PHSnp4p5r/0HSCeXrH0p6r6RryvfsXx7fTdJX\nJa2RdLak2yTtOtQnjYioQVVrstZh0BWdzgH+AEDSPIrUluuBQ4CDgRcAp0nafQ732mj7MIpVSt5Z\nHnsP8DXbBwJfpFiEe1qSlk+ukjLgZ4mIGJCZmNjcd2vKoItu/1DSHZIOBZ4IXAssAc61vRn4maRv\nAIcD9/a53QXlz9XA75Wvl1AsTYXtL0u6a5a6rABWAEhq73eliOicLk90Ohs4AXgSRYt+6QzXbWLL\nbwrbTzn/QPlz85D1iYgYuTYH+GEW3b4QWEbRSr8EuAI4XtJ8SbsBzwW+A9wGHCBpO0k7A0fN4d7/\nCbwKQNLRwC5D1DMiojZt7oMfuMVs+0FJlwN3294s6ULg2cB1FGmST7b9UwBJ5wM3ArdSdOf0817g\nXEmvB74F/BS4b9C6RkTUw90cJlk+XH0W8EoAF7+m3lVuW7B9MnDyNMf37nm9Cjiy3L0HeGG5jNWz\ngcNtPzD1/RERTTMdm+gk6QDgYuBC2z+otkpAMWrm/PKXyIPAW2ooIyJiKHYHUxXYXgv8ZsV16b3/\nD4BD67p/REQ1mu1j7yejViIihpBcNBERHZUWfERERyXAR0R0kTs6TDIiYltnYMLN5ZrpJwE+Ima1\nYUMdI6Gnd+CCBSMrqxoZRRMR0VkJ8BERHZUAHxHRQcUz1oyDj4joIOOupSqIiIhCm9dkTYCPiBhC\n+uAjIjrJ6YOPiOiitq/JOsySfRER27yqluyTtEzSzZLWSTplmvOvk3S9pBskXSnp4H73TAs+ImII\nVSz4IWk+cAawFFgPXC3ponLtjUm3Ar9j+y5JxwArgGfOdt8E+IiIgRmq6YM/Alhn+xYASecBxwG/\nDvC2r+y5/iqgb16HdNFERAzBc/gP2FXSqp5t+ZTb7AHc3rO/vjw2kzcB/96vbmnBR0QMaCsesm60\nvbiKMiU9jyLAL+l37dgH+PI34dTfhhERI1HRKJoNwJ49+wvKY1uQdBBwNnCM7Tv63XTsA7ztFRQP\nG5DU3vFKEdFBlY2DvxpYKGkfisD+auC1vRdI2gu4AHi97e/P5aZjH+AjIppUxSga25sknQRcAswH\nzrG9RtKJ5fmzgFOBxwNnSgLY1K/bJwE+ImJAVU50sr0SWDnl2Fk9r98MvHlr7jk2o2gkrZT05Kbr\nERHxED+0LutsW0PGpgVv+0VN1yEiYiqTXDQREZ3U5lw0CfAREQNzJQ9Z65IAHxExoCzZFxHRYemi\niYjoqAT4iIhOanYYZD8J8BERQ8ii2zGUUX0FLKc/RzRm3P4O2jAxsbnpaswoAT4iYmBzX5KvCQnw\nERFDSICPiOioBPiIiI7KRKeIiC5qOFtkPwnwEREDMjCRFnxERDeliyYiopMyTDIiorMS4CMiOqjK\nNVnrMPSarJK+LulmSd8tty/2nFsu6Xvl9h1JS3rOHSvpWknXSVor6Q+HrUtExGgZT2zuuzVloBa8\npEcBj7R9f3nodbZXTbnmWOAPgSW2N0o6DPiSpCOAO4AVwBG210vaDti7fN8utu8a7ONERIxWm5ON\nbVULXtLTJH0YuBnYr8/l7wbeZXsjgO1rgE8DbwN2ovjlckd57gHbN5fvO17SjZL+l6TdtqZ+ERGj\nZrvv1pS+AV7SoyW9QdJ/AJ8A1gIH2b6257LP9XTRnFYeOxBYPeV2q4ADbd8JXATcJulcSa+TNA/A\n9lnAMcCOwDclfVHSssnz09RvuaRVklZNdz4iok5tDvBz6aL5CXA98Gbb35vhmod10fRj+82SngG8\nAHgnsBQ4oTx3O/A+Se+nCPbnUPxyeOk091lB0d2DpPZ+V4qIzikCeHvHwc+li+YVwAbgAkmnSnrK\nHO+9Flg05dgiYM3kju0bbH+EIri/vPfCsq/+TOB04Hzgz+dYbkTEyLS5Bd83wNv+iu3jgecA9wD/\nIulSSXv3eevfAh+S9HgASYdQtNDPlPQYSUf2XHsIcFt53dGSrgfeD1wOHGD7HbbXEBHRMhMTE323\npsx5FI3tO4CPAh8tW9e9Y38+J+m/ytcbbb/A9kWS9gCuLLtO7gN+3/ZPJO0EnCzp48B/AfdTds9Q\nPHh9ie3bhvpkERGj0OJx8GrzIP2t1dU++CzZF1GL1bYXD3OD+fPne/vtH933ul/+8r6hyxpEZrJG\nRAyo7TNZE+AjIoaQAB8R0VEJ8BERnWQmGsw1008CfETEgNIHHxHRZS0O8EOnC46I2HZ5Tv/NRZlz\n62ZJ6ySdMs15STq9PH99maF3Vl1rwW+knBG7FXYt3zcKA5U14Pj01n+uFpeTssarrEHLmWvalVlV\nkYtG0nzgDIq0LeuBqyVdZHttz2XHAAvL7ZnAx8qfM+pUgLe91emFJa0a1QSElDUe5aSs8SprlJ9p\nOhWlIjgCWGf7FgBJ5wHHUeT0mnQc8BkXnf5XSdpZ0u62fzLTTTsV4CMiRuwSim8Q/Ww/JaX5ijIT\n7qQ9gNt79tfz8Nb5dNfsQZHxd1oJ8BERA7K9rOk6zCYPWctc8ilrLMrq4mdKWeNTTp02AHv27C8o\nj23tNVskp3IgAAAAfklEQVToVLKxiIhxJOkRwPeBoyiC9tXAa3vTpEt6MXAS8CKK7pvTbR8x233T\nRRMR0TDbmySdRNGnPx84x/YaSSeW588CVlIE93XAL4E39LtvWvARER2VPviIiI5KgI+I6KgE+IiI\njkqAj4joqAT4iIiOSoCPiOioBPiIiI76/wGfn0dB87K5AAAAAElFTkSuQmCC\n", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "evaluate_and_show_attention(\"c est un jeune directeur plein de talent .\")" ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "# Exercises\n", "\n", "* Try with a different dataset\n", " * Another language pair\n", " * Human → Machine (e.g. IOT commands)\n", " * Chat → Response\n", " * Question → Answer\n", "* Replace the embedding pre-trained word embeddings such as word2vec or GloVe\n", "* Try with more layers, more hidden units, and more sentences. Compare the training time and results.\n", "* If you use a translation file where pairs have two of the same phrase (`I am test \\t I am test`), you can use this as an autoencoder. Try this:\n", " * Train as an autoencoder\n", " * Save only the Encoder network\n", " * Train a new Decoder for translation from there" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [] } ], "metadata": { "anaconda-cloud": {}, "kernelspec": { "display_name": "Python [default]", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.5.2" } }, "nbformat": 4, "nbformat_minor": 1 }