Showing preview only (1,024K chars total). Download the full file or copy to clipboard to get everything.
Repository: spro/practical-pytorch
Branch: master
Commit: c520c52e68e9
Files: 50
Total size: 991.7 KB
Directory structure:
gitextract_lif5hpq3/
├── .gitignore
├── LICENSE
├── README.md
├── char-rnn-classification/
│ ├── .gitignore
│ ├── char-rnn-classification.ipynb
│ ├── data.py
│ ├── model.py
│ ├── predict.py
│ ├── server.py
│ └── train.py
├── char-rnn-generation/
│ ├── README.md
│ ├── char-rnn-generation.ipynb
│ ├── generate.py
│ ├── helpers.py
│ ├── model.py
│ └── train.py
├── conditional-char-rnn/
│ ├── conditional-char-rnn.ipynb
│ ├── data.py
│ ├── generate.py
│ ├── model.py
│ └── train.py
├── data/
│ └── names/
│ ├── Arabic.txt
│ ├── Chinese.txt
│ ├── Czech.txt
│ ├── Dutch.txt
│ ├── English.txt
│ ├── French.txt
│ ├── German.txt
│ ├── Greek.txt
│ ├── Irish.txt
│ ├── Italian.txt
│ ├── Japanese.txt
│ ├── Korean.txt
│ ├── Polish.txt
│ ├── Portuguese.txt
│ ├── Russian.txt
│ ├── Scottish.txt
│ ├── Spanish.txt
│ └── Vietnamese.txt
├── glove-word-vectors/
│ └── glove-word-vectors.ipynb
├── reinforce-gridworld/
│ ├── helpers.py
│ ├── reinforce-gridworld.ipynb
│ └── reinforce-gridworld.py
└── seq2seq-translation/
├── images/
│ ├── attention-decoder-network.dot
│ ├── decoder-network.dot
│ └── encoder-network.dot
├── masked_cross_entropy.py
├── seq2seq-translation-batched.ipynb
├── seq2seq-translation-batched.py
└── seq2seq-translation.ipynb
================================================
FILE CONTENTS
================================================
================================================
FILE: .gitignore
================================================
*.swp
*.swo
*.pt
.ipynb_checkpoints
__pycache__
data/eng-*.txt
*.csv
================================================
FILE: LICENSE
================================================
The MIT License (MIT)
Copyright (c) 2017 Sean Robertson
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
================================================
FILE: README.md
================================================
**These tutorials have been merged into [the official PyTorch tutorials](https://github.com/pytorch/tutorials). Please go there for better maintained versions of these tutorials compatible with newer versions of PyTorch.**
---

Learn PyTorch with project-based tutorials. These tutorials demonstrate modern techniques with readable code and use regular data from the internet.
## Tutorials
#### Series 1: RNNs for NLP
Applying recurrent neural networks to natural language tasks, from classification to generation.
* [Classifying Names with a Character-Level RNN](https://github.com/spro/practical-pytorch/blob/master/char-rnn-classification/char-rnn-classification.ipynb)
* [Generating Shakespeare with a Character-Level RNN](https://github.com/spro/practical-pytorch/blob/master/char-rnn-generation/char-rnn-generation.ipynb)
* [Generating Names with a Conditional Character-Level RNN](https://github.com/spro/practical-pytorch/blob/master/conditional-char-rnn/conditional-char-rnn.ipynb)
* [Translation with a Sequence to Sequence Network and Attention](https://github.com/spro/practical-pytorch/blob/master/seq2seq-translation/seq2seq-translation.ipynb)
* [Exploring Word Vectors with GloVe](https://github.com/spro/practical-pytorch/blob/master/glove-word-vectors/glove-word-vectors.ipynb)
* *WIP* Sentiment Analysis with a Word-Level RNN and GloVe Embeddings
#### Series 2: RNNs for timeseries data
* *WIP* Predicting discrete events with an RNN
## Get Started
The quickest way to run these on a fresh Linux or Mac machine is to install [Anaconda](https://www.continuum.io/anaconda-overview):
```
curl -LO https://repo.continuum.io/archive/Anaconda3-4.3.0-Linux-x86_64.sh
bash Anaconda3-4.3.0-Linux-x86_64.sh
```
Then install PyTorch:
```
conda install pytorch -c soumith
```
Then clone this repo and start Jupyter Notebook:
```
git clone http://github.com/spro/practical-pytorch
cd practical-pytorch
jupyter notebook
```
## Recommended Reading
### PyTorch basics
* http://pytorch.org/ For installation instructions
* [Offical PyTorch tutorials](http://pytorch.org/tutorials/) for more tutorials (some of these tutorials are included there)
* [Deep Learning with PyTorch: A 60-minute Blitz](http://pytorch.org/tutorials/beginner/deep_learning_60min_blitz.html) to get started with PyTorch in general
* [Introduction to PyTorch for former Torchies](https://github.com/pytorch/tutorials/blob/master/Introduction%20to%20PyTorch%20for%20former%20Torchies.ipynb) if you are a former Lua Torch user
* [jcjohnson's PyTorch examples](https://github.com/jcjohnson/pytorch-examples) for a more in depth overview (including custom modules and autograd functions)
### Recurrent Neural Networks
* [The Unreasonable Effectiveness of Recurrent Neural Networks](http://karpathy.github.io/2015/05/21/rnn-effectiveness/) shows a bunch of real life examples
* [Deep Learning, NLP, and Representations](http://colah.github.io/posts/2014-07-NLP-RNNs-Representations/) for an overview on word embeddings and RNNs for NLP
* [Understanding LSTM Networks](http://colah.github.io/posts/2015-08-Understanding-LSTMs/) is about LSTMs work specifically, but also informative about RNNs in general
### Machine translation
* [Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation](http://arxiv.org/abs/1406.1078)
* [Sequence to Sequence Learning with Neural Networks](http://arxiv.org/abs/1409.3215)
### Attention models
* [Neural Machine Translation by Jointly Learning to Align and Translate](https://arxiv.org/abs/1409.0473)
* [Effective Approaches to Attention-based Neural Machine Translation](https://arxiv.org/abs/1508.04025)
### Other RNN uses
* [A Neural Conversational Model](http://arxiv.org/abs/1506.05869)
### Other PyTorch tutorials
* [Deep Learning For NLP In PyTorch](https://github.com/rguthrie3/DeepLearningForNLPInPytorch)
## Feedback
If you have ideas or find mistakes [please leave a note](https://github.com/spro/practical-pytorch/issues/new).
================================================
FILE: char-rnn-classification/.gitignore
================================================
*.pt
*.swp
*.swo
__pycache__
.ipynb_checkpoints
================================================
FILE: char-rnn-classification/char-rnn-classification.ipynb
================================================
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"\n",
"\n",
"# Practical PyTorch: Classifying Names with a Character-Level RNN\n",
"\n",
"We will be building and training a basic character-level RNN to classify words. A character-level RNN reads words as a series of characters - outputting a prediction and \"hidden state\" at each step, feeding its previous hidden state into each next step. We take the final prediction to be the output, i.e. which class the word belongs to.\n",
"\n",
"Specifically, we'll train on a few thousand surnames from 18 languages of origin, and predict which language a name is from based on the spelling:\n",
"\n",
"```\n",
"$ python predict.py Hinton\n",
"(-0.47) Scottish\n",
"(-1.52) English\n",
"(-3.57) Irish\n",
"\n",
"$ python predict.py Schmidhuber\n",
"(-0.19) German\n",
"(-2.48) Czech\n",
"(-2.68) Dutch\n",
"```"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Recommended Reading\n",
"\n",
"I assume you have at least installed PyTorch, know Python, and understand Tensors:\n",
"\n",
"* http://pytorch.org/ For installation instructions\n",
"* [Deep Learning with PyTorch: A 60-minute Blitz](http://pytorch.org/tutorials/beginner/deep_learning_60min_blitz.html) to get started with PyTorch in general\n",
"* [jcjohnson's PyTorch examples](https://github.com/jcjohnson/pytorch-examples) for an in depth overview\n",
"* [Introduction to PyTorch for former Torchies](https://github.com/pytorch/tutorials/blob/master/Introduction%20to%20PyTorch%20for%20former%20Torchies.ipynb) if you are former Lua Torch user\n",
"\n",
"It would also be useful to know about RNNs and how they work:\n",
"\n",
"* [The Unreasonable Effectiveness of Recurrent Neural Networks](http://karpathy.github.io/2015/05/21/rnn-effectiveness/) shows a bunch of real life examples\n",
"* [Understanding LSTM Networks](http://colah.github.io/posts/2015-08-Understanding-LSTMs/) is about LSTMs specifically but also informative about RNNs in general"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Preparing the Data\n",
"\n",
"Included in the `data/names` directory are 18 text files named as \"[Language].txt\". Each file contains a bunch of names, one name per line, mostly romanized (but we still need to convert from Unicode to ASCII).\n",
"\n",
"We'll end up with a dictionary of lists of names per language, `{language: [names ...]}`. The generic variables \"category\" and \"line\" (for language and name in our case) are used for later extensibility."
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"collapsed": false,
"scrolled": true
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"['../data/names/Arabic.txt', '../data/names/Chinese.txt', '../data/names/Czech.txt', '../data/names/Dutch.txt', '../data/names/English.txt', '../data/names/French.txt', '../data/names/German.txt', '../data/names/Greek.txt', '../data/names/Irish.txt', '../data/names/Italian.txt', '../data/names/Japanese.txt', '../data/names/Korean.txt', '../data/names/Polish.txt', '../data/names/Portuguese.txt', '../data/names/Russian.txt', '../data/names/Scottish.txt', '../data/names/Spanish.txt', '../data/names/Vietnamese.txt']\n"
]
}
],
"source": [
"import glob\n",
"\n",
"all_filenames = glob.glob('../data/names/*.txt')\n",
"print(all_filenames)"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Slusarski\n"
]
}
],
"source": [
"import unicodedata\n",
"import string\n",
"\n",
"all_letters = string.ascii_letters + \" .,;'\"\n",
"n_letters = len(all_letters)\n",
"\n",
"# Turn a Unicode string to plain ASCII, thanks to http://stackoverflow.com/a/518232/2809427\n",
"def unicode_to_ascii(s):\n",
" return ''.join(\n",
" c for c in unicodedata.normalize('NFD', s)\n",
" if unicodedata.category(c) != 'Mn'\n",
" and c in all_letters\n",
" )\n",
"\n",
"print(unicode_to_ascii('Ślusàrski'))"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"n_categories = 18\n"
]
}
],
"source": [
"# Build the category_lines dictionary, a list of names per language\n",
"category_lines = {}\n",
"all_categories = []\n",
"\n",
"# Read a file and split into lines\n",
"def readLines(filename):\n",
" lines = open(filename).read().strip().split('\\n')\n",
" return [unicode_to_ascii(line) for line in lines]\n",
"\n",
"for filename in all_filenames:\n",
" category = filename.split('/')[-1].split('.')[0]\n",
" all_categories.append(category)\n",
" lines = readLines(filename)\n",
" category_lines[category] = lines\n",
"\n",
"n_categories = len(all_categories)\n",
"print('n_categories =', n_categories)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now we have `category_lines`, a dictionary mapping each category (language) to a list of lines (names). We also kept track of `all_categories` (just a list of languages) and `n_categories` for later reference."
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"['Abandonato', 'Abatangelo', 'Abatantuono', 'Abate', 'Abategiovanni']\n"
]
}
],
"source": [
"print(category_lines['Italian'][:5])"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Turning Names into Tensors\n",
"\n",
"Now that we have all the names organized, we need to turn them into Tensors to make any use of them.\n",
"\n",
"To represent a single letter, we use a \"one-hot vector\" of size `<1 x n_letters>`. A one-hot vector is filled with 0s except for a 1 at index of the current letter, e.g. `\"b\" = <0 1 0 0 0 ...>`.\n",
"\n",
"To make a word we join a bunch of those into a 2D matrix `<line_length x 1 x n_letters>`.\n",
"\n",
"That extra 1 dimension is because PyTorch assumes everything is in batches - we're just using a batch size of 1 here."
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"import torch\n",
"\n",
"# Just for demonstration, turn a letter into a <1 x n_letters> Tensor\n",
"def letter_to_tensor(letter):\n",
" tensor = torch.zeros(1, n_letters)\n",
" letter_index = all_letters.find(letter)\n",
" tensor[0][letter_index] = 1\n",
" return tensor\n",
"\n",
"# Turn a line into a <line_length x 1 x n_letters>,\n",
"# or an array of one-hot letter vectors\n",
"def line_to_tensor(line):\n",
" tensor = torch.zeros(len(line), 1, n_letters)\n",
" for li, letter in enumerate(line):\n",
" letter_index = all_letters.find(letter)\n",
" tensor[li][0][letter_index] = 1\n",
" return tensor"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"\n",
"Columns 0 to 12 \n",
" 0 0 0 0 0 0 0 0 0 0 0 0 0\n",
"\n",
"Columns 13 to 25 \n",
" 0 0 0 0 0 0 0 0 0 0 0 0 0\n",
"\n",
"Columns 26 to 38 \n",
" 0 0 0 0 0 0 0 0 0 1 0 0 0\n",
"\n",
"Columns 39 to 51 \n",
" 0 0 0 0 0 0 0 0 0 0 0 0 0\n",
"\n",
"Columns 52 to 56 \n",
" 0 0 0 0 0\n",
"[torch.FloatTensor of size 1x57]\n",
"\n"
]
}
],
"source": [
"print(letter_to_tensor('J'))"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"torch.Size([5, 1, 57])\n"
]
}
],
"source": [
"print(line_to_tensor('Jones').size())"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Creating the Network\n",
"\n",
"Before autograd, creating a recurrent neural network in Torch involved cloning the parameters of a layer over several timesteps. The layers held hidden state and gradients which are now entirely handled by the graph itself. This means you can implement a RNN in a very \"pure\" way, as regular feed-forward layers.\n",
"\n",
"This RNN module (mostly copied from [the PyTorch for Torch users tutorial](https://github.com/pytorch/tutorials/blob/master/Introduction%20to%20PyTorch%20for%20former%20Torchies.ipynb)) is just 2 linear layers which operate on an input and hidden state, with a LogSoftmax layer after the output.\n",
"\n",
""
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"import torch.nn as nn\n",
"from torch.autograd import Variable\n",
"\n",
"class RNN(nn.Module):\n",
" def __init__(self, input_size, hidden_size, output_size):\n",
" super(RNN, self).__init__()\n",
" \n",
" self.input_size = input_size\n",
" self.hidden_size = hidden_size\n",
" self.output_size = output_size\n",
" \n",
" self.i2h = nn.Linear(input_size + hidden_size, hidden_size)\n",
" self.i2o = nn.Linear(input_size + hidden_size, output_size)\n",
" self.softmax = nn.LogSoftmax()\n",
" \n",
" def forward(self, input, hidden):\n",
" combined = torch.cat((input, hidden), 1)\n",
" hidden = self.i2h(combined)\n",
" output = self.i2o(combined)\n",
" output = self.softmax(output)\n",
" return output, hidden\n",
"\n",
" def init_hidden(self):\n",
" return Variable(torch.zeros(1, self.hidden_size))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Manually testing the network\n",
"\n",
"With our custom `RNN` class defined, we can create a new instance:"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {
"collapsed": true,
"scrolled": true
},
"outputs": [],
"source": [
"n_hidden = 128\n",
"rnn = RNN(n_letters, n_hidden, n_categories)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"To run a step of this network we need to pass an input (in our case, the Tensor for the current letter) and a previous hidden state (which we initialize as zeros at first). We'll get back the output (probability of each language) and a next hidden state (which we keep for the next step).\n",
"\n",
"Remember that PyTorch modules operate on Variables rather than straight up Tensors."
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"output.size = torch.Size([1, 18])\n"
]
}
],
"source": [
"input = Variable(letter_to_tensor('A'))\n",
"hidden = rnn.init_hidden()\n",
"\n",
"output, next_hidden = rnn(input, hidden)\n",
"print('output.size =', output.size())"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"For the sake of efficiency we don't want to be creating a new Tensor for every step, so we will use `line_to_tensor` instead of `letter_to_tensor` and use slices. This could be further optimized by pre-computing batches of Tensors."
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Variable containing:\n",
"\n",
"Columns 0 to 9 \n",
"-2.8658 -2.8801 -2.7945 -2.9082 -2.8309 -2.9718 -2.9366 -2.9416 -2.7900 -2.8467\n",
"\n",
"Columns 10 to 17 \n",
"-2.9495 -2.9496 -2.8707 -2.8984 -2.8147 -2.9442 -2.9257 -2.9363\n",
"[torch.FloatTensor of size 1x18]\n",
"\n"
]
}
],
"source": [
"input = Variable(line_to_tensor('Albert'))\n",
"hidden = Variable(torch.zeros(1, n_hidden))\n",
"\n",
"output, next_hidden = rnn(input[0], hidden)\n",
"print(output)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"As you can see the output is a `<1 x n_categories>` Tensor, where every item is the likelihood of that category (higher is more likely)."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Preparing for Training\n",
"\n",
"Before going into training we should make a few helper functions. The first is to interpret the output of the network, which we know to be a likelihood of each category. We can use `Tensor.topk` to get the index of the greatest value:"
]
},
{
"cell_type": "code",
"execution_count": 12,
"metadata": {
"collapsed": false,
"scrolled": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"('Irish', 8)\n"
]
}
],
"source": [
"def category_from_output(output):\n",
" top_n, top_i = output.data.topk(1) # Tensor out of Variable with .data\n",
" category_i = top_i[0][0]\n",
" return all_categories[category_i], category_i\n",
"\n",
"print(category_from_output(output))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We will also want a quick way to get a training example (a name and its language):"
]
},
{
"cell_type": "code",
"execution_count": 13,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"category = Italian / line = Campana\n",
"category = Korean / line = Koo\n",
"category = Irish / line = Mochan\n",
"category = Japanese / line = Kitabatake\n",
"category = Vietnamese / line = an\n",
"category = Korean / line = Kwak\n",
"category = Portuguese / line = Campos\n",
"category = Vietnamese / line = Chung\n",
"category = Japanese / line = Ise\n",
"category = Dutch / line = Romijn\n"
]
}
],
"source": [
"import random\n",
"\n",
"def random_training_pair(): \n",
" category = random.choice(all_categories)\n",
" line = random.choice(category_lines[category])\n",
" category_tensor = Variable(torch.LongTensor([all_categories.index(category)]))\n",
" line_tensor = Variable(line_to_tensor(line))\n",
" return category, line, category_tensor, line_tensor\n",
"\n",
"for i in range(10):\n",
" category, line, category_tensor, line_tensor = random_training_pair()\n",
" print('category =', category, '/ line =', line)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Training the Network\n",
"\n",
"Now all it takes to train this network is show it a bunch of examples, have it make guesses, and tell it if it's wrong.\n",
"\n",
"For the [loss function `nn.NLLLoss`](http://pytorch.org/docs/nn.html#nllloss) is appropriate, since the last layer of the RNN is `nn.LogSoftmax`."
]
},
{
"cell_type": "code",
"execution_count": 14,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"criterion = nn.NLLLoss()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We will also create an \"optimizer\" which updates the parameters of our model according to its gradients. We will use the vanilla SGD algorithm with a low learning rate."
]
},
{
"cell_type": "code",
"execution_count": 15,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"learning_rate = 0.005 # If you set this too high, it might explode. If too low, it might not learn\n",
"optimizer = torch.optim.SGD(rnn.parameters(), lr=learning_rate)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Each loop of training will:\n",
"\n",
"* Create input and target tensors\n",
"* Create a zeroed initial hidden state\n",
"* Read each letter in and\n",
" * Keep hidden state for next letter\n",
"* Compare final output to target\n",
"* Back-propagate\n",
"* Return the output and loss"
]
},
{
"cell_type": "code",
"execution_count": 16,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"def train(category_tensor, line_tensor):\n",
" rnn.zero_grad()\n",
" hidden = rnn.init_hidden()\n",
" \n",
" for i in range(line_tensor.size()[0]):\n",
" output, hidden = rnn(line_tensor[i], hidden)\n",
"\n",
" loss = criterion(output, category_tensor)\n",
" loss.backward()\n",
"\n",
" optimizer.step()\n",
"\n",
" return output, loss.data[0]"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now we just have to run that with a bunch of examples. Since the `train` function returns both the output and loss we can print its guesses and also keep track of loss for plotting. Since there are 1000s of examples we print only every `print_every` time steps, and take an average of the loss."
]
},
{
"cell_type": "code",
"execution_count": 17,
"metadata": {
"collapsed": false,
"scrolled": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"5000 5% (0m 7s) 2.7940 Neil / Chinese ✗ (Irish)\n",
"10000 10% (0m 14s) 2.7166 O'Kelly / English ✗ (Irish)\n",
"15000 15% (0m 23s) 1.1694 Vescovi / Italian ✓\n",
"20000 20% (0m 31s) 2.1433 Mikhailjants / Greek ✗ (Russian)\n",
"25000 25% (0m 40s) 2.0299 Planick / Russian ✗ (Czech)\n",
"30000 30% (0m 48s) 1.9862 Cabral / French ✗ (Portuguese)\n",
"35000 35% (0m 55s) 1.5634 Espina / Spanish ✓\n",
"40000 40% (1m 5s) 3.8602 MaxaB / Arabic ✗ (Czech)\n",
"45000 45% (1m 13s) 3.5599 Sandoval / Dutch ✗ (Spanish)\n",
"50000 50% (1m 20s) 1.3855 Brown / Scottish ✓\n",
"55000 55% (1m 27s) 1.6269 Reid / French ✗ (Scottish)\n",
"60000 60% (1m 35s) 0.4495 Kijek / Polish ✓\n",
"65000 65% (1m 43s) 1.0269 Young / Scottish ✓\n",
"70000 70% (1m 50s) 1.9761 Fischer / English ✗ (German)\n",
"75000 75% (1m 57s) 0.7915 Rudaski / Polish ✓\n",
"80000 80% (2m 5s) 1.7026 Farina / Portuguese ✗ (Italian)\n",
"85000 85% (2m 12s) 0.1878 Bakkarevich / Russian ✓\n",
"90000 90% (2m 19s) 0.1211 Pasternack / Polish ✓\n",
"95000 95% (2m 25s) 0.6084 Otani / Japanese ✓\n",
"100000 100% (2m 33s) 0.2713 Alesini / Italian ✓\n"
]
}
],
"source": [
"import time\n",
"import math\n",
"\n",
"n_epochs = 100000\n",
"print_every = 5000\n",
"plot_every = 1000\n",
"\n",
"# Keep track of losses for plotting\n",
"current_loss = 0\n",
"all_losses = []\n",
"\n",
"def time_since(since):\n",
" now = time.time()\n",
" s = now - since\n",
" m = math.floor(s / 60)\n",
" s -= m * 60\n",
" return '%dm %ds' % (m, s)\n",
"\n",
"start = time.time()\n",
"\n",
"for epoch in range(1, n_epochs + 1):\n",
" # Get a random training input and target\n",
" category, line, category_tensor, line_tensor = random_training_pair()\n",
" output, loss = train(category_tensor, line_tensor)\n",
" current_loss += loss\n",
" \n",
" # Print epoch number, loss, name and guess\n",
" if epoch % print_every == 0:\n",
" guess, guess_i = category_from_output(output)\n",
" correct = '✓' if guess == category else '✗ (%s)' % category\n",
" print('%d %d%% (%s) %.4f %s / %s %s' % (epoch, epoch / n_epochs * 100, time_since(start), loss, line, guess, correct))\n",
"\n",
" # Add current loss avg to list of losses\n",
" if epoch % plot_every == 0:\n",
" all_losses.append(current_loss / plot_every)\n",
" current_loss = 0"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Plotting the Results\n",
"\n",
"Plotting the historical loss from `all_losses` shows the network learning:"
]
},
{
"cell_type": "code",
"execution_count": 18,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"[<matplotlib.lines.Line2D at 0x1103a9358>]"
]
},
"execution_count": 18,
"metadata": {},
"output_type": "execute_result"
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAg0AAAFkCAYAAACjCwibAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzt3Xd4VVX2//H3ogiIGjtYEFFHsI1+EymOg9gLjqKCJaKj\nYBccRWcsOFjHPrZRELsIGgsOKjqD6NgbYqKoI1hRVBCswCBNsn5/rORHElLuTW5Jbj6v57kP3nP2\nOWfdE+Su7LP32ubuiIiIiNSlRbYDEBERkaZBSYOIiIgkREmDiIiIJERJg4iIiCRESYOIiIgkREmD\niIiIJERJg4iIiCRESYOIiIgkREmDiIiIJERJg4iIiCQkqaTBzE41s2lmNr/s9bqZ7V/HMbubWbGZ\nLTGzj83suIaFLCIiItmQbE/DV8B5QD5QADwPPGFm21TX2Mw2B54C/gPsCNwM3GVm+9QzXhEREckS\na+iCVWb2A/Bnd7+3mn3XAAe4+28rbCsC8ty9b4MuLCIiIhlV7zENZtbCzI4CVgfeqKFZL+C5Ktue\nAXap73VFREQkO1ole4CZbU8kCW2BhcCh7j6jhuYdgblVts0F1jKzNu6+tIZrrAfsB3wBLEk2RhER\nkWasLbA58Iy7/5DKEyedNAAziPEJecAA4H4z262WxKE+9gMeSOH5REREmpuBwIOpPGHSSYO7/wp8\nXvb2HTPrAZwJnFZN82+BDlW2dQAW1NTLUOYLgHHjxrHNNtWOsZQ0GDZsGDfeeGO2w2hWdM8zT/c8\n83TPM2v69Okcc8wxUPZdmkr16WmoqgXQpoZ9bwAHVNm2LzWPgSi3BGCbbbYhPz+/YdFJwvLy8nS/\nM0z3PPN0zzNP9zxrUv54P6mkwcyuBP4NzALWJLo++hCJAGZ2FbCxu5fXYhgNDCmbRXEPsBfxSEMz\nJ0RERJqYZHsaNgTGABsB84H3gH3d/fmy/R2BTuWN3f0LMzsQuBH4E/A1cIK7V51RISIiIo1cUkmD\nu59Yx/5B1Wx7mSgEJSIiIk2Y1p6Q/6+wsDDbITQ7uueZp3ueebrnuaPBFSHTwczygeLi4mINnhER\nEUlCSUkJBQUFAAXuXpLKc6unQURERBKipEFEREQSoqRBREREEqKkQURERBKipEFEREQSoqRBRERE\nEqKkQURERBKipEFEREQSoqRBREREEqKkQURERBKipEFEREQSoqRBREREEqKkQURERBKipEFEREQS\noqRBREREEqKkQURERBLSqJMG92xHICIiIuUaddIwcWK2IxAREZFyjTppuPlm+OGHbEchIiIi0MiT\nhuXLYfjwbEchIiIi0MiThiFD4I474M03sx2JiIiIJJU0mNkFZvaWmS0ws7lmNsHMtk7guIFm9q6Z\nLTKz2WZ2t5mtW9dxAwZAfj6ceir8+msykYqIiEiqJdvT0Bu4BegJ7A20BiabWbuaDjCzXYExwJ3A\ntsAAoAdwR10Xa9kSRo+G996DkSOTjFRERERSKqmkwd37uvtYd5/u7u8DxwObAQW1HNYLmOnuI939\nS3d/HbidSBzq1L179DSMGAFz5iQTrYiIiKRSQ8c0rA048GMtbd4AOpnZAQBm1gE4HHg60YtccQW0\naQPnndeQUEVERKQh6p00mJkBNwGvuvuHNbUr61k4BnjYzJYBc4CfgKGJXmuddeDKK2HsWHj99fpG\nLCIiIg3RkJ6GUcQYhaNqa2Rm2wI3A5cA+cB+QBfiEUXCBg+GggIYOhRWrKhXvCIiItIA5vWo1Wxm\ntwIHAb3dfVYdbe8H2rr7ERW27Qq8Amzk7nOrOSYfKN5tt93Iy8v7/9t/+glefbWQ0aMLOeWUpMMW\nERHJKUVFRRQVFVXaNn/+fF5++WWAAncvSeX1kk4ayhKGfkAfd/88gfbjgWXufnSFbbsArwKbuPu3\n1RyTDxQXFxeTn59fad/xx8NTT8HHH8O6dU7aFBERaV5KSkooKCiANCQNydZpGAUMBI4GFplZh7JX\n2wptrjSzMRUOmwj0N7NTzaxLWS/DzcCU6hKGulx9NSxbFrMpREREJHOSHdNwKrAW8CIwu8LriApt\nNgI6lb9x9zHA2cAQ4H3gYWA60L8+AXfsCJdeGvUbpk2rzxlERESkPlol09jd60wy3H1QNdtGAikr\nzzR0KNx5Z0zBnDQpVWcVERGR2jTqtSdq0ro1/PWv8Mwz8MEH2Y5GRESkeWiSSQPA4YfDJpvATTdl\nOxIREZHmockmDa1bwxlnwLhxMHeVSZsiIiKSak02aQA4+eRY1Oq227IdiYiISO5r0knDOutEpchR\no2Dx4mxHIyIiktuadNIAcOaZ8P338MAD2Y5EREQktzX5pGGrraBfP7jhBqhHRWwRERFJUJNPGgDO\nPhumT1fNBhERkXTKiaTh97+HnXeO3gYRERFJj5xIGsyit+G55+D997MdjYiISG7KiaQBYMAAaN8e\n/vWvbEciIiKSm3ImaWjdGnr1gtdey3YkIiIiuSlnkgaAXXeF11/XLAoREZF0yLmk4Ycf4KOPsh2J\niIhI7smppKFXL2jRQo8oRERE0iGnkoa11oIddlDSICIikg45lTRAPKJQ0iAiIpJ6OZk0fPwxfPdd\ntiMRERHJLTmZNIB6G0RERFIt55KGzTaDTTdV0iAiIpJqOZc0mGlcg4iISDrkXNIAkTQUF8OSJdmO\nREREJHfkbNKwbBm8/Xa2IxEREckdOZk0/Pa3sXiVHlGIiIikTlJJg5ldYGZvmdkCM5trZhPMbOsE\njlvNzK4wsy/MbImZfW5mx9c76jq0aqXFq0RERFIt2Z6G3sAtQE9gb6A1MNnM2tVx3KPAHsAgYGug\nEEjrChFavEpERCS1WiXT2N37Vnxf1lswDygAXq3uGDPbn0g2tnD3n8s2z0o60iTtuitcdlksXtWt\nW7qvJiIikvsaOqZhbcCBH2tpcxDwNnCemX1tZh+Z2XVm1raB166VFq8SERFJrXonDWZmwE3Aq+7+\nYS1NtyB6GrYDDgHOBAYAI+t77URo8SoREZHUSurxRBWjgG2BXeto1wIoBY529/8BmNnZwKNmdrq7\nL63pwGHDhpGXl1dpW2FhIYWFhQkF+Pvfw+TJCTUVERFpcoqKiigqKqq0bf78+Wm7nnk9Rgqa2a3E\nY4fe7l7r+AQzuw/4nbtvXWFbN+C/wNbu/lk1x+QDxcXFxeTn5ycdX7kHH4SBA+H772G99ep9GhER\nkSajpKSEgoICgAJ3L0nluZN+PFGWMPQD9qgrYSjzGrCxma1eYVtXovfh62Svn4xeveLPKVPSeRUR\nEZHmIdk6DaOAgcDRwCIz61D2aluhzZVmNqbCYQ8CPwD3mtk2ZrYbcC1wd22PJlKhSxfYYAN48810\nXkVERKR5SLan4VRgLeBFYHaF1xEV2mwEdCp/4+6LgH2ImRZTgbHAE8SAyLQyi94GJQ0iIiINl2yd\nhjqTDHcfVM22j4H9krlWqvTqBddcA6WlMQVTRERE6ifnv0Z79YIFC2DGjGxHIiIi0rTlfNLQvXs8\nptAjChERkYbJ+aRhzTVh++2VNIiIiDRUzicNoMGQIiIiqdBskoYPPoCFC7MdiYiISNPVbJIGd3j7\n7WxHIiIi0nQ1i6ShW7dYwEqPKEREROqvWSQNLVpAz55KGkRERBqiWSQNsHIwZD3W5xIRERGaWdIw\nbx588UW2IxEREWmamk3S0LNn/KlHFCIiIvXTbJKG9daDrbZS0iAiIlJfzSZpABV5EhERaYhmlzS8\n8w4sWZLtSERERJqeZpc0LF8eiYOIiIgkp1klDb/9LbRtC1OmZDsSERGRpqdZJQ2tW8csin//O9uR\niIiIND3NKmkAGDQIJk+GTz7JdiQiIiJNS7NLGo48MqZf3nZbtiMRERFpWppd0tC2LZxwAtxzDyxa\nlO1oREREmo5mlzQAnHoqLFgADz6Y7UhERESajmaZNHTpAn/4A4wcqQWsREREEtUskwaAIUNg2jR4\n/fVsRyIiItI0JJU0mNkFZvaWmS0ws7lmNsHMtk7i+F3NbLmZlSQfamrts0+sRTFyZLYjERERaRqS\n7WnoDdwC9AT2BloDk82sXV0HmlkeMAZ4Ltkg06FFCzj9dBg/Hr79NtvRiIiINH5JJQ3u3tfdx7r7\ndHd/Hzge2AwoSODw0cADQKNZMur446FVK7jzzmxHIiIi0vg1dEzD2oADP9bWyMwGAV2ASxt4vZRa\nZx0YOBBuvx1+/TXb0YiIiDRu9U4azMyAm4BX3f3DWtr9BrgSGOjupfW9Xrqcfjp88w385z/ZjkRE\nRKRxa0hPwyhgW+ComhqYWQvikcTF7v5Z+eYGXDPldtoJ1l1Xi1iJiIjUpVV9DjKzW4G+QG93n1NL\n0zWBnYGdzKx8nkKLOIUtA/Z19xdrOnjYsGHk5eVV2lZYWEhhYWF9wq6WGXTvDm+9lbJTioiIZERR\nURFFRUWVts2fPz9t1zNPsrpRWcLQD+jj7p/X0daAbapsHgLsAfQHvnD3xdUclw8UFxcXk5+fn1R8\n9TFiBNxxR8yisEbVDyIiIpKckpISCgoKAArcPaUlDpKt0zAKGAgcDSwysw5lr7YV2lxpZmMAPHxY\n8QXMA5aUzcBYJWHIhh49YN48mDUr25GIiIg0XsmOaTgVWAt4EZhd4XVEhTYbAZ1SEVymdO8ef06d\nmt04REREGrNk6zS0cPeW1bzur9BmkLvvWcs5LnX39D9zSELHjrDpphrXICIiUptmu/ZEVT16qKdB\nRESkNkoaynTvDsXFsGJFtiMRERFpnJQ0lOnRAxYuhI8+ynYkIiIijZOShjIFZatn6BGFiIhI9ZQ0\nlMnLg65dNRhSRESkJkoaKtBgSBERkZopaaige3eYNg2WLs12JCIiIo2PkoYKevSAZcvgvfeyHYmI\niEjjo6Shgh13hFat9IhCRESkOkoaKmjbNhIHDYYUERFZlZKGKrp3V0+DiIhIdZQ0VNG9O0yfHoWe\nREREZCUlDVX06AHuUVJaREREVlLSUMU220D79npEISIiUpWShipatoyS0hoMKSIiUpmShmp07w4v\nvABvvpntSERERBoPJQ3VOOMM2Hxz+N3vYOhQWLAg2xGJiIhkn5KGanTuHL0MN9wA990X4xwmTMh2\nVCIiItmlpKEGrVrBWWfBhx9Cfj4cdhj87W/ZjkpERCR7lDTUYbPN4Mkn4cwz4dpr4eefsx2RiIhI\ndihpSIAZnH9+LGY1cmS2oxEREckOJQ0J6tgRBg+Gm26CX37JdjQiIiKZp6QhCX/5C/z0E9x1V7Yj\nERERyTwlDUno0gUKC+G66+JRhYiISHOSVNJgZheY2VtmtsDM5prZBDPbuo5jDjWzyWY2z8zmm9nr\nZrZvw8LOnvPPh6+/hgceyHYkIiIimZVsT0Nv4BagJ7A30BqYbGbtajlmN2AycACQD7wATDSzHZMP\nN/u22w769YOrr4YVK7IdjYiISOa0Sqaxu/et+N7MjgfmAQXAqzUcM6zKpgvNrB9wEDAtmes3Fhdc\nAL16RcGnAQOyHY2IiEhmNHRMw9qAAz8meoCZGbBmMsc0Nj17wp57wpVXxjLaIiIizUG9k4ayL/+b\ngFfd/cMkDv0L0B54pL7XbgyGD4d33oFHmvSnEBERSVxSjyeqGAVsC+ya6AFmdjQwAjjY3b+vq/2w\nYcPIy8urtK2wsJDCwsIkQ029vfaK0tJnnBH/vf762Y5IRESam6KiIoqKiiptmz9/ftquZ16P/nUz\nu5UYk9Db3WcleMxRwF3AAHefVEfbfKC4uLiY/Pz8pOPLlG+/hW23hb59Ydy4bEcjIiICJSUlFBQU\nABS4e0kqz53044myhKEfsEcSCUMhcDdwVF0JQ1PSsWNUiHzgAXjqqWxHIyIikl7J1mkYBQwEjgYW\nmVmHslfbCm2uNLMxFd4fDYwBzgGmVjhmrdR8hOw69ljYf3849VRIY4+QiIhI1iXb03AqsBbwIjC7\nwuuICm02AjpVeH8S0BIYWeWYm+oVcSNjBrffDgsWRJlpERGRXJVsnYY6kwx3H1Tl/R7JBtXUbLZZ\nLJt92mlw5JExMFJERCTXaO2JFDn5ZNh9dzjrrGxHIiIikh5KGlKkRQs46ST44AP4vs7JpCIiIk2P\nkoYU6tkz/nzrrezGISIikg5KGlJoiy2iyNOUKdmOREREJPWUNKSQWfQ2vPlmtiMRERFJPSUNKdaz\nZzyeKC3NdiQiIiKppaQhxXr1gp9/hk8+yXYkIiIiqaWkIcW6d48/Na5BRERyjZKGFFt7bejWTeMa\nREQk9yhpSINevdTTICIiuUdJQxr07AnvvQeLF2c7EhERkdRR0pAGPXvCr79CSUpXMRcREckuJQ1p\nsMMO0K6dxjWIiEhuUdKQBq1awc47a1yDiIjkFiUNadKzp5IGERHJLUoa0qRnT5g1C+bMyXYkIiIi\nqaGkIU169Yo/q/Y2PPssdOkCL76Y8ZBEREQaRElDmmy6KWy8ceWk4cMPYcAAmDcPDj003ouIiDQV\nShrSqOK4hnnz4MADoXNn+Ogj6NQJDjhAjy9ERKTpUNKQRr16wdSpsGgRHHJIFHt66qnohfjXv2DF\nikgkFi7MdqQiIiJ1U9KQRj17wv/+B/vtB++8A08+CZttFvvKE4dPP4UjjohiUCIiIo2ZkoY0KiiA\nFi3gtddg7Fjo0aPy/t/+Fh57DJ57DoYOzU6MIiIiiWqV7QBy2RprwMknw3bbxQDI6uyzD4weDSee\nCLvsAscdl9kYRUREEpVUT4OZXWBmb5nZAjOba2YTzGzrBI7b3cyKzWyJmX1sZs3mq/G22+ruRTjh\nBDj+eDjtNPjvfzMSloiISNKSfTzRG7gF6AnsDbQGJptZu5oOMLPNgaeA/wA7AjcDd5nZPvWIN2eN\nHAlbbAGHHx7jIERERBqbpJIGd+/r7mPdfbq7vw8cD2wGFNRy2GnA5+5+rrt/5O4jgfHAsPoGnYtW\nXx0efTSqSJ5+OrhnOyIREZHKGjoQcm3AgR9radMLeK7KtmeAXRp47ZyzzTYxvmHsWLjnnmxHIyIi\nUlm9kwYzM+Am4FV3r622YUdgbpVtc4G1zKxNfa+fq445Bk46KcZBvP9+tqMRERFZqSE9DaOAbYGj\nUhSLlLn5Zthyy0geSkuzHY2IiEio15RLM7sV6Av0dve6CiF/C3Sosq0DsMDdl9Z24LBhw8jLy6u0\nrbCwkMLCwiQjblratYNRo6BPH7jvPhg8ONsRiYhIY1RUVERRUVGlbfPnz0/b9cyTHHFXljD0A/q4\n++cJtL8aOMDdd6yw7UFgbXfvW8Mx+UBxcXEx+fn5ScWXS449FiZNirUq1l0329GIiEhTUFJSQkFB\nAUCBu5ek8tzJ1mkYBQwEjgYWmVmHslfbCm2uNLMxFQ4bDWxhZteYWVczOx0YANyQgvhz2nXXwbJl\n8Ne/ZjsSERGR5Mc0nAqsBbwIzK7wOqJCm42ATuVv3P0L4ECirsO7xFTLE9y96owKqaJjR7j00phR\nUVyc7WhERKS5S2pMg7vXmWS4+6Bqtr1M7bUcpAZDh8b0yyFD4PXXYy0LERGRbNBXUCPXqlVUi5wy\nBe69N9vRiIhIc6akoQno3TvqN5x/PvxYWxktERGRNFLS0EQkOijy8cfhu+8yE5OIiDQvShqaiEQG\nRT7yCBx6KAzTqh4iIpIGShqakKFDYfvtY1Bk1UqRX34JJ58Mm2wCRUXw2WfZiVFERHKXkoYmpKZB\nkb/+CgMHwtprw9SpsP76cM012YtTRERyk5KGJqa6QZFXXAFvvAEPPAAbbQRnnx3lp7/+OquhiohI\njlHS0ARVHBT52mtw2WVw0UWw666x/7TToH17+PvfsxuniIjkFiUNTVDFQZEDBsAuu8CFF67cv9Za\n8Kc/wR13wLx52YtTRERyi5KGJqp8UOTixfFYolWV2p5/+lNUj7zppuzEJyIiuUdJQxPVqhU880wM\niuzcedX9660XjylGjoSff858fCIiknuUNDRhG20EXbvWvP/ss2HpUrj11szFJCIiuUtJQw7baCM4\n4QS4/voY/7BwYbYjEhGRpkxJQ467+GLo0ycKQm2ySfz5wQfZjkpERJoiJQ05bsMNYz2KmTPhzDPh\nn/+EHXaAP/8525GJiEhTo6ShmdhsM7j8cpg1K5KH0aNj5oWIiEiilDQ0M61bx6yKRYtg8uRsRyMi\nIk2JkoZmqGtX2G47eOyxbEciIiJNiZKGZuqww+DJJ6McdSotWxbjJ0REJPcoaWim+veH+fPh+efr\nbvvllzBqFPzhDzEboybusdrmNtvADz+kLlYREWkclDQ0U7/9LWy5ZcymqM7PP8Pw4THTYvPNY/Dk\n3LmxONadd1Z/zK23wvjx0dtQ03lFRKTpUtLQTJlFb8Pjj8OKFavuP+ss+Mc/oKAAHn0Uvv8epk6N\nQZRDhsArr1RuP3UqnHNOHLfnnvDQQ5n5HCIikjlKGpqx/v3hu+9WTQDefBPGjIEbboD77ouVNPPy\nYt/NN8cS3P37x2MLgJ9+gsMPh/x8uOYaOOooeOEFmDMnox9HRETSTElDM7bzzrDpppVnUZSWwhln\nwP/9X5Sgrqp16+h5aN8e+vWD//0PjjsuSlQ/8gistloMsmzVKtqJiEjuSDppMLPeZvakmX1jZqVm\ndnACxww0s3fNbJGZzTazu81s3fqFLKnSokV8wU+YEMkCRM/C22/DLbdAy5bVH7f++jHz4rPPYMcd\nYeJEuP/+KCAFsO66sN9+ekQhIpJr6tPT0B54Fzgd8Loam9muwBjgTmBbYADQA7ijHteWFOvfH775\nBt56KwY/nn9+zIDYddfaj9thBxg3Dj7/HM47Dw48sPL+wkJ44w344ou0hS4iIhnWKtkD3H0SMAnA\nzCyBQ3oBM919ZNn7L83sduDcZK8tqbfrrrE+xWOPwa+/wi+/xLiERPTrF2WpN9101X0HHwzt2sHD\nD0dSISIiTV8mxjS8AXQyswMAzKwDcDjwdAauLXVo2RIOPTQeL9xyC/z1r7EaZqI6dYqZGFWtsUbU\nddAjChGR3JH2pMHdXweOAR42s2XAHOAnYGi6ry2JOewwmDcv6jEMG5a68xYWwrvvwowZqTuniIhk\nT9KPJ5JlZtsCNwOXAJOBjYC/A7cDJ9Z27LBhw8grn+tXprCwkMLCwrTE2lztsQfstRdccAG0aZO6\n8x5wAKy1VvQ2XHJJ6s4rIiKhqKiIoqKiStvmz5+ftuuZe51jGWs+2KwUOMTdn6ylzf1AW3c/osK2\nXYFXgI3cfW41x+QDxcXFxeTn59c7Psm+44+PAZEzZlT/GENERFKrpKSEgoICgAJ3L0nluTMxpmF1\n4Ncq20qJmRf6GslxRx0FH38cjylERKRpq0+dhvZmtqOZ7VS2aYuy953K9l9lZmMqHDIR6G9mp5pZ\nl7JehpuBKe7+bYM/gTRqe+0F660Hp58eNSC+/z7bEYmISH3Vp6dhZ+AdoJjoLbgeKAEuLdvfEehU\n3tjdxwBnA0OA94GHgelA/3pHLU1G69Zwzz3xaGLwYOjQAXbbDW68EZYuzXZ0IiKSjPrUaXiJWpIN\ndx9UzbaRwMhqmkszcPDB8fr2W3jqKXjiCTj33Fi74qabsh2diIgkSmtPSMZ07Agnnhhlp//+91j8\n6tlnsx2ViIgkSkmDZMUZZ8Dee8fsih9/zHY0IiKSCCUNkhUtWsTAyMWL4ZRToAEzf0VEJEOUNEjW\nbLIJ3H47jB8PY8dmOxoREamLkgbJqsMPh2OPhaFDtSKmiEhjp6RBsu6WW2DddeG44/SYQkSkMVPS\nIFmXlwd33w0vvxxLaYuISOOkpEEahb32gkMOifoNixdnOxoREamOkgZpNK67LgpA3XBD4sc8/DCc\nc076YhIRkZWUNEijsdVWcOaZcNVVMHt23e0nTYKBAyPJeOml9McnItLcKWmQRuWvf4XVV4cLL6y9\n3TvvxMyLvn1hxx3h8sszE5+ISHOmpEEalbw8uOyyKPxUXFx9m1mz4MADYZttoKgIRoyA//wHXnst\no6GKiDQ7Shqk0TnxRNh+ezjrrFWnYP78c/QutG0ba1i0bw+HHgrbbafeBhGRdEt6lUuRdGvVKsYp\n7LsvHHNMLHS12mrxevZZmDMHXn89ltmGKEk9YgQcdRRMmQI9e2Y3fhGRXKWkQRqlffaJ6ZfPPBPj\nF5Yti1ebNrG0dteuldsPGADdukVvw1NPZSdmEZFcp6RBGq1rrolXIlq2jEGUxxwDJSWQn5/e2ERE\nmiONaZCcceSR8JvfaGyDiEi6KGmQnNGqFQwfDo8/DtOmZTsaEZHco6RBcsrAgdHbMGAAfPVVtqMR\nEcktShokp7RuHZUily+HPn203LaISCopaZCcs8UWUVa6RQvYbTf49NOV+9zh1Vehf384+eTsxSgi\n0hQpaZCc1LlzJA6rrx6JwwcfxOJWPXtC795RQfLee2HhwprP4R4zMUREJChpkJy1ySaROKy3Huyw\nQxR/WnNNePppePNN+PXX6HWoyeOPQ0EBvP125mIWEWnMkk4azKy3mT1pZt+YWamZHZzAMauZ2RVm\n9oWZLTGzz83s+HpFLJKEDh3ghRfg4oujSNR//hNlqLt2jaTi+edrPvbpp+PPsWMzE6uISGNXn56G\n9sC7wOmA19G23KPAHsAgYGugEPioHtcWSdr668Mll8BOO63cZgZ77llz0uAeAyrbtYtFsZYvz0io\nIiKNWtJJg7tPcveL3P0JwOpqb2b7A72Bvu7+grvPcvcp7v5GPeIVSZk994zehx9/XHXfBx/AN9/E\nipvffRdrXoiINHeZGNNwEPA2cJ6ZfW1mH5nZdWbWNgPXFqnRHntEj8KLL666b9KkGEQ5dGisuDlu\nXMbDExFpdDKRNGxB9DRsBxwCnAkMAEZm4NoiNercGbbcsvpHFP/+dyQVbdvGehaPP17zTIsnnoB5\n89Ibq4hIY5CJpKEFUAoc7e5vu/sk4GzgODNrk4Hri9SounENCxfGrIr994/3Rx8NS5bAP/+56vFP\nPw2HHAInnpj+WEVEsi0Tq1zOAb5x9/9V2DadGA+xKfBZTQcOGzaMvLy8StsKCwspLCxMR5zSDO25\nJ9x5J8ykKh3OAAAcXUlEQVSZAxttFNteeCEGPpYnDZ06we67xyyK445beezPP0eBqC5dYOLEWMZ7\nv/0y/hFEpBkrKiqiqKio0rb58+en7XrmnugEiGoONisFDnH3J2tpcxJwI7Chu/9Stq0fMB5Yw92X\nVnNMPlBcXFxMvtY4ljSaOxc6doQHHogeBYDTToPnnoNPPlnZ7p57ojfhq69iqibE+0cegf/+Nx5h\nfPddLJTVunXmP4eISLmSkhIKCgoACtw9pSXq6lOnob2Z7Whm5RPYtih736ls/1VmNqbCIQ8CPwD3\nmtk2ZrYbcC1wd3UJg0gmdegA22238hFF+VTL8l6Gcv37Q5s28OCD8X7yZLj7brj++uiJuPlmmDED\nbrsts/GLiGRSfcY07Ay8AxQTdRquB0qAS8v2dwQ6lTd290XAPsDawFRgLPAEMSBSJOsqjmv46KNY\n5OqAAyq3ycuDgw+OWRQLF8JJJ8Fee60cy7DTTrHt4ovh++8zGr6ISMbUp07DS+7ewt1bVnkNLts/\nyN33rHLMx+6+n7uv4e6d3f1c9TJIY7HXXjBzZrwmTYoehd13X7XdMcfAe+/BYYfBDz/AXXdFkahy\nf/tb9FRcfHHGQhcRySitPSHNXp8+sSLmCy9E0tCnT9RoqGr//WMdi+eeg2uugc03r7x/gw3gootg\n9Gh4//2MhJ5yDz0U63KIiFRHSYM0e2uvDfn58NRTUeip6niGcq1bw5//DAMGxGDJ6gwdClttBWed\nFb0ODfXOOzBlCpSWNvxcdfn5Zxg8GE44ITPXE5GmR0mDCDGuYcIEWLp01fEMFZ1/Pjz6aPRMVGe1\n1eDGG2OMxIQJNZ9n8eIYXHnWWfDaa5W/pEtLYwrnbrtFMtOrV/RqnHNOJBCpSEaq8+CDUY/iww/j\n+iIiVTVoymW6aMqlZNozz0QPQ+fOMbbB6lxVpXZ/+EOsXzF9eix6VdX550dysd56USNik02iB2PL\nLWHUqJiJ8bvfRc/GuutGojJ+fEwR7dQpkoktt1z52mmnmAnSEPn5sNlmsRbHsmXwxhsNvw8iknmN\nasqlSC76/e/j8cP++6fmi/LGG2H2bLjuulX3vfsu/P3vMGIEfP01vPJKDK585BE480zYZpvofXjt\nNTj00BhjceutsYDWCy9E2yVL4Mkno/3++8cjkY8/rn+8JSXxKOTEE+GCC6JHo7o1OUSkeVNPg0iZ\nCRPit+3OnVNzvvPPX1m/ofycv/4Ku+wSX/rFxfE4o1xpKcyfD+usk/g1fv01poj27RtjM157rX7F\npU4/PdbQ+PJLaNky7sMGG0Q9ChFpWtTTIJIBhx6auoQB4MILIwH4859XbvvHPyJZuPPOygkDxDiJ\nZBIGgFatopdh3LjoLbj88uTj/OWXqIg5aFCczywSnmefjVhFRMopaRBJkzXXjMcT48fHwMiZM+OR\nxNChMbgxlXr0iPoQV1wBr7+e3LGPPgoLFsTMiXIDBkQyctVVqY1TRJo2PZ4QSSP3GC8xf34Mdpw+\nPdaqWHPN1F/r119jxsW338YaGIleo3fvWAL82Wcrb7/zTjjllJhN0a1b6uMVkfTQ4wmRJsoMbrkl\nvngnT46ZEelIGCAeLYwdGwtnnZlgkfYZM2IZ8OqW9v7jH2Plz2uuSW2cItJ0KWkQSbP8fLjkEhg2\nLKZiptOWW8a4iXvvhX/+s+72d98dUzoPOWTVfW3awNlnx3iJa6+NFT5FpHlT0iCSARddBDfckJlr\nHX88HHgg/OUv8ciiJsuWwZgx0aPQpk31bU49FY48MsZLdO4ca3LccQf89FPtMXz1VVSWnDWrvp9C\nRBojJQ0iOcYsFs/6/POVS3lXZ8KEeJRxwgk1t2nfPnoa5s6N3os2baKE9nbb1dzzsHw5HHUU3HMP\n7LMPzJvXsM8jIo2HkgaRHLTTTrGU99/+BitWrLp/8eIo4rTvvrD99nWfb6214LjjonLmzJlRC6Jf\nP1i0aNW2I0bAW29FsrFgAey3X6xr0VQtXgxHHBGDWEWaOyUNIjlqxAj45BN4+OFV9111VVSYvOWW\n5M+72WZRjfLjj+NRSMV1M/797xg4eeWVMHBgzMj48st4XFJdgtEUPPxwTEv9+9+zHYlI9ilpEMlR\nO+8clSL/9rfKX+wffxxf7OeeC1tvXb9z77hjzNQYP35lQamvv47xEX37xuJaEL0YkybBe+9F+eul\nSxv2mbJh1KjoWXnooeg5EWnOlDSI5LARI6Jb/bHH4r07DBkSNSOGD2/YuQ89NBKSSy6JL9TCwhjz\nMGZM5VVAe/SInomXXoKTT27YNTNt6tR4/eMfkfDUNkZEpDlQ0iCSw3r1inELl18evQ2PPALPPRcL\nYFW3+mayhg+PQY+FhbEq5kMPwfrrr9pujz3gppuid+Kzzxp+3VRZujTirsnIkTFr5KSTYrrs7ben\nb2lykaZASYNIjrvoInj//fjCHjYsegj69k3Nuc1ilsRBB8XiXL//fc1tjzsu1tYYNSo1106FCy+M\nJcgnTlx13w8/RBJ06qmxiNdJJ8UKpVqPQ5ozJQ0iOW7XXeM3/cGD45n8zTen9vzt2sXjhyFD6m53\n4omRZDSGQZFz5kRPQl5exPXdd5X333NP9CqUT0ndf3/YdNOoUyHSXClpEGkGLrooHk9cfDF06pS9\nOE4/PRKXcePSf63S0soDQKu66qpYc+PNN2Na6imnrHz0sGIF3HZbTLXcYIPY1rJlJBcPPggLF6Y/\nfpHGSEmDSDOw++6xzkTFZbqzoXPnqB9x663pHxtw9NHxuGTx4lX3ffVVjE/4859jMa7bb49iV+XJ\nzKRJUY+iau/J4MFxvoceSm/sIo2VkgaRZqJr1xiDkG1Dh8IHH8RsinR5+eWorzBlSoxJqJqgXHFF\nLBz2pz/F+/794dhjI7ZZs2LcRX4+9OxZ+bhOnWI8SNVHFC+9FNNLTzklfZ9JpDFQ0iAiGbXnnrDN\nNvUrLFVuzpzqK11CJAjnnQcFBTH48/77Y8pkuZkzY6Gu886rvOLoP/4RlS8POyyKVA0ZUn2SdfLJ\n8PbbUFISgyUHD46enCVLIpn4z3+S+yyLFkXpbZGmIOmkwcx6m9mTZvaNmZWa2cFJHLurmS03s5Su\n7y0iTYdZ/Eb/+OP1W9Dq449hiy3gmGOqf8Txz3/GOIVrr41HFH/+cxSbeuGF2H/55bDeeqs+elh7\nbbjvvpgdsfbaMZW0OgccABtvHL0U3brFY4077oCPPoLevWNtjiVLEvssy5dHctOtWwwm1XROaezq\n09PQHngXOB1I+K+4meUBY4Dn6nFNEckhf/wjrLEGjB6d3HHu8bihffsYV3DttZX3L18ea2rsv3/0\naEAMeNxjjxjU+Nxz0fNwwQWw+uqrnn+vvaIH5MYbq98P0KpVTL987TXYe+8onnXSSTFQcvRo+OIL\nuPrqxD7PuHGRbGyySazlceCBUfq73Lx5kZDst18kK+qRkKxz93q/gFLg4ATbFgGXAhcDJXW0zQe8\nuLjYRSQ3nXmm+/rruy9enPgx997rDu6TJ7sPH+5u5v700yv333ZbbHv33crHff+9e5cu7i1auG+y\nSXLXrM7Spe7vvVf9vuHD3VdbzX3GjNrPsWyZ+xZbuB92mHtpqfuECe6dO8exp5zi3qdPxNuihftu\nu8WfV1zRsLileSguLnbil/p8b8B3fHWvjCQNwCDgTaJnQ0mDiPhHH8W/QEce6f7KK/HFWZt589zX\nXdd94MB4v2KF+x/+4J6XF1/QCxe6d+jgfuyx1R8/bZr72mu733NPaj9HVb/8EsnAnnvW/pnuuSc+\nf8UE55df3C++2H2DDdz339/9zjvjc7u7n3deJBQffpjW8CUHpDNpMG/AQzQzKwUOcfcna2nzG+Bl\n4Pfu/pmZXQz0c/f8Wo7JB4qLi4vJz6+xmYg0cbfcEqtHzpoFXbrEOIVjj4Xf/GbVtn/8Izz9dDwO\n2HDD2DZ/fpTKdo8yz7fcEmMeOneu/nrLl8fiU+k2eXI8Urj//vg81cXRrVssYV6+LkhdFi+O9uuu\nC6++Go9DRKpTUlJCQUEBQIG7p3QMYVqTBjNrQfQw3OXud5Rtu4Tonagzadhtt93Iy8urtK+wsJDC\nwsJ6xywijUtpKbzySsx0ePTRKP502GGx2NZOO0Wb556DffaJWQ+DB1c+/uOPY1Gs+fNjwGNjWcK6\nsDDinjYtBk5WdM89UWly2jT47W8TP+err8Juu8ENN8BZZ9XdfsWKeK22WnKxS9NRVFREUVFRpW3z\n58/n5ZdfhjQkDWl9PAHklbVZBiwve62osG33Go7T4wmRZuiXX9zvvtt9yy2j6/7gg+PRxVZbxXP9\nmrr7n3kmxgB8/31Gw63VnDkxfqJjR/dXX125fdmyGF/Rv3/9zjt0qHu7du6fflp329NPdy8oqN91\npOlK5+OJdNdpWABsD+wE7Fj2Gg3MKPvvKWm+vog0Ie3aRU/CjBnRtT9jRkxjnDUrqjbWVJxq333h\nxRdjKmVj0bFj1HP4zW+ijkN5FcyxY6NWxEUX1e+8V10Vj2dOOqn2KZqffBL3rLgY/vvf+l1LpKpW\nyR5gZu2BrYDy/323MLMdgR/d/SszuwrY2N2Pc3cHPqxy/DxgibtPb2DsIpKjWrWKsQBHHw3jx8ca\nEd26ZTuq5HXsGMWezj0XzjgD3norHjH075/cY4mK1lgD7rwzEqXbb48pqNW5+OK4/sKF8dhnu+3q\n/zkSVVoaj0MyMW5EsqM+PQ07A+8AxUT3x/VACTGdEqAjkMUlcUQkV7RsCUceGTUMmqrWraPuw4MP\nRgI0c2Z8oTfEPvtEZcpzzqlc16Hce+9FHYuLLop79+ijDbteIoqLYdttY8yF6knkrqSTBnd/yd1b\nuHvLKq/BZfsHufuetRx/qdcyCFJEJBcVFsLUqfDII7DDDg0/3/XXxwDLY45Z9Ut6xIiomjloUBS1\n+vDD9D2iWLEiiln16hUDLqdOhb/9LT3Xam5mzoRffsl2FJVp7QkRkQzZbjs4/PDUnGuNNaKiZHFx\n5S/pKVOiJPUll0Qvxz77QF5eJCupNmtWVN4cPjzKdb/9dvRuXHFFlPKuryVLohz3rbfG+h6JKi2N\nKaw//lj/azcW330Xf1+6doWiosZTYlxJg4hIE9Wz58ov6TfeiG0XXhhfNuUz09u0WfmIIpVfPOPH\nx7iMmTNjXY+rroqehuHDYz2NY4+Nxbjq46qrYtzGsGGw0UYxBmTixLofezzyCAwYEJ//iSfqd+1k\npevL/N57IwkqKIixPb17R4KYbUoaRESasOHDoXv3+JKeODEGXl5+eeXiT0ccEUWxUvGIYsmSWHDs\n8MOjF2PaNOjTZ+X+Vq1ihsjs2dH7kKwZMyJpGD4cvvkm1hf57DM4+OAYM/Hzz9Uft2xZJEx77QU7\n7wyHHAIDBybXU5GMFSvg0kthnXViKfZUn3v06BjP8/jjUe9j/vz4OZ92WiQTWZPqOZypeKE6DSIi\nCfv0U/f27d1btXLfeedV61ksXRrlti+6qGHX+eQT9//7vyhnPWpU7WWyb7stam1UXBukLqWlUY9j\nq61WXR9k6lT3Ndd0P/HE6o+99dZYd+T99+M8Y8e6r7OO+4Ybuk+cmHgMifj665Vrg3Tp4r7ppu4/\n/JC68z/9dNy7N99cuW35cvfrrovtL7xQ+/GNdu2JdL2UNIiIJOeee+JLc/Lk6vcfd5x7t26rftE/\n9lh88V12mfv8+dUfu3x5fAmvuWYU3krkn+bSUvcDDoj1QGbPTvwzgPtzz1W/f/To6vcvWBDJwfHH\nV94+e3bE0KaN+48/JhZDXSZOdF9vPfeNN3Z/8UX3WbMiOTn00LrXT0nUgQe65+ever7S0vhZ1ZQ4\nlVPSICIidZo7t+Z9Tz0V/+K///7KbU88Eb0TBQXxxbreeu7XXuu+aFHsnzEjFsraaKM49ogjak4s\nqjN7dlTE3HDDuFZtyhckq2nBMfdYpKxPn/ji/N//Vm6/5JKI/8svVz1mzhz3li3dR45MPO6aXHhh\n3IcDD3T/7ruV2//5z9g+enTDrzFzZiR/d91Vcwx5ebWv1KqkQUREGqT8EcWIEfH+6afdW7eOctbL\nl8dvzKecEklEhw7uu+wS3xDrrOM+ZIj722/X77rffut+0EFxrsGDa046/vjHSBrKV/WsySefuLdt\n6z5s2Mrzr7GG+znn1HzMQQfFY5uG+Ne/4jNcfnn1PQqnnRZxffBBw65z/vnxc6qYFFX04YcRx2OP\n1XwOJQ0iItJgxx3n3rVrrNXRpo17v36xFkZFn33mfsIJ8dv0Qw/V/httokpLY02RNdZw33xz98cf\nd3/jjViT46WX4rdqiDaJuO66+G38jTdiLY68vNrXHSnvCajYy5KMBQvcO3Vy32efmh9B/PKL+3bb\nuW+/ffx3fSxZ4r7++u5nnll7u/x898MOq3m/kgYREWmw8gF2rVu79+0bX1KZ9Pnn7r17RwxVX3vt\nlfiYgOXLo+dgyy3js1x1Ve3tly6NL+Ozz65f3EOGxEDTmTNrb/f++9Hb0K+f+/PPr5qQ1WXcuLgX\nM2bU3u7662Mw6k8/Vb+/KS9YJSIijcTee8d6FLvvHkWQ2rTJ7PW7dImaDh98AO+/H5UqP/oIPv0U\nJk2qeUGyqlq1imXSv/wSNtgA/vSn2tuvtlpMvxw3LvkS16+8AiNHwpVXwuab1952++1j2fOpU6Po\n1frrx7TJceMSm/p5221xXNeutbc76qj4HI89lvDHSBlzbyRlpiows3yguLi4mPx8VZwWEUmVn36K\nCpEtcuBXxieeiKThd7+ru+20abDTTnHMwQcndv4lS2DHHWP11FdeqVz7ojbu8M47UTfjqaeiUmaL\nFhHnQQfFq1u3yklSeXzjx0cxq7rsvXfUa3j++VX3lZSUUFBQAFDg7iWJRZ0YJQ0iItIs5OdD584w\nYUJi7YcPjzU+3nknCkvV1+zZ8K9/RRLx7LOweHFUumzZMhKTJUtijYkOHaL3JJFVQu+9F044Ab76\nCjbZpPK+dCYNSS+NLSIi0hQNGgRnnw3z5sGGG9betqQkqlFecknDEgaIhcVOPDFeixdH78Drr0fS\n0K5dLP3erl2UBU90WfHDDovqkEVF9au8WV/qaRARkWbhhx/iC/zqq2Ndi5p89x306BEloqdMSfyL\nPNMOPzyWRn/33crb09nTkANPtUREROq23noxnuHee2teaGrpUjj00HhcMGFC400YIAZ3TpuWvmXP\nq6OkQUREmo1Bg2LmRkk1v3+7w8knx8DFxx+P8Q+N2QEHRG/IAw9k7ppKGkREpNnYd98YOHjEEbH8\n9tKlK/dddx3cf39M59xll+zFmKg2bWJK55dfZu6aShpERKTZaNUKnnkmZlKcckrUXrj22vht/fzz\nY3ntgQOzHWXibr01sz0Nmj0hIiLNynbbwaOPwscfR+/CiBGwbFnUR7jssmxHl5xEa0ekinoaRESk\nWdp663hEMXMmjB4djyZyoehVOqmnQUREmrWNN45HFVI35VQiIiKSECUN8v8VFRVlO4RmR/c883TP\nM0/3PHcknTSYWW8ze9LMvjGzUjOrdekPMzvUzCab2Twzm29mr5vZvvUPWdJF/2Nnnu555umeZ57u\nee6oT09De+Bd4HRive667AZMBg4A8oEXgIlmtmM9ri0iIiJZkvRASHefBEwCMKt79XN3r1rh+0Iz\n6wccBExL9voiIiKSHRkf01CWaKwJ/Jjpa4uIiEj9ZWPK5V+IRxyP1NKmLcD06dMzEpCE+fPnU1Jd\nQXZJG93zzNM9zzzd88yq8N3ZNtXnbtDS2GZWChzi7k8m2P5o4HbgYHd/oY52GSyMKSIiknMGuvuD\nqTxhxnoazOwo4A5gQG0JQ5lngIHAF8CSNIcmIiKSS9oCmxPfpSmVkaTBzAqBu4AjywZS1srdfwBS\nmh2JiIg0I6+n46RJJw1m1h7YCiifObFF2fTJH939KzO7CtjY3Y8ra380cB/wJ2CqmXUoO26xuy9o\n6AcQERGRzEh6TIOZ9SFqLVQ9cIy7Dzaze4HO7r5nWfsXiFoNVY1x98H1iFlERESyoEEDIUVERKT5\n0NoTIiIikhAlDSIiIpKQRpc0mNkQM5tpZovN7E0z657tmHKFmV1gZm+Z2QIzm2tmE8xs62raXWZm\ns83sFzN71sy2yka8ucbMzi9b5O2GKtt1v1PMzDY2s7Fm9n3ZfZ1mZvlV2ui+p4iZtTCzy83s87L7\n+amZ/bWadrrn9ZTIYpF13V8za2NmI8v+v1hoZuPNbMNk4mhUSYOZHQlcD1wM/B+xNsUzZrZ+VgPL\nHb2BW4CewN5Aa2CymbUrb2Bm5wFDgZOBHsAi4mewWubDzR1lye/JVFlvRfc79cxsbeA1YCmwH7AN\ncA7wU4U2uu+pdT5wCrGQYTfgXOBcMxta3kD3vMFqXSwywft7E3Ag0J+YoLAx8FhSUbh7o3kBbwI3\nV3hvwNfAudmOLRdfwPpAKfD7CttmA8MqvF8LWAwcke14m+oLWAP4CNiTmHl0g+53Wu/31cBLdbTR\nfU/tPZ8I3Fll23jgft3ztNzvUqKycsVttd7fsvdLgUMrtOladq4eiV670fQ0mFlroAD4T/k2j0/1\nHLBLtuLKcWsTGeuPAGbWBehI5Z/BAmAK+hk0xEhgors/X3Gj7nfaHAS8bWaPlD2GKzGzE8t36r6n\nxevAXmb2G4Cy2j27Av8qe697nkYJ3t+didpMFdt8BMwiiZ9BNhasqsn6QEtgbpXtc4lsSFKobLXR\nm4BX3f3Dss0diSSiup9BxwyGlzPKyqfvRPwPW5Xud3psAZxGPOq8guiq/YeZLXX3sei+p8PVxG+y\nM8xsBfHo+0J3f6hsv+55eiVyfzsAy3zVoopJ/QwaU9IgmTUK2Jb4bUDSwMw2JRKzvd19ebbjaUZa\nAG+5+4iy99PMbHvgVGBs9sLKaUcCRwNHAR8SifLNZja7LFGTHNFoHk8A3wMriGyoog7At5kPJ3eZ\n2a1AX2B3d59TYde3xDgS/QxSowDYACgxs+VmthzoA5xpZsuIDF/3O/XmANOrbJsObFb23/p7nnrX\nAle7+6Pu/l93fwC4EbigbL/ueXolcn+/BVYzs7VqaVOnRpM0lP0mVgzsVb6trAt9L9K08EZzVJYw\n9AP2cPdZFfe5+0ziL0/Fn8FaxGwL/QyS9xywA/Fb145lr7eBccCO7v45ut/p8BqrPtLsCnwJ+nue\nJqsTv/RVVErZd4zueXoleH+LgV+rtOlKJNNvJHqtxvZ44gbgPjMrBt4ChhF/Ge/LZlC5wsxGAYXA\nwcCiCouHzXf38iXIbwL+amafEkuTX07MYHkiw+E2ee6+iOiq/f/MbBHwg7uX/yas+516NwKvmdkF\nwCPEP5wnAidVaKP7nloTifv5NfBfIJ/49/uuCm10zxvA6lgskjrur7svMLO7gRvM7CdgIfAP4DV3\nfyvhQLI9daSaqSSnl33gxUT2s3O2Y8qVF5H5r6jm9ccq7S4hpu/8QqzHvlW2Y8+VF/A8FaZc6n6n\n7T73Bd4ru6f/BQZX00b3PXX3uz3xS99Moj7AJ8ClQCvd85Td4z41/Bt+T6L3F2hD1Or5vixpeBTY\nMJk4tGCViIiIJKTRjGkQERGRxk1Jg4iIiCRESYOIiIgkREmDiIiIJERJg4iIiCRESYOIiIgkREmD\niIiIJERJg4iIiCRESYOIiIgkREmDiIiIJERJg4iIiCTk/wHMztUCX24OVgAAAABJRU5ErkJggg==\n",
"text/plain": [
"<matplotlib.figure.Figure at 0x106d23ac8>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"import matplotlib.pyplot as plt\n",
"import matplotlib.ticker as ticker\n",
"%matplotlib inline\n",
"\n",
"plt.figure()\n",
"plt.plot(all_losses)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Evaluating the Results\n",
"\n",
"To see how well the network performs on different categories, we will create a confusion matrix, indicating for every actual language (rows) which language the network guesses (columns). To calculate the confusion matrix a bunch of samples are run through the network with `evaluate()`, which is the same as `train()` minus the backprop."
]
},
{
"cell_type": "code",
"execution_count": 19,
"metadata": {
"collapsed": false,
"scrolled": false
},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAeQAAAGoCAYAAACXNJbuAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzsnXeYJFXV/z9fdkFYkCxBJWckSBRBQckoGBDh5UUyP0VB\ndMGXpLCSJCfBAJKRjAgoIFElieQkYQm7sMDCguwSll3C7vn9cW4zNTUdqrtrZnp6z+d56pnuW3VP\n3a6u6VP33BNkZgRBEARBMLjMNNgDCIIgCIIgFHIQBEEQdAShkIMgCIKgAwiFHARBEAQdQCjkIAiC\nIOgAQiEHQRAEQQcQCjkIgiAIOoBQyEEQBEHQAYRCDoIgCIIOIBRyEARBEHQAoZCDIAiCoAMIhRwE\nQRA0RNJSko6UdImkBVLbFpI+N9hj6xZCIQdBEAR1kbQB8BjwBWBrYI60a1XgsMEaV7cRCjkIgiBo\nxDHAL8xsE+CDTPttwDqDM6TuIxRyEARB0IiVgT9XaZ8AzD/AY+laQiEHQRAEjZgELFylfTXg5QEe\nS9cSCjkIgqAGknaUdJekVyQtltp+Kumbgz22AeZS4FhJCwEGzCRpPeAE4IJBHVkXEQo5CIKgCpJ+\nCJwEXA/MDQxLuyYBPx2scQ0SBwNPAeNwh64ngNuBu4EjB3FcXYXMbLDHEARB0HFIegI42MyulvQO\nsKqZPS9pJeAfZjbDrZ1KWgRfT54DeMjMnhnkIXUVwwd7AEEQBB3KEsBDVdrfB2Yf4LF0BGY2Dhgn\naRiwsqR5zGziYI+rWwiTdRAEQXXGAJ+v0r458OQAj2VQkXSKpN3T62HAP4EHceX8lcEcWzcRM+Qg\nCILqnAT8RtKsgIC1JW0PHATsMagjG3i2Af6YXm8FLAksD+wIHAWsN0jj6ipiDTkIgqAGknYAfgks\nlZpeAUaZ2dmDNqhBQNJUYGkze0nSmcB7ZvZTSUsAj5jZnIM8xK4gZshBEAQ1MLOLgIskjQDmMLMJ\ngz2mQeI1YEVJ43GT/Q9T+whg2qCNqsuINeQgCIIqSJotKWLM7D1gthSDvOkgD20wOBe4HHgcj0O+\nJbV/AQ+HCkogTNZBEARVkHQTcJWZ/V7S3MDTeB7n+YF9zex3gzrAAUbSNsAiwBVm9lJq2xmYZGbX\nDOrguoRQyEEQBFWQ9AawgZn9R9IewI/xVJHfAQ43sxUGdYCDhKRZzWzqYI+jGwmTdRAEQXVGAO+k\n15vis+XpwD3AYoM2qkFA0jBJh0h6GXhX0pKp/YhKOFTQPqGQgyAIqvMs8K2UnWoz4KbUvgDw9qCN\nanD4ObALsD+9yy8+zowXAtZvhEIOgqAUJC0taTNJs6X3GuwxtcnhePGEscC9Zvav1L4p1TN4dTM7\nAd9PXudZr+pH8HjkoAQi7CkIgraQNB9wGbAh7oG7DPA8cLakiWa232COr1XM7EpJd+JlBx/J7LqV\n6rWBu5nP4BaDPDMBMw/wWLqWmCEHQdAuJwMfAYsC72XaL8NjVocsZvYqvo68SWXmD9xnZjNaqM8T\nwJertG/DjGct6DdihhwEQbtsCmyWsjhl259hCDs/pZn/5cBX6aKZf4scDpwv6TP4RG5rScvhpuwt\nB3VkXUTMkIMgaJfZ6T0zrjAvXhlpqHIy8CFdOPNvlhRnvBWwMTAZV9ArAFuZ2c2DObZuImbIQRC0\nyx34TOmQ9N4kzYR75P590EbVPl05828VM7sD2GSwx9HNhEIOgqBd9gdulbQmMAtwHPA5fIY8lKsA\ndevMvy0kzUHOumpmM1oYWL8QJusgCNrCzB4HlgXuBK7BFdlVwGpm9txgjq1NKjP/Ct0y828aSUtI\nuk7SZOAtYGLaJqW/QQlE6swgCIIqSFoJD3F6EA/pupbMzH+IP2w0haS78JrQp+KVn3opDjP752CM\nq9sIhTwDkBLjb4PXdD3ezN6UtDrwmpm9PLijG5pIGoZnLtoIz9yUN+FtOAjDGhQkbQ68a2Z3pvd7\nAf8PD5XZy8yG7AxK0lzA3sCqwBy4cv6NmY0f4HEsg3t7V7vXDh+A878LrGFmT/f3uWZkQiF3OZJW\nwUulvQUsDixnZs9LOhJY1Mx2qtc/qI6k03GFfB0wnr4zhpGDMKxBQdJjwAFmdr2klYH7gRNxBfKU\nme06qAMc4kj6f8DvgDeAV+l9r5mZrV5QzkbUfoDcrUHfvwNHmdkt9Y4L2iMUcpcj6RbgQTPbX9I7\nwKpJIa8LXGxmiw/uCIcmqRLQTmZ2/WCPZbBJs6eVzGyspF+m19skK8z1ZrbQ4I6wNSStX2+/md0+\nQON4AfitmR3bhoxRwKH4w1K1B8hvN+i/FPB74I94/uoPc/0fbXVsQQ/hZd39rAX8oEr7y8CQ/KHs\nED6geirBGZEP8MpI4HGqF6TXbwJzDsqIyuEfVdqyimxYIwElLW3MA1xR4Lh67AnsYmYXttj/U/iS\n17mZNsPXlY0C1yJoTCjk7ud9qv8oLgu8PsBj6SZOBH4iaW8LM9OdwEnJ8WdtYLvUvizw0qCNqn3m\nyb2fGa+HfARe/agIp9KztPE4uZlpQa7AY6J/30LfCrMAd7fR/xw8Reb2VHHqCsohTNZdjqSzgPmA\nbfEZyyp4tZargdvN7KeDOLwhi6Q/42ukbwL/oa8Jb+vBGNdgIGlR4LfAIsCvzezs1H4yMMzM9hnM\n8ZWNpA2Ak8xsjQLHtrS0ISl7zWYH9sWV+mP0vdd+XUDesbjj3RHNjCPTfzK+3BVWoX4kFHKXk7xE\nrwTWBD4JvIKbqv8FfM3MJg/i8IYsks6ttz8cmboXScsD95vZHAWOfQX4ipmNbvIcYwoeama2ZA0Z\nJ2XezgTsDDyatrxS37fBeP4CnGdmfyo4rqAFQiHPIEhaj0zoxlD1lpQ0O3Agtdfkqv44Bf1LcvrZ\nFV9n/ImZTZC0BfCimf1ncEfXGilCoVcTXorxQGC4mX2pgIz9gCWBAV/aSJ7RRbBGa9mSvg/8Ajdd\nV5ulX9vSIINehEKeAZE0t5lNGuxxtIKkS4ANgAup7i166mCMa0YmmXBvAO4C1gdWSJ78BwJrmtk2\nTchaEDiBngeuXkmkzWzAnIckTafHcSnLPcBuRUow9sfSRnIUWxl4YaBivNO1qIUN5PfSzYRTV5cj\n6QBgrJldlt5fDnxH0qu4yfqRugI6jy2Ar5vZXe0IkVcLWILqs+xCzi+StsHX5hfFnWayMgrFhnYJ\nxwC/MLOTUmhdhdvwpBrNcB5+PY+gygPXALNE7v104HUzm9qEjEnAn9sZhKRTgMfM7OykjG8Hvgi8\nJ2lLM/tHCzLnxLOPPVXkwcLMIs3yABAKufvZE9gBQNImeLWWLXBFcjzuvTmUmIjPNlpG0trARbgp\nMT/7KRTCkZxujsIVyDfxcJCl8DCz37QzviHIysD/VmmfAMzfpKwvAV82s4fbHlWbmNkLJcgow5dg\nGzz+F7wE4uLA8sCO+D3YsIBHehC/3cxOlzQbHo+8uO/S/8TacGcQCrlDkbQWMJOZ/TvX/gVgmpnd\nX1DUQsC49HpL4HIzu0nSWODfNXt1LocAh0va2cyqVeIpwhm4Y8vWtD4L+xHwfTO7RNIuwHHJTHs4\nnut4RmISvraad0RaDY93b4Zx9H1IGhRyns5ZDJiKx6HfbmbT+nko8+MZugC+BlxhZqMlnQP8pKCM\n9XHlDfBt/BrPjTt6/QJoqJCT/8YGVLcINfT0DhoTCrlz+Q1wdJX2zwAHAF8oKGciHo4yDi+q/ovU\nLoZIML+kh+itNJcGXksPFfk1uSKm4mWB77YZwrEoPXGdU3APdvC17Xto3lQ7lLkUOFbSd/Hvaabk\nRHgCPUlCivJT4BhJPzCzseUOs2lG4gkxRtBT0WgevCTju/hyx/OSvmpm46qLKGVp4zVgRUnj8f/h\nH6b2EXgIYxHmoseytDnwJzN7T9J1uKWsLpJWA65P55w9yZofvxYTgFDIJRAKuXNZEahmtnso7SvK\nVcDFkp7B45FvSO2rMXQyTV1dsrz7cHN1O5//VXwm/ALwIrAO8Ai+7lhohidpzlp1ZCUtPRgxn5Jm\nofq6+ot1uh2MP0COwx/ynkh/LwaObHIIl+E/+s9Jeo++D1wDaX04AFd+e1QqO0laGrew/AFPiHIp\ncDJuVu5DSUsb5wKX02PNqURIfAFouP6bGAd8UdKbuEL+n9Q+Dz7bb8TJwF/wJbC38Pv9Q9yUHo6U\nJRFe1h2KpP8CW5rZv3Lt6wLXmVk+i1AtOTPjZq1F8DjCh1L7SOAdMzurgIwhH2okKfsQswyuKI6l\negjHEwXknQWMM7PDUnWj43Ev4zWBq8xs9wIy7gA2NrP3c+3LAbea2WcbySiLVE3oHGDd/C4KetGm\nBCEr4aF1D5nZMy2MY+d6+83s/GZltoqkZ4Ft8uvZabb4JzNbMv0//snMFq4h4yngsLS0kc0lfzgw\nr5kVsqSkWfYiuLn6pdS2MzDJzK4p0P9HuOJ8F3+IXN3Mpkv6MbC1mX21Qf9JwBfM7On0+otm9mRa\nQjvfzJYv8jmC+oRC7lBSeM/CwDfN7K3UNjc+W5xgZtsO8FhKCTWSF3hfmuqKvWGyfkmL+KEf/yit\njTsUPWFmZ9bplw1hqRbK8vG+gspnJnyN/6P0/n9wZfYMcIaZfVBAxg3pvN/IyFkB906+3MyKrg+2\nTUp7+RHuMV3tOx5q3vhtk2bo6+f9NZJ/xz/NbISkxYHHayUJSTJWMLMXJE0ANjGzR9ID0D1mNl//\nfopeY1kTV+o3m9m7qe3ruFKvG7Ug6XVgXTN7RtJo4MdmdmNKkvKAmc3e3+OfEQiTdefyMzy84YW0\nhgrweXw9acdmBEnaES8wsST+ZPuCpJ8CY4o8XVNeqNE6uBlzMVr0bk79zwQulLQQbr57HNhB0kJW\nuzbsMq2NujpmNh0Pg6m8vxQ3XzbD1vj4L0oK/XPArcBFjTIn9QOfx+vdFjWBfoz6qTa0pFnpu+Za\n1cRfo//ceG7tamMqsrb9d+AMSXtkLEur4aUQb0vHrExfZ7YsLS1tJFP3mWY2tY5zWeWzFFq/TQ8W\n9+farivSF18qWwt/4Pwn7lg5P/5b9HhBGUEDYobcwSRT8Q54hq0puGfwJWb2Yd2OvWX8EDgcOAVP\niL9SMpntAuzcyFSVZIzBY5afbP5T9JLzMDAaGEX1WdhbBWRMBNZJprN9gO3MbD1JmwK/H0jzeVIY\nq1D9B79Q5qKkNP6B/9CtD1xgZv9X7kgLjeM+YKSZ3dlC39JqQ6d7/ljcCarP7LFoAgpJW+GhbXMA\nb+fGZEXWotMD34X4g0blf244/tC0o5m9JumrwMxmdlMNGS0tbaT/uTXN7L+qn0bTat3z8tSZh5jZ\nZPVOo1lNSKPUmWsCnzSzv0taAHfWq1iEdpsRLSj9QSjkLkfSE8DBZnZ1bg1rJeAfZtYwTlTS93CH\nlHZCjUpJUK/etXevBe4ys2PT+uXTZjZbARn742b/83LtuwDzm9kJBWRsjv8oVbt+Nc3e8oQMeRYG\nbgb+iq/VV4QUng22i6QN8XX1g6m+rl5zLCqxNrSk3+CZrQ7BleFeeGTBD4ADzeyignJG417BB7dz\nzyZZy+Oe+eD32NNN9G17aaNV5Kkzv21mk1Q/jaa1asUIyiUUcgch6RvADWb2YXpdkyZmYFOA5ZOZ\nOquQlwEeraXAVD3USMBYWgs1QtJteLzu34ocX0PGv3FT4nXATfhs+ZFkDr+yiCNUmnFsb2b35NrX\nAS4uMsuWe63fBBxuZq81Mf7KWnafXelvU2vZZaGe1Ij5sTUci1osoFBD1ou4cv+HpLdx56Nn07LL\n9mb2tYJyJgMrm9nz7Y5psJF0KHBC/sFCnuDj/+os0wRDjFhD7iyuxhN5TKB+qE8zBcHH4OuD+axD\nmwP1TNClhBqpd4L+04ATkymw2izs0QIiD8BTEf4f7t1ZMZV9A7i34LAWxtfi87wGfLqgjAXxEnyF\nlXGi4RLBINHOuMqsDT0vUFGib9OTZOVOfO22KDfiZuGWFbI88UZNzGy3Gv1WwR29pqtvgYq8jCL3\n/Ci8FnJ+pj8i7et3hawOyjHezYRC7iAsky/WyssdexLwm7TeKWBtSdsDBwF71BnLYSWd/2H6ejVn\nf+iyns8N/6nTzGl+YE7rnVj/TPr+YNXiJTwXcH5tbl18DbQIVwJfAZ4reDwAZvbPZo4fKNoc15dw\nhb6FpHYLKDyPOzy9iMfYbos/aG2FZwSrSc6qdB1wvDzcrdXqRPnQwpnxsK656XHqqsbD9DxYV7v/\nPx4GxR6sK/8feValYBpZtR+6eB6dk2O8awmF3OWY2VnJbH0k/kR9MV4T+SfJM7ghai+NZz5Bf9uY\npyqcmGsb24SIs4FTk3dw5Yd1I3ymVzSEa2/gCklfpsWi8QCpf8UD/rtm9nIyz45pxcGqXSSNoHpG\nqXozubYLKGQ4F1c0/8RDsP4iaW9cGTbyPK9m1Tm0SlvRh79v59vSmvDvqP8gtgTweuZ1SyQHRkvb\naElZJTgMd1j7fUFxZ1EndLEAHZNjvJuJNeQORtJGePq+FVLTk8Ap1mIt4/RjO4eZTWiy373A0Wb2\n51z71sABZlY0jWdLSHoQ2MjMJlZZ2+5FkfVsScLNb3vT81D6Ae4BO6qI2VXS7viP4VTgv7kx1fR8\nzcn4Dv4DeREePrJiWt/fG/dqL7ReWgaSPoUrwy2q7R8sk6SkxYA1gGcLmnf7HXniln9YjWQgJZ5n\nZ3x2fA6eUjQbhfABXsXtX9X6VpE1iTZCF5Nz6A6Wwr+C/iFmyB2KejLrXEnPrG0d4HpJI82s6YpC\nySmkFY/TUtJ4SjoIeNXMzs217wZ8ysyOrdH1GqCSzartte2kcPeTdBge+zsFGN2kN+5R+PrdMeYx\nya3wC2BPM7sged9WuIuenOMDxSm4KfYLeBjWt/F18l8A+w3wWAAPKzOvuNR21aUkr6w64EtR8Lcz\nKdU3LMX7SjoO+D6eXnR7q1NRylJWsuSEeFfFU7tF2q2S1kk5xrsXM4utAzd8nXPvKu17AS83IWdB\nfBb2Cp6JaVp2Kyjjv3hCkXz7usDEJsYyFk+/l2//Am6iHfTr3sRneRNYqk0Z7wGLp9fvAEum10sC\nUwf484wH1k6v3waWTa+/AdxZoP82eL7le4AHs1uT4xiGhzy9nO7XyjU5Ati9CTkH4DHqlfdX4Ilc\nXsYjDYrIOCm3nYwnf3kHOL2gjKeBDdPrL6bv/PvAtXgcchEZ04AFqrTP18T/8PfSNRjR4v0xEX8o\nnpY+/5vZbSDv1W7eYobcucwNVAsPuglPnFCU82jfGeMm4GhJ+TSev8LjZ4tScXTJ8zru+TwgJNP9\n/1HbwWXZav1ynA9sh1+DVnkVDycbm2v/Em14B7fI7PR8NxPxKkej8fXxussAKrc29M/xkoD74wUc\nKjyOz9LOLignXwd8YzyyoJk64Kvl3k/H79X96O2YWI9F6Cli8i08NO9MearSfxSUUSuj1ydw03UR\n9sO/k1arpI0kHLn6nVDIncu1uNkwXxrtm3gCiaKU4YxRVhrPcXgx9bx383r4DL4qGeeWhlixakBn\n4j/QF9H6Q8owYH9Jm+EZ1PI/cEVSX/4Bdy7bLY3h05K+iK9vH9HMYNR+msingeXwh4NHgB+kH+49\naex5XmZt6J2SrFslZR2WHgGaKWDQdh1wK5DFrgDv4jPZF/GHgErGrKlA3SQ26kmZacAeKSlOhWF4\nZreiqU7bWuqxXBKdoH8IhdxBqHfO2ieAn0v6ClBx3FgHV14nNiG27YLv5p6/q9A7jee5NJnGE1dA\np8grUGW9m4+j/mf6afOjrsuWwFZmdkcbMlbG19DBQ2GyFFXwx+DK81bcA/523Cx4gpmdVnQgapAm\nkmI1iU+lx0pxGG6d2QGfge3SoG+ZtaE/Q/WymDPhntZFabsOeEpks7Xl1p1TtrWrrVh2q5uBs9KD\n7LJ49jBw34WxDfpWUo4KfzDK1j7+IPXfs8AYsDbDGCVNAxa2nEOopPnwrHcRh1wGg20zj61nw2eO\nRbbnm5C5KZ4kYfEO+HzCze1T6FnHnoyHpmgAxzEW92ge9O88jWcW3DlubdwLvtn+o3GnrJbWB2vI\nHIGbqucvcOzzwGrp9f3AD9LrTWlyfRF4APheep1dVz8UuKMJOaen7/lm4I3KdcXrABda18ZN1NXW\nbhcAPiwoY+40lmuAzTPthwE/Lyjj78A8g3yP1roWnwamDObYummLGXIHYWalx+xSUsF3Va8YNRJ/\nOChSMQrz/+ADJB2Bh3JNAZ6xXD3gBuOolgsafCb4vhXLDXwocKikXcysSHH2/Bhmxsf+eTMro9LN\novhs7nYzmyJJ6VoV5TPAr63NnM1ZkqwHCx5+G+789RBuOTlZXr93TeCqJk99OHC+pM/gs+KtU5jR\nTrhloygjcYW8CLC/pXKDuBXgt/U65rJrrZgyy1UYhs+4Xy4yCPPZdR8LgZmNKtI/Hfux6TyF7NHk\n/VGpyDUSX0OvFmde9XegZLN50ICIQ+5yVELBd/WuGPUL4HPWZMWoJOccPCHJO7n22YHTrEYqwtyx\ntXJBV3gJdy46zGqEI8krGy2X5DxP34eUtQuM43k8cX/LVW6Sue9yPMuVAcuk63oO7r1eKNxI0lXA\npWZ2eZPnL6UakEouoJCSpRyKL4/MgT8YHG41KiqVTe4eq7bcMwWvB9zQsUtehORdS0le5BWf/h++\nJLWX9c42V0/OTrgjYqWM6GjgeDO7sGD/w/HMfCfiSYKOAhbHHc0OtxqJbNRTaWox/H+rmtn8UMsl\nDQpaIxRyByPps/jMo9oT7YDVy1UJFaOSnFrrUPPj8ckNLTZppv4rXOlWclevjXvmHoVXX/oZ/mNV\n1QM6zdBrYmaHFBjH7ng94x3NrKX4TkkX4ObPPfCkL5XruhmeJ/tzdfpm00R+Cldg59JEmkh1WDWg\nNItbDy960nS8sEoqzpKSkQh/WFubnqxb4Epognm2uCJjegxPnnO9pJWB+3DHrq8CT5nZrgVk7Is7\n+Z2Ox6iDO2vuBfzCzE4uIOM5YB8zuy79D3/ezJ5LM+B1zOx/G/T/O76eXugBImiRwbaZx1Z9w52d\nJtPzA/sQ7qgyCbitQd85s6/rbQXHMgVYLL3OrustQ4H1o3SuufB1qKVyY5gHN0e+UnAsNwHbVmnf\nFrg1vd4R/7Hrz+/noXQtpuIeyk3H3uJhT6tWua5L4rOqen2nF9wKxamWcD3mwR+Ezk7bfsC8LciZ\nCizR4hg+Xuds95rgDmTntjqWjJx36Yk1/yUe9gS+Pv9qQRlj8ApY+fadKRi/n35LFk2vx+NVtCr3\n2ltNfJ5ZcOvS8IG4r2a0LdaQO5ejcW/bUemJ9jt4nOhFVI9PzjJRUmUmOona5f6MYh6nY2itYlSF\nyhgMN7XlMTzrVREqM4M8D+GJF8ArAy1aT0hai94af0A4yTwt56r47KdIgYkyqmHNTvXMafPSk5ms\nKlZe8ZG2kbQ+Hqb3Nu7UBbAPvk6/lZnd3oS4x3ElkQ+Na4iVWJzFfJb9bdqvpPQB7sMBHmpX8Xh/\nE38gLcLC9HixZ7mb4vH7L6VjX8TzcG+KPzyuRYN7DT4u9Xg6/hAA7jH+vKTT8ERFxxQcR1CHUMid\nywrA9un1R8BsZvauvDbqNdQvRbchPWnyyoilbKliVIavpn634Q8WWRPvB8ALZlYzDjnHS8DueOWa\nLLvTE3c6H7niE1mSqf0WXBkugs+EJuKJPj5Dz49OTaycalh34NaBionc0nrs/rhnbcsUSROZ1p4L\nYfUrNv0GXwv/oSVTbjI//zbtW7noeXAfhRMkHYJ7XE/OjePtJmS1yzX4GmtDk3Ad7gROSolA1sbv\nMXCF9lJBGc/iFqD8Esx2+Dp9Ef6MW93+jZdB/WNadlmUYp/vGHxN/yv0nhDcgs/8QyGXQCjkzmUy\nPevG4/GZ3H/S+7prtpYppWcllPuzNitGVcYgaQngRUu2rxb5GV5laQt8PQ7cm3d5PH0j+FP/ZXVk\nnIx/hv3wWV2F64A/Fh1ISsaxDf7dHG9mb0paHXjNzIp44e4P3CppTfy7Pg6PT50XX0stOo4D8EID\nl6X3VwDfkTQeL1JRy/HsrRrtzbI0sI1l1lXNbFpyFNupSVmVON1r6W3ZaWjRycXx18WKVeN6Bp/l\nr0f1h4MiMvbGH0y2wR9YKvfFFjS2dFUYBVyWLBGVNeT1cAW7bREBZnZg5vVlkl7ELUrPmNlfCoj4\nFp6K9B71rjr1H/z+D0ognLo6FElXA9eZ2R8knYBn6DoPN7NONLONm5DVbhanrKyWKkalvuvX21/U\ntJkU+w/wWQb4Gu4ZVjDpvbzyzZpm9mzOSW1xfO151gIyVsFnB2/h3qrLJRlH4mt1hRSRpLnwH+2s\nR/FvCprNKzLG4JV47panibwcnz1tm8ZSJE1ky6TZ3/FmdnWu/VvAgWa2ThOyNqi3v94DZsYjuBFm\nxapx1ZNXSEZZSFqDvpXfTrQBqr6UQiZXSvd49n9mVTxcb66BGEe3Ewq5Q5G0JK74Hk1hQSfSE0qy\nr9WpEpOTUzeLkxWIQ05m8jvN7LZc++zAfmZWaJ0thZPk+Xg8NkDZfiS9DmxiZg/nflw2Bs4zs88W\nkHEL7ry1f07GusDFZrZ4g/7DgYOBc8ysqOmylqwpeDGIcZJOBWY1sx9IWhb4t5nN0478AuffDp/d\nn4Zn5gLPKrcXvrTwsZ+BdUgJxYFCUl1fBjN7cQDHshzwY3or9dPM7OkCfW8HrjCz09L9voqZjUlr\nyMuY2eb9NvAZiFDIHUi74R85WaNxM+DB1mLiiKRIPwQOMrOTMu0L4t7RRVMR5p+iZ8YT+B+BZy26\ntaCctmb8Kc53LnwWORFYBV/Lvga428wamj4lvYV7qj6XU8iLAU8XnGW/i886xjY6toGcV3CT8d2S\nnsZDYa5IP8D3mVkh5yF5Mo9aiSNqFh+o8aDVqzvJ5FzkXknf7+70KI7/4A8uLZnYpdaSaZQhQw3i\n5gtej6/hnuE35to3w+O/bygg4zt4par76Z2Kdy3gf8zsTw36fwm4AV/S2QU4A88uty6wgZk90GgM\nQQGadcu1lJlUAAAgAElEQVSObWA22gj/yMmZTAqnaUPGdFx5vYE7QM2S2hekhLAaYAPggYLHboXP\n9Kfj3tsTM1uhNI14iM7f0+f5CPfofR93wCmUuhL3eK+ki8yGLG0CjCso4xo8sUq716+MNJH7pM9x\nWroWv0/yJgFHNei7WNGtwDjWxMt9voRn+boKd9Z7gxSq08R12QkPG5yatkfxuPEBk4EvRWS3NfHE\nIE/icb1FZDwKbFalfXPgkYIynsMTgOTbDwOeKyhjKTwf/b14YpM/Aiu3e//GlrnGgz2A2Gp8Mf4k\nu1EJcq6iStxukzKm47PRpdI/4t3pfVkKeXkaxN1mji0tbzP+ILAPbjrenCbyaQNn4Z6rMydFtgQ+\ns3wQOKWgjEolpRNwj/pvZLcmxjIz7ux2KukhIbWPBPYoKOMpYPv0OvuAcTgFa/+WseGe5+eSiXPF\nnU/Pw9cqi8rZF38YPTZzTY9LbSMHSkYd2V/Hk+oUOXYKVXLR474LkwvKeA9Yukr7MsB7A/X9xlZ/\nC5N1hyJPuXc0HhLTVPhHGVmccvI+zrCV4ncvx72B9wSuteIm61XyTXhs5IH4D/CXCsiYjD+Vt1Qv\nWJ6H+q/A3mZWNGSkmpy5gCvxGc8nca/zhfA11C3MbHKd7hUZ9Uy9VvS6lkFy2lnBPEf5BHyN/RFJ\nywD3mNl8BWSsSHVzd8N7LCNjCv5Q8VSufUXgfjMbUb1nHzljgFGWW8KQp5L9pRXIG1+GjDqyl8Zn\nt7MXOPZV4H+trw/Hxri/wgIFZFyPrwGfm2vfFTdZb1alz5yV3xnVziEPDHg4WtcSYU+dS8vhH1RP\nWnFolbaiiUE+zudrZm+nNa1TapynHg/Ts56Y5R6gYR7rxI24EmxJIZsnfFiDNoutm69nbpJCYj72\nkDazW5qQ0XLyCpWUJjLDq3i41Qt48oh18BrES1A9n3N2LEvi1oKV6f39Vq5xMw8Wb+NKPV+wYBF8\n5l6UMpJptC2jiiKrPIT+kuIxxNfgZUu/bWbPJblL446eRR92rgWOTfd+1vHuu8Co7D2UuV/KTjAU\nNCAUcudSL6FH3UQL7fzQ12BXMjGr5kUb9pH0IG72LUp+RjEdeN2aq7h0HXB8mjG1NOPHvc53BX7e\nxHmBjzMWbWRmf01NWwKfSK+/JmlTPNl+zc+UlyHp6IwM8HXtujLwh6GF8LXseg9GRX8s26nYdCq+\nDr9R+rs2npzlRNyU3gyXAWdL+hk9ynA94HjgkibklJFMowwZ1RSZ8HXx7fseXpX98ZjlpyRVPPIX\nwetnF72+lQpXP0pbtX3Q+37ZEF+amkA5CYaCBoTJeogg6ZP4P/AewBqNzJmSNsSdfdbJm5OSufVu\nPHzqxmr9m5Cxn5k1THAgz0C1Cx5HvTj+jz8GN/teaAVvxDLMvJJOwRXyU/hafX45YP86ffcEvm5m\nW6X37+BewFPSIcsDx1mdhP8FZRxvGY/2/kZtVGyS9AawoXmI3lvA2mb2dLp/TjSz1ZoYxyy48t2T\nngnDh3hmukOsp4xiIznfwZX7LVRJpmFmfx4gGV+ht0KejhereBaY2cymVOtXRY5wh8FV8fvkETO7\no0jfdkj/b/fhPhOXWq5SW1Ayg72IHVv9Da83ej6epH40nqJurQL9rqWO4wnuzPTX/paRjhO+bjsd\nn4FdgodgPJLarh7ga3pHna2u41A6ZqvM+48doNL77wH/6m8ZmWNnBm7FY0HbuSaLUsWpLX13izbo\nO5EUEYB78341vV6Kgg5D+fsMzwi3ctpG4Ov0dzX5mVbHPYEfSNsfyTi9DZSMKjI/gTuM1S0ugWfS\n2jLXtjPuUT8BOBP4RAsydsIfhhvKAL4MnIMvJbyLO9d9uZ3PH1ud72uwBxBblS/FTZEH4rOT1/BQ\nlA+BFZuQ8QLupFNr//J4Gst+lZGO2zX9Q3+1yr4N074+1Wxyx10PzJV5fyAwd+b9fMATDWQsWU3p\nNPndjCfj8YrPdrLvl6VB9ZwyZOTkvU77CnkaqVJSrn0+GnjS4w8Y30qvL8bjVdfDHyQfL3j+KbXu\nAbwIx50UqOCFx6Xvj89o78M9pGdr8lqUIeMTuFPm/bglqXJ9dsUdAMfhZRnrybghewz+cPIBHnq0\nb7qPftmCjA+bkZH5DnYF/ok/RI8GDgAWaue+iy13nQd7ALHlvhD4C75eezEeGjEstTerkKdSJcwh\ns39pGpROLENGOu4mPIVirf0HAzc2kNFLYeBKPDurbBiCVUXGZcCCTX4/U/A0mbX2Lw9M7W8ZueNP\nBo5p876bDnyqSvtiNAitATYjxdTiYTRP0WOaLRS6h+d6nkIu3IseZTwa9/RvJOcQfA3+b/ja+hQ8\nqUgz16IMGcfi68dX4Ar4Q3w2+igeHz6sgIzxeIrXyvuj8Ix5lfffpfFDaNsyqshcOsl5EX9AuLad\ney+2ni2cujqPLYBfA7+zNsJygJeBlfC1qmqsgv+z9reMynE112Xxp/hG2bHynr51PX8LyvgaXrGq\nGV7Cr0mtdIOr0LiKTxkysgwHdkthMNVC5Pat1VFeAAJ8nfOIFP5UYRjwBdw7viaW8UNI9+zykubF\nc64X8g0wsytThq5LJH3dzP6RUrP+DX/Y2sCK5ffeCfiRmZ2ZPt/GwHWS9jB3RixCGTK+i8/4r5VX\nF3sU/55WLXpN8AQ2r2Xeb4D/r1S4D3fu6m8ZvTDPAf8r3IJ2ND5xCEogFHLn8SU8beADkp4ELsTX\nW5vlevwH9m+W89ZNXr6H4eu6/S0DPJzmtTr7X8N/OIYC1wOHS7quxjUZhXuC97eMLCvhCUmgp+BG\nUSoOV6LHJFrhA3yd/4RqHVWgfKOkj/CQqputQVUh86pi8wLXSPomnpTk07gyLlqec1EyCsfMbpFX\nJ/o0xR9yypDxWfzhCDN7XNL7wMlNKGPw/4slgHHJ2W11etcN/yS5KIN+kvEx8gIxu+FlVKfjOQnO\nLto/qE8o5A7DzO4B7pH0Uzy8Yje8HvFMeNzrOCvm6Xgk7tE8WtLp9MzGlseT/g/DzU79LYN03Ed1\n9k+j8b1o9A0faTZEoAwZv8JDYZ5O12R0al8Or9o0nL5hMv0h42PMrOWQlEpfSecC+xS8tyoUyS09\nE27G3kPSCWZWLR4+O57jklK+FXde+oo1V3xjOL7UkuVD3PltIGUMo/fDzUe4U1QzXA8cIy+v+S08\n21bWs3oV3ImuX2VI+jQeIbELbq6+G7doXW4FEuAExYmwpyGAvEjA7sCOwNz4bKNuMojUbzE8XGQz\neidruBHYy8zGDJCM6fiM4/0ah3wC2NzqhCxVkbEVHjtb+UEoQwYAZrZ1g8+zBH5NNqH3NbkZN3U2\nTFpSkoyGM1Q8FOw7bcpoeE0aIWlL4LdmVrX6UZVxfA2fnfeqK13gu6l2r/X5nuvJ6ScZTd9rkubH\nY8C/hCvznS0TaiXpVjyLWs14+nZlSLoB2BjPJX4BvpbesDpU0BqhkIcQ8ipQWwG7FVHImX7z4E+2\nwguST2zh3C3LSLOvhpjZrp0uIydvXvyaADxrZm8W6VeWjE68JnXOMzf+Y15VAZU1jk65JmVe1xTz\n/66ZTcu1z5vaa8aItytD0rW4Sfqv+b5B+YRCDoIgCIIOoOwUi0EQBEEQtEAo5CAIgiDoAMLLugOQ\nNB/uNDWWvt6dQRAEQ5FZ8bz1N5rZf8sWLmlRYP42RLxhZi+WNZ4yCIXcGWyGVyAKgiDoNnbAMw+W\nhqRFZ4YXCgdQV+c9SSt0klIOhdwZjAUP+K33uPc3YPM6+8/k+wVO1UgKeLbCelwDfLPAudqV0SjE\nschnKUIZcoaSjCL/9tfjkUf12KTB/l/hWVHr0Shq7mw84q8e4xrsL3q/1rvfBvJea1Rq+RIaV25s\nlNRsIO61N0iVO8e2eaJqzP8hjX8za5FGNiJ1D4Uc9GIq+J1R719x1gb7i9VMbywF8jXVq8n4bIFz\ntSvj7Qb7i3yWomNpV85QklEkv8WseGKqenyuwf5PFjimkRvLCLxoVD0aRYoUvV/r3W8Dea8t3mD/\niALHlDGO0mT02zLcQjS+S6vRqYovnLoaIGmMpLp5liVNl1Q4LjgIgiBon+H442WzWyjkAUDSOpI+\nklQ3Z24/sBC9E7YHQRAE/cwwXLk2u9VM5zfIdJVCxhebfg2sL2mhegemrFelYGYTzKxN/4IgCIKg\nGWKG3KGkUm3b4fmBr8MToVf2bZDMyptLul/SVGA9SUtKulrSq5LekXSvpI2qiJ9T0sWS3pX0kqQf\n5c7dy2Qt6TOSLpH039TnXklrtfsZV2pXQGlSVmt8yIDIKOeKlCOnm2SAF35qly1LkLF+CTK67V77\nQgkyOule60wk7ZWWLKdIuqfRb7ikHSQ9LGmypFcknZ1SkxamaxQyroyfTPVYL6K6a+bRwAHACnh9\n0jlw5f1V4PO42flaSXkPkJ8BD6VjjgFOraG4Kw8Gt+PeDlviv2xHU8K1LuMnshwpq3eIjHKuSDly\nukkGwKolyOgUhdxt99o6JcjopHutdfrLZC1pO+BEvFTlanihkxtTsY5qx68HnA/8AVgR2AZYGziz\nmc/TqTP3VtgNrx0M7o8/p6T1zez2zDGHmNmtmfeTcMVcYZSkrYFvAL/NtN9lZsen16eniz8SLxGX\nZwdgPmB1M6uUp2tYESkIgiBojorJupV+DRgJnGFmFwBI2hP4Oq5njqty/DrAGDP7TXr/gqQzgP1L\nHlfnk8oTro3X+8TMpkm6HJ8lVxSykQqGZ/rNDhyGB1wujF+PWfEC5Vn+VeX9T2oMZ1XgoYwyLszf\n0smzrEQnPIcGQRDU4zHg8Vxb/ycdrMyQW+lXC0kzA2uQqUluZibpFuCLNbr9CzhK0hZmdoOkBYHv\n4hbYwnSFQsYV7zBgvKRs+/uS9s68z0f+nwhsBOyHF+meAvwJmKWNsUxptePmlBPpGARBMLCsTN+p\nw3iatNg2TT/NkOfH9clrufbXgOWqdTCzuyV9D7hM0qzpFNcCe1c7vsVxdT7JW3pHYF+8uHuWq/GU\nNrUKaq8LnGdm1yZZc1A94j6/aLMO8GQNmY8Cu0ua28wmNfwAQRAEQUtU1oTrcRdwd67tvZLHIWlF\n4FTgl8BN+NzqBOAMYI+icoa8Qga2AirFz9/J7pB0FX4x/g9Qlb7PAFtL+mt6f3iN49aT9DM8B9+m\n+IJ9rbyCl+D5Aq+WdDD+mLga8LKZ/buZDxYEQRC0x3ppyzIGOKh2lzeAacCCufYFgVdr9DkQ9zU6\nKb1/PEXj3CHp52aWn21XpRu8rHcDbs4r48Sf8LWAlameX29fYCL+EHUNvoz7YO4Yw03ba+Ke1gcD\nI83sltwx/sLjkTcBJuDrB4/int3Tmv1gQRAEQW36Iw45/YY/gC9nAiBfC92IvpPtCiOAj3Jt03Hd\nUG2SV/PzDGnMrGbKSjO7j571+9Or7H8B2DjX/LvcMUsWGMOw3PtxwLaN+gVBEASt049e1icB50l6\nALgX97oeAZwHIOlo4NNmtnM6/i/Amckb+0Y8xfbJwL/NrNasupVxBUEQBEHn0R9e1gBmdnmKOT4c\nN1U/DGxmZq+nQxYCFskcf37yQdoLXzuehIfFHtjMuGTWqFJK0N9IWh14AL5PO37Wtshh5Yxn3KhS\n5ATdzmzti9j4gPZl3HJU+zJKo1My6Daq2FaUwv5IVXgY2ABgDTPLLwW2ReU387fAMi30fwZI6RZL\nH1s7xAw5CIIgGJL01wx5sOgGp64gCIIgGPLEDDkIgiAYkvSjU9egMORnyPlKS1X2byBpmqSyFlWC\nIAiCDiDqIQ8wkhaUdJqk5yRNlfSCpGslbVhQxF3Awmb2dn+OMwiCIBhYuq0ecqeOCwBJi+GB2G/i\n+aYfx6/n5nhc8YqNZJjZR3iSjiAIgqCLKJI6s1a/TqTTZ8i/wzNcrWVmV5vZs2b2pJmdTO/80p+S\ndFUqDD1a0laVHclkPb1ispa0s6SJkjaV9ISkdyRVqnOQ6bdH2j8l/f1hZt/Mkk5PRainpCLWB2T2\nzyXpLEkTJL0l6RZJq/TXRQqCIAiGPh2rkCXNA2wGnG5mfep45UzQhwKX4ikyrwcukjR39vBc9xH4\njHsH4Mt4ucUTMufeAU8SfhCwPJ4u83BJO6ZDfoJXX98GWDbJGZuRfyVeE3kzvDr6g8AtuTEFQRAE\nbRAm64FjaTwHaK1KTVnONbPLAVJBh33w+sg31Th+OPADMxub+pwOHJLZ/0tgPzO7Jr1/QdLngB8A\nF+IZWp4xs0pe03GVjpLWw/NeL5ByogLsL+nbuAI/q8DnCYIgCBrQbXHInayQCyfkxqtjA2Bm70l6\nG1igzvHvVZRxYnzleEkjgKWAsyVllecwPB0aeD7TmyU9jRek+KuZVUo/rgp8EngzV5t51iS3Dn9L\nh2VZib51RoMgCDqJK9OW5a1+P2u3hT116rjAs5sZbjK+psGx+Xx1Rn1zfLXjK9pzjvR3DzypeJZp\nAGb2kKTFgS3w4hSXS7rZzLZN/V/Bc8blHyoa1EfenHZSZwZBEAwO26Qty8epM/uNUMgDhJlNlHQj\nsJekX5vZlOx+SXOZWemPYGY2QdIrwFJmdmmd494FrgCukPQn4Ia0Rvwgnnh8mpm9WPb4giAIAidM\n1gPLXsCdwL2SRuG1hYcDm+LruZ8rKKcZ8zfAKODUZPr+G/AJfF14bjM7RdJI3Mz9ED673hZ41cwm\n4c5b/wKuTp7Xo4HPAF8DruqkROZBEARB59DRCtnMxqSqHj/HvaAXBl7HFfO+lcOqdW3wvtF5z5Y0\nGdgfOA6YjK9Tn5IOeSftWxo3Y9+HK9wKXwOOAs4BPgW8CtwOvNbMOIIgCILaDB8GMzc73QKGG2kB\nsrPoaIUMYGav4V7T+9TY38f6YGbzZl7/k4yFwszOB87PHX8NOStGMldXNVmb2VnU8ZY2s8nAT9MW\nBEEQ9APDhsHwFoJ3h00nFHIQBEEQlMXwmWDmFhaEO1Xxdeq4giAIgqAuw4e72brpfi2YuQeCUMgd\nxbdoJ+ZY41oJAOjLQrZ92zJe1eMljGRsCTJWKEEGuF9eu/ylBBllfMf5qL9WWbp9Ebcc1b6M3/+8\nfRkAe95QgpBPliBjSuNDGvJY40MKcVIbfceXNIbaDB8GM7egxTpV8XVs6swgCIIgmJHo1AeFIAiC\nIKjPTLQWVDy97IGUQ8yQS0bSKEkPDfY4giAIup5KZpBmtw7NDNKVClnSgpJOk/ScpKmSXpB0raQN\nB2gITcU9B0EQBC3QijJutYjyANChw2odSYsBdwNv4iUWH8c9YTYHTgdWHLzRBUEQBKXRau7MDoxB\nhu6cIf8Ov9xrmdnVZvasmT1pZicD60jaWdJ0SdPS38p2aEWApD0kPSFpSvr7w+wJJH1G0iWS/ivp\nXUn3Slord8z3JI2RNCkdO/uAfPogCIIZhcoacrNbh2q+rpohS5oH2Aw4yMym5veb2duSLgWy8Q1f\nBS7Ac2YjaQe8HvJeeLmS1YA/SHrXzC5MivV2vAbylnhazM/T+yteGvgmnkJzXrwIxYH0rrkcBEEQ\nBB/TVQoZV4QCnq51gJm9D0wAkLQU8Btcgd+WDvklsF9KpwnwgqTP4cUsLgR2AOYDVs9UmxqTO42A\nnc3svXSeC4GNCIUcBEFQHl1W7qnbFHLh/CuS5sQzNfzFzE5KbSOApYCzJWVzVQ8HJqbXqwIPNSj9\nOLaijBPjgQUaj+ow+iYW+CaeMCQIgqBTeQx318nSx0hZPq06aBXoI2kv4Gd4Od1HgB+b2X01jj0X\n2Bl36M3qof+YWeFsT92mkJ/BL8jywDW1DpI0E3A5MAmf+VaYI/3dA7g3163iBlAkjU4+FZJRaNVi\nFO1k6gqCIBgcVqbvb9d44Mz+PW2rccgNfo0lbQecCHwf1wUjgRslLWtmb1Tpsg9wQOb9cLwq4eUl\nDmtoYWYTgRuBvSTNlt8vaa708hS8lvK3zOyDTP8JwCvAUmb2fG57IR32KPB5SXP364cJgiAI6tN/\nccgjgTPM7AIzewrYE3gP2K3awWb2jplNqGzA2sDcwHnNfJyuUsiJvfDLfa+krSUtLWl5SfsAd0va\nBfghfoGVYpYXzHhBjwIOkvRjSctIWknSLpJGpv2X4HWNr5a0rqQl0nm+MLAfMwiCICgbSTMDawC3\nVtrMzIBbgC8WFLMbcIuZjWvm3F2nkM1sDLA68HfgBHxx4yZgUzwueQP8c1+Lz4Yr236p/9m4yXpX\nfDb8D3xt4Pm0/0NgE9wx7Lp0zAF0bGRbEARBl9I/iUHmxyd1r+XaX8PXk+siaWFgC+APhT9HotvW\nkAEws9dwm/4+VXb/DVe29fpfClxaZ/84YNsa+w7DvbOybacCp9YfdRAEQdAUBdaQL3ndtyxv9e/0\naRfcCbimH1MtulIhB0EQBDMABcKetl/YtywPvgNrPFizyxu4xXPBXPuCeN6JRuwKXGBmHxU4thdd\nZ7IOgiAIZhD6wakrLUs+gOeOANzZKL2/u95wJH2FFDrbyseJGXJH8T7tFSfv41jeEq/OtmTbMr5k\nz7Ut4041/YBZhRVKkAGeDr1dVilBRu3H+uLko/Ja5c0SZOTj7ltgz7JquTxZgowyflK3LkFGGZ8F\nPNFgq0wuaQx16L845JOA8yQ9QE/Y0wiS17Sko4FPm9nOuX67A/82s5a+gFDIQRAEQZDBzC6XND9w\nOG6qfhjYzMwqq9ELAYtk+6RkU9+muu9SIUIhB0EQBEOTfkoMAmBmvwV+W2NfH8dgM3ubnuRSLRFr\nyCUiabFUOaoM22QQBEFQj/5LDDIodJ1ClnRuprziB5JelXSTpF3TwnxRORskOXM2OYSyFrSCIAiC\neoRCHhLcgNv4FwM2B27D44D/kvJYF0H0TRRetF8QBEHQ37RSC7mydSDdqpDfN7PXzWy8mT1sZsfQ\nU594l2qmZUlzpbb1JS2GK3GAiWm2fU46TpL2l/SMpKmSxko6KHf+pSTdJmmypIclrTMQHzoIgmCG\nImbIQxMz+zteQqsSU1DPtPwi8J30ehlgYeAn6f0xwP54Nq4VgO3oGyx+JHAcXqpxNHBxEzPzIAiC\nYAZkRvOyfoqeGmE1TctmZpIqQZavJ+85JM2Bu7T/yMz+mPaPAf6dE3G8mf0t9RmFFwpdGlfOQRAE\nQRkUyNRVs18HMqMp5Mq6cKusAMxCjzm7Fo9lXo9P512Ahgr5V/RNlLBl2oIgCDqVB+ibtKadJEcF\naXU9OBRyR7ACPqOdnt5nZ8lFUjEVvcOyqZAqDwAFTNYH42WagyAIhhJrpC3LOLzgXj/SZTPkGWZd\nU9KGuLn6SqCSbSWbcnw1es+eP0h/s1/dM8BUMjlOqxBhT0EQBANBlzl1desM+ROSFsQv+4J4bcoD\n8RrIF6Y14nuAAyWNTccckZPxAq5ct5J0PTDFzCZLOhY4TtKHwF3Ap4DPmdk5qV+EPQVBEAwEMUMe\nEmwOvIKbp28ANgD2NrNvmVllBrsb/lXejycS/3lWgJm9AozCvapfBU5Lu44ATsS9rJ/A6yZ/Ktu1\nynhi1hwEQRDUpetmyCnHaJ88o1WOewr4Uq55WO6Yo4Cjcm0GHJ22vMwXqsh4K98WBEEQlEA4dQVB\nEARBB9BlJutQyEEQBMHQJBRy0H8siicGa5WSvs6pb7ct4k4tXcJASpAx/2fblwHwxjMlCCnj+1m9\nBBmPliAD4J0SZMxWgowyxgEeFdkBbFPCPXtlWdeknXv2/ZLGUIcuM1l3q1NXEARBEAwpYoYcBEEQ\nDE26zGQdM+QqSNpZ0sTM+1GS8nnhavUdJemh/htdEARBAHRdYpAhp5AlnZvKJE5Lfyuvry/5VNnY\n4eOpn52rXt8gCIKgP+gyhTxUTdY3ALvQOytWv3kQmNl7wHv9JT8IgiBogXDq6gjeN7PXzWxCZnsL\nIM2Yd5d0laTJkkZL2irbWdI3Uvt7km6StGPqN2e1k+XN0JK+Iunfkt6VNFHSHZIWyfX5nqQxkiZJ\nukTS7P1xIYIgCGZYumyGPFQVciMOxVNargxcD1wkaW4ASUsAVwBXAasCZ+F1DxuZmS31Hwb8Gfg7\nsBKwDnBmrv/SwDeBrwFfx1N3HljC5wqCIAi6lKGqkLeS9E5me1tSVuGda2aXm9nzeE3DOYC1074f\nAE+Z2YFm9oyZXQ6c18S550zbdWY21syeNrMLzeylzDECdjazJ83sLuBCmluDDoIgCBrRZTPkobqG\nfBuwJ73XkN/MvH6s8sLM3pP0NrBAaloWuC8n796iJzaziZLOB26SdDNwC3C5mb2aOWxsWneuMD5z\n/jqMBObKtW2ftiAIgk7lYeCRXNvU/j9tl60hD1WFPNnMxtTZ/2HuvVGiNcDMdpN0Kl5VajvgSEkb\nm1lFsbd4/pMpJxNTEATBQPL5tGV5mZ4ief1ExCEPeZ4G1sy1rV3twHqY2SNmdqyZrQc8DvxvGYML\ngiAICtJlJuuhqpA/IWnB3DZfwb5nAMtLOkbSMpK2BXZO+xrGD0taXNKvJK0jaVFJm+IJqJ9o7aME\nQRAELTETPWbrZrYCmk/SXilSZoqkeySt1eD4WSQdJWmspKmSnpe0SzMfZ6iarDcHXsm1PQ2sSHWl\n+nGbmY2VtA1wIrAP8C+85vFvKRbL/B6wPLATMB++PnyamZ3Z5GcIgiAIOhBJ2+E64vu4j9FI4EZJ\ny5rZGzW6XQF8CtgVeA5YmCYnvUNOIZvZrvgHrrW/jzHCzObNvf8r8NfKe0k/B14ysw/S/vOB8zPH\nHwYcll5PALauc/6Pj820nQqcWu9zBUEQBE1SMUG30q8+I4EzzOwCAEl74iGsuwHH5Q+WtDnwZWBJ\nM5uUml9sdlhD1WTdFpJ+KGlNSUtI2hH4Gc2FPgVBEASDTT+sIUuaGVgDuLXSZmaGR9R8sUa3rYD7\ngQMkvSTpaUnHS5q1mY8z5GbIJbEM8AtgHvwp5njgmEEdURAEQdAc/eNlPX864rVc+2vAcjX6LInP\nkEDNHNgAACAASURBVKcC30oyfgfMC+xedFgzpEI2s32BfQd7HH2Yg/a+kUkflTSQa0uQ8XIJMpZu\nW8Llr+9TwjhgW51Vipz2KVR0rAHzNj6kEFNKkDFbCTIuK0EGlHdd2uTKehGdA00733G/lRfooeLU\n1Uq/cpkJmA78r5m9CyBpX+AKST8ys0IXY4ZUyEEQBEEXUGAN+ZJ/wiV39G57a3LdLm8A04AFc+0L\nAq/2PRxw596XK8o48SSevOqzuJNXQ0IhB0EQBF3L9hv4luXB52CNkdWPN7MPJT2Apzu+FkCS0vtf\n1zjNXcA2kkZksjQuh8+aX6rRpw8zpFNXEARB0AX0X2KQk4D/J2knScsDvwdGkJx/JR2dUihXuBj4\nL3CupBUkrY97Y59d1FwNMUMuFUlLAc8AK5lZJAoJgiDoT/ppDdnMLpc0P3A4bqp+GNjMzF5PhywE\nLJI5frKkTfBcoffhyvky4JBmhjVkFbKkc/EMW0ZPkQkDlklVngaLhtm+giAIghLox1zWZvZbPGFU\ntX19cmGY2WhgsxZG8zFDViEnbgB2oXfVp9fzB0ma2czyBR/6CzU+JAiCIGib/ksMMigM9TXk983s\ndTObkNlM0h2STpF0qqQ3SFm5JM0j6RxJr0uaJOlmSStVhEk6QtJ9ad1gbDrmj5JGZI6RpIMkPZvy\nlY6RtH9mTAYsI+kfkiZLekhS08UrgiAIghmLoa6Q67Er8C6wDrB3arsKLzi8CV7x6THgFklzZvot\nB3wtbVsBGwP/l9l/Ah7DfCiwAl6seEJmv4AjgV8BqwLPAxclL70gCIKgLPqxuMRg0KET98JsJemd\nzPvrzWy79PopM/t5ZYekDYCVgYXM7KPUth+eVWVrelJnGrCLmU1Nx1yEu7sflhT33sAeZnZxOn4M\ncE9uXMea2U2p/y9xh4AlcOUcBEEQlEGX1UMe6gr5NmBPetZts+He9+eOXRWYG5iYm6zOCiyVef98\nRRknxgMLpNefw6/ZbQ3G9Viuv5KM+gp5ykjQXL3bZtnetyAIgo7lMbwsfJap1Q4sly5bQ+7QYRVm\nspnVyjOXz8UyBzAO2JC+jlcTM6/zzl9Gj4GjaB65rIyK13VjI8lsJ8Pw1QueIgiCoFNYOW1ZxgP9\nXJU2ZshDlgeBTwMfmFmriZZHAx/gJuwLahwTYU9BEAQDQefksi6FGUkh34gHbF8j6UDgWeAzeI3L\ny8zskUYCzOw9SccDJ0qaBtyNB40vb2bnpcPCeSsIgiBomm5VyH1mqSkcanPc+/k8vDzWeOB2entJ\nN2IUPks+ElgYeIXewePVZsgxaw6CICibMFl3BtUypWT2rV+j/V1gn7RV238IuVRnZnYicGLmveHK\n+Mgq/Z8j91Wb2X/zbUEQBEEJhFNXEARBEHQAsYYcBEEQBB1AmKyDfuPdd4G32xBQVrru75Ug44ES\nZNzStoRt9bsSxgHzfvRR2zLeHP5sCSMp41+2aPReI+YtQcZs7YuYf5f2ZQC88YcShLR/n7ifaLss\n0PiQQrzZRt9PlDSGOnSZQu7QiXsQBEEQzFjEDDkIgiAYmoRTVxAEQRAMPjYTWAvmZ+tQ23CHDsuR\ntGAqofiMpCmSxqfSintKKmHxKQiCIBiqTBsG04a3sHXoGnLHzpAlLYFnwnoTOBDPXP4+njD1+8BL\npDrHTcqd2czK8n4KgiAIBonpSSG30q8T6eQZ8u/wjFhrmNmfzOxpMxtrZn8xs63M7K8AkuaSdJak\nCZLeknSLpFUqQiSNkvSQpN0lPU9yMZX0d0m/lnSypDclvZqOGSHpHElvp5n55hlZM6VzPS/pPUlP\nSeqVZETSuZL+LGk/Sa9IekPS6ZI69BYIgiAYmkwbJj4aNlPT27RhnZnhuCMVsqR5gU2A03OlEKtx\nJTAfsBmwOl5E4hZJc2eOWRqvefxt4POZ9p2A14G1gF8DvweuAO4CVgNuAi6QNGs6fia8YtR3gBWA\nw4CjJG2TG9NXgSWBr6Rz7JK2IAiCIKhKp5qsl8aLNIzONkp6Ha9fDHA6brJeE1ggY4beX9K3gW2A\ns1LbzMCOZpYPqnvEzH6VZB8DHAS8bmZnp7bDgR8CqwD3mtlHuBKu8IKkdYFt8QeDCm8Ce6c0m6Ml\nXYdXiDq76SsRBEEQVGXasGFMG978vHLasOmUEzNeLp2qkGuxFj5LvRiPOl8V+CTwptTLBDErsFTm\n/QtVlDHAo5UXZjZd0n/xStuVtteS3I+j7CXtBewKLIpnNZgFeCgn9z9JGVcYD6zU+OMdBMyVa9sm\nbUEQBJ3Kw0C+YF4j42b7TB82jGnDmlfI04eJUMjFeRavkLRcttHMxgJIqqQamgOvtrQBfcseTsq8\nnlzjPHnnLqvSBsm0L+l/gOOBkcA9wDvA/sDaBeQWuGuOprdFPQiCYCjwefr+dr0MnNavZ53GTExr\nIe3WtH4YSxl0pEI2szcl3QzsLek0M6uV6+9BYCFgmpm9OABDWxe4y8zOqDRIWqrO8UEQBEE/MY1h\nfNRFCrkjnboSP8IfGO6XtK2k5SUtK+l7wPLAR2Z2Cz5TvVrSJpIWk7SupCMlrd4PY3oGWFPSppKW\nSWvMa/XDeYIgCIIZjI6cIQOY2fOSVgMOBn4FfBaPQ34COA4PiwLYAjgKOAf4FPAqcDvwWqNTtNB2\nBm6XuTS1XwL8Jo0hCIIgGECmM4xpLaix6f0wljLoWIUM7lQF/CRttY6ZDPw0bdX2H0Zvz+hK+4ZV\n2pas0jYs8/oDYPe0Zfl55phdq8gYWWv8QRAEQWu0vobcmSr5/7d35/F2Tff/x1/vhKihpaiE789M\nUFSJKqGUqBhrnqrE2GpjCqVaQfDFN2oeS41RQ4l5SJDUUDOZxBCEJEJiSA1JRELu/fz+WOsk++57\nzrln2Cf33JPP8/E4D/fsvdba69wk1llrr/351POStXPOOVdQmCGX/2ouYRCX1FfShBi2+UVJBW9P\nStpGUnPq1SSprDyYdT1Dds455wpprnCG3NzGti5J+wMXEcI0v0x4suYxSd3NbFqBagZ0Jzx9Ew6Y\nfVpOv3xAritDgDeqqJ9V4vkBGbSRRfL6LD7PvRm0AZ8vUk2i9uCy1ndOynY8Z1bdRn09f/lW9U1M\ne6n6NgDomkEbIzNo4+8ZtJFVuP5q/g3OyagPhc2lU0W7rOe2vTjcD7jWzAYBSDoa2AU4nLCHqZDP\nzGx62R2KfMnaOeeciyQtCvQAhueOxUBPw4AtilUFRsccBo/HKI5l8Rmyc865DqmZRSrcZV10yXp5\noDOtn9T5hFSwqoSpwO+BVwlRJI8CnpK0mZmNLrVfPiBnTNI2wJPAMtUsXTjnnCuulHvIQ+6YzpA7\nZrQ4NvOrbHdZm9k7tMy98GIMGtUP6FNqOw05IEvqSnh+eWfC88tfEsJx3gbcUiTyV1byPc/snHMu\nQ6U89rTDgT9khwN/2OLYWyO/4aAeEwtVmUYI5pXeVNCVEOeiVC8DW5ZRvvEGZEmrA88TMi6dCrxO\n2F2wIWHH3IeELFHpeovEbE7OOec6gMpDZxauY2bfSRpByND3IIBClqFehDS9pfopYSm7ZI24qesa\n4Fugh5ndY2Zvm9lEM3vIzHYzs4cB4nNiR0t6QNJMwowaSRtIelTSDEkfSxokablc4wr+Iul9SbMk\njZK0d6HOSFpc0hBJ/5H0gxp/duecW2jkInWV+yrhOeSLgaMkHSJpXcLW9yWAmwEknS/pllxhScdL\n+rWkNSWtL+lSYFtCmuCSNdSALGlZ4FfAlWZWSu6vMwnPxWwA3ChpacLOuhHAJkBvQurFuxJ1/gr8\nljDb/jFwCXCrpF/k6c8yhJ15Bmzv95Sdc67+mdldwJ+AswnpdX8C9Dazz2KRbsDKiSpdCM8tvwY8\nRViR7WVmT5Vz3UZbsl6LsPU8eXMdSZ8RciRDGKz/En++zcyS33JOA0aa2emJY0cCH0haC/iAkLS4\nl5nlHn6cGAfj3wP/SVx2ReBfwNvAQb4c7pxz2cpF3qqkXlvM7Grg6gLnDku9/xshNW9VGm1ALuRn\nhNWA2wlb0nNGpMptBGwnaUbquAFrEr4FLQE8Ee8p5CxKy4gAAp4AXgIOiM+wlWAwsHjq2KZ4Qinn\nXH0bS9iuk1TKImV1Ko/UVZ+Lw402II8nDJ4tnhUzs4kAktK7q79OvV+KcBP/FMKgmjSVsAwBYff2\nlNT5dFiah4G9gfVp/Te1gH2AVUor6pxzdWND5v/vMWcqcF1Nr1p5cgkfkGvOzD6X9ARwjKQrKni8\naSSwFzDJzFo9qCbpTcLAu6qZPVusK4Qd3l8DwyX90swyiBHonHMupxa7rNtTfX5NqM4fCV80XpW0\nn6R1JXWX9FtgXYoH8r2KEIT5TkmbSlpDUm9JN0qSmc0ELgQuibvv1pC0saRjJB2caEcAZnYy4dnn\nf0sqFOHFOedcBWq4y7pdNNQMGcDM3pe0MWE39HmEwCBzgDcJN91zN+lb3dc1s6mStgQGAo8R7jdP\nAobm7gOb2emSPiXMgNcgBB0ZGa81r6lEmydK6sz8mfL4LD+vc865xtBwAzKAmX0CHB9fhcrk/Ypk\nZu8RbuYWa/8K4IoC556Gll+/zKxoX5xzzpXP7yE755xzdaC5wseefMnaOeecy1BThfmQfYbsSrAv\nIUBYpcoJs1rM/2TQRhb5O9bOoI2sErVXH9fleP636jbOsS+qbuN0/bDtQiVZL6N2qvVURu2slUEb\nZeUSKOBXGbRxbgZtAKyWUTu10RQ3dVVSrx7V59cE55xzbiHjM2TnnHMdUqPdQ/YZcgliZqhfZ13W\nOedc5XK7rMt/1efQt9DPkCXdBCxtZnsVKdYNqP7mnXPOucw0WqSuhX5ALkbSomb2nZl92t59cc45\n11JzhZu6fMm6A5D0pKQrJF0SUzYOjcfnLUNLWlTSlZKmSPpG0gRJf0419SNJ90r6WtI7knZb0J/F\nOecaXaMtWddnr9rXIYRQmz2Bo/OcPx7YlRDNqztwEDAxVeYM4E5C+pNHgdskLVOj/jrnnGsAvmTd\n2rtmdmqR8yvHMs/H95PzlLnJzO4CkPRX4DhgM+DxTHvqnHMLsUbbZe0Dcmsj2jh/M/CEpLcJS9oP\nm9kTqTJjcz+Y2SxJ04EV2r50P2Dp1LED48s55+rVi8BLqWOzan7V5gpjWTfX6eKwD8itfV3spJmN\nkrQasBOwPXCXpGFmtm+iWDo8lFHS7YFLqC5Sl3POtYfN4ytpInBWTa86t8Jd1pXUWRB8QK5AzIt8\nN3C3pHuAoZKWMbMv27lrzjm30Gi0XdY+IJdJUj9gKjCKMPPdD5jqg7Fzzi1Ynn6xMVkZ52cApxAi\n0TcBrwA7t9FWW+0755xbyC30A7KZHZb4edsCZTonfr4euL5Ie62+rpnZslV20znnXIrvsnbOOefq\ngOdDds455+pAo+VD9gHZOedch+RL1q6GngY+rKL+jIz6kUU7EzNoY1IGbeyZQRsAIzNo4wdVt3C6\nqu+FvdO/+kYAdd8yg1Y+yaCN7TNoA+D/ZdDGNRm0MTGDNvpk0AYU2S5TgqkZ9aGwWgYGkdQX+BMh\n298Y4Fgze6WEelsCTwFjzayswBL1uZDunHPOtRNJ+wMXAWcCGxMG5MckLd9GvaWBW4BhlVzXB2Tn\nnHMdUg2zPfUDrjWzQWY2jpBoaBZweBv1/g7cRoglWjYfkBMk3STp3sT7JyVd3J59cs45l19TDJ1Z\n7qvYMrekRYEewPDcMTMzwqx3iyL1DgNWp4p4oQ1zD1nSTcDSZraXpCeBUWZ2YpXN7knruNTOOefq\nQI1CZy4PdKb1BodPgHXyVZC0NnAesJWZNUuVbfZomAG5FjwcpnPO1a9SQme+c8co3rljdItjc76a\nnVkfJHUiLFOfaWbv5Q5X0lbDDchxprwNsLWkEwhhK1cnbF++DtiOsGvuA+BqM7u8SFstZtqSfgsc\nT/iW9DXwb+AEM/ssnt8GeJKw7XMg8GNgNHComb2b/ad1zjlXTPcDN6b7gRu3OPbpyA/5V4+C/+uf\nRgiL3DV1vCvwcZ7y3wc2BX4q6ap4rBMgSd8CO5jZU6X0tRHvIR8HvAD8g/ALXBGYTPisk4G9gfUI\n6/znStqnjLYXAfoDPwF2B1YFbspT7n8JmwJ6AHOBGyv5IM455wprLnszV3gVW7I2s++AEUCv3DGF\nNehewPN5qkwHNgB+CmwUX38HxsWf04miC2q4GbKZzYjfSmblZq7RXFrebJ8kqSchW9PgEtu+OfF2\nYpyBvyRpCTPLZeM24K9m9iyApP8DHpbUxcy+rexTOeecS6th6MyLgZsljQBeJkywlgBuBpB0PrCS\nmfWJG77eTFaW9Ckw28zeKqdfDTcgFxMf9D4MWAVYHOhCSKNYav0ehOfSNgJ+yPwVhlUI34ZyxiZ+\nzj0dvwJtRv24gfBnnrR1fDnnXL0aC7yeOpbdfdpCmlikwtCZxeuY2V3xmeOzCSuto4HeiUleN2Dl\nsi/choVmQJZ0APA3wjedF5mfRnGzEusvAQwFhgC/AT4jLFkPJQzsScmd2bnUiyXcHjgCWLOU7jjn\nXB3ZML6SphK27dROLSN1mdnVwNUFzh2W73ji/FlU8PhTow7I30KrP6WewHNmdm3ugKRyRr91gWWB\nv5jZR7F+SYO5c8657JWyy7pQvXpUn72q3kTg55JWlbRcvCH/LrCppB0krS3pbOBnZbT5AWGgP07S\n6pJ+TdjglZZvu3sGEYidc841skYdkC8kbFt/E/iUsNZ/LXAvcCdhyXpZ4KpCDUQ27wezacChwD7A\nG4Tl7pOK1WnjmHPOuSrUYpd1e2qYJevkmn585jdfKpoj4ivptHxtxPfbpd7/C/hXqn7nxPmnSS2V\nm9mY9DHnnHPVq+Eu63bRMAOyc865hUtThaEzK7nvvCD4gOycc65Dyi1ZV1KvHvmAXFfWpfWjA+UY\n23aRknyUUTvVyiKvx30ZtJGV6e3dAQDUPV9wufK9yLZVt7F53kB35ZqYQRuQzb+fZTNoI4t/f7dl\n0AaEcA2VWiyjPhTmu6ydc845lzmfITvnnOuQcvmQK6lXj3xAds451yHVKB9yu2nXATmmSlzazPZq\nz34455zreBrtHrLPkJ1zznVIjbbLum6+JkjqLek/kr6QNE3SQ5LWSJxfVVKzpP0lPSfpG0ljJW2d\nKNNJ0vWS3pc0S9I4ScelrnOTpPsknSRpSrzWlZI6J8p0kXShpA8lzZT0gqRtEudXkfSgpM/j+bGS\ndkyc30DSo5JmSPpY0iBJy9Xut+ecc66jq5sBGVgSuAjYBNiOEPoy3zMrFxCyNv0UeAF4UNIP47lO\nwGRgb2A9QraNcyXtk2pjW2AN4JfAIYSQmIcmzl8F/JyQK3lD4G5gSCIZxdWEDE9bERJT/xmYCSBp\naWA4IcH1JkBvQurFdIQv55xzVchF6ir35UvWbTCze5PvJR0JfCrpx2aWTP58hZndH8v8AdiREA7z\nQjObS8uUV5Mk9SQMrIMTxz8HjomJpd+R9AjQC7hB0iqEwXllM/s4lr9Y0k6EXMr9CbGxByf6NTHR\n9jHASDM7PfVZPpC0lpmNL+sX45xzLi+P1FUjktYmDKY/B5YnzHYNWIWQJCLnxdwPZtYk6VXCbDjX\nTl/CwLkK4an2LsCo1OXeiINxzlTCTJf4386EgTqZpakLMC3+fDlwjaTewDDgHjPLRRXYCNhO0ozU\nNY2Q7LjIgHwW8P3Usd2BPQpXcc65djcaGJM6NrvmV220e8h1MyADDwETgCOBKYQB+Q3CQFgSSQcQ\nlrP7EQbuGYSsTOm8xekQUMb85fulgLmE5ebmVLmZAGZ2g6ShwC7ADsBfJJ1oZlfF+g/G66bTLk4t\n/gnOpLpIXc451x5+Gl9JHwFX1PSqzRXusm72JevCJC0LdAeOMLPn4rGtChTfHHg2lukM9CDMWAF6\nAs+Z2bWJttds1UJxowgz5K65vuRjZh8B1wHXSToPOIpw73kksBcwyczSA7pzzrmMNFU4Q67XJet6\n+ZrwBfBf4HeS1pS0HWGDV748wn0l7SFpHcLmqmVgXkDcd4FNJe0gaW1JZwM/K6cjMXXj7cAgSXtK\nWk3SZpJOjfeRkXRJvMZqkjYhbBLLLatfRQhoe6ekTSWtEXeQ35haAnfOOefmae8BuRMwN97PPYAw\n2x1LGIz/VKDOqfE1mjAj3s3MPo/nrgXuBe4kLFkvSxggy3UoMAi4EBgX29wU+CCe7wxcSRiEH41l\n+gKY2VRCLuZOwGPAa8DFwBep+9bOOeeq4Luss7UCYVaLmQ1n/saqnPS6ggFvmdnm+Rozs28JO66P\nSJ06LVHmsDz1+qXeNxF2WJ2VLhvPH5fveOL8e0D6USvnnHMZ8l3WGZC0DOEZ3m0Iy84lV61Nj5xz\nznU0vss6GzcSloAvNLOHyqjnS77OOecA32WdiUqSSZjZJFovYTeY+4GXKq/+0wHZdGN0Ru3UhfQT\nbg4+yaSVzTmz6jbep9UdpLKtwXVVtxF8lFE79eCbjNrpU0Xd16n1Y09z6UTnCoaFuXU6INdnr5xz\nzrmFTHtv6nLOOecq0swiFeZDrs+hrz575ZxzzrWh0e4h12ev2hBTKKaTUewTUzL2K1TPOedc42iK\nA3L5r7aHPkl9JU2I48qLkgoGmZK0paRnYzrfWZLeknRCuZ+nIWbIMZvSFcDvzWxQhW10js8fO+ec\n6wCamzvT1FzBDLmNOpL2JwSo+h3wMiE/wmOSupvZtDxVviaMQa/Fn7cihFWeaWbXl9qvDjlDTpJ0\nCnAZsH9uMJbURdLlkj6J327+I2nTRJ1tJDVL2lHSq5JmE6JrIWl3SSNivfGSzogxs3N1+0l6TdJM\nSR9IukrSkonzfSR9EUNrvilphqQhkrousF+Kc84tBJqaOjF3bueyX01NbQ59/YBrzWyQmY0DjgZm\nAYfnK2xmo83sX2b2lpl9YGa3EyI1/qKcz9OhB2RJ/0eIwrWLmT2YOPU3YE/gYGBjQsrDx2JAkqTz\ngT8T0je+JukXwC3AJcC6wO8J+/7/mqjTBBwL/Bg4hBDHemCq3SWAk4CDCH8gqxDCcDrnnKtjkhYl\nhHEenjsWwx4PA7YosY2NY9mnyrl2R16y3pmQLLiXmT2VOyhpCcK3mUPM7PF47CjgV4SQmhcl2jg9\nhuzM1T0DON/M/hkPTYrHLgDOATCzyxP1P5B0OnANcEzi+CKE5fOJsd0rgdOr/cDOOefma5rbGeZW\nEDpzbtEl6+UJMS/SD+x/AqxTrKKkycCPYv0BZnZTsfJpHXlAHkP4xZ0taScz+zoeX5PwuZ7PFTSz\nuZJeJsyE5x0GRqTa3AjoKal/4lhnoIuk75nZbEnbE5JbrAv8IF5rsdz5WGdWbjCOphLidrdhKPC9\n1LEN8BzJzrn69iAhpX3SjJpftbmpMxQfXJk7+B6aBt/T8uBX02vVpa2ApQhpggdKGm9m/yq1ckce\nkD8iJHB4ChgqacfEoFyqdPmlgDMI2Z1aiIPxqoS/dVcRlrE/JyxJXw90AXIDcjo8lFFSHO4dgRVL\n7rxzztWHX8dX0ut5jmWrqakT1saArD32Y5E99mtxrHnMaJq326ZQlWmEW5PpfT9dgY+LXStGlAR4\nQ1I3YABQ8oDcoe8hm9lkQoKKboR7xEsC7xEGxC1z5SQtQsiL/EYbTY4E1jGz99OveL4HIDP7k5m9\nbGbjgf/J+GM555wrQdPczsz9rvxXsSVrM/uOsHraK3cs5rLvRWLltQSdgcXK+TwdeYYMgJl9KGkb\nwkz5MWAnwj3dv0n6ApgMnAIsTkhqkZNvxno28FC8DzAYaCYsY29gZqcTNoctKuk4wkx5K8LGL+ec\nc43jYuBmSSOY/9jTEsDNAJLOB1Yysz7x/R+BD4Bxsf42hI29l5Zz0Q4/IAOY2ZQ4KD9JuBHbmzDg\nDgK+D7wK7GBmXyWr5WnncUm7EpatTyHMtMcRlqQxs9cknRjPnQc8Q7ifXNGzz8455ypnzZ2xpgqG\nsTaeQzazuyQtT5ikdQVGA73N7LNYpBuwcqJKJ8JTO6sBcwkrtSebWVmZTzrkgGxmrVLEmNlUwkar\nnBPiK1/9pymQOcrMngCeKHLtywjPPSfdljh/C+HRqWSdBwpdzznnXIXmdmpzU1fBem0ws6uBqwuc\nOyz1/krgyvI70lKHHJCdc845SthlXbBeHfIB2TnnXMfUJJhbwgMs+erVIR+Q68qytN5pX4bRN2fU\nj90yaGNYBm1kkWQ9q4il6RgBHVlWyeurf8BgjRZxeipzGb+rug2A4zkzg1YWzaCNZTNoI6u/r7e0\nXaSgqRn1oYgmwh3bSurVoQ792JNzzjnXKHyG7JxzrmPyGXLjizkwj0u8b5ZUUsiZcso655yrwtwq\nXnWoIQdkSTfFgbFJ0hxJ70o6XVKln7cbMCTLPjrnnKvSXEK0iHJfdTogN/KS9RDgUEK2hp0Iz5PN\nIWRuKouZfZppz5xzzlWvmcqWn5uz7kg2GnKGHM0xs8/MbHKMljKMkK4RSXtLel3S7Lg8fWKxhpLL\n0JIWlXSlpCmSvon1/5yq8iNJ90r6WtI7krLYtuyccy4pdw+53JffQ253swlpFDchZN+4nZDb8Ezg\nHEmHlNjO8cCuhExT3YGDgImpMmcAdxLyJj4K3CZpmWo/gHPOucbVyEvW88Qcxr2By4ETgWFmdl48\nPV7S+sDJlBaTemXgXTPLZf2YnKfMTWZ2V7z2X4HjgM2Axyv/FM4551qodIOW30Ne4HaTNIPwpL4I\n8aYHAM8C96fKPgccL0lm1irpRMrNwBOS3iYksng4xr9OGpv7wcxmSZoOrNB2lwcTklIlbUrIHOmc\nc/VqLCH/cdLsfAWz1WCPPTXygPxv4GjCnropZtYMENJaVs7MRklajbBRbHvgLknDzGzfRLHv0tUo\n6fbAPsAqVfXPOecWvA3jK2kqUFayo/L5gNxhfG1mE/IcfwvYMnVsK+CdEmbHAJjZTOBu4G5JFMSx\n+gAAG2VJREFU9wBDJS1jZl9W1WPnnHOl8wG5w7sIeFlSf8Lmrp5AX8Jsuk2S+hG++o0izHz3A6b6\nYOyccwuYD8gdW1xy3o+QeLo/YXDtb2a3JoulqyV+ngGcAqxF+GN9Bdi5SN1Cx5xzzrl5GnJATieP\nznP+PuC+IufXSL3vnPj5euD6InVbJdo0syzStzjnnEvKReqqpF4dasgB2Tnn3EKgicqWn33J2jnn\nnMuQ30N2zjnn6oAPyK5mOu8L2qTy+nMHZNSRTzJo4wcZtPFNBm1kdfs+i9/JzzNo46UM2vhlBm1A\nNn2p/s/4eM7MoB9gp5xVdRu6IIu+ZPF3LZvfCVT/O3Gl8wHZOedcx+QzZOecc64ONFgs66qyPUm6\nKaYmbJI0R9K7kk6XVHG7klaNbf6kmr4555xrcA2WfjGLGfIQ4FDge4T4zlcDc4ALym1IUi4RhAfS\ncM45V1yDLVlnkQ95jpl9ZmaTzew6YBiwO4CkvSW9Lmm2pAmSTkxWjMf6S7pF0peESOTvx9Oj40z5\n37Hsk5IuTtW/T9KNiffdJD0iaZak8ZL2i9c4Lp5vNfuWtHQ8tnXi2AaSHpU0Q9LHkgZJWi5xfh9J\nr8XrTJP0uKTFE+ePlPSmpG/if/9Q9W/ZOedcS7nAIOW+GnHJuoDZQBdJmxBiRd8ObEDY9neOpENS\n5U8CRgMbE8JZbkaYJW8HdAP2KuPat8Y6WxNSJ/0B+FGqTNHZt6SlgeHACGATQh7lFYBcfuNu8TNd\nD6wLbAPcG/uMpIMIaR7/Es//FThb0sFlfA7nnHPtSFLfOKH7RtKLkgrmwZW0Z5yYfSrpK0nPS9qh\n3GtmuqlL0vaEAexy4ERgmJmdF0+Pl7Q+cDIwKFFtuJldkmijOf74uZl9Wsa11wV6AT3MbFQ8diTw\nbrpoG00dA4w0s9MTbR8JfCBpLeD7QGfgPjObHIu8kag/ADjJzB6I7yfFz3004QuDc865LNQoUpek\n/QmJiH4HvAz0Ax6T1N3MpuWpsjXwOGEi9iVwOPCQpM3MbEyp3cpiQN5N0gwgd//3NsKg9Cxwf6rs\nc8DxkpRIdTgigz4AdAe+yw3GAGb2nqQvymxnI2C7+JmSDFgTeIKQa/l1SY8R/hAGm9mXkpaIZW6Q\nlIx33Znwh+Sccy4rtbuH3A+41swGAUg6GtiFMNC22h9lZv1Sh06TtDuwG7BAB+R/E2Z/3wFTzKwZ\nQGprIjrP1yWWa6b17HbRUi+SaINUO+k2lgIeJGR0Sl9vavx8v5K0BbADcCxwrqTNmB/l4EjCt6qk\ntv8KNPUDLd3ymA6ETge2WdU559rPWOD11LHZtb9sDQbkuLm4B5Bb3cXMTNIwYItSmlcYAL8PfF5O\nt7IYkL82swl5jr8FbJk6thXwTmJ2nM+38b/prEmfASvm3sRHqzYgfCEAeBtYRNLGiSXrtYAfptog\ntpP71rIxLe8rjyTct56U+3KRj5m9ALwg6RxgErCnmV0qaQqwppndWeQz5tf5kuoidTnnXLvYML6S\nphL26dZQbWbIyxPGn3TItE+AdUq8wsnAksS9R6WqZWCQi4CXJfUnbO7qCfQlzKaL+ZQw09xR0kfA\nbDObThh4L5K0M/Ae4R71MrlKZva2pOHAP+Ku5rnAhcAs4oBrZrMlvQicKmki0BU4J3X9qwgz3Dsl\nXUD4hrM2sD9wBPAzwr3qx2NfNyf8Ab4Z658JXCZpOjAUWAzYFFjGzC4t4ffmnHOuFKWkX3zrDhh3\nR8tjc76qVY+Q9BvgdODXBe43F1SzAdnMRknaj7Bzuj/h61J/M0tubGo1UzazJknHAmfEuv8h7Li+\nEfgJcAvhj+ES5s+Ocw4GbgCeBj4m7HBen5ZrJ4cTdki/SphVn0IYXHPXnyppS2Ag8BhhQJ0EDI3L\nFtMJN/CPJwRsngScaGaPx/o3SPo6tnsBYUl+LOCDsXPOLWjrHRheSZ+MhFt7FKoxjTCH7po63pUw\nrhQk6QDCssA+ZvZkuV2takA2s8PaOH8fcF+R82sUOH4jYQBOHptL2AF9TJH2PgF2zb2X9P8IjyyN\nT5QZR1g6T2qxPG5m7xEem8p3jXGEACgFxeXq8pesnXPOla4Gu6zN7DtJIwgroQ/CvHvCvQhPEOUl\n6UDCZG9/MxtaQa8aK5a1pG0Jm7LGAisRZqjvA8+0Z7+cc87VQO12WV8M3BwH5txjT0sANwNIOh9Y\nycz6xPe/ieeOA16RlJtdfxNvuZakoQZkwo7p84DVgRmEx6wONLM6DZTmnHOuYjUakM3sLknLE26b\ndiUEr+ptZrmNwd2AlRNVjiKstF4VXzm3EG6TlqShBuR4Hze91c8551wjKmVTV6F6bTCzqwm5GfKd\nOyz1ftsKetFKQw3IHV7T47QOLNYe1sqgjfFtF+lQls2gjSz+bH+QQRuvZdAGwP9k0MakDNo4LYM2\nQBdU/7u1M06qvh9nP9B2oTad13aRkpQb6iFpAQwvNYrU1V5qEcvaOeecc2XyGbJzzrmOydMvLjzy\npWt0zjlXJ3IDcrkvH5ArI+mmOCg2SfpW0vuSBkpabAFc/gPCbrp0kFbnnHPtrcHyIXeUJeshwKFA\nF0LQ70GERBF/qeVFY8ztklNAOuecW4CaqWy2WzBLQfuq+xlyNMfMPjOzj8zsQUIKxF8BSPplnEHP\n2yIpaaN4bJX4fhVJD0r6XNJMSWMl7RjPLSPptphYepaktyXlHvZusWQtqZOk6+MsfZakcZKOS3Y0\nzujvk3SSpCmSpkm6UlI6WYZzzjk3T0eZIc8jaQNCFqmJ8ZCRJyZ26tjVhM+6FSHZxI+BmfHc/wLr\nAr2B/xKe+Vm8QDudgMnA3oSkEz2B6yRNMbPBiXLbAlOAX8b27gJGEeJsO+ecy0LunnAl9epQRxmQ\nd5M0g9DfxQiLFH8so/7KwGAzy2Vkmpg6NyqXspFw3zhpXk7kGE/7rMS5SZJ6AvsByQH5c+CYuOT9\njqRHCHFQfUB2zrmsNNgu644yIP+bkLZxKUJM0blmdn8Z9S8HrpHUGxgG3GNmY+O5a4B7JPUgZH26\nP+Y6zktSX+AwYBXCTLoLYfab9EYq5/NUQu7mNgwihEtN6knrtNLOOVdPxhBSCCTNzlcwWzWM1NUe\nOsqA/LWZTQCQdAQwRtJhZnYT82/PK1G+RXiZmBJxKLALsAMhH/JJZnaVmQ2N95p3JtyXHi7pSjM7\nJd2JmFrrb4QvBS8S4mWfAmyWKpr+K2KUdL/+EEIYbuec60g2iq+kKRSIPJkd39TVvuLM8zzg3Pjo\n02eEwXjFRLGN89T7yMyuM7N9CJk8jkqc+6+Z3WpmhwAnAL8rcPmewHNmdq2ZjTGz94E1M/lgzjnn\nyuPPIdeFuwm/0r6EoMmTgQGS1pK0C3BisrCkSyTtIGk1SZsQNl29Gc+dJenXktaUtD4hn/Kb5Pcu\nsGlsa21JZwM/q8kndM45t1DpkANyTKd4JWG5eFHgQMJO6THAybSONt85ln8TeBQYRxjMAb4lzLjH\nAE8Rvj8dmLxc4udrgXuBOwlL1svSMtWWc865BaWS2XGlO7MXgLq/h5xOc5U4PhAYGN8+D/w0VaRz\nouxxFGBm5wLnFjg3KdXOt8AR8ZV0WqJMq/6aWb9C13fOOVch39TlnHPO1YEG29TlA7JzzrmOyZ9D\ndvVrrYzaGZdBG9/PoI0skmxVsp6VTxbPgr+VQRsfZdBGVv/sx2fQxqJtF2nTgxm0ATC96hZ09mVV\nt2H77l59P+4+s+o2qrcA1oUbLFJXh9zU5ZxzzjUanyE755zrmBpsU5fPkEskaUI6s1MWZZ1zzlUo\nt6mr3Fedbuqq6wFZ0vKSrpE0SdJsSVMlDZG0RUbttxo4JfWR9EWe4psC12VxXeeccxlosEhd9b5k\nfS+hjwcDE4CuhKxJy9XwmiJPOkcz+28Nr+mcc65cDbbLum5nyJKWJuQv/rOZPWNmk83sVTMbaGYP\n58pIulbSx5K+kfSapJ0Tbewt6fU4u54g6cTEuSeBVYFLJDVLapK0DXAjsHTi2BmxfIvZtKQBiZn7\nh5IuTX2EJSXdIGl6LHcUzjnnspO7h1zuy+8hl21mfO0hqUv6pCQBQ4EtgN8A6xHCZjbF8z2AfwG3\nE1IfngmcI+mQ2MRewIfA6UA3QnKK5wjJJaYTZuMrAhfmufY+sdxRhGeN9qB17rETgVcIEcSuJqR/\nXLv8X4NzzrmFQd0uWZtZk6Q+wD+AP0gaCTwN3BlzGf+KcF93XTN7L1abmGiiHzDMzM6L78fH5BEn\nA4PM7AtJTcBMM/s0V0nSV+Hy9lmR7q1MyHE8PMbV/hB4NVXmETP7e/x5oKR+hKQW75bxa3DOOVdI\nEy0T75ZTrw7V7YAMYGb3SXoE+AWwObATcHJc/l0B+DAxGKetB9yfOvYccLwkxTSOlbqbMEOeEPMs\nPwo8FAfnnPSM+ePY5yIGAUukjvUkm6AUzjlXK2OB11PHZtf+spUOrD4gVyYmdBgeX+dK+gdwFnmW\nkhdgnz6U1B3YnjBTv4rwRWHrxKCcfjrOaPMWwSHA6tl21jnnam7D+EqaSs0fTGkizxbcEpTw2JOk\nvsCfCLc0xwDHmtkrBcp2Ay4irNquBVxmZifmK1tMPd9DLuQtwjRyDLCypELxIt+i9dRyK+CdxOz4\nWxLZnIoca8XM5pjZI2Z2AmEpegta/410zjlXKzXa1CVpf8IAeyawMWG8eUzS8gWqLAZ8CpwDjK70\n49TtgCxpWUnDJR0kaUNJq0nal3AP+H4z+w/wDHCPpO3j+R0l9Y5NXAT0ktRf0trxfnRf4G+Jy0wE\ntpa0kqTlEseWkrSdpOUkLZ6nb30kHS5pfUmrEx7LmgVMqsGvwjnn3ILVD7jWzAaZ2TjgaML/4w/P\nV9jMJplZPzP7J1UERa/bAZmww/pFwr3apwk3Kc4CrgWOjWX2Iuxkvh14g5AfuROAmY0C9gP2j3UH\nAP3N7NbENc4AVgPeI3y7wcxeAP5O2KH9KeELALRcGPmSsMP6WcI3p+2AXc3sizxlKXLMOedcpSqJ\n0pV7FSBpUaAH4TYpEHb5AsMIK6E1U7f3kOO949Piq1CZL4Eji5y/D7ivyPmXCMsR6eN9CbPp5LE1\nEj8/ADxQpN018hzbpFB555xzFcp+qrM84bblJ6njnwDrZH61hLodkJ1zzrnq3RFfSV+1R0fa5AOy\nc865BnZgfCWNJKxK5zWNsKjdNXW8K+Hx1ZrxAbmurEi4pV2pp7LpBntm0MZDGbTxVgZtrJpBGwAv\nZ9DGshm08U31Tezz5+rbABg8IoNGns6gjYxsMKD6Nl4/t+omdPdlVbexgWVzq/N1PV5F7Y45vJjZ\nd5JGEPImPAjzIkP2Ai6v5bU75m/MOeecq52LgZvjwPwyYdf1EsDNAJLOB1Yysz65CpI2IsQNWwr4\nUXz/rZmVPLPwAdk551wHlXsQuZJ6hZnZXfGZ47MJS9Wjgd6JkMrdCCGUk0Yxf4vZJoQcC5OAVpt8\nC/EBuQQxM9SoUiKvlFPWOedcNXIJjiupV5yZXU1IDJTv3GF5jlX9GHFdD8jxG8o5wM6EbylfEL6p\nnB2fF15Q9qSyr2HOOedqpjYz5PZS1wMycC+hjwcDEwiDci9guWKVshafd3bOOVdXmqhscK3P7BJ1\nG6lL0tKE2NN/NrNnzGyymb1qZgPN7OFYplnS0ZIelTRL0nuS9k6183+S3pb0dTx/tqTOifNnShol\n6beSJkj6UtIdkpZMlHlS0sWJ93+U9I6kbyR9LOmuVPc7SRoo6b+Spko6sya/JOecW6jVKJh1O6nb\nAZkQOnMmsIekLkXKnU1Ih/gT4DbgTknJaCrTCWmU1gOOI0T26pdqY01gd8LS+C7ANsCp+S4maVPg\nMqA/0B3oTYipndQn9n0z4BTgDEm9inwG55xzC7m6HZBjGsM+8fWlpGclnSspnVHpLjO7yczGm9kZ\nwKvMj3WNmZ1nZi+Z2Qdm9ggh6cR+qTYE9DGzt8zsOeBWwtJ4PisTBttH4qx9jJldmSrzmpmdY2bv\nxdjZrxZpzznnXEUaa4Zc1/eQzew+SY8AvwA2B3YCTpF0hJkNisVeTFV7Adgo9yam0TqWMAteivCZ\n03HTJprZrMT7qcAKBbr1BGEr+wRJQ4GhwH1mlozY8FqqTrH2Ei6NXUzaIb6cc65ejSHk8EmavQCu\n21j3kOt6QIZ5SSaGx9e5kv5ByPo0qGhFQNIWwD+B04HHCQPxgUD6kaT0Nj2jwOqBmc2UtAnwS8JI\neRYwQNKmZpZLu1Vyey2dAKzbdjHnnKsrG5GYB0VTKPDUUIYaa5d13S5ZF/EWsGTi/eap85szP+bi\nFoTZ7/+Z2Ugze4/qYlMCYGbNZvZvMzuV8LdwNUIKRueccwtMboZc7stnyGWRtCxhs9aNhCXgGcDP\nCPmJ708U3TeGN3sW+G0sk3to+11glbhs/QqwK7BHlf3ahRB55RnCc9G7EO5Bj6umXeecc+VqrBly\n3Q7IhI1TLxLWcdcEFgUmA9cC5yfKnQkcAFxFuFd7gJm9DWBmD0m6BLgCWAx4hLAre0CZfUlm3PwS\n2Cte93uEQf8AMxuXp6xzzjlXkrodkOO949Piq5gpZta7SDun0voRpssT588i3AdO1rmM8GhT7v12\niZ+fA7Ytcr1WS9dmlkX6JOeccy3ULnRme6jbAdk555wrzpes64kvDzvn3ELLH3uqG2bWue1Szjnn\nGpPPkF3NPE/IoVGh5Qdk041pGbVTtW/aLtKmdLCC9vRJe3cgGDygvXuQ8IcM2rgmgzaA10dm007V\nPq+6hdf1SAb9gFtbbq8py0RCAAhXOh+QnXPOdVCNtWSdaWAQSdtIapL0gyzbdc4551prrFjWJQ/I\nkh6UNKTAuV9Iagb+C6yYCCFZSrsTJB1XannnnHMuWHgjdd0ADJa0kplNSZ07DHjFzF7PrmvOOedc\nMY21qaucJeuHgWnAocmDkpYE9gGuj0vWzckla0lbSXpG0ixJkyRdJmnxeO5JYFXgklivKR4/VNIX\nknaQ9KakGZKGSOqaaHdTSY9L+kzSl5KekrRxqm/Nkn4n6SFJX8e2Npe0pqQnJc2U9Jyk1VP1dpc0\nQtI3ksZLOkNS58T5AfGzzJb0oaRLE+e6SLowHp8p6QVJ25Txe3bOOVeShXTJOuYnHkRqQCbkFu4E\n3JkrmjshaU1gCCEm9QbA/sCWQC5/8F7Ah4TNeN2AFRNtLAGcBBxESL+4CnBh4rrfB24GegI/B94B\nHo1fEJL6x3IbEZJO3A78HTgX6EGIQz0vn7GkXwC3AJcQUi/9npCT+a/x/D6EcJ5HAWsRYmMnt/Je\nFfuzH7Bh/OxD4u/COeecy6vcTV03AmtJ2jpx7FBgsJnNyFP+VOCfZnaFmb1vZrnY1H0kdTGzLwiL\n+TPN7FMz+zRRdxHg92Y2ysxGEwbNXrmTZvakmd1uZu/G2NVHEwbx9Gz0RjO7x8zGAxcQMjP908yG\nxXqXEVIp5pwBnG9m/zSzSWY2PB47Op5fmRAze7iZfWhmr5rZDQCSVo6/j33N7Hkzm2BmFwPPMT/h\nhXPOuUxUcv+40nCbtVfWY09m9rak54HDgWckrUWYvfYvUGUjYENJv00cU/zv6sDbRS43y8wmJt5P\nBVaY14i0AmGWu0083hlYnDCTTkrOXnMPgr6eOvY9SUuZ2czY556Skp+pM9BF0vcIM94TgAmShgKP\nAg/FFYQNY9l3JClRvwthub8NQwn5KpI2iM0651x9eiG+kmYtkCs31j3kSp5DvgG4XFJfwqxvvJn9\np0DZpQjZmS5j/kCc80Eb10n/li3VxiDgh8Cxsa05hOxQXYq0Y0WO5VYLliLMiO9Nd8jMZgMfSuoO\nbA/8ipCB+0/xPvFShD/pTYDmVPWZrT5hKzsyf9XeOec6hi3iK2kiCyIwSGM9h1zJgHwXcCnh3u7B\nhHumhYwEfmxmxcJPfUuYVZarJ/AHM3sM5i0XL19CvbbiX48E1jGz9ws2YDaHkMrxEUlXE3IhbwiM\nInyWrjErlHPOuZpprBly2YFBzOxrwqB8PmEj1i2pIslZ7EDC8u8VkjaStFbcwXxFosxEYGtJK0la\nroyuvAscLGldST8H/klpqyTpmXr62NnAIXFn9Y9j+/tLOgdAUh9Jh0taP+7OPjhed5KZvUvYNDZI\n0p6SVpO0maRTJe1UxmcrIIMwkHPuqL6NTMJR1ksbWbXTSG1k1U4WbdyXQRv18lmyaqc+2kgvUbeP\nxnoOudJIXTcAywBDzezj1Ll5M1AzG0u4x7s28Axh9jkA+ChR/gzCRqv3gOSmrrYcTliyHkH4UnBZ\nnvr5ZsNFj5nZ48CuhOXolwl/704gfHEA+JKww/pZYAywHbBr3KAGYVPXIMKO8HGEpe9NaXuJvgQZ\nPOadyYCcxePm9dJGVu00UhtZtZNFG1kMyPXyWbJqpz7aqI8BuXYk9Y2Bq76R9KKkn7VR/pfxcdnZ\nkt6R1Kfca1YUyzrulm61zGxmT6ePm9kIws3RQm29BGycOnYLqZm3mT2QbNvMxhAeL0q6N1Un3ZdJ\nefqXr89PAE8U6O8DwANFPk8TcFZ8Oeecq5naLFlL2h+4CPgdYWLWD3hMUncza7VBV9JqhFgdVwO/\nIewxul7SlDielCTTWNbOOefcglOzJet+wLVmNsjMxhEee51FWJnN5w/A+2Z2ipm9bWZXAYNjOyXz\nAdk551wHlX2kLkmLEoJGDc8dMzMDhtF6M3nO5vF80mNFyufl6RfrQ3z4uK1HlWcTHscuYG4J+Vyb\nvyqhXJFrlNKPktRLG/XUl3ppY0H25bU2zk8voUxH+ftaT30prY2JRc7NauN8IuFBOrhChj6msh3T\nRf9fuzzhNmY6gfknwDoF6nQrUP4HkhaLT+a0SWHgd+1J0m+A29q7H845VwMHmdntWTYoaRVCKOQl\nqmhmDtDdzFpsuJW0ImHj8RZxj1Pu+EBgazNrNeuV9DYhKuTAxLGdCPeVlyh1QPYZcn14jPBc90TC\nV1fnnOvovkd4guaxrBs2sw8krUdpsScKmZYejHPHCTeZu6aOdyVMyfP5uED56aUOxuADcl0ws/8S\nnl92zrlG8nytGo6DaQaPk7Zq9ztJIwi5Ex4EiKGQewGXF6j2ApCONbEDZT4d5pu6nHPOuZYuBo6S\ndIikdQkZApcgZA5E0vmSko/m/h1YQ9JASetI+iMhLfHF5VzUZ8jOOedcgpndJWl5QuTGrsBooLeZ\nfRaLdCNk/suVnyhpF0La3uMIaYWPMLP0zuuifFOXc845Vwd8ydo555yrAz4gO+ecc3XAB2TnnHOu\nDviA7JxzztUBH5Cdc865OuADsnPOOVcHfEB2zjnn6oAPyM4551wd8AHZOeecqwM+IDvnnHN1wAdk\n55xzrg78fziXG3mmK1+9AAAAAElFTkSuQmCC\n",
"text/plain": [
"<matplotlib.figure.Figure at 0x1105fd630>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"# Keep track of correct guesses in a confusion matrix\n",
"confusion = torch.zeros(n_categories, n_categories)\n",
"n_confusion = 10000\n",
"\n",
"# Just return an output given a line\n",
"def evaluate(line_tensor):\n",
" hidden = rnn.init_hidden()\n",
" \n",
" for i in range(line_tensor.size()[0]):\n",
" output, hidden = rnn(line_tensor[i], hidden)\n",
" \n",
" return output\n",
"\n",
"# Go through a bunch of examples and record which are correctly guessed\n",
"for i in range(n_confusion):\n",
" category, line, category_tensor, line_tensor = random_training_pair()\n",
" output = evaluate(line_tensor)\n",
" guess, guess_i = category_from_output(output)\n",
" category_i = all_categories.index(category)\n",
" confusion[category_i][guess_i] += 1\n",
"\n",
"# Normalize by dividing every row by its sum\n",
"for i in range(n_categories):\n",
" confusion[i] = confusion[i] / confusion[i].sum()\n",
"\n",
"# Set up plot\n",
"fig = plt.figure()\n",
"ax = fig.add_subplot(111)\n",
"cax = ax.matshow(confusion.numpy())\n",
"fig.colorbar(cax)\n",
"\n",
"# Set up axes\n",
"ax.set_xticklabels([''] + all_categories, rotation=90)\n",
"ax.set_yticklabels([''] + all_categories)\n",
"\n",
"# Force label at every tick\n",
"ax.xaxis.set_major_locator(ticker.MultipleLocator(1))\n",
"ax.yaxis.set_major_locator(ticker.MultipleLocator(1))\n",
"\n",
"plt.show()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"You can pick out bright spots off the main axis that show which languages it guesses incorrectly, e.g. Chinese for Korean, and Spanish for Italian. It seems to do very well with Greek, and very poorly with English (perhaps because of overlap with other languages)."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Running on User Input"
]
},
{
"cell_type": "code",
"execution_count": 20,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"> Dovesky\n",
"(-0.87) Czech\n",
"(-0.88) Russian\n",
"(-2.44) Polish\n",
"\n",
"> Jackson\n",
"(-0.74) Scottish\n",
"(-2.03) English\n",
"(-2.21) Polish\n",
"\n",
"> Satoshi\n",
"(-0.77) Arabic\n",
"(-1.35) Japanese\n",
"(-1.81) Polish\n"
]
}
],
"source": [
"def predict(input_line, n_predictions=3):\n",
" print('\\n> %s' % input_line)\n",
" output = evaluate(Variable(line_to_tensor(input_line)))\n",
"\n",
" # Get top N categories\n",
" topv, topi = output.data.topk(n_predictions, 1, True)\n",
" predictions = []\n",
"\n",
" for i in range(n_predictions):\n",
" value = topv[0][i]\n",
" category_index = topi[0][i]\n",
" print('(%.2f) %s' % (value, all_categories[category_index]))\n",
" predictions.append([value, all_categories[category_index]])\n",
"\n",
"predict('Dovesky')\n",
"predict('Jackson')\n",
"predict('Satoshi')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The final versions of the scripts [in the Practical PyTorch repo](https://github.com/spro/practical-pytorch/tree/master/char-rnn-classification) split the above code into a few files:\n",
"\n",
"* `data.py` (loads files)\n",
"* `model.py` (defines the RNN)\n",
"* `train.py` (runs training)\n",
"* `predict.py` (runs `predict()` with command line arguments)\n",
"* `server.py` (serve prediction as a JSON API with bottle.py)\n",
"\n",
"Run `train.py` to train and save the network.\n",
"\n",
"Run `predict.py` with a name to view predictions: \n",
"\n",
"```\n",
"$ python predict.py Hazaki\n",
"(-0.42) Japanese\n",
"(-1.39) Polish\n",
"(-3.51) Czech\n",
"```\n",
"\n",
"Run `server.py` and visit http://localhost:5533/Yourname to get JSON output of predictions."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Exercises\n",
"\n",
"* Try with a different dataset of line -> category, for example:\n",
" * Any word -> language\n",
" * First name -> gender\n",
" * Character name -> writer\n",
" * Page title -> blog or subreddit\n",
"* Get better results with a bigger and/or better shaped network\n",
" * Add more linear layers\n",
" * Try the `nn.LSTM` and `nn.GRU` layers\n",
" * Combine multiple of these RNNs as a higher level network"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Next**: [Generating Shakespeare with a Character-Level RNN](https://github.com/spro/practical-pytorch/blob/master/char-rnn-generation/char-rnn-generation.ipynb)"
]
}
],
"metadata": {
"anaconda-cloud": {},
"celltoolbar": "Raw Cell Format",
"kernelspec": {
"display_name": "Python [conda root]",
"language": "python",
"name": "conda-root-py"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.2"
}
},
"nbformat": 4,
"nbformat_minor": 1
}
================================================
FILE: char-rnn-classification/data.py
================================================
import torch
import glob
import unicodedata
import string
all_letters = string.ascii_letters + " .,;'-"
n_letters = len(all_letters)
def findFiles(path): return glob.glob(path)
# Turn a Unicode string to plain ASCII, thanks to http://stackoverflow.com/a/518232/2809427
def unicodeToAscii(s):
return ''.join(
c for c in unicodedata.normalize('NFD', s)
if unicodedata.category(c) != 'Mn'
and c in all_letters
)
# Read a file and split into lines
def readLines(filename):
lines = open(filename).read().strip().split('\n')
return [unicodeToAscii(line) for line in lines]
# Build the category_lines dictionary, a list of lines per category
category_lines = {}
all_categories = []
for filename in findFiles('../data/names/*.txt'):
category = filename.split('/')[-1].split('.')[0]
all_categories.append(category)
lines = readLines(filename)
category_lines[category] = lines
n_categories = len(all_categories)
# Find letter index from all_letters, e.g. "a" = 0
def letterToIndex(letter):
return all_letters.find(letter)
# Turn a line into a <line_length x 1 x n_letters>,
# or an array of one-hot letter vectors
def lineToTensor(line):
tensor = torch.zeros(len(line), 1, n_letters)
for li, letter in enumerate(line):
tensor[li][0][letterToIndex(letter)] = 1
return tensor
================================================
FILE: char-rnn-classification/model.py
================================================
import torch
import torch.nn as nn
from torch.autograd import Variable
class RNN(nn.Module):
def __init__(self, input_size, hidden_size, output_size):
super(RNN, self).__init__()
self.hidden_size = hidden_size
self.i2h = nn.Linear(input_size + hidden_size, hidden_size)
self.i2o = nn.Linear(input_size + hidden_size, output_size)
self.softmax = nn.LogSoftmax()
def forward(self, input, hidden):
combined = torch.cat((input, hidden), 1)
hidden = self.i2h(combined)
output = self.i2o(combined)
output = self.softmax(output)
return output, hidden
def initHidden(self):
return Variable(torch.zeros(1, self.hidden_size))
================================================
FILE: char-rnn-classification/predict.py
================================================
from model import *
from data import *
import sys
rnn = torch.load('char-rnn-classification.pt')
# Just return an output given a line
def evaluate(line_tensor):
hidden = rnn.initHidden()
for i in range(line_tensor.size()[0]):
output, hidden = rnn(line_tensor[i], hidden)
return output
def predict(line, n_predictions=3):
output = evaluate(Variable(lineToTensor(line)))
# Get top N categories
topv, topi = output.data.topk(n_predictions, 1, True)
predictions = []
for i in range(n_predictions):
value = topv[0][i]
category_index = topi[0][i]
print('(%.2f) %s' % (value, all_categories[category_index]))
predictions.append([value, all_categories[category_index]])
return predictions
if __name__ == '__main__':
predict(sys.argv[1])
================================================
FILE: char-rnn-classification/server.py
================================================
from bottle import route, run
from predict import *
@route('/<input_line>')
def index(input_line):
return {'result': predict(input_line, 10)}
run(host='localhost', port=5533)
================================================
FILE: char-rnn-classification/train.py
================================================
import torch
from data import *
from model import *
import random
import time
import math
n_hidden = 128
n_epochs = 100000
print_every = 5000
plot_every = 1000
learning_rate = 0.005 # If you set this too high, it might explode. If too low, it might not learn
def categoryFromOutput(output):
top_n, top_i = output.data.topk(1) # Tensor out of Variable with .data
category_i = top_i[0][0]
return all_categories[category_i], category_i
def randomChoice(l):
return l[random.randint(0, len(l) - 1)]
def randomTrainingPair():
category = randomChoice(all_categories)
line = randomChoice(category_lines[category])
category_tensor = Variable(torch.LongTensor([all_categories.index(category)]))
line_tensor = Variable(lineToTensor(line))
return category, line, category_tensor, line_tensor
rnn = RNN(n_letters, n_hidden, n_categories)
optimizer = torch.optim.SGD(rnn.parameters(), lr=learning_rate)
criterion = nn.NLLLoss()
def train(category_tensor, line_tensor):
hidden = rnn.initHidden()
optimizer.zero_grad()
for i in range(line_tensor.size()[0]):
output, hidden = rnn(line_tensor[i], hidden)
loss = criterion(output, category_tensor)
loss.backward()
optimizer.step()
return output, loss.data[0]
# Keep track of losses for plotting
current_loss = 0
all_losses = []
def timeSince(since):
now = time.time()
s = now - since
m = math.floor(s / 60)
s -= m * 60
return '%dm %ds' % (m, s)
start = time.time()
for epoch in range(1, n_epochs + 1):
category, line, category_tensor, line_tensor = randomTrainingPair()
output, loss = train(category_tensor, line_tensor)
current_loss += loss
# Print epoch number, loss, name and guess
if epoch % print_every == 0:
guess, guess_i = categoryFromOutput(output)
correct = '✓' if guess == category else '✗ (%s)' % category
print('%d %d%% (%s) %.4f %s / %s %s' % (epoch, epoch / n_epochs * 100, timeSince(start), loss, line, guess, correct))
# Add current loss avg to list of losses
if epoch % plot_every == 0:
all_losses.append(current_loss / plot_every)
current_loss = 0
torch.save(rnn, 'char-rnn-classification.pt')
================================================
FILE: char-rnn-generation/README.md
================================================
# Practical PyTorch: Generating Shakespeare with a Character-Level RNN
## Dataset
Download [this Shakespeare dataset](https://raw.githubusercontent.com/karpathy/char-rnn/master/data/tinyshakespeare/input.txt) (from [Andrej Karpathy's char-rnn](https://github.com/karpathy/char-rnn)) and save as `shakespeare.txt`
## Jupyter Notebook
The [Jupyter Notebook version of the tutorial](https://github.com/spro/practical-pytorch/blob/master/char-rnn-generation/char-rnn-generation.ipynb) describes the model and steps in detail.
## Python scripts
Run `train.py` with a filename to train and save the network:
```
> python train.py shakespeare.txt
Training for 2000 epochs...
(10 minutes later)
Saved as shakespeare.pt
```
After training the model will be saved as `[filename].pt` — now run `generate.py` with that filename to generate some new text:
```
> python generate.py shakespeare.pt --prime_str "Where"
Where, you, and if to our with his drid's
Weasteria nobrand this by then.
AUTENES:
It his zersit at he
```
### Training options
```
Usage: train.py [filename] [options]
Options:
--n_epochs Number of epochs to train
--print_every Log learning rate at this interval
--hidden_size Hidden size of GRU
--n_layers Number of GRU layers
--learning_rate Learning rate
--chunk_len Length of chunks to train on at a time
```
### Generation options
```
Usage: generate.py [filename] [options]
Options:
-p, --prime_str String to prime generation with
-l, --predict_len Length of prediction
-t, --temperature Temperature (higher is more chaotic)
```
================================================
FILE: char-rnn-generation/char-rnn-generation.ipynb
================================================
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"\n",
"\n",
"# Practical PyTorch: Generating Shakespeare with a Character-Level RNN\n",
"\n",
"[In the RNN classification tutorial](https://github.com/spro/practical-pytorch/blob/master/char-rnn-classification/char-rnn-classification.ipynb) we used a RNN to classify text one character at a time. This time we'll generate text one character at a time.\n",
"\n",
"```\n",
"> python generate.py -n 500\n",
"\n",
"PAOLTREDN:\n",
"Let, yil exter shis owrach we so sain, fleas,\n",
"Be wast the shall deas, puty sonse my sheete.\n",
"\n",
"BAUFIO:\n",
"Sirh carrow out with the knonuot my comest sifard queences\n",
"O all a man unterd.\n",
"\n",
"PROMENSJO:\n",
"Ay, I to Heron, I sack, againous; bepear, Butch,\n",
"An as shalp will of that seal think.\n",
"\n",
"NUKINUS:\n",
"And house it to thee word off hee:\n",
"And thou charrota the son hange of that shall denthand\n",
"For the say hor you are of I folles muth me?\n",
"```\n",
"\n",
"This one might make you question the series title — \"is that really practical?\" However, these sorts of generative models form the basis of machine translation, image captioning, question answering and more. See the [Sequence to Sequence Translation tutorial](https://github.com/spro/practical-pytorch/blob/master/seq2seq-translation/seq2seq-translation.ipynb) for more on that topic."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Recommended Reading\n",
"\n",
"I assume you have at least installed PyTorch, know Python, and understand Tensors:\n",
"\n",
"* http://pytorch.org/ For installation instructions\n",
"* [Deep Learning with PyTorch: A 60-minute Blitz](https://github.com/pytorch/tutorials/blob/master/Deep%20Learning%20with%20PyTorch.ipynb) to get started with PyTorch in general\n",
"* [jcjohnson's PyTorch examples](https://github.com/jcjohnson/pytorch-examples) for an in depth overview\n",
"* [Introduction to PyTorch for former Torchies](https://github.com/pytorch/tutorials/blob/master/Introduction%20to%20PyTorch%20for%20former%20Torchies.ipynb) if you are former Lua Torch user\n",
"\n",
"It would also be useful to know about RNNs and how they work:\n",
"\n",
"* [The Unreasonable Effectiveness of Recurrent Neural Networks](http://karpathy.github.io/2015/05/21/rnn-effectiveness/) shows a bunch of real life examples\n",
"* [Understanding LSTM Networks](http://colah.github.io/posts/2015-08-Understanding-LSTMs/) is about LSTMs specifically but also informative about RNNs in general\n",
"\n",
"Also see these related tutorials from the series:\n",
"\n",
"* [Classifying Names with a Character-Level RNN](https://github.com/spro/practical-pytorch/blob/master/char-rnn-classification/char-rnn-classification.ipynb) uses an RNN for classification\n",
"* [Generating Names with a Conditional Character-Level RNN](https://github.com/spro/practical-pytorch/blob/master/conditional-char-rnn/conditional-char-rnn.ipynb) builds on this model to add a category as input"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Prepare data\n",
"\n",
"The file we are using is a plain text file. We turn any potential unicode characters into plain ASCII by using the `unidecode` package (which you can install via `pip` or `conda`)."
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"file_len = 1115394\n"
]
}
],
"source": [
"import unidecode\n",
"import string\n",
"import random\n",
"import re\n",
"\n",
"all_characters = string.printable\n",
"n_characters = len(all_characters)\n",
"\n",
"file = unidecode.unidecode(open('../data/shakespeare.txt').read())\n",
"file_len = len(file)\n",
"print('file_len =', file_len)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"To make inputs out of this big string of data, we will be splitting it into chunks."
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
" will continue that I broach'd in jest.\n",
"I can, Petruchio, help thee to a wife\n",
"With wealth enough and young and beauteous,\n",
"Brought up as best becomes a gentlewoman:\n",
"Her only fault, and that is faults en\n"
]
}
],
"source": [
"chunk_len = 200\n",
"\n",
"def random_chunk():\n",
" start_index = random.randint(0, file_len - chunk_len)\n",
" end_index = start_index + chunk_len + 1\n",
" return file[start_index:end_index]\n",
"\n",
"print(random_chunk())"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Build the Model\n",
"\n",
"This model will take as input the character for step $t_{-1}$ and is expected to output the next character $t$. There are three layers - one linear layer that encodes the input character into an internal state, one GRU layer (which may itself have multiple layers) that operates on that internal state and a hidden state, and a decoder layer that outputs the probability distribution."
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"import torch\n",
"import torch.nn as nn\n",
"from torch.autograd import Variable\n",
"\n",
"class RNN(nn.Module):\n",
" def __init__(self, input_size, hidden_size, output_size, n_layers=1):\n",
" super(RNN, self).__init__()\n",
" self.input_size = input_size\n",
" self.hidden_size = hidden_size\n",
" self.output_size = output_size\n",
" self.n_layers = n_layers\n",
" \n",
" self.encoder = nn.Embedding(input_size, hidden_size)\n",
" self.gru = nn.GRU(hidden_size, hidden_size, n_layers)\n",
" self.decoder = nn.Linear(hidden_size, output_size)\n",
" \n",
" def forward(self, input, hidden):\n",
" input = self.encoder(input.view(1, -1))\n",
" output, hidden = self.gru(input.view(1, 1, -1), hidden)\n",
" output = self.decoder(output.view(1, -1))\n",
" return output, hidden\n",
"\n",
" def init_hidden(self):\n",
" return Variable(torch.zeros(self.n_layers, 1, self.hidden_size))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Inputs and Targets"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Each chunk will be turned into a tensor, specifically a `LongTensor` (used for integer values), by looping through the characters of the string and looking up the index of each character in `all_characters`."
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Variable containing:\n",
" 10\n",
" 11\n",
" 12\n",
" 39\n",
" 40\n",
" 41\n",
"[torch.LongTensor of size 6]\n",
"\n"
]
}
],
"source": [
"# Turn string into list of longs\n",
"def char_tensor(string):\n",
" tensor = torch.zeros(len(string)).long()\n",
" for c in range(len(string)):\n",
" tensor[c] = all_characters.index(string[c])\n",
" return Variable(tensor)\n",
"\n",
"print(char_tensor('abcDEF'))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Finally we can assemble a pair of input and target tensors for training, from a random chunk. The input will be all characters *up to the last*, and the target will be all characters *from the first*. So if our chunk is \"abc\" the input will correspond to \"ab\" while the target is \"bc\"."
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"def random_training_set(): \n",
" chunk = random_chunk()\n",
" inp = char_tensor(chunk[:-1])\n",
" target = char_tensor(chunk[1:])\n",
" return inp, target"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Evaluating\n",
"\n",
"To evaluate the network we will feed one character at a time, use the outputs of the network as a probability distribution for the next character, and repeat. To start generation we pass a priming string to start building up the hidden state, from which we then generate one character at a time."
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"def evaluate(prime_str='A', predict_len=100, temperature=0.8):\n",
" hidden = decoder.init_hidden()\n",
" prime_input = char_tensor(prime_str)\n",
" predicted = prime_str\n",
"\n",
" # Use priming string to \"build up\" hidden state\n",
" for p in range(len(prime_str) - 1):\n",
" _, hidden = decoder(prime_input[p], hidden)\n",
" inp = prime_input[-1]\n",
" \n",
" for p in range(predict_len):\n",
" output, hidden = decoder(inp, hidden)\n",
" \n",
" # Sample from the network as a multinomial distribution\n",
" output_dist = output.data.view(-1).div(temperature).exp()\n",
" top_i = torch.multinomial(output_dist, 1)[0]\n",
" \n",
" # Add predicted character to string and use as next input\n",
" predicted_char = all_characters[top_i]\n",
" predicted += predicted_char\n",
" inp = char_tensor(predicted_char)\n",
"\n",
" return predicted"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Training"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"A helper to print the amount of time passed:"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"import time, math\n",
"\n",
"def time_since(since):\n",
" s = time.time() - since\n",
" m = math.floor(s / 60)\n",
" s -= m * 60\n",
" return '%dm %ds' % (m, s)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The main training function"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"def train(inp, target):\n",
" hidden = decoder.init_hidden()\n",
" decoder.zero_grad()\n",
" loss = 0\n",
"\n",
" for c in range(chunk_len):\n",
" output, hidden = decoder(inp[c], hidden)\n",
" loss += criterion(output, target[c])\n",
"\n",
" loss.backward()\n",
" decoder_optimizer.step()\n",
"\n",
" return loss.data[0] / chunk_len"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Then we define the training parameters, instantiate the model, and start training:"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {
"collapsed": false,
"scrolled": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[0m 19s (100 5%) 2.1267]\n",
"Wh! 'Lod at the to Cell I dy\n",
"Whapesfe show dous that,\n",
"But thes lo he ther, letrst surave and and cod a \n",
"\n",
"[0m 38s (200 10%) 1.9876]\n",
"Whan the she ciching, doove whath that he gone prie hasigrow nice knotat by wiith haye! ha coll, and i \n",
"\n",
"[0m 59s (300 15%) 2.0772]\n",
"Whurgre of nowif for of agand witeling in fromound be noyed th well and fort and withen a custrone fri \n",
"\n",
"[1m 19s (400 20%) 1.9062]\n",
"Why sleemer chome, I\n",
"tence lord thou let not mories, Wherly me cloonger on wit, me cre wort if thing i \n",
"\n",
"[1m 39s (500 25%) 1.9632]\n",
"Whank of winded than inderreast, hids for hink marry, I son will now my be tor think that I be uncient \n",
"\n",
"[2m 0s (600 30%) 1.9364]\n",
"What to youre\n",
"Good the dorsentemat.\n",
"What the not what a meifery part is be of look\n",
"Whait of the hall w \n",
"\n",
"[2m 20s (700 35%) 1.8673]\n",
"Whes Bester,\n",
"Bars, and and most man\n",
"ingeld my tiement make I lesiefoden as do you same to muse woke o' \n",
"\n",
"[2m 40s (800 40%) 2.1523]\n",
"Whe my bone a me but mast at the face.\n",
"Whe he frend him cope a be to with I comes or he God his for ma \n",
"\n",
"[3m 1s (900 45%) 1.8042]\n",
"Whis our namure.\n",
"\n",
"TRANIO:\n",
"May platis the lord,\n",
"I wis he we but he hards paron's we for the surven neav \n",
"\n",
"[3m 21s (1000 50%) 1.9770]\n",
"Whis, is at ell demes sy host is in\n",
"The revention eart-aly, his the couth stare.\n",
"The streath, the so h \n",
"\n",
"[3m 42s (1100 55%) 1.9771]\n",
"Which the called these what mace all bries,\n",
"Gow the from ceart repise--tring be of the\n",
"Hee he that, of \n",
"\n",
"[4m 3s (1200 60%) 1.7054]\n",
"What that hays how the frow he dresers gard.\n",
"\n",
"BAPTISTA:\n",
"That was on a prain their with to goe, all me\n",
" \n",
"\n",
"[4m 23s (1300 65%) 1.6584]\n",
"Whe time, like\n",
"Those paurstriet.\n",
"\n",
"SICINIUS:\n",
"Glow a and elfers; rother's Rome servest enon't is may thu \n",
"\n",
"[4m 44s (1400 70%) 1.7370]\n",
"When him these;\n",
"There and of Have the in of the do best veath and hever the chaw, not pites with at my \n",
"\n",
"[5m 6s (1500 75%) 1.6769]\n",
"Wher he have live the courtas,\n",
"I here that whils him I shee my like deated,\n",
"To countert a hardor of so \n",
"\n",
"[5m 26s (1600 80%) 1.7480]\n",
"Wh for the grone them with are\n",
"Belent dis are couch of my to tell ding.\n",
"\n",
"Sir:\n",
"What the deatred thou as \n",
"\n",
"[5m 48s (1700 85%) 1.7725]\n",
"Why.\n",
"\n",
"CUMETEL:\n",
"I carcithy place, did the forling like grease in ratenforer;\n",
"Which ot chatuse, be thy p \n",
"\n",
"[6m 8s (1800 90%) 1.6781]\n",
"What feath wifiten,\n",
"Thou kind Maner'd my king: I'll thou\n",
"Reven's my streathence,\n",
"By civery sow'd king' \n",
"\n",
"[6m 28s (1900 95%) 1.5265]\n",
"What so srome the and any strand?\n",
"\n",
"BAPTISTA:\n",
"Not bother hear are a common int.\n",
"\n",
"QUEEN MIRGANSIO:\n",
"I say \n",
"\n",
"[6m 49s (2000 100%) 1.5479]\n",
"Why, ruse the tort,\n",
"And whese a to the vill bear not tell not the the borwading.\n",
"\n",
"JULIET:\n",
"In be our no \n",
"\n"
]
}
],
"source": [
"n_epochs = 2000\n",
"print_every = 100\n",
"plot_every = 10\n",
"hidden_size = 100\n",
"n_layers = 1\n",
"lr = 0.005\n",
"\n",
"decoder = RNN(n_characters, hidden_size, n_characters, n_layers)\n",
"decoder_optimizer = torch.optim.Adam(decoder.parameters(), lr=lr)\n",
"criterion = nn.CrossEntropyLoss()\n",
"\n",
"start = time.time()\n",
"all_losses = []\n",
"loss_avg = 0\n",
"\n",
"for epoch in range(1, n_epochs + 1):\n",
" loss = train(*random_training_set()) \n",
" loss_avg += loss\n",
"\n",
" if epoch % print_every == 0:\n",
" print('[%s (%d %d%%) %.4f]' % (time_since(start), epoch, epoch / n_epochs * 100, loss))\n",
" print(evaluate('Wh', 100), '\\n')\n",
"\n",
" if epoch % plot_every == 0:\n",
" all_losses.append(loss_avg / plot_every)\n",
" loss_avg = 0"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Plotting the Training Losses\n",
"\n",
"Plotting the historical loss from all_losses shows the network learning:"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"[<matplotlib.lines.Line2D at 0x11079f780>]"
]
},
"execution_count": 10,
"metadata": {},
"output_type": "execute_result"
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAg0AAAFkCAYAAACjCwibAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzt3Xl8VNX5x/HvYd8XUVlEq4JawAVZBAQFQUDBam1FjRtg\nfwqIRdGq1brUDXfUqohb645Wq+KGgKggIqJBtFpQFEQQZVMCsggh5/fHk+tMhklyJ5lkbpLP+/Wa\n12Tu3OXM3MncZ855zjnOey8AAIDiVMt0AQAAQMVA0AAAAEIhaAAAAKEQNAAAgFAIGgAAQCgEDQAA\nIBSCBgAAEApBAwAACIWgAQAAhELQAAAAQilV0OCc+6tzLs85N76Y9fo457Kdc1udc18654aW5rgA\nAKD8lThocM51lXSupE+KWW9vSa9KmiHpEEl3S3rYOde/pMcGAADlr0RBg3OugaQnJf2fpPXFrD5K\n0hLv/aXe+y+89/dJel7S2JIcGwAAZEZJaxruk/SK9/6tEOt2l/RmwrKpknqU8NgAACADaqS6gXPu\nVEkdJXUJuUkLSasSlq2S1Mg5V9t7/0uSYzSTNFDSN5K2plpGAACqsDqS9pY01Xu/Lp07TilocM61\nlnSXpKO999vTWZAEAyU9VYb7BwCgsjtd0tPp3GGqNQ2dJe0mab5zzuUvqy7pSOfc+ZJqe+99wjY/\nSGqesKy5pA3JahnyfSNJTz75pNq1a5diERFFY8eO1Z133pnpYiBNOJ+VC+ezclm4cKHOOOMMKf9a\nmk6pBg1vSjooYdmjkhZKujlJwCBJ70s6NmHZgPzlhdkqSe3atVOnTp1SLCKiqHHjxpzLSoTzWblw\nPiuttDfvpxQ0eO83Sfpf/DLn3CZJ67z3C/Mfj5O0h/c+GIthoqTRzrlbJP1TUj9JJ0kaVMqyAwCA\ncpSOESETaxdaStrz1ye9/0bSYElHS1og62r5J+99Yo8KAAAQYSn3nkjkve+b8Hh4knVmyfIhAABA\nBcXcEygXWVlZmS4C0ojzWblwPhEWQQPKBV9KlQvns3LhfCIsggYAABAKQQMAAAiFoAEAAIRC0AAA\nAEIhaAAAAKEQNAAAgFAIGgAAQCgEDQAAIBSCBgAAEApBAwAACIWgAQAAhELQAAAAQiFoAAAAoRA0\nAACAUAgaAABAKAQNAAAgFIIGAAAQCkEDAAAIhaABAACEQtAAAABCIWgAAAChEDQAAIBQCBoAAEAo\nBA0AACAUggYAABAKQQMAAAiFoAEAAIRC0AAAAEIhaAAAAKEQNAAAgFAIGgAAQCgEDQAAIBSCBgAA\nEApBAwAACIWgAQAAhELQAAAAQiFoAAAAoRA0AACAUAgaAABAKAQNAAAgFIIGAAAQCkEDAAAIhaAB\nAACEQtAAAABCIWgAAAChEDQAAIBQCBoAAEAoBA0AACAUggYAABBKSkGDc26kc+4T51xO/m2Oc+6Y\nItbv7ZzLS7jtcM7tXvqiAwCA8lQjxfWXS7pM0mJJTtIwSZOdcx299wsL2cZL2l/Sxl8XeL869aIC\nAIBMSilo8N6/lrDoSufcKEndJRUWNEjSGu/9hlQLBwAAoqPEOQ3OuWrOuVMl1ZP0flGrSlrgnFvp\nnJvmnDu8pMcEAACZk2rzhJxzB8qChDqyJocTvfeLCln9e0kjJH0kqbakcyS945w7zHu/oGRFBgAA\nmZBy0CBpkaRDJDWWdJKkx51zRyYLHLz3X0r6Mm7RXOdcG0ljJQ0t7kAXXjhWTZo0LrAsKytLWVlZ\nJSg2AACVy6RJkzRp0qQCy3JycsrseM57X7odODdd0lfe+1Eh179VUk/vfc8i1ukkKXv27Gz17Nmp\nVOUDAKAqmT9/vjp37ixJnb3389O573SM01BN1vQQVkdZs0Wxtm8vUXkAAEAZSKl5wjk3TtIUSd9K\naijpdEm9JQ3If/4mSa2890PzH18gaamkz2U5EOdIOkpS/zDHI2gAACA6Us1p2F3SY5JaSsqR9Kmk\nAd77t/KfbyFpz7j1a0m6Q1IrSZvz1+/nvZ8V5mDbtqVYOgAAUGZSHafh/4p5fnjC49sk3VaCckmS\ncnNLuiUAAEi3SM89QU0DAADRQdAAAABCiXTQQPMEAADREemggZoGAACig6ABAACEEumggXEaAACI\nDoIGAAAQSqSDBponAACIjkgHDdQ0AAAQHQQNAAAgFIIGAAAQSqSDBnIaAACIjkgHDdQ0AAAQHQQN\nAAAglEgHDTRPAAAQHZEOGqhpAAAgOggaAABAKJEOGmieAAAgOiIdNFDTAABAdBA0AACAUCIdNNA8\nAQBAdEQ6aKCmAQCA6CBoAAAAoUQ6aKB5AgCA6Ih00EBNAwAA0RHpoIGaBgAAoiPSQQM1DQAARAdB\nAwAACCXSQQPNEwAAREekg4bc3EyXAAAABCIdNFDTAABAdBA0AACAUCIdNOTmSt5nuhQAAECKeNAg\n0YMCAICoiHzQ8MsvmS4BAACQKkDQQF4DAADREPmggZoGAACiIfJBAzUNAABEQ+SDBmoaAACIBoIG\nAAAQSuSDBponAACIhsgHDdQ0AAAQDQQNAAAglMgHDTRPAAAQDZEPGqhpAAAgGiIfNFDTAABANEQ+\naKCmAQCAaCBoAAAAoUQ6aKheneYJAACiItJBQ82a1DQAABAVkQ4aatUiaAAAICpSChqccyOdc584\n53Lyb3Occ8cUs00f51y2c26rc+5L59zQsMerWZPmCQAAoiLVmoblki6T1ElSZ0lvSZrsnGuXbGXn\n3N6SXpU0Q9Ihku6W9LBzrn+Yg9E8AQBAdNRIZWXv/WsJi650zo2S1F3SwiSbjJK0xHt/af7jL5xz\nvSSNlTS9uOMRNAAAEB0lzmlwzlVzzp0qqZ6k9wtZrbukNxOWTZXUI8wxatWieQIAgKhIqaZBkpxz\nB8qChDqSNko60Xu/qJDVW0halbBslaRGzrna3vsi6xFIhAQAIDpSDhokLZLlJzSWdJKkx51zRxYR\nOJTYihVj9frrjXX88bFlWVlZysrKSvehAACocCZNmqRJkyYVWJaTk1Nmx3Pe+9LtwLnpkr7y3o9K\n8txMSdne+4vilg2TdKf3vmkR++wkKbtz52wdcEAnPfVUqYoIAECVMX/+fHXu3FmSOnvv56dz3+kY\np6GapNqFPPe+pH4Jywao8ByIAmieAAAgOlJqnnDOjZM0RdK3khpKOl1Sb1kgIOfcTZJaee+DsRgm\nShrtnLtF0j9lAcRJkgaFOV7NmtLWramUEAAAlJVUcxp2l/SYpJaSciR9KmmA9/6t/OdbSNozWNl7\n/41zbrCkOyWNkbRC0p+894k9KpJq2FAqw6YZAACQglTHafi/Yp4fnmTZLNlAUClr1EhaurQkWwIA\ngHSL9NwTjRpJP/6Y6VIAAAAp4kFD48bSTz9JpezgAQAA0iDSQUPDhtZ7YsuWTJcEAABEOmho3Nju\naaIAACDzIh00NGpk9wQNAABkXqSDhqCm4aefMlsOAAAQ8aChYUO7p6YBAIDMI2gAAAChRDpoqFHD\nmigIGgAAyLxIBw2StMsu5DQAABAFkQ8amjalpgEAgCiIfNCwyy4EDQAARAFBAwAACCXyQUPTpuQ0\nAAAQBZEPGqhpAAAgGggaAABAKBUiaNiwQcrNzXRJAACo2iIfNDRtavfr12e2HAAAVHWRDxp22cXu\naaIAACCzCBoAAEAokQ8aguYJul0CAJBZkQ8aqGkAACAaIh801K0r1a5N0AAAQKZFPmhwjrEaAACI\ngsgHDRJDSQMAEAUVImigpgEAgMwjaAAAAKEQNAAAgFAqRNBATgMAAJlXIYIGahoAAMi8ChU0eJ/p\nkgAAUHVViKBh111tauycnEyXBACAqqtCBA0tWtj9Dz9kthwAAFRlBA0AACAUggYAABBKhQgaGjaU\n6tWTvv8+0yUBAKDqqhBBg3NW20BNAwAAmVMhggaJoAEAgEyrMEFDy5YEDQAAZFKFCRpatCCnAQCA\nTKpQQQM1DQAAZE6FCRpatpTWrpW2b890SQAAqJoqTNDQooXNPbF6daZLAgBA1VShggaJJgoAADKl\nwgQNLVvaPUEDAACZUWGCht12s0Ge6EEBAEBmVJigoWZNmyKbmgYAADKjwgQNEgM8AQCQSRUqaIgf\nq2H6dOnddzNbHgAAqpIKFzR8/72UlyedfbZ0ww2ZLhEAAFVHhQoaguaJWbOkFSuk777LdIkAAKg6\nUgoanHOXO+fmOec2OOdWOededM7tX8w2vZ1zeQm3Hc653VMtbNA88eST9njFilT3AAAASirVmoYj\nJN0jqZukoyXVlDTNOVe3mO28pP0ktci/tfTepzy2Y4sW0ubN0jPPSG3bSjk50saNqe4FAACUREpB\ng/d+kPf+Ce/9Qu/9fyUNk7SXpM4hNl/jvV8d3EpQ1l8HeNq0SbrkEvubJgoAAMpHaXMamshqEX4s\nZj0naYFzbqVzbppz7vCSHCwYSrpzZ+noo+1vmigAACgfJQ4anHNO0l2SZnvv/1fEqt9LGiHpj5L+\nIGm5pHeccx1TPWarVlL16tKZZ9rfEkEDAADlpUYptp0gqb2knkWt5L3/UtKXcYvmOufaSBoraWhR\n244dO1aNGzcusOzaa7M0enSWatSwoaUJGgAAVdWkSZM0adKkAstycnLK7HjOe5/6Rs7dK+l3ko7w\n3n9bgu1vldTTe5804HDOdZKUnZ2drU6dOhW6n06dpMMOkyZOTLUEAABUTvPnz1fnzp0lqbP3fn46\n951yTUN+wHCCpN4lCRjydZQ1W5RK69bUNAAAUF5SChqccxMkZUk6XtIm51zz/KdyvPdb89cZJ2kP\n7/3Q/McXSFoq6XNJdSSdI+koSf1LW/jWraU5c0q7FwAAEEaqNQ0jZb0l3klYPlzS4/l/t5S0Z9xz\ntSTdIamVpM2SPpXUz3s/K9XCJqKmAQCA8pNS0OC9L7a3hfd+eMLj2yTdlmK5QmndWlq3TtqyRapb\n3PBSAACgVCrU3BOJWre2ewZ4AgCg7FWKoKGwJoq335YGDZJK0EEEAAAkqNBBwx572H1hQcMzz0hT\npkhr15ZfmQAAqKwqdNBQv77UtGksaPjgA2n79tjzs2fb/VdflX/ZAACobCp00CDFelC89ZbUvbv0\n6KO2fN066X/5g1sTNAAAUHqVImj45hvpwgvt8Qsv2H0wfkPt2tLixRkpGgAAlUpp5p6IhNatpUce\nkfLypLPOkiZNknJyrGmiVSupTRtqGgAASIcKX9Owxx4WMAwfLl1/veU0TJkivfee1KuXtN9+BA0A\nAKRDhQ8aDjlE2nVXadw4aa+9bBKrZ56RPvzQgoa2ba15gm6XAACUToUPGn7/e+n776UWLezxiSdK\nkydL27bFgob166Uff7Tnn37aaiIAAEBqKnzQIEk14jIzTjzR7hs0kA46yIIGyZoo8vIsYTIrywIN\nAAAQXqUIGuK1b295DD16WDARBA2LF0vZ2dKaNdLWrdKYMZktJwAAFU2F7z2RyDnp2WdjE1g1bCg1\nb241DUuWSI0bS/feK515pvTyy9Lxx2e2vAAAVBSVrqZBkg49VPrtb2OP27a1oGHKFKl/f+n006XB\ng622gQRJAADCqZRBQ6L99rMhpj/4QDr2WKuNuOACadkyaeHCTJcOAICKoUoEDUFNg/fSMcfYssMP\nt5yHmTMzWzYAACqKKhM0SDamQ6tW9nf9+lKXLgQNAACEVSWChv32s/tBgwou793bggbyGgAAKF6V\nCBratZOOPNISIOP17i398ENsQqu8vPIvGwAAFUWVCBrq1rUahQ4dCi7v2VOqVs2e27ZN6tZNuuyy\nzJQRAICoq3TjNKSiUSObq2LmTGnlSumjj2y46VtuyXTJAACIniodNEjWRPHPf0obN1qPijlzrKdF\nkDwJAABMlWieKErv3tJPP0kHHGAjRNaoIU2dmulSAQAQPVU+aOjTx26PPy41a2Z5DgQNAADsrMoH\nDQ0bSm+/bbkNkjRwoD3eti2z5QIAIGqqfNCQaOBA6eefLbcBAADEEDQk6NhR2m03migAAEhE0JCg\nWjVpwABp8mSrcQAAAIagIYmRI6Vvv5U6d5YWLMh0aQAAiAaChiR69ZLmz5fq1ZO6d7dBnwAAqOoI\nGgqx//7S++/bZFeXXsqkVgAAEDQUoU4dadw464I5fXqmSwMAQGYRNBTjuONswKe//pVZMAEAVRtB\nQzGck26+Wfr4Y2nixEyXBgCAzCFoCKFXL+nss6XRo6WTTpJ++CHTJQIAoPwRNIT08MPSpEnSrFk2\nG2Zubvr2vXq1tGhR+vYHAEBZIGgIyTnp1FNtpMilS6VXX7Xl3kvnnCM98UTJ933JJdKRRzLfBQAg\n2ggaUnTooVK3brH8hhkzrBbiiiuk7dtT319envTGG9KaNdIrr6S3rAAApBNBQwmMHGk1Dl9/Lf3t\nb1KbNtKKFdLzz6e+r08+seaJJk0s+AAAIKoIGkrg5JPtIn/aadK8eVbr0L+/NH58wUGgfvlFuv56\n6bbbpHXrku9r6lSpfn3pxhvt7+XLy+c1AACQKoKGEqhXTxo61AKG3r2lfv2kiy6y4aZnz7Z1Vq2S\n+vaVbrhBuuoqaY89pOHDdx6Seto0qU8f6ayzLHj417/K/eUAABAKQUMJjRpltQ033WRJkgMHSu3b\nW1LkccfZFNtffy3NnGlNF9deayNLdu0q9eghffedzaI5e7Zt26CBdMop0iOPSDt2ZPrVAQCwM4KG\nEjrgAOmnnywAkCxwuPVWqWVLqXp16fjjpQ8/tAmvdt1VuuwyCyImT7Yg4oQTpClTLHly4EDbx9Ch\nNrtmdnbmXhcAAIWpkekCVCaDB9utMEEwsddeNjT1sGHSb35jk2JJ1iujdm2bKOuww8qlyAAAhEZN\nQwZ07Cg9+aS0ebPVMjhny2vVkrp0kebMia27dauUk5N8P9u2WbIlAADlgaAhQ0480XIcbryx4PLD\nDy8YNIwZIw0YkHwf555rtRTLlpVdOQEACBA0ZFCfPpbvEO/wwy3nYflyq0l47jnrcbFxY8H11qyR\nnn7a5sE4+mjmwwAAlD2ChogJEivnzJHeektav95GjZw3r+B6jz1mzRrvvWfNHD17SmeeKV14ofXM\nAAAg3QgaIqZ5cxth8v33bYTJtm2ta+f778fW8V568EHpj3+0LpxvvmnDWy9bJj36qM1lAQBAuhE0\nRNDhh9v4Di++KA0ZYr0q4oOGmTOlxYstp0GS2rWzAGPWLBsP4rnnqG0AAKQfQUME9eghLVgg/fij\ndNJJ9nju3NgQ1Q89ZAmQvXvvvO3w4VLdutL996e/XPfeK51xhk0R/tNP6d8/ACDaUgoanHOXO+fm\nOec2OOdWOededM7tH2K7Ps65bOfcVufcl865oSUvcuV3+OF2v88+1uzQo4cFEF9+aU0Qzz0njRgR\n66oZr1EjCxwmTpS2bElfmXJzpb//3WbkPO006cADrTtoSXz3nY2e+fnn6SsfAKDspVrTcISkeyR1\nk3S0pJqSpjnn6ha2gXNub0mvSpoh6RBJd0t62DnXvwTlrRIOPFBq2tQmxnLOmiecsyaKG2+0HIcR\nIwrf/s9/tiDjoossgOjfv/AJs8KaM8f28dpr0rvvSitXWu1HSdx2m7RwoXU5BQBUHCmNCOm9HxT/\n2Dk3TNJqSZ0lzS5ks1GSlnjvL81//IVzrpeksZKmp1TaKqJ6dRuCumVLe9y4sdShg3WxfPtt6eab\nba6KwrRtK/3+91bbcPDBNjT1iBFWQ5GsdiLeo4/a/bBhBZe/9JKVp2tXe9ysmTRjhnUbTcXq1ZbE\nKUn//W9q2wIAMqu0OQ1NJHlJPxaxTndJbyYsmyqpRymPXam1aWOzaQZ69JCmT7dxHUaNKn77p56S\n1q6VPvnEciD+8x/p8ceL3mbzZmnsWOmKK6ybZ8B7CxpOOEGqVs1uRx1lQUOq7rrLtj/2WIIGAKho\nShw0OOecpLskzfbe/6+IVVtIWpWwbJWkRs652iU9flUTjN9w+eUFg4nC1K1rtQGSJVMOGyadf75N\nwX3EEZafkOjpp21ciO+/L9j08N//SkuXWu1FoF8/Gztiw4bwr2H9eum++yzo6dPH9hsfnKTb/PkW\nPAEA0qM0E1ZNkNReUs80lWUnY8eOVePGjQssy8rKUlZWVlkdMrJOPNEu3EE3y1TdfbcFA199JdWo\nId1wgwUQ++5rz3tvvSMGD7YRKF94IZaQ+dJLlmB51FGx/fXrZ1N4v/uubTNunDVfDB8eW2fBAmse\nqZYfmj72mCVPXnSR1YD8/LMldu6zT2ybvDybl+PUU20ujuJs3mz5G61bF1y+Y4e9vi+/tBEzmzcv\nfl8//mivswbTuAGoICZNmqRJkyYVWJZT2IRF6eC9T/km6V5JyyTtFWLdmZLGJywbJumnIrbpJMln\nZ2d7pN/mzd43b+79n/4UW/buu95L3k+d6v2IEd7vvbf3eXn23KGHen/qqQX3kZfnfevW3o8d6/2b\nb9q2u+3m/S+/2PPZ2bbsqadi2wwe7P3RR9vfK1bY85MnF9zv1Km2/Nlnw72WCy7wftdd7TXFe/xx\n20/Nmt6PG1f8fnbs8L5VK+/Hjw93XACIquzsbC9LHejkS3CNL+qWcvOEc+5eSSdIOsp7/22ITd6X\n1C9h2YD85ciAunWlSy+1X/7ffGPL7r1X2n9/+1X+xz/a8gULpCeekD7+2HpyxHPOahumTLEky3bt\nbD6M116L7U+SXn/d7rdvt0Gp+va1x61aWQ+RTz8tuN/nnrP72YWl1cbZsUN65hnL3Qi2k2zmz6uv\nlv7wB+n006UHHrB1i/LJJ9Yj5MMPiz8uAFRVqY7TMEHS6ZJOk7TJOdc8/1Ynbp1xzrnH4jabKGlf\n59wtzrkDnHPnSTpJ0vg0lB8lNGKEXbRHj7ZZNJ991mbUrFbN8g2aNpWuuUY65xxrcojPZwj06yct\nWmQTbL30kvWseOQR65o5aZK0++7S1KnW5JCdbc0RQdDgnDVdxCdD5ubaKJjVqoULGmbNklatkn7z\nm4KDWT34oPUYueEGy59YtszGlyjKO+/YPWNHoKTOO88+c0BllmpNw0hJjSS9I2ll3C3+d2hLSXsG\nD7z330gaLBvXYYGsq+WfvPeJPSpQjurXt9qG11+3X+qTJsV6ZdSsKR1/vPTKK1KnTnZBTtZVs18/\n6x561VVWS3H22VbzcOONFijcf7/tOzvbJt9q2FDq3Dm2/UEHFQwa3nnHAo4RI+yXf2KS5datts9g\nxs9nn5X23lsaP94SNz/+2C76V19t+Qzt2lkgE7yGogRjRixaZMFLuq1bZwFORfT995kuQcXw+uvS\nhAllm9wLZFy62zvScRM5DeVixw7vP/00lrsQ74MPvO/Xz/vvvy96H19/Hdt+/Xrv69SxXIIzzvB+\n2zbvGzXy/rrrbF/HHVdw2wce8L56de+3bLHHQS7FokW2jzfeKLj+tdfa8lNP9X77dstluPRS+3uP\nPbw/4QTLszj4YCtL4KGHvHfO+2++Sf4acnO9b9zY+z59bP9ffJF8vbw870eN8v7MM4t+T5IZNMj7\nNm12zr2IuoUL7b2bNy/TJYm2X37xvlo1+/y8916mS4OqLlI5Dag8qlWzX/vJahEOO8xmz2zRouh9\n7LtvbPvGja17p2TNHjVr2miUL71kU3gHTROBgw+2XIOFC+3+hRds+/33l3bbrWATxZIl0k03Sb16\nWR7DsGFWi3Hyydbb4dxzpcmTreZjyhQrS+DUUy2P48knk7+GBQuknBwrs1R4E8Udd1iNxZNPpvbr\n++ef7b38+mvp+uvDbxcFc+ZYz5o5czJdkrKRl2dNWaX17bexGoYXXij9/oCoImhAWl11lXW/7NbN\nHh9zjI2XsHXrzkFDhw52P3eu9M9/WiLlkCEWhPTqZd05AxdeaIHEG29YnsVTT9kAWJ062fMjR0qn\nnGLPt2pV8DgNGlhS5BNPxCb9ivfOOxZU/O530i67JA8apk6VLrvMmnCqV7dZRQOPP26BT2GmT5e2\nbbNA57bbih/Uat06m5sj/hiZkp1t9/PnZ7YcZeWxx6Tf/rb087QsWWL3AwZY0JDscwZUCumuukjH\nTTRPVBrLl1uVbbNm1hySaN997XnJ+x49Yk0dd9xhTR2//OL900/b888/b89t2uR9r17e3313+HIE\nXTk/+GDn5+K7gvbq5X1WVsHnp0zxvmFD74891poyBg3yvmdPe+6dd2y/DRp4/9JLyY999tnet2vn\n/datdt+9e/ImocDVV9s+d93V+zVrwr/GstCtm5XlwAMzW46ycvLJRTdJhXX//d7XqOH966/b/ubP\nT0/5gJKgeQIVVuvW0iGHWNJktSSftnvuse6Zn39uTRhBU8cRR1jtxJgxNh13VpbVFkg2Iua779pz\nYfXrZ4NPPfFEweW5ubavYA6NDh0K1jRMmGCDVx15pCVeVq9uzR3vvWdV0pdfbjORDhxoPUyuusq6\nlwby8qwb6nHHSbVrWxPL3Lk2yFYyGzdK//iHdRXdscMGwiqNtWutlqQkcnMtIbVdO6tJSeesqVHg\nfSwBtrRNFEuWWC+eo4+2CeVookClle4oJB03UdNQqSxf7v3atalts22b9/Xq2a+2P//ZfuGX1sUX\n26/39eu9v+su7wcMsOTE+OS1f/zD+1q1LLny5ZftuQsuKHj8nBzva9e25E7J+2nTrBblhhsssbNL\nF+//9z9bd948W2fmTHu8bp3fadCreLfeagNSLV/u/b/+ZeteeaX311zj/Z13pv6a//pXS2T8/PPY\nso0bY4NwFeXTT+34d9xReC1NYVautBqcn35Kvczl5b//jdVyPfJI6fb1xz9637+//X3WWd63b1/6\n8kXdF1/Y/0aYzxLKV1nWNGQ8QEhaKIIGeBvJ8Z57iq7KT8Unn9gnvn59u7gPHuz9RRfZBSM4xowZ\nts6iRd537ep9797J93XiibZe374Fy/fBB94fcIA1rdx5p13wmzSxICTQpo192SYKRur8v/+zx3l5\ndjGqUcOCHcn7BQsKrh+/32Q6d7btgiaXDRvs+H/+c9HbeW9Bi3PWRFKjhlXBF2bMGO9ffDH2+K67\n7Lhvvrk1ix3RAAAbbklEQVTzugsXWk+XlSuLL0NZuvtuCxB32cX7v/+9dPvq2NF6/3hvAaGU+aal\nspSbG2u6evnlTJcGiQgagDQ54wzvzznHuoom88MP9l8xapT/dVjtZP7zH7ugJvv1vWmTBQWSdcNL\nzJHIyrL8jXg//WQX0urVvV+8uOBzeXlW87LbbhbkBMt69PD+sMNiXVYTrV1rZezVy+4XLvT+3HOt\nXJ06Jd8m3vnnWwDkvfeHHGLvWzKrV9v+u3ePLQtqYR56qOC6ubm2XtAtN5NOOMGCwm7dLO+kpPLy\nrGvxLbfY4y+/9Em7DJe1lSvtsxXf3bis3H23nfPmzXceYh6ZR9AAlJO8PPvlKdmv9MJqOfLyvF+2\nrOh9zZhh83Yk/tq+806ridi2zR5/8oklhDZpUvSvtgsusC/p7dvtgiRZkHHWWcnL+e9/2zpffx0b\nv0Kyi2TNmpaYGUiWpNqjRyzgGT7c3o9knnjC/1rNv2SJNd/UqGGPr7ii4Lp33WUXmxEj7PlZswo+\nv3699889l7w88e6/3/ulS4tepyjB2BzXXuv9SSfFEmFLYu1aey3PPWeP8/Js39dfX/J9lsSYMb5c\nfvkvW2a1deed5/1NN3lft641eSE6SIQEyolzsa6gf/tb8jEsgvX22qvoffXta10V+yXMvNK1qyV5\nfvaZXWrPOsuSO+fPt26fhRk61EaVnDrVurV27WrdPR9/3GYxTTR9unUn3HdfS9j89FPrAnvnnZas\n+dlntt6339rsnvHdKnNzbfyKYATPTp2sq+i2bTsfZ8oUS5asW1f697+tfLm5Nnvp0qWx9ZYula64\nwsbDmDDBxgIZPTo2AueKFZYAO2SIjWtRmDVrrOvrww/Hlq1YYYmx8UmoRQnG5ujb185jaRIhg+6W\nwYyxzklduthssWH88kvqo5C+/badywcftMerVsX+DrrJlpUrr7RxUG66yZKCt2yxMVIk6X//i32u\nKpIVK+x/JZiLB4VjEmAgQffuNoT1CSeUzf4PPdR6YcybF+uh8NprBacIT6ZjRxuM65JLrDfDiy9a\nj43sbBsS/OSTC45R8eab1mtDkv70J2n9ehvqu3FjO/78+RYUTJkibdpk98G4F4sW2cUgPmjYts0u\nCh07xo6xY4eNjTFqlLR4sQ28ddBBduvUSfrii9i6d95pxx43znrS3HefBQ5du9q4HC++aOXae28b\nP2HAgOTvQzDQ1Mcfx5Y9/7z1xBk0yC6mxXnrLQvUDjvM3r9vv7UArrAgsSiJQYMUC+jCOPpoG+js\nvvuKXzcnxyaUmzHDzvWbb9p4JdOm2WBqHTumN2hYvNjO4eDB9t5s2WLn6dJLLdBs1Ejq2VN6+mnr\nPXLMMdKuu1rvoIo0xfxzz9nrnDfPPn8oHDUNQIKgW2SyLqLpUK+eXVTnzZMeekjac0/rslkc56xW\nYuFCG/zp+ONt+TXXSHXqFLzofP21/bLv398e165tv/JbtLAagfbtYxeX6dPtPn4wreC5Qw+1+0MO\nseMnXpA+/FD68Ue7WJ96qv2C/89/LFhJrGlYsEDq3dvmIJHs1/h//iMdeKAFLPvua+/7yJHWZTEn\nJ/n78N57dh9fMzJvnt3/+9/Jt9m6teDj2bOlww+XatWymoatW617akksWWITvDVpElvWpYvNmrpy\nZdHb5uZa2R95RFq9uvhj3X23vf6XXrKJ2Pr1s2BxwgSrtenXL3wNR1Hefts+o/vvb7VfwYRuU6fa\nCKdDhsTWzcqy5cccI+23n5XrlVdKX4bCrFljQVk6g6MXX7T7IAAsTm6utHlz+o5foaS7vSMdN5HT\ngErunHOsF0ODBtadMqyVK609+dlnCy6/8ELLxdi0yR7ff7/lO+TkJN/P0KHWNTQ31/umTW3wrQYN\nYr0xzjrL+w4dCm5z2GGWd/GPf8TWu+oq2z431xIyGza0dvU5c7x/7DH7e9Mma+dv2tS6pRZnxQpL\nIE1MogwcfrjlDEixuVHatrVusE2a7NwF8IUXrFzx7e4HHBDrwfLRR7avxK+bHTuKn3vFe+vtkpjv\nsWyZ7XPy5J33GZ+oGMyzIhXfg2PjRjvH558fW7ZuneXD1K1rCakvvGD7+u67wvfz1luW8FuUQYO8\n/+1vrUfMgQd6/7vf2fKsLO8POqjguqtWWS+Uo46yc92zp83jUpwdO6xLcHG5QYn+8hf/6xw0JTV8\nuHUl9t7KH8wbEvRcKs6NN1pycFSRCAlUMg89ZP99zqX+pZls0qslS+yL7/777Yv7iCNio1YmE4xH\n8d57Vo5bbrH7jz6yi26TJtZdNN6aNdb7wjkLKF580QKP+C/voUO93313CyJmzbJ9fv65BQLJLqKF\nGTgwefm3bLFyjx1r+3v99djYF5deavevvVZwm8GDY6/Newt4atb0/t577fGqVfZ8fJdR7y2wq1+/\n+N4Ifft6P2RIwWV5edbb5aqrCi4fM8YmVwsSV59/3o49ZIitX1hPGO+9v/12SzBN/LwsX+79++/b\n30GwUlgy5Hff2T4uv7zo17TXXvZ+em9dkp2zhN0GDZIneH71VSyx9tlnrQyffFL0MYKA6brril4v\n3sqVlkTctq2dwzBBXaLVq+1/pV4929/DD9vjPn3sXIYxdGjBoDVqSIQEKpmuXe3+mGOKT6hMVLfu\nzsv22Uc68UTp1ltjVbd//Wvh+whyFO6+2+bmGD3amjhmzbL2/vXrY5OPBXbdVXrgAatOb9HCjvfR\nR9Kxx8bWueMOq8quXj2Wo7F0aSw57sADw73GYcOsGn7x4oLLP/rIyn3GGZYfMX9+rDr+7LMtmS2+\niWLdutiImIsW2f2331rC5H772ePddrPmm/hkyMcft6ajTZuKTsqUrEo7Pp9Bsqacrl2t+SYwc6aN\n9vndd7HX9fnnUrNmluexdq3NqZLM1q323p555s6fl9atLQ9Hsqau3XYrvOr+gQesav3TTwt/PRs2\n2HsRnKvTTrNzf/LJOzdNBNq0sfdQss9F69b2WosSnLeg6SOMcePs8z99uuVMPPJI+G0Dr71mdTs1\na0rXXWdNPT172nw5X38dbh9BU1LQVJaqn36yZtBNm0q2fSYRNAAZ0KGD9RS49NL07XPsWLtAV6tm\nF6sgCTKZjh1tveeftyG069e3L81337U8gzZtLDkvmS5d7EL61luWAPn738eea9bMelJIlqhXq1Ys\naKhfP3yS2Qkn2ORhF19syZaB996zIOfggy3f4uOP7bU2bmxBwJAhdhH45Rdb//nn7QLRuHEsKTO4\nYAdBQ9ATJggaPvvM8iqGDbPXMmVK4eXcvNm2a9Mm+fv00Ud2/M2bLRk1SCz94AO7//xz+yy0bWs5\nKrffnrwHyJNPWg+JogLB4LV07pw8r2HbNgsaatUqetK0YBj1IGioU8fO8xdf2Pt+wAFFl6FmTem8\n8ywAWr8+tnz9ertYBoIyzpkTO19FWbbMyn/JJfY5Ou00aeLE1HueTJ5sQdaVV1pgOG2aBTpt2kjL\nlyfvIZQoCBriZ+KdO9f2tXJl8ROW3Xqr5RidfLKVf+tW+8z95S+pvZZ4K1bY52fjxpLvI5R0V12k\n4yaaJ4ASmTMnltdQnPbtrYo1mPjryitt5Mldd/X+ssvSU5799rMBqYYNs5yIVLz2mlUbB9Xk3nt/\n3HGxMRUuusja8084wQaT8j42NPRjj9njI4+04cKPOirWhHDPPdbEET80eL9+seePOMLa8TdtsqHH\nW7VKPg7GkiU2SFbt2t5/9tnOz7/yipXlkkuszHXq2MBP++/v/ejRtk779jbegffef/yxvd6grT3e\nwIHhx5L429+8b9Fi5+XBSJWXX273heW7PPiglSO+GeyHH6z848aFK8OKFdakEZwH7+0cDBgQe9yz\np+VNxA+zXpS//MVyOoLclOzs5M1KRdm82Zolbr7ZmoJat/a/ji8SjAb75ZfF72fPPW3drl3t8YYN\nsWHvJcsJKUxOjuXk9OtnTUVnnBEbXXO33cK/lkRXX23NRxs2kNMAoAyccYZ9AwTzZEybFvvS+/DD\n9BxjwAAbcrtz55KNujh+vJXn9tvtC3+XXWKJo8GgUk2bxtro8/Ls4l+zpuV3OOf9o496P3KkDW7l\nveUV/Pa3BY8zbJiNVPn557bPINH0zTf9TsN3e28jhTZpYkFLYTNarl1r+R0tWtggX08+acvPOsve\nj19+sYvGfffFtvnzn+2Lf/ny2LJNmywwGT8+3HtWWDJk9+52ofr4Y19gvpVEY8ZYYJNoyZLU5pk4\n4ojYxXPxYjtmjRp20dy+3S6yt95q5+/aa4ve1/btNrBZ4vDnPXtaYmbYcgWBXPCZnzzZ8nS8t8HC\nJJvVtih5eXY+OnSwZOOff47lfcyebfk2iYOnxQvml1mxwj6bkvctW8aCueKSVJPZts32EQxlTtAA\nIO2eecZ6IgS/ojdutC/B3/wmffN9jBhhX+p164a/6MXLy7MLhRTrmTF9uj0XXOAlu1AGtm3z/g9/\nsOW1a9tF6q677Jfyjh02xflxxxU8ztVXW43CRRdZTUvwhb91qyVD3nRTbN0HHrD3adCgkk3Idd99\ndvH88EMr4zvvxJ5bv94ujvGJla++austXBhu/8mSIWfOtGUvvWSvqXp17ydOTL59374250lp3XOP\nvc4ff7Taj7p1rQzPPx+rEXrnHaspOuqogtv+/LPV8nz1lT0O3oPES8KCBXaMxKTdwpxzjtV+Jft8\n5+buHMQls369lSWYwn7GDAuQggnLgknq4oeY37LF9r91q13c4wPoGTMsofKLLwp+vlMRBIpBcEvQ\nAKBcHHdceoc/vvlm+wVW0i/DwBdfWJPJgAGx5pfc3NiFKP6XufcWOJx9tl14vLdfj5L9mmzb1n4N\nxnv4YStns2axbQLHH2/NHFu2xOYUOf/84icLK0zQxTPYV+LEVkEzwowZ9nj0aO/33jt8IJeXZxem\n3/3OgqRNm+xC2aNHrEmmXbtYE0mi3XdPrRtwYVautPf0wQetx8jIkXbc4cNjk6Hl5MSGVQ96juTl\neX/yyf7XIc9zc22o74MPTv4eXHedBUHF1Y7t2GG1PonnN17btrH5XQoTzC0yY4bVNp11lj1++ml7\nfutWa/66557Ycffe24LPAw+MzQOTKPg8lyS47t+/4NwvBA0AKqSg+11ZdU/r3t0ukMVdUIOq55df\ntgvMhAkFn49vmgmqrgMTJ9o27dsXvBiU1LZtdpFs1sxqFRLl5Vn+R7du9ve++9oEaql4+WXLS7j4\nYguQ6tSxLo6Bk0+2QChR0P00mEejtHr3tiBEsl/gf/mLXbhHjYo1EQXNJUGNy7hx/tfciyCnpVat\nwi+m27ZZbsnee9t8K7m5Vrvx/PMFc02CmoG5cwsv78CBVvNRlNmz/a9diYPuvI0aFcwB6drV+zPP\ntL+DIHH0aKsBS5yPJV6XLhZUpSJo+nn00dgyggYAFVJQVdusWfqaPOI98EC4fv47dtiFc/Ron7TW\nIxgzoFevnbddtsyCho4drVo9HXr2tOMVNi5AkEtx222xYCdV99wTC4Ruv73gc9ddZ/khiefkrbd8\nSk0hxZkwwfZ34IF2rLff9r/moQSznO7YYY/79LFaHedi41tccon/NRdi1arCj7N4cWxm1ZYt7XxJ\nljtwxx3W5CBZzVdRzjsvNnjVxo3J34egKWDNGmu2kmJ5EYHRo2NB0fXXW1ARTFBXlKImhivMxRfb\n+xcftBA0AKiQ1qyxb5kwIwSWtYMPtnwNyftvvin43ObNltGeONJm4IsvUksCLM5FF1k5xowpfJ2+\nfe2Xdq1a1sZfElddZYmo8T1FvLceB8mSJYNBv0ra9JLohx/swn3XXfZ427bYaJ5Brx3vremieXOr\nZr/xxtgsp1u2WMBxyinhjvfBB5YDM3Gi5UNcfHEscLrgguID19tvt2aEvDzL66hTZ+dAceJEOy87\ndlhQ7FzB/AXvY6Ohrl9veUN/+EO48o8fb8dMPF+FWbPGEmcTezsRNACokPLyLIExfujjTAnayWvX\nTj71dlGjMaZb0GzzwAOFrzN3rq1Tmmm7C/PVV7bvN96wv0eNssDo3HPTPzzy4sUFL4JDhvgie28k\n2rq18J4IYUybZsOXFzfduvexYCqYer5hQwta4n/FX3ddwWallSt33s/Chf7XZp5q1SyvI4yghikI\nUnv0sB4f8ce65BLrVum9NeHUr2+jXMZjREgAFZJzNgDP6NGZLklsUKI2bZJPRlanTvmV5cgjbfTG\nI44ofJ1u3aTrry/dgD+F2WcfG2xr+nSb1Oz++21SshdfDD9qZ1ht29oIoYEhQ2xyr/jZUotSu3Zs\ntMmS6N/fprkPMwFdMEjXyJH298yZNhhY/CBsq1dLu+8ee9yy5c772X9/mwH0ppukvLxwM69KNkmY\nZINvPfOM9P77NrBX4JFHpNtus5lOv//eZnY9/3wbBbS8EDQAKFOnnWbDO2daUIZgJMhMatHCRpIM\nRs8szJVXhpsBNVXVqtlIlHfcEZvyfNQoG8q6S5f0Hy/ekCE2amK9emV7nJIIhj7/5hubPfbQQ+0i\nfe+99h5JOwcNyVSrZu/j/Pn2Pu+5Z7jj77673T791M5NtWoW2AWjok6ZYvubOdOGgvfeRk0tTwQN\nAKqEKAUNUdCtmw3VPW2aBS/jx9vF8rzzyv7YyeZPiYIGDaTmza1W6rTTbNnQoXYfTMUeJmiQpMMO\ns/uwtQyBgw6yGoVPP5Wuvtqmns/Otvu5c6ULLogNK17etQySVKN8DwcAmbH//jYvQvv2mS5JNNx8\ns03Y1KRJbFmqk6dVRjfdZEFD0KTSqJHVFARzcqxebU05xSlp0HDwwdKMGXaMK66wGoepU20Ol7w8\nmyCudWsLLtq2TW3f6UDQAKBKaNDAfrEVN+FSVVGvXjSbCDJt+PCdl3XoEAsaVq0KV9MweLD0z39K\nffumdvwgr+Hiiy3I7dfPaoO+/tqea93ans9Ukx/NEwCqjIMOslkegVS0b29BQ26uTbceJmioVcsC\nkDAJmPGOP95yWU45xR4PHGgJka++WnAa+kwhaAAAoAgdOljzQDB9epigoaSaNbNeM0FwO3CgJUKu\nW0fQAABA5HXoYD0VZs2yx2UZNCTaZx9L3m3YUOrZs/yOWxhyGgAAKEKQPPv223bfvHn5Hn/MGEvA\nrFmzfI+bDEEDAABFaNjQelAEQUN51jRI1rUyKmieAACgGB06SMuXW4+T+vUzXZrMIWgAAKAYHTrY\nfXnXMkQNQQMAAMUgaDAEDQAAFCNIhiRoAAAARSJoMAQNAAAUo2FDm9grmD67qqLLJQAAIbz3XtXu\nOSERNAAAEErTppkuQebRPAEAAEIhaAAAAKEQNAAAgFAIGgAAQCgEDQAAIBSCBgAAEApBAwAACIWg\nAQAAhELQAAAAQiFoQLmYNGlSpouANOJ8Vi6cT4SVctDgnDvCOfeyc+4751yec+74Ytbvnb9e/G2H\nc66KzxVWtfClVLlwPisXzifCKklNQ31JCySdJ8mH3MZL2k9Si/xbS+/96hIcGwAAZEjKE1Z579+Q\n9IYkOedcCpuu8d5vSPV4AAAgGsorp8FJWuCcW+mcm+acO7ycjgsAANKkPKbG/l7SCEkfSaot6RxJ\n7zjnDvPeLyhkmzqStHDhwnIoHspDTk6O5s+fn+liIE04n5UL57Nyibt21kn3vp33YdMSkmzsXJ6k\n33vvX05xu3ckLfPeDy3k+dMkPVXiggEAgNO990+nc4flUdOQzDxJPYt4fqqk0yV9I2lreRQIAIBK\noo6kvWXX0rTKVNDQUdZskZT3fp2ktEZHAABUIXPKYqcpBw3OufqS2sqSGyVpX+fcIZJ+9N4vd87d\nJKlV0PTgnLtA0lJJn8uin3MkHSWpfxrKDwAAyklJahq6SHpbNvaCl3RH/vLHJJ0tG4dhz7j1a+Wv\n00rSZkmfSurnvZ9VwjIDAIAMKFUiJAAAqDqYewIAAIRC0AAAAEKJXNDgnBvtnFvqnNvinJvrnOua\n6TKheM65a5JMTPa/hHWuyx8VdLNzbrpzrm2myouCwkxEV9z5c87Vds7d55xb65zb6Jx7nonpMqe4\nc+qc+1eS/9nXE9bhnEaAc+5y59w859wG59wq59yLzrn9k6xX5v+jkQoanHOnyJImr5F0qKRPJE11\nzu2a0YIhrM8kNVdsYrJewRPOucsknS/pXEmHSdokO7e1MlBO7KzIiehCnr+7JA2W9EdJR8qSn/9T\ntsVGEcJMLjhFBf9nsxKe55xGwxGS7pHUTdLRkmpKmuacqxusUG7/o977yNwkzZV0d9xjJ2mFpEsz\nXTZuxZ67ayTNL+L5lZLGxj1uJGmLpJMzXXZuO52rPEnHp3L+8h//IunEuHUOyN/XYZl+TVX9Vsg5\n/ZekF4rYhnMa0ZukXfPPQ6+4ZeXyPxqZmgbnXE1JnSXNCJZ5e1VvSuqRqXIhJfvlV4V+7Zx70jm3\npyQ55/aR/YqJP7cbJH0gzm3khTx/XWRduOPX+ULSt+IcR1mf/OruRc65Cc65XeKe6yzOaVQ1kdUe\n/SiV7/9oZIIGWeRUXdKqhOWrZG8Gom2upGGSBkoaKWkfSbPyBwNrIfuAc24rpjDnr7mkbflfVIWt\ng2iZIuksSX0lXSqpt6TXnXPBwH0txDmNnPzzc5ek2d77IG+s3P5HMzWMNCoZ7338GOefOefmSVom\n6WRJizJTKgCF8d7/O+7h5865/0r6WlIf2QB+iKYJktqr6PmbykyUahrWStohi4biNZf0Q/kXB6Xh\nvc+R9KVsyPEfZPkpnNuKKcz5+0FSLedcoyLWQYR575fKvoeDjHvOacQ45+6VNEhSH+99/PxN5fY/\nGpmgwXu/XVK2pH7BsvxqmH4qo4k3UHaccw1kXz4r87+MflDBc9tIlgnMuY24kOcvW1JuwjoHSNpL\n0vvlVliUmHOutaRmik0myDmNkPyA4QRJR3nvv41/rjz/R6PWPDFe0qPOuWzZ9NljJdWT9GgmC4Xi\nOeduk/SKrEliD0nXStou6Zn8Ve6SdKVz7ivZlOfXy3rGTC73wmInxU1Ep2LOn/d+g3PuEUnjnXM/\nSdoo6R+S3vPezyvXFwNJRZ/T/Ns1su52P+Svd4usdnCqxDmNEufcBFl32OMlbXLOBTUKOd77rfl/\nl8//aKa7jiTpSnJe/gveIot+umS6TNxCnbdJ+R/QLbJs3Kcl7ZOwzt9l3YI2y76Y2ma63Nx+PTe9\nZV2vdiTc/hn2/EmqLetLvjb/C+k5Sbtn+rVV1VtR51Q24/AbsoBhq6Qlku6XtBvnNHq3Qs7jDkln\nJaxX5v+jTFgFAABCiUxOAwAAiDaCBgAAEApBAwAACIWgAQAAhELQAAAAQiFoAAAAoRA0AACAUAga\nAABAKAQNAAAgFIIGAAAQCkEDAAAI5f8BEXt19l83XNQAAAAASUVORK5CYII=\n",
"text/plain": [
"<matplotlib.figure.Figure at 0x106e39da0>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"import matplotlib.pyplot as plt\n",
"import matplotlib.ticker as ticker\n",
"%matplotlib inline\n",
"\n",
"plt.figure()\n",
"plt.plot(all_losses)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Evaluating at different \"temperatures\"\n",
"\n",
"In the `evaluate` function above, every time a prediction is made the outputs are divided by the \"temperature\" argument passed. Using a higher number makes all actions more equally likely, and thus gives us \"more random\" outputs. Using a lower value (less than 1) makes high probabilities contribute more. As we turn the temperature towards zero we are choosing only the most likely outputs.\n",
"\n",
"We can see the effects of this by adjusting the `temperature` argument."
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Thoo head strant me reporce\n",
"O and hears of thou provand of treech.\n",
"\n",
"LUCI death in that to tellon is head thing come thou that to not him with your firsure but,\n",
"They here thyse of yet in thou thy meat to\n"
]
}
],
"source": [
"print(evaluate('Th', 200, temperature=0.8))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Lower temperatures are less varied, choosing only the more probable outputs:"
]
},
{
"cell_type": "code",
"execution_count": 12,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"This commanderence the forself to the the to the the the to the to the the formands\n",
"What to the strange the boy the the have the the to the to to the formands\n",
"That the the the the the the sorn the to th\n"
]
}
],
"source": [
"print(evaluate('Th', 200, temperature=0.2))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Higher temperatures more varied, choosing less probable outputs:"
]
},
{
"cell_type": "code",
"execution_count": 13,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"That,\n",
"henct wto Haste's, norsee'd stave brYiry's is dsem.\n",
"Hell hurss Heamous halloR:\n",
"Tht a readerty the!\n",
"\n",
"KuWhrate.\n",
"\n",
"VLOMAY, mere's no, toojecur' kong.\n",
"\n",
"DUKE VIx whJos ivistomzliben\n",
"The vrieglad bloot, \n"
]
}
],
"source": [
"print(evaluate('Th', 200, temperature=1.4))"
]
},
{
"cell_type": "markdown",
"metadata": {
"collapsed": true
},
"source": [
"# Exercises\n",
"\n",
"* Train with your own dataset, e.g.\n",
" * Text from another author\n",
" * Blog posts\n",
" * Code\n",
"* Increase number of layers and network size to get better results"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Next**: [Generating Names with a Conditional Character-Level RNN](https://github.com/spro/practical-pytorch/blob/master/conditional-char-rnn/conditional-char-rnn.ipynb)"
]
}
],
"metadata": {
"anaconda-cloud": {},
"kernelspec": {
"display_name": "Python [conda root]",
"language": "python",
"name": "conda-root-py"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.2"
}
},
"nbformat": 4,
"nbformat_minor": 1
}
================================================
FILE: char-rnn-generation/generate.py
================================================
# https://github.com/spro/practical-pytorch
import torch
from helpers import *
from model import *
def generate(decoder, prime_str='A', predict_len=100, temperature=0.8):
hidden = decoder.init_hidden()
prime_input = char_tensor(prime_str)
predicted = prime_str
# Use priming string to "build up" hidden state
for p in range(len(prime_str) - 1):
_, hidden = decoder(prime_input[p], hidden)
inp = prime_input[-1]
for p in range(predict_len):
output, hidden = decoder(inp, hidden)
# Sample from the network as a multinomial distribution
output_dist = output.data.view(-1).div(temperature).exp()
top_i = torch.multinomial(output_dist, 1)[0]
# Add predicted character to string and use as next input
predicted_char = all_characters[top_i]
predicted += predicted_char
inp = char_tensor(predicted_char)
return predicted
if __name__ == '__main__':
# Parse command line arguments
import argparse
argparser = argparse.ArgumentParser()
argparser.add_argument('filename', type=str)
argparser.add_argument('-p', '--prime_str', type=str, default='A')
argparser.add_argument('-l', '--predict_len', type=int, default=100)
argparser.add_argument('-t', '--temperature', type=float, default=0.8)
args = argparser.parse_args()
decoder = torch.load(args.filename)
del args.filename
print(generate(decoder, **vars(args)))
================================================
FILE: char-rnn-generation/helpers.py
================================================
# https://github.com/spro/practical-pytorch
import unidecode
import string
import random
import time
import math
import torch
from torch.autograd import Variable
# Reading and un-unicode-encoding data
all_characters = string.printable
n_characters = len(all_characters)
def read_file(filename):
file = unidecode.unidecode(open(filename).read())
return file, len(file)
# Turning a string into a tensor
def char_tensor(string):
tensor = torch.zeros(len(string)).long()
for c in range(len(string)):
tensor[c] = all_characters.index(string[c])
return Variable(tensor)
# Readable time elapsed
def time_since(since):
s = time.time() - since
m = math.floor(s / 60)
s -= m * 60
return '%dm %ds' % (m, s)
================================================
FILE: char-rnn-generation/model.py
================================================
# https://github.com/spro/practical-pytorch
import torch
import torch.nn as nn
from torch.autograd import Variable
class RNN(nn.Module):
def __init__(self, input_size, hidden_size, output_size, n_layers=1):
super(RNN, self).__init__()
self.input_size = input_size
self.hidden_size = hidden_size
self.output_size = output_size
self.n_layers = n_layers
self.encoder = nn.Embedding(input_size, hidden_size)
self.gru = nn.GRU(hidden_size, hidden_size, n_layers)
self.decoder = nn.Linear(hidden_size, output_size)
def forward(self, input, hidden):
input = self.encoder(input.view(1, -1))
output, hidden = self.gru(input.view(1, 1, -1), hidden)
output = self.decoder(output.view(1, -1))
return output, hidden
def init_hidden(self):
return Variable(torch.zeros(self.n_layers, 1, self.hidden_size))
================================================
FILE: char-rnn-generation/train.py
================================================
# https://github.com/spro/practical-pytorch
import torch
import torch.nn as nn
from torch.autograd import Variable
import argparse
import os
from helpers import *
from model import *
from generate import *
# Parse command line arguments
argparser = argparse.ArgumentParser()
argparser.add_argument('filename', type=str)
argparser.add_argument('--n_epochs', type=int, default=2000)
argparser.add_argument('--print_every', type=int, default=100)
argparser.add_argument('--hidden_size', type=int, default=50)
argparser.add_argument('--n_layers', type=int, default=2)
argparser.add_argument('--learning_rate', type=float, default=0.01)
argparser.add_argument('--chunk_len', type=int, default=200)
args = argparser.parse_args()
file, file_len = read_file(args.filename)
def random_training_set(chunk_len):
start_index = random.randint(0, file_len - chunk_len)
end_index = start_index + chunk_len + 1
chunk = file[start_index:end_index]
inp = char_tensor(chunk[:-1])
target = char_tensor(chunk[1:])
return inp, target
decoder = RNN(n_characters, args.hidden_size, n_characters, args.n_layers)
decoder_optimizer = torch.optim.Adam(decoder.parameters(), lr=args.learning_rate)
criterion = nn.CrossEntropyLoss()
start = time.time()
all_losses = []
loss_avg = 0
def train(inp, target):
hidden = decoder.init_hidden()
decoder.zero_grad()
loss = 0
for c in range(args.chunk_len):
output, hidden = decoder(inp[c], hidden)
loss += criterion(output, target[c])
loss.backward()
decoder_optimizer.step()
return loss.data[0] / args.chunk_len
def save():
save_filename = os.path.splitext(os.path.basename(args.filename))[0] + '.pt'
torch.save(decoder, save_filename)
print('Saved as %s' % save_filename)
try:
print("Training for %d epochs..." % args.n_epochs)
for epoch in range(1, args.n_epochs + 1):
loss = train(*random_training_set(args.chunk_len))
loss_avg += loss
if epoch % args.print_every == 0:
print('[%s (%d %d%%) %.4f]' % (time_since(start), epoch, epoch / args.n_epochs * 100, loss))
print(generate(decoder, 'Wh', 100), '\n')
print("Saving...")
save()
except KeyboardInterrupt:
print("Saving before quit...")
save()
================================================
FILE: conditional-char-rnn/conditional-char-rnn.ipynb
================================================
{
"cells": [
{
"cell_type": "markdown",
"metadata": {
"nbpresent": {
"id": "9a73330c-27c1-4957-8e95-c3b42bc14a71"
}
},
"source": [
"\n",
"\n",
"# Practical PyTorch: Generating Names with a Conditional Character-Level RNN\n",
"\n",
"[In the last tutorial](https://github.com/spro/practical-pytorch/blob/master/char-rnn-classification/char-rnn-classification.ipynb) we used a RNN to classify names into their language of origin. This time we'll turn around and generate names from languages. This model will improve upon the RNN we used to [generate Shakespeare one character at a time](https://github.com/spro/practical-pytorch/blob/master/char-rnn-generation/char-rnn-generation.ipynb) by adding another input (representing the language) so we can specify what kind of name to generate.\n",
"\n",
"```\n",
"> python generate.py Russian\n",
"Rovakov\n",
"Uantov\n",
"Shavakov\n",
"\n",
"> python generate.py German\n",
"Gerren\n",
"Ereng\n",
"Rosher\n",
"\n",
"> python generate.py Spanish\n",
"Salla\n",
"Parer\n",
"Allan\n",
"\n",
"> python generate.py Chinese\n",
"Chan\n",
"Hang\n",
"Iun\n",
"```\n",
"\n",
"Being able to \"prime\" the generator with a specific category brings us a step closer to the [Sequence to Sequence model](https://github.com/spro/practical-pytorch/blob/master/seq2seq-translation/seq2seq-translation.ipynb) used for machine translation."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Recommended Reading\n",
"\n",
"I assume you have at least installed PyTorch, know Python, and understand Tensors:\n",
"\n",
"* http://pytorch.org/ For installation instructions\n",
"* [Deep Learning with PyTorch: A 60-minute Blitz](https://github.com/pytorch/tutorials/blob/master/Deep%20Learning%20with%20PyTorch.ipynb) to get started with PyTorch in general\n",
"* [jcjohnson's PyTorch examples](https://github.com/jcjohnson/pytorch-examples) for an in depth overview\n",
"* [Introduction to PyTorch for former Torchies](https://github.com/pytorch/tutorials/blob/master/Introduction%20to%20PyTorch%20for%20former%20Torchies.ipynb) if you are former Lua Torch user\n",
"\n",
"It would also be useful to know about RNNs and how they work:\n",
"\n",
"* [The Unreasonable Effectiveness of Recurrent Neural Networks](http://karpathy.github.io/2015/05/21/rnn-effectiveness/) shows a bunch of real life examples\n",
"* [Understanding LSTM Networks](http://colah.github.io/posts/2015-08-Understanding-LSTMs/) is about LSTMs specifically but also informative about RNNs in general\n",
"\n",
"I also suggest the previous tutorials:\n",
"\n",
"* [Classifying Names with a Character-Level RNN](https://github.com/spro/practical-pytorch/blob/master/char-rnn-classification/char-rnn-classification.ipynb) for using an RNN to classify text into categories\n",
"* [Generating Shakespeare with a Character-Level RNN](https://github.com/spro/practical-pytorch/blob/master/char-rnn-generation/char-rnn-generation.ipynb) for using an RNN to generate one character at a time"
]
},
{
"cell_type": "markdown",
"metadata": {
"nbpresent": {
"id": "cc294dae-dd8f-4288-8d3c-bb9fd3ad19bc"
}
},
"source": [
"# Preparing the Data\n",
"\n",
"See [Classifying Names with a Character-Level RNN](https://github.com/spro/practical-pytorch/blob/master/char-rnn-classification/char-rnn-classification.ipynb) for more detail - we're using the exact same dataset. In short, there are a bunch of plain text files `data/names/[Language].txt` with a name per line. We split lines into an array, convert Unicode to ASCII, and end up with a dictionary `{language: [names ...]}`."
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"collapsed": false,
"nbpresent": {
"id": "6a9d80df-1d38-4c41-849c-95e38da98cc7"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"O'Neal\n"
]
}
],
"source": [
"import glob\n",
"import unicodedata\n",
"import string\n",
"\n",
"all_letters = string.ascii_letters + \" .,;'-\"\n",
"n_letters = len(all_letters) + 1 # Plus EOS marker\n",
"EOS = n_letters - 1\n",
"\n",
"# Turn a Unicode string to plain ASCII, thanks to http://stackoverflow.com/a/518232/2809427\n",
"def unicode_to_ascii(s):\n",
" return ''.join(\n",
" c for c in unicodedata.normalize('NFD', s)\n",
" if unicodedata.category(c) != 'Mn'\n",
" and c in all_letters\n",
" )\n",
"\n",
"print(unicode_to_ascii(\"O'Néàl\"))"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"# categories: 18 ['Arabic', 'Chinese', 'Czech', 'Dutch', 'English', 'French', 'German', 'Greek', 'Irish', 'Italian', 'Japanese', 'Korean', 'Polish', 'Portuguese', 'Russian', 'Scottish', 'Spanish', 'Vietnamese']\n"
]
}
],
"source": [
"# Read a file and split into lines\n",
"def read_lines(filename):\n",
" lines = open(filename).read().strip().split('\\n')\n",
" return [unicode_to_ascii(line) for line in lines]\n",
"\n",
"# Build the category_lines dictionary, a list of lines per category\n",
"category_lines = {}\n",
"all_categories = []\n",
"for filename in glob.glob('../data/names/*.txt'):\n",
" category = filename.split('/')[-1].split('.')[0]\n",
" all_categories.append(category)\n",
" lines = read_lines(filename)\n",
" category_lines[category] = lines\n",
"\n",
"n_categories = len(all_categories)\n",
"\n",
"print('# categories:', n_categories, all_categories)"
]
},
{
"cell_type": "markdown",
"metadata": {
"nbpresent": {
"id": "4ff5f52a-2523-47f0-beba-f6c29d412e5f"
}
},
"source": [
"# Creating the Network\n",
"\n",
"This network extends [the last tutorial's RNN](#Creating-the-Network) with an extra argument for the category tensor, which is concatenated along with the others. The category tensor is a one-hot vector just like the letter input.\n",
"\n",
"We will interpret the output as the probability of the next letter. When sampling, the most likely output letter is used as the next input letter.\n",
"\n",
"I added a second linear layer `o2o` (after combining hidden and output) to give it more muscle to work with. There's also a dropout layer, which [randomly zeros parts of its input](https://arxiv.org/abs/1207.0580) with a given probability (here 0.1) and is usually used to fuzz inputs to prevent overfitting. Here we're using it towards the end of the network to purposely add some chaos and increase sampling variety.\n",
"\n",
""
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {
"collapsed": true,
"nbpresent": {
"id": "597a765d-634b-41a8-a0c6-be5c019da150"
}
},
"outputs": [],
"source": [
"import torch\n",
"import torch.nn as nn\n",
"from torch.autograd import Variable\n",
"\n",
"class RNN(nn.Module):\n",
" def __init__(self, input_size, hidden_size, output_size):\n",
" super(RNN, self).__init__()\n",
" self.input_size = input_size\n",
" self.hidden_size = hidden_size\n",
" self.output_size = output_size\n",
" \n",
" self.i2h = nn.Linear(n_categories + input_size + hidden_size, hidden_size)\n",
" self.i2o = nn.Linear(n_categories + input_size + hidden_size, output_size)\n",
" self.o2o = nn.Linear(hidden_size + output_size, output_size)\n",
" self.softmax = nn.LogSoftmax()\n",
" \n",
" def forward(self, category, input, hidden):\n",
" input_combined = torch.cat((category, input, hidden), 1)\n",
" hidden = self.i2h(input_combined)\n",
" output = self.i2o(input_combined)\n",
" output_combined = torch.cat((hidden, output), 1)\n",
" output = self.o2o(output_combined)\n",
" return output, hidden\n",
"\n",
" def init_hidden(self):\n",
" return Variable(torch.zeros(1, self.hidden_size))"
]
},
{
"cell_type": "markdown",
"metadata": {
"nbpresent": {
"id": "8ff6da45-57cd-46ca-b14a-3f560ce4d345"
}
},
"source": [
"# Preparing for Training\n",
"\n",
"First of all, helper functions to get random pairs of (category, line):"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"import random\n",
"\n",
"# Get a random category and random line from that category\n",
"def random_training_pair():\n",
" category = random.choice(all_categories)\n",
" line = random.choice(category_lines[category])\n",
" return category, line"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"For each timestep (that is, for each letter in a training word) the inputs of the network will be `(category, current letter, hidden state)` and the outputs will be `(next letter, next hidden state)`. So for each training set, we'll need the category, a set of input letters, and a set of output/target letters.\n",
"\n",
"Since we are predicting the next letter from the current letter for each timestep, the letter pairs are groups of consecutive letters from the line - e.g. for `\"ABCD<EOS>\"` we would create (\"A\", \"B\"), (\"B\", \"C\"), (\"C\", \"D\"), (\"D\", \"EOS\").\n",
"\n",
"\n",
"\n",
"The category tensor is a [one-hot tensor](https://en.wikipedia.org/wiki/One-hot) of size `<1 x n_categories>`. When training we feed it to the network at every timestep - this is a design choice, it could have been included as part of initial hidden state or some other strategy."
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {
"collapsed": false,
"nbpresent": {
"id": "cf311809-10bf-40f7-87e1-1952342f7f35"
}
},
"outputs": [],
"source": [
"# One-hot vector for category\n",
"def make_category_input(category):\n",
" li = all_categories.index(category)\n",
" tensor = torch.zeros(1, n_categories)\n",
" tensor[0][li] = 1\n",
" return Variable(tensor)\n",
"\n",
"# One-hot matrix of first to last letters (not including EOS) for input\n",
"def make_chars_input(chars):\n",
" tensor = torch.zeros(len(chars), n_letters)\n",
" for ci in range(len(chars)):\n",
" char = chars[ci]\n",
" tensor[ci][all_letters.find(char)] = 1\n",
" tensor = tensor.view(-1, 1, n_letters)\n",
" return Variable(tensor)\n",
"\n",
"# LongTensor of second letter to end (EOS) for target\n",
"def make_target(line):\n",
" letter_indexes = [all_letters.find(line[li]) for li in range(1, len(line))]\n",
" letter_indexes.append(n_letters - 1) # EOS\n",
" tensor = torch.LongTensor(letter_indexes)\n",
" return Variable(tensor)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"For convenience during training we'll make a `random_training_set` function that fetches a random (category, line) pair and turns them into the required (category, input, target) tensors."
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# Make category, input, and target tensors from a random category, line pair\n",
"def random_training_set():\n",
" category, line = random_training_pair()\n",
" category_input = make_category_input(category)\n",
" line_input = make_chars_input(line)\n",
" line_target = make_target(line)\n",
" return category_input, line_input, line_target"
]
},
{
"cell_type": "markdown",
"metadata": {
"nbpresent": {
"id": "53fb987f-4f42-4bf8-81ae-280ebdd19aee"
}
},
"source": [
"# Training the Network\n",
"\n",
"In contrast to classification, where only the last output is used, we are making a prediction at every step, so we are calculating loss at every step.\n",
"\n",
"The magic of autograd allows you to simply sum these losses at each step and call backward at the end. But don't ask me why initializing loss with 0 works."
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {
"collapsed": false,
"nbpresent": {
"id": "df50f546-6d02-4383-beab-90378f16576b"
}
},
"outputs": [],
"source": [
"def train(category_tensor, input_line_tensor, target_line_tensor):\n",
" hidden = rnn.init_hidden()\n",
" optimizer.zero_grad()\n",
" loss = 0\n",
" \n",
" for i in range(input_line_tensor.size()[0]):\n",
" output, hidden = rnn(category_tensor, input_line_tensor[i], hidden)\n",
" loss += criterion(output, target_line_tensor[i])\n",
"\n",
" loss.backward()\n",
" optimizer.step()\n",
" \n",
" return output, loss.data[0] / input_line_tensor.size()[0]"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"To keep track of how long training takes I am adding a `time_since(t)` function which returns a human readable string:"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"import time\n",
"import math\n",
"\n",
"def time_since(t):\n",
" now = time.time()\n",
" s = now - t\n",
" m = math.floor(s / 60)\n",
" s -= m * 60\n",
" return '%dm %ds' % (m, s)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Training is business as usual - call train a bunch of times and wait a few minutes, printing the current time and loss every `print_every` epochs, and keeping store of an average loss per `plot_every` epochs in `all_losses` for plotting later."
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {
"collapsed": false,
"nbpresent": {
"id": "81fde336-785e-461b-a751-718a5f6bff88"
},
"scrolled": true
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"0m 28s (5000 5%) 1.8674\n",
"0m 53s (10000 10%) 2.4155\n",
"1m 20s (15000 15%) 3.4203\n",
"1m 45s (20000 20%) 1.3962\n",
"2m 12s (25000 25%) 1.7427\n",
"2m 38s (30000 30%) 2.9514\n",
"3m 4s (35000 35%) 2.8836\n",
"3m 31s (40000 40%) 1.6728\n",
"3m 57s (45000 45%) 2.5014\n",
"4m 22s (50000 50%) 1.9687\n",
"4m 48s (55000 55%) 1.5595\n",
"5m 16s (60000 60%) 2.3830\n",
"5m 43s (65000 65%) 1.5155\n",
"6m 10s (70000 70%) 1.7967\n",
"6m 37s (75000 75%) 1.8564\n",
"7m 3s (80000 80%) 1.9873\n",
"7m 30s (85000 85%) 1.9569\n",
"7m 56s (90000 90%) 1.7553\n",
"8m 22s (95000 95%) 2.3103\n",
"8m 48s (100000 100%) 1.7575\n"
]
}
],
"source": [
"n_epochs = 100000\n",
"print_every = 5000\n",
"plot_every = 500\n",
"all_losses = []\n",
"loss_avg = 0 # Zero every plot_every epochs to keep a running average\n",
"learning_rate = 0.0005\n",
"\n",
"rnn = RNN(n_letters, 128, n_letters)\n",
"optimizer = torch.optim.Adam(rnn.parameters(), lr=learning_rate)\n",
"criterion = nn.CrossEntropyLoss()\n",
"\n",
"start = time.time()\n",
"\n",
"for epoch in range(1, n_epochs + 1):\n",
" output, loss = train(*random_training_set())\n",
" loss_avg += loss\n",
" \n",
" if epoch % print_every == 0:\n",
" print('%s (%d %d%%) %.4f' % (time_since(start), epoch, epoch / n_epochs * 100, loss))\n",
"\n",
" if epoch % plot_every == 0:\n",
" all_losses.append(loss_avg / plot_every)\n",
" loss_avg = 0"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Plotting the Network\n",
"\n",
"Plotting the historical loss from all_losses shows the network learning:"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {
"collapsed": false,
"scrolled": true
},
"outputs": [
{
"data": {
"text/plain": [
"[<matplotlib.lines.Line2D at 0x1102b8f60>]"
]
},
"execution_count": 10,
"metadata": {},
"output_type": "execute_result"
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAg0AAAFkCAYAAACjCwibAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzt3XecVOX1x/HvoYs00YCoiBo1YAnKqhDsNZZYMLYVI5JY\nAc0PNZY0C2rUREUFYkw0wUaMsXdjUOyKgA17xQZEwQXpu3t+f5y9zuyy5c6yuzO7fN6v176WuXPL\nM3OHvWfOc57nmrsLAACgLq3y3QAAANA8EDQAAIBUCBoAAEAqBA0AACAVggYAAJAKQQMAAEiFoAEA\nAKRC0AAAAFIhaAAAAKkQNAAAgFRyChrM7GQze9XMSip+njOzfWtZf4iZPWZmc7PW32fVmw0AAJpa\nrpmGTyWdLWmApCJJkyXda2b9alh/F0mPSdqvYpsnJN1vZv3r11wAAJAvtqo3rDKzryWd6e5/T7n+\nG5L+6e4XrdKBAQBAk2pT3w3NrJWkIyR1lPR8ym1MUmdJ8+p7XAAAkB85Bw1mtpUiSOggaaGkIe7+\ndsrNfyVpTUn/quMYa0v6saSPJS3NtY0AAKzGOkjaSNKj7v51Q+64PpmGtyX1l9RV0mGSbjKzXeoK\nHMzsaEm/k3SQu39VxzF+LOnWerQNAACEoZJua8gd5hw0uHuppA8rHs4wsx0k/VLSKTVtY2ZHSbpe\n0mHu/kSKw3wsSbfccov69aupxhLNyejRo3XVVVfluxloIJzPloXz2bK89dZbOuaYY6SKa2lDqndN\nQ5ZWktrX9KSZFUv6m6Qj3f2RlPtcKkn9+vXTgAEDVr2FyLuuXbtyLlsQzmfLwvlssRq8ez+noMHM\nLpH0sKRZioLGoZJ2lbRPxfN/kLSeuw+reHy0pH9IOk3SVDPrWbGrJe6+oCFeAAAAaBq5ztPQQ9JE\nRV3D44q5GvZx98kVz68rqXfW+idIai1pvKQvsn7GrkKbAQBAHuSUaXD34+t4fniVx7vXp1EAAKDw\ncO8JNIni4uJ8NwENiPPZsnA+kRZBA5oEf5RaFs5ny8L5RFoEDQAAIBWCBgAAkApBAwAASIWgAQAA\npELQAAAAUiFoAAAAqRA0AACAVAgaAABAKgQNAAAgFYIGAACQCkEDAABIhaABAACkQtAAAABSIWgA\nAACpEDQAAIBUCBoAAEAqBA0AACAVggYAAJAKQQMAAEiFoAEAAKRC0AAAAFIhaAAAAKkQNAAAgFQI\nGgAAQCoEDQAAIBWCBgAAkApBAwAASIWgAQAApELQAAAAUiFoAAAAqRA0AACAVHIKGszsZDN71cxK\nKn6eM7N969hmNzObZmZLzexdMxu2ak0GAAD5kGum4VNJZ0saIKlI0mRJ95pZv+pWNrONJD0g6b+S\n+ku6WtLfzGzverYXAADkSZtcVnb3B6ss+q2ZnSJpkKS3qtnkFEkfuvtZFY/fMbOdJI2W9J9cGwsA\nAPKn3jUNZtbKzI6S1FHS8zWsNkjS41WWPSrpR/U9LgAAyI+cMg2SZGZbKYKEDpIWShri7m/XsPq6\nkuZUWTZHUhcza+/uy3I9PgAAyI/6ZBreVtQn7CDpz5JuMrO+DdoqAABQcHLONLh7qaQPKx7OMLMd\nJP1SUb9Q1WxJPass6ylpQZosw+jRo9W1a9dKy4qLi1VcXJxrswEAaHEmTZqkSZMmVVpWUlLSaMcz\nd1+1HZj9V9In7v7zap67VNJ+7t4/a9ltkrq5+/617HOApGnTpk3TgAEDVql9AACsTqZPn66ioiJJ\nKnL36Q2575wyDWZ2iaSHJc2S1FnSUEm7Stqn4vk/SFrP3ZO5GK6TNNLMLpN0o6Q9JR0mqcaAAQAA\nFKZcuyd6SJooqZekEkmvSdrH3SdXPL+upN7Jyu7+sZkdIOkqSadJ+kzSL9y96ogKAABQ4HKdp+H4\nOp4fXs2ypxQTQQEAgGaMe08AAIBUCBoAAEAqBA0AACAVggYAAJAKQQMAAEiFoAEAAKRC0AAAAFIh\naAAAAKkQNAAAgFQIGgAAQCoEDQAAIBWCBgAAkApBAwAASIWgAQAApFLQQUNpab5bAAAAEgUdNKxY\nke8WAACAREEHDcuX57sFAAAgQdAAAABSKeigge4JAAAKR0EHDcuW5bsFAAAgUdBBA5kGAAAKR0EH\nDdQ0AABQOAgaAABAKgUdNNA9AQBA4SjooIFCSAAACkdBBw1kGgAAKBwFHTRQ0wAAQOEo6KCBTAMA\nAIWjoIMGMg0AABQOggYAAJAKQQMAAEiFoAEAAKRS0EEDhZAAABSOgg4ayDQAAFA4cgoazOxcM3vJ\nzBaY2Rwzu9vMNk+x3VAze8XMFpnZF2Z2g5l1r2s7ggYAAApHrpmGnSVdK2mgpL0ktZX0mJmtUdMG\nZrajpImS/ippC0mHSdpB0vV1HYygAQCAwtEml5Xdff/sx2Z2nKS5kookPVPDZoMkfeTu4ysef2Jm\nf5F0Vl3HI2gAAKBwrGpNQzdJLmleLes8L6m3me0nSWbWU9Lhkh6sa+cUQgIAUDjqHTSYmUkaK+kZ\nd3+zpvXc/TlJx0i63cyWS/pS0nxJo+o6BpkGAAAKR07dE1VMUNQo7FjbSma2haSrJZ0v6TFJvST9\nSdJfJB1f27avvTZaBx3UtdKy4uJiFRcX17vRAAC0FJMmTdKkSZMqLSspKWm045m7576R2ThJB0ra\n2d1n1bHuTZI6uPsRWct2lPS0pF7uPqeabQZImlZUNE0vvzwg5/YBALC6mj59uoqKiiSpyN2nN+S+\nc+6eqAgYDpa0e10BQ4WOkkqrLCtX1EJYbRvSPQEAQOHIdZ6GCZKGSjpa0iIz61nx0yFrnUvMbGLW\nZvdL+qmZnWxmG1dkGa6W9KK7z67teBRCAgBQOHKtaThZkSF4ssry4ZJuqvh3L0m9kyfcfaKZdZI0\nUlHL8I2k/0o6p66DkWkAAKBw5DpPQ52ZCXcfXs2y8ZLGV7N6rQgaAAAoHNx7AgAApELQAAAAUino\noIFCSAAACkdBBw1kGgAAKBwFHTSsWCHVY+4pAADQCAo6aJDINgAAUCgKPmhYtizfLQAAABJBAwAA\nSImgAQAApELQAAAAUin4oGHp0ny3AAAASM0gaCDTAABAYSBoAAAAqRA0AACAVAgaAABAKgUfNFAI\nCQBAYSj4oIFMAwAAhYGgAQAApELQAAAAUinooKFNG4IGAAAKRUEHDe3aUQgJAEChKPiggUwDAACF\ngaABAACkQtAAAABSKeigoW1bggYAAApFQQcNFEICAFA4Cj5oINMAAEBhIGgAAACpEDQAAIBUCj5o\nWLIk360AAABSgQcNnTpJCxbkuxUAAEBqBkHDN9/kuxUAAEBqBkFDSUm+WwEAAKQcgwYzO9fMXjKz\nBWY2x8zuNrPNU2zXzswuNrOPzWypmX1oZsfVtV3nzgQNAAAUijY5rr+zpGslvVyx7R8kPWZm/dy9\ntpLFOyR9T9JwSR9I6qUUAUtS01BWJrVunWNLAQBAg8opaHD3/bMfV2QL5koqkvRMdduY2b6KYGMT\nd08qFGalOV7nzvF74UKpW7dcWgoAABraqtY0dJPkkubVss6BiszE2Wb2mZm9Y2Z/NLMOde28U6f4\nTTEkAAD5l2v3xHfMzCSNlfSMu79Zy6qbKDINSyUdImkdSX+W1F3SL2o7RpJpoK4BAID8q3fQIGmC\npC0k7VjHeq0klUs62t2/lSQzO13SHWY2wt1rnPPxmmtGS+qqESOktdeOZcXFxSouLl6FZgMA0DJM\nmjRJkyZNqrSspBG/aZu7576R2ThFt8PO7l5rfYKZ/UPSYHffPGtZX0kzJW3u7h9Us80ASdP+859p\n2nvvAbr3Xumgg3JuJgAAq53p06erqKhIkorcfXpD7jvnmoaKgOFgSbvXFTBUeFbSembWMWvZDxTZ\nh89q25CaBgAACkeu8zRMkDRU0tGSFplZz4qfDlnrXGJmE7M2u03S15L+bmb9zGwXSZdLuqG2rgkp\n7j3Rvj01DQAAFIJcMw0nS+oi6UlJX2T9HJG1Ti9JvZMH7r5I0t6KkRZTJd0s6V5Jv0xzwG7dyDQA\nAFAIcp2noc4gw92HV7PsXUk/zuVYia5dyTQAAFAICvreExKZBgAACkXBBw1kGgAAKAwFHzSQaQAA\noDAUfNBApgEAgMJA0AAAAFIp+KCB7gkAAApDwQcNZBoAACgMBR80dOsmLVsmLV2a75YAALB6K/ig\noWvX+E22AQCA/Cr4oKFbt/hNXQMAAPlV8EEDmQYAAApDwQcNZBoAACgMBR80kGkAAKAwFHzQ0Llz\n/CbTAABAfhV80NC6tdSlC5kGAADyreCDBim6KMg0AACQX80iaOjWjUwDAAD51iyCBjINAADkX7MI\nGsg0AACQf80iaOCmVQAA5F+zCBq4PTYAAPnXLIIGMg0AAORfswkayDQAAJBfzSJo6NZNWrBAKi/P\nd0sAAFh9NYugoUcPyV2aOzffLQEAYPXVLIKGPn3i96xZ+W0HAACrs2YRNGy4Yfz+5JP8tgMAgNVZ\nswgauneX1lyTTAMAAPnULIIGs8g2kGkAACB/mkXQIEVdA5kGAADyp9kEDWQaAADIr2YTNJBpAAAg\nv5pV0DBvnvTtt/luCQAAq6ecggYzO9fMXjKzBWY2x8zuNrPNc9h+RzNbYWbTc21oMuySbAMAAPmR\na6ZhZ0nXShooaS9JbSU9ZmZr1LWhmXWVNFHS47k2UspM8ERdAwAA+dEml5Xdff/sx2Z2nKS5kook\nPVPH5tdJulVSuaSDczmuJK23ntS6NZkGAADyZVVrGrpJcknzalvJzIZL2ljSBfU9UJs20vrrk2kA\nACBfcso0ZDMzkzRW0jPu/mYt620m6RJJO7l7eWxWPxtuSKYBAIB8qXfQIGmCpC0k7VjTCmbWStEl\ncZ67f5Asru8B+/Qh0wAAQL7UK2gws3GS9pe0s7t/WcuqnSVtJ2kbMxtfsaxV7MKWS9rH3Z+saePR\no0era9eu3z1+6y1p/vxiScX1aTYAAC3KpEmTNGnSpErLSkpKGu145u65bRABw8GSdnX3D+tY1yT1\nq7J4pKTdJf1U0sfuvqSa7QZImjZt2jQNGDDgu+V/+Ys0cqS0dGnUOAAAgMqmT5+uoqIiSSpy95yn\nOKhNTpdeM5ug+Jp/kKRFZtaz4qkSd19asc4lktZ392EeEcmbVfYxV9JSd38r18b26SOVlUlffJGZ\ntwEAADSNXEdPnCypi6QnJX2R9XNE1jq9JPVuiMZVlQQK1DUAAND0cgoa3L2Vu7eu5uemrHWGu/se\ntezjAncfUNPztUkmeProo/psDQAAVkWzufeEJK25ZmQb3qxxgCcAAGgszSpokKQtt5Rmzsx3KwAA\nWP0QNAAAgFSaZdDw0UfSokX5bgkAAKuXZhk0SDHREwAAaDrNLmjoVzFV1Btv5LcdAACsbppd0NCp\nk7TRRtQ1AADQ1Jpd0CBJW21F0AAAQFNrlkEDIygAAGh6zTZomDVLWrgw3y0BAGD10WyDBomZIQEA\naErNMmjo21cyo4sCAICm1CyDho4dpU02YdglAABNqVkGDZK0ww7SAw9IpaX5bgkAAKuHZhs0nHGG\n9N570u2357slAACsHppt0FBUJP3kJ9KYMVJZWb5bAwBAy9dsgwZJ+v3vpXfeke64I98tAQCg5WvW\nQcP220v77RfZhvLyfLcGAICWrVkHDZL0m9/EfA2PPJLvlgAA0LI1+6Bh8ODIOFx1Vb5bAgBAy9bs\ngwYzafRo6fHHpddfz3drAABouZp90CBJhx0mbbCBNHZsvlsCAEDL1SKChrZtpVGjpFtvlebOzXdr\nAABomVpE0CBJJ54Yv2++Ob/tAACgpWoxQcNaa0kHHCBNmpTvlgAA0DK1mKBBkoqLpWnTYnppAADQ\nsFpU0HDAAVKnTmQbAABoDC0qaFhjDemQQyJocM93awAAaFlaVNAgRRfF229Lr76a75YAANCytLig\nYe+9pe7dpX/+M98tAQCgZWlxQUPbtnETq8mT890SAABalhYXNEjSjjtKM2ZIixfnuyUAALQcLTJo\nGDxYKi2VXn453y0BAKDlyCloMLNzzewlM1tgZnPM7G4z27yObYaY2WNmNtfMSszsOTPbZ9WaXbut\ntoqhl88915hHAQBg9ZJrpmFnSddKGihpL0ltJT1mZmvUss0ukh6TtJ+kAZKekHS/mfXPvbnptG4t\nDRpE0AAAQENqk8vK7r5/9mMzO07SXElFkp6pYZvRVRb9xswOlnSgpEYbGDl4sDR+fMzXYNZYRwEA\nYPWxqjUN3SS5pHlpNzAzk9Q5l23qY/Bg6euvmVIaAICGUu+goeLiP1bSM+7+Zg6b/krSmpL+Vd9j\npzFwYGQY6KIAAKBhrEqmYYKkLSQdlXYDMzta0u8kHe7uX63CsevUrZu05ZYEDQAANJScahoSZjZO\n0v6Sdnb3L1Nuc5Sk6yUd5u5PpNlm9OjR6tq1a6VlxcXFKi4uTtXOwYOlZ5+t/rnFi6WZM6Xtt0+1\nKwAACs6kSZM0qcpdGktKShrteOY53tmpImA4WNKu7v5hym2KJf1N0pHu/kCK9QdImjZt2jQNGDAg\np/Zlu/VW6ZhjpM8+k9Zfv/JzZ54pXXGF9O9/Sz/9ab0PAQBAQZk+fbqKiookqcjdpzfkvnOdp2GC\npKGSjpa0yMx6Vvx0yFrnEjObmPX4aEkTJZ0haWrWNl0a5iXUbP/9pTZtpHvuqby8tFS65Rapa1fp\nZz9jEigAANLItabhZEldJD0p6YusnyOy1uklqXfW4xMktZY0vso2Y+vV4hystZa0xx7SXXdVXv7o\no9KcOdJDD0n9+0sHHijNndvYrQEAoHnLdZ6GOoMMdx9e5fHuuTaqIQ0ZIo0aFcMv1147lk2cGLNG\n/uhH0T2xwQbSf/8bt9UGAADVa5H3nsh28MFSebl0//3xeP586b77pGHDYkjm+utL3/se8zkAAFCX\nFh809OoVoyjuvjse/+tf0ooV0tChmXU224ygAQCAurT4oEGKLopHH5VGjJDOOkvad98IJhKbbSa9\n+27m8eTJ0kknNX07AQAoZKtF0HDooVJZWYyiGDlSuvHGys9XzTT885/S9ddLixY1bTsBAChk9Zrc\nqbnZeGPpo4+kddeNIZhVbbZZ1DokxZKvvBLL335biqGuAABgtcg0SDFCorqAQYqgQYpsQ2mp9Prr\n8XjmzKZpGwAAzcFqkWmoS3bQ0KmTtHRpPH4zl9twAQDQwhE0KAKFXr0q1zUMHkzQAABAttWme6Iu\nyQiKV16JGojBg2vvnliyRDr7bKkR7wsCAEBBIWiokIygmDFD2nZbaYstonhy8eLq13/4Yenyy2Ok\nBQAAqwOChgpJ0PDKK5mgwV16553q13/44fh9551N10YAAPKJoKHC5ptLCxfG0MskaJAyXRSPPy69\n9Vb82z2Chh49pCeekObNy0+bAQBoSgQNFZIRFFIEDZ07S717RzHk3LlxD4vhwyNgmDlT+vzz6J4o\nLc3c1yKtb75p2LYDANAUCBoqfP/78btHj8wU01tsEUHDlVfGMMwXX5SefjqyDB07SkceGQWTVW+9\nXZtXX40bZL32WsO/BgAAGhNBQ4U11ojMwrbbxt0vpQgapk6Vxo+XfvUracstI7vwyCPS7rtLHTrE\nFNWPPip9+2264zz3XGQn7rij8V4LAACNgaAhyy9/KZ14YubxlltKX3wRt9Y+44wIHB58UJoyJW56\nJUXQsGyZ9NBD6Y6RTFF9zz0N23YAABobQUOWM86IICCRFEOOGBFdCsXFMR11WZm0337x3MYbx/0p\nbr013TFmzJB69pTeeEN6//2GbT8AAI2JoKEWRUXSr38tnXNOPG7XTrrwwsgyJDUQkvTzn0cG4ssv\na99fcl+LU0+Nrg2yDQCA5oSgoRbt2kkXXxx3vkwMH56ZoyExdGis+/e/176/d9+Ngsodd5T22Ue6\n++6GbzMAAI2FoKEBdO0qHXGEdMMNUf9Qk6SeoX9/6ZBDpOefl2bPbpo2AgCwqggaGsgJJ0gffhiT\nPdXklVekPn2ktdaSDjwwRmnkOscDAAD5QtDQQAYPlvr1k377W+mii6RrrpFWrKi8zowZ0jbbxL/X\nWSf+/fzzTd9WAADqg6ChgZhF0eRnn0njxsXwzexJn9wz97VIbLNNTPYEAEBzQNDQgI45Rvr006hT\nGDRIuvnmzHNffCF99VUm0yDFv994Y+WMBAAAhYigoZEcc0zMHPm//8XjpAgyO2jo319avrzmO2kC\nAFBICBoayZFHRpfF7bfH4yeflLp1kzbcMLNO//7xOwkoavPtt9KECbWPzgAAoDERNDSSddaJWSNv\nvjmmmL7yyphZMrmvhRRDNTfeuHJdg3v1+xs/Xho5Upo2rfbj3nxz3IETAICGRtDQiH72M+mll2IO\nhwMOiNkkq+rfP5NpmDo1hmP+/vcxCVSirEy67rr494wZNR/v3/+Wjj1Wuvba+rf566+5dTcAoHoE\nDY3oJz+JbMImm8S9KVq3XnmdbbaJoME9hmm2aiVdemnlkRWPPCJ9/LHUpYs0fXr1x5o7VzrllNi+\ntrkiajJnjnT66XFvjaOPzn17AEDLR9DQiNZYI+ZhePJJqXPn6tfZZpsYVfHGG3G77HPOiWzCGmtI\nP/6x9NFHUctQVBSzSFYXNLhLJ50UXR8XXSS9/HJu2YIvv4ybc91wgzRwYAQd2ZmOXMydK915Z+Vl\nCxZIixfXvl15ubRoUf2OCQBoGgQNjaxfP6l795qfT4ohTz89LpzHHRe35H70UalTJ2nPPeNeFyNG\nRODw2msrD9G84464+dWf/xwFmOXl0lNPpW/j2WdHhuK996Srr46A4bnncn6p3+3r8MOlhQszy4YM\nkU47rfJ6VYOSCROkzTeP0SSri/oGZgCQLwQNedanT3RhPP543Ja7R49Y3qNHdEssXBijLo46KiaG\nWrZMevvtzPYLF0qjR0cW4qc/jcLKPn2kyZPTHf/ZZ6N48g9/iGNuvXXcBvy//839tXz+eXTDuMfd\nPKWox3jhhci2ZK/XvXvlwOTpp2Mui4ceyv24NbnppnhPaiouzafFi6X11os2AkBzQdCQZ2aZuRtO\nOqnyc5tuGhf1hx6SOnbMrJfdRTFmjDR/vnTVVZn97blnJmhYtCiGfb78cnyL//hjadKk6IqYMkUa\nNUrabru4vbcUGYc99qhf0HDttXHL7zZtMvUY770XF8gPPsjMWfHEE9KSJZVrL5ICz2RCLPeo0cie\nICtXDz0Us3KmDaCa0tSpcd7+9KfCDGoAoDpt8t0ASDvtJM2bJ+2228rPbb555t+dO8fj6dOlYcOk\nN9+MYOH886WNNsqst8ce0o03RnHjiSdK990Xy1u1yszzYJa5WL3wQjyX2HNP6eSToy6iW7d0r2Hh\nwhjhcdJJ0mOPZUaEZI/2ePHFKA5Nuk6mTs1s+/77ERTdf3+8Fy+8EPt7/fUYhVIf774bvy+9NF5T\nIXn22TgHr78e78euu+a7RQBQt5wyDWZ2rpm9ZGYLzGyOmd1tZpun2G43M5tmZkvN7F0zG1b/Jrc8\nF1wQF9TsORxqMmBABA3l5fFNfKONpDPPrLzO7rvH70MPjYvwnXdGQea4cRFAzJ0b3/TffDMu7gMH\nVt5+zz1j/1Om1N2esrKY0fK88yKr8ctfRp1GkmmYMSMmtOrRIwIBKS6SrVplgobXXosA5tJLY3+3\n3SadcUZkV154IQopc+UeQcPAgdH18/LLue+jMT37rLTXXlHzsipDZLGy88/PBIwAGpi7p/6R9JCk\nn0nqJ2lrSQ9I+ljSGrVss5GkbyVdLukHkkZKWiFp71q2GSDJp02b5qjs8svd11zTffx4d8l98uTq\n1+vbN54fM6Z+x9l4Y/dTT635+bIy97/8xb179ziO5P5//xfPXXGF+xpruJeWuu+9t/vBB7sfdJD7\nnnu6f/llrHvEEfH788/dx41zb9vWfdky9333jW3N3O+4I9a5//7q21Be7n7hhe6HHhr/zvb557Ht\nXXe5f//77kOGuE+d6n799e4ff1y/96ShlJW5d+vmfsEFcR5bt3afNSu/bWop5s2L837aafluCZA/\n06ZNc0kuaYDncI1P87NqG0vrSCqXtFMt61wm6bUqyyZJeqiWbQgaavD443HW2rVzP/74mtebMCEu\n+mVl9TvO8ce7b7FF9c/NmuW+447RjuHDo01z5qzcxrffdl9nHffzz3e/5BL3zp3dJ02K5158MX7f\nc4/7L37hvs02se2tt8byX/wiAoE+fTLBSLbycvdf/zoTsPznP5Wff+KJWP7WWxHcJOtJ7vvtV7/3\npKHMnBntePxx9wUL4n359a9z38+UKe4zZlT/XHl5/c99czZ1ary3m22W75YA+dOYQcOqFkJ2q2jY\nvFrWGSTp8SrLHpX0o1U89mopubX22mtLf/xjzeudckpmsqj62Gef6L74+OPKy5ctiyGUs2bFiIgb\nb4zujGTUh5QZRvrgg5k7ew4aFLUL118fBZ7bby/17BldFDNmZF7XoYdKZ50VozmSos7Hq356FLNm\nXnJJFBL27y9dcUXl5997L177JptEkectt0QXzcSJMYT1xRfr977U17ffZuahePbZmOhr4MCoUxk6\nNNqXS0Hku+/GNOVnnFH9c337xqia1c3778fv996L4lsADay+0YYkU3RPTKljvXcknV1l2X6SyiS1\nr2EbMg21OOWUlb9ZN7QFC9w7dHD/059WPnb79u51nZr113ffeuv41vfJJ7G/Vq3i8c9/Huv85Cfu\nu+8eWZNrrql+P7fdFtt8+WVm2TXXxLLLL4/HN90Uj994I7POmWdGt0RVpaWRQUmyDbffHu2cPbv2\n1/Pgg3Wvc9ll7qefXv1zQ4ZEe0pK3IcNcx8wIPPc5MnR/hdeqH3/iWXL3IuKogunY0f35cszzz3z\nTHQZtWkTWZ7S0nT7dHdfssT9hBPc//739NsUmjFj3Lt0ie6ua6/Nd2uA/CjI7glJf5b0oaRedaxX\n76Bhl1128QMPPLDSz2233dbgbzCqd8gh7oMGZR4nXQd/+Uvd2+6/f6y71lqZeoMf/jCWTZwYjy+4\nIC58kvuX7J+vAAAcmklEQVRTT1W/n9mz4/lbb43Hd9wR25x5ZmadZcvc11svE4y4Rw1FTd0QSRfJ\nqadmjn/zzTW/ls8+i4CnpoAgsdlm8XqrdgssXBiBluT+s5+5b7pp5XqR0lL3Hj3czzij9v0nzj03\ngoJrr419vvRSLJ89O+pBdtnF/aGH4rmnn063z8WL3ffZJ7ZZYw33Dz9Mt12hGTYsPrO77x6fwWyl\npe7bbRefw8MOc3/ggbw0EWhQt91220rXyV122aWwggZJ4yR9ImnDFOtOkXRllWXHSZpfyzZkGgrA\nLbfEJ2TWLPdvvolvrkcdtXLRYXXOPTe23WOPzLITT4xlH30Ujx9+2L+rMygpqXlfW2/t/uMfR/1E\n27buRx+98oX50ksjY5FkA/r2df/lL6vfX2mpe79+/l3B3JZbRg1FTS65JNbdaqua15k1K/NaZs6s\n/Nydd8by88/PrPPPf1Ze5+STo36jrvd2ypQIdC65JIKlDh3cr7wynrv++ghu/ve/eH969qwcXNVk\nyZI4Tx07ut97r/uGG8b7neY85+L++90vuqhh91nVjju6H3NMZKHWWCNeW+Kdd+K9P+igCNy23rpx\n2wLkS0FlGioChk8lbZJy/UslvVpl2W2iELLgffNNXIjHjnU/++y4qHz2Wbptb789Pl3Z386ffNJ9\n6NDMxeh//4t1Nt209n2dfnqst+GGmYtlVV9/HQHFVVe5r1gR/x4/vuZ9vvyy+403RltGjaq+K8M9\nnt9ss8gEJKM93OMb/PrrZ7pNJk6M51u3XjkTc+yxEZi4x+uX3D/9tPI6//2vf1cg6u7+7bcrt2X+\nfPfevSOTkHQ77LprjB5xj2/Wu+ySWf/4490337zm9yB5fccdF8HHlCmx7P77oy0NndTbZZfIkPzv\nfw2732w9e0Zw9vrr8RoefTTz3N13+3ddXUmX1ty5jdeWbJddFscEcvHww/G3LVcFEzRImiBpvqSd\nJfXM+umQtc4lkiZmPd5I0kLFKIofSBohabmkvWo5DkFDgfjJT+Jbefv27r//ffrtkm91taX93WNo\n52GH1b7OvHkRcNTVP3/IIZF+fv99r3ZERU3+/e/KF/Irr4xhoosXuz/7bDx3++3xDT/p70+GjCZ1\nFcOGxQiQoqIIEhIrVrivvXZmdMSiRREgVLVihfv3vhfdFr/5Tab7IVFeHlmerl2jRiTxm9/EdgsW\nxDlKsg7u7vfd59+NIKlJMnS36gXtsMPce/XKBGjl5dHdU1sgVpv58yOgStu9le1vf3P/3e/qXm/B\ngtj/LbdEe9dfv/LIm4svznSXffpprPuvf+XWlvooLY3z1r595boboDbffhv/Z6obPVaXQgoayitq\nEar+HJu1zt8lTa6y3S6SpklaIuk9ST+r4zgEDQXiH/+IT8m660bffC7uust96dLa15kyxf3NN+vf\nvmzJvA5jx/p3BZhpzJ2budgsXBhzKEgRhAwfHt0GZWURkBQXxzfldu1iqGTfvnER6t07MiKnnea+\nySaVX1/aIseTTop127Z13377mI8jmVMiubhX/fafdPFcfHH8/uCDzHOLF0eK/rLLqj/e889HcFJd\nN07yTf322+NxEjwdcEDdr6M6ybnZYovKXVZVff55DJtMrFgRwUu7dpH5qs2MGZXf6+HD3fv3zzx/\n9NHRfZHYbLMo7G1sr7wS7eraNQpgswtX3SMwy6VgFS3Tt9/GF6TE009n/vbm+vkomKChqX4IGgrH\nvHlRjf6Pf+S7JXVbsiTauu66kW7PZZ6CLbeMdP64cVEXkEy6JLmfd16s85vfRF3HFVfEhT0pqExS\n3Q88kOmW+eKL2OaMM6I9adoyc2ZkE954I2o81l8/uhxuuSWyHNVNWFRSEu3t3DkK/Ko6+GD3wYNX\nXl5e7v6jH0VmpOpFLLHTTu677Rb/Li6O17XBBnW/juoMHx7v8V//Gu3NHg3jHu9l797+Xc3H44/H\n8nvuySyr6zOYBCZffRWPkxqPpKtnm22iriZx4okR9DW2a66JoOepp+IzlXyeEjvtlK72BPXz8cdN\nk1FaVT//eeXC86uuynz2H3sst30RNCCv6soWFJLhw+NTnWuR28iRkSHYbDP3ww+PZX/7W6T+k8LN\nJGuw1lqRvi8rizqLtdeOi0FJSdR8SHEBW7EiaiVOOKF+r+Xee2NfZvG6ago8ttkm1qsuhZ8Us779\nduXlSYbikUdqPn4y3HXy5MhIDBoUj3PtYy0vj8DpzDNj26pdL+4RHG29dRSNDhwYWZ3y8shsbLed\n+8471z0p16WXxrf5pGbmtdeivUnXVocOkYVKJEFfEuBlW748sjZps1W1OfzwTIbjV7+K2qDkm+Oy\nZRGAbrBB/QtPly/PPQtYl7KyljM52OjR8ZnLrhN66in3996rfbtXX226L0tJd5pZdOW5R0HvwIFR\n8zVsWG77I2gAUkpmo0yKA9NKvqVKkYpPZP/hXL7cvVOnyhfb3/8+Hmd/Q9hoo+iHHDky/lilnX+h\nOqecEhmQ2tKTo0ZFG6r777J0aRQHjhiRWVZe7r7DDpFpqO1CtXRpBE3du8eFLumieOKJ3F7D9OmZ\n4MM9AoSddqq8Tu/eUWzrHhf5pJupVauogRg/vu4iyuOPj8xJorQ0ztell8YFouo3tmRK8+wun88/\nj6nMkzlFWreOz0Yuvvkm8xlKAqZzzonHScFr0iWXBDZSvE/1cc45UfDakBf5k0+OIbir6qOPom23\n3LLq+6qvwYMrf/7Ky6Ow+Ygjat/uqKPic79iReO3Mfl8SvFlwT2yYKNGRWFv587R3ZgWQQOQUmlp\nFFfmes+NOXPif8MOO9R+IT3kkMguJBfxDz+M7bKngR46NOoRpEiRN7YXX4wREDW1+4IL4o9f0l/6\n4IOeulD0nHNi3ZNOij+e7dtX/raexkUXxR+9pKgyGWmSjESZP9+/qylJ7LtvLOvYMTI4s2dnAoia\n7Lab+5FHVl62xx5xzpKsTdXRP/36RbBRXh7rrLNOzPkxblwEGMXFEazceWdkmk44ITIUtRk9Otr6\n1luZi8GDD8ZzX38dj5N5R5JM0Jprxnmqyfjx8fqqnuPy8shmSTGxV02+/TbeuzRZwy++iOxHq1a1\nZ5WWL68cYFe1eHHUcJjF+U/qc8rK6jciIPH111EwXN09ZJYsiYt9ci+X5csjwyRl/iYk9Tq9etX8\nf6asLD4LUgR2DeXdd6t/z/72t3i/e/aMz09JSabwOvkMJfVFaRA0ADkoKam5n742p59e97fozz5b\neTTCvfdWHrr35z/H/6y6JoNqKnPmRJ/65ZdH3cTGG8c3/TTp8E8+ifkpktc8YEDlSbTcI5g44IBI\noVZN9b/7btQyDBmSWZZ8w0/+CCYFX6++mlknyU5kH2uvvWovotxgg6g7yXbuufFN/5JLKnddJEaM\niO6mDTeM4+2/f+VzuWJFZqRMknnIzipVtXx5ZGeSbNeNN1ZOObtHYW0ykddZZ8XjI4+snCXJlgz7\nlWKocLa3347lrVpVziZde23ldS+4oPKFsza//W0U0EorzyeS7Y9/jHWqS/NnD+V98snIJO2xR2Qe\nkllgn3xy5e2efDIumtlZk3Hj4tv2okUR+A4Y4DUW5b7wQjz3hz/E45dfjsc9e8bcI+6ZQmkpRlpV\nJ/n8Se433FDze5CroUPj/19VxxwT53/YMPdtt81k215/PZ7fYQf3Aw9MfxyCBqAZWbQohpoWUkX8\n8OHxzalTpwgCskdZ5Lqfqhe3q66KC2P37vGNcuTIKNo86KBY3rNnZg6IxMYbZ4aSTZgQ3+arzr9x\nzz2ZbIR7FFGaVd9FsXhx/DWrOgV2kmHYeefojqlqypToMx4xIgLG6gKp5cvjQvPkkzGktUOHmoPS\n5Hhnnhm/t9228ggO98h8JMHPvvvGxSCZbbW6eVCSbqG2bVcefnfFFZH9GTEigpXlyzPTkn//+/HN\nu6QkAqMePWLd2vryFy+Oz8mpp0aNSfbw4WzLlkUfvBTnL3HXXRFUrb22f1ck7B5ZreQ19O4d66y1\nVuVam3vuycycevfdsezLLzPL+vSJ+p3u3ePutknxcbZktFcyX0nSrfXb30aRdGlpfC633bbyEOqq\nLr00sj+bbtqwI2yKinyluqBk9NUZZ0R7zCJzmd01khRmV1d/Ux2CBgCr5NVX44/OEUesWtHc2LHx\nRzz5Y/bppxGIjBwZfwhHjow+7K22ir7k666rPCtj4uij49uTe/xRTia/qs1HH8VfrOpulf7GG/Fc\n1Wmzk2nIpdpn/UzrmWe81vqDIUPim3BpabwH0sq3mL/wwsx8Eb16RXZk3rw4P9ddt/I+TzwxLiqn\nnRYBWHYf+x57RIFo8s34gQfiwrrFFnGBvvDC+Nbdrl3MndKnT9Qq1JRluv76uGi9/35kQXr0qL5W\n4uabMxfyJItUXh71PNttFxmN7Im13GPZySdHzcf8+dE1tMkmkUUYNSpef1I0OmhQ7O/cc+PzNXVq\nvM7u3aN2p7zcfc8946Ke3eWSdKe1bh3HGDYsLtTJN/dp0yJ4GDMmRhsNH55p+4svZgL9PfeMrNOx\nx8braQjl5ZmaqOyuwQ8+iGX33Zf5jK+9duXhwd98E9mfiy9eeb8TJsR5zkbQAGCVzZ696lNDJ99i\nk0K+IUPiwlfXHApVjRsXF7UlS6Kr5Kij6t4mKWCr2gXhHt/QpOpvKrbxxvFc9sRX9bVoUeVZP5cv\njwvVs89Gt0abNu5XXx3PPfBAHPff/668j2TGzSR1ngwH3H33qFvIvgguXhzdKr/+dWb9pAj3m2/i\neOPGxXvTt2/UY0gxB8fZZ0dWpHv3uFhnt2nUqKgJ+OyzCPS6dYvz2KlTZELcM7eXf/nl6OL64Q+j\n62DZssie7LtvdHt07RqBzEsvxfrJcNm6fPRRZDN69Yrg4f/+Ly7aSRsffDD2nd3Nlx0wzZwZ5+KP\nf8wsO+SQzBTx//pXvCcjRsR5a9s2goik2HnkyMxstEkh9Jgx8Z63bx8ZtGuuie2q1oL84Q8RsGX/\nf3rjjfgcjh9ffddLMrJKikxGomoXVp8+sU7V+VOGDYv3qWoQt/328T5kZw8JGgAUhKSQb9KkTEFj\nLgVaiWnT/LvMQNeuUXOQxkEHxbfAbKWlcXGoaUhmMsdEbcNLc/HDH0bxpHvsU4o/+tttFxeYpPuk\nvDyG9lXtpkouHqeeGr+TFP1tt8V+eveOmoQvv4yagmSdJDA45phYP7nQJUOCk5R9cXE8Xrgw6jza\ntKlcNDhmTLznrVrFxbF79/hGf/75ERQk7Vm+PLqbzj8/3vNu3eL1bbppJjh47rlMkPKrX0UXyaqO\nNigri8xTp07R9qpTrmc79NDoekr07Zu5n8yQIdG25AZ5AwfGxbVTp3htyXv7+eeRFUqOl9xr5o03\n4nVJlSccmzMnU+icFDV+9VV0JyRZjlatVp6ILRk5s956mWHd7pHNyL7r7XHHxXpVZ9NNan+yg7Ky\nssxxs4d2EzQAKBgbbBCp2w4dolCxPtmLFSvij91pp3mNXQ7VufjiuJBlX4iT+SSS+3ZUdfXV8XxS\nUb+qfvGLTJ3CCSdE7cDYsXEhyb4Y1KS8PC6ua68d72H2a3nzzSiWS4Z8duxYufDyooti2fjxUdiX\n3a0za1ZkK7Lnlpg6tfpixoULoyvkT3+q/WZxhx4aXRutWsVF76WXogti++3jdaxYEQHI+efH8pNO\nqvv1p5EEpHXNT/DHP2b6/pcvj4v+hAkRwCTf6pMg6IwzvFIB5eefx+PDD4/fU6ZkiizXWy9e3+LF\nsc8//zlzzNNPjy6OPn0yAdwFF0T3wezZcSE/9tgIHrKzTOPHR9A1alRm1tiknmH06JVfe9WC6/Ly\nyKJkjxBKRlbstVfsOzn3BA0ACkZy2/Pttqu+XiGt3XbLTNld3fC56iTf1pJ7OCRZhqq3wc42f36k\ngBvKddfFBWHBgigaPOusWP711+nrRZLbkNfUXz57dgRDJ51U+V4ln34aIynatIntc7kfTH389a9x\nnOxM0NKllV/nIYdErYVU/X1V6mPZssjE1HWL9qeeiuPOmJEZSTJ5cqYbrWvXTDr/rrti2RVXZLbf\nZJNYloysmDkzArnsYGWbbTKjeD77LJ4///wYjdSuXQRr66wT3R2J0tLMcN3ks33aafFZTepBknvq\nSJW7M5YtqzmIvuKKylOqJ6/p3XcjY5S0gaABQMEYMya+Ka/qbInJ7dO7dEmfrUjGryfD4JJ5DmrK\nMjSGpGslud9HfY6dFOxVHb6aVllZ1FA09qyNy5fHBay24yT3RWmIrolcJTUm112XmXL8iy/iwtu5\ns/vee2fWnT8/MjFJd457pivg+eczy6ZOrTxq54QToktq+fLIIHTvHhftuXPjAv7DH0YmpuqIpJKS\nysWt++wT07q/9ZZ/181w7LHR3ZP2859kFpJRIxdeGBmr8vL4f9m+fbz+xgwaWgkAcnDOOdL770sb\nbrhq+xk8OH5vtZVklm6bLl2kLbeUXnhBWrZM+t3vpJ/8RNphh1VrSy623lpq3166/HKpd29p++1z\n38e228bvH/6wfm1o1Ur63vfid2Nq2zbe39qOs/fe8fvQQ6U2bRq3PVV17Bjn48UXpbffjs/HuutK\n7dpJl10mjR6dWbdbN2nyZGmjjTLLRo2K9QYNyizbbjtpvfUqP545U+rXT7rpJumii6SuXeP9P/xw\n6bXX4vcmm1RuW5cu0sCB0n/+E4/feUfq21fafHOpUyfpiSekO+6Qhg9P//n//vejbVOmxOPXX4/X\nbyadeqp0yy1Sz56p3756aeJTDKC5a9Mm/iCuquQP9dZb577dCy9I114rzZolPfjgqrclF23bStts\nExeq445L/wc/26BBcWFLAqfmbNNNpfPOk4YOzc/xBw6Unn46Khj69cucj1NOqXvboqL4qc0uu8Tv\nLbeU7rqrcqB36qnSnXdKZ59d/bZ77y1dc4307bfxWf3BDyIA23ZbaezYCHyHDau7nQkzabfdKgcN\nSdDWtat02GHp91VfZBoA5MU668Qf9iOOyG27gQPjm9+YMdLJJ8eFoqltt138/ulP67f9hhtK//tf\n/bIUhcZMOv98abPN8nP8gQOlt96SXnopvsk3tL59pQULpHvvXTkzNHCgVFKSyRxVtdde0vz50j//\nGUFN0r6iImnRImnffaX118+tPbvuKk2bJn31lfTee5Gpa0pkGgDkzYQJuW8zaJBUXh4Xq/POa/g2\npXHoofEHe1UyBQ2RrUFcuN2lN9+UfvazxjlGx441P9euXc3PDRwode4sjRsXj3/wg/idZDd+/vPc\n27LrrlJZmXTDDfE710zdqiLTAKBZ6dcvagnGjIl+5XzYYw/p0Uel1q3zc3xk9O2bCcAaI9OwKtq2\nje6EV1+NzFr37rH8oIOkSy6J37nafPOoW0gC7qbONBA0AGhWWreWPv44+pOBVq0y3TyFFjRI0UUh\nVW5bly7SuedGUJErs8g2zJoVRZ2dOzdIM1MjaADQ7DT2qAE0L4MGRYFu1REMhSApVEy6JhrCrrvG\n76bumpCoaQAANHOnnZYZkVJo+vaN2obddmu4fRI0AABQTz16xHwShcgshgg3pC22kI4+WjrkkIbd\nbxoEDQAANCNm0q235ufY9AwCAIBUCBoAAEAqBA0AACAVggYAAJAKQQMAAEiFoAEAAKRC0AAAAFIh\naAAAAKkQNAAAgFQIGgAAQCoEDQAAIBWCBjSJSZMm5bsJaECcz5aF84m0cg4azGxnM7vPzD43s3Iz\nOyjFNkPN7BUzW2RmX5jZDWbWvX5NRnPEH6WWhfPZsnA+kVZ9Mg1rSnpF0ghJXtfKZrajpImS/ipp\nC0mHSdpB0vX1ODYAAMiTnG+N7e6PSHpEkszMUmwySNJH7j6+4vEnZvYXSWflemwAAJA/TVHT8Lyk\n3ma2nySZWU9Jh0t6sAmODQAAGkjOmYZcuftzZnaMpNvNrEPFMe+TNKqWzTpI0ltvvdXYzUMTKSkp\n0fTp0/PdDDQQzmfLwvlsWbKunR0aet/mXmdZQs0bm5VLOsTd76tlnS0k/UfSFZIek9RL0p8kTXX3\n42vY5mhJt9a7YQAAYKi739aQO2yKoOEmSR3c/YisZTtKelpSL3efU802a0v6saSPJS2tdwMBAFj9\ndJC0kaRH3f3rhtxxo3dPSOooaXmVZeWKkRfVFlJWvMgGjY4AAFiNPNcYO63PPA1rmll/M9umYtEm\nFY97Vzz/BzObmLXJ/ZJ+amYnm9nGFVmGqyW96O6zV/kVAACAJpFz94SZ7SrpCa08R8NEd/+5mf1d\nUh933yNrm5GSTpa0saRvJP1X0jnu/uWqNB4AADSdVappAAAAqw/uPQEAAFIhaAAAAKkUXNBgZiPN\n7CMzW2JmL5jZ9vluE+pmZudV3MAs++fNKutcWHHDssVm9h8z2zRf7UVlaW5EV9f5M7P2ZjbezL4y\ns4Vm9m8z69F0rwLZ6jqnZvb3av7PPlRlHc5pATCzc83sJTNbYGZzzOxuM9u8mvUa/f9oQQUNZnak\nYhKo8yRtK+lVSY+a2Tp5bRjSekNST0nrVvzslDxhZmcrZgE9UXHDskWKc9suD+3Eymq9EV3K8zdW\n0gGSfippF0nrSbqzcZuNWqS5ueDDqvx/trjK85zTwrCzpGslDZS0l6S2kh4zszWSFZrs/6i7F8yP\npBckXZ312CR9JumsfLeNnzrP3XmSptfy/BeSRmc97iJpiaQj8t12flY6V+WSDsrl/FU8XiZpSNY6\nP6jY1w75fk2r+08N5/Tvku6qZRvOaYH+SFqn4jzslLWsSf6PFkymwczaSipSDMeUJHm8qscl/Shf\n7UJONqtIhX5gZrdkzd2xseJbTPa5XSDpRXFuC17K87edYrK47HXekTRLnONCtltFuvttM5tgZt2z\nnisS57RQdVNkj+ZJTft/tGCCBkXk1FpS1Wml5yjeDBS2FyQdp5j+O5mT4ykzW1Nx/lyc2+Yqzfnr\nKWl5xR+qmtZBYXlY0rGS9pB0lqRdJT1kZslMveuKc1pwKs7PWEnPuHtSN9Zk/0ebYhpprAbc/dGs\nh2+Y2UuSPpF0hKS389MqADVx939lPZxpZq9L+kDSbooJ/FCYJkjaQtKO+Th4IWUavpJUpoiGsvWU\nxHTTzYy7l0h6V9KmivNn4tw2V2nO32xJ7cysSy3roIC5+0eKv8NJxT3ntMCY2ThJ+0vazSvPqNxk\n/0cLJmhw9xWSpknaM1lWkYbZU4104w00HjPrpPjj80XFH6PZqnxuuygqgTm3BS7l+ZsmqbTKOj+Q\ntKGk55ussag3M9tA0tqSkosR57SAVAQMB0va3d1nZT/XlP9HC6174kpJ/zCzaZJekjRacZfMf+Sz\nUaibmf1RcXOyTyStL+kCSSsk/bNilbGSfmtm7ytueT5GMTLm3iZvLFZSUXuyqTJ3nt3EzPpLmufu\nn6qO8+fuC8zsBklXmtl8SQslXSPpWXd/qUlfDCTVfk4rfs5TDLebXbHeZYrs4KMS57SQmNkExXDY\ngyQtMrMko1Di7ksr/t00/0fzPXSkmqEkIype8BJF9LNdvtvET6rzNqniA7pEUY17m6SNq6xzvmJY\n0GLFH6ZN891ufr47N7sqhl6VVfm5Me35k9ReMZb8q4o/SHdI6pHv17a6/tR2TiV1kPSIImBYKulD\nSX+W9D3OaeH91HAeyyQdW2W9Rv8/yg2rAABAKgVT0wAAAAobQQMAAEiFoAEAAKRC0AAAAFIhaAAA\nAKkQNAAAgFQIGgAAQCoEDQAAIBWCBgAAkApBAwAASIWgAQAApPL/J59eIFK2fFMAAAAASUVORK5C\nYII=\n",
"text/plain": [
"<matplotlib.figure.Figure at 0x106e5bdd8>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"import matplotlib.pyplot as plt\n",
"import matplotlib.ticker as ticker\n",
"%matplotlib inline\n",
"\n",
"plt.figure()\n",
"plt.plot(all_losses)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Sampling the Network\n",
"\n",
"To sample we give the network a letter and ask what the next one is, feed that in as the next letter, and repeat until the EOS token.\n",
"\n",
"* Create tensors for input category, starting letter, and empty hidden state\n",
"* Create a string `output_str` with the starting letter\n",
"* Up to a maximum output length,\n",
" * Feed the current letter to the network\n",
" * Get the next letter from highest output, and next hidden state\n",
" * If the letter is EOS, stop here\n",
" * If a regular letter, add to `output_str` and continue\n",
"* Return the final name\n",
"\n",
"*Note*: Rather than supplying a starting letter every time we generate, we could have trained with a \"start of string\" token and had the network choose its own starting letter."
]
},
{
"cell_type": "code",
"execution_count": 13,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"max_length = 20\n",
"\n",
"# Generate given a category and starting letter\n",
"def generate_one(category, start_char='A', temperature=0.5):\n",
" category_input = make_category_input(category)\n",
" chars_input = make_chars_input(start_char)\n",
" hidden = rnn.init_hidden()\n",
"\n",
" output_str = start_char\n",
" \n",
" for i in range(max_length):\n",
" output, hidden = rnn(category_input, chars_input[0], hidden)\n",
" \n",
" # Sample as a multinomial distribution\n",
" output_dist = output.data.view(-1).div(temperature).exp()\n",
" top_i = torch.multinomial(output_dist, 1)[0]\n",
" \n",
" # Stop at EOS, or add to output_str\n",
" if top_i == EOS:\n",
" break\n",
" else: \n",
" char = all_letters[top_i]\n",
" output_str += char\n",
" chars_input = make_chars_input(char)\n",
"\n",
" return output_str\n",
"\n",
"# Get multiple samples from one category and multiple starting letters\n",
"def generate(category, start_chars='ABC'):\n",
" for start_char in start_chars:\n",
" print(generate_one(category, start_char))"
]
},
{
"cell_type": "code",
"execution_count": 14,
"metadata": {
"collapsed": false,
"scrolled": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Riberkov\n",
"Urtherdez\n",
"Shimanev\n"
]
}
],
"source": [
"generate('Russian', 'RUS')"
]
},
{
"cell_type": "code",
"execution_count": 15,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Gomen\n",
"Ester\n",
"Ront\n"
]
}
],
"source": [
"generate('German', 'GER')"
]
},
{
"cell_type": "code",
"execution_count": 16,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Sandar\n",
"Per\n",
"Alvareza\n"
]
}
],
"source": [
"generate('Spanish', 'SPA')"
]
},
{
"cell_type": "code",
"execution_count": 17,
"metadata": {
"collapsed": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Cha\n",
"Hang\n",
"Ini\n"
]
}
],
"source": [
"generate('Chinese', 'CHI')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The final versions of the scripts [in the Practical PyTorch repo](https://github.com/spro/practical-pytorch/tree/master/conditional-char-rnn) split the above code into a few files:\n",
"\n",
"* `data.py` (loads files)\n",
"* `model.py` (defines the RNN)\n",
"* `train.py` (runs training)\n",
"* `generate.py` (runs `generate()` with command line arguments)\n",
"\n",
"Run `train.py` to train and save the network.\n",
"\n",
"Then run `generate.py` with a language to view generated names: \n",
"\n",
"```\n",
"$ python generate.py Russian\n",
"Alaskinimhovev\n",
"Beranivikh\n",
"Chamon\n",
"```"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Exercises\n",
"\n",
"* Adjust the `temperature` argument to see how generation is affected \n",
"* Try with a different dataset of category -> line, for example:\n",
" * Fictional series -> Character name\n",
" * Part of speech -> Word\n",
" * Country -> City\n",
"* Use a \"start of sentence\" token so that sampling can be done without choosing a start letter\n",
"* Get better results with a bigger network"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Next**: [Translation with a Sequence to Sequence Network and Attention](https://github.com/spro/practical-pytorch/blob/master/seq2seq-translation/seq2seq-translation.ipynb)"
]
}
],
"metadata": {
"anaconda-cloud": {},
"kernelspec": {
"display_name": "Python [conda root]",
"language": "python",
"name": "conda-root-py"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.2"
},
"nbpresent": {
"slides": {
"10393c05-7962-4245-9228-8b7db4eb79a1": {
"id": "10393c05-7962-4245-9228-8b7db4eb79a1",
"prev": "22628fc4-8309-4579-ba36-e5b01a841473",
"regions": {
"335fd672-4ee6-4b7
gitextract_lif5hpq3/
├── .gitignore
├── LICENSE
├── README.md
├── char-rnn-classification/
│ ├── .gitignore
│ ├── char-rnn-classification.ipynb
│ ├── data.py
│ ├── model.py
│ ├── predict.py
│ ├── server.py
│ └── train.py
├── char-rnn-generation/
│ ├── README.md
│ ├── char-rnn-generation.ipynb
│ ├── generate.py
│ ├── helpers.py
│ ├── model.py
│ └── train.py
├── conditional-char-rnn/
│ ├── conditional-char-rnn.ipynb
│ ├── data.py
│ ├── generate.py
│ ├── model.py
│ └── train.py
├── data/
│ └── names/
│ ├── Arabic.txt
│ ├── Chinese.txt
│ ├── Czech.txt
│ ├── Dutch.txt
│ ├── English.txt
│ ├── French.txt
│ ├── German.txt
│ ├── Greek.txt
│ ├── Irish.txt
│ ├── Italian.txt
│ ├── Japanese.txt
│ ├── Korean.txt
│ ├── Polish.txt
│ ├── Portuguese.txt
│ ├── Russian.txt
│ ├── Scottish.txt
│ ├── Spanish.txt
│ └── Vietnamese.txt
├── glove-word-vectors/
│ └── glove-word-vectors.ipynb
├── reinforce-gridworld/
│ ├── helpers.py
│ ├── reinforce-gridworld.ipynb
│ └── reinforce-gridworld.py
└── seq2seq-translation/
├── images/
│ ├── attention-decoder-network.dot
│ ├── decoder-network.dot
│ └── encoder-network.dot
├── masked_cross_entropy.py
├── seq2seq-translation-batched.ipynb
├── seq2seq-translation-batched.py
└── seq2seq-translation.ipynb
SYMBOL INDEX (105 symbols across 17 files)
FILE: char-rnn-classification/data.py
function findFiles (line 9) | def findFiles(path): return glob.glob(path)
function unicodeToAscii (line 12) | def unicodeToAscii(s):
function readLines (line 20) | def readLines(filename):
function letterToIndex (line 36) | def letterToIndex(letter):
function lineToTensor (line 41) | def lineToTensor(line):
FILE: char-rnn-classification/model.py
class RNN (line 5) | class RNN(nn.Module):
method __init__ (line 6) | def __init__(self, input_size, hidden_size, output_size):
method forward (line 15) | def forward(self, input, hidden):
method initHidden (line 22) | def initHidden(self):
FILE: char-rnn-classification/predict.py
function evaluate (line 8) | def evaluate(line_tensor):
function predict (line 16) | def predict(line, n_predictions=3):
FILE: char-rnn-classification/server.py
function index (line 5) | def index(input_line):
FILE: char-rnn-classification/train.py
function categoryFromOutput (line 14) | def categoryFromOutput(output):
function randomChoice (line 19) | def randomChoice(l):
function randomTrainingPair (line 22) | def randomTrainingPair():
function train (line 33) | def train(category_tensor, line_tensor):
function timeSince (line 51) | def timeSince(since):
FILE: char-rnn-generation/generate.py
function generate (line 8) | def generate(decoder, prime_str='A', predict_len=100, temperature=0.8):
FILE: char-rnn-generation/helpers.py
function read_file (line 16) | def read_file(filename):
function char_tensor (line 22) | def char_tensor(string):
function time_since (line 30) | def time_since(since):
FILE: char-rnn-generation/model.py
class RNN (line 7) | class RNN(nn.Module):
method __init__ (line 8) | def __init__(self, input_size, hidden_size, output_size, n_layers=1):
method forward (line 19) | def forward(self, input, hidden):
method init_hidden (line 25) | def init_hidden(self):
FILE: char-rnn-generation/train.py
function random_training_set (line 26) | def random_training_set(chunk_len):
function train (line 42) | def train(inp, target):
function save (line 56) | def save():
FILE: conditional-char-rnn/data.py
function unicode_to_ascii (line 20) | def unicode_to_ascii(s):
function read_lines (line 27) | def read_lines(filename):
function random_training_pair (line 43) | def random_training_pair():
function make_category_input (line 48) | def make_category_input(category):
function make_chars_input (line 54) | def make_chars_input(chars):
function make_target (line 62) | def make_target(line):
function random_training_set (line 68) | def random_training_set():
FILE: conditional-char-rnn/generate.py
function generate_one (line 26) | def generate_one(category, start_char='A', temperature=0.5):
function generate (line 50) | def generate(category, start_chars='ABC'):
FILE: conditional-char-rnn/model.py
class RNN (line 7) | class RNN(nn.Module):
method __init__ (line 8) | def __init__(self, category_size, input_size, hidden_size, output_size):
method forward (line 20) | def forward(self, category, input, hidden):
method init_hidden (line 28) | def init_hidden(self):
FILE: conditional-char-rnn/train.py
function train (line 19) | def train(category_tensor, input_line_tensor, target_line_tensor):
function time_since (line 33) | def time_since(t):
function save (line 54) | def save():
FILE: reinforce-gridworld/helpers.py
function interpolate (line 1) | def interpolate(i, v_from, v_to, over):
class SlidingAverage (line 4) | class SlidingAverage:
method __init__ (line 5) | def __init__(self, name, steps=100):
method add (line 12) | def add(self, n):
method value (line 21) | def value(self):
method __str__ (line 25) | def __str__(self):
method __gt__ (line 28) | def __gt__(self, value): return self.value > value
method __lt__ (line 29) | def __lt__(self, value): return self.value < value
FILE: reinforce-gridworld/reinforce-gridworld.py
class Grid (line 55) | class Grid():
method __init__ (line 56) | def __init__(self, grid_size=8, n_plants=15):
method reset (line 60) | def reset(self):
method visible (line 84) | def visible(self, pos):
class Agent (line 93) | class Agent:
method reset (line 94) | def reset(self):
method act (line 97) | def act(self, action):
class Environment (line 109) | class Environment:
method __init__ (line 110) | def __init__(self):
method reset (line 114) | def reset(self):
method record_step (line 127) | def record_step(self):
method visible_state (line 135) | def visible_state(self):
method step (line 144) | def step(self, action):
class Policy (line 174) | class Policy(nn.Module):
method __init__ (line 175) | def __init__(self, hidden_size):
method forward (line 184) | def forward(self, x):
function select_action (line 197) | def select_action(e, state):
function run_episode (line 210) | def run_episode(e):
function finish_episode (line 230) | def finish_episode(e, actions, values, rewards):
FILE: seq2seq-translation/masked_cross_entropy.py
function sequence_mask (line 5) | def sequence_mask(sequence_length, max_len=None):
function masked_cross_entropy (line 19) | def masked_cross_entropy(logits, target, length):
FILE: seq2seq-translation/seq2seq-translation-batched.py
class Lang (line 76) | class Lang:
method __init__ (line 77) | def __init__(self, name):
method index_words (line 84) | def index_words(self, sentence):
method index_word (line 88) | def index_word(self, word):
method trim (line 97) | def trim(self, min_count=3):
function unicode_to_ascii (line 117) | def unicode_to_ascii(s):
function normalize_string (line 124) | def normalize_string(s):
function read_langs (line 130) | def read_langs(lang1, lang2, reverse=False):
function filter_pair (line 159) | def filter_pair(p):
function filter_pairs (line 163) | def filter_pairs(pairs):
function prepare_data (line 166) | def prepare_data(lang1_name, lang2_name, reverse=False):
function indexes_from_sentence (line 209) | def indexes_from_sentence(lang, sentence):
function pad_seq (line 212) | def pad_seq(seq, max_length):
function random_batch (line 216) | def random_batch(batch_size=3):
class EncoderRNN (line 248) | class EncoderRNN(nn.Module):
method __init__ (line 249) | def __init__(self, input_size, hidden_size, n_layers=1, dropout=0.1):
method forward (line 260) | def forward(self, input_seqs, input_lengths, hidden=None):
class Attn (line 267) | class Attn(nn.Module):
method __init__ (line 268) | def __init__(self, method, hidden_size):
method forward (line 281) | def forward(self, hidden, encoder_outputs):
method score (line 305) | def score(self, hidden, encoder_output):
class LuongAttnDecoderRNN (line 337) | class LuongAttnDecoderRNN(nn.Module):
method __init__ (line 338) | def __init__(self, attn_model, hidden_size, output_size, n_layers=1, d...
method forward (line 359) | def forward(self, input_seq, last_context, last_hidden, encoder_outputs):
function train (line 510) | def train(input_batches, input_lengths, target_batches, target_lengths, ...
function as_minutes (line 609) | def as_minutes(s):
function time_since (line 614) | def time_since(since, percent):
function evaluate (line 621) | def evaluate(input_seq, max_length=MAX_LENGTH):
function evaluate_randomly (line 667) | def evaluate_randomly():
function show_plot_visdom (line 679) | def show_plot_visdom():
function show_attention (line 686) | def show_attention(input_sentence, output_words, attentions):
function evaluate_and_show_attention (line 705) | def evaluate_and_show_attention(input_sentence):
Condensed preview — 50 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (1,080K chars).
[
{
"path": ".gitignore",
"chars": 69,
"preview": "*.swp\n*.swo\n*.pt\n.ipynb_checkpoints\n__pycache__\ndata/eng-*.txt\n*.csv\n"
},
{
"path": "LICENSE",
"chars": 1081,
"preview": "The MIT License (MIT)\n\nCopyright (c) 2017 Sean Robertson\n\nPermission is hereby granted, free of charge, to any person ob"
},
{
"path": "README.md",
"chars": 4063,
"preview": "**These tutorials have been merged into [the official PyTorch tutorials](https://github.com/pytorch/tutorials). Please g"
},
{
"path": "char-rnn-classification/.gitignore",
"chars": 48,
"preview": "*.pt\n*.swp\n*.swo\n__pycache__\n.ipynb_checkpoints\n"
},
{
"path": "char-rnn-classification/char-rnn-classification.ipynb",
"chars": 93252,
"preview": "{\n \"cells\": [\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \":\n def __init__(self, inp"
},
{
"path": "char-rnn-classification/predict.py",
"chars": 826,
"preview": "from model import *\nfrom data import *\nimport sys\n\nrnn = torch.load('char-rnn-classification.pt')\n\n# Just return an outp"
},
{
"path": "char-rnn-classification/server.py",
"chars": 181,
"preview": "from bottle import route, run\nfrom predict import *\n\n@route('/<input_line>')\ndef index(input_line):\n return {'result'"
},
{
"path": "char-rnn-classification/train.py",
"chars": 2222,
"preview": "import torch\nfrom data import *\nfrom model import *\nimport random\nimport time\nimport math\n\nn_hidden = 128\nn_epochs = 100"
},
{
"path": "char-rnn-generation/README.md",
"chars": 1610,
"preview": "# Practical PyTorch: Generating Shakespeare with a Character-Level RNN\n\n## Dataset\n\nDownload [this Shakespeare dataset]("
},
{
"path": "char-rnn-generation/char-rnn-generation.ipynb",
"chars": 42205,
"preview": "{\n \"cells\": [\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \":\n "
},
{
"path": "conditional-char-rnn/train.py",
"chars": 1792,
"preview": "# Practical PyTorch: Generating Names with a Conditional Character-Level RNN\n# https://github.com/spro/practical-pytorch"
},
{
"path": "data/names/Arabic.txt",
"chars": 13190,
"preview": "Khoury\nNahas\nDaher\nGerges\nNazari\nMaalouf\nGerges\nNaifeh\nGuirguis\nBaba\nSabbagh\nAttia\nTahan\nHaddad\nAswad\nNajjar\nDagher\nMalo"
},
{
"path": "data/names/Chinese.txt",
"chars": 1246,
"preview": "Ang\nAu-Yong\nBai\nBan\nBao\nBei\nBian\nBui\nCai\nCao\nCen\nChai\nChaim\nChan\nChang\nChao\nChe\nChen\nCheng\nCheung\nChew\nChieu\nChin\nChong\n"
},
{
"path": "data/names/Czech.txt",
"chars": 3916,
"preview": "Abl\nAdsit\nAjdrna\nAlt\nAntonowitsch\nAntonowitz\nBacon\nBallalatak\nBallaltick\nBartonova\nBastl\nBaroch\nBenesch\nBetlach\nBiganska"
},
{
"path": "data/names/Dutch.txt",
"chars": 2351,
"preview": "Aalsburg\nAalst\nAarle\nAchteren\nAchthoven\nAdrichem\nAggelen\nAgteren\nAgthoven\nAkkeren\nAller\nAlphen\nAlst\nAltena\nAlthuis\nAmels"
},
{
"path": "data/names/English.txt",
"chars": 27125,
"preview": "Abbas\nAbbey\nAbbott\nAbdi\nAbel\nAbraham\nAbrahams\nAbrams\nAckary\nAckroyd\nActon\nAdair\nAdam\nAdams\nAdamson\nAdanet\nAddams\nAdderle"
},
{
"path": "data/names/French.txt",
"chars": 2162,
"preview": "Abel\nAbraham\nAdam\nAlbert\nAllard\nArchambault\nArmistead\nArthur\nAugustin\nBabineaux\nBaudin\nBeauchene\nBeaulieu\nBeaumont\nBélan"
},
{
"path": "data/names/German.txt",
"chars": 5553,
"preview": "Abbing\nAbel\nAbeln\nAbt\nAchilles\nAchterberg\nAcker\nAckermann\nAdam\nAdenauer\nAdler\nAdlersflügel\nAeschelman\nAlbert\nAlbrecht\nAl"
},
{
"path": "data/names/Greek.txt",
"chars": 1981,
"preview": "Adamidis\nAdamou\nAgelakos\nAkrivopoulos\nAlexandropoulos\nAnetakis\nAngelopoulos\nAntimisiaris\nAntipas\nAntonakos\nAntoniadis\nAn"
},
{
"path": "data/names/Irish.txt",
"chars": 1857,
"preview": "Adam\nAhearn\nAodh\nAodha\nAonghuis\nAonghus\nBhrighde\nBradach\nBradan\nBraden\nBrady\nBran\nBrannon\nBrian\nCallaghan\nCaomh\nCarey\nCa"
},
{
"path": "data/names/Italian.txt",
"chars": 5642,
"preview": "Abandonato\nAbatangelo\nAbatantuono\nAbate\nAbategiovanni\nAbatescianni\nAbbà\nAbbadelli\nAbbascia\nAbbatangelo\nAbbatantuono\nAbba"
},
{
"path": "data/names/Japanese.txt",
"chars": 7622,
"preview": "Abe\nAbukara\nAdachi\nAida\nAihara\nAizawa\nAjibana\nAkaike\nAkamatsu\nAkatsuka\nAkechi\nAkera\nAkimoto\nAkita\nAkiyama\nAkutagawa\nAmag"
},
{
"path": "data/names/Korean.txt",
"chars": 423,
"preview": "Ahn\nBaik\nBang\nByon\nCha\nChang\nChi\nChin\nCho\nChoe\nChoi\nChong\nChou\nChu\nChun\nChung\nChweh\nGil\nGu\nGwang \nHa\nHan\nHo\nHong\nHung\nHw"
},
{
"path": "data/names/Polish.txt",
"chars": 1149,
"preview": "Adamczak\nAdamczyk\nAndrysiak\nAuttenberg\nBartosz\nBernard\nBobienski\nBosko\nBroż\nBrzezicki\nBudny\nBukoski\nBukowski\nChlebek\nChm"
},
{
"path": "data/names/Portuguese.txt",
"chars": 549,
"preview": "Abreu\nAlbuquerque\nAlmeida\nAlves\nAraújo\nAraullo\nBarros\nBasurto\nBelo\nCabral\nCampos\nCardozo\nCastro\nCoelho\nCosta\nCrespo\nCruz"
},
{
"path": "data/names/Russian.txt",
"chars": 85168,
"preview": "Ababko\nAbaev\nAbagyan\nAbaidulin\nAbaidullin\nAbaimoff\nAbaimov\nAbakeliya\nAbakovsky\nAbakshin\nAbakumoff\nAbakumov\nAbakumtsev\nAb"
},
{
"path": "data/names/Scottish.txt",
"chars": 752,
"preview": "Smith\nBrown\nWilson\nCampbell\nStewart\nThomson\nRobertson\nAnderson\nMacdonald\nScott\nReid\nMurray\nTaylor\nClark\nRoss\nWatson\nMorr"
},
{
"path": "data/names/Spanish.txt",
"chars": 2231,
"preview": "Abana\nAbano\nAbarca\nAbaroa\nAbascal\nAbasolo\nAbel\nAbelló\nAberquero\nAbreu\nAcosta\nAgramunt\nAiza\nAlamilla\nAlbert\nAlbuquerque\nA"
},
{
"path": "data/names/Vietnamese.txt",
"chars": 339,
"preview": "Nguyen\nTron\nLe\nPham\nHuynh\nHoang\nPhan\nVu\nVo\nDang\nBui\nDo\nHo\nNgo\nDuong\nLy\nAn\nan\nBach\nBanh\nCao\nChau\nChu\nChung\nChu\nDiep\nDoan\n"
},
{
"path": "glove-word-vectors/glove-word-vectors.ipynb",
"chars": 12222,
"preview": "{\n \"cells\": [\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \":\n return (v_from - v_to) * max(0, (1 - i / over)) + v_to\n\nclass SlidingAverage"
},
{
"path": "reinforce-gridworld/reinforce-gridworld.ipynb",
"chars": 321021,
"preview": "{\n \"cells\": [\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \". The extraction includes 50 files (991.7 KB), approximately 511.4k tokens, and a symbol index with 105 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.
Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.