[
  {
    "path": ".other/technical_requirements.md",
    "content": "**Software and hardware specifications**\n========================================\n\n<table>\n<thead>\n<tr class=\"header\">\n<th><strong>Chapter number</strong></th>\n<th><strong>Software required<br />\n(with version)</strong></th>\n<th><strong>Hardware specifications</strong></th>\n<th><strong>OS required</strong></th>\n</tr>\n</thead>\n<tbody>\n<tr>\n<td rowspan=\"2\">1-12</td>\n<td>Python 3.8</td>\n<td rowspan=\"2\">x86/AMD64 system</td>\n<td rowspan=\"2\">Windows,<br />\nany Linux distro,<br />\nor macOS</td>\n</tr>\n<tr>\n<td>Any modern web browser, <br />such as Firefox, Edge, Safari, <br />or Chrome (recommended) </td>\n</tr>\n</tbody>\n</table>\n\n**\\*Note**: The code in this book is in the form of python notebooks, and can be executed in Google Colaboratory. If you wish to execute these on your local machine, the (Python) package requirements are mentioned in the following section.\n\n**Package specifications**\n==========================\n\n| **Package required**      | **Version**       | **Installation command (pip)** |\n|---------------------------|-------------------|--------------------------------|\n| Transformers              | 4.1.1 or higher   | `pip install transformer`      |\n| genism                    | 3.8.3 or higher   | `pip install genism`       |\n| TensorFlow                | 2.4.0 or higher  | `pip install tensorflow`            |\n| NumPy                     | 1.19.5 or higher   | `pip install numpy`            |\n| SciPy                     | 1.6.0 or higher   | `pip install scipy`           |\n| pandas                    | 1.2.0 or higher   | `pip install pandas`           |\n| Matplotlib                | 3.3.3 or higher   | `pip install matplotlib`       |\n| scikit-learn              | 0.24.0 or higher  | `pip install scikit-learn`     |\n| toposort                  | 1.6.0 or higher  | `pip install toposort`             |\n| Sentencepiece             | 0.1.94 or higher | `pip install sentencepiece`             |\n| trax                      | 1.3.7 or higher   | `pip install trax`  |\n| Allennlp                  | 1.0.0 or higher   | `pip install allennlp`        |\n| Allennlp-models           | 1.0.0 or higher   | `pip install allennlp-models`  |\n| Farm-haystack             | 0.6.0 or higher   | `pip install farm-haystack`        |\n| Torch                     | 1.6.0+cu101 or higher   | `pip install torch==1.6.0+cu101`  |\n\n**\\*Note**: This isn’t an exhaustive list of all the packages required to run the codes in this book, but only the essential and most commonly used packages in the book. You will likely encounter some more required packages as you read through the book, which you can install using `pip` or `conda`.\n"
  },
  {
    "path": "Chapter01/Multi_Head_Attention_Sub_Layer.ipynb",
    "content": "{\n  \"nbformat\": 4,\n  \"nbformat_minor\": 0,\n  \"metadata\": {\n    \"colab\": {\n      \"name\": \"Multi-Head Attention Sub-Layer.ipynb\",\n      \"provenance\": [],\n      \"collapsed_sections\": []\n    },\n    \"kernelspec\": {\n      \"name\": \"python3\",\n      \"display_name\": \"Python 3\"\n    },\n    \"accelerator\": \"GPU\",\n    \"widgets\": {\n      \"application/vnd.jupyter.widget-state+json\": {\n        \"946c90b82f7f46caa25c885668b75eab\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_4191af78535e4da8bb797690eff84e00\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_9ce3d57b96b64da0b15e3f3626bacb30\",\n              \"IPY_MODEL_f8da2c91156342a69d9b262f4f993aa4\"\n            ]\n          }\n        },\n        \"4191af78535e4da8bb797690eff84e00\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"9ce3d57b96b64da0b15e3f3626bacb30\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_97370923218945c5b80ab468751ac8a7\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 230,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 230,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_0ba4a91f472e4c41ba80ab4025288446\"\n          }\n        },\n        \"f8da2c91156342a69d9b262f4f993aa4\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_15aa4b6f8f784c74804107be249126b9\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 230/230 [00:01&lt;00:00, 185B/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_edea457617ed4792aeeb65292019ceb4\"\n          }\n        },\n        \"97370923218945c5b80ab468751ac8a7\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"0ba4a91f472e4c41ba80ab4025288446\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"15aa4b6f8f784c74804107be249126b9\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"edea457617ed4792aeeb65292019ceb4\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        }\n      }\n    }\n  },\n  \"cells\": [\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"aXACkAtfNpG0\",\n        \"colab_type\": \"text\"\n      },\n      \"source\": [\n        \"# The Attention Mechanism\\n\",\n        \"Copyright 2020, Denis Rothman, MIT License. Denis Rothman rewrote the reference notebook entirely in basic Python with no frameworks. Three more steps were added, and a Hugging Face transformer example was added. The original images were taken out, redesigned by Denis Rothman for educational purposes, and inserted in the book descriptions of the multi-attention sub-layer.\\n\",\n        \"\\n\",\n        \"[The Reference Colaboratory Notebook was written by Manuel Romero](https://colab.research.google.com/drive/1rPk3ohrmVclqhH7uQ7qys4oznDdAhpzF)\\n\",\n        \"\\n\",\n        \"[A Medium article was written by Raimi Karim](https://towardsdatascience.com/illustrated-self-attention-2d627e33b20a)\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"veRoFjFRNXwJ\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"import numpy as np\\n\",\n        \"from scipy.special import softmax\"\n      ],\n      \"execution_count\": 0,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"JLe9lWCJNogW\",\n        \"colab_type\": \"code\",\n        \"outputId\": \"733e039b-343e-4161-9919-19b3a1ec130f\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 90\n        }\n      },\n      \"source\": [\n        \"print(\\\"Step 1: Input : 3 inputs, d_model=4\\\")\\n\",\n        \"x =np.array([[1.0, 0.0, 1.0, 0.0],   # Input 1\\n\",\n        \"             [0.0, 2.0, 0.0, 2.0],   # Input 2\\n\",\n        \"             [1.0, 1.0, 1.0, 1.0]])  # Input 3\\n\",\n        \"print(x)\"\n      ],\n      \"execution_count\": 0,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Step 1: Input : 3 inputs, d_model=4\\n\",\n            \"[[1. 0. 1. 0.]\\n\",\n            \" [0. 2. 0. 2.]\\n\",\n            \" [1. 1. 1. 1.]]\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"JZImwtHPN91V\",\n        \"colab_type\": \"code\",\n        \"outputId\": \"07706940-e200-4956-b957-fe9681139d0d\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 126\n        }\n      },\n      \"source\": [\n        \"print(\\\"Step 2: weights 3 dimensions x d_model=4\\\")\\n\",\n        \"print(\\\"w_query\\\")\\n\",\n        \"w_query =np.array([[1, 0, 1],\\n\",\n        \"                   [1, 0, 0],\\n\",\n        \"                   [0, 0, 1],\\n\",\n        \"                   [0, 1, 1]])\\n\",\n        \"print(w_query)\"\n      ],\n      \"execution_count\": 0,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Step 2: weights 3 dimensions x d_model=4\\n\",\n            \"w_query\\n\",\n            \"[[1 0 1]\\n\",\n            \" [1 0 0]\\n\",\n            \" [0 0 1]\\n\",\n            \" [0 1 1]]\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"7kRBS7MUOFgV\",\n        \"colab_type\": \"code\",\n        \"outputId\": \"8b0bcc03-88b1-4e8d-a483-dacc91ffa9ee\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 108\n        }\n      },\n      \"source\": [\n        \"print(\\\"w_key\\\")\\n\",\n        \"w_key =np.array([[0, 0, 1],\\n\",\n        \"                 [1, 1, 0],\\n\",\n        \"                 [0, 1, 0],\\n\",\n        \"                 [1, 1, 0]])\\n\",\n        \"print(w_key)\"\n      ],\n      \"execution_count\": 0,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"w_key\\n\",\n            \"[[0 0 1]\\n\",\n            \" [1 1 0]\\n\",\n            \" [0 1 0]\\n\",\n            \" [1 1 0]]\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"Napm2VtkOIEN\",\n        \"colab_type\": \"code\",\n        \"outputId\": \"7331eb08-64d5-4a36-eeef-0a0a556f130b\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 108\n        }\n      },\n      \"source\": [\n        \"print(\\\"w_value\\\")\\n\",\n        \"w_value = np.array([[0, 2, 0],\\n\",\n        \"                    [0, 3, 0],\\n\",\n        \"                    [1, 0, 3],\\n\",\n        \"                    [1, 1, 0]])\\n\",\n        \"print(w_value)\"\n      ],\n      \"execution_count\": 0,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"w_value\\n\",\n            \"[[0 2 0]\\n\",\n            \" [0 3 0]\\n\",\n            \" [1 0 3]\\n\",\n            \" [1 1 0]]\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"JqapIgfDOQ7d\",\n        \"colab_type\": \"code\",\n        \"outputId\": \"fd610d7a-968a-47e6-d614-40ad03c1d172\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 108\n        }\n      },\n      \"source\": [\n        \"print(\\\"Step 3: Matrix multiplication to obtain Q,K,V\\\")\\n\",\n        \"\\n\",\n        \"print(\\\"Queries: x * w_query\\\")\\n\",\n        \"Q=np.matmul(x,w_query)\\n\",\n        \"print(Q)\"\n      ],\n      \"execution_count\": 0,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Step 3: Matrix multiplication to obtain Q,K,V\\n\",\n            \"Queries: x * w_query\\n\",\n            \"[[1. 0. 2.]\\n\",\n            \" [2. 2. 2.]\\n\",\n            \" [2. 1. 3.]]\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"NmfMln1Wmv73\",\n        \"colab_type\": \"code\",\n        \"outputId\": \"065b63ba-7584-4302-97cd-d5e1765470ed\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 108\n        }\n      },\n      \"source\": [\n        \"print(\\\"Step 3: Matrix multiplication to obtain Q,K,V\\\")\\n\",\n        \"\\n\",\n        \"print(\\\"Keys: x * w_key\\\")\\n\",\n        \"K=np.matmul(x,w_key)\\n\",\n        \"print(K)\"\n      ],\n      \"execution_count\": 0,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Step 3: Matrix multiplication to obtain Q,K,V\\n\",\n            \"Keys: x * w_key\\n\",\n            \"[[0. 1. 1.]\\n\",\n            \" [4. 4. 0.]\\n\",\n            \" [2. 3. 1.]]\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"v3Asv-8mOWkN\",\n        \"colab_type\": \"code\",\n        \"outputId\": \"2ec71310-0486-46f4-d9f5-d12a1a6ad0e6\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 90\n        }\n      },\n      \"source\": [\n        \"print(\\\"Values: x * w_value\\\")\\n\",\n        \"V=np.matmul(x,w_value)\\n\",\n        \"print(V)\"\n      ],\n      \"execution_count\": 0,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Values: x * w_value\\n\",\n            \"[[1. 2. 3.]\\n\",\n            \" [2. 8. 0.]\\n\",\n            \" [2. 6. 3.]]\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"gfgRAHUuOp5c\",\n        \"colab_type\": \"code\",\n        \"outputId\": \"ad02f055-11e0-4b9a-eb15-b66e4846c95e\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 90\n        }\n      },\n      \"source\": [\n        \"print(\\\"Step 4: Scaled Attention Scores\\\")\\n\",\n        \"k_d=1   #square root of k_d=3 rounded down to 1 for this example\\n\",\n        \"attention_scores = (Q @ K.transpose())/k_d\\n\",\n        \"print(attention_scores)\"\n      ],\n      \"execution_count\": 0,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Step 4: Scaled Attention Scores\\n\",\n            \"[[ 2.  4.  4.]\\n\",\n            \" [ 4. 16. 12.]\\n\",\n            \" [ 4. 12. 10.]]\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"hg2t6KuNOjzM\",\n        \"colab_type\": \"code\",\n        \"outputId\": \"c0610f91-cd1d-4b0f-b5ce-f6445481186a\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 90\n        }\n      },\n      \"source\": [\n        \"print(\\\"Step 5: Scaled softmax attention_scores for each vector\\\")\\n\",\n        \"attention_scores[0]=softmax(attention_scores[0])\\n\",\n        \"attention_scores[1]=softmax(attention_scores[1])\\n\",\n        \"attention_scores[2]=softmax(attention_scores[2])\\n\",\n        \"print(attention_scores[0])\\n\",\n        \"print(attention_scores[1])\\n\",\n        \"print(attention_scores[2])\"\n      ],\n      \"execution_count\": 0,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Step 5: Scaled softmax attention_scores for each vector\\n\",\n            \"[0.06337894 0.46831053 0.46831053]\\n\",\n            \"[6.03366485e-06 9.82007865e-01 1.79861014e-02]\\n\",\n            \"[2.95387223e-04 8.80536902e-01 1.19167711e-01]\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"R4Es7A7NOvjD\",\n        \"colab_type\": \"code\",\n        \"outputId\": \"b86060fe-1292-47c5-93f6-ddeeca1bfb62\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 199\n        }\n      },\n      \"source\": [\n        \"print(\\\"Step 6: attention value obtained by score1/k_d * V\\\")\\n\",\n        \"print(V[0])\\n\",\n        \"print(V[1])\\n\",\n        \"print(V[2])\\n\",\n        \"print(\\\"Attention 1\\\")\\n\",\n        \"attention1=attention_scores[0].reshape(-1,1)\\n\",\n        \"attention1=attention_scores[0][0]*V[0]\\n\",\n        \"print(attention1)\\n\",\n        \"\\n\",\n        \"print(\\\"Attention 2\\\")\\n\",\n        \"attention2=attention_scores[0][1]*V[1]\\n\",\n        \"print(attention2)\\n\",\n        \"\\n\",\n        \"print(\\\"Attention 3\\\")\\n\",\n        \"attention3=attention_scores[0][2]*V[2]\\n\",\n        \"print(attention3)\"\n      ],\n      \"execution_count\": 0,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Step 6: attention value obtained by score1/k_d * V\\n\",\n            \"[1. 2. 3.]\\n\",\n            \"[2. 8. 0.]\\n\",\n            \"[2. 6. 3.]\\n\",\n            \"Attention 1\\n\",\n            \"[0.06337894 0.12675788 0.19013681]\\n\",\n            \"Attention 2\\n\",\n            \"[0.93662106 3.74648425 0.        ]\\n\",\n            \"Attention 3\\n\",\n            \"[0.93662106 2.80986319 1.40493159]\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"uBDKhaCvOzXj\",\n        \"colab_type\": \"code\",\n        \"outputId\": \"138901d8-0aa9-4db9-b8b1-76ad557e6688\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 54\n        }\n      },\n      \"source\": [\n        \"print(\\\"Step 7: summed the results to create the first line of the output matrix\\\")\\n\",\n        \"attention_input1=attention1+attention2+attention3\\n\",\n        \"print(attention_input1)\"\n      ],\n      \"execution_count\": 0,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Step 7: summed the results to create the first line of the output matrix\\n\",\n            \"[1.93662106 6.68310531 1.59506841]\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"iEjgRcqHO4ik\",\n        \"colab_type\": \"code\",\n        \"outputId\": \"675a154b-a305-4c0c-e314-353541abfd3e\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 635\n        }\n      },\n      \"source\": [\n        \"print(\\\"Step 8: Step 1 to 7 for inputs 1 to 3\\\")\\n\",\n        \"#We assume we have 3 results with learned weights (they were not trained in this example)\\n\",\n        \"#We assume we are implementing the original Transformer paper. We will have 3 results of 64 dimensions each\\n\",\n        \"attention_head1=np.random.random((3, 64))\\n\",\n        \"print(attention_head1)\"\n      ],\n      \"execution_count\": 0,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Step 8: Step 1 to 7 for inputs 1 to 3\\n\",\n            \"[[0.05750794 0.25966685 0.80912647 0.00841755 0.53786959 0.05089332\\n\",\n            \"  0.17938191 0.91091697 0.20593063 0.27634727 0.33869867 0.25488968\\n\",\n            \"  0.88673807 0.56544205 0.69075114 0.56069125 0.92579273 0.46042461\\n\",\n            \"  0.78471374 0.93064241 0.99626239 0.13662306 0.72892312 0.52327088\\n\",\n            \"  0.90128711 0.28245531 0.05630861 0.55857421 0.50998676 0.59709355\\n\",\n            \"  0.40038745 0.70580749 0.18971837 0.78544634 0.35815199 0.57527984\\n\",\n            \"  0.38283035 0.94917395 0.25450774 0.85725663 0.27262613 0.5720429\\n\",\n            \"  0.38092713 0.34721503 0.38857267 0.50218029 0.74035216 0.37789311\\n\",\n            \"  0.12812721 0.42074447 0.39534834 0.4927362  0.65353466 0.86485487\\n\",\n            \"  0.22989766 0.87239043 0.64613354 0.89034403 0.29338559 0.1671029\\n\",\n            \"  0.1675619  0.70683457 0.03683821 0.37657364]\\n\",\n            \" [0.08308343 0.01529261 0.34000535 0.48559272 0.25036425 0.98195061\\n\",\n            \"  0.72015388 0.03838282 0.18674587 0.33203929 0.82965726 0.6962791\\n\",\n            \"  0.49038184 0.97126469 0.25373185 0.18486967 0.38352481 0.68254099\\n\",\n            \"  0.01014604 0.51217341 0.17219508 0.14178547 0.74892979 0.12190071\\n\",\n            \"  0.0090985  0.09704158 0.70447804 0.21374912 0.72523093 0.89713875\\n\",\n            \"  0.28817021 0.56472583 0.59136866 0.7711216  0.78839121 0.03607145\\n\",\n            \"  0.33438564 0.99970048 0.80579864 0.79923327 0.57124039 0.64183951\\n\",\n            \"  0.11464931 0.703289   0.64033748 0.5799896  0.14488077 0.90946673\\n\",\n            \"  0.4189947  0.99825172 0.28607413 0.6801013  0.16240732 0.25219133\\n\",\n            \"  0.30470031 0.30292756 0.15999459 0.52230381 0.82012623 0.33586634\\n\",\n            \"  0.25613996 0.60354742 0.26006038 0.23281006]\\n\",\n            \" [0.37977727 0.7429604  0.38837932 0.18434243 0.84440271 0.53955069\\n\",\n            \"  0.40121556 0.83114666 0.48845808 0.58768546 0.4097926  0.29445373\\n\",\n            \"  0.22750019 0.9520429  0.99964437 0.57829693 0.32369595 0.60769326\\n\",\n            \"  0.76116892 0.14857116 0.07462658 0.01199289 0.37147371 0.80177111\\n\",\n            \"  0.60845313 0.33410248 0.06017335 0.447363   0.31500924 0.95988807\\n\",\n            \"  0.41506716 0.33740287 0.38991258 0.23478571 0.57808465 0.48520973\\n\",\n            \"  0.48241035 0.35030686 0.90598744 0.1296871  0.57966373 0.98736092\\n\",\n            \"  0.43859306 0.5358377  0.25181342 0.0195783  0.51178364 0.26981021\\n\",\n            \"  0.04674047 0.97762416 0.72747288 0.75616534 0.68105477 0.06914679\\n\",\n            \"  0.14054312 0.42816012 0.66792325 0.03168237 0.68685758 0.43487164\\n\",\n            \"  0.08064005 0.23444144 0.60360253 0.21423994]]\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"QI50dkZ1O630\",\n        \"colab_type\": \"code\",\n        \"outputId\": \"7d467842-f837-4e41-e099-534549b6fc05\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 54\n        }\n      },\n      \"source\": [\n        \"print(\\\"Step 9: We assume we have trained the 8 heads of the attention sub-layer\\\")\\n\",\n        \"z0h1=np.random.random((3, 64))\\n\",\n        \"z1h2=np.random.random((3, 64))\\n\",\n        \"z2h3=np.random.random((3, 64))\\n\",\n        \"z3h4=np.random.random((3, 64))\\n\",\n        \"z4h5=np.random.random((3, 64))\\n\",\n        \"z5h6=np.random.random((3, 64))\\n\",\n        \"z6h7=np.random.random((3, 64))\\n\",\n        \"z7h8=np.random.random((3, 64))\\n\",\n        \"print(\\\"shape of one head\\\",z0h1.shape,\\\"dimension of 8 heads\\\",64*8)\"\n      ],\n      \"execution_count\": 0,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Step 9: We assume we have trained the 8 heads of the attention sub-layer\\n\",\n            \"shape of one head (3, 64) dimension of 8 heads 512\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"3n87LE92_Puf\",\n        \"colab_type\": \"code\",\n        \"outputId\": \"55d00415-ebea-43a6-b4c5-ff13e02c3052\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 90\n        }\n      },\n      \"source\": [\n        \"print(\\\"Step 10: Concatenation of heads 1 to 8 to obtain the original 8x64=512 output dimension of the model\\\")\\n\",\n        \"output_attention=np.hstack((z0h1,z1h2,z2h3,z3h4,z4h5,z5h6,z6h7,z7h8))\\n\",\n        \"print(output_attention)\"\n      ],\n      \"execution_count\": 0,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Step 10: Concantenation of heads 1 to 8 to obtain the original 8x64=512 ouput dimension of the model\\n\",\n            \"[[0.46950893 0.88546586 0.47615937 ... 0.08285802 0.16577096 0.61094461]\\n\",\n            \" [0.31638247 0.24246402 0.30390966 ... 0.42283366 0.62127905 0.64414042]\\n\",\n            \" [0.1922683  0.7017995  0.60116595 ... 0.20012387 0.16264044 0.93645276]]\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"PJLl4Jf3fPLh\",\n        \"colab_type\": \"text\"\n      },\n      \"source\": [\n        \"And now with Hugging Face in one line!\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"CZIRvcRmfTPb\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"#@title Transformer Installation\\n\",\n        \"!pip -qq install transformers\"\n      ],\n      \"execution_count\": 0,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"cNwLYc-SfXdF\",\n        \"colab_type\": \"code\",\n        \"outputId\": \"d1314cc6-74d6-45cf-b8d6-0a903e58ac60\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 85,\n          \"referenced_widgets\": [\n            \"946c90b82f7f46caa25c885668b75eab\",\n            \"4191af78535e4da8bb797690eff84e00\",\n            \"9ce3d57b96b64da0b15e3f3626bacb30\",\n            \"f8da2c91156342a69d9b262f4f993aa4\",\n            \"97370923218945c5b80ab468751ac8a7\",\n            \"0ba4a91f472e4c41ba80ab4025288446\",\n            \"15aa4b6f8f784c74804107be249126b9\",\n            \"edea457617ed4792aeeb65292019ceb4\"\n          ]\n        }\n      },\n      \"source\": [\n        \"#@title Retrieve pipeline of modules and choose English to French translation\\n\",\n        \"from transformers import pipeline\\n\",\n        \"translator = pipeline(\\\"translation_en_to_fr\\\")\\n\",\n        \"#One line of code!\\n\",\n        \"print(translator(\\\"It is easy to translate languages with transformers\\\", max_length=40))\"\n      ],\n      \"execution_count\": 0,\n      \"outputs\": [\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"946c90b82f7f46caa25c885668b75eab\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=230.0, style=ProgressStyle(description_…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\",\n            \"[{'translation_text': 'Il est facile de traduire des langues avec des transformateurs.'}]\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    }\n  ]\n}"
  },
  {
    "path": "Chapter01/positional_encoding.ipynb",
    "content": "{\n  \"nbformat\": 4,\n  \"nbformat_minor\": 0,\n  \"metadata\": {\n    \"colab\": {\n      \"name\": \"positional_encoding.ipynb\",\n      \"provenance\": [],\n      \"collapsed_sections\": [],\n      \"toc_visible\": true\n    },\n    \"kernelspec\": {\n      \"name\": \"python3\",\n      \"display_name\": \"Python 3\"\n    }\n  },\n  \"cells\": [\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"7fjcTlyE3WvR\"\n      },\n      \"source\": [\n        \"#A Positional Encoding Example\\n\",\n        \"Copyright 2021 Denis Rothman, MIT License\\n\",\n        \"\\n\",\n        \"Reference 1 for Positional Encoding:\\n\",\n        \"Attention is All You Need paper, page 6,Google Brain and Google Research\\n\",\n        \"\\n\",\n        \"\\n\",\n        \"Reference 2 for word embedding:\\n\",\n        \"https://www.geeksforgeeks.org/python-word-embedding-using-word2vec/\\n\",\n        \"Reference 3 for cosine similarity:\\n\",\n        \"SciKit Learn cosine similarity documentation\\n\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"JKJ8Saf6vR9b\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"654ce4ae-0115-46d4-a186-ee2581f1ee4f\"\n      },\n      \"source\": [\n        \"!pip install gensim==3.8.3\\n\",\n        \"import torch\\n\",\n        \"import nltk\\n\",\n        \"nltk.download('punkt')\"\n      ],\n      \"execution_count\": 6,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Requirement already satisfied: gensim==3.8.3 in /usr/local/lib/python3.7/dist-packages (3.8.3)\\n\",\n            \"Requirement already satisfied: scipy>=0.18.1 in /usr/local/lib/python3.7/dist-packages (from gensim==3.8.3) (1.4.1)\\n\",\n            \"Requirement already satisfied: smart-open>=1.8.1 in /usr/local/lib/python3.7/dist-packages (from gensim==3.8.3) (4.2.0)\\n\",\n            \"Requirement already satisfied: numpy>=1.11.3 in /usr/local/lib/python3.7/dist-packages (from gensim==3.8.3) (1.19.5)\\n\",\n            \"Requirement already satisfied: six>=1.5.0 in /usr/local/lib/python3.7/dist-packages (from gensim==3.8.3) (1.15.0)\\n\",\n            \"[nltk_data] Downloading package punkt to /root/nltk_data...\\n\",\n            \"[nltk_data]   Package punkt is already up-to-date!\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"execute_result\",\n          \"data\": {\n            \"text/plain\": [\n              \"True\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          },\n          \"execution_count\": 6\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"PGXgeOyS5qBP\"\n      },\n      \"source\": [\n        \"# upload to the text.txt file to Google Colaboratory using the file manager \"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"7o7EeDUUu0Sh\"\n      },\n      \"source\": [\n        \"import math\\n\",\n        \"import numpy as np\\n\",\n        \"from nltk.tokenize import sent_tokenize, word_tokenize \\n\",\n        \"import gensim \\n\",\n        \"from gensim.models import Word2Vec \\n\",\n        \"import numpy as np\\n\",\n        \"from sklearn.metrics.pairwise import cosine_similarity\\n\",\n        \"import matplotlib.pyplot as plt\\n\",\n        \"import warnings \\n\",\n        \"warnings.filterwarnings(action = 'ignore') \\n\",\n        \"\\n\",\n        \"\\n\",\n        \"dprint=0 # prints outputs if set to 1, default=0\\n\",\n        \"\\n\",\n        \"#‘text.txt’ file \\n\",\n        \"sample = open(\\\"text.txt\\\", \\\"r\\\") \\n\",\n        \"s = sample.read() \\n\",\n        \"\\n\",\n        \"# processing escape characters \\n\",\n        \"f = s.replace(\\\"\\\\n\\\", \\\" \\\") \\n\",\n        \"\\n\",\n        \"data = [] \\n\",\n        \"\\n\",\n        \"# sentence parsing \\n\",\n        \"for i in sent_tokenize(f): \\n\",\n        \"\\ttemp = [] \\n\",\n        \"\\t# tokenize the sentence into words \\n\",\n        \"\\tfor j in word_tokenize(i): \\n\",\n        \"\\t\\ttemp.append(j.lower()) \\n\",\n        \"\\tdata.append(temp) \\n\",\n        \"\\n\",\n        \"# Creating Skip Gram model \\n\",\n        \"#model2 = gensim.models.Word2Vec(data, min_count = 1, size = 512,window = 5, sg = 1) \\n\",\n        \"#model = Word2Vec(sentences=common_texts, vector_size=100, window=5, min_count=1, workers=4)\\n\",\n        \"model2 = gensim.models.Word2Vec(data, min_count = 1, size = 512,window = 5, sg = 1)\\n\",\n        \"\\n\",\n        \"# 1-The 2-black 3-cat 4-sat 5-on 6-the 7-couch 8-and 9-the 10-brown 11-dog 12-slept 13-on 14-the 15-rug.\\n\",\n        \"word1='black'\\n\",\n        \"word2='brown'\\n\",\n        \"pos1=2\\n\",\n        \"pos2=10\\n\",\n        \"a=model2[word1]\\n\",\n        \"b=model2[word2]\\n\",\n        \"\\n\",\n        \"if(dprint==1):\\n\",\n        \"        print(a)\\n\",\n        \"\\n\",\n        \"# compute cosine similarity\\n\",\n        \"dot = np.dot(a, b)\\n\",\n        \"norma = np.linalg.norm(a)\\n\",\n        \"normb = np.linalg.norm(b)\\n\",\n        \"cos = dot / (norma * normb)\\n\",\n        \"\\n\",\n        \"aa = a.reshape(1,512) \\n\",\n        \"ba = b.reshape(1,512)\\n\",\n        \"cos_lib = cosine_similarity(aa, ba)\"\n      ],\n      \"execution_count\": 7,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"xlTeXmatz7bP\"\n      },\n      \"source\": [\n        \"A Positional Encoding example using one line of basic Python using a few lines of code for the sine and cosine functions.\\n\",\n        \"I added a Pytorch method inspired by Pytorch.org to explore these methods.\\n\",\n        \"The main idea to keep in mind is that we are looking to add small values to the word embedding output so that the positions are taken into account. This means that as long as the cosine similarity, for example, displayed at the end of the notebook, shows the positions have been taken into account, the method can apply. Depending on the Transformer model, this method can be fine-tuned as well as using other methods.  \"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"EmBUq9MzxQxz\"\n      },\n      \"source\": [\n        \"pe1=aa.copy()\\n\",\n        \"pe2=aa.copy()\\n\",\n        \"pe3=aa.copy()\\n\",\n        \"paa=aa.copy()\\n\",\n        \"pba=ba.copy()\\n\",\n        \"d_model=512\\n\",\n        \"max_print=d_model\\n\",\n        \"max_length=20\\n\",\n        \"\\n\",\n        \"for i in range(0, max_print,2):\\n\",\n        \"                pe1[0][i] = math.sin(pos1 / (10000 ** ((2 * i)/d_model)))\\n\",\n        \"                paa[0][i] = (paa[0][i]*math.sqrt(d_model))+ pe1[0][i]\\n\",\n        \"                pe1[0][i+1] = math.cos(pos1 / (10000 ** ((2 * i)/d_model)))\\n\",\n        \"                paa[0][i+1] = (paa[0][i+1]*math.sqrt(d_model))+pe1[0][i+1]\\n\",\n        \"                if dprint==1:\\n\",\n        \"                        print(i,pe1[0][i],i+1,pe1[0][i+1])\\n\",\n        \"                        print(i,paa[0][i],i+1,paa[0][i+1])\\n\",\n        \"                        print(\\\"\\\\n\\\")\\n\",\n        \"\\n\",\n        \"#print(pe1)\\n\",\n        \"# A  method in Pytorch using torch.exp and math.log :\\n\",\n        \"max_len=max_length                \\n\",\n        \"pe = torch.zeros(max_len, d_model)\\n\",\n        \"position = torch.arange(0, max_len, dtype=torch.float).unsqueeze(1)\\n\",\n        \"div_term = torch.exp(torch.arange(0, d_model, 2).float() * (-math.log(10000.0) / d_model))\\n\",\n        \"pe[:, 0::2] = torch.sin(position * div_term)\\n\",\n        \"pe[:, 1::2] = torch.cos(position * div_term)\\n\",\n        \"#print(pe[:, 0::2])\"\n      ],\n      \"execution_count\": 8,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"pgrXed2FwHDC\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"54dcdded-8470-47fc-999e-67294ee67dd2\"\n      },\n      \"source\": [\n        \"\\n\",\n        \"for i in range(0, max_print,2):\\n\",\n        \"                pe2[0][i] = math.sin(pos2 / (10000 ** ((2 * i)/d_model)))\\n\",\n        \"                pba[0][i] = (pba[0][i]*math.sqrt(d_model))+ pe2[0][i]\\n\",\n        \"            \\n\",\n        \"                pe2[0][i+1] = math.cos(pos2 / (10000 ** ((2 * i)/d_model)))\\n\",\n        \"                pba[0][i+1] = (pba[0][i+1]*math.sqrt(d_model))+ pe2[0][i+1]\\n\",\n        \"               \\n\",\n        \"                if dprint==1:\\n\",\n        \"                        print(i,pe2[0][i],i+1,pe2[0][i+1])\\n\",\n        \"                        print(i,paa[0][i],i+1,paa[0][i+1])\\n\",\n        \"                        print(\\\"\\\\n\\\")\\n\",\n        \"\\n\",\n        \"print(word1,word2)\\n\",\n        \"cos_lib = cosine_similarity(aa, ba)\\n\",\n        \"print(cos_lib,\\\"word similarity\\\")\\n\",\n        \"cos_lib = cosine_similarity(pe1, pe2)\\n\",\n        \"print(cos_lib,\\\"positional similarity\\\")\\n\",\n        \"cos_lib = cosine_similarity(paa, pba)\\n\",\n        \"print(cos_lib,\\\"positional encoding similarity\\\")\\n\",\n        \"\\n\",\n        \"if dprint==1:\\n\",\n        \"        print(word1)\\n\",\n        \"        print(\\\"embedding\\\")\\n\",\n        \"        print(aa)\\n\",\n        \"        print(\\\"positional encoding\\\")\\n\",\n        \"        print(pe1)\\n\",\n        \"        print(\\\"encoded embedding\\\")\\n\",\n        \"        print(paa)\\n\",\n        \"\\n\",\n        \"        print(word2)\\n\",\n        \"        print(\\\"embedding\\\")\\n\",\n        \"        print(ba)\\n\",\n        \"        print(\\\"positional encoding\\\")\\n\",\n        \"        print(pe2)\\n\",\n        \"        print(\\\"encoded embedding\\\")\\n\",\n        \"        print(pba)\\n\",\n        \"\\n\"\n      ],\n      \"execution_count\": 9,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"black brown\\n\",\n            \"[[0.9998703]] word similarity\\n\",\n            \"[[0.8600013]] positional similarity\\n\",\n            \"[[0.96135795]] positional encoding similarity\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    }\n  ]\n}"
  },
  {
    "path": "Chapter01/text.txt",
    "content": "The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\nThe black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.\n\n\n"
  },
  {
    "path": "Chapter02/BERT_Fine_Tuning_Sentence_Classification_DR.ipynb",
    "content": "{\n  \"nbformat\": 4,\n  \"nbformat_minor\": 0,\n  \"metadata\": {\n    \"colab\": {\n      \"name\": \"BERT_Fine_Tuning_Sentence_Classification_DR.ipynb\",\n      \"provenance\": [],\n      \"collapsed_sections\": [],\n      \"toc_visible\": true\n    },\n    \"kernelspec\": {\n      \"name\": \"python3\",\n      \"display_name\": \"Python 3\"\n    },\n    \"accelerator\": \"GPU\"\n  },\n  \"cells\": [\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"jNKaJz5j_ylj\"\n      },\n      \"source\": [\n        \"# BERT Fine-Tuning Sentence Classification\\n\",\n        \"Copyright 2020 Denis Rothman. The text cells were taken out and replaced by titles of each cell withing the cell. The titles of the cells refer to the titles of the sections of the book. The descriptions of the cells have been rewritten for educational purposes.\\n\",\n        \"\\n\",\n        \"Contributer: George Mihaila\\n\",\n        \"\\n\",\n        \"[Reference Notebook by Chris McCormick and Nick Ryan](https://colab.research.google.com/drive/1pTuQhug6Dhl9XalKB0zUGf4FIdYFlpcX)\\n\",\n        \"\\n\",\n        \"[Reference Article by Chris McCormick and Nick Ryan](https://mccormickml.com/2019/07/22/BERT-fine-tuning/)\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"DEfSbAA4QHas\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"d8560cba-271c-451a-cc53-a088af8ce80e\"\n      },\n      \"source\": [\n        \"#@title Activating the GPU\\n\",\n        \"# Main menu->Runtime->Change Runtime Type\\n\",\n        \"import tensorflow as tf\\n\",\n        \"device_name = tf.test.gpu_device_name()\\n\",\n        \"if device_name != '/device:GPU:0':\\n\",\n        \"  raise SystemError('GPU device not found')\\n\",\n        \"print('Found GPU at: {}'.format(device_name))\"\n      ],\n      \"execution_count\": 26,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Found GPU at: /device:GPU:0\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"0NmMdkZO8R6q\"\n      },\n      \"source\": [\n        \"#@title Installing the Hugging Face PyTorch Interface for Bert\\n\",\n        \"# !pip install pytorch-pretrained-bert pytorch-nlp\\n\",\n        \"!pip install -q transformers\"\n      ],\n      \"execution_count\": 27,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"Ok002ceNB8E7\"\n      },\n      \"source\": [\n        \"#@title Importing the modules\\n\",\n        \"import torch\\n\",\n        \"from torch.utils.data import TensorDataset, DataLoader, RandomSampler, SequentialSampler\\n\",\n        \"from keras.preprocessing.sequence import pad_sequences\\n\",\n        \"from sklearn.model_selection import train_test_split\\n\",\n        \"from transformers import BertTokenizer, BertConfig\\n\",\n        \"from transformers import AdamW, BertForSequenceClassification, get_linear_schedule_with_warmup\\n\",\n        \"from tqdm import tqdm, trange\\n\",\n        \"import pandas as pd\\n\",\n        \"import io\\n\",\n        \"import numpy as np\\n\",\n        \"import matplotlib.pyplot as plt\\n\",\n        \"% matplotlib inline\"\n      ],\n      \"execution_count\": 28,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"oYsV4H8fCpZ-\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 35\n        },\n        \"outputId\": \"5a2b9ada-305c-4c38-f5c1-9daf64758ff9\"\n      },\n      \"source\": [\n        \"#@title Specifying CUDA as the device for Torch\\n\",\n        \"device = torch.device(\\\"cuda\\\" if torch.cuda.is_available() else \\\"cpu\\\")\\n\",\n        \"n_gpu = torch.cuda.device_count()\\n\",\n        \"torch.cuda.get_device_name(0)\"\n      ],\n      \"execution_count\": 29,\n      \"outputs\": [\n        {\n          \"output_type\": \"execute_result\",\n          \"data\": {\n            \"application/vnd.google.colaboratory.intrinsic+json\": {\n              \"type\": \"string\"\n            },\n            \"text/plain\": [\n              \"'Tesla P4'\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          },\n          \"execution_count\": 29\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"JpfK9OOJy1OY\"\n      },\n      \"source\": [\n        \"@article{warstadt2018neural,\\n\",\n        \"    title={Neural Network Acceptability Judgments},\\n\",\n        \"    author={Warstadt, Alex and Singh, Amanpreet and Bowman, Samuel R},\\n\",\n        \"    journal={arXiv preprint arXiv:1805.12471},\\n\",\n        \"    year={2018}\\n\",\n        \"}\\n\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"_UkeC7SG2krJ\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"1d576ac9-fb08-4715-bb7b-dd0b55983bac\"\n      },\n      \"source\": [\n        \"#@title Loading the Dataset\\n\",\n        \"#source of dataset : https://nyu-mll.github.io/CoLA/\\n\",\n        \"df = pd.read_csv(\\\"in_domain_train.tsv\\\", delimiter='\\\\t', header=None, names=['sentence_source', 'label', 'label_notes', 'sentence'])\\n\",\n        \"df.shape\"\n      ],\n      \"execution_count\": 30,\n      \"outputs\": [\n        {\n          \"output_type\": \"execute_result\",\n          \"data\": {\n            \"text/plain\": [\n              \"(8551, 4)\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          },\n          \"execution_count\": 30\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"AQfTaYDo42zu\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 376\n        },\n        \"outputId\": \"abf38bbc-0bc2-42e8-9709-b25614e6f3d3\"\n      },\n      \"source\": [\n        \"df.sample(10)\"\n      ],\n      \"execution_count\": 31,\n      \"outputs\": [\n        {\n          \"output_type\": \"execute_result\",\n          \"data\": {\n            \"text/html\": [\n              \"<div>\\n\",\n              \"<style scoped>\\n\",\n              \"    .dataframe tbody tr th:only-of-type {\\n\",\n              \"        vertical-align: middle;\\n\",\n              \"    }\\n\",\n              \"\\n\",\n              \"    .dataframe tbody tr th {\\n\",\n              \"        vertical-align: top;\\n\",\n              \"    }\\n\",\n              \"\\n\",\n              \"    .dataframe thead th {\\n\",\n              \"        text-align: right;\\n\",\n              \"    }\\n\",\n              \"</style>\\n\",\n              \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n              \"  <thead>\\n\",\n              \"    <tr style=\\\"text-align: right;\\\">\\n\",\n              \"      <th></th>\\n\",\n              \"      <th>sentence_source</th>\\n\",\n              \"      <th>label</th>\\n\",\n              \"      <th>label_notes</th>\\n\",\n              \"      <th>sentence</th>\\n\",\n              \"    </tr>\\n\",\n              \"  </thead>\\n\",\n              \"  <tbody>\\n\",\n              \"    <tr>\\n\",\n              \"      <th>1171</th>\\n\",\n              \"      <td>r-67</td>\\n\",\n              \"      <td>1</td>\\n\",\n              \"      <td>NaN</td>\\n\",\n              \"      <td>john is as tall as that man .</td>\\n\",\n              \"    </tr>\\n\",\n              \"    <tr>\\n\",\n              \"      <th>5605</th>\\n\",\n              \"      <td>c_13</td>\\n\",\n              \"      <td>1</td>\\n\",\n              \"      <td>NaN</td>\\n\",\n              \"      <td>i expect soon to see the results .</td>\\n\",\n              \"    </tr>\\n\",\n              \"    <tr>\\n\",\n              \"      <th>6605</th>\\n\",\n              \"      <td>g_81</td>\\n\",\n              \"      <td>1</td>\\n\",\n              \"      <td>NaN</td>\\n\",\n              \"      <td>john hummed , and mary sang , at equal volumes .</td>\\n\",\n              \"    </tr>\\n\",\n              \"    <tr>\\n\",\n              \"      <th>3537</th>\\n\",\n              \"      <td>ks08</td>\\n\",\n              \"      <td>1</td>\\n\",\n              \"      <td>NaN</td>\\n\",\n              \"      <td>they can smile .</td>\\n\",\n              \"    </tr>\\n\",\n              \"    <tr>\\n\",\n              \"      <th>8483</th>\\n\",\n              \"      <td>ad03</td>\\n\",\n              \"      <td>1</td>\\n\",\n              \"      <td>NaN</td>\\n\",\n              \"      <td>alison ran</td>\\n\",\n              \"    </tr>\\n\",\n              \"    <tr>\\n\",\n              \"      <th>4709</th>\\n\",\n              \"      <td>ks08</td>\\n\",\n              \"      <td>1</td>\\n\",\n              \"      <td>NaN</td>\\n\",\n              \"      <td>the news was dealt with carefully .</td>\\n\",\n              \"    </tr>\\n\",\n              \"    <tr>\\n\",\n              \"      <th>7690</th>\\n\",\n              \"      <td>sks13</td>\\n\",\n              \"      <td>1</td>\\n\",\n              \"      <td>NaN</td>\\n\",\n              \"      <td>i sent money to mary .</td>\\n\",\n              \"    </tr>\\n\",\n              \"    <tr>\\n\",\n              \"      <th>7515</th>\\n\",\n              \"      <td>sks13</td>\\n\",\n              \"      <td>0</td>\\n\",\n              \"      <td>*</td>\\n\",\n              \"      <td>mary wrote a letter to himself last year .</td>\\n\",\n              \"    </tr>\\n\",\n              \"    <tr>\\n\",\n              \"      <th>2443</th>\\n\",\n              \"      <td>l-93</td>\\n\",\n              \"      <td>1</td>\\n\",\n              \"      <td>NaN</td>\\n\",\n              \"      <td>a flowering plant is on the windowsill .</td>\\n\",\n              \"    </tr>\\n\",\n              \"    <tr>\\n\",\n              \"      <th>5680</th>\\n\",\n              \"      <td>c_13</td>\\n\",\n              \"      <td>1</td>\\n\",\n              \"      <td>NaN</td>\\n\",\n              \"      <td>the canadian bought himself a barbecue .</td>\\n\",\n              \"    </tr>\\n\",\n              \"  </tbody>\\n\",\n              \"</table>\\n\",\n              \"</div>\"\n            ],\n            \"text/plain\": [\n              \"     sentence_source  ...                                          sentence\\n\",\n              \"1171            r-67  ...                     john is as tall as that man .\\n\",\n              \"5605            c_13  ...                i expect soon to see the results .\\n\",\n              \"6605            g_81  ...  john hummed , and mary sang , at equal volumes .\\n\",\n              \"3537            ks08  ...                                  they can smile .\\n\",\n              \"8483            ad03  ...                                        alison ran\\n\",\n              \"4709            ks08  ...               the news was dealt with carefully .\\n\",\n              \"7690           sks13  ...                            i sent money to mary .\\n\",\n              \"7515           sks13  ...        mary wrote a letter to himself last year .\\n\",\n              \"2443            l-93  ...          a flowering plant is on the windowsill .\\n\",\n              \"5680            c_13  ...          the canadian bought himself a barbecue .\\n\",\n              \"\\n\",\n              \"[10 rows x 4 columns]\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          },\n          \"execution_count\": 31\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"GuE5BqICAne2\"\n      },\n      \"source\": [\n        \"#@ Creating sentence, label lists and adding Bert tokens\\n\",\n        \"sentences = df.sentence.values\\n\",\n        \"\\n\",\n        \"# Adding CLS and SEP tokens at the beginning and end of each sentence for BERT\\n\",\n        \"sentences = [\\\"[CLS] \\\" + sentence + \\\" [SEP]\\\" for sentence in sentences]\\n\",\n        \"labels = df.label.values\"\n      ],\n      \"execution_count\": 32,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"Z474sSC6oe7A\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"59ff8f5c-64ab-46a5-b74b-6fb24ded5cb8\"\n      },\n      \"source\": [\n        \"#@title Activating the BERT Tokenizer\\n\",\n        \"tokenizer = BertTokenizer.from_pretrained('bert-base-uncased', do_lower_case=True)\\n\",\n        \"tokenized_texts = [tokenizer.tokenize(sent) for sent in sentences]\\n\",\n        \"print (\\\"Tokenize the first sentence:\\\")\\n\",\n        \"print (tokenized_texts[0])\"\n      ],\n      \"execution_count\": 33,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Tokenize the first sentence:\\n\",\n            \"['[CLS]', 'our', 'friends', 'wo', 'n', \\\"'\\\", 't', 'buy', 'this', 'analysis', ',', 'let', 'alone', 'the', 'next', 'one', 'we', 'propose', '.', '[SEP]']\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"Cp9BPRd1tMIo\"\n      },\n      \"source\": [\n        \"#@title Processing the data\\n\",\n        \"# Set the maximum sequence length. The longest sequence in our training set is 47, but we'll leave room on the end anyway. \\n\",\n        \"# In the original paper, the authors used a length of 512.\\n\",\n        \"MAX_LEN = 128\\n\",\n        \"\\n\",\n        \"# Use the BERT tokenizer to convert the tokens to their index numbers in the BERT vocabulary\\n\",\n        \"input_ids = [tokenizer.convert_tokens_to_ids(x) for x in tokenized_texts]\\n\",\n        \"\\n\",\n        \"# Pad our input tokens\\n\",\n        \"input_ids = pad_sequences(input_ids, maxlen=MAX_LEN, dtype=\\\"long\\\", truncating=\\\"post\\\", padding=\\\"post\\\")\"\n      ],\n      \"execution_count\": 34,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"cDoC24LeEv3N\"\n      },\n      \"source\": [\n        \"#@title Create attention masks\\n\",\n        \"attention_masks = []\\n\",\n        \"\\n\",\n        \"# Create a mask of 1s for each token followed by 0s for padding\\n\",\n        \"for seq in input_ids:\\n\",\n        \"  seq_mask = [float(i>0) for i in seq]\\n\",\n        \"  attention_masks.append(seq_mask)\"\n      ],\n      \"execution_count\": 35,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"aFbE-UHvsb7-\"\n      },\n      \"source\": [\n        \"#@title Splitting data into train and validation sets\\n\",\n        \"# Use train_test_split to split our data into train and validation sets for training\\n\",\n        \"\\n\",\n        \"train_inputs, validation_inputs, train_labels, validation_labels = train_test_split(input_ids, labels, \\n\",\n        \"                                                            random_state=2018, test_size=0.1)\\n\",\n        \"train_masks, validation_masks, _, _ = train_test_split(attention_masks, input_ids,\\n\",\n        \"                                             random_state=2018, test_size=0.1)\"\n      ],\n      \"execution_count\": 36,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"jw5K2A5Ko1RF\"\n      },\n      \"source\": [\n        \"#@title Converting all the data into torch tensors\\n\",\n        \"# Torch tensors are the required datatype for our model\\n\",\n        \"\\n\",\n        \"train_inputs = torch.tensor(train_inputs)\\n\",\n        \"validation_inputs = torch.tensor(validation_inputs)\\n\",\n        \"train_labels = torch.tensor(train_labels)\\n\",\n        \"validation_labels = torch.tensor(validation_labels)\\n\",\n        \"train_masks = torch.tensor(train_masks)\\n\",\n        \"validation_masks = torch.tensor(validation_masks)\"\n      ],\n      \"execution_count\": 37,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"GEgLpFVlo1Z-\"\n      },\n      \"source\": [\n        \"#@title Selecting a Batch Size and Creating and Iterator\\n\",\n        \"# Select a batch size for training. For fine-tuning BERT on a specific task, the authors recommend a batch size of 16 or 32\\n\",\n        \"batch_size = 32\\n\",\n        \"\\n\",\n        \"# Create an iterator of our data with torch DataLoader. This helps save on memory during training because, unlike a for loop, \\n\",\n        \"# with an iterator the entire dataset does not need to be loaded into memory\\n\",\n        \"\\n\",\n        \"train_data = TensorDataset(train_inputs, train_masks, train_labels)\\n\",\n        \"train_sampler = RandomSampler(train_data)\\n\",\n        \"train_dataloader = DataLoader(train_data, sampler=train_sampler, batch_size=batch_size)\\n\",\n        \"\\n\",\n        \"validation_data = TensorDataset(validation_inputs, validation_masks, validation_labels)\\n\",\n        \"validation_sampler = SequentialSampler(validation_data)\\n\",\n        \"validation_dataloader = DataLoader(validation_data, sampler=validation_sampler, batch_size=batch_size)\\n\"\n      ],\n      \"execution_count\": 38,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"JzX6dkOHCv9F\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"7729a80b-bc93-41be-f999-063c77f5e2a4\"\n      },\n      \"source\": [\n        \"#@title Bert Configuration\\n\",\n        \"# Initializing a BERT bert-base-uncased style configuration\\n\",\n        \"#@title Transformer Installation\\n\",\n        \"try:\\n\",\n        \"  import transformers\\n\",\n        \"except:\\n\",\n        \"  print(\\\"Installing transformers\\\")\\n\",\n        \"  !pip -qq install transformers\\n\",\n        \"  \\n\",\n        \"from transformers import BertModel, BertConfig\\n\",\n        \"configuration = BertConfig()\\n\",\n        \"\\n\",\n        \"# Initializing a model from the bert-base-uncased style configuration\\n\",\n        \"model = BertModel(configuration)\\n\",\n        \"\\n\",\n        \"# Accessing the model configuration\\n\",\n        \"configuration = model.config\\n\",\n        \"print(configuration)\"\n      ],\n      \"execution_count\": 39,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"BertConfig {\\n\",\n            \"  \\\"attention_probs_dropout_prob\\\": 0.1,\\n\",\n            \"  \\\"gradient_checkpointing\\\": false,\\n\",\n            \"  \\\"hidden_act\\\": \\\"gelu\\\",\\n\",\n            \"  \\\"hidden_dropout_prob\\\": 0.1,\\n\",\n            \"  \\\"hidden_size\\\": 768,\\n\",\n            \"  \\\"initializer_range\\\": 0.02,\\n\",\n            \"  \\\"intermediate_size\\\": 3072,\\n\",\n            \"  \\\"layer_norm_eps\\\": 1e-12,\\n\",\n            \"  \\\"max_position_embeddings\\\": 512,\\n\",\n            \"  \\\"model_type\\\": \\\"bert\\\",\\n\",\n            \"  \\\"num_attention_heads\\\": 12,\\n\",\n            \"  \\\"num_hidden_layers\\\": 12,\\n\",\n            \"  \\\"pad_token_id\\\": 0,\\n\",\n            \"  \\\"type_vocab_size\\\": 2,\\n\",\n            \"  \\\"vocab_size\\\": 30522\\n\",\n            \"}\\n\",\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"0z3-ZV0k2qk8\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"852674d6-31a6-40e5-eed1-5548bdb03584\"\n      },\n      \"source\": [\n        \"#@title Loading the Hugging Face Bert Uncased Base Model \\n\",\n        \"model = BertForSequenceClassification.from_pretrained(\\\"bert-base-uncased\\\", num_labels=2)\\n\",\n        \"model.cuda()\"\n      ],\n      \"execution_count\": 40,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Some weights of the model checkpoint at bert-base-uncased were not used when initializing BertForSequenceClassification: ['cls.predictions.bias', 'cls.predictions.transform.dense.weight', 'cls.predictions.transform.dense.bias', 'cls.predictions.decoder.weight', 'cls.seq_relationship.weight', 'cls.seq_relationship.bias', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.transform.LayerNorm.bias']\\n\",\n            \"- This IS expected if you are initializing BertForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\\n\",\n            \"- This IS NOT expected if you are initializing BertForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\\n\",\n            \"Some weights of BertForSequenceClassification were not initialized from the model checkpoint at bert-base-uncased and are newly initialized: ['classifier.weight', 'classifier.bias']\\n\",\n            \"You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\\n\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"execute_result\",\n          \"data\": {\n            \"text/plain\": [\n              \"BertForSequenceClassification(\\n\",\n              \"  (bert): BertModel(\\n\",\n              \"    (embeddings): BertEmbeddings(\\n\",\n              \"      (word_embeddings): Embedding(30522, 768, padding_idx=0)\\n\",\n              \"      (position_embeddings): Embedding(512, 768)\\n\",\n              \"      (token_type_embeddings): Embedding(2, 768)\\n\",\n              \"      (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n              \"      (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"    )\\n\",\n              \"    (encoder): BertEncoder(\\n\",\n              \"      (layer): ModuleList(\\n\",\n              \"        (0): BertLayer(\\n\",\n              \"          (attention): BertAttention(\\n\",\n              \"            (self): BertSelfAttention(\\n\",\n              \"              (query): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (key): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (value): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"            )\\n\",\n              \"            (output): BertSelfOutput(\\n\",\n              \"              (dense): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n              \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"            )\\n\",\n              \"          )\\n\",\n              \"          (intermediate): BertIntermediate(\\n\",\n              \"            (dense): Linear(in_features=768, out_features=3072, bias=True)\\n\",\n              \"          )\\n\",\n              \"          (output): BertOutput(\\n\",\n              \"            (dense): Linear(in_features=3072, out_features=768, bias=True)\\n\",\n              \"            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n              \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"          )\\n\",\n              \"        )\\n\",\n              \"        (1): BertLayer(\\n\",\n              \"          (attention): BertAttention(\\n\",\n              \"            (self): BertSelfAttention(\\n\",\n              \"              (query): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (key): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (value): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"            )\\n\",\n              \"            (output): BertSelfOutput(\\n\",\n              \"              (dense): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n              \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"            )\\n\",\n              \"          )\\n\",\n              \"          (intermediate): BertIntermediate(\\n\",\n              \"            (dense): Linear(in_features=768, out_features=3072, bias=True)\\n\",\n              \"          )\\n\",\n              \"          (output): BertOutput(\\n\",\n              \"            (dense): Linear(in_features=3072, out_features=768, bias=True)\\n\",\n              \"            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n              \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"          )\\n\",\n              \"        )\\n\",\n              \"        (2): BertLayer(\\n\",\n              \"          (attention): BertAttention(\\n\",\n              \"            (self): BertSelfAttention(\\n\",\n              \"              (query): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (key): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (value): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"            )\\n\",\n              \"            (output): BertSelfOutput(\\n\",\n              \"              (dense): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n              \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"            )\\n\",\n              \"          )\\n\",\n              \"          (intermediate): BertIntermediate(\\n\",\n              \"            (dense): Linear(in_features=768, out_features=3072, bias=True)\\n\",\n              \"          )\\n\",\n              \"          (output): BertOutput(\\n\",\n              \"            (dense): Linear(in_features=3072, out_features=768, bias=True)\\n\",\n              \"            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n              \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"          )\\n\",\n              \"        )\\n\",\n              \"        (3): BertLayer(\\n\",\n              \"          (attention): BertAttention(\\n\",\n              \"            (self): BertSelfAttention(\\n\",\n              \"              (query): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (key): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (value): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"            )\\n\",\n              \"            (output): BertSelfOutput(\\n\",\n              \"              (dense): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n              \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"            )\\n\",\n              \"          )\\n\",\n              \"          (intermediate): BertIntermediate(\\n\",\n              \"            (dense): Linear(in_features=768, out_features=3072, bias=True)\\n\",\n              \"          )\\n\",\n              \"          (output): BertOutput(\\n\",\n              \"            (dense): Linear(in_features=3072, out_features=768, bias=True)\\n\",\n              \"            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n              \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"          )\\n\",\n              \"        )\\n\",\n              \"        (4): BertLayer(\\n\",\n              \"          (attention): BertAttention(\\n\",\n              \"            (self): BertSelfAttention(\\n\",\n              \"              (query): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (key): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (value): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"            )\\n\",\n              \"            (output): BertSelfOutput(\\n\",\n              \"              (dense): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n              \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"            )\\n\",\n              \"          )\\n\",\n              \"          (intermediate): BertIntermediate(\\n\",\n              \"            (dense): Linear(in_features=768, out_features=3072, bias=True)\\n\",\n              \"          )\\n\",\n              \"          (output): BertOutput(\\n\",\n              \"            (dense): Linear(in_features=3072, out_features=768, bias=True)\\n\",\n              \"            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n              \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"          )\\n\",\n              \"        )\\n\",\n              \"        (5): BertLayer(\\n\",\n              \"          (attention): BertAttention(\\n\",\n              \"            (self): BertSelfAttention(\\n\",\n              \"              (query): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (key): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (value): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"            )\\n\",\n              \"            (output): BertSelfOutput(\\n\",\n              \"              (dense): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n              \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"            )\\n\",\n              \"          )\\n\",\n              \"          (intermediate): BertIntermediate(\\n\",\n              \"            (dense): Linear(in_features=768, out_features=3072, bias=True)\\n\",\n              \"          )\\n\",\n              \"          (output): BertOutput(\\n\",\n              \"            (dense): Linear(in_features=3072, out_features=768, bias=True)\\n\",\n              \"            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n              \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"          )\\n\",\n              \"        )\\n\",\n              \"        (6): BertLayer(\\n\",\n              \"          (attention): BertAttention(\\n\",\n              \"            (self): BertSelfAttention(\\n\",\n              \"              (query): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (key): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (value): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"            )\\n\",\n              \"            (output): BertSelfOutput(\\n\",\n              \"              (dense): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n              \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"            )\\n\",\n              \"          )\\n\",\n              \"          (intermediate): BertIntermediate(\\n\",\n              \"            (dense): Linear(in_features=768, out_features=3072, bias=True)\\n\",\n              \"          )\\n\",\n              \"          (output): BertOutput(\\n\",\n              \"            (dense): Linear(in_features=3072, out_features=768, bias=True)\\n\",\n              \"            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n              \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"          )\\n\",\n              \"        )\\n\",\n              \"        (7): BertLayer(\\n\",\n              \"          (attention): BertAttention(\\n\",\n              \"            (self): BertSelfAttention(\\n\",\n              \"              (query): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (key): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (value): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"            )\\n\",\n              \"            (output): BertSelfOutput(\\n\",\n              \"              (dense): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n              \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"            )\\n\",\n              \"          )\\n\",\n              \"          (intermediate): BertIntermediate(\\n\",\n              \"            (dense): Linear(in_features=768, out_features=3072, bias=True)\\n\",\n              \"          )\\n\",\n              \"          (output): BertOutput(\\n\",\n              \"            (dense): Linear(in_features=3072, out_features=768, bias=True)\\n\",\n              \"            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n              \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"          )\\n\",\n              \"        )\\n\",\n              \"        (8): BertLayer(\\n\",\n              \"          (attention): BertAttention(\\n\",\n              \"            (self): BertSelfAttention(\\n\",\n              \"              (query): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (key): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (value): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"            )\\n\",\n              \"            (output): BertSelfOutput(\\n\",\n              \"              (dense): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n              \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"            )\\n\",\n              \"          )\\n\",\n              \"          (intermediate): BertIntermediate(\\n\",\n              \"            (dense): Linear(in_features=768, out_features=3072, bias=True)\\n\",\n              \"          )\\n\",\n              \"          (output): BertOutput(\\n\",\n              \"            (dense): Linear(in_features=3072, out_features=768, bias=True)\\n\",\n              \"            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n              \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"          )\\n\",\n              \"        )\\n\",\n              \"        (9): BertLayer(\\n\",\n              \"          (attention): BertAttention(\\n\",\n              \"            (self): BertSelfAttention(\\n\",\n              \"              (query): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (key): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (value): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"            )\\n\",\n              \"            (output): BertSelfOutput(\\n\",\n              \"              (dense): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n              \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"            )\\n\",\n              \"          )\\n\",\n              \"          (intermediate): BertIntermediate(\\n\",\n              \"            (dense): Linear(in_features=768, out_features=3072, bias=True)\\n\",\n              \"          )\\n\",\n              \"          (output): BertOutput(\\n\",\n              \"            (dense): Linear(in_features=3072, out_features=768, bias=True)\\n\",\n              \"            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n              \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"          )\\n\",\n              \"        )\\n\",\n              \"        (10): BertLayer(\\n\",\n              \"          (attention): BertAttention(\\n\",\n              \"            (self): BertSelfAttention(\\n\",\n              \"              (query): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (key): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (value): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"            )\\n\",\n              \"            (output): BertSelfOutput(\\n\",\n              \"              (dense): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n              \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"            )\\n\",\n              \"          )\\n\",\n              \"          (intermediate): BertIntermediate(\\n\",\n              \"            (dense): Linear(in_features=768, out_features=3072, bias=True)\\n\",\n              \"          )\\n\",\n              \"          (output): BertOutput(\\n\",\n              \"            (dense): Linear(in_features=3072, out_features=768, bias=True)\\n\",\n              \"            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n              \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"          )\\n\",\n              \"        )\\n\",\n              \"        (11): BertLayer(\\n\",\n              \"          (attention): BertAttention(\\n\",\n              \"            (self): BertSelfAttention(\\n\",\n              \"              (query): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (key): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (value): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"            )\\n\",\n              \"            (output): BertSelfOutput(\\n\",\n              \"              (dense): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n              \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"            )\\n\",\n              \"          )\\n\",\n              \"          (intermediate): BertIntermediate(\\n\",\n              \"            (dense): Linear(in_features=768, out_features=3072, bias=True)\\n\",\n              \"          )\\n\",\n              \"          (output): BertOutput(\\n\",\n              \"            (dense): Linear(in_features=3072, out_features=768, bias=True)\\n\",\n              \"            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n              \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"          )\\n\",\n              \"        )\\n\",\n              \"      )\\n\",\n              \"    )\\n\",\n              \"    (pooler): BertPooler(\\n\",\n              \"      (dense): Linear(in_features=768, out_features=768, bias=True)\\n\",\n              \"      (activation): Tanh()\\n\",\n              \"    )\\n\",\n              \"  )\\n\",\n              \"  (dropout): Dropout(p=0.1, inplace=False)\\n\",\n              \"  (classifier): Linear(in_features=768, out_features=2, bias=True)\\n\",\n              \")\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          },\n          \"execution_count\": 40\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"cJO7qtU_SsDy\"\n      },\n      \"source\": [\n        \"##@title Optimizer Grouped Parameters\\n\",\n        \"#This code is taken from:\\n\",\n        \"# https://github.com/huggingface/transformers/blob/5bfcd0485ece086ebcbed2d008813037968a9e58/examples/run_glue.py#L102\\n\",\n        \"\\n\",\n        \"# Don't apply weight decay to any parameters whose names include these tokens.\\n\",\n        \"# (Here, the BERT doesn't have `gamma` or `beta` parameters, only `bias` terms)\\n\",\n        \"param_optimizer = list(model.named_parameters())\\n\",\n        \"no_decay = ['bias', 'LayerNorm.weight']\\n\",\n        \"# Separate the `weight` parameters from the `bias` parameters. \\n\",\n        \"# - For the `weight` parameters, this specifies a 'weight_decay_rate' of 0.01. \\n\",\n        \"# - For the `bias` parameters, the 'weight_decay_rate' is 0.0. \\n\",\n        \"optimizer_grouped_parameters = [\\n\",\n        \"    # Filter for all parameters which *don't* include 'bias', 'gamma', 'beta'.\\n\",\n        \"    {'params': [p for n, p in param_optimizer if not any(nd in n for nd in no_decay)],\\n\",\n        \"     'weight_decay_rate': 0.1},\\n\",\n        \"    \\n\",\n        \"    # Filter for parameters which *do* include those.\\n\",\n        \"    {'params': [p for n, p in param_optimizer if any(nd in n for nd in no_decay)],\\n\",\n        \"     'weight_decay_rate': 0.0}\\n\",\n        \"]\\n\",\n        \"# Note - `optimizer_grouped_parameters` only includes the parameter values, not \\n\",\n        \"# the names.\"\n      ],\n      \"execution_count\": 41,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"GLs72DuMODJO\"\n      },\n      \"source\": [\n        \"#@title The Hyperparemeters for the Training Loop \\n\",\n        \"# optimizer = BertAdam(optimizer_grouped_parameters,\\n\",\n        \"#                      lr=2e-5,\\n\",\n        \"#                      warmup=.1)\\n\",\n        \"\\n\",\n        \"# Number of training epochs (authors recommend between 2 and 4)\\n\",\n        \"epochs = 4\\n\",\n        \"\\n\",\n        \"optimizer = AdamW(optimizer_grouped_parameters,\\n\",\n        \"                  lr = 2e-5, # args.learning_rate - default is 5e-5, our notebook had 2e-5\\n\",\n        \"                  eps = 1e-8 # args.adam_epsilon  - default is 1e-8.\\n\",\n        \"                  )\\n\",\n        \"# Total number of training steps is number of batches * number of epochs.\\n\",\n        \"# `train_dataloader` contains batched data so `len(train_dataloader)` gives \\n\",\n        \"# us the number of batches.\\n\",\n        \"total_steps = len(train_dataloader) * epochs\\n\",\n        \"\\n\",\n        \"# Create the learning rate scheduler.\\n\",\n        \"scheduler = get_linear_schedule_with_warmup(optimizer, \\n\",\n        \"                                            num_warmup_steps = 0, # Default value in run_glue.py\\n\",\n        \"                                            num_training_steps = total_steps)\"\n      ],\n      \"execution_count\": 42,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"9cQNvaZ9bnyy\"\n      },\n      \"source\": [\n        \"#Creating the Accuracy Measurement Function\\n\",\n        \"# Function to calculate the accuracy of our predictions vs labels\\n\",\n        \"def flat_accuracy(preds, labels):\\n\",\n        \"    pred_flat = np.argmax(preds, axis=1).flatten()\\n\",\n        \"    labels_flat = labels.flatten()\\n\",\n        \"    return np.sum(pred_flat == labels_flat) / len(labels_flat)\"\n      ],\n      \"execution_count\": 43,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"6J-FYdx6nFE_\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"791a450e-3e14-435f-bdd8-77f54e520e99\"\n      },\n      \"source\": [\n        \"#@title The Training Loop\\n\",\n        \"t = [] \\n\",\n        \"\\n\",\n        \"# Store our loss and accuracy for plotting\\n\",\n        \"train_loss_set = []\\n\",\n        \"\\n\",\n        \"# trange is a tqdm wrapper around the normal python range\\n\",\n        \"for _ in trange(epochs, desc=\\\"Epoch\\\"):\\n\",\n        \"  \\n\",\n        \"  \\n\",\n        \"  # Training\\n\",\n        \"  \\n\",\n        \"  # Set our model to training mode (as opposed to evaluation mode)\\n\",\n        \"  model.train()\\n\",\n        \"  \\n\",\n        \"  # Tracking variables\\n\",\n        \"  tr_loss = 0\\n\",\n        \"  nb_tr_examples, nb_tr_steps = 0, 0\\n\",\n        \"  \\n\",\n        \"  # Train the data for one epoch\\n\",\n        \"  for step, batch in enumerate(train_dataloader):\\n\",\n        \"    # Add batch to GPU\\n\",\n        \"    batch = tuple(t.to(device) for t in batch)\\n\",\n        \"    # Unpack the inputs from our dataloader\\n\",\n        \"    b_input_ids, b_input_mask, b_labels = batch\\n\",\n        \"    # Clear out the gradients (by default they accumulate)\\n\",\n        \"    optimizer.zero_grad()\\n\",\n        \"    # Forward pass\\n\",\n        \"    outputs = model(b_input_ids, token_type_ids=None, attention_mask=b_input_mask, labels=b_labels)\\n\",\n        \"    loss = outputs['loss']\\n\",\n        \"    train_loss_set.append(loss.item())    \\n\",\n        \"    # Backward pass\\n\",\n        \"    loss.backward()\\n\",\n        \"    # Update parameters and take a step using the computed gradient\\n\",\n        \"    optimizer.step()\\n\",\n        \"\\n\",\n        \"    # Update the learning rate.\\n\",\n        \"    scheduler.step()\\n\",\n        \"    \\n\",\n        \"    \\n\",\n        \"    # Update tracking variables\\n\",\n        \"    tr_loss += loss.item()\\n\",\n        \"    nb_tr_examples += b_input_ids.size(0)\\n\",\n        \"    nb_tr_steps += 1\\n\",\n        \"\\n\",\n        \"  print(\\\"Train loss: {}\\\".format(tr_loss/nb_tr_steps))\\n\",\n        \"    \\n\",\n        \"    \\n\",\n        \"  # Validation\\n\",\n        \"\\n\",\n        \"  # Put model in evaluation mode to evaluate loss on the validation set\\n\",\n        \"  model.eval()\\n\",\n        \"\\n\",\n        \"  # Tracking variables \\n\",\n        \"  eval_loss, eval_accuracy = 0, 0\\n\",\n        \"  nb_eval_steps, nb_eval_examples = 0, 0\\n\",\n        \"\\n\",\n        \"  # Evaluate data for one epoch\\n\",\n        \"  for batch in validation_dataloader:\\n\",\n        \"    # Add batch to GPU\\n\",\n        \"    batch = tuple(t.to(device) for t in batch)\\n\",\n        \"    # Unpack the inputs from our dataloader\\n\",\n        \"    b_input_ids, b_input_mask, b_labels = batch\\n\",\n        \"    # Telling the model not to compute or store gradients, saving memory and speeding up validation\\n\",\n        \"    with torch.no_grad():\\n\",\n        \"      # Forward pass, calculate logit predictions\\n\",\n        \"      logits = model(b_input_ids, token_type_ids=None, attention_mask=b_input_mask)\\n\",\n        \"    \\n\",\n        \"    # Move logits and labels to CPU\\n\",\n        \"    logits = logits['logits'].detach().cpu().numpy()\\n\",\n        \"    label_ids = b_labels.to('cpu').numpy()\\n\",\n        \"\\n\",\n        \"    tmp_eval_accuracy = flat_accuracy(logits, label_ids)\\n\",\n        \"    \\n\",\n        \"    eval_accuracy += tmp_eval_accuracy\\n\",\n        \"    nb_eval_steps += 1\\n\",\n        \"\\n\",\n        \"  print(\\\"Validation Accuracy: {}\\\".format(eval_accuracy/nb_eval_steps))\"\n      ],\n      \"execution_count\": 44,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\rEpoch:   0%|          | 0/4 [00:00<?, ?it/s]\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Train loss: 0.4848619277793837\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\rEpoch:  25%|██▌       | 1/4 [03:06<09:20, 186.97s/it]\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Validation Accuracy: 0.8182870370370371\\n\",\n            \"Train loss: 0.2855956747942446\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\rEpoch:  50%|█████     | 2/4 [06:15<06:14, 187.34s/it]\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Validation Accuracy: 0.8283179012345678\\n\",\n            \"Train loss: 0.17187516562857075\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\rEpoch:  75%|███████▌  | 3/4 [09:23<03:07, 187.60s/it]\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Validation Accuracy: 0.82445987654321\\n\",\n            \"Train loss: 0.10979274041270566\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Epoch: 100%|██████████| 4/4 [12:31<00:00, 187.93s/it]\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Validation Accuracy: 0.8217592592592593\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stderr\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"68xreA9JAmG5\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 513\n        },\n        \"outputId\": \"c7a845b2-9633-406b-bb6b-9d45a966c02f\"\n      },\n      \"source\": [\n        \"#@title Training Evaluation\\n\",\n        \"plt.figure(figsize=(15,8))\\n\",\n        \"plt.title(\\\"Training loss\\\")\\n\",\n        \"plt.xlabel(\\\"Batch\\\")\\n\",\n        \"plt.ylabel(\\\"Loss\\\")\\n\",\n        \"plt.plot(train_loss_set)\\n\",\n        \"plt.show()\"\n      ],\n      \"execution_count\": 45,\n      \"outputs\": [\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"image/png\": \"iVBORw0KGgoAAAANSUhEUgAAA3wAAAHwCAYAAAD9+W2oAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nOy9eZwkR33m/URWdfdcutBIIJDECCNzYxsLMGBsMGALH+B38euFlzWLP2Ds3QXv2l7bY3stMNh+sbENthcfnMbmXhuDQCBOHUgIoRGgG0kjaaSZkWY0M9LM9ExPd1dlxv6RFZkRkRGZkVWZXUc/389Hqq6qyMjIqurpeOr5HUJKCUIIIYQQQgghs0c07gUQQgghhBBCCGkHCj5CCCGEEEIImVEo+AghhBBCCCFkRqHgI4QQQgghhJAZhYKPEEIIIYQQQmYUCj5CCCGEEEIImVEo+AghhKwLhBBfEEL856bH1lzDC4QQe5qelxBCCPHRHfcCCCGEEB9CiGPa3U0AVgDEg/u/KqX8SOhcUsqXtjGWEEIImWQo+AghhEwsUsot6mchxC4Ar5dSfsUeJ4ToSin7a7k2QgghZBpgSCchhJCpQ4VGCiF+VwixD8AHhRCnCSE+J4Q4IIR4ePDz2doxlwshXj/4+bVCiKuEEH8xGHuPEOKlQ449TwhxpRBiUQjxFSHEu4UQHw68jicNznVYCHGLEOJl2nM/LYS4dTDvXiHE/xw8vnVwbYeFEA8JIb4uhODfc0IIIU74B4IQQsi08igAjwDwWABvQPo37YOD++cCOAHgf5cc/2wAtwPYCuDPAbxfCCGGGPtRAN8CcDqAtwD4pZDFCyHmAHwWwJcAnAngTQA+IoR4wmDI+5GGrZ4E4KkAvjZ4/LcA7AFwBoBHAvh9ADLknIQQQtYfFHyEEEKmlQTAm6WUK1LKE1LKQ1LKf5NSLkkpFwH8CYAfLzn+Xinle6WUMYAPATgLqYAKHiuEOBfAMwFcJKVclVJeBeDiwPX/CIAtAN4+OPZrAD4H4FWD53sAniyEOFlK+bCU8tva42cBeKyUsiel/LqUkoKPEEKIEwo+Qggh08oBKeWyuiOE2CSE+EchxL1CiKMArgRwqhCi4zl+n/pBSrk0+HFLzbGPBvCQ9hgA7A5c/6MB7JZSJtpj9wJ4zODnVwD4aQD3CiGuEEI8Z/D4OwDsBPAlIcTdQojtgecjhBCyDqHgI4QQMq3YrtZvAXgCgGdLKU8G8GODx31hmk3wAIBHCCE2aY+dE3js/QDOsfLvzgWwFwCklNdJKV+ONNzz0wA+OXh8UUr5W1LKxwF4GYDfFEK8aMTrIIQQMqNQ8BFCCJkVTkKat3dYCPEIAG9u+4RSynsB7ADwFiHE/MCF+7nAw68FsATgd4QQc0KIFwyO/fhgrlcLIU6RUvYAHEUawgohxM8KIR4/yCE8grRNReI+BSGEkPUOBR8hhJBZ4V0ANgI4COCbAC5do/O+GsBzABwC8McAPoG0X2ApUspVpALvpUjX/HcAXiOl/N5gyC8B2DUIT/21wXkA4HwAXwFwDMA1AP5OSnlZY1dDCCFkphDM8yaEEEKaQwjxCQDfk1K27jASQgghVdDhI4QQQkZACPFMIcT3CSEiIcSFAF6ONOeOEEIIGTvdcS+AEEIImXIeBeBTSPvw7QHwX6SU3xnvkgghhJAUhnQSQgghhBBCyIzCkE5CCCGEEEIImVEo+AghhBBCCCFkRpm6HL6tW7fKbdu2jXsZhBBCCCGEEDIWrr/++oNSyjNCxk6d4Nu2bRt27Ngx7mUQQgghhBBCyFgQQtwbOpYhnYQQQgghhBAyo1DwEUIIIYQQQsiMQsFHCCGEEEIIITMKBR8hhBBCCCGEzCgUfIQQQgghhBAyo1DwEUIIIYQQQsiM0qrgE0JcKIS4XQixUwix3fH8O4UQ3x38d4cQ4nCb6yGEEEIIIYSQ9URrffiEEB0A7wbwEgB7AFwnhLhYSnmrGiOl/A1t/JsA/FBb6yGEEEIIIYSQ9UabDt+zAOyUUt4tpVwF8HEALy8Z/yoAH2txPYQQQgghhBCyrmhT8D0GwG7t/p7BYwWEEI8FcB6Ar3mef4MQYocQYseBAwcaXyghhBBCCCGEzCKTUrTllQD+VUoZu56UUr5HSnmBlPKCM844Y42XRgghhBBCCCHTSZuCby+Ac7T7Zw8ec/FKMJyTEEIIIYQQQhqlTcF3HYDzhRDnCSHmkYq6i+1BQognAjgNwDUtroUQQgghhBBC1h2tCT4pZR/AGwF8EcBtAD4ppbxFCPFWIcTLtKGvBPBxKaVsay2EEEIIIYQQsh5prS0DAEgpPw/g89ZjF1n339LmGgghhBBCCCFkvTIpRVsIIYQQQgghhDQMBR8hhBBCCCGEzCgUfGSq+Im/uBwXvuvKcS+DEEIIIYSQqaDVHL71zOJyD/PdCAvdzriXMlPcffD4uJdACCGEEELI1ECHryWe9pYv4Rf+np0mCCGEEEIIIeODgq9Fbtp7ZNxLIIQQQgghhKxjKPgIIYQQQgghZEah4COEEEIIIYSQGYWCjxBCCCGEEEJmFAq+CeU79z2MbdsvwX2Hlsa9FEIIIYQQQsiUQsE3ofyf6/cAAK6488CYV0IIIYQQQgiZVij4JoQ4kfj37+xBnEgAQCQGT0g5vkURQgghhBBCphoKvgnh/+zYjd/4xA340Dd2AQAikSq+hHqPEEIIIYQQMiQUfBPCsZU+AGD3w2nOXi74mlN8/ThBP04am48QQgghhBAy2VDwjYl9R5bxor+8HHfuXwQAbJzvAABOrMbGuCYdvh/+46/gWX/61eYmJIQQQgghhEw0FHxj4jPf3Yu7DhzHe79+NwBg00DwLQ0En3L4ZIMO35ETPTx0fLWx+QghhBBCCCGTDQXfmNg1aLew3EtDLBe6A4evlwq+gd4Lrtny/qvuwbbtlxQcQkIIIYQQQsj6hYJvDGz/txvxsW/dBwC47YGjAJBV5zyROXzp2NAcvn+84i4AwNHlXpNLXTM+/M17sffwiXEvgxBCCCGEkJmCgm8MfPGWfQCAR2yex10HjuHEaox+kjp9S6tp8Za6VTrVMOUMThPLvRj/69M347M33D/upawZ9x8+gW3bL8GNew6PeymEEEIIIWSGoeAbA2eftgkveMIZ+M2XfD8SCSwu99CPU8mmcviEyuFDmOJTuX4C7So+VU20SZS7GQ9RoebH33EZfvmD32p6Sa1zxR0HAAAfvfa+Ma+EEEIIIYTMMt1xL2AWqSq0kkiJbiTQGcRtxlKir0I6h8zhU1qpyTYONg8uLuNH334ZHnv6Jtz54DHsevvPNDLvKCu+99AS7h3kQ04TLb5NhBBCCCGEZNDha4GqzXwiUwcvz9NDJviW7By+QNdLCb1+i53aDy/1sBonuPPBYwCaqyCq5gm91mF562dvxZ9ccmur56jLNIbgEkIIIYSQ6YGCrwWqXDYpJSKh5eklMmuIfsJqyxCcw6ccvhZFk31dxxuqCKqW3Lbp9YGr78F7v35Py2chhBBCCCFkcqDga4EqzRUnEpEQmqiTWg5fmiNXN4dPibFh8uBCGdSVyTi8VN7T7/p7H8ZyL0AUKsG3jsIcQ99XQgghhBBCRoGCrwWqNvOJlIj0HL4kz+FLZOrSiWxs4DkH4+IWVZPt8B1e8reAuO/QEl7x99/ARZ+5uXJe9Xq1mX84uTCmkxBCCCGEtAcFXwtU6RYp05BNoefwxbl9ttyPM/cvNE8uWYM8OHspR074BZ/qB3jz3qPB865HuUcIIYQQQkibUPC1QJVTlQxy+JTDJ7UqnUDq+ImajdfVsDaLttjOZZnDVwd1jU0VgSGEEEIIIYSkUPC1QGUOnzRz+NK2DLnDl0jN9Qpuy7AGOXzW1A+X5PBlbSUC5lVj1pPeW0/XSgghhBBCxgcFXwtUOnwJzKItienMSSmzXLzQnLysSucQSuLSm/fhaW/5YmWBFXvuspBO1QA+xLUbZe3TDtsyEEIIIYSQNqHga4HqHD7VliG9r1fpTO/nuXhxHCaCRnH4/vTzt2FxuY99R5Yr161TVqWzTuN4Ne/6k3uEEEIIIYS0CwVfC1S5WiqkU+XwJVIaQi1OZO1G6mrUMIIv1GWypy7L4ctDOgMcPnU7Q4rvTR/7Dj70jV3jXgYhhBBCCFnnUPC1QJXmSiQQRVoOXyLR06p06iGdem7f9/YdxSd37PbMWe7wNVEQxa4A+nCJ4MurjFbPm+crTpfiW+7FeOtnb80qkup89ob78eaLb/EeO11XSgghhBBCppXuuBcwi1TlomUhnZnDZwo1I6RTe/zCd30dAPCLF5zjmDO99eX8hTh/VSPsKVbjxD0QeXe5EGGTrEFIZxvtKv71+j34wNX3AAAu+rknDzUHU/gIIYQQQkib0OFriNd+8Ft42+duBRDSlkEVbUE2vqfl6sVSQmmpXizx4OIytm2/JHt+pe8vrpJ4NFhZaGio6LDDM8MKsoQLzTZ7CPZ8L8wIrPbTOddjsRlCCCGEEDIdUPA1xOW3H8D7r0rdniqrKk5U0RZVpVMi1tsyaDl8cSKx+6El4/hjy318466DeM7//1UsrfaN5/oeYdNEuwZb15TNWastwxo4fL3A4jd1UO9RNEypzTGJxN0PLeH4Sr96ICGEEEIImQko+FqgOodPmjl8UqJntGXIxUQvTrBlYc44/thKH3/2he/hgSPLuGP/scLcLkLaO7zun67DX335jtJ1h5wrJVzxrUVbhl6/eYcvF3zDz7HWbRme/+eX4dXvu3ZtT0oIIYQQQsYGBV8LVOfwmSGdUprtF2KtaqfLRVtc7mdKwT6XL62urL2DGMx198Hj+Juv3ukdZy8lJEoyRMTVbTI/DL2SfMNhUa9HNIriGwPf3X143EsghBBCCCFrBAVfw+jhmN4xg6ItelsGPRQzkVJz+IrzHV3u5UVRAsMsQ9s7VK277L5JeJhmSOuGUSkrMDMs6rUeJqSz7hV/b99R9Fu4BkIIIYQQMttQ8DXM0eWeIcJchUjSHD6ROWtxIg1BJg2HLymIuGPLfS2M0Hb4hq/SWYVdgKUsTLSOW7cmIZ0t5PDJBkI6Q7hj/yIufNfX8c6v+MNtCSGEEEIIcUHB1zAHj60aYscliuSgD59y+KQE+npIZ5KHC/YdjuHicj8Ti7aO87ZlKBFTtl751Lf34Ef+9KsFsWpPUVZVUz0VouGytgw1NNlyz1+p1EWrIZ0tJ+LtP7oMgKGYhBBCCCGkPhR8DXPo2Ioh0FzOmgrpVM5Q6vBZIZ2D4/qxLDp8K31vSKdPhJXl8NmKb/unbsK+o8uFMMhCDl/JlDIL6azRlqGG4nviH16KS258IHj8agtFW7KQziEsPnWpYoo78f3av1yPy29/cNzLIIQQQgghJVDwNYAusg4eWzWEi0vEpIIvr9KZSGk4fImUmSPXT5KCsFpc7mXVHYtFW3w5fPUFT1XOXlmYaBamGXDarGhLrdUBX71tf/DYNhw+FdLZadnhU6Jwktr9SSlx6S378NoPXjfupRBCCCGEkBIo+BpAb+p98NiKIdBsUSSlRCLTypiG4NPGJUl+nDOkc6WfiYB+HCbCauXwaeGk9tp1yhy5ejl80rgNPq7G2FYcvjXK4RNaNddJYZLWQgghhBBC/FDwNYAuuhaXe3jgyInsvu1wqY1yRwitSicKIZ1qnCukM83hS3/uWSdoIodPhWHa4aHFkM4SwYdwEadG1BURdQRiO43X09uykE7fGtXjIeZgU3qyrqAunauxmQghhBBCSJt0x72AWUAXfH/xJbOSoi209GbdRg5fLDHfjbDaT9KQTt3h05RWJxI4pgs+y7nyhnQOIXhsh68Y4uk/tk6Y5rB9+HzDL715H57zuNNxyqa8YX07RVuq2zL0E4m5TjOSbdT2FU26ck2KR0IIIYQQ0h50+BqgrMebLcCyMMBIZM6QCulc6ETZ/SyHL85z+D7y+mfjyWednObwqZBOSxD6XLdh2jIUw1HN58uqdNYRcWrN1Q3ry9cDAPcfPoFf+/D1eOPHvm083kYfPnX9ZSGdK02EkjYU0tmkRGugywchhBBCCFkDKPgaoKwgii1i1F27aEucpA5fej8XE3GSi78NcxG2LHRxfCVGNHjnenFihHX6nLyyxuvCcqiycNJKhy8gpDOkSmegG1hoMu84vxJYux9aMh5vw+FTU3ZKFN+Kp32EWnmI95cVbamxNhdN9jkc1W0khBBCCCFrAwVfA/T6/s2v7ZK5QjqTJBUkc8rh00ReL0k0J0lg80IHx1fzoi29uFjh00Wdzb4aWdWHL6TxumvIdbseMnrKZTmDVQ6fdd/lMPpCDZXga7KgplqvLZh1mnAWm1pzsyGd6W3LBUoJIYQQQsiIUPA1gF04RUcJvm3bL8GbP3Oz0axbOXyx5fDdvn8R+4+upM9pRVs6kcCm+S6Or+Q5fP04sZq2j57Dp0RTpcNXomWkdavz//7DNfj5d1+tna9kcMn5y8JUbRGmRPlc1NxHXmri3cdKr0FnceSQTrpyhBBCCCHrjVYFnxDiQiHE7UKInUKI7Z4xvyiEuFUIcYsQ4qNtrqctysSULlI+dM29mUgRIq/uKKVEL84F30WfuQW3PXAUANDT2jKkDl8Xx1fjTNDYIZ3eKp1l4sjzeJxUNV4vc/hqVOmU1fPp40LOb6Octm5DBVSA/LUuDemsyOErcwebpkmHL3M3m5uSEEIIIYS0QGtVOoUQHQDvBvASAHsAXCeEuFhKeas25nwAvwfgeVLKh4UQZ7a1njYpyw+LE2n0gJOaSOhkOXypuJrvFPV3bAu++Q6WVvrZRtsO6Yy9OXzhTpOaocrhK228Hnw2Pd8vbFy+nvBzqPeo22DTPHX+0pDOBoq2NNeWoaGJGp6LEEIIIYS0R5sO37MA7JRS3i2lXAXwcQAvt8b8CoB3SykfBgAp5YMtrqc1ygRfIiVOrMba/fQ2DelMf1ZtGea6xbejFydGcZBNC10saYVA+klinH8Yh89GTVE4puCwFY+9ee8RvOXiWzSHL/x8VWNHcfjUazTnENXDEhTS2fcUbRlCMI3clqHBkE7qPUIIIYSQ6aDNPnyPAbBbu78HwLOtMd8PAEKIqwF0ALxFSnlpi2tqhbKm3nECLPX62X2jaIsW0qm3ZTCPzx2+TgRsnu9AylxI9GJpOHG+Vgnl+W6+tdev0vmq93wTiyt9/Pj3nwEgTBiEt2UoX1/Z+dR71GRIpzJNRYkH10RbBuUgjuqqNdlKIaRgDSGEEEIIGT/jbrzeBXA+gBcAOBvAlUKIp0kpD+uDhBBvAPAGADj33HPXeo2V9CtCOo+vaA5fkm+Us6ItiUQ/SbIcPnNuaWyuNy2kb9nSqhJ8iXH+4XL43Jv2YkgnrPvVgivEhQus2RIkOLPqkdbjKrSy22DRljhAqDYS0qn68I04T5PN0n2vMyGEEEIImSzaDOncC+Ac7f7Zg8d09gC4WErZk1LeA+AOpALQQEr5HinlBVLKC84444zWFjwsvRIx5QvptHP4+lqVTnPuJK/SKQS2LHQAAMeWU9ewH0vDYfT12yvrw+ejyuFzOmwqlFPl5dUK6axw+CrWZ2ApkTb68CnxXrZqb0hnjfM0lsPX0DzNT0YIIYQQQtqiTcF3HYDzhRDnCSHmAbwSwMXWmE8jdfcghNiKNMTz7hbX1Aq9EhcnTiSWVt0hnWLw6veTBFLCWbRFylysqbYMAHB0uZeeO06MgizDhHSWrd1ci32/eIx6SC0prEpnmDi053LXoSkXvE26XEmuVL1jmgjpVIy6dtlohwgqPkIIIYSQaaA1wSel7AN4I4AvArgNwCellLcIId4qhHjZYNgXARwSQtwK4DIAvy2lPNTWmtpCCS5XOlMsZRZ+GQlobRlyh0+F/bkcPiBv+xBFApuV4DuRisie5fD5jCxdvNnCITyHr/z5dO7Bc8M0eq84pnD+kpDO4uNhlUDroNZTpqWb6MPXWEhng1efVyhtbEpCCCGEENICrebwSSk/D+Dz1mMXaT9LAL85+G9qWR0Iro1znUzcKZIkF3zdTpQJko6Ww3fw2CoA4NRNc875laCMBLBpENKp+sr1EzOHzyeadHGWSCCkdkllW4YSgRYHhDsqQqt02pO5BKdPfCWh56hBnDmTJTl8HgWujgkRTE2tudm2DHT4CCGEEEKmgVYbr68XlODaONcpPKeHdM5FIg/pjNL/AODeQ8cBAI89fbNzfuUApjl8pkbvxYkhKnz99volDp8Pu/G6fVjZPHkIZfV51DxVUae24HSd3yd4M1HZZGuCANfQt546LSWaqq5Z55xVqJnKKpSSyeEDV92D5739a+NeBiGEEELGAAVfA6iQyw0uwaeFdHZ0wac5fLsOLQEAtp2+yTl/Twvp3DTfKTxn9OHzhnTqLmDlJTnnCiraMpACSY2cuXwac+yhYyu45q48wrdQtKVE8NkypE4RmVDUS1r2evpyKuuso05Pw9J5RjvcnIsG31Tx1s/dir2HT4x7GYQQQggZAxR8DaActo3zRcGXJDBCOvUWC7ngOw4hgHMe4RN8ucOncvj051b7WrhmQNGWUKenyuFzncpu2h4U0ukRY//xPd/Eq977TW1uW3D6z+97vEmdEhLSWRViGuKQSet2WBp1+NiXgRBCCCFkKqDga4DSkE4pcWIQ0hkJrS2DEIhUMQ4JnHXyBmzoFo/X54+EyHL48ucsh8+zqR+mLUNVDh9QFJjqXlzHlcqKn5iDdz54bDCHO+TTJW7rNm8fhTD30uPw1ZBvw6w5Scx2IIOTNgYNPkIIIYSQ6YCCrwGUMHIJviSROJ41SZcw2jIIkRXtOOcRm+DrCb6ahXSmrRu6UW6rpA5fvbYMtggRnsohVVU6AWD7p27E4aVV77EhwiYrqOJ53pd/5w7p9M3RlE+mnT9RQrTM4WsupLPOQRddfDOedNGlZnXW8FOWcs/B4/jKbfsbmo0QQgghhLRJq1U61wsqpHODI6Qz1qp0rvaTLO9LiayOEOhLia1bFrIQT5sspDMSEEJgw1wHx1b62XPq+YVuFNR4PVQ3VDVeB4BP7tiDjXMd/NHLnzqY3DxfUNEWlffnE2vqdgSHzzfHKPgqf+rOny+nUq09qEqndRvCx7+1e3B+ic7gC4KmQjpf+BeXZz8zopMQQgghZLKhw9cA/awtQ/Hl1EM6V+PEcPjS2/SHuY7wCj49pBMwi8P0kzykc8NcxxvSmZQ4fN7rCgwDdY3KiraEHJ8JJ58b5haPruvI2x0I59gmQxF9c+rLqhKgIQyj0/LeffWF/jDnIYQQQgghkwkFXwP04gRCAPOOHDy9D1+siTPluqgwzm4nKoR0qs10L1EiUQm+fGAvTrDST7LHldDqW9ZSvyS0z7dnLzh8AQJQCYy4Kk5Tn7ci3y9vcD5KSKe6bU71qPUU2kUY5/UI8MB1fOueh3DFHQ8O5gpfmyoGox/DvDtCCCGEkPUHBV8D9GKJuSjCXFSUTrGUONHLi2cs90y3LsTh6/VNkag7fGlbhjyHME4krrzjAB7/B1/ATXuO5OvQBZ8VZuhzaYpFW9zj9MOlJc5ChE0eslhe4MR+1tVy0NUqwjxHc8SesNWgkM7Ahfzd5Tvx3q/fk847xOoNt7Gphn6EEEIIIWRqoOBrgH6cYK4j0O04BF8iDRGy3E/FnxJZnUzwRZU5fEpP6g5fX8vh2zAQfJfffgAAcO09eQ870+EL2/jbAkEXb7q21cMn1Yg6bRnUoMqWCgF9ACsbr7eQw1c4V8B6UOFqKnwCthJRcf6GYON1QgghhJDJhoKvAXpxgm4nQsdRZjORpuBbGbh9StwprdSNokz8FeeXWVVPAEb7hsXlPpZWY0QCmO9GiKVEZ7AMfa8/TON12+HTBVfXV1LUc2wZoW6g/bQrpNPfHk4JrOYEUNYuovA6FcfY+MJUi+cYbm3q+vX52SydEEIIIWT9QcHXAL1EYq4jMOd0+MyNtgrpVOGZ6nauIyCsd0Md14sTw/3TQzrvPngc/3DFXZlDGCcyG6sLIl2TFNsyuK/Lbryuz+FyM9M1mzl8IQKryn3z5fi55s4b23vOUbmacEJcTG8RnUD1NWrRlbL3vQlYtGW6aPILD0IIIYRMBxR8DdDrJ5jrRJl400kSaWz6l3tmSKcSZ92yHL44QRTpgq/4tqn+fImU2Vh9g2/k8A3r8GniQ79W17LzHnXV58ny6ypCOgtFW5whneVz1FF8/3zNLlxz1yHv8962DNpJqltNlC9I19xD7dVZtIVoUO8RQggh6w/24WuAfiLR7QjMddxtGRKH4LNDOuc65SGd+nMLjgbv890IUSTQH4R/AmaooS4sQr/lL+bw5T/r16rncclsbPjOMmu74C3aYt4q6uTwDdOW4aLP3AIA2PX2n3Gvy1elM6BQSh7GWr4Gw+ErH2qgPi7JEO97HWjwTRfUe4QQQsj6gw5fA/zZK56OL/6PH3M6fL04MTb9J6yQTiVa5jqRNzxuNU6MuTc42j8owZjIXBzqYkJ3Ge1Nn6/wRrFKZ7XDp4bUKTaS57P5nneHh7rGe3v5VTxfxm0PHHU+HgeISH/RFnVT4fANncNXdHlZpJNMWkjng4vL+N4+9+8XIYQQQpqBgq8B5rsRNs13nW0ZlnvxoJCKyO4DeZVLJYy6UXnjdf2phUFI56/++OPwrPMeAQCY6wp0ojSHTxV30UXXMLlctmjTD3Nda9mx+Ryux8vz/aRHEK46eh64WjXocwyz3b1h92Hn4/3YHdNpNl53z1nVezCfaziHLnf4jNmCjyezyaR9At79tZ34tX+5ftzLIIQQQmYaCr4GcVXpXO4lSJK0Rx6ArEm6sFw4Xw4gMAjp1J5TYuvkDXM49xGbjONjmYeL+sTCjXuOBPVkKzh82v2Op2iL79hsDqcrN7j1TSYLP6TncAk+r8MXJrBc+F4qX56i7tr5hG+Vq5nNNbTDp46vn7tZ6zys2jJVTJjBh6XV2OhTSgghhJDmoeBrEFflyuVejETKrLKmncOnBEraeN09by9OjBy+7iB/LhIiE5LzSvAlSVZds5e4hcev/sv1+Psr7sru+/bspTl8kZ7DV8QndMoKraLTHMcAACAASURBVPgLnLiFlWu8eqwQphoYQunCV2kzr9Lpd/iqXMsqz2XUypqmszvSVE4o96aLYT7/bSIxeSKUEEIImTUo+BrE1ZZhuZcMBF80uJ8Kvo4VdtntRF63ZDVOjOeUsJSQ2Dg/EHxd1ZYBWBk4X8vaN+f2Zv+6XQ9lP/s27WU5fLq4dS27qniKTrYJrehZV7YxzAq/VObw+eeomtsm9oRlSscYm9CQzsQQj+VjdYT1hUK6Lu6s1zuTJq4SKZlbSgghhLQMBV+DOEM6+zHipOjw2VUUXRU+Ff04b6YO5O5aP87n7UYCnSh15Xp9aZwLKLp1x1f6lddTVnhFv1aXUPXn8Pkf87dUcDtpw81RH1/4q8rhK4R0BhRK8VX4LIwLXKMNG6+TqUBm/yOEEEJIS1DwNYjt8C10Iyz3YkgJbF5IO2AcGwityHL4XO6gol8I6RTZ4yqkM5FAN4oQS4nVOBV6qsl7+ry5qTq2Up030y80Xs/nKFuv63xljycVgk496ivIYoypcAmH2VtW5fCVC9HRHEc5rEMnivO30XidMZ1kFBjSSQghhLQPBV+D2EVXNs53sNJLEEuJM7YsAAAOLK4AAJRBpsRE1+EOKlZjaTReV25gL5HYNAjpjJN0TOxx+GJrU6UcvivvOICHllad57VronjbMjiO7dsnzNbh3935BF2VINTH+Iu2qFszr/HFf3UFPn/TA955y+bMQl7LQjor+vBV7XVHFWlmPuFIU5EZYNI+A1Iy0JgQQghpGwq+BpmzRNvGuU5WtOWkDV1smIsywWc3WS9zzHpxYrRs6A7EVq+fO3z9RKIjUoGx6szhK4Z0rvYTvOYD38Luh044zxsXHD7PtTqW7g3pdIi60Cqd5Tl85q0dZZrn+OWP7XzwGHY+eAy//+83FebTwzir3MrSxuvekM7yuV1zDVdhtN3tNA2+6WLS5FXq8E3WmgghhJBZg4KvQZwOXz9tyxAJgdM3L+DAsVTw2Xlv5Tl8ZuN1VaWzn0hsyBy+JHP4lOC77PYDeN7bvwagmId2bKVfCNksnLfQh8/t8Lly3Gwn74EjJ7Bt+yX49+/sKYzNWyaY7lv+vDq/f61qDq/QtG4B4IY9aX+9s07ZWLp+n2hTbSEK69IFX1V7ijpFW8qHGrhy+NoWf2TymbSPgJTM4COEEELahoKvQey2DJvmc4evEwFbt8yjNwh1tFswuFo6PPmskwGkffj08coN7Cemw9eNBBIpsdrPhdzew6l7Z2/2V/pJthYftnjS9aG+XlfPPfvYew4cBwB85Nr7CmPVvPoSe1o8aYgbVjkmcwDz528cCL6TBvmVxpoChFKew2efqnjsci/Gr/zzDuw6eNxYR9Vmd1j3w+7zmM411FRB5yHTwTjF1Vdu3W9EHQDM4SOEEELWAgq+BrHz8DbNdbMqnZEQOH2QxwcU3UDb4fvjn38q/vQ/PA1A2pbBcPi0Kp1K8MVJ2py9n0hDLCkcDzkbl5vHWA6ftl3U1+vurWc+porWLC4Xq4O68utW42LBmbJ9oR3SWTxHcY5b7z8KADg4cF11dHHrm1MJ3ZCQzqt3HsSXb92PP/rsLcaYeiGd4TtjkRVtKTqlZP0yrvDJb9/3MF7/zzvwJ5fcVljPMM7z5268H9u2X4LDnvxjQgghhORQ8DWI7dJtnO9kffiiSOD0zfPZc1FFDt8zzj0tK8hijz/zpFQ4nnnyBmycz8VfJASSxHT4FK6NnsuZ0yk4fNpdXYC6BJ9dtEWNP7rc865Nn6anXYO0xrmoLNriEIRLq6nboPIqdfSQzsrCK9bT+l27V19k9cer7sM3mmAzG69T8q13xvUJOLKU/t7vfnjJeFxKDLWo9155NwBg16GlipGEEEIIKcaykaHp2jl8WdGWNIRTd/jsSDjbHex2hCHy9J9f9KQz8fevfgZe/ORH4o79iwByhy+WHofPsdl3CUMdJQhv3nsEV+886G3LEOLwqTFKZOnkYiw/Rg83dQnCwhzZedNbO9SwrB3E4kofJ1bjrIm9vl7fsVJKr2A2XDXLnRQCuPTmfbjs9ge9cxtzee+Us1Z9+BjROV1MmuaXQ5aRUb979r+5hBBCCClCwdcgXSssc9PA4YsTiY4Q2LqlzOEzj42EMFw0ow2CEHjp084anCN9C/tK8HkcPpc2sfNpbJTo+dm/vQoA8P/80GO09ZSHdNpiqMxNzIu25I+5cvjKFE+ow5f+LCGEMNZ0YHEF556+KZ/PEHzF+Q4srnjDMg2HT+X5ZWMEfu3D1zvHuhjWlROWkxh2tiHO0/iMpFXGJPi8PTblcGGm6vfKDo0nhBBCSBGGdDaICrVUpH344iyk8xFaSGcxh8+834mE0boh8mxs8sbrqahMq3Q63CyHajl03J//stCNHCGdmsNXEdJpP+YLiwTcbRlWnUVbvFM4XULjecfYJJE4ddMcAODBxWVzvQ6XTueO/ccK87nuJ9baC45YxV53WEdGnSakRcR6RkqJy29/0FtNddYYd1uGwsdfDqdB1RdCdPgIIYSQaij4GuRJZ52MHf/rxdn9TfMdLPdjJIOiLcqNAxwhnZbD1xECepSnb1+zYW7QhH1Q2CV1+HLnTm2IXE6Rq4CK4qQN3fAcvpJwSUVZC4jEIdYMh8/hABaoEIWGwze4jaXMRLidW1hVpVOF0urrc583dVyVgLXfxuqiLaPl8I3ax2/WufTmfXjtB6/DB7+xa9xLWRPG9RmQ2Rce5m+AhBxqTXT4CCGEkHAY0tkwW7U8vY1zHfRiCSHScMuNniIsQNHhiyJzM2M3aldsGDh8P/HEMxEN2jL0YomzTtmAB44sY8uG9C12OWxlOXybF7oFkaa7ILpADWnLUOrwoZij1+sXc89GKdqiy6V0HoE4ljh5Q/r62a+FfumuYqZ3PriI0zbNoROJgsjUBWCcSDz37V/FwWOpm1psCO+9pHQdI27Q2YevnAeOpM7u7ofWR/GPcX0CMsHneHwY17FPwUcIIYQEQ4evRVRT9NV+AiHgrboJFHP4ulEUFNK5Ya6Dq373hfjzX3h6HtLZT/D887fil5+3TcshKx67Gvtz+DbPFx0+3XXTQ6mcjdfr5PBlIZ35GFdbhlpFWzzn0MfGMm9rsWIJvqqQzrsePI7Hn7kFgChvvC6RiT0AWO4l1tDyzW5SsQ4f6qPTftEWbriniXG1ZSj7GmaYLzXK/l0jhBBCiAkFX4tsmssFXkeITFyo+zp2LkoUmZtpn8MHAGeftgkL3U7mNq3GCea7EbqDEE/A7e6UOXxbHCGdseHw1Wu8brdp0MmqcGrLMUM61W21w1cnhy9OZOa6Fh2+cmdsqdfHyRvmBqG25vP6PfvYxULoqHO5hbUWz1KFo/H6GPyd4yv+sOFJYL3p1XHro6LDPVwSn/o3Z9zXQwghhEwDFHwtsmHOdPT0kE5hvfJz3WIOnx6uFAW8U2r8ci/GXCdCJ4qyjZErz852tXQ2znWKDp92v6rxelEs+s/l2rS5qnSWfZtf1cjc7GcnszUqEd6L7fXqgq84X5ykrqsQwOGlHu4/fKKwFtd6jlkC6NhyH/uOmAVjdOSQDp2z8bp1/G0PHMXdB45hFMr00g27D+MH/uhL2Ku9NmS8jC+Hz/NFzJAhnUnJF1mEEEIIMaHgaxFD8EWmw1cI6bQUXaFKZ4AVoQTfiV6M+Y7t8BXzXcocvrmOKIigvibCqhqv2yGRISGd+ubNFHzV4VuuPEDXOfSfTYfPDG/V1+/aVMZJgm4kICDwhZv34blv/1phLYDpWgKpwNO55u5DeM0HrnUvuuR6Qkkc1634g3+/CX9+6e3Bc/3LNbtw894jxmNlH8t9R5fRTyQeOuavBkvWG3bRluFEqPr3gXqPEEIIqYZFW1pEVdAE0iqbZg6fObbraMtQdt+FEoVSAvPdCNEgp09KiUTr06coc/i6UbEtgy7aqtsy5D8nsqotQ1HQ6SGgISGdVQ6f69E4kdl7shqXhHR6HMwoEs7qqfoSbGd10RHieKhEEOnXXMcJcTZet45f6SdY6Zf3YtT5w8/cEjwWqH5PyNoz7rYMNlIOt6L835PJuh5CCCFkEqHga5Gu5tp1hCiEeJpjKwRfgMOnzzHXibJNf5xIJFKiGwno0qJM8M11I8NlA2yHryqkMzGeL8vhy9oy6M6YIxQxpA+fEmfOXCFrbCxl9p4Ucvik+2dFnKSvp6toiZkvWB7SCRTFpm8ddXAVbbHnSmS7vfkyId/eKUhdxhXSObgt/F5iuEIyWQ4fP1yEEEJIJRR8LdLp6Dl4Agtd3fEzdz62cKh63oVeyXO+G2Wb/X4iEQ8cPp2qkE5biOgOn+5IVuXwJVJWtGVQ4/Tj9efV5q5MNJohnWUvl57Dt9B1Cz57/TbxoNG9c35tvH3dzmqpJe/D0Dl8yN1e11zq/uj7Zf8Lnb+v3JVPCuN6J8rbMtQnq9I50qoIIYSQ9QFz+FpEd9wiYbpBVRGancgs2tIJKdqizTk3yOEDUqEmZdFFLBMa853U4dNdPj2nT5/L3Xg9/1lKdyXPPJSz+G29kUOXoPB8Ya7svP7iEPbPyqWb70RYsUM6S5wxtaZOJCqL6YQ4aKtxUlrUYhSqqnyOWqa/TFgnjvd1UhlXu4K1ZnyX6Q+1HmZNdPgIIYSQcCj4WqRMsFXl5HWEWbQlJIdPHzPfEVlIaRzL1JGy1ElZH765ToReLLG0mo/RwzSrGq/rTdtTh68oLu3qm9IQecXcs9K2DFpxGn1Oe4705/RcqpDNfDcyGr0DpjPnEgP9JEFnULTFHlfmqrmQ0p/jOGofvbI+fm07b/n0k7srX2ddGcaew+cLtR5WcI/7egghhJBpgIKvRfQcvrohmlEkjBDNsCqdkfGzCrvsJ0mWw6ez0isL6YzQ6yc4oQm+vsfhS5Lipk3Xd7GnSmdedqEYnmUKruKcPrKefna+WmKOUfMrwWeLX91h9BWl6QzaMpSOC9zI+vL4ht3OhjRel7Jd0ecK1SXjZXxtGcofH3ZddPgIIYSQaij4WqRTU7CVzREm+PKfu508JDRO8iqdOmXFQua6aQ7f0mpeZKRnOHzFxuv6xl53+GTiFkN2yJ8uPmKHUCnb29lzFPLVLIdPzd8ZhHSWN14vni8eOHz6+6JCXvVTh4hUAAWH0bWOOqhVuUJZs/uOx5okpJ0GWVvG/VaIQluG0XLx+NkihBBCqmHRlhYxc/jcY/7hP/0wbr3/iPtJpKGdMYpizYUuPuY6AlLmOXyJLLZ+CMnhM0I6NYdvzqjSmc5jCDajLYPH4ctEmnkf8IR01ija4nKz9J8LDl9JlU7XeeNEpnmZ2mOrcYKN6HirjZaxEscA5gqP1w0PVSgH2cxFtESwlCNvmMs+lWzLMHmMK1fRd1YznLv+l2IM6SSEEEKqocPXImYOn3szc+FTH4Xf/MkneOdQuqpO4/X05ygL8Yw9VTpL2zJ0IiTSbCPQ01SQq/G6r7Klr0pn5sY5BJ3L4Stty2DNWRA31h21nm4knBVJK6t0Dgq+6HtUVzPoULHjE9/S83MohnC155bVG+bF5R7+9qt3llZZ9ZG/r7UPXdccW+lj2/ZL8Mkduxufe9whncUcvsHtiPMSQgghxA8FX4vojlpIWwUXqnBLUJVOvQ9fJKwqncUcPp/ImO9GmBuc8OiJXva43ofP1ZbBdPhM8VbWh8+Vx2M6fMWfbMrCQ+1DJXIBGgmB+W7H4fBVhHRK6QjpTAqrDNVJPc/r02zRFvM5ier1vf0L38NffvkOfP6mB5zPl1bpzKqrcldeh31HTgAA/uGKu8a8ktE4vLSKN3/mZqz040z8F/vw8UsBQgghpG0o+FqkG+DwVaEERV2Hr9uJtBy+xF2l0yP4FjoR5gaC7p6Dx7PHzaItWkinLDp8dtEVV5XOMhfO5bDVabxeWqXTFdJpCS5bsNoox1R/V1Qenq/aaBm+92LYcEj1cTH7+BVDOqusleMDh9fnBts5Wcb81u0kM1lrVG+e+9m9h09g2/ZL8I27DtaeeS2F1Tu+eDs+dM29+Pdv780eK+TwjRj2y3BhQgghpBoKvhbpGFU6h5tDVeqMAibQ2zh0NYevFw9y+Aohne62DHPdCPODJvF/8aXbs8e9jdcHYsmozGkJttIcPpdDKIvjyvd2pii8++BxvPuynYU51Ei9aMtCJ8Jq31+lUwmlXpzgos/cjAcXl3PBp72kq06Hb8SQziH3s3mVTm0ua0wSENKZibYhFpJXTJ3cTfmwznubiHK9hx27HgIAfOxb9UM+1zLnTX1BVHbGUT8aE/zRIoQQQiYGCr4WsRuvD4Ny6ToBx0eGw1ddpdPn2py6cS4L6ezFEi958iMBmJU3XY3Xfa0MEunu1Zfn8KX4QjrViNI+fA6n4B1fvD0LQ7X70VUWbdHXP3jq8tsP4J+vuRd/+Ombsx5+rpBORScSwSGd3rYMVihqXcqrdFYXbRllQz1qyf31Sl5h1f3CuQryhDLuHL7C4yOGdPKjRWaNOJH49Y99Bzfv9RdzI4SQulDwtYiraMvl//MFeN9rLgieI8py+KoFnx1CqkRbWqVTZmGaCpfIeMUzzsY//fKzsmMB4OQNc9k82bm05/cfXcG7vnKHt61Ckrhz+Gyh5z1emrcuzGp/5jnuO7RkOoZwhXT6q3TarR6UULZFuF20pSNEcLGTpkM6Xcern/Nwz+r5q8Iyy76HkJnrym15HSp7dGaKsP7cY38nvEVbhlsZ80PJrHH0RA8X33A/rhs4+YQQ0gStCj4hxIVCiNuFEDuFENsdz79WCHFACPHdwX+vb3M9a42rD9+2rZvx4oFjFjZHert5oVM5Vnf45uwcvkEbAR2XyPjvLzof556+yRCH6tz63mrOEqDv+sqdhitmh3S6cvhsoadv3VyN1+u0ZVDcc/A4fuwdl+GG3YeN+TLBJ9x9+NTzQuRzqo14dmxHGJvz3OFLn4+i8A2p7Q5ma9V/rrG3VblShrNpzy1rCADPwJC2DNySD4dXZDve2+A511AgZYVaUNKWQd3S4SMEwOi/E4QQ4qK1PnxCiA6AdwN4CYA9AK4TQlwspbzVGvoJKeUb21rHODEE35BJfMoZ27xQ/Vb5cvj6cRq6Z/fhc4V0qinmNQdv03zx3C7H0VXcBMhDOruRMFxCaQs9T0hnSANv9Zy9hvsPnyiOhRnSOdeNClUy1WZ6Loqyn9UlK3HWEcLIzVztm+vsCOF9TWxCHL46f/91Fy873lG0pTqkszqc1kfeE5E7lzrkIZ3u59VnbhqdU/tfjcK/ATWZwpeAkFJG/Z0ghBAXbTp8zwKwU0p5t5RyFcDHAby8xfNNHCGN16tY7qXFRDY7RJeNUaUziswcvsAqnUqY6iGdm+eL7mLX0SfCF76o+vAtdM1jss2ao7hH7BA65Tl8Zthl4RzGg/n4TlTu8HU7ohAKqUS4XbTFbsvQeA5fLYcvxdWWQY8IrJpylE0HS+63g/odHaI14tg2kTL7HbKqdFrPDzHz8IsiZAIZ/XeCEEKKtCn4HgNALyO3Z/CYzSuEEDcKIf5VCHFOi+tZc4wcviGLtpwYCL5NDtFVdr5uR2SOXj+RiGVYHz41QncDNzncxU4k8C+vexZ+4YfPzh5zRG0CAN700e+gn0hsmDOvwQ7D1P+8JVZIaNn8xpzWGJcIlcjzEVUOn+12qnN2NdGmQul62rF6mflCDl8kgtsyuEI6i3/wwzcAeWEP/eiiGK7cVMh8bF1Cci8nhUnaW+VVOt2LqirqUsZaXmf2BYP2z07R4Utvh/2MTNL7RkgT8DNNCGmDcRdt+SyAbVLKpwP4MoAPuQYJId4ghNghhNhx4MCBNV3gKOi96kLLvz/ncadjw1x+nNoIBYV0Gjl8InP0UoevGIbpasug8vzmKxy+SADPP/8MnHnSQvaYL3zx1geOInYIPtu58zl8mehwzj54Lts4ukMz7bF6SOdC19GWYfD8fDfS3In0OVX5sxuZIZ25w6fmjpzVSV24wmvtQ4cTXUWHL7sfUKUzH+um9HPtcV0niQnsypB9ieAP6Rze4RuXI+at0jlioufkfrIIGQ5GRhBC2qBNwbcXgO7YnT14LENKeUhKuTK4+z4AP+yaSEr5HinlBVLKC84444xWFtsGriqdVXzsDT+C773tpYXHQwSfXpSlE0WZo/fuy3ZitZ8UHD7XhlENmdPCL10On6shfFlOUS9Ost5+9nhX+X7d8AoJcfEVbXEKPj2HT7irdGYhnVoOnxI3KqQzioShGFTz9tzh8xdjsXGN84nXxeUeXvmea3DfoSXvfC4XyC4+E9aHb/jNh7RuSTOo75GmqS2DwhbYIeHaZYz7eghpHH6mCSEt0Kbguw7A+UKI84QQ8wBeCeBifYAQ4izt7ssA3NbietacJnL4FC6XzcbM4cv78O249+H0MUfenY0SA3oO3xZHhVC1cVtazZ2xsvDFflwMKX3Wn3wV39192Aw7dOTz2cLQhU8UuvSW4fB13Dl8elhmVrFTXUuSO3z6FfX6xbYMrnYULlzhtfb1qtfpK7ftxzfvfgh/+eXbK+d1vbb6/FVhsqNsqNXnYZIdvkmkSmSr39FRRPhaIiC8gs71ZU8VZgVffrbIbDLslyCEEOKiNcEnpewDeCOALyIVcp+UUt4ihHirEOJlg2G/LoS4RQhxA4BfB/DattYzDvTKnMNW6VS4KmXadI2QzqjQd88WXC4yh0/P4XOcWzl7x1Z62WN2pUsAePrZpwBIi5K4XM5/uvoe4w+b2r/pQimP+gpx+AJCOrXHOyLtV5hIcyOpQkrnOnkOn7pmFaYZiYqQzo7wFmOxcQk+e+1qcxsU0ueq0lkYVL2lCHntvcdac5Awql4vV0GepuZum0IOnxK3NebQ3fBpyA8lpA78d5MQ0gattWUAACnl5wF83nrsIu3n3wPwe22uYVKwe+DVJagPnxHSKQpVOUPCSoUjh89VMCYXfP3sMTsnsBsJ/NRTHoUb9xxJQ0o7xfMfX42N3Z760dWOIKTxelBIp5RGpU0VarraT7BxcK15lU5XDl9+rLsPX4qvUM98pxhCGhL6mRfBCO/DlkgJKSUeXurl15E9F+6Q+N2m6mO4Ka9HVQhzNJLDt3ZvRsiZcocvfF1Gaxe6IGTGGDGtlRBCnIy7aMtM8siTFwqPDVulUzFM0Rbb0avn8OUfjW5U/JiocYvLueCzXSoh8jWlDl9xnuMrfbM65+CvnStsqzSks47DJ3MHzxZ89nHK/VPXA2h9+KyQTjuHz+fq2u0p7HP71q67kul5/C+InsP3yR278Yy3fRm3PbBojAnZZA/jwNjr5aa8HlV9uNTv1NQ4fKL4ZYW9nloOX9+R4EvIjMCiLYSQNmjV4VuvXPHbLyy0A3BonVqE9eHTfxYFR6+Ow6cXbXEdp8YdNxw+S/AhD3l0FY0BUofPbLKe3hp9+ALCCn1hML4cNb1KpzIe9XOq5+e0PnwKXfBFQzh8C3MRFlfMx1acRVvc99XL6Ot7CJiFWa688yAA4I79i4Pn0jF6aKsP6Xths/OUHm6se5KZJFFaFdI1jSGdmeCzHx/c1rmWnvZLPTnvGiHNMEoYPSGE+KDgawG7/QAwekin3qrBh+6gzXWiER0+PTzUP26hm19robWAyK87rdJZ/Ljt3L+Iee1cec+9YtiWLhzOOmUDHjiynI+R5vEKV6sIuy2DyzHJq3TmRVvU318V0tmNhLGDLRRt8Tp8xc9Hr+8OPdVR6wtpvK2LAvWzLRClrBYAVeG0orCFN+dPbyd34zKBXRmC35OJD+l0hGoXx9S3+Fz5vYTMCqP8fhNCiA+GdK4RoW0ZfIT08dMdpa7T4at+u119+FxiVT32rlf+IB5/5hYAjpBObVyZw3fdrocLj8cO10/98Nev/EF85r89zxjvK9riEht2W4bMDXOcs6uFdKpbvWiLUaWz0IfP7/DZrMbFnoi+PnyRY7026i1LpL+qYyLDt/+h/QTt+V3nXUte+Z5r8N4r7x7fAoYgD+l0v3DDVLa0jx0bdluGIUI6DcFHF4TMGJP8BRkhZHqh4FsjRm3LEHQOK6TTzr1zFU0pzOFoy6ALF+U0KtHyyJM34A3PfxyAomiJRC46e54qnTZZDp/RlsG8/dHHb8WZJ28wjvMXbSmew+/w5WPU+buRKIjJrC1DxwzpLPbha8nhE+b9MqTm8GX9BLMSntWbiyzEtqp/g+vY7HZ8G5hv3v0Q/uTz09Xtpeobft+XG5OM73M2TL6SEdI5PS8BIYQQMjYY0rlGjBrSGYIuMIQQ6HRshy8khy+9nfM4fBc+5VH4uR94NM46ZWP2mBKSK71i0ZaqHD6bTLg5i7YMBIvjtbTHKFx5bnruWpqHNxjrzOHTHb70B9V+IhLCyGHLHT5kc7twFm1x5PC5HDl1Xv08LpSoS6TU3D7TOZKOcxRRInd4h28IrbiuqXK98jDb+u/JOASSMH72FW0JX5jp8BEyW0xDKHwIN+89gsefucWZ4kIIWXvo8K0Ray34gGLOXojgygVfPlavNrlxvosXPemR5rwDcahX7AQGIZ0VVTpt1J84vaWftG5dl+HLNXOGdEqZCRi9tYIuMpNM8Amtgbg5TzeK3ILPqqZpM3yVzvRWTVtWtEU/Jsvhy9w6iTv3L2YhnfceOo4/u/R7ntdqMI/nXCFtGaZ727L2VImf3OFrfu62qBKvdfa2fcPhq3c9uw4ex7btl+BzN95f6zhC1ppp1nv7jizjZ//2Kvzhp28e91IIIQPo8K0Rw+bwfeq/PrfgnHnPIcoFXsgalDDVXTR9XtcUc4MHVRGVbiTQTySEyEMeV/pJoRG8C2fRFitc01UoRGpiRsfn8JlVOos5bnlIZ+QNoYsiOKt0ZsLUm8NX/MbT5fC5w1FlUA5XVolT6jl8uVB4yTuvzB57/Yd24M4Hj+EXLzgH523d7JyvFw+/+5im0MNJOJ2etgAAIABJREFUoPL9HcEBWMu3whCX1pcV+Xqk/nQQRn5vzTXdcv9RAMAlNz6An336o2seTUj7zMIXZUdO9AAA3919eMwrIYQoKPjWiGENvmece1rw2Ko2DGFVOh0FWqLy59V59h9NBd+ZJy3g/iPLEMjF4mo/LIfPJdwKIZ0Oo9AnynTRJERemVIv2qKuz8gbVM93hNdRsXMkV608PJ/DtyHQ4XO5MVK6exX60Kt0OnsSIg/XdBe4Qem5yt7RTLRPwc5lkjRp1VpGcfjGgRACiUw/3962DDUuxggvrvkarEGgBSEjMUt9+Pj7RsjkwJDONWLUKp0h2I6SnesWVqWz+JguXFz/gKt8v/uPnMCWhS42qSbxIl9TLw7N4RuIGSmztdhhX65ZfKFhusjJxao0HL6s6qVVKEY9lzuMlqCLzNc4D+lUz9dw+FyCz/EHP5H52kOLtqgXzGEiGnO4ZtPfj7rkYbYTvHOZwB1J/nq5X7dh8t6yY4dc06hkv7ueKp11MEKvh/xsTfJHkqxv+NkkhLQBBd8asRY5fFWCKiyHz+/gAe7rUEVb9h1ZxtYt85lAjITQKkqGCU49Z0w5aPYG17VGv8OX39dDN82iLcUcPiU4I6G3F7AFX2QI5Lzxej63C1cOn9203XUt6WO5s1Me0plXHlUhsL4cPaH9XHh+cOt1+Eo+17MQmjROfO9v7ooNM+eaxnS6fjSHZL9b4dP2HeHeoeRf+fBTSSaT/Oue6f2MTvPaCZlVGNK5RqyF4Ks6R5XLaB/+0dc/G485baPhHLpO0dFy+J657bSseIsQ5jlDBKfelqHbEViNiyFsahoVogkg+ytpiyR9c5jltWmP+9oyJIlM20oIkblbtubpFPrwmZtX/dp//6efiG/cdQiX335g6Cqd6vpCyvLrYZx2lc7COQI+mv0hcvimsX3AJFAllEd5XdfqnXje27+GvYdPAEg/XlXitc4GMRkhh0/PbSVkrXngyAncc+A4nvv4rZVjZ+Ez6sq3J4SMBzp8a8RahHRWnaOqD58tGJ/7+K147OmbraItxTn0Fg5btyzkhV9gFX9xnP8Xfvhs477uKKnrKYZ0po9f/bs/gbe+/CkA/HlNsSZUIt3hM0I683Nmxw3OL4TInJRiSKcwrk9VD1Sj9Nfq/EeehJM3zAFw9+ELqdKZrb1GmKVepbM6TcojCDFiH74Z2LisJXkOT0VI5xCv61q9F0rsKQo9IK31qNtdB4/jxj3lhR5Mh6+25Ks5npDm+Kl3Xon/733Xlo4ZppARIYRUQcG3RqxF43VfkZDs+YpF+J6ODMFXfF537rZuWciKoAghjDW5HD5dLAJ6RU6Zjc//AKqQznTMo0/diB8851TjOHsDqG8O85xArS2D0NoyaMfGUqYFXYRZ3VJHF4tA7oKpabqWu6nmV83rdeo4fHkOX/F5fVw6h+bwOQ7Qi7qUhnR6NtalnyhuXIaiyuGzfx9qzj7UmkbFey0wPyN//dU78Tv/emPpXKNU6Rz1OEJG4ajVushF9tnkN2WEkAah4Fsj1iSk0yGo3v+fL8h+rgqp9OVj6al3LneqzOHTT+l6DeY79jf+A/cqkYWcP1fhB7sJecHhMwRf7vBlRVs67rYMSSIRDfL78pw52+FTV5lS6MOnXXxHiGyNoQ6fT/DpeY5VpIKuKGj1c5Tn4eXvh5OSj5TvdSPlVL1avs960NxjeiuyCrueoi3q+ZV+jBXH74KOWcG33joY0kkmHeY+E0LagIJvjfD1ZGubFz3pkTjrlA0Aqoum+Faou3TzjvwzXdicsrGbCQghzOt2CU57Pgngk9ftxjfvfijr22cXTXEJR19ekzOHTwuL9LVliGUa0hlF+vnt6zaLtmTtDbLnNWc0EtkTrtewTtGWkCqdeqhcnsPnGAfN4fPONlwO3zSVF5+kJVYVMhl3Dp/UXOYQ9Fxbn+BT0yVJ9XUZbRlqXlF++kl6xwnRmZ5/N31M89oJmVUo+NaIqnDLNlECqcrh87mQunBxFRzRG6pvmu9C3dUbrwPuHD5b/CRS4nf+7UbjvPamUJ8lEu4x+nzZ2CwnUBpFW3xtGdIqncK7wU7DQfP7/ZK2DJ1IZALI1YDe3YeviJTSW0TGPLY4xtmEXoa5HnU2+Prc6Rom96//JGZ0VTmj487he8+Vd+Nn//aqWsfknwHfK57/jlV9XkZz+CbxHSckZ4L/uawNf90ImRwo+NaIMRl8AHLhMWwOn75Jcgk+vQH5hvmOEdKpp+i5RO98xwxv1P/YdS3Bl7sExaqhdnN2he5MGW0ZXIJP7+81qNKph3QWcvg6wihC0bNcMDP3Me/n5xLe9rHp+dwCzXet9jg1RmTX7XYMy8iKtgyRwxfSPoK4MJ1i97NDOnwNvBl7D5/A3oeXap7X/bjtnieyut2E/lkc9mr4mSSTisxu+SElhDQHBd8aMa6QTiAXGMM6fDrzjvwzvfrnpjlN8AmrSqeraEvXfEzfxHYHajHbFEKFdObj9VYL9vGAWV3SrKiZh3S62jKoKp16H75Khy+r0qnEpDZWK9rStQrVdCPhyeFzi8DYUzXUHqduy0I6gYocv8G19Idx+OB+3Ug5Mt/xeZ4fd0hnvXn0EGpvDl92v3pm/Xe6tsOnjqt3GCFrxigO/qQwzWsnZFah4Fsj1qJoi/fcgQ5fyBKrHL5N851sHgGzSqfr/POeKp3pvOn42x5YxKFjK3lIp6NNRF7IxJy/56rSOXD4VI6hms5oyyBThy9ty+B21OzrydxEV0hnicO3Ya6D1TgpzO8u2qK1ZSgRYVIbr19TYZzUc/z888WeHL6QxuvTwOGlVeyp6Vq1RYXe837Wg+Zu4D1Jaiq+sqF2nmdYSGfx+FAYYkamhSn655MQMgVQ8K0R48zhU+eu7MMX4EK6Co7o826cNx0+W/T88vO2Vc6XjR8c+4Gr78GFf/11QHOrFOquz4Vz9uGDzNouGI9LiRv3HMa27Zfgjv2LWbinLz+wEwljA90r6cOX5vCl2K0oNsylrqke1nnD7sN4yTuvhI3UNsSl+2JtA63egr6r9YMxt2OawWNDOXwjOFFrhVrZ52/ahx/9s8vGuhZF1cs1yuvZRJiYRL3NqC7i7H9hcodPfVaqr093+IapVJqed3I/k2R9M03Frggh0wMF3xohxvhKKyFX5TKGSFJn0RbN4ds438mEmoD5jXqnI/Dmn3sKPvy6Z2eP2Q7fJ67bnf2sO2EHFleMJuLZmgttGcy/ks4+fDJ93M5tTCTw2RvuBwDcvPcoOpFAJ4JWJKXo8Okb6LI+fKk4VCGdtsOXvgZ6L75P7NgNF4nMcw1Lq3SqW5k3u3ZV2jR6D3qKupSdq+wzMxWhSRO4uCoxMlIxnAYuVw/RDD2nb7j+OQWUOCyfbpTG63YIOCHjICj/eoo/pdO8dkJmFQq+NWKsDl+kbpvI4St3+DbNdfOQTqvxekdz/hS22/W/L9upzWu3bJCFEEK7aEuxD18xh08iFU2dTAgPxlp/hItVOs25Cw6fEnwq19AQfLoQdDt8eh5fWZGLONscu8eocfot4Hbp0pDOdnL4fMVuJolJXFpVlc5xt2UAZC2dnMh8++dvy5BfU5WIC+k/6UN9+TGBOp+sI8o+f7P02WRVXEImBwq+NaJKbLV6bit00UfIP86upuH6tdkhnZHlcgGmKxQS0qmQslhJtNiWwe/w5W0epOHw6W0Z9MOzHD7PBryrhWmm5zLbMugOX6Q1Xrcdvo1ZSGd1Upa+IS7N4dNeD/W29h1JX/o1uaZTh8RDJIxloUljklUh7s+4Nlfbtl+Cv/ryHc7n8tfNTe5m1z9vIzl8Sb33VEIr2lIM6jTWlfbhK5/PdPiCl4HBAggZO+XRGZnFN7XMkmglZFag4Fsj1vKLrlM3zRn3Q0M6QzSpS6DpLl0q+NKfhSjmsaWP54+VCT67uEkiixtGO4fP/kMTu0I6YTt8A8Fn7TRVlc7MPbSet1/PLKQzOz6/NiOks1C0ZRDSaVTqdP/FlDK/ptKwIG0jrc7mav0gUXwNdZTr6Wu8XvaRGndIZ8h5x5nL9TdfvdP9RNXrVuEAto2s6fDJkpBOV9/MWn34htwVcz86PUgp8b6v343DS6vjXkpjlH3+8pDO6YffrxAyOXTHvYD1wlqFdH5j+09g87z5trpCKbPnIpFtoEJCOl05fIbDp7dlgDBEZC748sfsHD4dd0inOcZuGl6ew5e7gUurcSa89Bw+/WjVON4XmtjVQjrnOiJz6NRGfMtC7obqDp/9OmchnYEOn1qHrzeeuka15qpKmlmVToe1oj4bo/R8G58waWbMWiOz26qQzmHmHv2Kpaz3uumudDGk0/yiRsrqkM2RGq9b5yWTz7fvO4w/vuQ2XHvPQ3jvay4Y93IaIeTfU35GCSFNQodvjVirtgyPPnUjTvE4fOrvx9Yt89lzuhAdti2DTicSmcAoVOl0OI1llUNtJ0wXJ4q88mZKQfDFeg5f/thltz+IZ257hPF4sShL+tpljpq1vkhz7eY7USYu1bhTNmmvcyTwuxc+EU99zMl45nmPwA0X/SRe/KQzAeRhsiE5fHrBjLJ9cb6BLt80VIZ0KofPc7LSXJSSedeCSQ3pDC3K4n1+cDucCK99iPv8dRy+kvPa4tYOq3YxSkgnc4qmD/Vv+JGl3phX0hxh0Qftr4MQsn6g4FsjJqHxeiIlPvy6Z+Nzb3p+9pxePyTM4Svm8NkoY05Yc7qcxrLcxmIOn/SuMRNBlknmcviu3nkQB4+t4j8842zj8SQxN5sdkTuU0iomYYvR+W6EOJFY7sW471Daz+2Ujbnw7kQCTzrrZHzuTc/HloUuTtk0l13fxvl6RVt8fQF19JYIVdU8y/rwKbHryxcMCk0a08YlRGiOY2lV66oqyx7UlsM79+ikRVjqxXRmPTSLTxm3ISGdiRHSSWadSPv7NSuEFG2Z5qudobeKkJmBIZ3rACUs+onEj3//GcZzacXIVGiEfPldlnNnny8ahEQq8lxC+/xu3Dl8JpmQ9oR06vfVWnYNBNkPnnOqsV57Iy609dv9wTLXdHA/FcI9/PrHvoMv3bofQFHw2ajzbRi8pnrRlrJwvrxNhHOIsa6qvmZpjp9/Q6WEni+Hr0x0jlJNsglCRMlahk1df+9DePrZpwb02UtvfcN84cshNHK9sp5rm4ZKl39+zM9r+XyjtGXIjxvqMDIGopIvpKaVsn+bZqEPH9syEDJ50OFrmd/+qSeMewneoiSAKUSGzeErzplbfPr8uYArhnm6sNeThnSWF20phnTm99Whx1f6AIAtC93BedLHY8u5UEVb1Lz6y9cRpkhSQliJPQA4aUO3MN6+HqBeW4bVfu7wuRqp28dX9TXTX6+ykM5RWgCM60//JG2Ybt57BK/4+2vwl1+6o/K1zESMLwxyzA6fRD2hJXWHz/6dHtzmrmW1exiP4PDlAnOCPhykFCHcX8hNMyHh+NPMLFwDIbMGBV/L/LcXPh673v4zY12DElWusDxdcJ11yobKuUIcPiXs0pDO4rlMh88v+Oz19pOkkPNnN1K2NZA+hzr/4kofkcirY+bFXMzzRcLsUadv1F0hnTYdh7tpks6n1rESULRlpR9nm4Wy3nj6t8RlRTCMkM6Soi3eHL6StebCpb2//l+46QFcvfNga/M3xb4jywCAO/YvBufoeV2xwe24Gq/rffXCxiP7DBSEYiGks3pjbxQrqnk93IdOH75/n6eZqjD79HZ6rzdrw8KUWUImBgq+dYD6g+mq6qhE0G//1BPwkdc/u3KuMIcvb7/g7MPnaNXgwv6j2ItlIQTU7sNnbwpcOXzHV/rYPN/N1pG9PlYOnxHSmZi6xS6E46o2WpWrqI7N+vDpDl9hdMpKPzFaJUgpsf3fbsR1ux7CnoeXCnNX5kTJ8rYMWUVQ3y484JvqNr+Z/y8f+TZe/b5rnc+FVcJrekVu1HvWiUT1uip08kghnU1V6awxjYTf7c2vIXeSqzb2+mex7mswijtKxkMeZTHedTRJeQ7f9H9Gp3jphMwszOFbBygt4gzp1Aqp2G0QXISMyeaEu2iLy/VzYW/m4iTBnO3wWWOLx+iCL709ttLH5gUt3NKTw9cRIn/ttGIpQO7oqUcW5hyCD8Vr18mPDW/LsNJPtMqZCRZX+vj4dbvx8et2AwBuePNP4pSNc0YrifJcv/xJl6jLHT732sodvuI51pKQDZO9NillK5Uc1evYEdWCr+r1GqktQwNvxTBhlL5CM5ncq+Hw6WHa9ddi3pLJJ7LC52eB8h6q088subGEzAp0+NYBr3zmuQCAp519SuG5MsE1LJ1O7py5whp1IVQW0mmHEfZjWViv2pznIZ22K5gUxh5f6WOT1iPPV6UyiszNhj715kFlTb0tg42uG1y1adT5VA6fUbTF8/dytZ9kwjORwNJKbDyv8hPVKyItoVpcQ75Q17CsSudIRVu8Q1ol5LT28ttaa9brMgqo0pkJ5YrnhyraUvsQxxz5Z+vA4go+9I1dlef0iX/7WnVx6EN//s8v/R7ef9U9NdYePJRMCGKdOXz5mOm9YLV0hnQSMjnQ4VsHvPCJZ3rzCEuKZNbiY7/yIzhtc1qVMsvhE+Y/+PrjijLBWRBvicScJazyxutuB8Hl8C0u9/HoUzdqj7uL2kRCGAUD9I3mRqu5fVVuY3nRlvTY1X6Cuw8cw9Hlvtfl0XP4AODICbM3ld2IXsryb8allKXNqNWxZU3evXNn89Y+tBGC+vBZ9xMp0SnUgh0d9Tp2oqhyXVUb23G3ZdA/W2/86Ldx7T0P4XmPPx2PP/MkzznzL0uKDp95LaoPX5nTqrvNDy/18LbP3YrX/eh5YWu3zksmn3WXw1fxhc80MM1rJ2RWocO3zlE5caP+LX3O952OJz7qZADuZuuA5vAFtmVwhXTaAtHO4SsP6UzHrvQTbJ53h3Tqm4qopA/fpszhS++7chsrc/gGtxu0xus/8ZdX4OfffXVhrGKllxjXZAs+e7Nwx/5FHDq+6p1Pf7ViKXHVnQdxdDmfs5k+fGMK6QwZU/IFQZPkIZ0hDp80bn2Mqy2D3iT98KAZds/jAKfn1ISd4zl9Xep1KltmWeTztu2X4LUf/Jb3+VHEMhkPsxjSWf5vwPRfp/ryVLTw5RkhZDgo+NY5bYR0dvWiLZrq6TpCOjudGg5fLAshoHYOn+1EuYq2AMBmR0infWwkzIIu+nIywTf44+xqSK9fp8utUJtclf+3qm+aPX/zV/qJsWm3BZ/Kb1Kvx50PHsPX7/RXsZQyv/5jy3285gPX4tPf2Zs9n+cL+kI6vVMXeqytNbI6JbLg9LS1qcwEXxQF5PCZtzYj5fDVP6Q4h/algu0ou0ik/3mpjdHnKXuN4iQpDQW//PYD3udmwT1Zr7T1Zcw4KO3D53HDp4kpXjohMwtDOtc5rlDDkeccuHYCZshoVrTF8ZgLu05IXBrSOTimLKRTO9Qo2qKFDOmHR1qVUTuk03b4fCGdX/wfP+ZtG5CFdA7E4uJyzzlOZ6UfG8L0qCX4eoMXbZjNQloQJnURFZU5fKUNhFPKcgjbJKzxunm/rU1lHtJZLSqrNnyjbASbyeHLb/Mc2rLPgYS38qDMxwBhYrafSHQ7orQtiZ/Z2YredeAY7ntoCS98wpnjXsqaMEN6r/yLsux2ei94msUqIbMKBd86J28v0Ny/0HquXlXRlnptGYohnXbRFltc6Pk+usO3SQvpVI9f9JlbjGOjyAzpNB2+7uDx9L6vaMsTHnUSnvAof24TkBZtOf/MLfjOfYe159ysDkSZwnb48pC4+u+nKhqjv+6Z4PPMN9EOn+PEV+88iO87Ywse5ek56SlGOjIqDDESImAzFOYADsfo70YWFqnNVf450DfrHkfV+sKmTBQniWrPUv/NytYxAxvSD159D758635c+/svHvdSWsX+MmAWqPp9qRoz6ah/+1m0hZDJgYJvndPNBF9zc+q5eq6QzuDG63Z4ZiyLbRmsoi0hOXwAsEUL6XQ3RTdDOl/13m/irgPHs+c2zpshnC6Hr+qPXZKJAOBHHnc6PvXtPdlzq333ZnZFq9IJAIdth2+gLOq8neolU6+VLiirmrxXhfKlt+Ny+Iq8+n3XYuuWeez4Xy9Jx9ifl9ZCOtP3pROJShexagmjvJ6NOHzaXCH7OSmldxNr66+QHDvl8NmEOMm5vpzi3fSAOJEzFeboIyTMd9qoKqQFTPd3EtO8dkJmFebwrXN8YsfmV55/Hl773G1BY3URGVUUbSnL4bM3cHFS3Xg9kcDrf/Q8vOpZ52THKPTz6iGdvpdAL9qiiz0gD8PMc/jK+/C5yDadA8F3fDVvsXAsa6+Q041EGtKpC74lsyBL1qOsxl9cNbSfCT6HwzfExlIXBlVcdefBwrWMis/lPHhsVRtjPtd60ZaAxuuhbRvSn9d+a6WfMuQb/NThcws5O9QzRJCl/w4UT7zcjx2jrfPN0FY0SabbBQol+7e9Jfd9HISI12l+b6d57YTMKhR865xMnFWM+4OfeTLe8rKnBM2pwi7jRDrdPOFw/VzYbksvSQrf7OtFW5RA3LKhi9/76ScBMJs0G0VbHCGdNlEkvKXhVSsFtTkfxuFTlycgcN7WzcZzrny++W6ElZ4Z0qmqJCpcoi0U9VrpIkK9B8MIoXyTX37s8ZU+/tP7r8XrPrSj9jnKz189xh7SloBSH8NOVB3SWSVK9DXWfVuauDqXmx7i9KbnNwdK6/GQHD5XLi8AnFgNEHwztBFN213M0AV5yFt3zM61huTwTbNPpj6XjOgkZHKg4FvndFoI6VQiLpFmo3QlrPQ/AuV9+Mz7fVeVThXSCa0whtbw3azSmR9nFG3xhnQKb1EZW+A5BZ/zyBz1mkcCOH3LvPHc0eWiw7fQjQaFVfwhnf0hQjqzYxOVw5c/VhUmV7oJCzQb1Xt0x77FqiXWotjku7iSgsPX0qZSvY4dUe3wVQpC7fm6QrzpkM7IUbTFfp3LirbYzl7iEJM2sSek80SvWvDNUlsGKadZEoSjR2/MCiE5fNPMDFwCITMHBd86p50qnbrDVyzQYjym/fzl3/gxY56nPPpk436au+MO6UykzDbrabGVfA32WMDdlqFwHcLfmD5rwzCKwzc4WAiB0zZZgu9E0eFb6HYGRVtKQjoD+pgVFyKNY42QzipxUvrc8G5jI1indW0YbVHYWkin9tmsDNmsmMvMsawp+BrYiuk5Rq62DPaS0gq3yI5xYTt7ZS01YlkM7QaA5V54zN8sbEglZkMchDJLbmZVVVtgut/bWXqvCJkVKPjWOZnD1+AWKHf43EVbfA3JddftD376Sfjjn3+qMW8v9vffklILkRT5OYwqndqxes6dT/RGVh9BHXW8+sPm6sNX5fHp67UF41FHSOfCXOTI4fMVbQl/P9XIfmw6fHqxDe+xZaF8SfUYfQFNbxHs+VxiriBOWqvSOXD4AnL4qkLX9Pe2bmuCZh0+d/BZ4RRSOj+PruN9xZd0vDl8A4ev7IuWsuv/m6/eic/ecL9/wISR/ps3+xvrWXT4Sq/Fcr2nkmleOyEzCgXfOqeNkM5OJxdD+r7M1ZbBl8/3jMeehg1zpoha7sUFh0+tP0mkEdKZO3/a+fW16O0iPDtEIfw5fKpZumKoKp3K9XGMc7kV850IN+09gjv3H8sesx0+V6XNUHJ3UBpzDUv2TXXFOOV+Nb15tUWDS0QURGFbOXxaSOeo1zlSSOdIZ07R3Tr10TXEWyGks9hYXZ9HX1hIZdd+XHT6gTyk01VAKT+n/7P2V1++A2/62He8x04adt/QWWUWc/hKq3Rmt9N7vdO8dp37D5/AL73/2qAeuYRMOhR865yyHLqh59TCKfX5XQ6fTuQYq7PcSzBnPa7G9bQS5Xp1TWN+R3ipfV7jOiJ/Bc+FrErn4L5jA1pF/icx7D1YmItw14HjuH1/nutm5/r1hqnSORgbWyGdIeKnNDRJcwrLyPu6NUshP88Z07k2IZ397LNZrwqn+/l8QN2m9k1smo18PKsPpv0zkL6/2WfBk+tXq2iLdDt8qmiL221X53SvcWm1mDM76cjsf7ONnd85C4Tk8E3z5WaRElPeiO+mvUfw9TsPYtfBpXEvhZCRoeBb5+Qhnc2ht2UQDifN9zdA38S5hOhKL3Y2Xk97m+XVK/UcPh2f4PNRVrRFuQhq02o7fkB40ZbQv4llG1mFCmGtF9KZjlViMXNZAsIbQ6ozVm1ckkQGjSueuyr00cQV/ugSJ23Q08Jlq9syhF9XW45kENL9GXcVZpEOpSWtMYD+WagI6Swp2rLB8bvoOqfO/YdPeI+ZVNaPw5cySyGdIV+8TPPlTvPaCZlVKPjWOZmgaXDjqBdtcT3uC6HsVAi+5X4xpBNIhWI/zisBRsItony5gz6UmHSRCb7B/XnHunzhoAq1XjXKFRaqo5/DF7YWDyme0mOVKCl3+PTXJESaVAmYLKSz5jah2imzQjoDcvhac/ji/LWtEtKVAlkvqjOOHD6H42IUbXG0XnA5a65CLyH5WlU5fHYouLl29+/HnodTwbd1y4L/xBNG+rrO/tY6JK9z2ii7klmQ8bPyuXRFJhAyrVDwrXNU4/NGHb7BnLZgqArp1CvvuYRWz9GWAQDmOhF6sRnSKRxhnb7qoD46kVlIRmfToI+fusSh2jKocYO1lOUeAWZjad+mVrl0dd5PdQ15H770vk9M6K9dE6FJoU6gTZXYKYg5Zw6fJQpbc/hy97SOg+d8XhtQu2hLA7/peghulH1f5BZ/arwrd84I7xzcxgGb+1Twuap0Vufw+UI69w4cvkefusF77KQh5Wy5Xj7UJc6IhgBQkcM35L+Hk4T6XE53QCegPn3T/F4QomhV8AkhLhRC3C6E2CmE2F4y7hVCCCmEuKDN9ZAibbZlsB2VaESHD4AzlKvbEehbIZ2u87gKyJTlqOeHAAAgAElEQVQRCYGN86awmu9E+K8v+D485/tOB5BvRlwCLLTxulrK217+VJy0wS0wAeCI1qrBt6nVnaS6ZG0ZrFsb87r85wnJx9LPU3fFdfvZhTh8bVXpzKqnypDKp+UD6vRJLM5da3jpHFLKShc7Haht2j1rsfvvlRZt8YV0huTweUog7h04fGdMkcOXSHf101lj3eXwZbfTfL3TvPYc3xdEhEwjrQk+IUQHwLsBvBTAkwG8SgjxZMe4kwD8dwDXtrUW4kfPt2t6TvsPdObwVRwH+IXonC+kU6vSqaYpCL6KojA2kRDYPG8KsDNOWsDvXPjEXJAOzrlp3iH4Kr7fVOtV437+hx6Dm97yU9joce/03nw+h2+YPnzFoi2D+55JfIK9MG92W76YYcMoKx0+u8ee0+FD5RjF/qPLmYtUF+Wexokcqbdh+nw+YjxtGYoC3SfkANPhq1pXiNubeEI6Twwq25bm8Hnm3Xd0GUB1GPYkIbFenIf8d2dWKHf4pl9lzMrnMneXZ+SCyLqmTYfvWQB2SinvllKuAvg4gJc7xr0NwJ8BWG5xLcRDG334OoNwK/sPdNaWwdf3LsDhcz3ejSL047whuRKL9mmMVgyBgs8WcnYkmbpC4RhbtXc886TUTbA3qHpTeJ0jhuArd/iGoWe5gz73yMjhK/umOjA0KRlyg1MpnKynQ/rw+TaVR5d7ePFfXYF/vOLuWmtUuIq2eD8flQ5g/nNc05Js4rdcF2XqEvTPSiGHT+Yndrl6+sp8OXY6/SRx5vIqMV6WC+trAK+OnaqNnZxqTVCbaXprqghz+KaXLKRzer4/cTID2puQjDYF32MA7Nbu7xk8liGEeAaAc6SUl5RNJIR4gxBihxBix4EDB5pf6Tom1K2pg9qL2XtnnxBzz+Fx+FyCr5MWbclCOkV1SGdIKGskijl89pzqD4IASsMxXbzzP/4g3vELT8f5jzzJeHzTvHuerOUCyh0+tWn9xQvOxg+cfUrlOtSsdh++VY941F/Hsj+EZT3PXOPqhmxVhTPaz7q0UWgO37/u2IPF/8vee4fJcpTn4m91z+zu2ROUdQTKCBEESAgEJhkTBJhgY8AEy9cZ29jmcp2vudcGg3+2MTa6mGvwJRgMXLjGGNuAAQE2VkDhIImkgCSUUTjSOTo57M5Md/3+6K6qr6q+6q6esDuz2+/z7DOzPd1VX4eZ/t5+v7A0sEh3E/RJywu3WI+LJo3Zmyt8Y5H4yhepv8u5BHqDHP/nktvQH0hvdc6J5Yq2xPThy3KJLhPSGaO+mp5u9vLlvqpuOzuQ64TxrbeQTvNwZHb3d7bDUQ1CvxctWswiVq1oixAiAXAhgN+pW1dK+QEp5XlSyvOOO+64yRu3jhCjdDWFUvjcG3RaE9JJEQq55J7sd9ME/VxqAqCc0MqiLTEKX8IofMEm7cDmha63rApHLs7h1eed7C0/YkOXWRt44slH6vcLgTylfmZyxE48chFnHL+p2ggYx2LghHTuOthj17cVvvo7YR0nUbyy6T21vmiL/TmrCEYofHku8fGr7ireD3nn7w9MDl/I7KV+hu/dv69Z0Zas3h67UMro4JwgKSU+cvkdeMeXbsJHrrjDWj9ctAXe+xjyX/ThCzdej1GdXSwPhs99XS3k+dpxrKvgXhtrATH7Mst7u1aKtpjTNMtno0WLApMkfPcCoN7sSeUyhc0AHg/gYiHEnQCeBuBzbeGWlcUkc/iatmWgCDdD95ebPnyych7RlPAJ4RVHcU1XDpeA8BS+YfOBQoTvY7/0VPzo404AYBevoe0a6HEQIi5XUZ0nt+DLzgPLAIAjF2176HGNUvhq5o9VAl3UhXS6xIoN6XS3Yda57NaduGPnwUa2uRjkJKQzwPh+61Pfxov/+jIrV5MD3Twmr4mJnBwJNFRXXeOZlDiwXDQvP9TLvPW50Ci3sictaFPZliGrLtoyTEGM5UH9ttMGifoCQGsBMdfErKFa4Jv9HZ1ldZJC/16sjd1psc7RLAatGa4GcKYQ4nQURO91AC5QH0op9wI4Vv0vhLgYwO9KKa+ZoE0tHMQQn2HH9BS+QEjn2378cdi6ZZ5d1wUXytVJBPpMSKefw+fbWIWin59DGp11VJggp/ANiyMW+XG2LHTx6BM246Ibtluka9NCR6tx/cy4CwJGba2CJnyOwrdjf0H4tm5ewJ5DhoTEqsKxoVjDFmOoT19zFD42h69eBbzitp2YSxPMd5LhFT5VtIUofO61te2OXQCM2hRC06It4/ZVDJGXJodP0iJE4W2oMdL53FUMQygUPn8WddyqHOZQjuAsKnxyfUR0rhnyQLHW2zKsFejfi1W2o0WLcWBihE9KORBCvBHAlwGkAD4spbxBCPF2ANdIKT83qblbxMMUbRkfTJVOe3moaMvPPeM03y6G2BVj8yGdgyzXDr0a3iV1VtGWqD58/jpVx8lT+Gpn4BFS+ABTkIKSl82E8A3y3OQVRip8utG6k8O380Ax5vFb5nHzA/v1+k378MUWIWl6DQ5btIWeencdjj8dXB5g80JnpKqIdlsGfhDTR7J6LLtoS4zCR8MoR/+mc0+9XcJGUbQP8Oe3CB5sJ7i28ToT2k0L4wRtDzjTKodvllBcj2vfFV2Le1h13tZCoZBZenASgzW2Oy3WKSap8EFK+UUAX3SWvSWw7nMmaUsLHpMI6UwDIZ0KUUVbAiuF+/AZRzoU0hnT58+2kyF8zi7Rf7d4IZ21U7CoJHylozsg8tYmUlhmkJveXEKIqP1UxMkN6dyxfxmb5ztem4jYHL7YYixq/qbXYOOiLYwC5a3DjHloOcPifIrDvUyP8boPXIljNs3jvRc8KcpWlWuX52FCovsR1hJkovBFVGW1wyhrV68fjzik6hqXMkwlVbhmlWFubmNtHz7muu5HJIOGrlcV0jlLjqqUayvMMYQZOiXRiAo7nuEdn2HTWczyuWjRQmHVira0mA5MomgLp8JZcw6proXG7iRFlU5FHExIp5vDR8aPrNLpwv3hl2TOrVsW7PmG1PiOrCB8KqR14Ch8CgNStCVa4Sv9ZC+k88Ayjts8z+RCmvdVt0FOCQKAWx88gDtJTtyk+vC5zrtR+MIKJTfmwd6g7Mco9D5ddfsufOG790fb2iNkOhT6qIl37X6Z91EhnY6SNipMWKTU13hBZNWFZ69P1T9XFXTXC31GkeWS/X1QYbOVIZ3Oq4IK6RxkEp//zn2NG9qvBtaLE7oWctpcVKvQ6jqeXayVS3MtqK0tWihMVOFrMf1QxGecN9W6tLEYGhQkfGwOX4JBnmviYtoyOHbRoi2BkFHLBoYUhm7UQgBv+JEzcPWdu3D5rQ/pZcNgS2VIZ6G2UWKyad6sP8hISCdE1H6qsUzj9eJ15/5lHLtp3juf1rmJYHyu837+hZcAAO58x0uL1Yb0DhqHdGpyTk2U7DoUh3oZFudSJGJ4R0YpspkMNyF3j38I1OaokM5AGOWw4Ig8JXXugw4JWpjHH0ctt3P4wvNnpcInnPOhjnHVtqaHoL2SInzb7tiFbXfswqHeAK99yinhgaYAVAmapYbxjbEGve3KkE7vzezBFA6b7euybcvQYi2hVfjWObTzPsYftLEofKGQzkAfvqJoi+3Qj9yHj5nLdcbpvwvdFL95/qP0/8Pe6tzefxQqh4+W46cKX98K6Yys0lnuhMkzK5bvPLCMYzfP+aGxY67SOXzRlmaET7ftIGcmtA7FweUBNs53SoIxnK3qfLnExprbyaUMwSY5TRW+0b/oVK1LdM9Nvy0KXd88Kafkk7yHTYSrDvMgl0gSXz9X/f+qC2Lwny07PfxUwaJpRkyT+rWAtbh7MyAgj4S1snvc71aLFrOKlvCtc+gcvjGOWZc3FvPQLxRqGurDNyDtCGIUvojilaydfg5fxZxDMr6Nc3yPPcCEdLpFWxQyGtKJuCqdeUDh23t4gCMXfcIXXaVTvdZcXHVK3bDbeepdRHOoKoVPQAztXNOQzhCh00przSR2W4ZmxUbGo/CZ0Ekd0inDvyE0P69K4YsN6ZRSIhXCUw/6SuGL2w0LbmXUWXDIpfO6VrGWCC3NeQ1hLZAMtX+zre+Ra292T0WLFhot4VvnmEwO3+iEr8nYaZnDp5w03eC9qg/fkCpjSOFTq9I5hs3hq1L4VF/AUNGWfp7re1MiREOFTxG+YvnyIMNCJ2WIs3lf7bjI2nWKz2tNZNFECQPMflpFWwJ5fhQqhy8RtpLVBANybGmfRAo1LB2fP3Zm2bhz+Jb6We3+qUsvFNLJz++rvfb2dpuNKsKXSz5KgKqo4W39dQZZ7h3HWSjeYpTW6bd1FMQQn9t3HMCfffF7U38s1FVb96gKmG2iO8u2U7R8r8VaQkv41jl0W4Yx/kLXKnwjPPfjCEy3rNLpOtJ+WwY6Tv2lzzmVHuGr2GZYYrspIqTTVvhoDp99HNQxOH6z3eeQwvThs6t0Lg9yzHcTJjQ2NqSzfh13X5qgTtzywzWL1yr7OUdfVekUoijasjTIvHXq0LeKthTLQt+DzCI+/ud0v2OOnbVPFd/zA8sDPOaPLsK7vnpz5XjUCVKHsqoPH20QHgovlU5uY9Vu5VKyBZVo64ug7Yx60mMqnc6CwxobMj3riDkXF9+8Ax+49HarX+g0Qj0QXOt9+GbYdAvrJWy6xfpAS/jWOXQK3xh/0LjCKhSjKHxdJqSzkxR9+JQyEQrppASwSUjnTz31ZL3MC+l0nNxx6KVnPWwLfuGZp7GfdXVbBmPIJhrSmdtKiyLIbgVRCjekU/WK6w1yzHcSTwW22zKE90N9VKeWxKopH7/yTrzsf18WvZ2rDChCW9mHjyGRSuETpcJ3qDcM4SsVPtI+JAQapsnto9V4Pas/djLw3sW+w4Wz/Jlr7w2uc/HND7LEqro9B90PXm6UsPc1NJ4slUSuGIS2K2gJ/9kS04Nv2pUiihkydSjE7N6skd+Y381ZDukMRTHMGmb3DLRo4aMlfOscqsLjpoXxFWytU/hiirY0GdsUbbHX8dsJkJDOiFBHtc6fveIJeNerzwHgKw9/9epz8MxHHoPTjt3ozTnsXiaJwFt/7HHsZ6oPn6Xw0ZDOLNd3KdqHb+uWsMKnhtJtGXKT0zTXSSqJc2UJ/Mino7GE748+ewOuv3ef/r9plU43xxNg8vyYEM+lfo7FkvBBFopfU1D1VJ+6wAVCbeCOjZ3DV4SYfurqu00fuhL/8b0H8C/fuschZmEb676W1961Cz//katx64MHirEsm3yFnc7JTSudf+w+fLwNankihEfKBnn99capJ8uMYjsTOXyMWrkW0YR8TztRV1+NSSh8//CNu/HHn7thOMPGjCk/DfFYJ9+xFusDbVuGdY7XnHcSDvUG+Jmnnzq2Mevy40Z56Me3ZRDIckn6rMF6VaDEJYZ00n5+KpTSdSjOPeUofOL1T9P/02EnUZK622EUvkDj9UQYG2jYpwvTeL0kfFJqwjffSStDOqugDlVt4/VmdUc0mjZeV/PYOXzVYx7qDQAAG+eL45BLiUP9QWNbMyaHj8752g9c6dnJ2aeWdZIijHmQS3z62h/gv3/mOuw51Mev/sgZer1f+ug1AIDv/vELybYxiiC/zs4DPccO0w4gz8NOXqhoi6XowVY+Q9eMWu4K/b/xiW/igX1LlfbT7ekay5zCNwMOXlsy3sCQ39lAtQo9nFr5B/98HQDgj3+cf1i4kpiV81CH9jvWYi2hVfjWOTppgtf/8CMw3wlXhmw8Zk285NhDOssqncphTHVIZ5ioxBQzsdo4RFYzHYfCVwVO4VucN+eun+UkRwxIU56oUvg5fEb1mO8kHnG1i7aEbY11moctkDFs43V6YjxS6GyjwjcX5zooBb6hQjrVuG6uGgAc6A1w9Z27jd1kv0IhnSa0N9dEbHcgfym2aIvKKawuvmL/rw5lJmXwfEsQtTdkl6fwVRM+lU+p8IXr7tehmQ0Ll3oVOot5mo2xGlgLuV4utt3+ED77bTukOGb3ZsUxb1Klc5ZhUh1mO6Zz1h4ktGhRhVbhazF21DX7HkX54kIxu4kd0pkEQjo5AlcFmrum1q8jJ5TrTiJ/QVXppOF7Jx+1iE3zHcx1kjKHzzjFHW03cNRilyUFap8sha+vFL7EU1PskM4wlONdm8M3pHc9bEhnE4Xv4LKr8A0X0kmb26s5Q3mfdkinP5aURaGiw/3inKnzHbykHWIVQt316jqpErYDGypGo/Lu3DEs8gc3hy9kQ/FahHQG1onYB2oHH9I5/S6eUSun39ZYvPYDVwEAXv7EE83CiN2blfBWUT42inkoMQOXYBCzbDuF2o1pDxVu0SIGrcLXYuyIUc8A4MjFcJhhaMwuox520qJoS+Y4vq4Da7VMaBDSSeevIyfU2Z3E080uo/CdfPQirn/bi/CorZsKAqDmF7Yy+Y3/eT4ufM053pjqXqZz+GhIZ7cmpLNS4bPHD2HYPnz1ThOv8NnnPqAClqAKH0Rx41dhnk1gwlu5kE573axO4ZPSqtaquH/oIYZVDbOp4RWgCl8uw0530aOPU/js/XT78O1b6uPbP9hjjaWOTdVPTF0BGdcOmq9qBgmPPy1YiwofhxgSJ703043KhxJRa0031hpBWlt702K9oiV8LcaOGPXsr1/3RHz+jc+KHvO4sq0Ad/NX+UxuUQ6/LUMzAjZcSCf5ZwIK3xzJ4Xv9s07Hw44w1Te7aVL04SMKUocok900iTo3bkine9y4oi37l/pY6ttKibrp11fprDWJxdAKX0VIqlv0Uit8c6lWlUYJ6bSKtpQYOIyP5mdKhtTm0oRND3JJHnQECJ+l8NWHktE1DiwP8Ddf+36hTLrrg+TwyXD1UQm+d5/73s7vA173/qvwE++9nA1xjbmOWVv0UwizTKnZC4TwzYLCx+zKmkTMqZiZKp3lZVtdtEWp0Cth0GQwCyHRMeB+L1YLH7z0djz1T/99tc1oMcNoCV+LsSOmqfnLn3giTj56MXrMU48p1j2w7KsrnVSgl+XYfbDIZUoCOXxM+l8l6PamX2H1NraK2Gw+F497+BZvGVV2/vBlZ+HKNz9ff6aK1yjXnFbppGGedZBU4eskTPEbvy3DL330GvzpF77njGO/hjBsSGfzoi0+MaoL6TxUktjFeZXDN1xbBtPygskt9BS/GoUPQLcjynFNO5KwwhcHrin5Oy+6CX/1lVvwhevuZ3P4zLY0ZJYJ6WQUPvpdlrD3VUqJG+8vKrIOrONRvFZdx5XONHM01MONDXMmF3YmHFb9/ZoFY4dHzO7NmtoZtU+TN2NimGXbKczv1urv0Z9+8Xt4cP/yapuBe/ccxuW37lxtM1oMgZbwrVO86fln4hHHbZzI2G7ftnHgL3/yHLzxuY/ED51+jPdZJ0kgJfDb//idYv5AH77mCt8QOXy0Smej2Xx88vVPwz//+jOsZVxIp7ExQT8jOWLCKEFu3lgV7By+1CPw3PnduX8ZOw/YN6PYQgpN1RTl4NYWbXE+58IB/fBD+391HObShCh8zUI6XQKn/tdNyxuGdOZS6tDmpX6u8zlDX7tYQsCtdWCp2NceU9iEhnTSPD2uLYN7qm64by9e/NeXWSvRdajJVAGtzVdE9fXGqZjq4cZCN/XWm2YMW81x1tBk/6bBMa8C/b6EwOW7zhpkkxvOFGPWHiSsBJ7/rovx0x/attpmtBgCbdGWdYrffsGj8NsveNSKzPU3F5zLOowcPv2Gp+OhA/5TrGM3zeN3X/Rodptu6hKS4tVVAZoWi6GKiUucQhilx6CLIxa7eNIpR1nL5jvhZzTdVGCQ0ZBOoVXNJo1wiz58ZUhnl2m8zqTwuTlYQHwhhTri5tknCxtqQzqD84QVPndMXfk1EV7j9dhcVXpcsrw+pLOuaAtkcR4TAfzNf96qF4cetNAhqg4Zp5jSI8aRY/Wd4vbLjCE9NerG+/Y569jHad+SKS7EKXxV37Oqq4ILM9YKX5cqfNPv4XGtLtYiolqJ6PM6aWtGg37IU/VQYspJawzWyjWpdmOt7M84sMS0sWkxG4gifEKIjQAOSylzIcSjADwGwJeklHwd8BYtCF529sOj133KaUezy6uISuoUcgkrfNFmeHMq4lRHMqy2DJPow1cRl1q0p6AhnYAiN6EKihzsPnx+SKeVw6fUNiY3LY90wpo617mUSCHqQzoDZK6q5ow7pgkhNGM2Demk1wwN6VTnwlX4qA2c8ychkQiBTpKgR6q1hkKprVw5Zrylfobbdxwk/Rzt0EoAJdn1x6YOrBqbK0rjzusStiKHz6xz8/b9+v2AJFbGFG2pDun019FqtqXwTb+Hp22cflNHQszuzZoqVkXq1sJpXQukFYA+GWtkb1qsc8SGdF4KYEEIcSKArwD4GQB/PymjWrRwUVWkwVX41LrDNgzn1k+T+n52gNN4vdFscVD7dsQGv8JpNxXoZ3bRloQQFSCO9EoJq/G6q2Q9+oTNZt3ytWi87ahjejyfQFA0fSqfE5JZBU+RUqGUdB2XFDrkizaxV20ZDpchnYM8XKjEnte2XTrE090PlyBy49EKrApRVTqZ8f7wX6/HS95zGR7cvxRcRwjm/FIiSUI63fMp4Tde98M+7QcG9+45rN8PMiakszKms+IjZSM5J6rY0IauuR3OgoPHkde1iCb5btN+KITzAK4K074vVWBans4kuHtYixaziljCJ6SUhwC8EsD7pJSvBvC4yZnVooWNKrLmEhL1b1UfvhjYIZ2xRVv49+PE31xwLlvhdC5NCsJX/p8IoY+BSzKqUOTwmSqdSlWc7yT41K88Db91vgkFpk/WPUefIQBc+Ca37NJbduC0P/iCF/pHx60NBfXIHKfwhckWYBcJKbaT6DMhhlWwQzT945S5IZ0RjdcT0mNRIagoS/atxjV37gIA7F/ycxOrdk9CkrYMhsi6KilV72z12SCX9n7vKgswAWMO6dR5pWYtVZjHKJyzQaLWghIUh/o9nIHTZaFptdxZw6ydjxCm8Vy05LPFsIgmfEKIpwP4aQBfKJelFeu3aDFWVJG1jhPmqBzfqj58TedUDmZ90RYS0jmh55svO/vhOOUYv8JpNy2KtuTk8aoJuVN2Nwzp7CZWoZgfesQx3vEutvGPjXHyDThVjjumX7lxOwDg2rt2eZ8ZIll9LjxixZEFZx2frCiFT2iFjypObv4dh8whLF6VzsqiLf54alnqKtuBU1unGOrPKoqAcNeylG5Ip3rvk2ZX4fNCOp3tdh4ghC+jx8MorsH9qNjJXNqvAHBoOUMi7KIt054LBqwf9YHuXiiMm6swO42goeEhjHpep+F6WCshndPYImOabGkxW4glfL8J4M0A/kVKeYMQ4hEA/nNyZrVoYaOKrLlKh8plig15i5mzk8aF4tg5fI2mC+JxD9+CN7/4MbXrddMEfVIcRxB7NN2LUvjskM651PR8C4HL4dP/OkVLvPlozlpMiKQK6azhWlUhnb/36e/gtD/4gl/YxSMrZhsV1kiPQ0zBGb/BOPSY3BgW4eMKqUiwCl8ozLE/IMeXzQlUc3G28+/pdoC9X/55kcSJZU20QkIBYNdBU7iJkmqj0oYv5MqCGMzDgkO9DBvnOk67jun3qrgHKmsRdQ+M6DqTIho/83fb8Mr3XT7yOFQRD2HUa69pEaxJIBS6PWuQzLvVxixEH7SYTkQVbZFSXgLgEgAQQiQAdkop3zRJw1rMNs456Qj8yKOPX5G5XMUp1IevcUgn05ahDhPoSIEvvOmHo9brdop+hOZmKywFBojLqSj68NGQzuqtZBnO5zoqXNEWNqSTPsGXYaXKG3fooi0Cn772Hm07N7Y7RiIEhFb4zDpVJFjP64Ro1s1p96Pzx5NSllU67QMVcvR6Vg5c2E5OWaAPCjgF1xSeISGdzDF095kjbHS7hw7wIZ1qlVCBmjQR1QUxdGEZs+xQb4DF+dRu1zEDPpUJp15dOyYNun9ZLtHlYosmrMRc9v3x9h2L+x4ON3Ym5aqXX5+FByYxmMbv2BTw+RYzitgqnZ8E8AYAGYCrAWwRQvy1lPIvJ2lci9nFZ5kcs0nBJSTKFxw5pJPwyJhm8sUkvh0rhbm0qNr4/ktvA1CQT7XPTRqv59LuP9etaAVRjO3nYKnlgP3UnSMl1DnIcllLrrWSVHMX9tS7irYDeuyKKp2iXJ+Ok2X1d18rhy/3CZFLGutz+Ap7qpRBin6NFKqmUKGq0vosTD5zGRvSKb1rwT3F6hpSoMeE2q9DOgOX5FyaVDvT2oEzKx0sFT763ZiFp+jT1BR6pRA6L9J5nVaoa6zy+hpxJ6ZB4ZsCE8aCabyuZuG3qcV0Ijak8ywp5T4APwHgSwBOR1Gps0WLVUfH8f7GVaVTDKXwTT6HLwTVgP4T2+4u5hd+lc4Yi1QO31xa9OCragUBGBIUUsdoqGBd0Rb1vtppl8GxKPz2AD5Z8FRAt0WCJsrFsZRSWgQkRuFzq3T6RVsc4ibt9V1IWRRtcecOOQJ9psqlNV7pzvQqiKEom867W9Ic0dB5kcQ2cx26OXz+9aPAEeDQd7mbcnaSeRgV8tByofDRIWfBYdW7MAO2joK6B0YAT+SnGVVWjkrkp4Hwhb7ns4ppuqymyZYWs4VYwtcVQnRREL7Plf332suuxVTA5SPj6sNHVb1OXZyhMzew8gpft+MonRBesZnYHL7eINdN3ucYwvfSsx+m3xcVOrkqncpxMeBDOiX7PmSwKbxR8xPkfKwIEnVCPIUvkKimQjql5AlqFeiYtEF96Gl/bhEcbryCuPsKHz9/vyakUy3rM2qlDLx3x8qlIbauzbmkam8BTuFznfWjFrueXe6xczHXSaMqIFIbD/YGWJzr2NfFDNzd3GO6mrjq9odw03a/ou44YF1ngWucC9WdRqgrLK6Z/HBzRNSRmjjWiursVheeBrQKX4thEUv43g/gTgAbAVwqhDgVwJSVOSUAACAASURBVGR+3Vu0aIjlgX2HUwrOfMdO9qhS+D79hqfjl551enD92JDOSeTwxcIlZioMEWhG+FQO33zZl4xT+N57wZPwOy+w2zPE9OHjFDErRyfzSaILU7Sl+sbnFW1hbt6uze6YVFFKyjw2ug9DVel05hg4RMvOWeNJmBDCC9UMK3xkvAo7dUgns5JgbJFkXUX6Ab7SqaQbgW/L4J7O4zcvWHap9YDw92y+k9SoJ2ocovD1Mmycc3P4pt+pmqbKlK/7wFX40XdfNpGxrQdGgZ3N3QtsyhFTWGiUHL7VxrQT71lGS/haDIsowielfI+U8kQp5UtkgbsAPHfCtrVoEQWVb6agiNrmBTtFtYrsPOW0o/GqJ53kjGPex4Z0Ciukc2XBETOTw1f+36AtgyLMtUVboEI6neWM41IX0ukSKG5mTqVh7XI+V/PQKdwhwn34imMnpW2jS9Y4+Apf+PPCvmqFTxVtGSaHr1Lh0+G0ZKUax9QU5jGkjnM4/SflTEinY/9xm+eL8ZiQztADmLqQTo4kHVweYHG+44R0zo5TNUu2DgM3x5dfx36dWjRpyzDDIZ3wvu+ziWm8rqbh9LaYTUQRPiHEEUKIC4UQ15R/70Kh9rVosepYKitKKijCt2VD11peR9rcsE1a5t7NEwzBbry+spTPJXxKlQLIDatBWwYV0hkq2kJ7StFQRTOOtF6BiBy+CNUi1ODbX8+Zh2U79r9+0RaijIrCgRmlSiclScG2DA5B9EyWYHP4oghfhQPWH1Tl8HEhndIi3+a9fwxdx6muaAsAHL1xrrDLalOh7AmFdCbVVToZG43CR4u2BIeYGkxTSOdKIUj4KnpIThPi2jLYr00xDYRPk9bVN2UkTON1NS3RB9NiR4t4xIZ0fhjAfgCvKf/2AfjIpIxq0aIJ3FBN9fR/y0K3cj0XLiG0QjqHyeGL2mJ84KqVnnvKUXjp2Q/Dn7/qCdE25VJi3+E+5sv651wOXzF+qR5Csg67dF6BavUHiM2JC49lr8crZ67iRqHmv+K2nfjSdfeTnDMV0onGffi0KpUIvmhLpcLnj59LyZ7HqJDOCnO5ap7K4RHwGZ9VmZO2ZWCqtZpw2gIcYXMdiI3zxfWX5UyVzsCFPNdJosgaPVYHl8scPhrSCeCbd+/GvXsO1w+2StDO6Bp0vELVYYPf+SlUYqpQZeaoqth0hHTWP7ibBdCQ9WnBFPB5ANNjR4t4xLZrOUNK+Sry/9uEEN+ehEEtWjTFa847Gdffuxf/cPUPAACi5CdbNtiXd11Upkcch2jLQOdY8bYMjhInIDDXSfDeC55EbKo3aqmX4eo7d+EV554IgA8VpZCycDK8HC9GTakt2uJ8zpnL5fAVYY5umKAzjy4qElbo1L8XfHAbAODtL38cgLJoCwQwRA6fWr2TCGRMvzq3tUMWcHbpMu64DN+WQVrryXKsB/YtkUbnviNZEH2zL5r8ucRfVju4ygZ3uw3dTmmXT4DDVTqr2zK4jqiUslD45lOrSmkuJV75visAAHe+46U11q8OmipBB5YHAIBN86vdpa0euTT9OCkBCqn65uHSdHuhbpscDqMqt3WRDysBsw+rb8somEbrpyWEu7iG10YV1vWCWIXvsBBCN1YTQjwTwPQ++myxrjDXSfCm55+p/zc5fF1nzeofJ/fTkdsyrHJIJzd9jEX37V3CoV6G88/aWo7Lb+XmPPl9+HzPhSMlec3n/vq+klSVm+ZtVzFfSBUUKIoB5VJaRURi7FXrKDLiql0ukRrUKHISPrnlxlGwQzrD0MRKAm/57PV4xju+hi/f8ID+nKtgSkM6NYFlQjoN0bLJrh5K+ssW5wqFb2ARvuI19H0s1OgIZ7p87WU5Brksq3TS9aofUkwDmjp+j3/rl/H4t355QtaMF3TfLIUvmMM3W4pSZci696YZpuF61QR89U0ZCaOG104CU0P4VtuAFo0R+6jvDQA+JoQ4ovx/N4Cfm4xJLVo0ByU7JqQzvmgLUN23rxNdtCVqtYnAz+Hz12li3w+dfjQ7LoeqkM46hY8StxhnRQ1nKWH8mtZ/psdfWOHjesgBRuFT6tdCN8FSP2+Uw9dJ7ZDOUFVLLqeRQrVl8JYHFT6ynAup1esZYrh975K1jmDGl7DPsSZ/3DF0uD9X8dN1ZDYowseEdIau47qQTlO+v3g9tFzk/7o5fNSU6+/di3NOPjI86Coh1qleHmT44nX3T9yecSJI+AI7q875lPjCQajf5BinfVh1LOY3adJwH2rNKkbtiTgJTMs1Pi3Es0U8Yqt0fkdKeQ6AswGcLaU8F8DzJmpZixYNQFUo5be5RVvquI7rRNIwziSW8K1iiIOvxPm2NLFvg8rhCxVtKcfiiBTA3/RjG6+rrTh7eYWPITLOIuUs2gqfHe7oK37Fq0iK6yOXReN1VcG0iSLZSRJL7VKbeqSTGM7eVCUf0hjVh6/CTiuk05mXqwpqqZWSJ/jFZ9Jar1jHnpsr+qOuP1vxLN6HQjrn0iQqXE7NdbBXhDlWVem88f4p7UCkSXT1NXjhV27Bb33qOytg0PhATyHdu2BIZ+SxaILlQYZXvO9yfPPu3WMb0/TerFhpRLVyKhxxTcCnwJYR0Cp8YUyJGS0aoFEwv5SS3vl+G8C7x2tOixbDoUMVvoQv2tIUw6h109aHb1ikidDOSUjhU+MrwsKF6bnLuafPFgFznB0+h89e1x0jtEz3+KP2uPlzgbBUAViN1xe6CfYejmvLYEI6i8qa0nGGqtoycA5TLiU6bEN63pbaENFyGVUC60JvAZXDZ+ZWtrrEk5LBMCmE1zBahXTaDwSK1xDhE0LU9HC0Xw/1lMLXCVbpHNTkQK4WYhS+Z77ja1NdeCYEW+Gr/p4DRIkZoxN6y/YD+Nbde/CWz14/vkFLVD6UcF6bYhpCOteKwqcwTeRmmmxpMVuIzeHj0GZrtpgaUHUr3Iev+pJ1Vby6qp7sGKsY0+m2T2B72EXegmmOVDCHr3w11S+duZgn1TyRCBMwDhyx4PbLXcYrfGF1jdqmqnRKKdHPZEOFr3jtpAJ5bvrN6dDUChs4riERUvh4W6LbMujG634+JiBYkqaWZAyRpfbm5kMAfPhvKKSzz1XpDNy5aKuIwBrWuorwbZizB4zJG1ttyAinehbJHuAUZCLLQ9fvNCoxHNS3tjKHb0R1bBqu11k5H3WI+Y6tNKZF4ZsWO1rEYxTC157tFlODLvEAFVdpGtL58CMW8FvnPwpPOe0oAHxhCJdEupimHD6+9H3cWDRnMdyWoXg1RIpX+OqcZ66xdpW9bg4ct94HL70db/zkt+ztGCXSq9LJhC0CKocPOodP9SiMq9JZKnw6pNNeXpVHGGzLwFXpDJzcXl3j9fKnvF9TjIYrcGNUynDRlmvv2o0f7DpczqXs8J4OMFU6uaIthoC7+Nmnn4qTj16sdEToR7k0Ib1pkthtGSjpntI7Hfc9WCuwrgXyvm5XJ5FrNYnDW90rcrQJp+F6GFWlnDZMU2jqtJgyLXa0iEcl4RNC7BdC7GP+9gN4+ArZ2KJFLag6p4jO5oblx4UQ+G/nn4nTj91YjOn4lB/7xafiot98du0YqwVXiePCS2N/ozuWwledw+cqVu5cdRUPqwhYVd8+t6E5xZ9/6XvedlxTd9eeUJsGIQqSkUuJQZ5joRuv8NlFW/xqoV6z9xrCJyV/nYXym6yQzgo7VUgnp7YBkmnLYKsRaj+b5DXaY9nLFueK7++AOc8c4Xv7yx9fkLYI9UTZoDhuJxFOSCe9Zqc1pJP/3q0FWH34UP19oOuP81jokPUxMn41ZnVhIfu1KaapaMusX5x6N1bXDAvTQOiB6TomLeJQSfiklJullFuYv81Syulv5tNiXSO20Iq3XXlXdp3qZz/qOJx45IaR7ZoUYnL4Yp9U0pxIN1TURVjhMwTCXZeCq+Kpc8MqcsnssC+eqCl8/fs79frVffj4/4Uo/qQstjEKXwS5UYTPKdoClESp/P8zv/YMb0yOa0jZtPF6jcJXLqNKIEd8GQ5oVb10zx0Hc1795Z7Cp9syxDdeV5VUQ6DHSJH3Yjy7PJB1zU4n3yPHee25XsFcvdBy53WciFHx46GKthSWfucHe/CZa++x1hiVK01XH77ZxjR+xabg9AKYHuLZIh6jhHS2aDH1eNerz8HWLfONtlFEL7b33rTAC+lkaEHsT3RUDp8K6QwUbeFCzji1hK/Syf9Px7Odd9ZEjf/yd9tw5W0P1doTUqeKhwCFQpdlspHCp1bpqhw+JxRVOWiKsLukxIVEqPE6P3+vJodPLRnoHD4+tNVvWSE1cculIfNVx8Rti2CN5Sl8qi2Df32ElHSVZxmen7yXhnimibAeENnX5HQyvjUiorCoCteuWn8SoXeTUMyUmS9/7+X4nU/zFVSHnXUacvgUxnU67th5EK99/5U4sDwYz4CRMMR1eo7ptBCtKTGjRQO0hK/FmsarnnwSHv/wI+pXJFB+34zxPV+J46u2xI1FCV+oQkYJ7fS7qg0TcmYVWnFCGwETPhUiBnSZHdJZv2MPHex59rjOXCinzyrakuda4etHyD+KCHXSpAzptPdFzdlJ7TYX6nMXUvIhjTFVOrnzrzbrk2PPFa/h2jLQ88TlSIbmclehoa4KXON1tUroYUzROiM4vR/SWS5IE/vrMrAqlobHmwasRb/LqlRLlwdDOv11x4VJhHRGPZQY0qMO5fKuJEyVzvHYcv29e7Htjl24Z/ehsYzHF6Zi1pvCsOmpySecEjNaxKMlfC1aOFDO5GpW3BwGXccJbmK921g+pVVPg851sVyFPIXbMhSv37p7N375Y9dYn//H9x7AP5GQJjd8KkR43M+aPNSuyuHz8unKf4u2DGabRgqfDukUVqETALoIDGCUVJcQeuOVIZ0X/+5zcOFrzgnui0JvUNeHr1hKyavr5ErGFiltAhcK7XW34dahYynMd1IIwTdeD4Z0ChFdECOXNqGnqiGdcxocaA5ub8O1BHopy8B7ClnzeZ7LqIczHIbdrgrVYc+KLA2HaRCkx60+x4SL1+HjV96JD3/9DgDAr3/imzjjf3yxdptpDE2dFgF3WpTGFvFoCV+LqcYJWxbGprTF/jwpojdsDuBqwQ3p5AhryBl2t+3UqHoAacsQcPTdHKP3XXyb9XkuJX7po9dYy7TKVG7L+VpqHlVSn0wRhao2EKGcPlq0pZ9JzHf9HL4DywM2f8YQusQKI1Tj6yqeTHEcPoevIDanHbsRz3n08WaeqBy+8IGixLCqeTpdRy3JclMApTqkU23rLvfH76YC3SSxcxrLt+E+fDXOtGO/LqjjVOnsWQrfFHjQDKYx3GxcsEI6aSuNwPp1+Ywfv+ounH/hJUPZMs6QTvObObYhPYw353A4jFsZG4yB8H3huvvxxevuBwB86frtUdvUPUhYDUwL0ZoOK1o0QVt4pcVU47L//tyRf2ybCnVq/Rnje34fPrZoC7/tXCfB4b4hUK7ix8EoXsUr58TT5UoVM5/7cJtc8wpX8UoJXxOnlysSE5pPQhVsEaXaVHzuKnyHegM8/q1fxq8++xF480sea41hQjoFU7QFJKSTIXwhha889nPknIeKNfQjG69Tp4oL3/QaqoOG5dKQTtaMcpyACij97bqdBJ1UWNeEyeHjxxcQUeoJAMjc7HOS2N/3/oDOGR5vNTGN4WbjQiiHL+zsVl979+05jPv3LA1lyyQUvmHCnscx9kpBnYdxWaIrQY8wYhGCPsRGI847bkwBnwcwRaGlLaLRKnwtphrdNLGc2lEQ+wOl1IN01kI6neIqbNGWCsJH0aRgjQm3cckTrOXzzhycY9KPKNqixjvUMwn8TZ6YWzmDNUVbaEVMIYR2/twqnbsP9QEAn/vOfUF7O0mCTNr95ihRckNy1ecc1Kq0Mqtru7p87cbrYejG68xYLlEFhgvpNNv6xNrdbi5NkCbCIazF+6ocvsqQTvKe7lOaCOv70mcqg04bxh02N02wwjgDy7n1gyGdzPVbh0m0ZVCoGlET+SFJxjQ8oDDnYzzHbhwKH31A1WSbacO0/B5NS2hpi3i0hK/FOkBx5479fVLO5Gr21BsGbnEVVuELbOu2dOgEKnNa45evbl85PZezfKGbOJ/7Yw4I6QBCVTqL14PLROEb8iZY1fS8mEuaNh0A+gNH4SvtVeGQ3MMJ5YB1U1G2H7DVC90LjlX4fJuLHD6h57vk956DM47b6DkCyu7atgzlKw3p9HruMaofdUlji7aYfXBGktI7h900QTdNkOXF2Dfct7c+pBPVTqFLttU+pWVRHoWqfMZpAT32aw3WPln7x+9rXZVO93vXBOMMkVQWVBZtGZHIT0OVznH3RRxLWPUQCt80PlSZFlumSfVsEYeW8LVY81gvIZ1uziFf8IT/kXbVt6gcPmFXleTCIemc8x07pHOJhJAqKOc6lBdIx6MK37C3nsxx5rlQRp3TWaHwLQ8ya7k1pg7pTLw57EqRcQqflPY1feoxG7E41/FsTzXhqz46ao6q/n8hhY+eJ9OWITyXGoIfy143TQTSRGCQ57jklh146Xu+jh+UVfpC381EVPfho9f/d+/da4rAJMI6qPSYTSuhWsshVXaPTYO6/nyhI5IP4eyrhyp1358maEIgZjmk05yP8Sp8o+ybhGx8TI3aOj2YhvMLYLoOSosotISvxbpB7O+kDumcNcbngFMmggqfR/jic/j0zZghS/TVVfh27F/2xuyrip8VN3hO4Rv+6b3ZToiCnLmVHNV+CmF62rk5fFUKnxuyaRO+4um1Ijcu2JBW+ApXkgi4p1tx9n5NHz5FenWunPRVDQnm/IISOEL8o/rw2cvp9hTdMqRz31IRMrvvcPEaUt+LtgxV8xv8wkeuJkVbbIWvx+QNThvqyMMsE8KxV+mUsgxBXu1jUk9cpPPaFJPoG9gU484vHUeVzmHOv6y7sFYB00L4psOKFk3QEr4W6w6/+iOPqPw80Q7+bBM+rthA6F7hVoiMIbtqDa6fntt6AAAWHIXvQY7wlcRJOS3VVToHhpgOefehznw3KcIHqcMkpSSKb5XCp5bb+0jtVWGy1nmRxT66IYVmfn9ZLqXXcyMVYcJdF9Kp9sHqw+cWaJF+bz5JVJM8rw7pfP2zTseLH39CMLenePrub9cpQzrVwwu1L3UhnTfet4/93J1DF20Rdg4fDW+dBgeagyEGvH2jmi2lxMeuvBMPHfC/p5OG9fthPYDhdypWUVptX7lOibTWaWis+kpUPXBZKYy7aIsmfCOMQasKNwW33bfu3o09h3ojWDQ+W1YD00I8W8SjJXwt1hXufMdL8eYXP7ZynVSH8K2ERZNDk1AkT+GLyOFT4EI66b3gwPIAV9y205uDU/jc0J2qxusHexk2znfKZebzvWUBlRhQJauTCotcqLmsHL7yM1fhW+6XCh+Th6dVpNRv5aBCJdNExDdTlz7hSRPBhqMC9Tlo6hj0KgqVcCGXtKcgbSDPtYfopElZUCV+fKBQ3vpZrsdUxz9YUKlc/pL3XIav3viAXtzPcuw93PccfuUcp0k4h28aHGgOdXlSo6pZNz+wH2/57A34zU99e6RxhoH18IgsDyt81QzDNAKPxyRylELX/zjmTYUfQbBaGJa0hmCKtgw/nkRzwl8l8P30h7bh41feNbQ9w2L1VeoCU2JGiwaYKOETQvyoEOJmIcStQog/YD5/gxDiOiHEt4UQXxdCnDVJe1qsTxg/Lu4XSqyRkE6+nDh/DFyiEtV0viKHz3VoLvjgNm9zNqQzU8qe/Urx/QcOYM+hHnqDHJtKwqecpL2H+jjn7V+pt13bad53EoE8lxYJpDl8EHZPPcA4IqqlBRvSWR4LFdLpEr5BVhA+esjVpccXrZFe/dVECI9oqXl7lsLnj2cUvlzvs1+8pj6ks6oReDcVZcsEqdfn9suFIuG5Y2Po8qRf2Vse2K/ff+TyO/Did1/q2UbzJ+mYlCRPb+P14jVk3ah+v1I59zR4gDIuhB4eBclQzbEI5Y5WYRKnPaaYybDz6jY5U3C9jjv3bRwKXxGR0GyEKuJ6uJ9haeDnoU8aU8DnAUyP0tgiHhMjfEKIFMB7AbwYwFkAfoohdJ+UUj5BSvlEAO8EcOGk7GmxftE0MpMW6ZhlNArp7DQnfGqNjHHiuWlcR+TA8sBbR+WRVTXxfuvnbsAr33cFAGDjfKG0qaH3Hh7eOe2mCe7bu4Tn/tXFelkuzX7SY9JNBTqJ0NXjVAEatmiLU4WTVpyTUCqirVqpojkBgc9Tn9OSrFKYYizhtgxZbgoZDDLjVHlqISSjIEqSkycrSXonSawwVL6xu7cZOkmCfmbCSZWNboEihVBrhR37l3H/viVvXnXMEiGsEO7ZyOGrVj1GDblSu70aP4N243WDuhy+XEq85z++j7sfOsSOt+qET49dP3jT+dX1Ow2KtMl9G89442vL0HSbMNEMRSVMGpM8vzsPLOOnP3RVVBj3NFxnLZphkgrfUwHcKqW8XUrZA/APAF5OV5BS0kSLjWgfGrSYIGJ/7JXYNYt879Lfey5ecNZWAHzuUegQuAqfq27+3c+dh/f/zJOtZcJRoey8G38O9wbRYwip6sNX56DdvvMgAJCQTkUE2NWjoMJYdx4weRm0yTk9Ip0kKStI1it8tPE6AC9kNMv9kE61H+HG60xIp5cXV0C1kgD880LJYBXJySWvHKjNqSqoXv/Xa8/BC8trUe27cXjtcQpnjFf4Bnmux1RFfULiOz0sdi+9Ys5+luMxJ2zGm55/JgDS9N5R+CjosTiwPMA/Xv2DqQirqvOpx2Vik5/BcTmBtGZQ3e8KYL4n9+w+jAu/egt+8aNXO59Xb1815jhBFfHwOopkNJtffSemIefU7MN4YK6rURgf/1Cpehu9qTPU6AS0KXSO5gTn/PvL78Tltz6ET2y7e3KTtFg1TJLwnQjgB+T/e8plFoQQvyGEuA2FwvemCdrTYo3jhWdtxTEb57zlomEfvpc84WH4w5c+li3AMe045ZhFnLBlAUAzhc9VplyH+vmP3YoXPe4Ea5k6rlwfPs5Zcs2hhTGAQjVTCt/AIQ8h6JDOIRw6F1wrilxKrSZRQpYmhcKnyNuyVvj8a8Y0XvdDOqUs/k+TxCIcyhaOZEnpO+GJsBW+oq8dyvnCVTop+TQhnX6BFki/pHkRIlUgJ86U2t9Tjl7U6xYhndAbcAVmeGWwCOnUhK8mh49ety6xBoriOokQ+lzoIjCBHErAPgfvvOgm/P5nvouv37qTXXdF4RxvF6MSlmFI7bjCCYPFWQK/4m7I5uGeG2pnHPRr7tyFp/3Zf2D/UnU0wCT86iZEoW6dWx/cj38nearq+p2GYhrub0FTXHfPXqttj8ntHt6m0EOlum3oq16u9m8FNQq3SNokoPYn5iHPFFxmLRpi1Yu2SCnfK6U8A8B/B/CH3DpCiF8RQlwjhLhmx44dK2tgi5nBB372PFz7Ry8YeZxHHLcJr//h6kqe04wXP74gZuedenT0Nl2nSEtMhVKj8BWvdY6G+7lL+DpJ4rV4qLvBu4RvlPA79xgAhaOhltJD0k2FVSilUuEL5P0BJVHKJdKkOOZuD0hud6T0z4+r8NFDneX8cgC4+s5dtdsoO7gwT1q0RSt8eiBh7bsgPfK4AjDcA4oipNMofAOdwxdqy8CHZSqTlvoZhDAKtroG00QEnRzaq1EV57l716HA2iuHkFqqMDbHv0GoQ+j79583P4i7HjoYPU7I9tDXu/ZYkN+od375Zmzft4QbApVc62wYBfQBSXAdaa8bwvkXXorXf+wa/b8ifNMQgjyKHndgeYBXvO9yfPbb9+plKgx+lFMyzPkMhU1X5StPGpM8vWp/Yr7y0/BgoUUzTJLw3QvgZPL/SeWyEP4BwE9wH0gpPyClPE9Ked5xxx03RhNbtFh7eMYjj8Wd73gpznr4Fu+z0BNJl6gEqyAyoD2SQk+wNy90vBuE6+B3UqEd8JieboAJ6VT7NUo4U4ersJmRKp2OwtdNE62eHe7Z7RqsMRzClzlFYTIpSWVYu2AQ9zSXtopQSISwFFS6laUokuV5LvHzH7FD31yb6ZxsWwat2Pm5nKqvIVBW6ST74+Xwga8q20kLFVWtr9aJqafUZ6qOLvVzCGHUVk34RJzCd9zmeQB8waGVRl3lyZHbMpSvjUI6A07gL3zkavzIX17cYBxih/U+pPBVHwsu17Su1yg31cjqSoTyNewMboj9amKUkMfeIMcgl1af1bFU6ZS+PXXjhT42DxhWUOFbAQVXf+cj7v+rf5W1aIpJEr6rAZwphDhdCDEH4HUAPkdXEEKcSf59KYDvT9CeFusU6rerfSAVJnIu4YvJhVMj2dU57WW/+MzTccrRi2zbgN4gtxx3SqCUk10XIkYVvu/eswdfvmF7veEBcA7gIDf5csKxle5TVbU23YdPhXQ6oYZ5LpGmdiuQVIV0BvIw/aItNjmm5ySk8IXIMV9B0yffubQd6dwh6YLM3S1z5EKOkpQFQXNV1k5aqL7K1oHO4QspfGT/2JDODIkwTe6XdUhn+Kk23e9jNhUh49NA+OrCmEd1RtX2TXKZR3ngEuq9Rx9ShQmd3pD9nFY0VTamicDyIMP/9283BsI7uYctAQMiEaN8mfPabDLu93gYSCmxPMjwvfv34cF9S0OOUb4OQQu4/G31HRxlz6T07am7XM3vlT8Wt3ySGNf5HRemIY+5KT6x7S7ccN/e1TZj1dCZ1MBSyoEQ4o0AvgwgBfBhKeUNQoi3A7hGSvk5AG8UQpwPoA9gN4Cfm5Q9LdYvHnPCFnzp+u3YumV+tU1ZdbzgrK341Wc/Au+/9HZrudt4vUlIpxuimMKE7j3siAW88Kyt+H/fuNsjb70sRydNtMpCc+Jche+BgOOxiRRt+fG/ubzW5ipwvQez3JBSSrLcHD6VM8TdBNXhIgY88QAAIABJREFUMVU6bQI2yI3CVxx3qclhqH2BG4DohnRSp8BWzvh16iCZ9WmeYE4UwEwTBXN8jMKn1vdnKAhfgn5myHMnKYq2uG0ZQlU6KRG0i9AUr0v9HJvmDflWIZqdJAle8/T6VuNPBeHTeWnxxH0YrFTRFrobTRW+Oiag1dDcKOydJME/XXsPPvT1OyAB/NHLznK24cdJGh0Rx8yAwm2tM2RLA/WdGLVoy9s+fyP+/oo7ARTFvG750xc3HsNcm83n57YZV5VO99AU5yN8PkPEdZi+juPCJHlWs5DOydkxKfzPf7keQNGPeT1iYoQPAKSUXwTwRWfZW8j7/zbJ+Vu0AIA3Pu+ReNaZx+DJDXLa1io6aYI3v+SxHuFzi43EhHTqoi2MspRrp79o+dDPpF+lc5CjmwiompiFs28TvkxK3L7jAJ73rktYGxZVW4Zaa+vhkl5AKXzFe0qyuqlAmhqFb7lU+LiboAnp5Iu2ZJIWhimWV4d0+gqsX7SFzu/nsrl21EFK6RXdoSQwkzS8k1H4UlHm8PEOb6HwyfIcOIQvM3OHQjq5SqqU6KrjuDzIACGQludakcJEhN0+aqs6ZjsiypZPGsYZ5REsfCL9Kq9V4zfBKOGEQVU6wqZaB5w8mFAPIahCH1vg6uYH9uPtn78RH/mFp2Bxrrn7FFKM2HkbHkr1mz1qpVRF9gC+knIMRlHAOLJo+vCN8kBBer+n9QpfNdFcSbXNVOmc3JymaEtU2ZaJ2dFiMlj1oi0tWkwaaSJaslcDL6Qz5veeyRlR96K/vOhmvaybJuhluUcwlMJn1hM6ZE8TiVzi1gcPBE3Y0FV9+Ea/+XSZONYsNzl8tsKXWEVmlMLHtlHQOUOqaIudW5bnRtFzc/j4xuuAS0+qirYMMn55E+dcSmbfpLnlD5wWCEDhoAyImkIt9p+0F9eDS7q7ZUinIq1a4XMIS8KE3fYHfg7fcj+HgJ3Dlwi7YI4LepzUuZwOha98beiMxn5V1Gox5FBhlCqdtqrHX7Ph8NXqz+mDKBrSaSofhrehePvnb8S2O3bhG3fs8jeIQEjhtkNY7ddYqPM0JEcDAGzfO1wIpwtT1KT59cCpavreMeLPvLt5LIH0thuB0A4LUxV7gpM0GHsGIzrXPVrC16JFC8ylvAMdAy6U8ONX3aXHUWMvD3JsWejg3a99IoBS4UsFzn9sEWbaSRM/pFNKXQGTg1Imx3HzcXsPKju4oi2dxK7SudRXRNUft64P38AilTbh42/uftGWVNg5kpwqVWzJqyh1oMUurLHKRVy+nICAWqz78GlHyR+rP/Bz+NKkaNXhNV53DoDQr2a524cPKHItE1Klc3mQ6/eha54eP/V+18Eeu+5Koq5f26iqhA7vamDTuBQ+i/xFhCHHhrdKOGpRxe9cFQlsQoJj7OTmGvYh1iiku+rhWhPUqc8x29JrwJyzEW1iHjTF2OKut5p5dCtRtCXmge8shnSud0w0pLNFixazAb9oS0xIZwEaQvTtu/fgX0k5bSFMqORSPyueqpcb9gY5OkmCD/3ceQCAS27ZoZ10k8MHHPJ6ahmoqpjjuPfwOXxmZKtPXip0fhlg2jJw4VRG4WNCOsvPFenw2zL440np35CTRDQu2tJI4WNsoarfgeWBN64QJpx0Lk0AQZ05e6xcFiphN03whBOPwFNOKxT5ovE6DenM9dgUnMLXY6p09rMinNEofBm7rWUbc1zH4XPdv/cwTtiyMDx5qJGCmrY28Mc35zEWoxA+ai613VL4arbVxM57oGDG1VEEVlVb7nvmL1P7FxUBUQFGLPc+bHoklb00hLspBiNsa9miXoe4HKpCOkchO7n0z3K82h2+nlYaE+3D1+A7v5I9CFuMB63C16JFi8IhJ4hxaHQIEbkB/Ze/24Z/vOYesw4MmVzu5yXhK7brZ7lFsoocPpvwZVJWEj41duyNt6oUeyiHT0V6UvWokzhVOvsVIZ2yUJQUqVsi+6PaHbgqkwr/ZAmkZIq2CLdoC90H3omrc87pTb9orO6TNLXk3j2HrXX13Lpoi7Bs5kI6+7lENxX4/H99Ft7yY0UBDZXDZwib6ZtHcc7JR5Q2m+WhUFYBU0BneZDra4IjXomwr+9Bg7YQISz1M3zjjl14+p9/Df94zQ+ittm/1McHLr3NztMsX0OnMby8maMWl89Tjj0CX7AVvmaqZZ0DTom66qtYfI/C43Ij6YcZQxZuCT/w8M9r47HL11FCOsdFYOrU52ob4G1LW/8MbROYh1Y19tW2AVlJziPU3JObosnYsxbSOWpu61pAq/C1aNECXS+HL17ho8QhEcK6qe4+1Ne9y5b6hZqinOXlgZ2zpdQcgPR0yyUO94x6RKEqZQLxN580EcFiJVzjdTuk05974BE+f9xMSiQCOGqxKOm/f3mAbiqKQjaymMMQPjN+aDwJXuGzHD1LLWEXW0TmYUcs4H4nf6dLKqhKGWq8bv4/5+QjcWh5gNt3Fg22hTAOVlEF0zhKnOPVH/g5fLotQ+mkD7TCYg7Aey94En74UccWc5Jt+1aVTvsatXL4FOGDj7lOYm07inqi8Ni3XKSP2zfu2I3XPuWU2m3e/vkb8elr78Ejj9+E5z1mK4B6pzrk4ESHdEatZTvFo+Xw8YQv1K7BtsF+9Vcwn9OKj7plT409Cmr/hiX80nnVy6X/vumhjKkAWodRyKJti/3abFufTKmHVqO47LSqsEJ90Ra1Lb/daqhck+QtauiYBxrT0h4iFrNm7yTQKnwtWrTwFb6YkE6maItLFO/dc1iPvTQoQjrVOr0stxQ3qvDlhPiFcvjmUlNKP/a3vErhYxuvB4q2hHL42LYM5RiqhxsAHLepIMEqNy7UeJ1zoPPcr7KYJvEKSZ5L3P3QIa10/NWrz8GZWzd767rXBKfK0XmefMpRZQN4o4Io57qbFi5EyIGCLAiaG1rcVW0Zyg04he3Jpx6FLQtdbznXeL00TB/fXk0OXzdxCJ92qr1Vo2GHKMYNtPdw0SeuRwrRBI9lzfL4kM7yTc1PgR16x1W7jJuQ2mWFWwYeWDizFPPXKHxUqa5zALmP9SU1rMIrbXvMYum9j7023vGlm3D9vXv18RslrHYcDzSAiKqpFTBkkT5okd6yxuOCCc0c4hrQg1V9PgHoFIqJhnSWc8WEdM4YfxrlYdRaQUv4WrRoMVSVTo7wuY7Q9r1L6HbKUMZ+bil8vQFTpTOzHbcsD4d0FqX+i/exN0GuMIsC33g9NwVByF0wSURRpbO0t+9UF6VQCp4ieQC06qmUM60yETWRKmIUEv4Nuapoi72txFe/9wCe966LsePAkt5vbt+p4pnnflsNCfumf9JRGyy7hCD5i6lS+Hj7JGhbBrJf5TF2y+fTc2HPSYu2UGWIrANSpTPLSQ9EeOg6Cp8isGNzumLJl35njNSOcWCbkI2xhMCUaI+1zVaI9i318adfuBHLgzgSQa/1UIhjXV5iXUgnrdJJQ6NjQzpz8jBjGGjCwTw8cd/HXmL/55Lb8Mr3XUFy+EYhfENvaqHuYUTltsz+j6NoC2RYqQtvwhNN015mFIOaYSXaMqxljOlZxkyjDels0aIFk8PXIG+H3IDcrd76Y2fh5gf2A6BFW4y6QklFJ0lwIFfFT8zYhwOEb66TNrIT4PP06PwurD58NKRTiDI81FUk/XEzWSh4R20kCt9mo/Dl0hAdxbtEOUeoaIvrcPpFWwI7KYuWAoNcatWI5hdS0GMlwTwhdf4/+ehFj4gp0tVJ3Bw+/0m7q/gWNhTHWM3dz3LPVpdkKoQUPuEofG4PRIq51CZ8uVYa/HWHQeww7pP3mDDHqj58TYyr+4pxSgwAXPiVW/D3V9yJU4/ZGDWdpfAxBMh9z9kQCmOlJMLk8NXZwzy8GTWkM/TAg+4jsyxq7PJ1FMI3tqIt2oTmtnCN58fRlsF9QBUzXoh85yPs36iYJN9rEqI6LuJ50fX3Y8uGLp5xxrFjGS+EVuFrFb4WLVrAD2eMy+Erww6Jj0A3e+8FT8KZWzd7VTrDIZ1C93NTjkeVwjeXGgoR2yC4SuHjcvhUOCZgk6wksXMOqxQGlSvUTRMctViEHlKFb5D7RVsgRBke6dtZNM529ssp2hIs1Q8TFtgbFOukQYXPXBPfvWcPvnX3Hm8sur8nHbXBcoQFjOqoxuKejAuhFD4/pLNTqr7KSR9k0s9fpCSTLA8RviKHjynawqg23Y59XHX+15gcvfgQNVttiwlzbFrMhZ+xXs0KKXDqOxyr8IXy9kJzcctDvMBU76Q5fP73qG4yrfAN3ZbBtscsp/s+xMBEPR9XHuUoGKWaLUeKx9F4PZdc4/Xq8aTzamwc74OfGJg+fJObVA0dc/8flxlv+L/fxAUf3DaewSowyoOQtYJW4WvRooXnRMf4M1yICXUOF7qFU61z+Po5FrqmAXffCemkIZKZVvjCbRnmOom2IaQCuqjO4eNCOo1TaDVeLwt/6GqiFTkmtCjLsZvmy0I2C+X6NqmkCl+ShEM63d2g+YRqXA5SSu2A04qXHBGm5Ovfv/eg9/lbPnuD9T8X0kl7ENKQTt/xKsgcG9KZS6Pw5bnnbNP/QiGdLsG0FL6qkM6Awjcu3yGa7mm1Tfjb1aheLqJDOvWccesBdssRdVxj88Ji+vDVFW0J75s6byY8OK85iJNR+Ix6JQP7OyzqVM4YDLgQhREwzGgcWdQP1kYQIKVkiFvENq4tdLuVDK8099uVm6sKs0af2iqdrcLXokUL+E/00gZPsO2iLWb5hm7RFN1T+MpfnV7mhHSmwhRtIc7L4T5fpbObJtruqtYNFClD6vT8TEhnVZXOlBBUY68/Lm27oAq30JBO+rkgxM+teKqQS79oS5IIx2GuV/g04RP1OXwx2LzQ9dQ2dW3MqRw+ZZ9TlENKiX7mN17vEmJW2Fyj8AVCOqlzLSA0uaeN1znVxg3pHHsOXyTUbGrfrZDOUJXOhkTQ3775elbfyvI1VBXXt8u8t9Rqya9jbcvYYttoPh+QfFtOMa2aS33fhxT4LDvtBzT++WxSoESQsbNc4sb79uEvLrqpcZGT8bVlUK/Nx+NUtbE0XgdXpbOW8ultue1WI0pwsgpf/NizlkvYhnS2hK9FixaAl3wXV6VTPcGX3jIAWJhThK9YptoymD580iJZc2mii5/E9OGb6yTa7lAlTxccqVPgCM6AVMSkxCJJeIWPJWiENB5bFm45riR+uSxaDijSQckHDelcHmT40nX369Li9UVb+H2UEuhlxbHShC8tyKt/POJuD497+Bb87586t7CbLBcCOkS3IFiismhLL+PbMgDF/gM2AVewSSZV+Pi2DLbCl3ktMSi6aWI9Gab9wMbRADl2CLchcijHjWLUxuuZM2cM7FzJMgTNmfBrNz2Ay2/dWbmtTYBA3ocUPvt7GBpbwiZ/VfsWetgCjBDSSdgMJcKc2Nj06lK2ZVLiNe+/En978W3Rv4sKseQ81pZhRtPnngvpHOE7Vyh80ltWtw273pDnaBSYhxMTJHzOXJXrzhh/ahW+lvC1aNECwKZ5O7p7eCfPLF/oFIRPhQYulWoKddBdhc+EdBav37rbzx0z25rw0FCvPheVVToZgpPnRgWgWyZCIE1J0RbtRPrj5kTBO/3YjThhywIWSvVTotjXjkM6RPleHdsLv3oLfu0T38QVtz0ULtpCSEhVDpTO4SuPdVGAxl83lvCdfdKR+LFzHl7YbV04Qh8P1YfPhNb5hvWz3CsepI4LzQPz1GcnjFRhEAzpFFaVTn3MWcJn922sCpuVUuKK23Y2cshi1zSOmArprCf3dRUra+dU5KYuh48qc0xIJz1+B5cH+MW/vwY//SE/Z8cu2sIz2lBIn3mQ4NtE/7eL3fD7oJcx84wrD4i2hwAASRXvCpuqQMNaTVXShnaNaf+CRKnBtjlzXY1iXfGQxl0W+V0I/L+yIZ0qh29yc5gw7ijKNzlDJoBW4WsJX4sWLQAcv3ne+j8mpFOtEerDt2HODulUuWx2PzvaeD3ReVcxjtVcx4R0jiOHr8t8NsilDkGlN0E3h6+qt1eWm+PyG899JD73X5+p/1ehjIoU0/DRJBHaIdleNkXfvncJmZTefqSOMxAMfZPShEeWr50kpPDFsX56qSQO+VKEWPfhYxxvoCAw/YH08ijV/7T/nHtpJoH5e6EqnTDEvyjaYh97im6aBEMW3XN96fd34oIPbsPfXnKbN04I7nE43Muwb6nvraen1SGd4THMcn7OWNKiyFVtDh/446POBV32L9+6F4DJ77Xn44mdrfBV21DflsFexhFoPWaFwjdUqKITcm0rfNJbr2mBErX2IDN9PZsS1HESWmDUkE6zrSm0M6pd/FxBWwLsW19nq8AhVoJkxvC9WRPMZs3eSaAlfC1atMDxZRERhagqnYxDR502lcNHi38kwlb4qIOvmmwDcU/jVF4YABzux2Xzc4VZzGfxOXxJgrItg01QQ2Fgik8tdFMcv3nBCs0r2lMk1hwCZUhnOZ5SS5cGmVUERkGZXmUHYEInARPyqMJTXcQqfDbJs3P4lOPeIeeKs0+WCp8X0skofG64cUxIJ51OCPOgQcrq8OW5Dt94HfB9PTXKRy6/Uy/73v378LcXhwmgO8b5F16Cs//4K8H1OUtD35RwW4bg8FHbV43nhs4CtsJ364MHABSqcNU4dngnXad6n2LaMpg5wBJobl6FUYiHqyhSWzm1sckcRUGkYoNelptj37Cx3rhCOqXz2gRcfpwp/DO8fcW4AeIWANcigtq2khxCffcnqvA51YAr150xAtWGdLaEr0WLFjBqnEJU43XdlsH8kC4NjNKmnuJTJz5N7AAxSjQ6KSmCEvHjTMNDY0M6q3L4KIk65+TCIR2QqpDUck/hq6jemOXSU0yV3Xle9J9zFT4IVbSl+He+PJYHlwflfvghncX8YTsAABJelc5ORJXOKljtKhzyp3Iylb3KLO9Je4jwOTl8xRxhwkcvrqLyp10ESK1P91dNGVL4qBNM37vOotp8x/5lrUi++K8vw19cdJM3roZzHO7dc5hfzckdqwtHLOwLLY9U+CL9I7pa5hxnwP590O85ghVoTRDTskCHMwYfdPgPQurUJ+7TUaq00k1yGb6WhiVLaogeKUTUlMCNTT0agrTqTZlLJBuBaNNxQ6G+tba42+nPV55ETHJOE0lQ7wCsxr6PgrYtQ0v4WrRoUeIzv/Z0vPTshwGILdpSvFIni5bCX+jaRVuAgigJYZM8815o9YmO+YpzT2TnT4Tx72OLE1Tm8OmCKcBvPOeMwg6Sw2e1ZVBVOh1FjW3LIKV3PNUhkCgcNJW7RtszJOSpvbJ7x/7l4n9HqXRDuNR2LjGUkF4OX0jhqyLH3L4ANvmjIZxFvqUJUaWO5eaFDnIp0c+k34dPKXxEwXWJmR1San92qLwu3JBOur/q2HE+zpxbtIVc31XO4/17DzufVROROujcmvJ/2zHnxwg577G5LGr7OrXfqg7LFHDiFFJeCedtDJE/ClPZkl9PCUSucqj2jCV3FTYOQ4ykc84sha/xaMz45esohG/sRVuGOk72GACNXBjeJslcPdxwN2/f733uX0+j2zMsVkKpilH4Zo0/tTl8LeFr0WJd4rxTj/KWPfnUo3H6MRsBxDZeL8DdgIQA5jt2Hz6gCIWk3IKSwbk0wSDLked2+ewNcynrjAthGN84CB9tfk4LTnBFPVThD1pNFOAdQSn9ypK0iS5V+LyQznL8Q8vF/j2oCJ/bRqM00thRLLdCZtOiSqbblqGTCLZdxVwn5rYfbosgBPCMM47R9tG2DFICpx2ziK/+1rNxytGLyHK+LQNtn2Dms+cPCHwAzHGjOWG0Sqeyzd0PhW4nsRwF+t491/T/e3cfdj7zhgYQr1joUCvyoKBujHD4YzPCV5vDR4az+/AVrxlDmKsqYLrv4/bVn8v63CGE3hzMZnxIZ15pRxXoJq7CN2pIZ7F+sUE/M1EJTUM6x1a0xXltZAMj8elQ2hGocaHw8cRN4fPfuQ8veveluOj6+/U29NUbc2hrhgDz3R839IOliJ/+Uc7FakCd62F7aK4FtI3XW7RYh/jkLz8Nh3oDPPHtX7WWm6f68WN99Mq7vGULnVQ7HW5IJ1W7rKItSYJcQocBKix2UyufTUH1qgPi+/BVFSLRLREAnXOXZTSHzydZOmRQKQhcHz42pLN4LUIZpafwKVKi/JEDZcjqg/uWLVsVFGFUZE45Nt0kwRIUsUsgJQnpLF8TwSt8bsXMGLj5dB/82fOwfd9S+b/99H6uk+DMrZvRTRP0shyD3G+8rq6P2JBO11E5WB43t10AJcL0vLvopoIPSQQflqpwj0f4JFJmhmjCpxU+FdJZrw6FQzrj5oz2/SmJYgo40VzKQYUyEmpEbpO0gAk1yhtXzCiX1dUI+RYrvq2xcMlmXR++Jq69IBVxe1muf2/6DRupjy2HTyt8I4xB3o8lpBP11/SN9+8DANy242C5jSKazlj6elo50mPU/UnOUj7kidH4Zovvaf8h5mH2WkVL+Fq0WIcoKlz6X39FHFQ4ZhWqfjdpTmDXK9pi1us4bRkAuyIjACzOpUgE4FM6c1uKrdLJVaPU8xOlR4ejSdOry91frkonFzaS5UyhERJylRGiYyqCFn9qXJW798D+JctWBXW+1HHgFD73+PZJjzzuuHBFbDi4qh59v3G+gzOO21T+b4d0qhvvXJpowu4Svm6MwhewBSAKnxPSSQlziNAr28KEL6zw3bPHJ3wcGod0NlL4+OXN2zLUrEerdJKxWYUvtx9I2Hb5c7vj14V0hhpic6HEw4Rlqv0bxtd1cxEtwkfXG0Lho5dub5Czxz4G41L4TKrm8MSY2mIUvuHBKnzO/+p46d/XwLlYzcbrE+3D12DomQvpLA1ex3yvJXwtWqxXpInAa847CT/xRJMj96vPPgNZLvEzTz81YoTwL+cCIXlzbtEW8otLHXzToN0mfBvmOuU29h1GtS4AmjRer1L4FNuye4i5jdeV+Wkq0HdCOkOl3F3upMxYLu322jKUIZ3K6VGEb4dS+JwBFeFTSphytFKnsb2E1HmStA/faFU6ffIUgjo6VF3pdgT2HeiXc7oqaqnw0Ry+iiqd7vxG4TPLaJXOYg77vFIUbRmK8yqEcFQZe106hxvSOaqP5uWoSf8zF8EcvkhPbfQqncJbVqXwhXIB6e4NHdLJKIBSmvxc7hjyLVZ84hgLT+GrCykdcuweCensr3aVziGG45q2q+thNLLD5PA5C1TRMPWboPfD2TK0fJIwffimg/DNXkhn8RrXY3Btos3ha9FinUIIgXf+5Dl4xiOP1cs2zKX4nRc+GvOd0RS+BarwpbZDbrVlYJSWgRvSWSp8LmjRFlcVDKGK8CkFTMBWJ2gzdPqqFD4pJXEo/XFz6Yd0qlGUcqWOkSCk0grpLJWq/YEqnYpgK7KsDmHXyuFL2By+NFClcz66Sid5H1D79P+asJjj2q1Q+LQqmVGFzx646jo8VBI+6igKuFU6bUJPQXtI0ld3TMA4YotzKe7Zfcj6rI6I1MGkNfmOb9NqnLFzqkMeS+KLbajC5xO+yvYl5Csc4HsVxW/87ayxORIhybXDEa6K4zQOnzukFptwyPhJqPM9StEWLkLh/33jbkgpcag3iM8JHEkJtYYAMB4imkvfIF/hM7+JhQ2h79DKK3zqWp2kslbXz9Jad7b4HgnpXGVDVhEt4WvRosVQCP1uPmrrJlzw1FPMekJo4pEIpy1Daod7AkB/YN9JNsylrNMpIPRN0CV8773gSaxtVX34FCmjpJT24XND/zpJEe7nOm7Lg0wTDTWG+1RR3XSWSoVvXit80HNwIZ2urQpK4VvqZ7jxvn14yXsu8/a32xG6KihgEz6+SmfcndEtZsO9d5FLczwrCV/ikwa/aEt4zoPLdogrUITNdhjCx1nbLQvXKGehOoevWLB1ywJ2HexZn1X1RYyBS2gsha8hsYtvy1CsV/dAPNSYXh1irq1FrcLHECAgfLx0yGawaEs5f8bPwaHKsR5V4XN/NyxS23hke+x+ZghfxiUVV4A7fm/+5+vwhevux1lv+TJ+5ePXxo1jJLDG4BrPj6ctQ32VTvU9d9vI+ERRjTm8PcNiJRS+mBlmjO+Z37OoGqRrE21IZ4sWY8bbX/44L6RrPeEV556E1//wI6xl3TRBP8uQJrZi0GUc717GKXwM4RPGwXe3edHjtrK2VeXw0dA+TqkyFTQLKKegbzmRwJ/82424fcdBfPKXn1Yu8xulq/0xCp8b0qn68AUIXyCHb6mf49+++wO9vEv2t1sWbXEbr4cUvm6kwmeHVJrlvqYpCHGhCp/QBNktFMO1hqhSnNxP1LhWqKGr8DmhuhTKHuU3U/XZz+FTNhfnLUaFi/bdiDJK/q3EqCGdsQqTG6qooI4np/BxY4fGCa1jLXe2c1fTBU0GubWMy4k0c4X3fzjlihDfLFy0ZZg56Lq9QY6N88X7cRVt2Xu4CLn+2k0PRtrjk7ZYcGRKfe9GITuS2d497uqcqPtEmADFK2HjghajJ6rwxU+ykvs+DqiHGes4orNV+Fq0GDd+9umn4c0veexqmzFxhBQcrmG3IjSqPL+CrfAVr65at6EbassQDukMkYLqHD5f4aNjubl8KVNkJs8l7tuzpNsnAHyVTvXvkpfDZz5PhdBE44BD+FylcoMq2tLPLCem44R0AqQP36Ba4XPVtp988kk462FbvPUoywqFd6r/pSz6j1HV01L4nFYQXFXVqhu2l8PHFG2BcEKJK0I6FeFTDif1n0NFW9KkaH9Bc1HD5CGSVHk5fJQs8NtQ3/1hRyzgo7/4VG95Fcx6dSGdlNCZ5bo1ACHJbt9Kez6eIDchgiF+o44XrfZa5PCFQ/eqDtOoVTp7WR7MBzXnuMngZOyBqdI5aEj4Qgpp02IuppqO842RAAAgAElEQVRpo81K+A8FRhvPbOtuH8rh8/qXekSRWroy0Dl8E4zprEpN8FeemBkTgfq+recqnS3ha9GixVAI/WxyhG+eFCWxFD7i0Ksb2tLALsBSFdKplnuEL0DsYhuv0/V0Xp2Z2Fp/OTP25rIgVPSmnOem+qZrhyIkpg+fmUuFdA6yHMuD3DquvsKncvgyKw/HantR9uFbdhqvp4nwisAAwJxDthLBky0aIuO2ZbDXK3yEc//kq7js+zutHD4Fl2Ry56vqHHpVOnUOH90PXuFzzytgHkgoh5OGyIWcR6XM7l/u68+MasE7jnVwqx7SzWKKttDquOPuw0enp8fHhEX7nzcK6aRT1RC6OkJoK3zVx6HqODWMlCzGI+8HeR7uw8cQnvqxzbp90krGzYeuQ0jha8oxpPPaaFuGTOn+h0OMZ8b1W6+7+6VJgVu0xVlvNUI61ddwJapj0mvvqtsfwh/+63X+OmNgfCvZ1iKL/T1bw2gJX4sWLYZC6IdzniMPHaPwUfLD5VItORU3F+c6/FyEgCwP8qgf8hiFrwintKYpXss36jMV9uOGiS0PMq9Bt0tYH3HsJnRTgW/evRsADelUcwlNHBQpPHJDN7gfNIePOgSKUHdI43Olcqg+fLFVOkNEyzpWVj6dvV6h8JnwMJrDZ/bLbcswfEhnNxU42OPbMghC+tyiLW6zeoDm8JnxfSdQ6u0lgP1LA+8zjyQG94SH7vcYofC5zhRXRKV6rtJBqlmPjsbl8FESOKhsvE7GbKrwOTb7YyuFj6rx/vbWmBWHaTgiQ0jZQDqkVmKpn+H7D+wfSj1S28x3krJKZ/F/Y4VvxDBgY09z0mpsUGOU/+cy+MCkkU3gvn/2Ai+HLxAiPAwpHxcmmsOnQ1XNsktu2YFPbLvbt2OIhx4uml5Xo0DZ2yp8LVq0aNEQod9NTuHThM+t0smEdC47bRlCOXxFAZhS4ctybJyrT0muKtqiPkuc1hHKLtoyAeBz+GSp8Fl5S0wO34a5FI8/8QhcfutOAFxbBlOlUzVd30IIn3s85pXCN8gtJ4SSGgGh7SvsNg3Z2Rw+h2ypQjIuQpU53VWFEJaDZfrwmTXnnJBO7nxVFYOhHy3OdXBomWu8Xry6hE9tSvdbKdM6pLMih0/9m5Y5fAcYwudWQYzPk3OIUoAQUVCHLCG5s01DOuv8Izo/xy+oalTVv4weC7ugSb3BJhSNH18NR0M6c6L4cPaMvWgLed/PcyvPO5fA//jn6/CC/3Up9hzq+xvUQNmjrlf1f2OFL0AQmxO+8rXRVmpb+1q32lcMMR61yd3ePTzqWvXaMrhEcYT9GxbGlgnOyuxXb5Dz39cxTMdVhZ0U2iqdLeFr0aLFkAhVu+JDOgsFKkmEp8IomEImTkhnl2/LoMIegeKmRJu9h8AVAVFQip2ATajc/nuJQxjcMLFl5waZ577CBwBPPe1o7WDMuUVbyhC8XEqtxG0k+xfK4VvuZ5YT0yHjKoVPF21Rzk0qmLYRftGWlBBsCjuk0/rAWc92nNSUVQofpzxW3bApGdw4l+rcQGteZ2xP4bNUZ7doi01eaJ8zrfAlRe4lzbs0IaEO4QvvioXcccTodqExbFVz+JDOJutZIa96mfm8OoePH1NKENt5GxRxC3UNUPtMv6tcThc3Jj9eeLuYbW7fcRC/+alvW/ZddftDAMx1M4wrPF/+DqjjPI62DIBfFKsOVUS6flv71frOjMAPivY57vfPUfhKwqt/RiS/nnmwsHKExRDhCc7hzAWYB4N+OProhoxDJYyfS53b9cv4WsLXokWLscKttAjYCh/9waUOvnK43cbri3Mp+yNNq2n2BrkmPVWoyv+y2zLY8xSvwnrtsIQPUQofADzy+E36vcnho/tW9PlTTts82T+32qgK6Tzcs8NJTX+/gugMMtMz0ArpZJQ0LoePgx3+Ktj35QJnuzKks+MragocQa/M4SPvF+c7mvC5+Wx0HJfQ2/MXC3WVQHJe79l9GGe95SJcd8/ecg5jX9G3zFaTXDuAeIfYdcTskEd7kLseOohnvuNr2L5vSS9LhMlLaqzw1RZtMbhvj5lTKzSxffisMGj7s1TbzhvvKnyhz91z4hJp2x52qHK8EZhMzUd1RHvngWU84a1fZj+bc/pGxoR0XnHrTn1dh5S8pg3cq67POrgqbVO1N2gT/O+b+7/bNiREXFdD4ctrrvFxgPt9MYSPt2cUrKjCp4u2rNiUU4eW8LVo0WI4NAjp1EVbEptMUaKhfHsuh4/7kX7iyUdaIZ3cvC6qcvhU4ROVP6fgFm1xQwJ7tGhLXih89EaW5XzewOYFE6I5R6qYqrlUtUd1o6JkyN2PblqEZS4NMssh6BDVEkJY6mltWwaHuIcK4VCmRPkZR6Do7V2tS1tzLDiknbOrOqTTVvgOqrYMjMRnFD7bXiucVxENovCpj+/edQj9TOLOhw4W62iFL0EubUeVIz+N4Dhi1PF1faaPXXkX7t1zGJ/99n16mSAPMWKdrNgy5tShv+j67Xo7zmHXOXwMfwj23pOmomvIcm4u6/NyS5pXKVFDRio+G+YsViuGZh/VA6SQbVfc9hD2O1V71ZoqtFsd5zqidtP2fbjgQ9twZakujovwxeSYBqGucR2WGn4Q0GhYyeTiOQuUQi2lxE3b9+HBfcuowgqmoK2swkeOVG8QekgzuiErmcNnirasX8bXEr4WLVoMhdDPpqvS0GVeHz6m8boqrPCorYUCttBNPML0z7/+DPz8M07TpCHLZVSjcI5AvOCsrXjnq87GyUcvFvslbCfXz+Er0NFtGWzHftnJo5NSsoR184LJOfT68AnokE5D+KjC5w+40Emw1LfDSd2G97RoRV3jdY/whXL46HurSqe7nrB8BK5oi0v4+KItvg3cZ0UOX6nw0XYBUETPJdn+wGp65SzkudS9DVVvxMM6bLR8glzm8HFl913/JtbdUevlzDgx4WbF9eSrZHsP9/HLH7sGO/b7jm1slU413DPOOAbb9y3hhvv2FdszYYXD9OGTqA/p9BVU95gUr/uW+tY2eu2aENPQfOMCHU59R0MzcKdDba9+I9RxrnOoVZ7pASbXlcKtglwHGXjfZFtW4RuF8KE+pJOGHP/ouy/DNXftLuet/45NGhwZG/sczO+LCuf1frvGYIZdzXqyx1Kdq/VL91rC16JFiyERelLG5dLNh4q20H5oOqSzcKD/4lVn44a3vchT3ADgMSdsLoqIkJ/vGIWPIxDHbprHa55ysr6BCdik1A35U/utq3Rmdkjn8iDzwtg4gkYJn9eHD0KHdKqxlAJZzO2Pt2Euxaev+QG+cccuvUwrfCWJpQVxVLGZRPAN6bkqndwZTwIkzz1nRQ4hJSF+SCfdR4Av2lJZpZN8tHG+o5WQQkFR25djezl8/njquKgn/4NcapsU4VOtH9Qp75TFdtxKrYDv1IQKlbhwQxar+vBx4Wb0mqbb/sM37sZXb3wA77/kNm/O2JBOhROP3AAA2F+SKrW9ncMX25bBLJfShFvXEa3Q52q5XTm12mGnn7nXxlARneU2XG9JCbvicNUcVde/W2SoX+NEDxxiGAoBbdrA3S4q1GxbfY1DPTCgeaHDkwJO4XMPj/meOtsyY6009Pd6JRQ+MkdfP4DgH6KMAiv/t3z/R/96Pf7vVXeNPrgDdZtexwJfS/hatGgxHEK/m1wu3RwJ6aQ/uJQcujl8C90UG+cLUuT+SHNOOkfmXLgKEh1b3dDcPm0uUXAJg1u0hcvh48IhN80Twpc6ffhE4eRKaRyeOoVvvpNi39IA9+4x1f+syqOwC+L0srwgcYJX+FylNnSjtNVQ/7jp/2E7ElwfPvf8NC7aQq7KLRs62HfYkA91XSq71LFJ9TFnFL5ymXIWcim1vQdK9fBQ384TLI6BtJ9eK/JT4a1Vhc25/fesHKnQNtaxNg9N6DSKANHwYmMzeQJSAbWaybOzt+cUvtqiLU7eVl0pdU2IQ4ew/Hw/UfiKIh7Wx0F73FzSUap0cjnOOXkgob6jIXJTdf0bwqcIXLUyp85HX6s4AYVvlJDORlv6xMZqnzGKwie5hyOOwhdoG+JtV742uQ62713C7//Td7yiZLHQIZ0TVMK4fpahHL5xKI126kPx/pJbdmAbeWg5LmTW7/P6RH0d8xYtWrRoAF7hK5alTljgglWIpHhVN0Sq2Lk/0ikhRgoxIZ2uggQYf3br5gW88twT8fPPPI3tLafIhFH4bMJXVGcsQjqpU5fnkq2CaeXwOQpfIop+hf3M9OuaJ7Zz+8rdxxQ5EaXdS307pNNtTaCQJsIjfKkb66rmDdjgNV63IzrZtgwLHYfwNezDR6c8YkOXED6JDd2iaqfbViOpVPiKhQOi8HUdhe+wUwlUK3wc0fEUPvO+qqKicYL9deqcWKAM6SwPJXXmVCjfplJt3n2why0burrwTAw00XUKq6itKelwi2Jw47jvpSQPZRybfv0T1+KoxTlTpbNG4dt3mFZOlZDqcmM2o4s6qQCp9zKkwldsNNdJdH9IOp66LpUKH5qj6vJXhZ1Utcm6kE7lzGuFL5TDN0pIZ8NjZcJzUdrkV34dBlzvPHd3Qw8kwmHT8fO/7fM34EvXb8dzHn08XvKEh8VvqG1QczfetPEcFL3Aw4DxhHSa9/TYZxMo36l+e9cz4WsVvhYtWgyF0O8mq/CRoiT0B5euq4iUIiVzTI8+BUNQmoV0UoI571TGTBKBC1/7RJx90pFOHz6bXBplqiR8ZdGWNBH65pg5TkV8SKeO6UQiBDIp9VPnhRqFb/fBnreMVqJ0Fb7+INdE1CWQrznvZE+VVGO4oOslNuOzoPoAmnWLV6vvnRvSySp84Rs2/eyIDV3sXx5gkOXIpTn3btEd7uGBO7/OicqMwqcI30MHe9h9sKf3LU25HD6ekFBHskqNcVUPW+FznVG13KBoy+CHdBqFr4OlfoZnv/M/8a/furfcZ7VtNdRo6jgaFdJWj4oxfacbzvp0H9R4rnqo8MXrtuMT2+6uJMTUJjuHr1qh4fpZmvGGV/i4SARKausUoKqiE67CVxeK6YZyhgjiaApfs2Olrx8wNo3AMviQSHu8zCGboWmHCa9MnN+SpghV+p0E6IOpXiDEeBx20N/Dgf5tmEwxFzXmOuZ7LeFr0aLFcAjl9rAKX5dW6SSKDlX4FOErHR7qGLlOjvrfDek85ehFvOa8k4I2U4VPEwBmP7i8NGODUsXKHL7yhthNE632uOoOd5OhChotrqJmSIStFNkKn//T7aoGdFy3AiBQOINuHtvJR2/A777wUXjLy87ynNxglU4Ci++5hM/7vySbTKNzBY7wqevrwz9/Ht70vEfaY5L3R5aN6vctDZBLaVVhLcZ2irY4RXkK24r/lOOcSZPDp/IDP7ntbpz7J1+1c/hy6ZF++qpg5coQ59wvLqG2952+YKgV+YAWbbFDOgsCtHGug92Heti/PMDOA8vWHHVV7TTRdRQ+9Ur3q7IPH7HLV/iEfs/aUL6GHEU+h4+GdHIE1Lx3r8OmAoSUEu/7zyJPknswJSH1daceeIVc3pgcPkPkqg11+/VNpi1Do00p47Nsox8NA2dYAE0UPn60JqTHbfHSFHUPNf7yyzfhdR+4cqixzSTWC4D6cN9RwBVtkVIG+2mOAlOlc/xjzwrakM4WLVoMhdAPpxuWBxCFT9htGSgBc9syxCh2dk8/gUt//7mV61PbXIWPgtro5u6Fcvi6qcDew361u1zyIZ3UdrWvx26a058lorgJ6j585HhU9aKjUKQmKaMxaZXOXmaa1SsSs9jt4I3PO7OwwRkroiuD04fPB3UZlONCi1i45ILbz8XS5uc9Ziue95itQVuOWCwI397D/ZLwBRQ+0grDhXrooBznAanSqYq1KKhzrnIvubYM1SGddvEfWtvD7Y9lHUfHZs4vo20ZuJBOwIQ75o5jWRfaqT51FUQ1To9R+DheYV8b9ie6SmfA5TcKKre9IWj7lvrYNN/BgeWBVcSD20U6l1vUqKnru+2OXfjw5XcA4HP4KKnVCl9gkqpvvgqdpyHIVVDKnlo/qPANmu3xKNzADQnOGFIwDLjr2bXT5PC527o2wrIxBuq3uK43opQS7/mPW/HKJ52oK0fTuUKH4J7dh6387WHA9R3sRx6TYWAXdDLztCGdk0Gr8LVo0WIohH42OSXILtpSHdK5TAhU7VxU4WsY0mkUPm7ccEinSxh65Q2xkyaskhOq0kmhSMQpx2wEAOw51CsVPqMU1RVt4dCx2jIIzwE0Cl+Y/Oo5RagtAz1WZLmryoLPWeMcYDqGq65snAs/p6S2HLGBEL5ceudbhdS6hIXCPJU3hM0t2qKgDm1atmXgctKqQpWoI/j/s/fm8ZIc1ZnoF5lVd+td3a1dQhLaEItYxC6BNYhVtjH+eQWPzTO2x9iMAW8PD9jjARtj+w14Gc/zBs/2Ay94mcFmNQLMYptFBiFk0I5QS2qpW0uv995aMmP+iDwRJ05EZGZV3dt9u8lPv9atqsyMjMzKzIoT33e+U+c2eXQwxqv+5AvuQ7FubDCaqTDHDnCMV6m1lTvKHKqmcZ22x13tnxncmOOKBXxhq6kcPsBdS6nTR6s3BQSHV8fYWn3vpdZ2w9jp9k1bBMM34WiXM2RRhk+765IG2EnTlprHHKkAYvlvMdjAkEk6Y8+WmSSdEwYG9ruM3DOzxBixgEl+j2nJcWKiZoIOSXl4Cl9/8Cjecd2t+PF3/5vYZ/reMZ9PzjxLyHMPMFVIEPTOHvHF7nnzmzdz0wE605Yu4OvQocO0mOC5yc1D+HiCyz+JBRuMQklnal9eWYYWLp18MCMlfl67kcCF9kU/GJLh44NCye40ySFp+bnVjO6eh5erem7OgGGhwbQlBnveFaLnkPZL7Xl19FS4bjw4dq9rUviCBukcNbmrygHo0nzIIMf6wgM+rfn3bZa/6srzAQB7q1nxaA5f1bcRY/goiD7C8sEANyDr5QoafgBH45q6HL6RV94jPsDUGrjroaO488GjbF3Z6zCI4ZJO/jmvv2YNbkRQ1sjw0UBKmrZoOi63vRtQp9vh69G6ir+J4Oiwvo4c/5zMkkwOX3T1YFdBEDThgDRVPuZN1z6mak4HN0zqtLfJ4SM0sUl0noeFcRYudLye6aSmLX6QNtnJsrEU3TPiWpgWMfluwPBR2ZCGoMyyhRMcW07y8Ia2acJTfncuGItvZyTKs0VKkl0F0pLOtWD4eJvHiuH75g33uoCvQ4cOxwB8dtPL4WOslSzLwAO41EN6UpdOvu/5iPQ0tp5k9mhJ4NIp6muV7AesZXzmAr5HViKSznqG74M/dRUuPX2L95kN5ICorLRNLTqCKbxeHxw3lWXgoAFNEzMrZ2TrGD7p0gk4SSexydTe8y87DW9+6WPxumsuju4HcCzzuHBlFpxpi2T4zPIYwzepS2coIXMDTMkQBPl+kYFtxiSdfHti+LQGY/iq9WrYL29/1V9XK6/qMwsmCK4On9/oPY8sew6y0qfDBKzpOOuRo6Pg2Dj4/rYuOoavbuDOtwnu7RlGu3yC4/xdm6r2wvsjtYc2kk5CW0nnb3zY5H+VZSLgO4YMX+DyugYMX0rGKe+dlOQ4JemcRGFqfwOb8iqtYsT/HlwgnNhpwwRGG0g5N+DuX9n0GsR7Xq6eezaGz7i1gG2y4bf4377xMP7zX3xp3QvBHw90AV+HDh2mQtuCzABjSsrSDxC4wyMry9DL1MQmIW0knbH8wZjsyZd00r58FoxyMijnpp+o1VWU8Ry+GCjgI2kVDxz47H1sUHbZmVvxgsee7n3Wyx2LGStJIdlKP+hRYt14n728Pc/sRkg6xfaO4WtgP8XiOobPl3SafMiDy0PfpZP19QefeR7OqwbdMdi8m7K0A09ZloHgTFsyz2zHLKuuhYDhc6hl+Ng+5CA+GIhp/y/gTIBk24eZjPOQyD+NybticJJOOrP+oDnm0snbHIwLXPnrn8Dr33sDa9NniBTMZENTnTgX7Mb7CABbK4ZPaz/M+8o9B/GNh44ihlDSGV0tCX7t++7DxLrqYDIlNbCvY/ikXLStaUupgT0Pr1QMdvicmDTgm0WG6a5fen6ysgxTBto1MZKHlKmQ3D5W4qEJrsRL/TZ0LUvlQ5NLp5xkmgYyVxhIM3xr4tLJzgUPtteB4LPtN0k6P3vnw/iHL9+HldF09RI3MrqAr0OHDlNhEim8m90MBzYEzvDJH7vUNp7xSQtJJw8iKQCIyZ547Eb9coGf+dvvmReDBMNnLb7LZkkngSSuW+Z7doBLAwS/ZmG8vSXhkNq3tebijKYtPh5h+OQpT+0z5cwZMnz+BxRoN0k6Ny/4jF4dw5eSdBaam7bUX3/eJIItvaEDCerhIOBzAwrNvjezrPorBzKc4SvC9e1qbCBWxxKaJsPrWSk3gULrD8aFvXZLDa9moelrW4bPH0g59iPcPlaHz+assc9idfhUi76k6/C511tsDp//+c/9zZfxW9fd5rZhC6Ur7ixFp/nEFF1ra8UlSEnnqNQ4vDrC737stihrwoOpcWlknbEJmMGEkk6frZ445Ku2oz5OuHmyReqPex0EMYlrPvi+ExMLdWibw0cBlvw9ayPpnJWUspJrdmTDcTzgW4uLNi7p1FM7mbbZV9NPcaqEzsmALuDr0KHDVODPzf/ykktr16XAYlzq5APXBXxFMOhISjr5PqaUdMYMCaKSTvgBAQ38ieGTg0L6zSp0muH7h9dcid/5/id5n33gp67ER17/HFaWoaz62+zSKQO+nJm2xBg+V4cv8w82AqXinK5fszC5eQ3DV/8ztHvLvPdeHmOqL3O9DIv9vMrh00EOX1P/eN/GRWkHADEGBOBMF81Sh4PeuoLOvktniuHTjQyfk2W5z0gSCbiBzI/9mTOFKMu1MG3xGcS6IvF8Wezr8CSdcAxl04A2zTq4z10On/YC6cG49Grg8Zbk/TbLwNqTqjOGT7JxqV3UMSsy4CsKjd/48C347x+9FR+6aW+wPs+vHFWTGrGSLxNLOmdg+Nx1Z154hdenPO9+jUd/MoHDlqdoYPimknTSs6RlwEcTimb/8f6HfZotSIkdl71GWkwsTYqUk/G6mLa0ZPjoUu8knR06dOhQgQYrl5+zHS994lm16/IaRKkHLo2DVscF5mry6zh4W1NLOmMMnxfwCYavGqKShHMwCl1FAV7EN+3S+fizt+HbLz/T++yxZ27DmdsXkavKsaz6AeJ1+FLtLfQlw+cKustlvB0a49UFbCnlpS8CDQPlFGgg0STp3L3ZD/g2zde4dIqmNi/0cGRQoNTOETYlRY66dLIcPjLP6SdOkqtHl4G7qwIsL66GnfPq8CWYQB1tI94m/zRTiuXYmSWfvHU/659mjp3w1ms2bTF/ZXH01HhJKX9Zk2OnYfgMxdc0yEwN0vjHxKLLoLLU2mNgebckez9p5MG3nmMDeSuC1WGR9KQMsWbfcvJkVJZYrupzrkTqdPJraVRNagTHiuYC7kG72uUCThqkSUlyLCiYFDrxOuXS2VSHL1brsgltGT6b28y+S76b9HWxFjl8bgKEMCycCsBfd7Z9AaLwunWnXR/TFleHryHgs1Liky/g6+rwdejQYSrYOmpZ3K6fI2c1iFIBHz2IV0cl5iTD14KVaWKKAF/SSQxfbPY6VoeP9kXvaQZ21Uo6EzkXZfOPTLSvVJah+vHjAVtsFh4I2S8avCkVBoOAG6T3WGBIkD3OEt9z7FyZ7etZWvpBbZLi7to8AcMn3vczhaIsUWptv+8219Kvfefj8cChVXtNjUrH8KWuM5fDpwLpZZscPs/Vky153w33WldOKRWNIZZfpBCWNti1eQ5POHs7Pn7zPiPpDBi+cPAXA5ey8v2mBuf9PBP5iv5ypeIuj5lCMPKmmnqyLRkU8b7kdB2znCczWI6X0qBt/Paih9YK0Rw+tJdN1o2FZT/HhcZ8Lx148WtpbBm+WMA3KcNnnj3jcnLnSPq+bVmPNTFt4a/T7aWMilLvJ+mPzeFrOJexHD4vYE2cT61nz+GTkzxa63V16Ywxl1qvjYxXoq1LJ63XSTo7dOjQoQINynq5apRJUOAxKnRywM3LMki2LsXKeAHfhJJOYvhig5mY+Yhj+vwgicpIBMYO7IejRSwa7UNZApMUXg8DPlcOYz4m6ZzApdNIOsMV/FIOaYZPvm+bwyclnfUMny+7zXOFcaGhtQlY5/KslTz4+592Ll53zcWOmS5cXkmMAQHcgIXOJbdft46tNezcqIwHQa/9S2dmUpY6mP3WGrjuqw/gGW/9GAbjImq2wl06aZ+HVsfYvuTkjda0RbAcbYc99DW6/cfX61cBsQ0MxYr9LPNn1zXl8IWmLalJBAm+Wa6UlYfyz8vS9fn2fYdxw54DXp/89iYcDLKOzkVy+MpSR59Dsf3UDeoDlUHp1AWxASwPPqg0w5qYtmgnDZ30VElJcsEnQqYcg3uMMV8g2qN9BSy6WLHtZAiHrOmZQiyHzw+M4tsZhnrGgE+U3DMBu/+Z7dOaSDrda57Dty4MX9VkXR1L2j+wPsYxxxtdwNehQ4epQA/oXpY1Bnw0ECmESyeHzeEbl60MWAA/EJxc0kkMX/jDFWX43E4BuB/kpGlL6QYPbV06OUwumPsR56YrqXzFxb4fDFEQrJSyJTDyTAXlGFz5hnjwBlQBeWS3SdOWYL34+aHvLXWKJmL4xHfVyzI7Y54rhf/03Avw/MtOS2wbk3S6SYEmRpLGWrGZ/JTMMcXwpQb1GqEEWUPjl//h33H/oVU8cHBgB4fewFK5+6soNVZHBYbj0hrbaK2DsgxS2pkCLZYuoKmgiL7vWGAKmGeFHKArZf7JJts6B3rBb2aC31J7FdmMa2e13jVv/5QneZWDxFID9x1YwXlv+ABuvOcAJgGf4LCSTrAC1wyxw6kb00vmf1SUVoB2LEUAACAASURBVNUgg5g//NQdeM/n7vY+G47L6LNlUtMWHmhOGha4UhlVW961MHuQ4QX54gSP2TPb3yj+dhJGjVQuTVLBWJmfJgk0rTMrKSUNmzwmvpT32mz7AlIunetTlsFKOhs4PlrvdX/1JbzrM19f834cT3QBX4cOHaYCZzyaAhoaiIzKOkmn+Tscx1w64+1OLOn0TFvSDF88h8//ayWdo4Rpi3Y/km1dOmUffJdOJgWbkOHLlM/w0TFYho9JPwmBpLPFIcTOWwq2zAH1IbF+wPDVuHRSC650hrIDqEwBP/OCS/D0C3bGt5URPVgdvjJ06QSAF7EyGFobQyI6T6NIAFeXw+cHiOlBnWyj1Nzt0Q2J+XqZ4oXRXf297VXpilKHLp0xi/YYaI9hDl8i4KvOnzSHIfTyTBStN2UZMqWSEjz7PjVQZB/3Mirx4OeLlTrd59ClE/inW0xA+OciaGqCz/A5uWVbFq2OXQwknWyySR7bWz94M+4/tOp9ZgyzwufoYNQ+4KNnnsvhm27wbhk+LumcMg5ISjpFe/SbFjJ8ssHJ+9OW4aPg2pd0Np8DjelzHAmS1R+Na9pbA8ljtFapXptgMthXScdW3zit99k7H8ab3//Vte/IcUQX8HXo0GEq2AKxWQZVPUlkHSgCL8uQNm2ZjK0DhGlLg/kHrU+7qS3L4DlP+sEQZ48Al8Mn989zgpoY0BiUUihK90NYVyieIGWbfe7S2SOzCm0Zi55g+Hg/Lzl9Cx5zxlb7PhVk8plfvkbI8PnvpTlBSqb6xHO2e+/buHRSS3mmLMPXFHRHTVts7qlj+PjM+2VnbsVrrr7Q1kzkxeljdfXqzCBGLQa2sRw+rd2stWZBDL+uTcDk2qD6e9tYEfJDq0LSSW51LRm+0KUzvv6cDfji7ffzLM7wRdaVsisrB4T83H2SVwyfzHnyGT8fQRFs7ULrVrc2a9h36TR/izKemxnrT91gWCn/PuKuyG1Yk9VxGZUsx5yMU6DjmJbhkxMOTQFSG6QCJt8cSNtJmtClU06y+IFRG6iW3wOlCPDfkzpWkvdxdtMWf3+DomDLGoLgKRBj+KDjtXFnRVsZ7nrkD24UrGvAp5R6kVLqFqXU7UqpN0SW/7RS6qtKqRuVUh9TSj1qPfvToUOHtQMvRO1kh4mAz1pSl41lGQBgvrWk06GdaYvrC/U1NpiJyRSdaUvF8FU/yIMEw1eUzqkxFczUIc/Mj3jRcG45pByL9quYS6fWIVsp3ToBExB/6LVX4bSt83bd2FHwQQYPqoIcvqAOnx9Apc7ROacs4ZZfeZF9v1STwxfUSmQGIU0D89jiPss9HUXc8yh4oBpYJuAzy2LMRCgVc+/bMHw6wkJpuEG9BuIMX+ZLOim427405z5b8SWdbU0LaK3QtCW+Pp3TVBDcr8w+bPu6+m5mknS61+Y7C2WjdcxCWHhds74039v8XHoMX/V3WMSLPE+aw6eUr7YY10g6YxiOyyBfcVJQ/+g+mZQIkjlja114nbfAX49jwUdkvYMrI9y535koTdqHpmAmyvC1CPjWovC6vCd9lYJYdw0CcX5fcEln6hTddO9BfOWegzPtq6nbs57DjYx1C/iUUjmA3wPwYgCXAfh+pdRlYrUvAbhCa/0EAH8D4DfWqz8dOnRYW9BAupdn1ur8tc+7KLruxadtBgBc85jTklI/Hq/xGkRAWh7oM3zNj7NcKSshtAxf5NeFBx8uiBBMn1Lo5yqacwGYHzBqehqGL1MKR4djK7+Lma5ISEMTCkIVfEmorb+XK7Fe2E/H4MT3yX8g/UBZfof+dq6+oPkeXvH0c+M7gM9uLkbcRu0+yGAHLoh0ks52DJ88hjxTGJclHjoyAOBLTPNKHqh1NQBSrp2opLNmlryu8DpvJ8jh09yB0w36+ABWQXmSS8fwmRy+5WHhmSbwv03jH2lW42SS8Q1lICCPtZ/7pi0a5vhi3588n+ncR/e5b9ri2CSewychJyO0dt9dm1ubH4/n0pnRpFF8hBvrTW3AB3/SZlw4SWfbcWzKlKgt6FgdwzfZAFpeP3w+btoYwwvyBKtH4Ix87B4j/OGn7sCvfvBrwedNsPdlQ4mLJkln8tlQrkEOn2h8NE4H22sRFvE2eRmjVFD8rb/7GXzb//jMVPuyks6GkyTvr1g5kxMV68nwPQ3A7VrrO7XWQwB/CeClfAWt9Se01svV288COHsd+9OhQ4c1hJXkZQrzvRx3ve1a/MhVF0TXfdTOTfjqm1+I733qOcn2VE3wlhyCsAVtJJ1KKcvwWZfOSJ5CVNIZaa+XZTaHT/ZZazCGr7Fr0T6sjkr8j0/cDqC5fAEAnLFtEf/fK5/K2qj+Zn5ZBslWSmaMgzM40eXs9DUlxHPQ9TPXy3DzW16EX3jxY2rXf9K5RtpZx5ba/lV/e5nCsNpPE8maGrj3MuP0SflOp29dsMtyFogUpZ/DxwctRWKwwd96Lp2JUZ1GPVOjGcXH90+mJ0Al36wcObdVLp0HK3bPLPf71lrSaYPO+u2CHD5xPL1cCXZU22OQEryA8bPnWXwuGD5qS5phtM3hK7XbSeqyeu8X9uDho8OgP/0Iw5cyRYl1p+7r4PUWAXMN1Ll0xhBz6ZwEtJ8ep50ngJRL+gzfdH1KuXTy9vg1JwMOvs2R1TH7fAKGz7bdFPCZ3xP+vPIYyhqGb1ZTG3nv1pVPWQsijAfz9D2bPOXZ267bF0dZanufmvX8A9t7cGXtO3OcsJ4B31kA9rD391SfpfAqAB9ax/506NBhDUE/im3liktzPS+ou/bxZ3jLeZDV2qXTC/jauXRSYNi2Dh9EcOQHpsq5dIrzUOjZcvjkNm3P89WXnhptayFSuJ1LPntZnEWhH3Yj6QyX+/b/qNoL+yU/4gOfhX7emGP37lc9HZ/6uatr1wnyLPP2DF9qsZGFajxQBXxnbHcBHzk+Aub7zlgAGBssyUuND9DGEfv50BkvlsOnWQFvN+jjdvacIStLHTB8B5ZZwCeYvuaBXXWNixy+pKSzJ01b/BWDsgww36dCeoBu+57YJx8kk6RTaymVC3MC+TZee+ybi103t+87gp//2xvxur+6IehrLIcv5tBJ+5Gol3RGcvgmkHQCaQfgtqDrZ2qXTmrHTlzwYG26KCMliSwT11PsHottMwnjmDJukiC2V05GNO1TT9ifGOQ9P1hnhs/P4eNy8LVo3UfqefaO627Fk9/yUTxYKTjkvu874BsbncjYEKYtSqkfAHAFgN9MLP8xpdT1Sqnr9+/fH1ulQ4cOxxgUMJ2yaW7ibb/8Sy/Ab33fE73P+Mx0UIcvMQapk3T+/WuejR+96nx/H5mys/WW4YuM8nhQJ6V+fDzUz7PasgzWpXOqgM/v9zTF2wlKKU8W6fIQw3y0EPXHwAcZMuDy++C/n9R6e9N8D+fuXKpdx9VMNO97mcvhaw744stNAekSDxwaYGkux9aFvl2WK3gDat+0JWSp5DHzADBm8hIzaInV4aMTPmYyYr5txqSmpXZF1rcvEsPnZrhlwNY0yJaS36ZB21zu+gGE56TfU4Gkk6SyvC8xxqou95HA8y5tkXo0MHyBaYtrMzYJcrQqBv9IxRzwdvsRl05ider6Tag1bYEf8I0Kl9/Jg8q6/KtZA75A0jllxBe7ZyZt6+hgjCt+5aP459sfDNo3L3nAwV4HpU8cJMvcFrRqE8NHOeUpVrLetGW2QInn0QH+M0m2PG0+JYfv0lkyw6n1qMMXfy599KsPAIAN+OTzaM8jyzhZsJ4B370AuH7r7OozD0qpawC8EcC3a60HsYa01n+otb5Ca33F7t2716WzHTp0mAzfdvmZeNO1j8Hrn3/xxNtuW+rXll5oX4fPQQ7KnnD2djxDWPBnStn1LMOXsJ62csiA4XPrmIAvJenUweBnEvDZ1Wnq+HEoxMs68PPcSwSVckAfLGdDgRgL6vrgf7YeTmwu4DQvfIavftvUchM0Gknn6VsXvO+f5IGAGcgpLumM1uFLMwdBkJNYPwgC4e6DonRBjCfpZMdXao3Dq2NkCti8YHI+H1kOJZ2O6UMtaDEPKE1f4+u7HL54+70s84M5XTF8SrAekcsnNYkQunQqT9JJAVyqzzII4vl+sVuTl6yR++eGVFLSGXtO/Md3fg5/9q93RY9FgnJOeT/pMuTmVHWum22UEnWwks6cGL7JAoOgNl4R3hdtcdu+I3jwyBC//uGbo214DB/bbxCUJVi9SfrjAtgG05aK4fNlnPH9c5T2Gp4+EKNNqYVYHnKsT9NCMnxN6oCZ9iWkwhL0tchb4+M371v7zhwnpO3OZscXAFyklDofJtD7PgAv5ysopZ4E4A8AvEhrffKc1Q4dvgmQZyqZszdte4S2s8x8sEVsRWo5vacBDZmgpAIPGhTKHD6PVewp+6MYuHRq59I5TR2+fYedlGSagBFwgVemIBg+85fnPaYYPvqBVIkcPq8sQ02+o9x2PX7UJQvr1eFrOIep/MN+rjAuSuw7tIpTt85733+eZS6Hryo5Qkv5oDEl5/IGPLp5fcPw+Z+VWtvzPi51lEkg5lEpVGUZxtg837MTCQeWYwyfGP0l4CYEiNGJz6QTejaHD976hLleZp1vze7p+JQv6axh+OrkZ8SWv/uzd2PX5nm7XR3Dl0dy+Ox9EVnfOrpm/rHS8RHoe/uTf7kLgDEkOjJwOWIA8OnbHsSnb3sQP/jM86LHxiElnaaItbn+OcNXV1dv7UxbpnTprP5GJb8TNsYnQmJNeFLNSD6Z61OC1ZugO1ai2mjaUlR9iO+zLofPLG9ZKqShDcC/ZmQwOWu+IN8fQCkQ5vW6lGVI5FHbWpiIL//kLftxcGVk5e8nMtaN4dNajwG8BsBHAHwNwHu11v+ulHqzUurbq9V+E8BmAH+tlLpBKfX369WfDh06bGzwgbQcdNCip563A2//nsuj2/CacXY7MRwzkk5fzpjKn5GBXiyY4Rbm0jSGu3ROw9A9cMgJHmaVWfGyDLw/fDa/l2cJhk/bbaIBH/t9dOdqpu5ODckw5pliZRmaJJ3xz3tVmQBi+PyAj5u0aK/A+agoPaMU/pfg5eZEBnihC2U8CHQD2zLKsHGTnkJrDIsSc73MBgdk2rJ5vhf0tUkmVrLrg/pYt92cYL3kMc73smhZBtNVzgiE7cc+k+YuxmjHvCYZl9ZNpi01ks7IhUPXHLkN837xZxs1e88jxhhioS9VAmFfGiWdrD+ldoG/F/AlJKRAOHE1KehYyQ150rBAi+vXz+GbDDxv1bURD/7qGD6fYUuzXnVIybQliO31paPx/cf6OIus07JgEUlnKe6jtZiw8xm+0j1z1sW0pTo28bmrT1qtxw7ycWdtxbAo8Y2Hjq59h44D1pPhg9b6gwA+KD77Jfb6mvXcf4cOHU4cyJw1DgrcvuspZ+M7nxw3812oset3+1CBC90oMeNq5YFCyplyE5V9LplEZZp4bd8hxvBNOet+yWlbAACvfu6jvcEkHQPPJ2pi+LLM1W3zlrPT52qxhe3IgfFVF+1qcwgTQXa/l2etJZ0p9Ks8wIeODLFz87zXDjdpKcrSy+EbFxr9av/pgM+99hnB6m8b0xa4827Yg/Dk2wBYuXIEvK9k2rJtsc8CNtd+HeiQWpu2BC6d/vL5XhYwMkpVkk5u0d/StEVeiyTp9Lcz7EKqz/LevvPBI8nc5fsOrOBrew8BcMGTb2xk2rr09C3BhNTSXA+AY1tjgVmTpDMTDB8xSjzgW61h+Nq4HdfBTnIJxrf19oJlKrz7IsFulRpHhmMvvxZwz2yfPXfLvXIHXvCR7vP0kk7zN2YSxuECvvh+UsFQU/3LNpDBNpf+au33Q2vg6w8exXk7l6bOL/dNc9zr9Sm8bv7KS4i6HrveSBWzHmqU44F1Dfg6dOjQoS34QEXOMseCLaCFGUpU0ulkjoCrERhsylgR/pfvks/Wy0DSyKmqgG+KaGPHpjncd9AEfdPm8G3f1Mddb7sWgGMzADcYC3L4aurwZUrhrS97PJ5y7g68+f1ftcv5YKptN9/+PZfjxY87o3nFCeGuE/PXlGWo8qNadi4MGo0T6/KwwOZ532mWm+mYHD4XNI9KjX6mMEQ6PyTF8CVNWxAv3m4HtmVYqoAfk8mDM8wzL8kxLjXme6aeppQ+1Q3Y/+X2B+0gUNZ7a1+HTwfL/XxGDVX912TaEoNci+ddEgx7ka7DJxm+991wn30t23rW2z5uX8si84D5Lt7/n6/EOTuWPEOIax5zGk7bOo/3fO5u+9nRSA2wusFnJiSdmjN8RTuGr1XZkxoEOXwTDpa1/VvdA6ysSqqtD3xlL37h776C6990TXTij993aRnn5AzfJMdG+20r6fSdQeMMJUcTs94GYeH1tKTzC3c9jF//8M341Zc9Dq94+qOm21+K4av2NYtRWbCvxPOMfvPovHoGS5H790TGhnDp7NChQwdP0pkYdMhP53oZfu6Fl+Cjr39Oq/Vz5SSdWgN/+WPPwJ/8X0+r7ZcMIlLOoH3RZ16WYZqA7V2vfCoee6aRqU4TMAJ+X/lAKJbDlymFmJqLfiAzpbBtsY8fvvL86HLebuznkR/Bo3YuYXGumZGdHMrbF5d0TuOUCpjJh0OV5HHTfB4w0c6kherwVZLOcRmWIKjJ4fMYvhLR9VOmLbyNWCDEJy3Kah1pRrRloW/dK3nfUgHGJ27Zh5f/8efwh5+607RTXTsTM3xigJVnyjsGYvjkgL9uEOYxImK9OMPn/sVQx7DX1Z6kiSsecCgFPO6sbdi21PcCqGdccEpgmLIs8vmAOLPJ2+YBW1G6HD4e5KXq/gEmjzDZfnKJA103vWlz+MT142qZqiSj9sChVRwZjLEsAuRxGV5jvD+8PX7NBbLphAx0kkCA1mxk+GxZhvg+m4yJZolNpOzRc3bVfqh5+74jAIAv7zkw/f68c+7ff2vNqqUknZLh45MA9ll1klB8XcDXoUOHDQEeFMkBlmMowiHHT159IS6qpIsSMUbwqeefAsCUk3jGBTuxIyHNooGc1PjzAf+cyIHjy8tyNpfO07Yu4CVVrcJpndf4XueZfJMCSD+HL+HSWf1NHQPvmk2Aj/SXNy1NMNYKrg5gdXxZZgcO004W93uZzXHbJBi+TDlXxMClsyyDQa8MxlI5RKn8Nq0TZRmoDXbNcVCX80xZNzy6Bij4WZzLrFGR6UPVftCawf4qx5QGflbOK45BYq5Hs+bw/lL/TB85o0HHoPwBYUvVV0zSKa8FsrRP9blfc73WXVcy4A+2ZXcoHTtHnOGrCfigRA5fXNJZx/Btmp9N+EX9sxNrE2be2cOje6YsrfogdeicLS5KjQ99Za+ZHKmCq1SQ5LN97rUMytaC4bOsWescPrZ/8L6ljiVcd1JI1l3m8E2bv5hCKocPWHtZp8tP9D+n53nsmdsXBlMnOrqAr0OHDhsCij2NAoZP+cFX6zbF+0wp/NwLLsFHXvccXLA7LuUM2hA/CHyQxiWdxJbRDLlhDCp2bEqGjgLKSWvWEVJspIp89riztuHSWODcEDC1lXTyRbOa0CT3IWS3uWAw65Aav/Qz5QK+uZ53DfJSFuToSteHyeHzrx05SErV/orlk9D2IcPnpE+pgI/LN42kk7nP0rlSVK7A7cvrjAAFJyQVlPlqjQxftYJnaJJlYcCnnUPsNJLOvQdX/H6rkOEjKWyqz3X3b91VZb9/r2G3Be9GxtQHhOWhYfj4ZE2taUuNpHPQModv03wNwxe5h370z67HeW/4gH0/ax0+KyW25UW0aUulgxl+f73rM1/Hq9/zRfz9l++zwVFZhveWfF3P8PF9TXY8crtRDbsKuGDcdyd1L1OB0FqUNJDOvENREiN27uoY7tb7Q3UPejm6Uzcb31eE7QXc3RjLm6b78WSRdHY5fB06dNgQkHb3MUzK0sj1s8wwcZecHmcE69qwz3zWZl/kwAHA4lyOo8PCuHRGWMFJQBbuTc5uKcjjX+hnePVzL8T7brjXax8Afu/lT462YRm+xMn3apwJp0a/L+2Dr2mhxF8us52WVOzlymP4eN8zLuksyQjFvB+VZTBDHDNhIcQYPjnoKXWkKLR2xztOBHy0AgVTpQ7rS2aZkfS6HD63zxhossNKZm3hdfqbYMtEDh9fL88MQ+Ufg7Z1+DympcU9sefhZTz3N//J+yzPQ3OioqHPdZLsWoYvUpZBBnn2daaCwHJ5YAb/POCrL8vgB7M8j7gtw2eMYxLtRz6jwtUEK+m0dfgmgzTXKArtJgESjfFt7thvGOcjgzF2bgpZm1Q+nM82hfdYdPtJJJ0U8LU0bfH36V6ncgAlOzcNpOxx5JVliD+7ZnmUS0k7/z5MYBuffChLPfEkaiGeawSe/wwIhq/XSTo7dOjQYc3BB1VyptsN5Cd7yMv1Jwk0ZM4e/RjxFmRZA8DlyvHB1rSmK3Jw3BaxfEMAuPktL8Zrr7nIBpBtHPlsDl/qB5b1jQeQqT4Bs9f6SiEsy5AFyyZFP89sbtCmudyXpirfpdNIOkOGjwYRMcbOvo6atoT1wKKF19mgpY7ho/w4X9LpH4tkI1OsCjF6NCi0wX5iYEUIc/j4MoVeHmP4UEn60oPyGL7x0HLwWZzhC/OmvG1qRkp1xhI98f0D/vNDXkvyuXfTfQcB+Pm39ZJO/94qtbYBxrAo8QN//Dn8zb/dU1uHb3ONpLNNMBHm8E328KK1uXERyXBTLfFrluoYbp7vYVSGks4UW5cycJFb+WYqjYcTbNcU8FFAJ42LCKnt7f06AzMm2f2hJ+lMs6OEf7plH771dz9tpbRt90f7bCvZHk1B/6WuXfu8jlwrc52ks0OHDh3WHn6Ol4oum5nhm2LQb7ti2bpQxslfL0YCvqklnb3pJJ1NAaatlVU3kq1Ae04dAu9bXXt882kLyTfBXifV+94Eks7tS8bS/bXXXOR9zgfhkuELTVuc8c2oKssAuMGenJz3Zu7LcCApByl/8Mk7bZFut65z6RyXZZQNph6TaQvJT+kzOhaeJ2dNWxJjK/oOR+IaT/WdIGWu/PohQxU5QLemLaydNpLOlVHIZPUiOXw0Pk01WRfU1V1V1BwPEnhbfFvpsAkAv3XdbQCA+X47SWcmgtmidOd3MCrxmdsfxM/+9ZdrTVuWasyUzKC//rxbl86p6/D5LFOpNXrVdVGWGh/96gPY8/Ay3vi/voKL3/Qhs46V68FOzmye79ngKcXKpSSdtS6d3PFzgqNzeXH128Tk314/U/mIYvtp4Jx5zfuRLLzOy1hEGL6b7z+Mm+49hOXIfRcDn+Qal36OYN393eR0GoNj+PxtVbDcLaNruK18fKOjk3R26NBhQ4APdpIunRMGbHLtSWR9ckAcM//w6/CZ1+Q+WZSzuXQCYcD3hTde0yro/aMfvALv+uevJxm8iQK+SKDLwccfcy3aA6Y/H21BzfPrqCngW+jntoQFBy+3YVw6fRkez5/jLp3jsrQBpx2QisFausYYBUO1XbarZqwPsQGfZfFI0lm674CzwbliTIHmPQlhzWqEC2rM3pxDylz5AMzm8EUG6KFpS80grFpEOXAcWcSls4nhq712ahbF8hRTDF+WhQwfgeqB1fWR2vNz+BwjvMoG4as1A/I6hs+0GZ94Gxclenlmj3n6HD7/r2H4MiiU0ABe+5dfwg896zyvfAWXTBPDN9fLcLR6nWb4WIBRJ+lMbDPJsdG6wyaGLxJ48H2mAsZUnvAkqCvLYEqX8HXD7em8rQwLPP/tn8TbvvMJuPrSU5P74/f5fQdWWpu2TBXwJZ5nbrIsvFebTJdONHQMX4cOHTYEvLIMInBQ4m9rzMDwycLKNJD1+1nP8NFvx7SMljVtqX5wdm+Zx67N843bXX3pqfj/X/X0ZIBM7bUJ0GLHzcF/DOsknXyUuN4MH33xfsA3XZs8aF6a6yUlnS6Hr2K+xpEcvkQeDOAPYiTLVgcNP4cvNhiiPmXKBFMFYwWJmSN2LZB0JgY7smyDdafV/l8JyXry9WIunQBAVjiepLMNwxdxuexFAr7YIJuj7tqpk5nHBpF8115NRxXm8BH8HL66viBw6aR9H1od2c9rGb6GgC913lfHviTOPRsnGyzT88ay4oW25khamyBEGp/wSQoK8sy64TXmyRIjbQAxhi+8N+XrJtC6TXLHmLkIvcwzlQyEaJ1Z5Icyh29QU3idv7vvwAoePDKwfX/46BAPHBrgroeOttrft19+Jv7qC3uw52FnsFSn2hyMC+x5OJRr10Hr8LwC7v6l3Gh+fc/l08mSNyq6gK9Dhw4bAtL9kIMPWCfBLDl8uzbPedvEGD6/cLlk+NwP2rSEFlnYT+vSmYJl+HrNHeODjfhyNiPaUtK5/jl85j3P4Zu2iG8v4wyfCPiYpLModZVrZmAKr/szxHKw7BUe5oPKyKAvBc0ol6ayDJQfpyOSTmK+yhI4uDzCA4dXq/bj+5X7MXJCLguLb+gknVU7HsMXmrboqv8yh6vunqAlMUlnLKgq2PmO9XuSYuQLTH4Zk4l5AZ/Xr7SywXPprDlupfzSDqV2LM2hFcd21pm2bK5x6QTS551YQ143D5iB4ave2xw+mGCw1H5QQ9czYL4/Cvh4sOu3r6OvZYkAb5vE9nzJDXsOWHYxBpsX1+DSWSQCE8DcOyl2q2mCpg0cu2pejMb+uYqfL4Vnve3juOJXrrPfS6y0RLTP1Qrf/7RzMSo07qhKvAD1DN87rrsVV/3GJ3DXg/UBJQedt+D0CIYv5tLZMiVxw6ML+Dp06LAh4M12JyWdk7bpv5+E5SEmzdXho+CNs3qu9h6t5zN8/uBnUszl61GcfLocvtS55z/qdSYwdTmaa4X6HL7p2uRtbJrzJZ28iDcxfFxeaZ0KafZdjIBSEsWJGD62SqrwOk18UDBVavcd2Bw+ZYKOUms8820fs7PtqTwlOSAzQVlYx09iThajAssLAQAAIABJREFU5wFfrpDHTFuAiuHhQXG8fbONWU8W4gbMIE4OpseMCYr1e5IcPl643DJ8bH98Eiqs6Ri/H7nioe6SkJJOz6WTjVrrTFvqXDqBdABnB/lV09a0pba1EE7qS+9ZWQbtP1fNfgtv8uBI5WxKbGDQ/8SxyHwypNaLMG9HBmN8x+/9M17z519sPL4mZrqIsJK0z7k8S5q2rAXDJ+XcI2na4q1r/vJbg46Ngv8md0urNIm4Udfd35+8ZT8AYO/B1dr2OVarSY5EvOdN+hBONklnl8PXoUOHDYe0S+dkkPkyk7A8u7eYgE8OQGJMFRleACzgK2d36ayVSM6ASQI++oVMBWlbF93PSK1LJy8yvQFz+FIglm6+l6GXZyLg4w6Z5NJJ751pi3Pp9NtOGUVQwNKmHIeGuyZ/8X/fFF3HlmCogh3KN+TLKHgdlaUXKLVl+BRUVefPvG/O4QsZiTzLogwfFEk6XTt1gzDa/GiEcSFZa+xYTJ3DcKRZK+kUy5bmenhk2cgno0wtW5+3a0pSxPfhSwprGD74DGZZhq6uQL2kkwesMSQlndUgn87f1AwfSTpZHb5e5iS9FPS5/ZasqLbGkUF17nX82FPFw717UZY+8bYP26Lr7N/vO5Q8rlSpFYkicl/Qy7leFp3E4O3PUnhdFifnAZ+GXycvxiTS9W4DvoYvn9an5zS/9+oYPjKKmuRYqfak7FLg0ulNYJ5cks4u4OvQocOGQ9qlc0JJp5p+wE8M3yPLQwA8l82tM2cZPifts5JOre0P2qwunWsNGuDOmsP3y992Gb7vaefa93Xt8c170xbFawD9LlNfe56pzpSSzmoUvqnKbcq8AbvP6PXzLMoApySdqYF8U9Dkt9HMfFuJq2X4eFkG95cXXo/1i0MOpmlChTMNsXw8WWqEB8G9yrhEmraoqn+8K3VMCfX58GoY8OWZCgZ9PIcvKollr+fyzGPLpGycSzppEMmZDm8SCv7zKU/cP0VkMiCGTASNpQ4HzpmqN21pek6mrgc5yHd1+CYbLDs23PwtytJOqMXkjqujwmOVaWBfJIJdz2WTX0+epLPdfUqvSKZZ9/xLPQMkrLSQrUYv+3mGcRGXja4Nw0d/zQsuPzUMX/iM4lcLHRtNKDQda6lJHVMFXUX8PEvY72eCYx2MiPkNn1u8TX6vzlFZlU7S2aFDhw7rg1SO1yzE0KSs0q6K4XvwyACAGyjEBvRc2uckndzQYkp2aZ1y3dbKpfOVzz7fqxHWr2X4HPJ1Oi76qbaSThadTXvt0DnaVOU2Sekxl3Qq5Rf17gtJpxxseIWHI4OddqYtupH55nl6RWnad4GeOxalwoFWqgexXCKZwxcbAMuyDHx/eVV8XGsRJCkEfauTi9GSw8yohNCLBKHcpbPpnG9e8OfJYwyfazccvHtlGcTkQSqHj5/qWkkn/LzVUocmPr08s/K2aBsNj4TUebeDfCvpnJLhs0yVwbgqvJ4pdz75AHwwLj35p+2njhukpBi+tg6R/HjoNeVEztc8/9wERzxP1PYjwgpbSWcvS9agcwzi9BGfPPfDGpfOOoYvVjw+hqKS69Il6wfd6e3a1vnjoAkJ2aU6l85efnJJOruAr0OHDhsOMpeFZsInlnSyDbj0sA12V6Yt+w8Tw0d9ceCBKTFIxPCVLJ+qZbWCAHUDiFlga2W1Kbxe/W1DkNWatnD52jpJOuUghLN60wbdNHDdNBdn+OiQTR2+uNtszJ4fiMvD+Os2gzfD8NUfmzOxqXKgSnccrki9MfyQ+0xLOsMcPunyyevHEWQxY76/fq6C2ldau3uOd6XOEImugxjDl0UCvjEbZDcFfPI+kLvnbH7UpZNvKyWdiZvMz+9M949LioE4Y5krheVhkXy2yB5sEnX5Gk1bSMae+UxuW8jJkaLUjuGzTKxrdHVUsLInvJ24Y22q/xRD8MBS9gmQOXy66kPF8NU8r/17Pbla9JqhXfaqyZC4GU3Y10khz6+XL6hF/mPN9m1z+MrSz3vmjGxtWQYr6WyPVRuE+ltxhQYgcvi6gK9Dhw4d1hfBTLfNN5pQ0ln9PX3rAv721c+aaNsLT90CALjotM0AgEtOM++/9fIz7Dr8B4G6RoyXqXemp+o3YUOYttigtfkYWtfhWyfTFvpdpiCm75m2zCarpcLsygvoZB0+5dV6rHOkJMSCQV2zvoRx3KxfxwZ1XNJpWT+zTkrSmWIjpFxObl+W8QkLOYjizRDDB3CLeGJO45LOfkQeTG3GAr6YaQvt64FDAzzxzR8NtvFZuQbJY0Qa6Bdej7ebqfREiF+vsW7vflH5Qoeyxl6mcHQwxo6lOcQgj08ymqn9kxGMlXRmPpPbFnLtQrscvlgwZCSd1DfB8E2Sw1cFGL08C01bIgXHAXefkhtsPcMXXhdB3yLPAOoB4O6dWDC0NnX4/H2PhOogJm3llwttPxCOrSlQMO+CLsYo1jJ8kx/jIMHw2TYjAd9cPt01vFHR5fB16NBhwyFdeH2ydmj9TfM5HrVz00TbXnjqZnz4dVfhgl0m4Dt35xJu/9UXe3lhdkBfuoBiieXwudnu2YKN9UK7HD4DHjC94cWX2uP02qsp88DzlVLf71rBMVru+Kbd5Xc95Wxsmu/hpU88M2gnrMPnH6csyxCb8b7/0CoW+7nd3kiBtW2zCRr1teAAN/FBpi2l1jaP0rp0ZmS6Ige7cQSmLcrshw88Y9evcy4NB1i9LGNW6D7Dp5Q/aKYBYZ4pQCgUqc1DEUlnptIMXwpekCaW1eU8WgkiD/i4SyfvV6YCxn3bYh9PP/8U3M1qjtXm8LHrj77LcVliaS63Zh95bhi+7Ut93H8odDmU94nMt00NfknWGNbhmwwycHEMn7t/Vrwi8qVdl5dFMHLWWGDkXnsTCNWq/UxZp8zYerHcOqr3WPe8TrGEHEUkoOL7pIByXGjIcom0zpHBGL/+4Zvx2udd5MntmxAL5obj0uasasHwxW4Z2m7VBv/1+yy0Rs5UEjzAlEEt7x8tmyQOW7UusvK5pbzPo5LOkySHrwv4OnTosOEgBwv0bnKSZrbA4tLTt3rvZUF4PuMqc/i0dvbhGy2Hz7Y/QR0+LlX78ec+Ot5eXQDJpZDrzPAR/By+6fZ5we7N+MmrL7Tv+XeZeXX4SijlMyxZNVB18rSw/We97ePIM4VnXHAK+nnm5yS1lHQ2zaS70gvpsgxkPFQXwJj9aXziln24Yc8B73MFOlbXLzmhoJQLIGKsjMyJpHYoh4/3xMqSI9cSNRll+PKQ4Ws6z15gJq6joC0NPP+y03BoZcQYKdaWkAQTzMDXbztWiL5e0umuv16eYTguMS40ti32bcDXyzIcHYxxyibD8M33MnzuvzzPMptNkwdp05Y4wze9pNP8HRc0OaHs+eSB3erYmbYMxn7O2SjyvcZMVwAuc0+XPjD90sFrCkDrJZ3NDF/qe6aX9vcmwnBRX37/k3fg/Tfuxc5Nc/iRqy4AYCY+7n1kBY85Y2uwXWzf9GpYlJjvmYAvlcPHrxdqg4L/VpLOzFdJ2GXBc4dtR9dIS1HnqCiDovIE6n3MLEe6LJ/o6CSdHTp02HAIcvgo32jCAM5ut045Y2duXwRgfiRI+kQDKTLHAGaXE64XWpVlqNAmRqsvy7D+IGOVp5+/E0A8x3JWyFxEyfDJsg28DMDyMO6wV5Qm3yhw9WzF8DXnnVGXKH+NisTzZdRXOUh74NAAn7x1v33/ns/djR/+k+vx/hv3Bvswhis0sNKY6/kMQ48FyDFzin7ugp6ydO2Ykg+yDl89k7Q8HEcZLFn2AWjHpPLj5IhJYDPlitzTZ3b7RFuZUoGk00iElWB+avrG2qSAa1QFfIReZhi+LQs9/PyLLsH7XvNsbGfyTnl88n1TDh8FI64OX7rDf/DJO/DyP/qs95kWf4tSI8t8054jLJAfMJfOsTAZkXmmsv8xE6B+rkJJZ4Lho9erVtKZZtT4dqnrLSXd1ULSGTNukUEvD35f/kefxYt/+9PJvgX7q16PitLm4ZY64dIpJMR8341lGTSVI1He9kB9CRvZzyZ4rrRiGz5hZ/76zyOzn5Mj4OsYvg4dOmw4hHX4zPuJJZ3i71rjqot22deXn70N7/vJZ+OM7QsASNJplp3Ikk5CG6OVetOW9Q/5dm6ex3U//Rycc8oSAFmHb232IQuvKztg8A0IzP5NXTkaQ1CJjxhK7Qq1x9ivFLRuXo8zfOOyrIISn+GzgZbWXumB4bjED73r8/jyL70A25b6uPn+eK0xBTJtob4DS+L65fUqea4fX+5qcjmGD1Yu6tZ10sH4NfflPQejwUmehXmKjaj6/MwLdnryStM/yfBpW0S9KH2Zo2kr+hJZFj4netVgOJYXGEOmlH1WUlvDceEFfHmmcGQwxqa5Hn7iWxxz/eRzt+PSM7Y2PmNlrEGsrrTi7wmH2hhu33cEt9x/2PtMSn0LrTGX5SaHr4gwfKPSfp9ezlkZZ8JizBnArqcsC01bUjl81edW0lnz/PPaSHyHsVqcdCwAK1AeY/jke7b9Tfem6wPafSQknS6I1SLwZfen7ad5s9o2h0/7k2Q8YJeSzlny6LipTsDwCUVB3LRl6l1vKHQMX4cOHTYcUgHSLHX41gPcgl0phcvP2W4DI61nd+mMGVKsJSZh+Nqcy9qAr/WeZsOFp26xgxSef7RW10Io6XQDBumSmFUSu3+54yHce2DFFuSOgRdqr5OASmi0YwIBVxfPl3Tyvhq549J8yFR87usPAQBG4/i+6Ng9l04R8PWyzO5PswG9t1yYfVTxHqBUa0knAHxpzyPRz2NOpE1QAL78Sy/An/zwU4NlMUknlViIlmXgOXxC0inZSpJ0PrI8wm9fdxvGRVnLNpD0FXDnZViU1nAIMKzF8rAIvuO/+4ln460ve3ygRggZTX//9IySzoxt5HCF1oF80q5e/R0zYw86n37AV9hzwoOEUmsvAAzahzBSYYFqvUtn+HkbSacXXCbOie/GyraVDF8sN1H0OfbsqC0HEVk2qiSd1B8dOfZYoJgqch7rMz0jAT/glRMLsTy6tncxXZuL/dzr73lv+AA+fvM+ALB5m75rcHUNnyQRXxfwdejQYcMhYPispHMyWIZvHaON6990Da776efY99zmeVaXzvXKdSNMkiPYhqWsc6k7BgRfgNxz6VybNnkzPRbwlaWucqhCSeeX9xzAOz/9dTxydJg8R0Wp0bfBjvmMBrAX7E4bDmndPCDhLB5JOnmgR8uIjVyISNM+e+fDAOKDTYCCF8eeaR0OgLkLp1tPe8ulaQu0CyZj+VOp6/LGPQdx1vZFrxC6PQdTsAXblvqY7+XRAM97bwN/FWUNPAMYPjmQqeA5kVXXz8GVEd5x3a14/417G9kGV2aDggNf0qkBHB2MsUm6ftD2kc9+5/ufhOdcvBtAGKzQnIoteF4tn2sxWC7L0EWUghv6tChL49KpXNs84BuMXX4WZ77MfVEv6YyxUzGWjvfQuwarv5Pm8KUmHFKmLfSSTLFiklB5SccCuLprJyZ1HRXOeKnUOirP9XMTzV9ZoiOFcTXJRffwuMa0pc7huAmUU7jYz5M1UW2+rZCYx9Y9UdEFfB06dNhwkIM4mW/UFsciyNi1ed6WcADg2crPmsO33qgrlC7RJmBqwxieW8ktjwV8SefafAcy74p2YV062fKcMVorozEeWR5h56a4HX6htf0+pFPhO3/oqXjUztR509HBXazPFOzwsgy8ZAMxdLHB1U33HgSAqBEGQJJO0x/qu8xp8nL4IixmL1fehIlpTdv2fQmeazOGgysj7Nw8F0iRpQlKG/AmmgbWpdY2cOUF3W1bvF322rCCISPKb6l7D6zUftdG0mlAg9Wi9AO+1VGBwbi0dSUlYkz4t19+Jr77KWcDiElYzV9p1FEnPyQUOpxAkBMB46Iy9oC7JngOHy/LMBZBS5NpiycRFlJUDx7D5086/O7HbsNvfPgWAPUTaPy0tcrhiwSmzrSlvqC86HK0/WB7r4ah+Tscl5i3JmRxxm4sSjcA7XP4ZFkG+f15/Yu01fY+psmIxbncPk/kttZRl52HTtLZoUOHDusMOfCZFq5g+7ELuHgu1KxlGQgXnbp55n7FMEkOXxu2se446VxcevqW5DprjZ5XlmGtAj7O4PmMnszhyzO3/mBc4sDyEKdsjgd8xo3QlzPyoCbVe5PDV99nzuYVpdnGMXvVsozq6MVNYFarAf1onGb4jOkL7DFINtPP4QvZrx4rTeAXK69MW9gwlrsqxjAYF+jnWXDd5plqlJoFx8bOfp17oFleGbAwaaB3LOJ64f1KuXQS9h8e1Ju2KD+4J/DA++CKkRXHyqqYPok2bZkH+l785a7Ytm96QQFfE8M3KrRgbqtl2rXfy/zC6zww4GUZeCBU6FRZBs7QhQFW7LfHX499roH//tFbo8vq9tvOpTPcv5N0Rhg++T7K8NUF32GwNSxKzOec4avvs3TpbLrPxtV3S6ecB7lyoiDG5rU1W6L+LDCGT05q1eXwnSySzs60pUOHDhsO09ZwkpiWGZwFuR3Quh+PWaSZn/m/r/Zm6NcSk7l0znYS73roKADg0hpr8LUGv47WIx2SWDG7D/E+Z86SDx8dYlxqnLJpPtoWyZsANugt3fWTOv+l1klpUz9XGBWayfxMm4XWXsF16isyM2gdFSVe+azzMBiX+IvP3w3ADTKTkk6IHL4ylLhxCWxMWsUD5lhZBj6gdgP0+HkZFiU2z/eC85arySWddZd+jIlQCiKHjwd88XajOXxiAmHf4VVsXUg/C5QCY/jcuefBHwVmm1OSzsTBUnOS4bIBXzWopu/N1oyriYLGzBlR1me0TEzFmNJ6EqvjwrlKioApWnidX0MeY0xBVXj8fg4fDxh91F1XqTa87RMMl2T4YudBmqjEJZ01wXdkf9yl03weYdl4ICty+JoCpaIsPYZvJHIw/XUnOx4Oy/D1nWmLvCxjbLyts3uSSDq7gK9Dhw4bDimXzhMBNF4yuVLmdRuHyxTO3rF+EshJmMdZjgEAdlTW78969M6Z2pkE/PjWQ1abZX4wbwITHmQqOwh94NAAAJKSziOrY+zaYpZxh0KgOveJ7mudHmjO5RlGRdFK0pllCrqSIRsDGYUhczMntmSYzOFDZfpC/dLBfczrFqZKT/REsXpNbUMwfA3s+WBUYsdSFmXNZpmxl1vGGL+s+r4sa8Dr8IG/9q+dIDgVDN++Q4NkoAY44x3alhALipdq2lHKDfrdpFkYdHGmaSDq8LVh+OhSGpcaN99/EB/59/tdWYaA4YsPvFdZWYaCXZta66ic1LtXvGDF/I0xxryVusCtLqcsdu0GfRMB6/7DA/z0e2/A+btMDi8F0fGyDPXv6/Yr+06vRsylU9bhI8ScRa2BT0OgNC58SafPFor+TXg8HNSfpble1CwKYDl8rE1ZJudERxfwdejQYcMhNYib9rl7LEoCEHgdMcfwHbPdrxtmPYWvfNZ5eNr5p+AJZ29fmw61AHc5Xeyna2RNi1wwb5Lh62XKMmL7qppwpyQCvkOrI5y+zZT0kDlJZN4Rg0Z64DPXy3B0WHiyPLouXVkGOhZAayPpNAOxDAou4qO+pBg+cttztvphnz2TG8tEuOVKKcsk0WBda10VdfcHnE2SzmFRYi7PIq6Ts13IqRy2I4Mx3nfDvVgZFSaXkwWWfHDpMXys69ywhn/mM3wDnLcrbeCj4AJK3lYeYa02JSSd1I68opxywS3h7B0xfHR5OIavmVXaf3iAb/3dzwAAvveKcwC4Z729FlW8rdURM20ROXAxdtErd6DNRMYDhwdeHb76bXwmbMdS37rv1jJoHrMYX08GT7/2wa/h07c9iNseOOL1ra7wet0+6iSnhfb3DZiA3rp0Jrb1ZLSW7W2fw9fLXR2+OtOWtcjh8ySdiRy+mEvnSaLo7HL4OnTosPEg8yhozDPtc/dY8oNRSecGNW2ZBLMeQy/PjmmwB/gTB2fvWFyX9n1JZ1i2gQYYDx01NfhSAd/ysLA5lTZoIhar5txr7Q9S3viSx+CySjZLLEvI8PGyDE7SSQzdqCzRz/1ggwK9WP6QO35XOqFkslECr1v4o392PW574HAgd8xTDJ8S7EqDpHMwKs1gcg1GOXwPctyptclB+5bf/ATe+L9uwsNHh8aAJY9LOnlrvN1YDl8mGb7Dqyi1xtk7FvH1X3tJ2E8W8XF5aKy8S8ql07QTnlOXm+w+4wHKUAzy2+Q/0bKv7nV14ug6c6VJtHW7lW0t9nMMRkzS6RmIxAOjH3/3F+1rDY0//ddv4Nlv+zhuuu+g128O3ooMPPj9XHNreAxfKgiWJQ72HzGqgM0LPa9vsdxEycjGYq3aEhnCJEZrbXL4qmeITrUZYeUGtkRHcncAqORGZic+ZFkNjtkCvsqlcy5nks4Ew3cSSzq7gK9Dhw4bDrEZ6RMFVtLJ8ltmlUNuBMxqPHM8wGfrU0zQLJA5VrymFC2XIGlrDLJYNS8GnbIG1/BdNa+6eBe2iAGiZfEU1eHTtp/W0CUzwZGp9Wb2ybtPA/HYYBOI1eELAwdTh8999rX7RcAHlyvmFV5HRNJJDF9NDl8/z9bk3uNNhDI+w+49eGRoP6NALcYa+Dl8/NpJFF4X+XckGY0FZYpdj/y480zhX97wH/CdTz7LfpZy6aS+2DZFv/kg22Nkirhpyxv+9kb8yJ9eH92PM/lw1xRJoKllk8OXeYXX7THM51gdM0mnCBiaTD1+7xN34Lcq05X33XAfgPj1lFCBAvADvjpJZyxfkGPPw8v4n5+4HUBVK5IFrHRu56ykM8LeBdLicJ2v3HsQDx8dBp+bvvttUfBMOXyljrsBx/IOKfhvLLxeyXXpWuWPFs/9s9S44e4Dtfuuw6oty+Ceo7Jvtg4f+9gyfCcJxddJOjt06LDhkBrETVsP51jGW6qS9ZWlC/jWu57epPjUz12Nbzx8dKJtNtghtMJ6B6lScifzsGL731QVvJ7LsyAfridyRrgkOHnla3+gxGWMluEjSWfmio7HyjIUStnBWk8wfDQAGyZoDOvSWS3WWgfXDGf4AGBlOA7kUmFZhqrtTEOznELL8CUmhwYjw5iuxb3Hc+3k0ZdaewELUAVvitfh423567nXCrJKiry+ABN403aS9eSSTn6ie7nCmdsX8ahTnBxUFl7nUBFRJ13L/BnMg38KEGzAV13Lh1bHuO5rD0T3YwO+EZMOW4bPvB+ODdts2Ge/T0tzPcvk8j4AFLQ0UEwADrOafkBzDp/sA1ej1AUgniw00q0f/bPrcfP9h6s2K2l16bPq9WUZ6t8DwA+96/M495QlfOrnrw6WeZJOuAkeyuHTOu7SyYNq59jaModPmLakGL7/+U+34//5x1uD7duaLznTltxJyRMMXyfp7NChQ4djiLAOXzXYmLCd46XE4OYY9H4j4dydS7jqot0TbXMs8yDXGqduiTtjzgolGD35Pva9L1S5hCTT4pizEiLz3jMmEdfyy550Fs7ctgANf3A013NGJXMxho8knUEOnxl4URAqJ12cpDPt0qkUsDwY49DqyMsTJMggcnlYeH3ntehc4XVdBTL+gN+dm3QOHy/qvFaQA8VSuwEuIVMKeZbZdf0cPi7p9CcH4gyfv//VUcHyL8PnpGRuTTumkX7PfVhn/hK71WPGGpxpouuC2OM2DsB0XjyGj7UDmONd6OdeHT7C0hwxfP62Zvvp7PSjtfRE3h+Hv8+6gM+9jgUqh6pyGYC5b7V2gR4FQnQ/S1l1vKxFvC93P7wc/VwaxtDEj5V06viEa6wOn83hazj/3JBH9oE/Zr5w1yPR7duWZaD7c4HlrQYMXxmykp2ks0OHDh3WGaFL53SwRZuPcaxCs9H0e7TB4r1vGlA5i//03Eev2z58SWczw0f1z2IDbulQaU1bIuzGO773iejlRqIkAz7qgq2DRwFCZurwxcoyZNXAiwaTko1rMm0xeXYKH7t5H57wy/9oJJ1iHZnzuDwsArkj9ctn+Co2i7VFm6XUAKNCo99TayLpvITVjpRDPx1l+EwOn5OlplhRsY0IXvMs7P/y0LmuyiOjwJjac+2YvzyXL1WHT/ZLieDSs+Fng30r+SWZnvheorb6EUnnmH3vdG4XehmgwjY2zfcqmavPLgKw5kOTgOeQctQxfKOixEWnbsZZ2xdblz0oGlw2+70MBTOdGVuGL2TCgPjE5qQxitd3xo6SSqDU8QnXWMBrc/ga+mBy+Ny1wr8vfo6OCBbW7q9lwDcYh2ylvJbGpa7ycd1n9nl8klB8XcDXoUOHDYdkvtWUz91jXdYhV8qXdJ7A7Njrr7n4eHdhauzcPI+b3/IivOrK89dtH3x8GBZeD793cguNBXx9O5vuy4t6mfIu/d/8ricAcEEQH7zM9/JA0smdOMvKZESyRLnyg4u+cLi0DF+i8Lp0KDVMT8hY8c+WhaRTIVKWQRN76BdMp+WpgA9VW7NKOr/wxmtw+TnMbCiQzumA4TOBQ7zwulyPkGUqMLnJlAr6v1LD8MkcUgIFMVz+ulSTwxd7XlI3+HfAywNwSZwJXv02DiyHuWOO4eOSTve924E6MXyBpDPH6qiw55czX1qb/k3y9cu6mryt2GvASJzP3L6IUzbN1Zc94Axfg9LUSjpF7Ut6PgRFyYXZi/ysDTzTFvAcWVZ4PdIkl6XLnMxWLp3Jsgzu9dFEwNeW4RuOjWMvPd90hP19/4178Q837vU+yzKXT3kyoAv4OnTosOEQMHw02Jgw4jueks77Dq7aH+aNJumcBK+95iLc9bZrj3c3psbCOpRj4KgryyDdEXuZspMZmyI5VH3hhGjr8GWO4fuv33YZvruyrlfVuqUX8DFJp8jhc6YtPIev6nvmBxe93B/2jwpdOfclghdxLqDDgETmpC0Pi2DgFZi2wASOCj5TZg1tagzYHgC3AAAgAElEQVSe5nqzm7bsFnJgefRG0umP4Cl4doXF2faeSY3ytpEMHze0IKwMXcAn4zJiQuUiep7StbfYz2ufSTHTlljx87HH8LngNiZPfSQW8NkcPneCXACh7ecL/RxKqSDQ2Tzfq+rwVf3hOWDVhBuxOm2QRXImTU/CgIowLkoTtGSqwaXTgY77J97zbzjvDR8I9tG3kk7HmtLn/L3rk3stJbFtIdlReX9p9n+O4Zifc3j9a1eHL7PX2zgygQAAh1dnY/jIMZi+Wn58HD/1F1/y3veqvrXNFdzo6ExbOnTosOEQ5PBN2Q49po81wXZkMMYHbtyLf8zvB3ByuHR2iMMbHCv/WpPBCOUiAfFJgJ7IGSlLbaWSNOaQxeS19l065xg7HuTwZRljnf1954Id4rk1BJKZLc3lWB76rBaUn59WRkxbemxwB5jgZQtjOlWV+0bHDnCGTwyaBQMRgzH7SC6eClKeedu+I4EpCbl0lhp452e+HtR+I/iSzjBPNs9Chm95WFg5Zjxm8wN56g/gJhRikw1eC5GTRm3wQbxlnnJXb7LQOhq8Pnx0BAlXty00bSm1+3yhXw28xSB9sZ9jdVQGEmjaflRozPczrAgGNgW6ziRizDJhVJRVbmq9qVislt8Hv3J/dN1+rjAYl/Z4KNC1tQ0FRejVRqzWXRmWODoY15bfSLWhtQve+tY5uAXDJ1ZoKsvgcvhUUGeR9+focDaGryjNNUlfrUa7gNgEiaGc/kRFx/B16NBhwyEVIE373D1e4RbNem80l84OawffWt+XLEppshm4xuV4QOgKR4NnwF373nZVEMQHVhmTEoZ1+GDLMshrMs/8+06WUADMAGs0LqNyQAUp6Qyve8mgGNMW1oZyfeBlGZSi4Ja1z+SuKayLaYt4Bn1t7yH8v/90h/eZUq5fb3n/V71BMR/4y5qNsq8xpqxW0slyJDl7KBm+Ojkn9V+C9uUXXjevF/q5C9RKc23J0jqxcgAxho/nPVqzjV4OBeUFFJkyUs/B2NXhG4vgY1yU3gRIE2JGQ+E6/vtRodHLMsue17Vt+xZZj1/b/TzzJZ1laT8379P7IZbsb794Dx77Xz9Seywc1Kd+XpWEKMm8iZm2RLYbRSSd9n0Tw1eW9jrJBIPLg7kjKYav5YCgqK5JqxTSzbJawDH1J0m81wV8HTp02HiQA0UlBr1tMW0Zh7XEJAOOkwE/+MxH4dkX7jze3ThmqDNt6YvreL6XO8ldZFxJAzpnQuHuBfqMBwAKMGUZ5My69iVg1niD3GNLN7C1gaQIxnq5CmZKRkWJUaGjDJFkN3mtP9umYA2Xh4XvYAkVSActSw85aGb9TGA9Ar42svJM+QHPITZglTmLhFzFAz4Z0K4MC5vrJ4/MY6jYQmqX2Jom1kdF3sRqpZGxxtJc7iSdWkf7HZN0WpMPJgskiaCGk8oaSacv4cszhYV+5jN8Xg6fke3NyVoXNRiXIStt2uKv/e9/SAxfVh/wae3OfzTgY697eVYxlH6JilRZhlgQPim4ozQ3guJOlbEAazT2zzlH0+8vMXwAgoCZf9epY2prysOZRKCqXdriPJHJ1DRurxsRnaSzQ4cOGx5cijEJ7PrHUVIpc4BOdrz5pY873l04psi9gM9nuSTDN993Dpo8uFrs51gZFXbwQ+OkoizdZ5H9KaWig5ckw1eZCZUsuKAARhpWxBi+UaExLFIMn39MRRmyJTKHb2U0DgaRQcCnNQAFpXxJWREJgCX6eTwnaxa0Gftlyv+eDq86OaNXaN6TdIZBUqwO3/JwXFOWgbt0snaq9Yit2VTj0AnEFQk8/4lAQd5i30l8i5LKUjQzfDSQ90xbGLNLDJ+VMrKTnymFhYrho2tFSgJHZemcaltA6zjzzr9y+f2Pi9K6qUr3TI5SGyZwVJj+Srdbn+EjqbbfhnXpDExb3OtpgxPqTj/LPAbMSqx1fMK1TtLZ1Bdy6QRQSTo5W9iiz20ZvmoSgjN8bdhBV1v05Aj4vrmmnjt06HBC4kROgTt928Lx7kKHdYR3bQqGTw7ge5EBPOBq8snC60XpBuuuCLvP8MXGmDTQmrcundUseubqQ/LBDy3z5aihVyPlQsUCBilnJbbkH15zJV77vIuix788LLxBpFIs4BOBkQluHcqq/br6kMZpNLl4OrQK+PyA59AKZ/j4cXFJZxi8xli/Urvt5KHzc+sZwuQk6TR/lyZg+Og19cNjXgon6bQ5fNUkhZTlP3QkwvAJV0fTpjMdoc/JtCVg+HqGWbS5bl7hblM2Ym4C0xYgfj3F8u8IRtJpHFbr4ptSu/NfaI39hwdyL/ZVXzB8BJrAGZVphk/W6GsLW9cy9yeReg05fCPvu0sHoql90jMyEwwfscd1xiyx8hYxkKKB7omYS2cKWdZJOjt06NDhmGNSiSatfqzjxZc//Vz7ugv4Tm7wAEza4ku5IS/bwAfnJPuVhdd5rp0LzFx7SsVnuWkASO3SnigPjrNvPDeQBxcx05aVisWJBQwmz87/LFMKjz97G5563ilV330WcUW4dGY84GODOVX981w6adY+6InD+uTwtZN08mCfM3ypzWP5ejHTFgCgy0oGJ4ZlpWV8fV/SuXkK0xYb8LH+U+CxOJfbwX5Run7zrh9ciZi2VCdj6BVer3bAGL6FfhaUZciVkXQCsOzikMkLDcM3maQTSNdMfeavfQy/87HbggDESDqzIGAJoN0EUFlq3H9oNblqPzdtyQBqzko6pXTSvZZyz7bgZU44A+ZMW+Jy5kHBv7t0IBqDYfjMMeWZ8oJV+q5TDp1AOxbQrCcYPujW7GAn6ezQoUOHYwpf1tYeZotjzRC+9WWPBwD8+efuxulbu4DvZAYfIMocPulUqFhAyLejQSkxfIdWR3jL+7+KlWERDLQli0ODkasv2Y3veNJZAEJJJ2f4gGqgxfJZaJmUo0o2kgK+GMNnTFukxNCXF4Z1+IqA8ZK5YmTaohCatsi6hxJzkWOYFW2eQSaHz333nMFKDYIzFbJiscLrtK75KxaoOPtnTVuydqYtsaCHPosVXl/s5zb4Iwt82h9J/g6ujHBwZYT9h1dx4ammkD3F9H7hdZ7DRwFfHrg4ZpmyJVdWKhfHIlaWYcIc6tgEgdbA3oOrePtHb/VcZYHKpTMjV9b01VFqbXPwCq2xTwR8UdMWwWC5AuVSDspY16klnRTwZV7ZgrypDl8sWK/Qtg4fUE1eRfL29h5aqdm+XcRXaHpWmPdaty/p0PS9nkjoAr4OHTpseEw7ZjteDB/gZmO/2XL4vtkg6/Dxa7UfMHycgQkZPmIEuevjrs10/YQ5a0q5WfWnnb8TL31iFfBV133o0sn7WrVKgaQIxvoR9ozs0aMBgwrvM7s7u39fYrk8LIKBV26NLWjgr6FgzqssMC1zJiV6eciaAcBHXvcc/MLf3Ygv3n0gvXECbVQGMoePIzXOjBVZN6xffF3+V34OCEmndek0f5ty+Pg3qcS+fDONiGkLm0zIMwVU6XmHVkb47t//F9z6wBFb19OWZRiFhddLrbFqJZ0hg0amLYBj+GRZhklNW/hxctTV4dMarV06bcBXauw75CSd8prq5xnKUgcBFDHH0qWTv5VB4iv++LPJPnHwnFjN3pPxVJuQJ2D4GuIxyn+k/XrfX/X61geOpPvc0L7WGn/06Tux//CgUgPQ5Fk70xaAyjK0WnXDo5N0dujQ4YTBxC6d1d+6PJ/1Av3wTmIa0OHEA7+0KMmfIE1bzDLHznz4dVfhD/7jU+ygNOboKo1cZCDhZuLdZzRYClw6VTiQp3tE5ovxnD7ajBi+mCRQqXQAQn9lDt/KcBy4ViYZPmVq3r3pf3/FLpd5hxIpSeclp2/Bt11+ZnK7D/zUlcllrRi+iEslIcUWROsysvO1xII0OmR56CqxzLl0VqYtDTl8sa7TNRpjdBb6uSn3URqpHAWu/BwcXBkFg3dn2hIWXvdNW/KAASLTFsBdlzzgoEG9nHRpQuxy4tdobPDfy1VjYKDhGxLd84hjrorSF0v28zCoo74t9HN7vK5PLAgXQeI/3/5QulO8fyTpzJWX48bzipt+f2XA1ySbDHP4QtOW2x44XLN9fcR3072H8NYP3oxP3/agkHS2N3zJVHs2cKOjG4l06NBhw+M7KubisjO2TrX98WD4ti32AQA7luaOw947HCvIsgwxKR1f7j5SuPT0rXjhY0+3g9JYkOAknWbQIXMGyUhFOmQCjOGzbYXt0kgzz/yBfi93jqI0abFcl8OHmImI6ydg2DvPtGUkyjKwoJM+13CmLQDw7s/ebc9HU3peX0g6z96xyPqW3vjRuzcnl7UZJ7K4PrJ9StIZ+Yzl9S32c7YuBeL+RnLCgZCLAKyxLAOfxKj+UnDlO2qWXt9GZYnDq2NsqhhgXpoilsNnyzJEGD4NbT+fr3L4/GNy1+VRK+l051Zr07+1YPhkICnRq5jYusDAM20pNe55ZNkuK7T2rotennmBNe/bGdsWcN8BX+bIuzS9aYv5S2UZQtOW5pIkct9NbPi41KwOn8/QUjB3y/2HcQbLg+cTI01B2zIr2M7vLyPpDNePTQ7kqpN0dujQocMxw7VPOAPXPuHaibc7ns/p11x9EXZumq9lEjqc+JCSTq8On2T4Mm7a4j6XOXz+NuYvZ+IITz1vB/70X79hPmcNStMWGXjxz2gQF/bdSaDmezlWR6UNLpf6cZfOVA6flZQqBcUOUWtglbEVZ+1YtEHJ2DI9laRT7I+MGGpz+HouYHrnD12B5z3mtOD4Y5jV6CVXypMpcqRIiRhT2WMB31zPyGFL7a4J2U0eZ/L2JMO31CDpjFnhUHA1iLgyLlbtjQuN+w6s4MIqYOat8ICPirNTUDGM5fBpvw6fnE3IlcJ8dR1SnMADjlJrlGV4DzYh9tXzwvCx3xT6nngAMhyX5h4iJl1r9Fk+HGf4ytJnjmX9To6zdyxizyMy4AtltpPC5fAprBbaqQRs4fUpGL4GZkwyfOOIactt+47gsWduxd6DJueRDG3atM+vB08NoFNmV2EbSqnWbOBGR8fwdejQ4aTFo3YuAYA1sziWWJzL8cNXnr/mLoEdNhZqTVvEjDFnwWKBoTROAVhZBivddCu8/vkXu/X4zLcoy0ADHS+HT0pFhTyS94UCUjLHSLp0ZuFn5lh5m/4BHh6McdrWebzrlVfgFU87F/1qX3/+ubvx1F+9zvRPhQGRrK0VA68lKAeHdVLQVP5dW2RKWTZU4szti9HPo+1kvqvrfFViwDJ8IjDz2V/3eZDDN4Wkk/bt5dtV59QyfEWJex9ZscdIZ3zTXG4nCwDn7hkry0CDdK0dm7jQy8P80MyUZeDwyzLoNWP4hk0MX8Uk0/EcWB7i4jd9CH/wqTvtOpoxfGPB8I3LMjBtiUEp4JxTlrxtTZ/c62JKhs+5dGaei2WPOQc3MV2hS2d6Xa2159IZlmVw5/I0ZnzGZe+NAR+7Hnj+sIaOsrGx9rLs+E4cryW6gK9Dhw4bBh967VX48x95+pq1d9rWBdzx1pfgFaxMQocOawkeOChhItIXERBnwfi4kgKzXIU5aTYwE+8BYDuTC3uSTmL4ej7D55u2+O0aR0jWdybppEHW0bocvggPJ6WHvUhAe3QwRq4U/sOlpyHLlJVV3bbvCPYfHmAwLr1yA4ThuMR8L69l6kwOn3ktB6t1zGCsFMIkUApYHvh28s+/7DR8/Geei8vObC9L53mVJn9L5mT66/dzhedeshsA8LTzT7GfkzvnqVsWsGW+hwtPTUtWefu0X2pbKT84czmdJoDcf3iAlVGBsyrpLJ3yUzb7snZicmKF1wkaGqujEply++bgpi2yXcAM3kvdLoearxObCBg2uKzSRAYtoxp7f339Hm87YvAPLI9waHVsHZwlKRdj+gFzzZ6zYwmHV8c4uMwY07V06ZQ5fGxiqKnlSVw6qZvUvmRIaf/jQnsBMK+r2HSsY4/hc4xzqf3g7soLdwXuq3a7k0jS2QV8HTp02DB4zBlb8awLd61pm03GDh06zAI+6FbKZ11idfhiDB8FZjGGr8m0hQIC3h7NXlt2I8bwif1Il0h+38wLN8SYSycZq/htVsuozTxk+I4OxsId1B+WrIwKU85C7G9lVJjcrgZJp2P44n1bD8QYvl6mcEFNbmAM3LTFZ/jMcnnsvSzDVRftxq2/8mI85dwd9nMaL5+yaQ5f+W8vxJPZshhip1Qpw6gNxiUeOTrEweWRlWlSQHfXQ4Z5OosYvuqi3bnJdyq2NftsDl8oQyTTFiq6LruUM9MW2y4bxFNNvjaSTi5xjV0XfsAXLu/nvqRTGiLRdiTV3POwOU/nVgoUmcM3l8fvU8PwmXO7h7F8fsC3tnX4JjFtSbUZA/WTT2jEyjKMhfEOPzdNZiqcceSTaVr7dfgee9ZWvOhxp0fbaKyveAKhy+Hr0KFDhw4dpkRQloGNL4OAL2ODdMGm+cvdACNjgxRah2PTXI5Dq2PPkMUyfLkfIOSir7xdOTHSz7hpi1/vbFOicHcbl04ZTBwejAOpaS9zFu0rw6KSwvobrg4LLPZDqR9HP89sECvzcNa6Pp/fNvCkc3cA+PpM+8uzLMrwxVhiwJlOzPUyEbxPNref6up8P8NgVOB1f3UDNs/3sHvLPLYs9Oz18Y2HjgJgAV+13S7B8NVJOgkawOq4sEFdcG1lYcDHB/gUUMScbyWW5np4pGLMmiSdMeQZSTqrvpF0kF1yupIgA8C9lekKnac6Sed8L/eMmc7eYYLEPQ8v43FnbQv2M71pC5N0au1MXLznxGRt1502ySDy855nTh47LkuP8eQS3SaGj18PmXDp5MFirlTwrCYo1Uk6O3To0KFDh296SNYsE0ETlwrxunGxHL5RETpPUvtWeikGpCSn8xk+89e5dFaDqhpJpyz83WOmLTSr3sTwBSYiIuDLs7AQ+tHBOCJNdEMTw/CFppcroyrgqwmk+nmGMyuHPynZWk/SP8sUrn3CGfj8G5+Hy8/eZj+bFHnm2DlehiB2DQF+UOzl8E14sKl6fvO9DKujEg8cWsW+w6s4tDLCtsW+DTS/UTF8Z26vcq6qi6uJ4YsFVEWp8enbHsSCqCXJjykl6exlyg72+y0knYuc4ZviezIMnwtiKJbgDJfW7romAxJynyxL/xzwAIcf41ye4fxdmwAAdz541Gt7VnBJp5E8OgYuU+arnLgsUi3D5+ck8+doL1MotNl+VGjPxIY/G5qklpyZ9Rk+fwIoz+L1OmlZJ+lsAaXUi5RStyilbldKvSGy/DlKqS8qpcZKqe9az7506NChQ4cOaw0+EJWFwHu5wuffeA1+/weeYpfb4CvC8A3HZTCIpxnwUvsDJAIZqMRMW/q5v69czKIDvmmL1/eoaUtlolEFHi9mMqiYS6eTHvI2ZcBXBJ9Je3SFcDZ/dVRgcS6vDdz6ucLPvvASvON7L8e3VLlthPU0U6LjOXXLgjVIaVsO7he/9TJcfJqRfvIAWanQhKfu2D0Dnklr0SU+n+/lGIwLrI4KrIwKHFodYetC316/9x9aRabCUjQyh29UlCjLZongNx5axn1VcBQzqEmZtvRyZSWdbRg+v+RF4+oBZA4f9cML+KAti7S3YvjOqBi+QmuPjeKyRc5i9nsZNs33cMa2Bdyx39U0XIuAhD9fNBjDl1GNwaaiDOk2YyhYcA7413I/z1CUpQ2cOUPdn8C0hTPHWeauIF52wuxb2TxXAPiOJ56JN7z4UrNdJ+lshlIqB/B7AJ4P4B4AX1BK/b3W+qtstbsBvBLAz65XPzp06NChQ4f1gixmLpm7hX6OJ5+7HQDw8qefG+S1AW4gHwv4iE2jsZNkIDbFAr5q5R1Lc5jLM+zYNBes4xzrUC2TDqOhpJMYvn6ucNN/eyEUgA/ddL89nsBwRgQmsRzFI4MxTmd1tgBEnRWlscfKqMTOzc2mLfO9HC970tnBsvWVdLq2d26en2h/r7ryfOw9sIJbHzhSMXzVOQRsGYIUw8fhsbUTRjEx0xbAsE2DcYnVUYleXuKgZfjM97X/8ADbFvuB0dBmwa6OSz251X3AfCPM4auCiH6WOUnnpAzfFNdFPzMSWhvwMadRQqmdec7RYYEdS30baA7HpZcb2PMknTzYMX27YPcm3LHfMXxrEfBRgEc5fIUNAM31prU7nrYyx7pAyTJ85FAs1AVF6YJgLrfk32dTIMYdZXMlJJ2c4VM+w/cdTzoL33LJqQBQBfK1uzlhsJ4M39MA3K61vlNrPQTwlwBeylfQWt+ltb4RwHRZph06dOjQocNxRJvC66duXcBdb7sWL3n8GYHMEXCDmFFRBqwN5culTFvIMTNm2rJryzw++fPfgudX9ed4sOhqUjmjCa/vXh2+yrSlGkDN5Rk2z/e8wZfcns4H71svYaBUJ+mktmWelzX0EJEAl27WGXasZbx3/q5NuO6nn2vf8+OhPK3aZEMBYsR2bpq333emlDUXSeXwccjc0kmQzOGrTFtWRoblo4CPBuT7Dw88do8kfWFgVrZmTX77+55o+iQ+z5UKHDjHTJZIcr52OXyzM3w5Y4KoH/sPD/DPtz8IwAQYnLk+beuClevyAuHUfwI/d3Qsj969GXfuO2LP71rEI64MQ2ZKMFTHkCli+Nj32Yvn8ErUBYV1OXyG4dP2PPqmLdMxfHnmnmfGhdRflwd8fk7x2gTUGwHrGfCdBWAPe39P9dnEUEr9mFLqeqXU9fv371+TznXo0KFDhw7T4PNvfB6++IvPB+APELlkE4jLBq1nizfAMa9jDB8xeCSoCiSdc2EOHw3e+rnCGdsWbaAXlXTSezHLzQ1WZB2+XmRWXiEclO+q2C3O8MUQSjpFwAc/H8f0pcBiPwsG6I9mJQfqAr61ZPiefO4Or9QBb/vsqkTBQ0eGrdv70asuwNu/53K87Eln2e8uGvDVtBGbeGiLtKQzs3JOHvCRs+r+IwNsW+rb9enaWgzMVXSrQfRCP8NLn2iGjcFkQqaqOoXus7FlhDLLDrVh+JrKMjShRy6dtpSA2fdgXOIVf/w587n2r/9Tty5YqSJnogD/vprnAV/Vzwt2bcLhwRgPVtdUXa5cW5ReAOYkj3klhdRaW6ZL5k6mUMfixlw6Cf3qXNJ55JLOSUxb+Hnl14qsw1cyQx0gnCzpAr5jCK31H2qtr9BaX7F79+7mDTp06NChQ4d1wqlbFnBKJZOURhm+jXqMzQoHOOSmOSrKIIAJJJ0J0xY+uLE5fMKdkcc/0klQFkWf7zlDFBpkHaicDIl19BnD4FBxzilLXp/5oOqiRIAEhDl8UKGT4+o4btrC2Zo6dmctAz4ZUPGmqSbd3oMrrdvr5xm+88lnI8v8Onx0LVD7kl3m8JZNGPClzs18P8PysMCwknUeXBlh62LPm7DwGT7zd3FOMnEhwxf7rvwJBRVdxuWPI3vdKwxJ3tmC4eNtTBocm20yT+YoA5GVUWGDCmr+tC2OvV0VZSn8+zDMX6Og+khV63EtJIfOtMUwfBSsUW6u8eiMM7Yp1ObwUUAZeTb0exkKjTjDx84Hb/+91+/BW97PM8b8ZyKfwOLHB5hgtlcb8NUc5AmE9Qz47gVwDnt/dvVZhw4dOnTocNKAs3ZNgYSTObrPXvx4Y37yosedETJ8VQCTMm2h4OsIK/RdMmmbv+9wUEODOF4jkPZDzAENxsm6XuZkAbCyL45zdlDAZ97ToOrDr7sKf/PqZ7nARYxEQoZPhTl8wwILEdMWPw8x/V20GddvXWhncyADKo/hqySdew+stmoraNtOEIQMn3es4iTyt5Pn8MU/X+jlOLRqroGjwzFWR2Ul6XQ7277IGT5zQUiGb1zqIOALgnwIV9ksvow7ONogIleWHWrD8PE22hRql+jllaRT5PARVoYFSm2YU7p/d26et69XRM1GP2/SnTv6HmmCaDgu8eGb7reun7PAr8PnGLC8CpTKiqUE2hezL0vgQ1/Zi/devydYPhbPKP9aVijKkrmuhnmMgH+ef/5vbsQ7P/N1bx8rnOFTXNLpX38a/nN1rudPpDTV+ztRsJ51+L4A4CKl1Pkwgd73AXj5Ou6vQ4cOHTp0OOYwIqh48fFwXeX9BYCLT9uCu952rflcMnxW0mkgCQuSfB5leUA2Hydg+OKz2LRMBpM0CU6F1w+uDJGpcADv1vcHRlsXaYhBs/imnUtP3wrAtLM8LIK8RDlIV0oUvy41BuOyqsMXsmuPrkwt6nP46r+oj//Mc7GNBS91kAEVP4/E8B0e+HlabUFtZYrLd80yfggyIIrJd9uCn1N+nub7GQ5WQT991dsW+x47tz3C8MXMVWTAJ+svyn4HLp00iZBnAMzAnjPbtixDC4dSLhlsy15x9DJf0jkqQ/mxhkam3PFtWei5gE9IOvm9uRCRm1LAtTIq8OPv/reJ+xsDz6krGbtGzros3mt1juZ7GUqt8er3fBEA8D1XnOMt55JRwA/ue1lWY9ri9t0kteTMaZ7BapXf8dHb8PizttplWvv3MAXUQCfpbAWt9RjAawB8BMDXALz3/7R35mFynPWd//6qr7lnNIeu0X1Yp23ZsnVYtnzIYPnCkGCwOWwuGwIEyC4QrmBCYCGbPGHZhMebBMgSNpBkgRCTOLAsJMCGK44Jh8HGJ8aHkKzDkmakmenud/+oeqveeuut7urpbo1m9P08jx9NV9fxdlV1+/3W93cope4TkfeLyPMAQEQuFJEnANwI4E9F5L52jYcQQghpJ7rAQS10Y/a0Xtj29trhSw3pDETAmCEoXnPxKgBJ4eRyTKKQzmjfXZarGIWcKnQX86mf0Z4YRQVq/Ne2MNLHsffnyuEzQzpPBm5fZyFZpVNE8L9fdxE+9aptNd2dehpo1UhPWGGzHragMj+OFmnXnbMo075s9K7MHD6tiUTik+T4GJoQfCmrl/I5HB6P5yL2GUVbAINAZNcAACAASURBVGDAkcPnLNpi3Su5ICfPxLxf7DFpAegSdPmchA3Is4TumscpZcxPix8vXqWzYjt8UxVUq/410R+7q5gLH97Ygi8XG4/+jkTv6/t6bJoPEVxERVsCh0+HdAYOn0LURiOTw1fIxUT9RLmCcqUajrlstWUwL30h74tnu7ALEA/9bSSHzwxZ//y9T+D/PXQwfM8Pt3XnCZrXdbbTTocPSqm7AdxtLXuv8fe/wQ/1JIQQQmYlEiTwZJlX61XSRFMih88Kn7Qn7y84fxSf+s5jeOHW6An6W69ah7detS6xb9P1CUM6DSGpd62dLT2fMifVPTXCHNPmX67mykBUDj9ZpTMpoEyHz+wHOGWV2xMAg91FXHpW7Xz/ZnP4vv/uPXjfXffh7h/vS3wue9/3/97eTLlkLvQp9R0+/3zpiWy8YFB8O/N1o43X0+5N37WJLzPbMgDAPEPwIbx/PMsBU6jaVRLFd/hMOVnrGum3bKGrl41NTAV/Z3D4zKqYGStQxo/nf3f0uSlbH258sgylVEy0dRfzqUVbzCFrh89V1beVgs8M4VSI59gJgrYMwQUtZThHfoGf6Dz84uA4vvTDp/DF/3gS33r7FYbDl/x82uGL+iqaYixar16opXl8T+IesVkZVSF+n5i/P35IZ83DzBpmRdEWQggh5HQnS4W/ehUWk0VZcjXfXzKvC/e85zlYOdxd99hphQmAqMEyYAo+f0IlIuEkqNuRv6fR669f2Iu/es32cPmywS687ap1uGL9/Nj6XYVkD0HAncNnCj7dD9BVtCWrmZXmsGalo5ALJ6u2qLDPrb/u9ARm6C6IhG7hyeBcvP6yNXjeuYv9MVjnrLmiLdHf5pauUL6+zkJsgtxvhnQicmjMc1SuVBOiyCxQo4mFdKZ8t1x5mr7DF68CWYu8JxjsLuLKDfOn5/B58bYMU44cPoX4ee0q5UIhnszhMwRocM5zDsE3bm3XDFEfPi9oWxA5fNrlskO8a9FRyMVCvB85MIZ/+NHT2BfkG4ZiTj8MMj5fIezDFxXh0dRz+Mxjmnm/Ohcxeq9qbBP/jpgOX44OHyGEEEJMsrhGroIbtfahJ/ma6QoHID6psdsyeCLhpKuvwxd8evKU83wRNlWp1BR8el50y84V2LVmOFwuInjD5WsS63emhHTaFRtF4hM0na/oKtqStaz+dMrvm+QMRzRZtKWpXceIHNioQM/JYKJ/7TmLcGKqgrt++FTivmnGwUwP6UxO9Gs5fHrs+ZygkPPCazhVSXf4TMzPYA9Jv2U/HNAPLhoRfDlPwjYr3w765jVCPheFo1YdBWl0lU7z83QX8+FDh2eOT8TWj+XwBeLKfEChr8PYZAsdPqMolBnSqUMhlZHDl9XhqyiFxf0deOrZk/jn+/fj0Wf8ZvHmOXK1Zch7Hk6oShT2aVxj83q7+vCVq1G/w7TG60BcDNpVOs3fH+bwEUIIIQSAUSwiQ4GIqKJn7fc13S0UfPGiLf6/KnTxoqfufVZIp2e4S70ZHL6s0Ys6RNEOOUw2Xo+HdB4/6U90nUVbsh3aEN7x5Vl1kpkTlBAqLVR8+vp4ImGxnJNls/qgHk98u2YczHjRlmh5muAzXTZdkAeIBEIh58XWKVfdOXw5a9DmNmnXJSkS/f+0O5TV4dOUplG0Jed54T1cUSoRZjyuq3SaDl8xF4ajPmVVcPVi5zxemdVfVjukc9uKQXzmtu3O99KoBiGnOjRVfwQd0uk7fLotQ9YqnSp0y/7uB1GR/slKNVYUBohfp3xOUKkmXUAg7r65BF8s9Nvqw2fe1+Z6ySqd8UI5lbmh9yj4CCGEkFbQm6GMv563pTkwdgXMLiuksymHz3RMrOMLBEdP+BNIV0jngj6/gEl3KX1CHAm+bFOLsM2A3ZYhISwEk8YkWle8TCvakoWokEyyQEy27ZNuqeu9ZgkLtCAKpzVDANNy2ZoZQ9ot5hJDZuN1ABjpjQrdaIHgh3QaoXiOKp0ijmqnjpzTcP3gSrlCWT3D4XPl+AHAX9++A++8er1/HGOdabVlMArOVJVy9uFTKn5vdpfyoUh/8sgJKx/T4fCZIZ1BEaWxCXdIZyEvGOrOVnBIU6mqwLUWvzF5WLQlKDaD9Kqrrh6KpXwOVRUJK/P7O1mpJqt0xj6f57dlcLSXiQk+h/MWK+5kVulMOHxmSKeycviMkE5pTWP70wEKPkIIIaQF9HbUL+MftmVImVTbRVESDl8TE3lTmOj9GCliePaEX+giFHw6j0eAhX0d/nhqOnz+v1n7vnWGbQZsh8+a3Fu70w5fR8GLCQzXumm4Gj43gtmn0L4mrQzp1JPdga5CGAIbd/iCCbPt8DUjOmsUbTHpKuYS7p2LQs6LXdOpSjURJudqC2I6pekhnZZIDAqoTNYJ6RREYtG8X6fVliEXub3VqqPx+mQFyirq1FXMhds8eeQEho2KsObp1+Mxt41y+NwOnydS95rYVJTyz3fo8EWtXbxA9ESN1+P3gUskdxT8tgyTlSrmB99R/UBsspwUc7GQzjCHLyna64V0TsYEnxHSmYtXUJ4wxGBVpTt8npGbOdthDh8hhBDSAlwNyW30vMIORUzbh+2oNRMuGHNMdFsGRCGd2k1cOeIXgDFDOucHgq/WZ1Qq/tS+HnoCWDeHz9pOh7J1FHJ4wXmjKOVz+NIPn8KX79uXWWzpY9ritBGHUE8Dc9bkupUO33lLB/Duazbg17cuwYFjfq7Xiclk2cCkwzf9Y5rbLgn6CAJJh0/nenYX87h680K8ZPuy2Pv6/ORzYoV0qkRz8pwIYGkt837NHtIZ5ZwBNQSfkTMYa4MwLYfPC926ilJh03eNzuEzv/PdpXzo0B04NoFzlvTjM7dtx8Hjk3ji8InEeFxjPJ7i8HkiMde1FmMTZXSX8qgGDp/Av7HLxsMeEV/IRm0Z4heqVPBwLJ6GiGLer7Q5Ua7iunMWA1AY6Crizn95GJPlKipVLciTny+f8+JtGXLu61NP8Jkung5Njd4zc/jix4i3A5HU6sOzDTp8hBBCSAvIEtKZljuWto9W5vCZc0C7LYNAcNOFy/DRm7bgpdv8iXsY1iUIn9LXckAadfg2LPLzvZ4+ciK23JXDd8OWxeHr4zqks+hX6bz2nEWh6EoT0jZhSKftJmbaOj4RtB2+Fuo9iAhu270Kg93FMAR2Ysrh8E0ztNV5TPitLe586fn4wxvPDZfbYkg7wZ4nuPNlW3HJ2ngrDFN0FWIhnZHDp90Uz5PEeYzf6+7PkwzprB1ua66n32va4TPyOSsOMTvuqtJZzMW+jwv6OrBmfi+2rxqKF2gpJHP46jl8OS+bw/fth57Bpju+gu88fBCVauSOKihUqyoQe1GYZxTSaTt8yXNWyueglO/m9nbk8aFfOwdr5/cACBy+RB8+I6TSk0A4J/vwuRw+M+RyshJ9Nyam0qt0mhVO7YI68R6WDOkkhBBCiEGmkE6J/2tjCzwdxqdpVUhnKPiMceU8wQ1bRo18JP2ehNUX0yaZ/vpRKfcsnLd0AADw4P7jseW24Fs13IM/etEWvP+GTQAQNv823UZ9xKzFSvQYazVNr0c1xdFspcNnou8FsxhFWg5fs7mengBXn70odk/rsF6NFnz1KHjxsM8pI4dP95nLSb22DCn7doV0Gqci7eGDmTNoOrTTa7wuxgOUZA7fyakKqlUVExJdxXzsmukcWcDO4UtWso368KU7fFkE353feBgA8PihsUD0BG6e8p3KsIIm/GVpVTpd56yj4FfpnCxXQ8dej9udwxdtm895qFYRhnTGcuqMFfU+JlJcPTNvUDu/rvX8HD73dWdIJyGEEEJi9GUq2pJ8om2ic/jWL+zFFevnJ1yVZqovmpMac4IKuP0Ts0Kkzt3T+XMuGnX41i3sdS4v5OPb37xtGXJeVKXy0WfGUMx7GInlPU3P4bMf3mfd3t/Y/ydRbKSVSXwG+mGAKSiiHD5bdE7/OP4ukzu4YMW82Ou+jIIvn7OKthhVOjsKORw9WfarKFrXonYOnxbs7qItmvRrIW6Hb1qN171wX77D52q8Hhetfs5i9HqkJxLTroqcOUvEitRy+JAppPNbDz4THCPnF23xdDisdviC75RIzSqdaQ6fFkpa6GnhF8vhsxw+T3zxX04p2mKeGy0IzXy8mOAzQzq9dPfertJponsQzgUo+AghhJAWkMXhAxCGSjn3EQirLUsH8Pa96xPvN+PwmY5YlMPn49qt2WZhOAjp7GphDl8h5+G6cxZh82h/bLmeGL7m4pW4ZeeK0NnSE79HDoxh6bxOpyDIenqa7cMHpDt8bTL4nOXw9aHsQonNhXSK8zPYPSEzO3yJoi2Rw6fdIX/8lnA2XtrjCR1da7nniRWSV8PhyyVFY5bWKjalvBfeix/92oP4y+/8Ivb+iclqENKZ7gR3GU5+LLxRj9EKOSzmPIylNF63Hb69mxbiy/ftC18rFa+SOlGuhI6edvO0AASC34oaDp/rvjQfVIWCL/h3opxepdMTv+JpxSh+E8+pi45x9GQZSqlYESMt8vywYeOceO572j8f6Q+pzHzQ2Q5DOgkhhJAWkKU/FeBP2FJDOgNBNVlJFuYAmnOPzAIw4aQynMwk9xu1BBBcdtYI3nf9Rrzj6qQIjdaPKvtl5U9ecj5ed+nq2DIdwrV0sAvLhrrC5Xpi/siBMSwf6o5ts3LYf/3wgbFMx7VDWjWN6KQoR611LRFqISLYsnQA//WF58SWAa2tFCqSvv3vPm8TnrNxAQCgrzObZ5C3KnCabRm0ePBDOq0+fA5H2iaR9xeEo5rHduEZx7OLdDRKZzEXjsMWewBwYqochkyamJ/PDIvU6/WU8jGXzaSU9zCe0ofP8yQWBvk/Xr419n6lqnDSCoPUjp4+TkWp8DMJtMPnr5+lSqf5eQp2SGfM4YsXbdGhtpVqNXRK856HDzx/M9Yv7I2dw0pV4ejJcszhc7WBAPR94r62VaUShZc0nrjbP8xGKPgIIYSQFpB1srhn/fwwf81Gh3SaoUnTOYYLMz9QT7D+4MZzsXPVEJYbwkoT9eHzj/uKXSvDyowuqrpZc5MhjXqCaOfy6Yn5ZKWKZYPx8b7h8jV4wXmjeOVFKzIdIwrpTPaDq8Xm0aixeCRw052bVvPFN+zCiy5YGr7Wh2plHqFIemjrrRetwJ/fcgGu3rwQF68ZrrmfC4MQULNJPQBUqtWw5YcWC2bVzPAzGJc/4fCJXicpdOPVaFMcPribfk+HUt6rKbD9xusq8d01P59ZmVavN9JbSgiicP18LlZ4xCTnOJcmFaVivRxPTlVCR08PsVyJRJCuSKuMMFx7LDamC6iFXsnI4dPCLGe1ZZAgl9MsfpPPCV62Yzm+/JbdibY2R8YnnQ6fKQKBqCCNzeL+Drxy18rU85UThnQSQgghZBr82S0XpL6nwy4nUwRfM5iTX/3n1uXz8NnbdzjXr1fa3iYtxLFR7BAwjblfW6AW8x4+8uItmY+R6vDVyeH7m9t3hv0Kw7YVhjuhVGv78NUjCum0nK4WFG2pxZ0v21p7BQCffMWFePLIiYSjPVVVsRw+Pd6aRVtSrkvC2bRCOrMUbcmac5qGiNQsVOT34UuK1lirBYfDN9xTDNexz00p72GsRpXOWte/Wo33qJuY8nMqPZHwPJerkcPniVhFW2o7fLajW9IOX9AwfrJcxQ8eP4y+jjwWBKHisRy+QPBNVZN9+DTzuoo4NDaJw+NTsXtVO3u2w2e6l5rhniK+/c49AID9x04mjgEE+Yut/ymeESj4CCGEkNME3ZbBnJC1gyxOYdSWIduEuFGBmEYxeOpv51OZE/NF/fGKkY2S9pHqfdTuUt5oPu92+FqRH5gVfW1amUeoHd1m6e0oYP1C3xE2BVvZqNKowwNdrlTsvrMdvuBfl0g0F6U3Xhcjh6/5z1ort3b/sYngQYBg2WCX01nUYghAGG450ltyNiYH/Acc+4+lN16vdf0qSsX60OmQTvPcPbDvaCRiBbGQTrsfo+34FXJe7LPpIkxRDl8F3/j5Aew+ayRsq2Hm8uU8vy2Dqw+fvvADnYVA8E3GcpP157IfmOW95CODWHGflPPlCejwEUIIISTJK3etwNlWIZKsdBbcIZ3nLOnHj554tumxabIUfzH78GUhLcSxUcKcn1y6w9dZbG76YvchnA4Jhw+1K/61A30Zk43XmwjpbKRSafadhkw5cvg8r7ZLaY8orHbrEInmZ0/LJxWJci+ztDCoR9o1Hx3oxKPP+HmlAuCbb7882sZRjRMADh73u5gP95QiQe/I4ZuquG/eerdfVSmcjFW2rKCi4r3q7n38SNiGwxMJirbEw3BdYwd0m4rotRazWvA9sO8Y9h+bwG6jb2MY0gl/HH5bBv94ZsVRvduBoE3MkfHJ2O9NGNJp/X66irbEm72nOMis0kkIIYQQF3dcv2na2+rQLvsJ9Wdu24Fnjk00NS6TLILA7MOXhVaFdKbn8EWvOxyFIhrBbkuhaWTktsCVIKazkY//xTfsQnex8VYAmvRctiYEnzTX/sO5T+PvcrVqCL7I4UkIPqsypWt/iSqd1tjTPocZ0mkXi5kOaad7z4b5YSEX+5qkFW0ZC4qxLOjrcDYmB5Lhzib1vn/VqoqHdIZFW4BfHjqR2I8AVtEWqw+fMRYR/0GNq1G8/vfJw/4xlgx2RscywkfDtgy6aIujLYOuEHt4bAo9pSiv+NjJMu78l4exa81Q4pzUqpKadg+I+BVD5wIUfIQQQkgTfOwl52Oop9iSfenJkx3S2VPKx0KXmiXLHFdNM6SzWcekkCGHz550NkpYtMVa3kgoY9UKYY1ESPZ9bEkp3pOV8Jw7ipdMFzFyuVqFeUrKFRWKZX2/e46QztoOX7COXaXTy+jwpfThmy5pImvVcDfmdRVweHwqIQrTirbcetEKHBqbwqt2rcTDB44H61qCz+7DEdtv7c9TqVoO31Q1LNqinTNzfF5QuCQth8/8LvotOOKCT4dm6zHvDx5cDXRGv5lRNVLd+y5qyxC7D4I/ezsK8MR3+EZ6o36c33rwGXzj5wfwtqvW+cfMe5gsV/37pJbDl1a0xUs+FJqtsEonIYQQ0gTXnrMIO1YN1V8xA3ry0qr9pZFFlGgXptGQzmZ6BQJmDl8yVEzT2YQrBhiT4kTj9ewo6/NGlQabGlpDlKtuUd5MDp4n7S0844d0+n+HbRnqFG1JG0+9kM5ablcjVTrnddXuOZiaA+ZJ2L/Qvibxoi3R/dzbUcB7r9+IzmIuFPT2EGs6fA53zaSiFE7EHL5KWLTlt55zFi5bNxLbjy5GpFT9kM6CJyjkrZBO6wGOLpDiFJdeJPx1WGYhF3cQ9b/9nQUcGp+MhW/qgkpPBC6iLhjjOXL4zPOf2nh9DlXppOAjhBBCThPm93bgG2+7DO++dkNb9q8bPGcRfDpUKqu4sh2v6RL17UqfIHc4SsE3QlqVzkYUX1ikRhfWgDv8rp1UQxekdfsUtL7wjDndvu+pZ/Hev/8JALtoi5WHGHN2Eh5fuF1sG8lWtKWqVCg2a4knzVBPyblcu1cuV62rmMOVGxaE4Zr2KvGiLe4xVFLCpF297+z9fumNF+Obb7s88f62D34N+476okvEF1YqaLzeUchh/cK+2H4kqNKpERHcvnsV1s7vCcYSfRc7CrlE0RY9Vv3vr44GDp8p+MyQzlDwVYJcS9PpDb63yq/UeXhsCv/HaCp/9KQv+J46Egi+8P5K3kPmKa3Vr9FsUj+bYUgnIYQQchphNxVvJV1Fv39XFoHw1qvOwoK+Eq47Z3GmfU+n8bqLC1cO4hUXrcCmxfHCN+akLGuT+zRa4WClOZqnsmhLmiBoBk+SBS6axdzf/fuOhX/rcEDPk4SIieXwpezXFlrJkE73lkoBmxb34fd//WzsWh3vJ/ilN16Mr9+/Hx/5vz8Plw33FPHQ/uR+9IMHl8j/6fv3AogEke0xxYq2pNzPUS9MK6SzhuDTq569JL1w1PcfPQTAd8l0+Lj+DPq7FRbGEQBQYV5xMefhXddswOGxSTy4/3hs7H/8kvMwPlEJBaW/fi7cDvBduGLOQ6fhaoYhnYju5YmpqqMCrv+vAtDXWcC3H34Gh8enUMp7mChXceykn/8YCj7DQa7l8On7KJkTKk0VdjqdoMNHCCGEnCFEbl39GX1XMY/XXro6s5gIw8+anFn0dRTwvudtSuTpxRy+ZkM6W1C0JdGGQvS+mxpaQ1RCh6910zlHulNL9ukiyuFDLBcLsMSatb2knGvPcnJqOXyeJ3jxhcsS4unsJf24eG08pHqo2+3wlcI+gs63AUTirJbDl+bYhQ5uIqQz/f7PElKtG68PdBYwUa6ioiLRo793umiK7sOnw0D1b4h+2NDbkceb96zFP77pYly0ehhXblzgbMtghmv2dxWc10mCoi2A7/DZD4/0NlWl0NuRx+Fx39H79Ku3o7uYw9EgpPPJQPCF595RtMWUgGnhvWzLQAghhJBZR3dRt31ofZ+/Rvv2NUq8Smdzgs90CuLLs4/d/rx6y1PZh6+SIgiao3Yft2nu0bm8YPRhswVfLKQT9mQd4Xb2NrGQzpTPUWlwEp9WlMksOpOGmUdmYp7jNMdOh5JeuGIwtrxW0ZYsD2i0q9cfCD6/Oqb/nq6Aq9siSCB6xgORqMPCtRgteB5+6zlnxfYfb8sQjbWY91CerGCgM54TaQp4fZ4mpqqJAlDhKxX9lgFAX2cexbyHsUAA6rGGVWAdrvWxIPwTiPcBjH0OTxq+V05XKPgIIYSQMwQ9WdNP+Jvln958SVhmPQo/a8muE8RcgyYVTlofvkbGrre1i7Wcyhy+djh87SjaYp+S33v+ZvzOF38SChpPBPODvm+aeFsG937tc50zcsBEkiJLU7/yYny7NIcvzEGsccLSwjVj66Q8wFg53I2vvGU3Vo/Ew7xr7bNelU4AOBa0fujvKuKbPz8AALhwxTwAkcM3GTh8EoQ1jgfb6Ictug1gIZ88nqstg/57fLISy98z1zertU6Uq4nCTVrAmQVtAL+KsUs0m20/7FEeGp8M/9YPk5IOXzx/cTbDkE5CCCHkDOEDzz8bO1cNYfM0G8PbbFjUhys3LgBghHS2zeEzBUBrBF8z6EL1kcPnzgNqJ5U2FG3p7yyEfc7awWB3ES/fsRyPfugaDHb7x8l5gvk1Hb44YVsGxwRd3xu12i3Um8Tbt0e6w1e/CFImF7BGTt66hb3IWxe4q0ZbkiwhnYfHJpHzBD2lZB6dFnxTYUhn5PB1FnLhdalUgz55jocNMcFnjF1/zv7O+Pl0ifSJciVxffdsWID/9Jyz8DvXbYyNPU3wFQ3BZ99EZmuKVIdPIidztkOHjxBCCDlD2Li4D5+9fUdb9n0qHb5miboyNNN4PdhmJh0+1XqH7z3XbQwLdLQKLcI+fssF2H3WSLgsDOkUwQLL4TMFW9o5TYZ0Rte21v1SbxJvbzmvKxIob9+7DjkRfOif7o+qQNY4VloOX2ydBhV7t6Mnp26fkMXhOzQ2iY68F3MW9cMD7VpOBfeAwN/v+FQljBAw13cJ61gFUsvhA5Bw+MzvTszhcxTledOetQCAno7oHHSX8s4wb7NoS63vZST4ksvLVQWl1CkN1W4HdPgIIYQQ0jS6qmazTdHTaLahu4mevCkVn+Q1ItaixvTx5Vkm3K2iXQ6fnU/XLGbOnSkA8kZ+m+3wuRpuR/sT53JzYl/L6aqXl6Xvj+VDXfjC6y+KhRBftHo4dMi1yOhxCDBNWKWzxm3R6D3jOp7eQ5ZdHZ8oo6OQizmLB8f8EMdS6PBFDraCwonJSqxFi+6l6Bq7+Vlj1zt4MGHn8Jk9/8KiLVPVhLNpokVvKe83e3eJYP35RCTT+bGFux3eOpuh4COEEEJI03zspefj87+xE30d7QkHbIfDBwDffecefOa27QAacycjh0+HdCb33W7KldY0u58ptIOTE0lM2E3xnfbxavXhq910vfb0V2/Z31nA+cvmxR42eBIVPdFu2GC3O+QTiNy7Vjq/LnGzqL8TQPZ7wRZ8+4NWClrERjl8QLUKjE+WYw5fWlsSwM63jY7x6DNjAIAtywZi67v68J0sV2qG5WrR2xs4fd2l5IMmLTZNh851rfR13Lt5obXc3+fJydkv+BjSSQghhJCm6SnlsXX5YP0Vp0mz/f1MzMbr8/s6jBi+xh0+vYUYk9ZTxblLfadp15rhOmvOLFFV1Lizlk+pYAnYrmX8/Uw5fCnu0DuvXh8WKKk7XkdeqieCiSDcUbt3ZsinTdFwmVqFS9xsWNQXtiPIQkfBQ9kIbR2bjItYjd94PcjhMypjRu5y8nPV64V4ydoRa/3o31gfvhquvq7SqcWvWbXT3pf/OfR4kvdFVzGP779rD+ZZYlCfi5PlCvrRvrzWUwEdPkIIIYSc9rS6uTiAqC/DNOoy6E2SbRmaGVljbF0+iB+977l47qaFife6ijncvG3pqRtMDfQpsSMp9YReX9pHP3QN3r53HYB4XmIipDMlXzLnRQ5R2v3y2ktXZxZfWqDmrbHYDl9njb6Q2kVrZfEPV0jnxkW9AIDHDo5n2kdHIYejQaNye7mJJ/69fmKygm6Hw+cS63pRbykfO9frFvQ6iwKFjc89Cf8+6ejDZ6Jz+PS50MJPh98W817s2PWc3/l9HYmqoNrt1Nd7NkOHjxBCCCGnPbXCuxpFgnmdnrQO9ZSwbeUg3nTF2sz7SBSpmYGiLQBSQ2h/+v69p3QctTBzJk2KRh8+vV6lksxLTDujrkbZ4cS+ieugM770eM3jeCI4b5nvEP7a+Uvq7ksXdmll70tXSOeGRX0AgMcOjmXaR0chalRuLzcRCJSqJtop6HDiWt9LPSbNP77pYmf+pP7OLOzrCPd36PgkRpd0pu5bC71I8PnjXtTficcPjaOUz1kPGrSoBL75tssz5QRrkBPdtQAAEKlJREFUIX+Cgo8QQgghpP200uGzxUDOE/zta3c2tI+qLlqRyOGbnfl07SSceFvLI4cvOmcVR25YWtGWRMEcMYq2NHG/uIrBmMdYOdyNxz58rXPb1126Gs/dtCB8Xcz5omGihZVPXQ7flmUDeOHWJXjFRSsS7+1ZPx9ff2B/THB3FLxY8/G+wDGzQzo9D1AVX/TEQjpVVNTF5heBy7ghcB01+ZznFB769C4e6Az3d2yiXDM3stsWfMHYRnpL2Hf0ZODwReuL8SBg2VBX6n5NwpDOKebwEUIIIYS0nVY6fGYO33TR20ZtGeLCj0ToEL6C5aq4Gl7r0EfPElku7HDCnGfm8E3/Smgn69ylA4nx1bsNb7xgCVaP9ISvtcPXylYXXY4Q0kLOwx/eeK5z/U+84kJ85+GDuPnPvxsuG+wuYe3KHtz7+BHc9cZdWDrPF0F2ewNPBBWl/KIthvtXrZHDp3NKb96+LNPn0YJqdKAz9j0f7kmvFqv78OnQzq5A8OU9wYi1nYKKHsg08DsSFm2hw0cIIYQQ0n5a6fBFRTmmL/nevGcNXve/7sXyoe74Ppsd3BzkjudtwpoFPdhtFevQAtC8tto5Mif++q9Vw9148sgJvOaSlf52tap0NuG0LpnXhX/4zYuxdkFPYnz18v/sJuo6bLVdDl9fR96Zi2djf39WDnXhjZevwct2LI85aXZIZ29HHk8ePhEUbTHaMqS0JQGAzaP9qQ6oi6eCYjOj8zpj4xyq4fD1lHxR3m2FdFaqCsM9RRyzzkmWdh02+lzMhZBOFm0hhBBCyGlPK6sctiLscu/mRXjsw9eGk+8oX4iSz6a/s4DXX7Ym4a7kHS0LXP3d9LUf7i3hgQ9cHebQrRjuju2vkKtftCUrm0f7wyqcMfGZslstXkuWQxbl8LVO8Jk5fDdvWxYcv/aU3n575Ug3PE8SYZO2YB0d6MSTR3zBF2vLUMPha5QnAsG3ZKATfUZBl6EaDp8WeL1W0ZZyVWF0nr8fc2RplV1rod3OCQo+QgghhJDZhdcGN277yiEAUeNqUp+wD58xG63V321eV7xAzVWbFuLzv7ETl6z1QwhXj/SE4tCe2L/7mg34u9dfNK1xmoJ0UX+Hc52lg35IpB22qgVgu0I6f3vvetz7O89JVL60sR9yrBjqdq9nnbfFA52YKFdRqarYccNcyxYIvtUj/lhWjfRgjREOO9RTI4evmMf83lIo+rXgq1QV7rh+Ez5605ZwXaWivM9GxqsdTebwEUIIIYScIn7v+Zvr9lDLQpjD10LF95EXb8FbDo05C2oQN9qVMsVd2EzeMTFf7hApW5cP4qH9xwEAGxf34UdPPAsAWGm5f7ftXjXtcZr5gF1F9/X99Ku34+s/+xUGrJ58ug+fq0rn+2/YlFg/C7F2Aw6XLgv2+Ulj8UBUKdMs2vLa3avxm5/9QcJlnQ53XL8JL9uxHAstMT1cQ/B5nuBbv305CkEeqG4ZUa4qLOjz92NWh52OQNVFW+ZCSCd/lQghhBAyK3j5juUt2Y+IH7r2rms2tGR/gO8GrF/YV39FEqKFlClgXnDeKD75r4/iivXzw2UHjk8AiFw0m6efPQnAF3x/e88vAQCXrRtxrjsddJNxu82AyehAJ16+c0Viea0cvlsc67cL22HMKjRHDcFnOnzXn7sY15+7uCVj6yjksGlxf2L5UHd6SCcQD5+NHL7oc5rSTl/DRsK52YePEEIIIWSWIiJ44ANXz/QwzngKuWSVzrOXJAt+/PKQX+Z/eYrg0yzs68BgIBIuWze/5rqNsHZ+D161ayVun4ZL2I4qndNhMkiO3LZyEB+/9YKa6/7py7eGVTtNwWe6fe0k7wnKVYXBGg6fjW7LUHY0uFcwQoUZ0kkIIYQQQsipoZT38Gvnj2Ln6qGa62nBtyxF8H35LZfgV0cnICJ4z7UbcOtFy8OwvlaQz3l47/Ubp7XteUsHUMp7eO2l0w8pdbGgr5R6PlxowdldzKGvo3a+31WbFoZ/m83Wdwe5ku3mi2/YhX/88dNhQZYsdBlVOjVvvGIN7t93DFdumI/79x0D0Jjg0wVsGNJJCCGEEELINBAR/NGLttRdr6uYx+HxKYzOcztM6xf2YX2gUbpL+Vho7R/ffF4Y5jkTDHQV2+Imf+9dVza0/pagp+BrL13d0HYigs//xk4sndfV0kq5tdg82o/No8kQz1roXD6dAwr4RWDufvMlACIh2EhbBhFBR8GbE1U6KfgIIYQQQshpy2du2477njpat/WAi1bmms1mhnpKDfXGM9m6fLDFo2k9OvzzpguXOt/Xgi+fa0y0dhRyzOEjhBBCCCGknSwf6nZW6CRE01PK48EPXh3rmWgSNYpvUPDlcwzpJIQQQgghhJCZppYDXA2rdDa2z85ibk4UbWHjdUIIIYQQQsicZfmQX+Cm0eqtpbzHkE5CCCGEEEIIOZ1ZM78X97znSgw12KS+s8iQTkIIIYQQQgg57Rnuqd3I3UVHPoeJORDSScFHCCGEEEIIIRbrF/VifIIOHyGEEEIIIYTMOe64ftNMD6ElsGgLIYQQQgghhMxR2ir4RGSviDwgIg+JyDsc75dE5G+C978nIivaOR5CCCGEEEIIOZNom+ATkRyAjwG4GsBGADeLyEZrtVcDOKyUWgPgIwB+v13jIYQQQgghhJAzjXY6fNsAPKSUekQpNQngrwHcYK1zA4BPBX9/DsAeEWmwJSIhhBBCCCGEEBftFHyjAH5pvH4iWOZcRylVBvAsgKE2jokQQgghhBBCzhhmRdEWEbldRO4RkXsOHDgw08MhhBBCCCGEkFlBOwXfkwCWGq+XBMuc64hIHkA/gIP2jpRSf6aUukApdcHIyEibhksIIYQQQgghc4t2Cr5/A7BWRFaKSBHATQDusta5C8Ctwd8vBPB1pZRq45gIIYQQQggh5IyhbY3XlVJlEXkjgK8AyAH4pFLqPhF5P4B7lFJ3AfgEgE+LyEMADsEXhYQQQgghhBBCWkDbBB8AKKXuBnC3tey9xt8nAdzYzjEQQgghhBBCyJnKrCjaQgghhBBCCCGkcSj4CCGEEEIIIWSOQsFHCCGEEEIIIXMUCj5CCCGEEEIImaNQ8BFCCCGEEELIHIWCjxBCCCGEEELmKBR8hBBCCCGEEDJHoeAjhBBCCCGEkDmKKKVmegwNISIHAPxipsfhYBjAMzM9CHLGwfuOzAS878hMwPuOzAS878hMkOW+W66UGsmys1kn+E5XROQepdQFMz0OcmbB+47MBLzvyEzA+47MBLzvyEzQ6vuOIZ2EEEIIIYQQMkeh4COEEEIIIYSQOQoFX+v4s5keADkj4X1HZgLed2Qm4H1HZgLed2QmaOl9xxw+QgghhBBCCJmj0OEjhBBCCCGEkDkKBV8LEJG9IvKAiDwkIu+Y6fGQuYGILBWRfxaRn4rIfSLy5mD5oIh8VUQeDP6dFywXEfnvwX34IxE5f2Y/AZnNiEhORH4gIv8QvF4pIt8L7q+/EZFisLwUvH4oeH/FTI6bzG5EZEBEPici94vIz0RkJ3/zSDsRkd8K/h/7ExH5rIh08PeOtAMR+aSI7BeRnxjLGv59E5Fbg/UfFJFbsxybgq9JRCQH4GMArgawEcDNIrJxZkdF5ghlAP9ZKbURwA4AbwjurXcA+JpSai2ArwWvAf8eXBv8dzuAO0/9kMkc4s0Afma8/n0AH1FKrQFwGMCrg+WvBnA4WP6RYD1CpstHAXxZKbUewLnw70H+5pG2ICKjAN4E4AKl1GYAOQA3gb93pD38TwB7rWUN/b6JyCCAOwBsB7ANwB1aJNaCgq95tgF4SCn1iFJqEsBfA7hhhsdE5gBKqaeVUvcGfx+DP/EZhX9/fSpY7VMAnh/8fQOAv1Q+3wUwICKLTvGwyRxARJYAuBbAx4PXAuAKAJ8LVrHvO30/fg7AnmB9QhpCRPoB7AbwCQBQSk0qpY6Av3mkveQBdIpIHkAXgKfB3zvSBpRS3wRwyFrc6O/bVQC+qpQ6pJQ6DOCrSIrIBBR8zTMK4JfG6yeCZYS0jCBs5DwA3wOwQCn1dPDWPgALgr95L5JW8d8AvB1ANXg9BOCIUqocvDbvrfC+C95/NlifkEZZCeAAgL8Iwok/LiLd4G8eaRNKqScB/CGAx+ELvWcB/Dv4e0dOHY3+vk3rd4+Cj5DTHBHpAfB5AG9RSh0131N+mV2W2iUtQ0SuA7BfKfXvMz0WcsaRB3A+gDuVUucBGEMU3gSAv3mktQShcDfAf9iwGEA3MrglhLSDdv6+UfA1z5MAlhqvlwTLCGkaESnAF3t/pZT6QrD4VzpsKfh3f7Cc9yJpBbsAPE9EHoMfon4F/LyqgSDkCYjfW+F9F7zfD+DgqRwwmTM8AeAJpdT3gtefgy8A+ZtH2sWVAB5VSh1QSk0B+AL830D+3pFTRaO/b9P63aPga55/A7A2qOhUhJ/se9cMj4nMAYK8gE8A+JlS6o+Mt+4CoKsy3Qrg743ltwSVnXYAeNYIEyAkE0qpdyqlliilVsD/Pfu6UuqlAP4ZwAuD1ez7Tt+PLwzWpwNDGkYptQ/AL0VkXbBoD4Cfgr95pH08DmCHiHQF/8/V9xx/78ipotHft68AeK6IzAsc6ucGy2rCxustQESugZ/zkgPwSaXUB2d4SGQOICIXA/gWgB8jyqV6F/w8vr8FsAzALwC8SCl1KPif1Z/AD0cZB/BKpdQ9p3zgZM4gIpcBeKtS6joRWQXf8RsE8AMAL1NKTYhIB4BPw88xPQTgJqXUIzM1ZjK7EZEt8IsFFQE8AuCV8B9O8zePtAUR+V0AL4ZfGfsHAF4DPyeKv3ekpYjIZwFcBmAYwK/gV9v8Ihr8fRORV8GfDwLAB5VSf1H32BR8hBBCCCGEEDI3YUgnIYQQQgghhMxRKPgIIYQQQgghZI5CwUcIIYQQQgghcxQKPkIIIYQQQgiZo1DwEUIIIYQQQsgchYKPEELIGYuIVETkP0TkhyJyr4hcVGf9ARF5fYb9/ouIXNC6kRJCCCHTg4KPEELImcwJpdQWpdS5AN4J4EN11h8AUFfwEUIIIacLFHyEEEKITx+AwwAgIj0i8rXA9fuxiNwQrPNhAKsDV/APgnV/O1jnhyLyYWN/N4rI90Xk5yJyyan9KIQQQohPfqYHQAghhMwgnSLyHwA6ACwCcEWw/CSAFyiljorIMIDvishdAN4BYLNSagsAiMjVAG4AsF0pNS4ig8a+80qpbSJyDYA7AFx5ij4TIYQQEkLBRwgh5EzmhCHedgL4SxHZDEAA/BcR2Q2gCmAUwALH9lcC+Aul1DgAKKUOGe99Ifj33wGsaM/wCSGEkNpQ8BFCCCEAlFLfCdy8EQDXBP9uVUpNichj8F3ARpgI/q2A/78lhBAyQzCHjxBCCAEgIusB5AAcBNAPYH8g9i4HsDxY7RiAXmOzrwJ4pYh0BfswQzoJIYSQGYdPHAkhhJzJ6Bw+wA/jvFUpVRGRvwLwJRH5MYB7ANwPAEqpgyLyryLyEwD/pJR6m4hsAXCPiEwCuBvAu2bgcxBCCCFORCk102MghBBCCCGEENIGGNJJCCGEEEIIIXMUCj5CCCGEEEIImaNQ8BFCCCGEEELIHIWCjxBCCCGEEELmKBR8hBBCCCGEEDJHoeAjhBBCCCGEkDkKBR8hhBBCCCGEzFEo+AghhBBCCCFkjvL/AZZFODwBRJk4AAAAAElFTkSuQmCC\\n\",\n            \"text/plain\": [\n              \"<Figure size 1080x576 with 1 Axes>\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": [],\n            \"needs_background\": \"light\"\n          }\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"mAN0LZBOOPVh\"\n      },\n      \"source\": [\n        \"#@title Predicting and Evaluating Using the Hold-out Dataset \\n\",\n        \"df = pd.read_csv(\\\"out_of_domain_dev.tsv\\\", delimiter='\\\\t', header=None, names=['sentence_source', 'label', 'label_notes', 'sentence'])\\n\",\n        \"\\n\",\n        \"# Create sentence and label lists\\n\",\n        \"sentences = df.sentence.values\\n\",\n        \"\\n\",\n        \"# We need to add special tokens at the beginning and end of each sentence for BERT to work properly\\n\",\n        \"sentences = [\\\"[CLS] \\\" + sentence + \\\" [SEP]\\\" for sentence in sentences]\\n\",\n        \"labels = df.label.values\\n\",\n        \"\\n\",\n        \"tokenized_texts = [tokenizer.tokenize(sent) for sent in sentences]\\n\",\n        \"\\n\",\n        \"\\n\",\n        \"MAX_LEN = 128\\n\",\n        \"\\n\",\n        \"# Use the BERT tokenizer to convert the tokens to their index numbers in the BERT vocabulary\\n\",\n        \"input_ids = [tokenizer.convert_tokens_to_ids(x) for x in tokenized_texts]\\n\",\n        \"# Pad our input tokens\\n\",\n        \"input_ids = pad_sequences(input_ids, maxlen=MAX_LEN, dtype=\\\"long\\\", truncating=\\\"post\\\", padding=\\\"post\\\")\\n\",\n        \"# Create attention masks\\n\",\n        \"attention_masks = []\\n\",\n        \"\\n\",\n        \"# Create a mask of 1s for each token followed by 0s for padding\\n\",\n        \"for seq in input_ids:\\n\",\n        \"  seq_mask = [float(i>0) for i in seq]\\n\",\n        \"  attention_masks.append(seq_mask) \\n\",\n        \"\\n\",\n        \"prediction_inputs = torch.tensor(input_ids)\\n\",\n        \"prediction_masks = torch.tensor(attention_masks)\\n\",\n        \"prediction_labels = torch.tensor(labels)\\n\",\n        \"  \\n\",\n        \"batch_size = 32  \\n\",\n        \"\\n\",\n        \"\\n\",\n        \"prediction_data = TensorDataset(prediction_inputs, prediction_masks, prediction_labels)\\n\",\n        \"prediction_sampler = SequentialSampler(prediction_data)\\n\",\n        \"prediction_dataloader = DataLoader(prediction_data, sampler=prediction_sampler, batch_size=batch_size)\"\n      ],\n      \"execution_count\": 46,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"Hba10sXR7Xi6\"\n      },\n      \"source\": [\n        \"# Prediction on test set\\n\",\n        \"\\n\",\n        \"# Put model in evaluation mode\\n\",\n        \"model.eval()\\n\",\n        \"\\n\",\n        \"# Tracking variables \\n\",\n        \"predictions , true_labels = [], []\\n\",\n        \"\\n\",\n        \"# Predict \\n\",\n        \"for batch in prediction_dataloader:\\n\",\n        \"  # Add batch to GPU\\n\",\n        \"  batch = tuple(t.to(device) for t in batch)\\n\",\n        \"  # Unpack the inputs from our dataloader\\n\",\n        \"  b_input_ids, b_input_mask, b_labels = batch\\n\",\n        \"  # Telling the model not to compute or store gradients, saving memory and speeding up prediction\\n\",\n        \"  with torch.no_grad():\\n\",\n        \"    # Forward pass, calculate logit predictions\\n\",\n        \"    logits = model(b_input_ids, token_type_ids=None, attention_mask=b_input_mask)\\n\",\n        \"\\n\",\n        \"  # Move logits and labels to CPU\\n\",\n        \"  logits = logits['logits'].detach().cpu().numpy()\\n\",\n        \"  label_ids = b_labels.to('cpu').numpy()\\n\",\n        \"  \\n\",\n        \"  # Store predictions and true labels\\n\",\n        \"  predictions.append(logits)\\n\",\n        \"  true_labels.append(label_ids)\"\n      ],\n      \"execution_count\": 47,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"cRaZQ4XC7kLs\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"819f0e2f-168e-49ee-8d45-36cbc8452711\"\n      },\n      \"source\": [\n        \"#@title Evaluating Using Matthew's Correlation Coefficient\\n\",\n        \"# Import and evaluate each test batch using Matthew's correlation coefficient\\n\",\n        \"from sklearn.metrics import matthews_corrcoef\\n\",\n        \"matthews_set = []\\n\",\n        \"\\n\",\n        \"for i in range(len(true_labels)):\\n\",\n        \"  matthews = matthews_corrcoef(true_labels[i],\\n\",\n        \"                 np.argmax(predictions[i], axis=1).flatten())\\n\",\n        \"  matthews_set.append(matthews)\"\n      ],\n      \"execution_count\": 48,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"/usr/local/lib/python3.6/dist-packages/sklearn/metrics/_classification.py:900: RuntimeWarning: invalid value encountered in double_scalars\\n\",\n            \"  mcc = cov_ytyp / np.sqrt(cov_ytyt * cov_ypyp)\\n\"\n          ],\n          \"name\": \"stderr\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"IUM0UA1qJaVB\"\n      },\n      \"source\": [\n        \"The final score will be based on the entire test set, but let's take a look at the scores on the individual batches to get a sense of the variability in the metric between batches.\\n\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"xytAr_C48wnu\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"07097f15-0ae7-41af-f114-e5d9002dced9\"\n      },\n      \"source\": [\n        \"#@title Score of Individual Batches\\n\",\n        \"matthews_set\"\n      ],\n      \"execution_count\": 49,\n      \"outputs\": [\n        {\n          \"output_type\": \"execute_result\",\n          \"data\": {\n            \"text/plain\": [\n              \"[0.049286405809014416,\\n\",\n              \" -0.17407765595569785,\\n\",\n              \" 0.4732058754737091,\\n\",\n              \" 0.34151450937027694,\\n\",\n              \" 0.5945883900105632,\\n\",\n              \" 0.7410010097502685,\\n\",\n              \" 0.4472135954999579,\\n\",\n              \" 0.29277002188455997,\\n\",\n              \" 0.9165151389911681,\\n\",\n              \" 0.8246211251235321,\\n\",\n              \" 0.8459051693633014,\\n\",\n              \" 0.7419408268023742,\\n\",\n              \" 0.6979824404521128,\\n\",\n              \" 0.7141684885491869,\\n\",\n              \" 0.2773500981126145,\\n\",\n              \" 0.5056936741642399,\\n\",\n              \" 0.0]\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          },\n          \"execution_count\": 49\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"oCYZa1lQ8Jn8\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"c604d08f-efeb-48bc-b5eb-bf3b0811bf6b\"\n      },\n      \"source\": [\n        \"#@title Matthew's Evaluation on the Whole Dataset\\n\",\n        \"# Flatten the predictions and true values for aggregate Matthew's evaluation on the whole dataset\\n\",\n        \"flat_predictions = [item for sublist in predictions for item in sublist]\\n\",\n        \"flat_predictions = np.argmax(flat_predictions, axis=1).flatten()\\n\",\n        \"flat_true_labels = [item for sublist in true_labels for item in sublist]\\n\",\n        \"matthews_corrcoef(flat_true_labels, flat_predictions)\"\n      ],\n      \"execution_count\": 50,\n      \"outputs\": [\n        {\n          \"output_type\": \"execute_result\",\n          \"data\": {\n            \"text/plain\": [\n              \"0.5453476037943634\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          },\n          \"execution_count\": 50\n        }\n      ]\n    }\n  ]\n}"
  },
  {
    "path": "Chapter02/in_domain_train.tsv",
    "content": "gj04\t1\t\tour friends wo n't buy this analysis , let alone the next one we propose .\ngj04\t1\t\tone more pseudo generalization and i 'm giving up .\ngj04\t1\t\tone more pseudo generalization or i 'm giving up .\ngj04\t1\t\tthe more we study verbs , the crazier they get .\ngj04\t1\t\tday by day the facts are getting murkier .\ngj04\t1\t\ti 'll fix you a drink .\ngj04\t1\t\tfred watered the plants flat .\ngj04\t1\t\tbill coughed his way out of the restaurant .\ngj04\t1\t\twe 're dancing the night away .\ngj04\t1\t\therman hammered the metal flat .\ngj04\t1\t\tthe critics laughed the play off the stage .\ngj04\t1\t\tthe pond froze solid .\ngj04\t1\t\tbill rolled out of the room .\ngj04\t1\t\tthe gardener watered the flowers flat .\ngj04\t1\t\tthe gardener watered the flowers .\ngj04\t1\t\tbill broke the bathtub into pieces .\ngj04\t1\t\tbill broke the bathtub .\ngj04\t1\t\tthey drank the pub dry .\ngj04\t0\t*\tthey drank the pub .\ngj04\t1\t\tthe professor talked us into a stupor .\ngj04\t0\t*\tthe professor talked us .\ngj04\t1\t\twe yelled ourselves hoarse .\ngj04\t0\t*\twe yelled ourselves .\ngj04\t0\t*\twe yelled harry hoarse .\ngj04\t1\t\tharry coughed himself into a fit .\ngj04\t0\t*\tharry coughed himself .\ngj04\t0\t*\tharry coughed us into a fit .\ngj04\t1\t\tbill followed the road into the forest .\ngj04\t1\t\twe drove highway 5 from sd to sf .\ngj04\t1\t\tfred tracked the leak to its source .\ngj04\t1\t\tjohn danced waltzes across the room .\ngj04\t1\t\tbill urinated out the window .\ngj04\t1\t\tbill coughed out the window .\ngj04\t1\t\tbill bled on the floor .\ngj04\t1\t\tthe toilet leaked through the floor into the kitchen below .\ngj04\t1\t\tbill ate off the floor .\ngj04\t1\t\tbill drank from the hose .\ngj04\t1\t\tthis metal hammers flat easily .\ngj04\t1\t\tthey made him president .\ngj04\t1\t\tthey made him angry .\ngj04\t0\t*\tthey caused him to become angry by making him .\ngj04\t0\t*\tthey caused him to become president by making him .\ngj04\t0\t*\tthey made him to exhaustion .\ngj04\t1\t\tthey made him into a monster .\ngj04\t1\t\tthe trolley rumbled through the tunnel .\ngj04\t1\t\tthe wagon rumbled down the road .\ngj04\t1\t\tthe bullets whistled past the house .\ngj04\t1\t\tthe knee replacement candidate groaned up the stairs .\ngj04\t0\t*\tthe car honked down the road .\ngj04\t0\t*\tthe dog barked out of the room .\ngj04\t1\t\tthe dog barked its way out of the room .\ngj04\t1\t\tbill whistled his way past the house .\ngj04\t1\t\tthe witch vanished into the forest .\ngj04\t1\t\tbill disappeared down the road .\ngj04\t0\t*\tthe witch went into the forest by vanishing .\ngj04\t1\t\tthe witch went into the forest and thereby vanished .\ngj04\t1\t\tthe building is tall and wide .\ngj04\t0\t*\tthe building is tall and tall .\ngj04\t1\t\tthis building is taller and wider than that one .\ngj04\t1\t\tthis building got taller and wider than that one .\ngj04\t1\t\tthis building got taller and taller .\ngj04\t0\t*\tthis building is taller and taller .\ngj04\t0\t*\tthis building got than that one .\ngj04\t0\t*\tthis building is than that one .\ngj04\t1\t\tbill floated into the cave .\ngj04\t0\t*?\tbill floated into the cave for hours .\ngj04\t0\t*?\tbill pushed harry off the sofa for hours .\ngj04\t1\t\tbill floated down the river for hours .\ngj04\t1\t\tbill floated down the river .\ngj04\t1\t\tbill pushed harry along the trail for hours .\ngj04\t1\t\tbill pushed harry along the trail .\ngj04\t1\t\tthe road zigzagged down the hill .\ngj04\t1\t\tthe rope stretched over the pulley .\ngj04\t1\t\tthe weights stretched the rope over the pulley .\ngj04\t1\t\tthe weights kept the rope stretched over the pulley .\ngj04\t1\t\tsam cut himself free .\ngj04\t1\t\tsam got free by cutting his finger .\ngj04\t1\t\tbill cried himself to sleep .\ngj04\t0\t*\tbill cried sue to sleep .\ngj04\t1\t\tbill squeezed himself through the hole .\ngj04\t1\t\tbill sang himself to sleep .\ngj04\t1\t\tbill squeezed the puppet through the hole .\ngj04\t1\t\tbill sang sue to sleep .\ngj04\t0\t*\tthe elevator rumbled itself to the ground .\ngj04\t1\t\tif the telephone rang , it could ring itself silly .\ngj04\t0\t*\tshe yelled hoarse .\ngj04\t0\t*\tted cried to sleep .\ngj04\t1\t\tthe tiger bled to death .\ngj04\t1\t\the coughed awake and we were all overjoyed , especially sierra .\ngj04\t1\t\tjohn coughed awake , rubbing his nose and cursing under his breath .\ngj04\t1\t\tjohn coughed himself awake on the bank of the lake where he and bill had their play .\ngj04\t1\t\tron yawned himself awake .\ngj04\t1\t\tshe coughed herself awake as the leaf landed on her nose .\ngj04\t1\t\tthe worm wriggled onto the carpet .\ngj04\t1\t\tthe chocolate melted onto the carpet .\ngj04\t0\t*\tthe ball wriggled itself loose .\ngj04\t1\t\tbill wriggled himself loose .\ngj04\t1\t\taliza wriggled her tooth loose .\ngj04\t1\t\tthe off center spinning flywheel shook itself loose .\ncj99\t1\t\tthe more you eat , the less you want .\ncj99\t1\t\tif you eat more , you want correspondingly less .\ncj99\t1\t\twhen you eat more , you want correspondingly less .\ncj99\t1\t\tas you eat more , you want correspondingly less .\ncj99\t0\t*\tthe most you want , the least you eat .\ncj99\t1\t\tthe angrier sue gets , the more fred admires her .\ncj99\t1\t\tthe more that you eat , the less that you want .\ncj99\t1\t\tthe angrier that sue gets , the more that fred admires her .\ncj99\t1\t\ti think that the more you eat , the less you want .\ncj99\t1\t\ti 'm not shocked by the idea that the more you eat , the less you want .\ncj99\t1\t\tit is obvious that the more you eat , the less you want .\ncj99\t1\t\tit is not entirely clear if the more you eat , the less you want .\ncj99\t1\t\ti want to explain exactly why the more you eat , the less you want .\ncj99\t1\t\ti demand that the more john eats , the more he pays .\ncj99\t0\t*\ti demand that the more john eat , the more he pay .\ncj99\t1\t\ti demand that john pay more , the more he eats .\ncj99\t0\t*\ti demand that john pays more , the more he eat .\ncj99\t1\t\tyou get angrier , the more we eat , do n't you .\ncj99\t0\t*\tyou get angrier , the more we eat , do n't we .\ncj99\t0\t*\tthe harder it has rained , how much faster a flow that appears in the river ?\ncj99\t1\t\tthe harder it has rained , how much faster a flow appears in the river ?\ncj99\t0\t*\tthe harder it rains , how much faster that do you run ?\ncj99\t1\t\tthe harder it rains , how much faster do you run ?\ncj99\t1\t\tthe harder it rains , how much faster a flow do you see in the river ?\ncj99\t0\t*\tthe harder it rains , how much faster a flow that do you see in the river ?\ncj99\t1\t\twhen it rains harder , how much faster a flow appears in the river ?\ncj99\t1\t\tas it rains harder , how much faster a flow appears in the river ?\ncj99\t0\t*\tas it rains harder , how much faster a flow that appears in the river ?\ncj99\t0\t*\twhen it rains harder , how much faster a flow that appears in the river ?\ncj99\t1\t\thow much harder has it rained , the faster a flow you see in the river ?\ncj99\t1\t\thow much harder has it rained , when you see a faster flow in the river ?\ncj99\t0\t*\tthe more john eats , the tighter keep your mouth shut about it .\ncj99\t0\t*\tthe more everyone eat , the more john keeps his big mouth shut about it , ok ?\ncj99\t1\t\twhen john eats more , keep your mouth shut tighter , ok ?\ncj99\t1\t\tas john eats more , keep your mouth shut tighter , ok ?\ncj99\t1\t\tkeep your mouth shut tighter , the more john eats , ok ?\ncj99\t1\t\teveryone keep your mouth shut tighter , the more john eats , ok ?\ncj99\t0\t??\ti can well imagine the more him eating , the fatter him getting .\ncj99\t1\t\tbill can well imagine getting fat .\ncj99\t0\t*\tbill can well imagine the more he eats , the fatter getting .\ncj99\t1\t\tfred can well imagine joe getting fatter , the more he eats .\ncj99\t0\t*\tit is important the more you eat , the more careful to be .\ncj99\t0\t*\tit is important for the more you eat , the more careful to be .\ncj99\t0\t*\tit is important the more you to eat , the more careful to be .\ncj99\t0\t*\tit is important the more you eat , the more careful you to be .\ncj99\t0\t*\tit is important the more you eat , the more careful for you to be .\ncj99\t0\t*\tit is important for the more you to eat , the more careful to be .\ncj99\t0\t*\tit is important for the more you to eat , the more careful for you to be .\ncj99\t0\t*\tit is important the more you to eat , the more careful for you to be .\ncj99\t0\t*\tit is important for the more you eat , the more careful you to be .\ncj99\t1\t\tit is important for you to be more careful , the more you eat .\ncj99\t1\t\tit is important to be more careful , the more you eat .\ncj99\t0\t*\ti can well imagine quickly mary answering the question .\ncj99\t0\t?*\ti can well imagine with a hatchet mary destroying the jeep .\ncj99\t0\t?*\ti can well imagine if he eats more , him getting fat .\ncj99\t0\t*\tit is not entirely obvious if , mary listens to the grateful dead , she gets depressed .\ncj99\t0\t*\tit is not entirely obvious whether , mary listens to the grateful dead , she gets depressed .\ncj99\t1\t\tmary listens to the grateful dead and she gets depressed .\ncj99\t1\t\tif mary listens to the grateful dead , she gets depressed .\ncj99\t1\t\twhen mary listens to the grateful dead , she gets depressed .\ncj99\t1\t\tmary gets depressed if she listens to the grateful dead .\ncj99\t1\t\tmary gets depressed when she listens to the grateful dead .\ncj99\t1\t\tthe more she looked at pictures , the angrier mary got .\ncj99\t0\t*\tthe more pictures mary looked at , she got angrier and angrier .\ncj99\t1\t\tmary gets depressed and she listens to the grateful dead .\ncj99\t1\t\tthe higher the stakes are , the lower his expectations are .\ncj99\t1\t\tthe higher the stakes , the lower his expectations .\ncj99\t1\t\this expectations are lower , the higher the stakes .\ncj99\t1\t\this expectations are lower , the higher the stakes are .\ncj99\t0\t*\this expectations lower , the higher the stakes .\ncj99\t0\t*\this expectations lower , the higher the stakes are .\ncj99\t1\t\tthe more obnoxious fred is , the less attention you should pay to him .\ncj99\t0\t*\tthe more obnoxious fred , the less attention you should pay to him .\ncj99\t1\t\tthe more fred is obnoxious , the less you should pay attention to him .\ncj99\t0\t*\tthe more obnoxious fred , the less you should pay attention to him .\ncj99\t1\t\this expectations are always lower than mine .\ncj99\t1\t\tjohn was lots more obnoxious than fred was .\ncj99\t1\t\tyou should always lock your door , no matter how fancy the hotel might be .\ncj99\t1\t\tyou should always lock your door , no matter how fancy the hotel .\ncj99\t1\t\ti do n't plan to lock the door , no matter how fancy this hotel is .\ncj99\t0\t*\ti do n't plan to lock the door , no matter how fancy this hotel .\ncj99\t1\t\ti 'm going out , whatever the weather .\ncj99\t1\t\ti 'm going out , wherever that hurricane might be .\ncj99\t0\t*\ti 'm going out , wherever that hurricane .\ncj99\t1\t\tthe more examples mary says that bill has helped fred to discover the less i believe her .\ncj99\t0\t*\tthe more food mary knows a man that eats the poorer she gets .\ncj99\t0\t*\tthe fatter he goes to a doctor when he gets the more he eats .\ncj99\t0\t*\tthe fatter that that he gets bothers him , the more he eats .\ncj99\t0\t*\tthe more books i ask to whom he will give , the more he reads .\ncj99\t0\t*\tthe more people i ask what he will give to the more he reads .\ncj99\t1\t\tthe more carefully he words the letter the safer he 'll be .\ncj99\t0\t*\tthe more carefully he knows a man that worded the letter the safer he 'll be .\ncj99\t1\t\tthe more geniuses john meets , the angrier he gets .\ncj99\t0\t*\tthe more john meets geniuses , the angrier he gets .\ncj99\t1\t\tthe more people you say will buy tickets , the happier i 'll be .\ncj99\t0\t*\tthe more people you say that will buy tickets , the happier i 'll be .\ncj99\t1\t\tthe more people you say that right after the show opens will buy tickets , the happier i 'll be .\ncj99\t1\t\tthe more i talk to joe , the less about linguistics i am inclined to think sally has taught him to appreciate .\ncj99\t0\t*\tthe more he eats , the poorer he knows a woman that gets .\ncj99\t0\t*\tthe more he eats , the fatter he goes to a doctor when he gets .\ncj99\t0\t*\tthe more he eats , the fatter that that he gets really bothers me .\ncj99\t0\t*\tthe more he reads , the more books i wonder to whom he will give .\ncj99\t0\t*\tthe more he reads , the more people i wonder what he will give to .\ncj99\t0\t*\tthe sooner you call , the more carefully i know a man that will word the letter .\ncj99\t1\t\tthe richer john gets , the more geniuses john meets .\ncj99\t0\t*\tthe richer he gets , the more john meets geniuses .\ncj99\t1\t\tthe more articles he reads , the fewer people he thinks will go into linguistics .\ncj99\t0\t*\tthe more articles he reads , the fewer people he thinks that will go into linguistics .\ncj99\t1\t\tthe more articles he reads , the fewer people he thinks that under the current circumstances will go into linguistics .\ncj99\t1\t\tthe more articles he reads , the fewer people he thinks under the current circumstances will go into linguistics .\ncj99\t1\t\tthe more people that arrive , the louder that it gets .\ncj99\t1\t\tthe more people that arrive , the louder it gets .\ncj99\t1\t\tthe more people you give beer to , the more people that get sick .\ncj99\t1\t\tthe more people that you give beer to , the more people that get sick .\ncj99\t1\t\tthe more people arrive , the louder that it gets .\ncj99\t1\t\tthe more people arrive , the louder it gets .\ncj99\t1\t\tthe more people that you give beer to , the more people get sick .\ncj99\t0\t*\tthe more pictures of john that he buys the more arrogant he becomes .\ncj99\t1\t\tthe more pictures of himself that john buys the more arrogant he becomes .\ncj99\t1\t\tthe man that arrived on the train was my brother .\ncj99\t0\t*\tthe man arrived on the train was my brother .\ncj99\t0\t*\tthe more people everyone who likes pays attention to , the happier we all are .\ncj99\t0\t*\tthe later it gets , the more people everyone who likes pays attention to .\ncj99\t1\t\twhenever bill smokes , susan hates him all the more .\ncj99\t1\t\twhenever bill smokes , susan hates him much more .\ncj99\t1\t\twhenever bill smokes , susan hates him far more .\ncj99\t1\t\twhenever bill smokes , susan hates him a lot more .\ncj99\t1\t\tonce janet left , fred became all the crazier .\ncj99\t1\t\tonce janet left , fred became much crazier .\ncj99\t1\t\tonce janet left , fred became far crazier .\ncj99\t1\t\tfred became all the crazier , the more often janet left .\ncj99\t1\t\twhen bill smokes , all the more does susan hate him .\ncj99\t0\t*\twhen bill smokes , much more does susan hate him .\ncj99\t0\t*\twhen bill smokes , all the more susan hates him .\ncj99\t1\t\tso much did you eat that everyone gasped .\ncj99\t1\t\tso fast did you run that everyone gasped .\ncj99\t1\t\tso intelligent a dog did you buy that everyone gasped .\ncj99\t1\t\ti know how much you ate .\ncj99\t1\t\ti know how fast you ran .\ncj99\t1\t\ti know how intelligent a dog you bought .\ncj99\t1\t\the ate so much that he got sick .\ncj99\t1\t\tso much did he eat that he got sick .\ncj99\t1\t\tthe more you eat , the more you want .\ncj99\t0\t*\tyou eat the more , the more you want .\ncj99\t0\t*\tthe more you eat , you want the more .\ncj99\t0\t*\ti wonder you ate how much .\ncj99\t1\t\ti wonder to how many people bill talks .\ncj99\t1\t\tthe longer he has to wait , the angrier john gets .\ncj99\t1\t\tif he has to wait , john gets angry .\ncj99\t0\t*\the gets angry , the longer john has to wait .\ncj99\t0\t*\the gets angry if john has to wait .\ncj99\t1\t\tthe more that pictures of him appear in the news , the more embarrassed john becomes .\ncj99\t1\t\tthe more pictures of himself that appear in the news , the more embarrassed john becomes .\ncj99\t1\t\tthe more that pictures of himself appear in the news , the more embarrassed john becomes .\ncj99\t1\t\tthe more pictures of him appear in the news , the more likely john is to get arrested .\ncj99\t0\t*\tthe more pictures of himself appear in the news , the more likely john is to get arrested .\ncj99\t1\t\tthe more that pictures of him appear in the news , the more likely john is to get arrested .\ncj99\t0\t*\tthe more that pictures of himself appear in the news , the more likely john is to get arrested .\ncj99\t1\t\tthe more that john gets upset by them , the more that stories about him seem to show up in the news .\ncj99\t0\t*\tthe more that john gets upset by them , the more that stories about himself seem to show up in the news .\ncj99\t1\t\tjohn is more embarrassed , the more pictures of him appear in the news .\ncj99\t1\t\tjohn is more embarrassed , the more pictures of him that appear in the news .\ncj99\t1\t\tjohn is more embarrassed , the more pictures of himself appear in the news .\ncj99\t1\t\tjohn is more embarrassed , the more pictures of himself that appear in the news .\ncj99\t1\t\tstories about him seem to show up more on the evening news , the more that john gets upset by them .\ncj99\t0\t*\tstories about himself seem to show up more on the evening news , the more that john gets upset by them .\ncj99\t1\t\tif you give him enough opportunity , every senator will succumb to corruption .\ncj99\t1\t\tyou give him enough opportunity and every senator will succumb to corruption .\ncj99\t0\t*\twe gave him enough opportunity and , sure enough , every senator succumbed to corruption .\ncj99\t1\t\tif you give any senator enough opportunity , he will succumb to corruption .\ncj99\t1\t\tyou give any senator enough opportunity and he will succumb to corruption .\ncj99\t0\t*\tyou give every senator enough opportunity and he will succumb to corruption .\ncj99\t0\t*\twe gave any senator enough opportunity and , sure enough , he succumbed to corruption .\ncj99\t0\t*\twe gave every senator enough opportunity and , sure enough , he succumbed to corruption .\ncj99\t1\t\tthe more lobbyists he talks to , the more corrupt every senator seems to become .\ncj99\t1\t\tthe more lobbyists wine and dine him , the more every senator is susceptible to corruption .\ncj99\t0\t*\tthe more time that every senator spends with lobbyists , the more likely he succumbs to corruption .\ncj99\t1\t\tevery senator becomes more corrupt , the more lobbyists he talks to .\ncj99\t1\t\tany senator becomes more corrupt , the more lobbyists he talks to .\ncj99\t0\t*\the seems to become more corrupt , the more lobbyists any senator talks to .\ncj99\t0\t*\the seems to become more corrupt , the more lobbyists every senator talks to .\ncj99\t1\t\tevery senator seems to become more corrupt , if he talks to more lobbyists .\ncj99\t1\t\tany senator seems to become more corrupt , if he talks to more lobbyists .\ncj99\t1\t\tany senator seems to become more corrupt , as he talks to more lobbyists .\ncj99\t0\t*\the seems to become more corrupt , if any senator talks to more lobbyists .\ncj99\t0\t*\the seems to become more corrupt , if every senator talks to more lobbyists .\ncj99\t0\t*\the seems to become more corrupt , as every senator talks to more lobbyists .\ncj99\t0\t*\the seems to become more corrupt , as any senator talks to more lobbyists .\ncj99\t1\t\tthe sooner you solve this problem , the more easily you 'll satisfy the folks up at corporate headquarters .\ncj99\t1\t\tthis is the sort of problem which the sooner you solve the more easily you 'll satisfy the folks up at corporate headquarters .\ncj99\t1\t\tthe folks up at corporate headquarters are the sort of people who the sooner you solve this problem , the more easily you 'll satisfy .\ncj99\t1\t\tthis problem , the sooner you solve the more easily you 'll satisfy the folks up at corporate headquarters .\ncj99\t1\t\twho did you give pictures of to friends of ?\ncj99\t1\t\tit is this problem that the sooner you solve the more easily you 'll satisfy the folks up at corporate headquarters .\ncj99\t0\t?*\tit is the folks up at corporate headquarters who the sooner you solve this problem , the more easily you 'll satisfy .\ncj99\t0\t*\twhich problem the sooner you solve , will the more easily you satisfy the folks up at corporate headquarters ?\ncj99\t0\t*\twhich problem does the sooner that you solve , the more easily you 'll satisfy the folks up at corporate headquarters ?\ncj99\t0\t*\twhich problem the sooner that you solve , will the more easily you satisfy the folks up at corporate headquarters ?\ncj99\t0\t*\tthe harder it rains , the faster who runs ?\ncj99\t0\t*\tthe louder who talks , the angrier you get ?\ncj99\t1\t\tthe harder that it rains , how much faster a flow do you see in the river ?\ncj99\t1\t\tthey failed to tell me which problem the sooner i solve , the quicker the folks up at corporate headquarters .\ncj99\t0\t??\ti finally worked up enough courage to ask which people up at corporate headquarters the sooner i solve this problem , the quicker i 'll get free of .\ncj99\t0\t??\twhich folks up at corporate headquarters do you think that the sooner you solve this problem , the quicker you 'll be able to tell t to buzz off ?\ncj99\t0\t??\tthis is a problem that you 'll be able to tell the folks up at corporate headquarters to buzz off if you solve .\ncj99\t1\t\tthis is a problem that you 'll be able to tell the folks up at corporate headquarters to buzz off if you solve it .\ncj99\t0\t??\tthis is a problem that you solve it and you 'll be able to tell the folks up at corporate headquarters to buzz off .\ncj99\t0\t??\tthose are the folks that you just solve this problem and you 'll be able to put them on ice .\ncj99\t0\t??\tthey failed to tell me which problem i 'll beat the competition more easily , the sooner i solve .\ncj99\t0\t??\tthis is the problem that you 'll beat the competition more easily , the sooner you solve .\nbc01\t1\t\tjohn saw the man in the room .\nbc01\t1\t\twhich room did john see the man in ?\nbc01\t1\t\twho did john think that bill claimed that mary suspected that everybody liked ?\nbc01\t1\t\tjohn could not visit sally .\nbc01\t1\t\twhat john could do is not visit sally .\nbc01\t1\t\tjohn could n't visit sally .\nbc01\t1\t\twhy did john leave ?\nbc01\t1\t\ti hit the ball .\nbc01\t1\t\tyou hit the ball .\nbc01\t0\t*\the hit the ball .\nbc01\t0\t*\tshe hit the ball .\nbc01\t1\t\tthey hit the ball .\nbc01\t0\t*\tam not i going ?\nbc01\t1\t\ti am not going .\nbc01\t1\t\tare n't i going ?\nbc01\t0\t*\ti are n't going .\nbc01\t1\t\tlouise is unhappy , is n't she ?\nbc01\t1\t\tlouise likes not being happy , does n't she ?\nbc01\t1\t\tnot many books survived the fire , did they ?\nbc01\t1\t\tno books survived the fire , did they ?\nbc01\t1\t\the has n't often paid taxes , has he ?\nbc01\t1\t\the ca n't pay taxes , can he ?\nbc01\t1\t\tshe does not see him .\nbc01\t1\t\tshe kept not seeing him .\nbc01\t1\t\tshe could not have been working .\nbc01\t0\t*\tmarianne not left .\nbc01\t0\t*\tmarianne left not .\nbc01\t1\t\the could not have been working .\nbc01\t1\t\the can not have been working .\nbc01\t1\t\the can simply not have been working .\nbc01\t1\t\tyou must not simply not work .\nbc01\t1\t\the may not just not have been working .\nbc01\t1\t\the ca n't have been working .\nbc01\t1\t\tca n't he have been working ?\nbc01\t1\t\tcan he not have been working ?\nbc01\t0\t*\tcan he not have been working ?\nbc01\t1\t\tjohn wrote books .\nbc01\t0\t*\tjohn write books .\nbc01\t0\t*\tjohn wrote books .\nbc01\t1\t\tjohn did not write books .\nbc01\t0\t*\tjohn seems that is nice .\nbc01\t1\t\t`` i am so happy '' , thought john .\nbc01\t1\t\tdown the hill rolled john .\nbc01\t0\t*\tjohn kisses often mary .\nbc01\t1\t\tjohn often kisses mary .\nbc01\t1\t\twho do you think mary said john likes ?\nbc01\t0\t?*\twho did you ask whether mary knows why john likes ?\nbc01\t1\t\twho do you think that mary said that john likes ?\nbc01\t0\t*\thow do you wonder whether mary solved the problem ?\nbc01\t1\t\thow do you think that mary solved the problem ?\nbc01\t0\t*\thow do you wonder whether john said that mary solved the problem ?\nbc01\t0\t*\thow do you wonder whether john said mary solved the problem ?\nbc01\t0\t??\twhich problem do you wonder whether john said that mary solved ?\nbc01\t1\t\thow did you think that mary solved the problem ?\nbc01\t1\t\tmary hired someone .\nbc01\t1\t\ti heard that mary hired someone .\nbc01\t1\t\ti resigned because mary hired someone .\nbc01\t1\t\tmary wondered which picture of himself bill saw ?\nbc01\t1\t\twhich picture of himself does mary think that john said that susan likes ?\nbc01\t0\t*\tmary thinks that john said that susan likes pictures of himself ?\nbc01\t1\t\tmary thinks that john said that pictures of himself , susan likes ?\nbc01\t1\t\tif you do n't believe me , you will the weatherman ?\nbc01\t1\t\ti rolled up a newspaper , and lynn did a magazine ?\nbc01\t1\t\tkathy likes astronomy , but she does n't meteorology ?\nbc01\t1\t\tthe da proved jones guilty and the assistant da will prove smith .\nbc01\t1\t\tmary will believe susan , and you will bob .\nbc01\t1\t\tyou might not believe me but you will bob .\nbc01\t0\t*\tyou will bob believe .\nbc01\t1\t\thow did you solve the problem ?\nbc01\t1\t\ti wonder who could solve the problem in this way .\nbc01\t0\t*\thow do you wonder who could solve this problem .\nbc01\t1\t\tno candidate can predict how many people will vote for him .\nbc01\t1\t\tevery politician is worried when the press starts attacking him .\nbc01\t1\t\twhich politician appointed the journalist who supported him ?\nbc01\t0\t*\tthe fact that no candidate was elected shows that he was inadequate .\nbc01\t1\t\tjohn sells books , mary buys records and bill v newspapers .\nbc01\t1\t\tthe question of whether john met mary worries the people who support .\nbc01\t1\t\tthey have left .\nbc01\t1\t\thave they left ?\nbc01\t1\t\tcould they have left ?\nbc01\t1\t\the has often seen mary .\nbc01\t1\t\the i often sees mary .\nbc01\t0\t*\the sees often mary .\nbc01\t0\t*\tsees he i often mary ?\nbc01\t1\t\tit seems that it is likely that john will win .\nbc01\t1\t\tit seems that john is likely to win .\nbc01\t1\t\tjohn seems to be likely to win .\nbc01\t0\t*\tjohn seems that it is likely to win .\nbc01\t0\t*\tjohn seems will win .\nbc01\t0\t*\thow do you wonder which problem to solve ?\nbc01\t1\t\thow intelligent do you consider john ?\nbc01\t0\t??\thow many people do you wonder whether i consider intelligent ?\nbc01\t0\t*\thow intelligent do you wonder whether i consider john ?\nbc01\t0\t*\twhat the hell do you wonder how to say ?\nbc01\t1\t\the has left .\nbc01\t1\t\this book is nice .\nbc01\t1\t\tbill saw him .\nbc01\t1\t\tbill works with him .\nbc01\t1\t\tjohn believes him to be a nice guy .\nbc01\t1\t\tjohn considers him a nice guy .\nbc01\t1\t\tfor him to do that would be a mistake .\nbc01\t1\t\twith him sick , the team is in trouble .\nbc01\t0\t*\ta man to be in the garden is unlikely .\nbc01\t0\t*\ta man to come is unlikely .\nbc01\t0\t*\tjohn to call would be unlikely .\nbc01\t0\t*\tthis conclusion to be arrived at is surprising .\nbc01\t1\t\tjohn believes that he is sick .\nbc01\t0\t*\tjohn believes that him is sick .\nbc01\t0\t*\tjohn tries him to win .\nbc01\t0\t*\tjohn wonders where him to go .\nbc01\t1\t\twho do you think that bill likes ?\nbc01\t1\t\twho do you think that bill believes to be innocent ?\nbc01\t0\t*\twho do you think that believes john to be innocent ?\nbc01\t0\t*\twho would you prefer for to win the race ?\nbc01\t1\t\tsomeone stole my car .\nbc01\t1\t\tmy car was stolen .\nbc01\t0\t*\tthe children eat all chocolate .\nbc01\t1\t\tjohn has often kissed mary .\nbc01\t1\t\tthe kids have all eaten the chocolate .\nbc01\t1\t\tin general , he understands what 's going on .\nbc01\t1\t\tit 's probable that in general he understands what 's going on .\nbc01\t0\t*\tit 's probable in general that he understands what 's going on .\nbc01\t0\t*\tin general that he understands what 's going on is surprising .\nbc01\t1\t\ti explained how to fix the sink .\nbc01\t1\t\ti explained how we should fix the sink .\nbc01\t1\t\ti explained that we should fix the sink .\nbc01\t0\t*\ti explained to fix the sink .\nbc01\t1\t\tmickey looked up the reference .\nbc01\t1\t\tmickey looked the reference up .\nbc01\t1\t\tmickey looked up them .\nbc01\t1\t\tmickey teamed up with the women .\nbc01\t0\t*\tmickey teamed with the women up .\nbc01\t1\t\tmickey pointed out that gary had left .\nbc01\t0\t*\tmickey pointed that gary had left out .\nbc01\t1\t\tmickey slips up all the time .\nbc01\t0\t*\tmickey slips all the time up .\nbc01\t1\t\twhat does john think mary bought ?\nbc01\t0\t*\tjohn thinks what mary bought .\nbc01\t1\t\tjohn wonders what mary bought .\nbc01\t0\t*\twhat does john wonder mary bought ?\nbc01\t0\t??\twho is he reading a book that criticizes ?\nbc01\t0\t??\twhat do you remember where we bought ?\nbc01\t1\t\twho bought what ?\nbc01\t1\t\twho is reading a book that criticizes who ?\nbc01\t1\t\twho remembers where we bought what ?\nbc01\t0\t*\ti wonder who what bought ?\nbc01\t0\t*\ti wonder what who bought ?\nbc01\t1\t\tthere are n't many linguistics students here .\nbc01\t1\t\ti have n't met many linguistics students .\nbc01\t1\t\twhat does every student buy ?\nbc01\t1\t\ti need sally to be there .\nbc01\t0\t*\tthe boat sank to collect the insurance .\nbc01\t1\t\tthe boat was sunk to collect the insurance .\nbc01\t1\t\tjohn wants to win .\nbc01\t1\t\tthe bed was unmade .\nbc01\t0\t*\theadway was unmade .\nbc01\t1\t\tjohn was unknown .\nbc01\t0\t*\tjohn was unknown to be the murderer .\nbc01\t1\t\twe knew john to be the murderer .\nbc01\t1\t\the fed the children .\nbc01\t1\t\tthe children were uneducated .\nbc01\t1\t\tthe children were undisciplined .\nbc01\t1\t\ti believed these students all to like john .\nbc01\t1\t\tthey tried to all like john .\nbc01\t1\t\ti believed these students to all like john .\nbc01\t0\t?*\tdid he try ever to talk to the student ?\nbc01\t1\t\tdid you believe him ever to have made an effort to talk to the student ?\nbc01\t1\t\tdid he try to ever be attentive to the needs of students ?\nbc01\t1\t\tdid you believe him to ever have made an effort to talk to the student ?\nbc01\t1\t\twork out an analysis that is typical of this view of understood subjects .\nbc01\t1\t\tthey were believed all to be quite diligent .\nbc01\t1\t\twas he believed ever to fail students ?\nbc01\t1\t\tthere is tending to be more and more discussion of these issues .\nbc01\t1\t\tjohn seemed to be a great linguist .\nbc01\t1\t\tthere promises to be a storm tonight .\nbc01\t1\t\tjohn strived to be successful .\nbc01\t1\t\tjohn wanted to improve his lot in life .\nbc01\t1\t\tjohn expected to win .\nbc01\t1\t\tthis book is too dense to be read in one sitting .\nbc01\t0\t*\tthere is too likely to be a riot to be a serious discussion of the issues .\nbc01\t1\t\tjohn tried .\nbc01\t1\t\tjohn remembered .\nbc01\t1\t\tjohn is refused .\nbc01\t1\t\tjohn forgot .\nbc01\t0\t*\tbill seems to be obnoxious , but i do n't think that sam happens .\nbc01\t0\t*\tbill seems to be obnoxious , but i do n't think that sam turns out .\nbc01\t0\t*\tbill seems to be obnoxious , but i do n't think that sam tends .\nbc01\t0\t*\tthey tried all to like john .\nbc01\t1\t\tthey seemed all to like john .\nbc01\t1\t\tjohn believes sally to be polite .\nbc01\t1\t\ti believe john with all my heart to be a fine person .\nbc01\t0\t*\tjohn is wanted to win .\nbc01\t0\t*\tjohn would be liked to win .\nbc01\t1\t\twe would like john to win .\nbc01\t0\t*\tjohn would be hated to win .\nbc01\t0\t*\tjohn would be preferred to be the candidate .\nbc01\t1\t\twe would prefer john to be the candidate .\nbc01\t1\t\ti would like for john to win .\nbc01\t1\t\ti would hate for john to win .\nbc01\t1\t\ti would prefer for john to be the candidate .\nbc01\t1\t\tjohn destroyed the house .\nbc01\t1\t\tthe electrode emitted ions into the medium .\nbc01\t1\t\tions struck the electrode .\nbc01\t1\t\tthe medium contains ions .\nbc01\t0\t*\tthe house destroyed john .\nbc01\t1\t\tions left the electrode .\nbc01\t0\t*\tthe electrode was left by ions .\nbc01\t1\t\tthe electrode was struck by ions .\nbc01\t1\t\tthe ball lies in the box .\nbc01\t1\t\tthe ball rolled from the bush to the tree .\nbc01\t1\t\tthe box contains the ball .\nbc01\t1\t\tthe tree dropped fruit to the ground .\nbc01\t1\t\tfruit hit the ground from the tree .\nbc01\t1\t\tthe stone knocked against the pole into the road .\nbc01\t1\t\tthe stone knocked the pole into the road .\nbc01\t1\t\tthe box contained the ball .\nbc01\t0\t*\tthe box gradually contained the ball .\nbc01\t0\t*\tthe box at once contained the ball .\nbc01\t0\t*\tthe box contained the ball to the ground .\nbc01\t1\t\tthe tree gradually dropped its fruit to the ground .\nbc01\t1\t\tthe tree dropped its fruit to the ground .\nbc01\t1\t\tfruit hit the roof .\nbc01\t1\t\tfruit hit the roof from the tree .\nbc01\t1\t\tfruit at once hit the roof from the tree .\nbc01\t0\t*\tfruit hit the roof against the ground .\nbc01\t0\t*\tfruit at once hit the roof against the ground .\nbc01\t1\t\tfruit dropped from the tree .\nbc01\t0\t*\tfruit dropped from the tree from the clouds .\nbc01\t0\t*\tfruit fell against the house .\nbc01\t0\t*\tfruit fell against the house against the ground .\nbc01\t1\t\tthe tree changed into an oak .\nbc01\t1\t\tthe tree changed from a maple into an oak .\nbc01\t0\t*\tthe maple changed into an oak from a cedar .\nbc01\t1\t\tthe maple changed into an oak from a cedar .\nbc01\t1\t\tthe maple changed into an oak .\nbc01\t1\t\tthe oak developed out of a maple .\nbc01\t1\t\tthe train reached the station .\nbc01\t1\t\tthe branches knocked against the wall .\nbc01\t1\t\tthe child became a man .\nbc01\t1\t\tthe party lasted till midnight .\nbc01\t1\t\tthe dog went crazy .\nbc01\t1\t\tit struck john that it was so .\nbc01\t1\t\tit came to john that it was so .\nbc01\t1\t\tthe snake saw into the nest .\nbc01\t1\t\thard work resulted in high grades .\nbc01\t1\t\tthe farm passed to john .\nbc01\t1\t\tjohn is touching the wall .\nbc01\t1\t\tthe wall is being touched by john .\nbc01\t1\t\ta bear occupies the cave .\nbc01\t1\t\ta bear inhabits the cave .\nbc01\t1\t\twater fills the tub .\nbc01\t1\t\tthe electric main joins the house circuit in the basement .\nbc01\t1\t\tthe house circuit is joined by the electric main in the basement .\nbc01\t1\t\tthe fence straddles the sidewalk .\nbc01\t1\t\tthe sidewalk is straddled by the fence .\nbc01\t1\t\tthe man with a book .\nbc01\t1\t\tgas escaped the tube .\nbc01\t1\t\tthe terrorist escaped the prison cell .\nbc01\t1\t\tthe prison cell was escaped by the terrorist .\nbc01\t1\t\tthe rolling stone avoided the river .\nbc01\t1\t\tthe river was avoided by the rolling stone .\nbc01\t1\t\tthe agents caught the terrorist .\nbc01\t1\t\tthe sponge soaked up the water .\nbc01\t1\t\tthe tub filled with water .\nbc01\t1\t\tjohn received a book .\nbc01\t1\t\tjohn learned a lesson .\nbc01\t1\t\tthe parcel reached john .\nbc01\t1\t\tjohn received the parcel .\nbc01\t1\t\tthe farm finally got to john after much litigation .\nbc01\t0\t*\tthe farm finally reached john after much litigation .\nbc01\t1\t\twater filled the cup high .\nbc01\t1\t\twater filled the cup .\nbc01\t0\t*\twater emptied the cup .\nbc01\t0\t*\tthe cup filled the water high .\nbc01\t0\t*\tthe cup filled of water .\nbc01\t1\t\tthe cup filled with water .\nbc01\t0\t*\tthe cup emptied with water .\nbc01\t1\t\tthe barge piled high with logs .\nbc01\t0\t*\tthe road blocked with a stone .\nbc01\t0\t*\tthe branch dropped bare of its apple .\nbc01\t0\t*\tthe logs piled the barge high .\nbc01\t1\t\ta stone blocked the road .\nbc01\t0\t*\tthe bottle drained the liquid free .\nbc01\t1\t\tthe branch dropped its apple free .\nbc01\t1\t\tsome branches broke off of the tree .\nbc01\t0\t*\tthe tree broke off some branches .\nbc01\t1\t\tthe tree dropped some branches .\nbc01\t1\t\tthe tree lost some branches .\nbc01\t1\t\twater bubbled out of the kettle .\nbc01\t0\t*\tthe kettle bubbled water up .\nbc01\t1\t\tthe kettle bubbled water .\nbc01\t0\t*\tthe cup filled water .\nbc01\t0\t*\tthe stone knocked the pole into the road .\nbc01\t1\t\tthe tub leaked empty of water .\nbc01\t0\t*\tthe stone knocked against the pole into the road .\nbc01\t1\t\thail stones broke the window .\nbc01\t1\t\tthe force of the wind broke the window .\nbc01\t0\t*\tthe window broke from hail stones .\nbc01\t1\t\tthe window broke from the force of the wind .\nbc01\t1\t\twhat the force of the wind did to the window was break it .\nbc01\t1\t\tjohn hit the stone against the wall .\nbc01\t1\t\tjohn hit the wall with the stone .\nbc01\t1\t\tjohn tapped some wine from a barrel .\nbc01\t1\t\tjohn tapped a barrel of some wine .\nbc01\t1\t\tjohn laid the book on the table .\nbc01\t1\t\tjohn included his name in the list .\nbc01\t1\t\tjohn loaded the bricks onto the truck .\nbc01\t1\t\tjohn loaded the truck with bricks .\nbc01\t1\t\tjohn fed rice to the baby .\nbc01\t1\t\tjohn fed the baby rice .\nbc01\t1\t\tjohn fed the baby up with rice .\nbc01\t0\t*\tjohn fed the baby rice up .\nbc01\t1\t\tthe ball lies completely in the box .\nbc01\t1\t\tthe box completely contains the ball .\nbc01\t1\t\tthe train got to the station fully .\nbc01\t1\t\tthe train reached the station fully .\nbc01\t1\t\tpress the stamp against the pad completely .\nbc01\t1\t\tpress the pad with the stamp completely .\nbc01\t1\t\tspray the paint onto the wall completely .\nbc01\t1\t\tspray all the paint onto the wall completely .\nbc01\t0\t*\tspray the wall with all the paint .\nbc01\t1\t\tspray the whole wall with the paint .\nbc01\t1\t\twhat john did to the wall was paint it .\nbc01\t1\t\twhat john did to the whole wall was paint it .\nbc01\t1\t\twhat john did to the wall was hit it .\nbc01\t0\t*\twhat the stone did to the wall was hit it .\nbc01\t0\t*\twhat the stone did to the whole wall was hit it .\nbc01\t1\t\tjohn took bill to be a fool .\nbc01\t0\t*\tjohn concluded bill to be a fool .\nbc01\t1\t\tgive the bottle to the baby full .\nbc01\t0\t*\tgive the bottle to the baby awake .\nbc01\t1\t\tgive the baby the bottle full .\nbc01\t0\t*\tgive the baby the bottle awake .\nbc01\t1\t\trub the cloth on the baby torn .\nbc01\t0\t*\trub the cloth on the baby asleep .\nbc01\t1\t\trub the baby with the cloth torn .\nbc01\t0\t*\trub the baby with the cloth asleep .\nbc01\t1\t\tdry the baby with the cloth asleep .\nbc01\t0\t*\tdry the baby with the cloth torn .\nbc01\t0\t*\tthe cup knocked the stone apart .\nbc01\t1\t\tthe stone knocked the cup apart .\nbc01\t1\t\tthe cup smashed apart against the stone .\nbc01\t1\t\tthe stone smashed the cup apart .\nbc01\t1\t\tthe tank filled with petrol out of the pump .\nbc01\t1\t\tthe cup emptied of water onto the ground .\nbc01\t1\t\tjohn included her name in the list .\nbc01\t1\t\tjohn rolled the ball from the tree to the bush .\nbc01\t1\t\tjohn tapped the bottle of some water .\nbc01\t1\t\tjohn gave bill the book .\nbc01\t1\t\tjohn got the book from bill .\nbc01\t0\t*\tjohn gave bill of the book .\nbc01\t1\t\twe have someone in the living room .\nbc01\t1\t\tjohn is very fond of mary .\nbc01\t1\t\tmary laughed at john .\nbc01\t1\t\tthe ship sank beneath the waves .\nbc01\t1\t\tmary considers john a fool and bill a wimp .\nbc01\t1\t\tjohn regards professors as strange and politicians as creepy .\nbc01\t1\t\tsue put the books on the table and the records on the chair .\nbc01\t1\t\tharriet gave a mug to john and a scarf to vivien .\nbc01\t1\t\ti expect john to win and harry to lose .\nbc01\t1\t\tyou eat the fish raw and the beef cooked .\nbc01\t1\t\tthey told sue who to talk to and virginia when to leave .\nbc01\t1\t\tsmith loaned , and his widow later donated , a valuable collection of manuscripts to the library .\nbc01\t1\t\tsue moved , and mary also transferred , her business to a different location .\nbc01\t1\t\ti succeeded in convincing , even though john had failed to persuade , mary not to leave .\nbc01\t1\t\twe did n't particularly like , but nevertheless ate , the fish raw .\nbc01\t1\t\tflo desperately wants , though she does n't really expect , the miami dolphins to be in the play-offs .\nbc01\t1\t\tjohn learned french perfectly .\nbc01\t1\t\tbill recited his lines poorly .\nbc01\t1\t\tmary plays the violin beautifully .\nbc01\t0\t*\tjohn perfectly learned french .\nbc01\t0\t*\tbill poorly recited his lines .\nbc01\t1\t\tjohn learned french immediately .\nbc01\t1\t\tbill recited his lines slowly .\nbc01\t1\t\tmary will play the violin soon .\nbc01\t1\t\tjohn immediately learned french .\nbc01\t1\t\tbill slowly recited his lines .\nbc01\t1\t\tmary will soon play the violin .\nbc01\t1\t\tjohn immediately learned french perfectly .\nbc01\t1\t\tjohn learned french perfectly almost immediately .\nbc01\t1\t\tjohn learned french perfectly immediately .\nbc01\t0\t*\tjohn perfectly learned french immediately .\nbc01\t0\t*\tjohn learned french immediately perfectly .\nbc01\t0\t*\tclearly , john immediately will probably learn french perfectly .\nbc01\t0\t*\timmediately , john probably will clearly learn french perfectly .\nbc01\t0\t*\tclearly , john perfectly will immediately learn french probably .\nbc01\t0\t*\tjohn perfectly rolled the ball down the hill .\nbc01\t1\t\tjohn rolled the ball perfectly down the hill .\nbc01\t1\t\tjohn rolled the ball down the hill perfectly .\nbc01\t0\t*\tjohn perfectly shot the ball .\nbc01\t1\t\tjohn shot the ball perfectly .\nbc01\t0\t*\tjohn intimately spoke to mary .\nbc01\t1\t\tjohn spoke intimately to mary .\nbc01\t1\t\tjohn spoke to mary intimately .\nbc01\t1\t\tjohn spoke french intimately to mary .\nbc01\t1\t\tjohn spoke french to mary intimately .\nbc01\t1\t\tmary jumped the horse perfectly over the last fence .\nbc01\t1\t\tmary jumped the horse over the last fence perfectly .\nbc01\t0\t*\tjohn spoke intimately french to mary .\nbc01\t0\t*\tjohn spoke to mary french .\nbc01\t0\t*\tmary persuaded to leave john .\nbc01\t0\t*\tthe lions ate raw the meat .\nbc01\t0\t*\tmary persuaded that he should rest bill .\nbc01\t1\t\twe consider the men all fools .\nbc01\t1\t\twe consider the men all totally crazy .\nbc01\t0\t*\ti saw the men all .\nbc01\t0\t*\tthe men were arrested all .\nbc01\t0\t*\tthe men arrived all .\nbc01\t1\t\tthe teacher ordered the two boys both to pay close attention .\nbc01\t1\t\tthey returned the books all to their owners .\nbc01\t1\t\twe painted the chairs all red .\nbc01\t1\t\tthe trainer fed the steaks all to the lions .\nbc01\t0\t*\tbill proud of himself john does n't consider .\nbc01\t0\t*\thome was gone by john .\nbc01\t1\t\tmary left the room angry .\nbc01\t0\t*\tthe room was left angry by mary .\nbc01\t0\t*\tthe room was left angry .\nbc01\t1\t\tjohn resembles bill .\nbc01\t0\t*\tbill is resembled by john .\nbc01\t1\t\tthe package weighed 10 lb .\nbc01\t0\t*\t10 lb was weighed by the package .\nbc01\t1\t\tthis book cost $ 10 .\nbc01\t0\t*\t$ 10 was cost by this book .\nbc01\t1\t\tthe book cost john $ 10 .\nbc01\t0\t*\tjohn was cost $ 10 by the book .\nbc01\t0\t*\tjohn is impressed by bill as pompous .\nbc01\t0\t*\tthe boys were made a good mother .\nbc01\t0\t*\tthe boys were made a good mother by aunt mary .\nbc01\t0\t*\tthe kids were failed by max as a father .\nbc01\t0\t*\tthe kids were failed as a father .\nbc01\t0\t*\tthe men were struck by the idea as nonsense .\nbc01\t0\t*\tthe men were promised to leave .\nbc01\t0\t*\the impresses his friends all as pompous .\nbc01\t0\t*\taunt mary made the boys all a good mother .\nbc01\t0\t*\tmax failed the kids all as a father .\nbc01\t0\t*\tfrank promised the men all to leave .\nbc01\t0\t*\twe proclaimed to the public john to be a hero .\nbc01\t1\t\twe proclaimed john to the public to be a hero .\nbc01\t0\t*\twe proclaimed sincerely john to be a hero .\nbc01\t1\t\twe proclaimed john sincerely to be a hero .\nbc01\t0\t*\twe proclaimed sincerely to the public john to be a hero .\nbc01\t1\t\twe proclaimed john sincerely to the public to be a hero .\nbc01\t0\t*\tthey represented to the dean mary as a genuine linguist .\nbc01\t1\t\tthey represented mary to the dean as a genuine linguist .\nbc01\t0\t*\tthey represented seriously mary as a genuine linguist .\nbc01\t1\t\tthey represented mary seriously as a genuine linguist .\nbc01\t1\t\tthey represented mary seriously to the dean as a genuine linguist .\nbc01\t0\t*\twe proved to the authorities smith to be the thief .\nbc01\t0\t*\twe proved conclusively smith to be the thief .\nbc01\t1\t\twe proved smith conclusively to be the thief .\nbc01\t0\t*\twe proved conclusively to the authorities smith to be the thief .\nbc01\t1\t\twe proved smith conclusively to the authorities to be the thief .\nbc01\t1\t\tthe gardener watered the tulips flat .\nbc01\t1\t\tthe grocer ground the coffee beans to a fine powder .\nbc01\t1\t\tthey painted their house a hideous shade of green .\nbc01\t1\t\tthe joggers ran their nikes threadbare .\nbc01\t1\t\tthe kids laughed themselves into a frenzy .\nbc01\t1\t\the coughed his handkerchief completely soggy .\nbc01\t1\t\tthey fed the meat to the lions raw .\nbc01\t0\t*\tthe lions ate at the meat raw .\nbc01\t1\t\twe love them .\nbc01\t0\t*\twe love they .\nbc01\t0\t*\twe love their .\nbc01\t0\t*\tus love their .\nbc01\t1\t\tour love they .\nbc01\t1\t\tour love them .\nbc01\t1\t\tour love their .\nbc01\t0\t*\the belief that mary kissed bill is mistaken .\nbc01\t0\t*\thim belief that mary kissed bill is mistaken .\nbc01\t1\t\this belief that mary kissed bill is mistaken .\nbc01\t1\t\tmary loves him .\nbc01\t1\t\tmary is fond of him .\nbc01\t0\t*\tmary is fond him .\nbc01\t1\t\tmary criticized him .\nbc01\t0\t*\tmary 's criticism him was cruel .\nbc01\t1\t\tmary 's criticism of him was cruel .\nbc01\t1\t\tthat john loves mary is doubtful .\nbc01\t0\t*\tjohn to love mary would be doubtful .\nbc01\t1\t\tfor john to love mary would be doubtful .\nbc01\t1\t\tto go abroad would be nice .\nbc01\t1\t\tjohn 's plan to go abroad is nice .\nbc01\t1\t\tmary believed john to have loved her .\nbc01\t1\t\tmary considered john to have loved her .\nbc01\t1\t\tmary reported john to have loved her .\nbc01\t0\t*\tmary considered to have loved her .\nbc01\t1\t\tmary tried to go abroad .\nbc01\t1\t\tmary intended to go abroad .\nbc01\t1\t\tmary managed to go abroad .\nbc01\t1\t\tmary desired to go abroad .\nbc01\t0\t*\tmary tried john to go abroad .\nbc01\t0\t*\tmary managed john to go abroad .\nbc01\t0\t*\tmary desired john to go abroad .\nbc01\t1\t\tmary believed him to have loved her .\nbc01\t1\t\tmary considered him to have loved her .\nbc01\t0\t*\tmary believed he to have loved her .\nbc01\t0\t*\tmary considered he to have loved her .\nbc01\t0\t*\tmary reported he to have loved her .\nbc01\t0\t*\tmary believed his to have loved her .\nbc01\t0\t*\tmary considered his to have loved her .\nbc01\t0\t*\tmary reported his to have loved her .\nbc01\t1\t\tit is certain that john has loved mary .\nbc01\t1\t\tit is likely that john has loved mary .\nbc01\t1\t\tthere are strangers in that garden .\nbc01\t0\t*\tthere is strangers in that garden .\nbc01\t0\t*\tthere is arriving three men at that station .\nbc01\t1\t\tthere are arriving three men at that station .\nbc01\t1\t\ti consider there to be a man in that garden .\nbc01\t0\t*\ti consider there a man in that garden .\nbc01\t1\t\tthey alleged there to have been many strangers in that garden .\nbc01\t0\t*\tthey alleged many strangers to have been in that garden .\nbc01\t1\t\tjohn wagered there to have been a stranger in that haunted house .\nbc01\t0\t*\tjohn wagered a stranger to have been in that haunted house .\nbc01\t1\t\tjohn tried to kiss mary .\nbc01\t1\t\tjohn persuaded mary to kiss him .\nbc01\t1\t\tjohn told mary to kiss him .\nbc01\t1\t\tit is illegal to park here .\nbc01\t1\t\ti remembered him having kissed mary .\nbc01\t1\t\ti reported him having kissed mary .\nbc01\t1\t\ti reported having kissed mary .\nbc01\t1\t\ti enjoy taking a bath .\nbc01\t1\t\ti detest taking a bath .\nbc01\t0\t*\ti enjoy him taking a bath .\nbc01\t0\t*\ti detest him taking a bath .\nbc01\t1\t\ti saw him kissing mary .\nbc01\t1\t\ti noticed him kissing mary .\nbc01\t0\t*\ti noticed kissing mary .\nbc01\t0\t*\tthere was known to everyone .\nbc01\t1\t\tjohn 's refusing the offer is shocking .\nbc01\t1\t\tthe enemy 's destroying the city was horrific .\nbc01\t1\t\tjohn 's refusal of the offer was shocking .\nbc01\t1\t\tthe enemy 's destruction of the city was horrific .\nbc01\t0\t*\tjohn wanted to leave the room happy and leave the room he did happy .\nbc01\t1\t\ti often send mary home drunk , and she gets there just fine .\nbc01\t0\t*\ti raw eat fish drunk .\nbc01\t0\t*\ti only eat fish drunk raw .\nbc01\t1\t\ti do n't think fred will , either .\nbc01\t1\t\tjosé likes cabbage , and holly does too .\nbc01\t1\t\tjosé ate cabbage , and holly has too .\nbc01\t1\t\tjosé is eating cabbage , and holly is too .\nbc01\t1\t\tjohn is leaving but mary 's not .\nbc01\t1\t\ti consider bill intelligent and i consider sally not .\nbc01\t0\t*\tsally started running down the street , but only after josé started .\nbc01\t0\t*\tsally made bill laugh , and then josé made .\nbc01\t0\t*\tmary came to read fred 's story , and i also came to .\nbc01\t1\t\tjohn wants to go on vacation , but he does n't know when to .\nbc01\t0\t*\tmary was told to bring something to the party , so she asked sue what to .\nbc01\t0\t*\twe might go on vacation if we can ever figure out when to .\nbc01\t0\t*\tron wanted to wear a tuxedo to the party , but caspar could n't decide whether to .\nbc01\t0\t*\tyou should n't play with rifles because to is dangerous .\nbc01\t0\t*\tjohn is being discussed and sally is being too .\nbc01\t0\t*\ti remember john being discussed , but you recall sally being .\nbc01\t1\t\tsally might have eaten cabbage , but holly should n't .\nbc01\t1\t\tjosé asks that we go to the meeting , and sally will tell us when .\nbc01\t0\t*\tit 's we go to the meeting , that sally will tell us when .\nbc01\t1\t\tit 's to mary that joe said holly can talk .\nbc01\t1\t\tmary claimed that eaten cabbage , holly has n't .\nbc01\t1\t\tmary claimed that eating cabbage , holly 's not .\nbc01\t1\t\tmary claimed that eat cabbage , holly wants to .\nbc01\t0\t*\tmary claimed that would eat cabbage , holly .\nbc01\t0\t*\tmary claimed that has n't eaten cabbage , holly .\nbc01\t0\t*\tmary claimed that eating cabbage , holly started .\nbc01\t0\t*\tmary claimed that eat cabbage , holly made me .\nbc01\t0\t*\tmary claimed that have eaten cabbage , holly should .\nbc01\t0\t*\tmary claimed that intelligent , i consider holly not .\nbc01\t0\t*\tlilly recounted a story to remember because holly had also recounted a story to .\nbc01\t0\t*?\ti reviewed joe 's attempt to find holly while you reviewed josé 's attempt to .\nbc01\t0\t*?\tmary questioned joe 's desire to eat cabbage , but only after i had questioned sally 's desire to .\nbc01\t0\t*?\tsally explained the attempt to arrest holly , but only after i had denied the decision to .\nbc01\t1\t\tjohn did n't hit a home run , but i know a woman who did .\nbc01\t1\t\tthat betsy won the batting crown is not surprising , but that peter did n't know she did is surprising .\nbc01\t0\t*\tyou should n't have played with rifles because to have is dangerous .\nbc01\t0\t??\tron wanted to be wearing a tuxedo to the party , but caspar did n't know whether to be .\nbc01\t0\t*\tlilly recounted a story to be remembered because holly had recounted a story to be .\nbc01\t1\t\tlilly decided that eating cabbage , she should be .\nbc01\t0\t*\tlilly decided eating cabbage , to be .\nbc01\t1\t\tread fred 's story , i also want to .\nbc01\t0\t*\tyou should n't play with rifles because play with rifles to is dangerous .\nbc01\t0\t??\tron wanted to wear a tuxedo to the party , but wear a tuxedo to the party caspar could n't decide whether to .\nbc01\t0\t*\tlucy barnes recounted a story to remember because remember holly had recounted a story to .\nbc01\t1\t\tmag wildwood came to introduce the bartender but i came not to .\nbc01\t1\t\tmag wildwood came to introduce the bartender but i came precisely not to .\nbc01\t1\t\tyou should unload rifles because not to s is dangerous .\nbc01\t1\t\tif ron knows whether to wear a tuxedo , and caspar knows whether not to , do they know different things ?\nbc01\t1\t\tlucy recounted a story to remember because holly had recounted as story not to .\nbc01\t0\t*\ti will , if i can work on it .\nbc01\t1\t\tdid harry leave ?\nbc01\t1\t\tdoes joe sing ?\nbc01\t0\t*\ta proof that god exist does .\nbc01\t0\t*\ta proof that god does exists .\nbc01\t0\t*\ti visited every town in every country i had to .\nbc01\t1\t\tevery man who said he would buy some salmon did .\nbc01\t1\t\ti visited every town i had to .\nbc01\t1\t\tevery town in every country i had to i visited .\nbc01\t1\t\tevery man who said he would buy some salmon did buy some salmon .\nbc01\t1\t\tlilly should buy salmon and mary should too .\nbc01\t1\t\tlilly should buy salmon and mary should buy salmon too .\nbc01\t1\t\tjoe 's neuroses bother his patrons , and sally 's neuroses do too .\nbc01\t1\t\tjoe likes his bar , and sally 's patrons do too .\nbc01\t1\t\tevery picture of itself arrived .\nbc01\t1\t\tmy uncle does n't have a spouse but your aunt does and he is lying on the floor .\nbc01\t0\t*\tmy uncle did n't buy anything for christmas , but my aunt did it for him and it was bright red .\nbc01\t1\t\ti know which book max read , and which book oscar did n't .\nbc01\t1\t\tthis is the book of which bill approves , and this is the one of which he does n't .\nbc01\t0\t?*\ti know which book mag read , and which book bob asked why you had n't .\nbc01\t0\t?*\ti know which book mag read , and which book bob discussed after i had .\nbc01\t1\t\tdulles suspected everyone who angleton did .\nbc01\t1\t\twhile bob read fred , he did n't dickens .\nbc01\t1\t\tsally suspected joe , but he did n't holly .\nbc01\t0\t*\talthough mag does n't eggplants , sally eats cabbage .\nbc01\t0\t?*\talthough i do n't know which book sam did , i do know which book sally read .\nbc01\t0\t?*\tnear everyone angleton did , dulles stood .\nbc01\t0\t*\tsally will stand near mag , but he wo n't holly .\nbc01\t0\t*\twhile holly did n't discuss a report about every boy , she did every girl .\nbc01\t1\t\tsally will stand near every woman that you will .\nbc01\t1\t\ti know which woman holly will discuss a report about , but i do n't know which woman you will .\nbc01\t0\t*\tsam stood near yesterday every one of the women we 'd been discussing .\nbc01\t0\t*\ttruman visited yesterday you .\nbc01\t0\t*\ttruman told the story bob .\nbc01\t1\t\twhile truman did n't visit me , he did you .\nbc01\t1\t\twhile truman did n't tell me a story , he did rusty .\nbc01\t1\t\twhile josé wo n't talk about mag , he might about holly .\nbc01\t1\t\talthough doc might tell it to you , he wo n't to me .\nbc01\t1\t\ti think you need to show yourself more than you do anyone else .\nbc01\t1\t\twhile truman does n't want to visit every city , he does barcelona .\nbc01\t0\t*\twhile rusty might leave in order to please mag , he wo n't his father .\nbc01\t0\t*\twhile doc might claim that bob had read his book , he wo n't the paper .\nbc01\t0\t*\ti 'll turn the radio down , but i wo n't up .\nbc01\t1\t\tfred likes eggplants , although he likes cabbage too .\nbc01\t1\t\talthough he likes cabbage too , fred likes eggplants .\nbc01\t1\t\tfred gave flowers to his sweetie because frank had .\nbc01\t1\t\tchina is a country that joe wants to visit , and he will too , if he gets enough money .\nbc01\t1\t\tjerry would n't read a book by babel , but meryl has done so and it was pretty good .\nbc01\t0\t*\ti know which book max read , and which book oscar has n't done so .\nbc01\t1\t\tjoe might wish he had , but this is n't a country he has visited .\nbc01\t1\t\twhile i might want to , this is the kind of thing that harris has already suggested .\nbc01\t1\t\twe like our friends and they do too .\nbc01\t1\t\twe like our friends and they like our friends too .\nbc01\t1\t\twe like our friends and they like their friends , too .\nbc01\t1\t\trusty talked about himself only after holly did .\nbc01\t0\t*\trusty talked about himself only after mary did talk about himself .\nbc01\t1\t\ti could find no solution , but holly might .\nbc01\t1\t\tfred talked about everything before rusty did .\nbc01\t1\t\tjoe will go to the store , even though fred already has .\nbc01\t1\t\ttoday there is little or no official harassment of lesbians and gays by the national government , although autonomous governments might .\nbc01\t1\t\tthe candidate was dogged by charges of infidelity and avoiding the draft , or at least trying to .\nbc01\t0\t*\tdavid is a great artist , and when he does , his eyes squint at you .\nbc01\t0\t*\tthe candidate was dogged by charges of infidelity , or at least trying to .\nbc01\t1\t\tthis information could have been released by gorbachev , but he chose not to .\nbc01\t1\t\ta lot of this material can be presented in a fairly informal and accessible fashion , and often i do .\nbc01\t0\t*\tjohn likes not mary .\nbc01\t1\t\tjohn does not like mary .\nbc01\t0\t*\tjohn meets often mary .\nbc01\t1\t\tjohn tries to often meet mary .\nbc01\t0\t*\tjohn tries to meet often mary .\nbc01\t1\t\tjohn tries not to meet mary .\nbc01\t0\t*\tjohn tries to meet not mary .\nbc01\t1\t\tis mary running the marathon ?\nbc01\t0\t*\truns mary the marathon ?\nbc01\t1\t\tmary is often running the marathon .\nbc01\t0\t*\tmary runs often the marathon .\nbc01\t1\t\tmary is not running the marathon .\nbc01\t1\t\ti did n't , as bill had thought , go to the store .\nbc01\t1\t\ti did , as bill had thought , go to the store .\nbc01\t0\t*\ti did not , as bill had thought , go to the store .\nbc01\t1\t\tthe writers could so believe the boy .\nbc01\t0\t*\tthe writers so believed the boy .\nbc01\t1\t\tthe writers did so believe the boy .\nbc01\t0\t*\tthe writers did n't so believe the boy .\nbc01\t1\t\trome destroyed carthage .\nbc01\t1\t\trome 's destruction of carthage was horrific .\nbc01\t1\t\tjohn bought the picture of himself that bill saw .\nbc01\t1\t\tthe perception of the problem is quite thorough .\nbc01\t1\t\tthe knowledge of the problem is quite thorough .\nbc01\t0\t*\tthe problem 's perception is quite thorough .\nbc01\t0\t*\tthe problem 's knowledge is quite thorough .\nbc01\t0\t*\tthe problem knows easily .\nbc01\t0\t*\tthe ship sank to collect the insurance .\nbc01\t1\t\tthe sinking of the ship was very devious .\nbc01\t1\t\tthe sinking of the ship to collect the insurance was very devious .\nbc01\t1\t\tthe ship 's sinking was very devious .\nbc01\t0\t*\tthe ship 's sinking to collect the insurance was very devious .\nbc01\t1\t\tthe testing of such drugs on oneself is too risky .\nbc01\t0\t*\tthis drug 's testing on oneself is too risky .\nbc01\t1\t\tthe ship was sunk to collect the insurance .\nbc01\t1\t\tthis drug must first be tested on oneself .\nbc01\t1\t\tthe president 's moral destruction is complete .\nbc01\t1\t\tthe moral destruction of the president was certainly not helpful .\nbc01\t1\t\tmary wants to wear nice blue german dress .\nbc01\t1\t\ttomatoes were introduced in europe after 1492 .\nbc01\t1\t\twe rich have impeccable taste .\nbc01\t0\t*\trich we have impeccable taste .\nbc01\t0\t*\ti read three his books .\nbc01\t0\t*\ti read every his book .\nbc01\t1\t\ti read his every book .\nbc01\t1\t\tevery boy named a planet .\nbc01\t1\t\ti showed every boy a planet .\nbc01\t1\t\tfew boys read any of the books .\nbc01\t1\t\ti showed few boys any of the books .\nbc01\t0\t*\tthat few boys came upset any of the teachers .\nbc01\t1\t\ti was not reading a book when you came in .\nbc01\t1\t\ta boy did not laugh .\nbc01\t1\t\tmost boys did not laugh .\nbc01\t1\t\tevery boy named mercury and venus .\nbc01\t1\t\tevery boy named every planet .\nbc01\t1\t\teach student speaks two languages .\nbc01\t1\t\ttwo students speak each language .\nbc01\t1\t\tsome tourists visited all the museums .\nbc01\t1\t\tfond of some boy every girl is .\nbc01\t0\t*\tguinevere has a single bone that is in every corner of the house .\nbc01\t1\t\ta critic thinks that every book is readable .\nbc01\t1\t\twho does he admire ?\nbc01\t1\t\the admires every man .\nbc01\t0\t*\twhat does who admire ?\nbc01\t1\t\twho admires what ?\nbc01\t1\t\tsomeone from every city hates it .\nbc01\t1\t\tsome professor admires every student .\nbc01\t1\t\tsome professor admires every student and hates the dean .\nbc01\t0\t*\tyou filed every paper without inspecting .\nbc01\t1\t\teveryone reported that max and some lady disappeared .\nbc01\t1\t\tmost guests will be offended if we do n't invite some philosopher .\nbc01\t1\t\tall students believe anything that many teachers say .\nbc01\t1\t\twho will be offended if we do n't invite which philosopher ?\nbc01\t1\t\twho believes anything that who says ?\nbc01\t1\t\texactly two boys kissed some girl .\nbc01\t1\t\tmary dates exactly two of the men who know a producer i like .\nbc01\t1\t\tevery student has to come up with three arguments that show that some condition proposed by bill is wrong .\nbc01\t1\t\tif we invite some philosopher , max will be offended .\nbc01\t1\t\tthree relatives of mine inherited a house .\nbc01\t1\t\tif three relatives of mine die , i will inherit a house .\nbc01\t1\t\teveryone attended some seminar .\nbc01\t1\t\texactly half of the students attended some seminar .\nbc01\t1\t\tmore than three students attended every seminar .\nbc01\t1\t\tevery student attended more than three seminars .\nbc01\t0\t*\tevery man surrounded the fort .\nbc01\t1\t\tevery man lifted the table .\nbc01\t0\t*\tevery man lifted the table together .\nbc01\t1\t\tthe men surrounded the fort .\nbc01\t1\t\tall the men surrounded the fort .\nbc01\t1\t\tthe men lifted the table together .\nbc01\t1\t\ta hundred men lifted the table together .\nbc01\t1\t\tall the men lifted the table together .\nbc01\t1\t\tevery man lifted a table .\nbc01\t1\t\teach man lifted a table .\nbc01\t1\t\tsomeone attended every seminar .\nbc01\t1\t\tmore than two students attended every seminar .\nbc01\t1\t\tyou married no one .\nbc01\t1\t\ti will force you to marry no one .\nbc01\t0\t*\twe voted for me .\nbc01\t1\t\teveryone had been worrying himself stiff .\nbc01\t1\t\teveryone who had been worrying himself stiff said that he was relieved .\nbc01\t1\t\tthere were five tourists in the room apart from myself .\nbc01\t1\t\tphysicists like yourself are a godsend .\nbc01\t1\t\tmax boasted that the queen invited lucie and himself for a drink .\nbc01\t1\t\twhich pictures of him did earl see ?\nbc01\t1\t\twhich pictures of earl did he see ?\nbc01\t1\t\tbill seems to himself to be handsome .\nbc01\t1\t\tbill seems to him to be handsome .\nbc01\t1\t\tjohn will see which picture of himself ?\nbc01\t1\t\teach other 's houses seem to the women to be garish .\nbc01\t1\t\teach other 's houses appear to the women to be garish .\nbc01\t0\t*\told pictures of themselves convinced the children to pretend to be adults .\nbc01\t0\t*\teach other 's houses proved to the women that they had bad taste .\nbc01\t1\t\tthese stories about himself worry john more than anything else .\nbc01\t0\t*\tthese stories about himself describe john better than any official biography .\nbc01\t1\t\twhich picture that john took at the party did he decide to display in his house ?\nbc01\t1\t\twhich report that john revised did he submit ?\nbc01\t1\t\tmary always prefers lemons to limes .\nbc01\t1\t\tmary always has preferred lemons to limes .\nbc01\t1\t\tthe dog that the rat bit chased the cat .\nbc01\t0\t*\tthe cat that the dog that the rat bit chased died .\nbc01\t1\t\tjean never reads this newspaper .\nbc01\t0\t*\tjean reads never this newspaper .\nr-67\t1\t\ta gun which i had cleaned went off .\nr-67\t1\t\ti gave a gun which i had cleaned to my brother .\nr-67\t1\t\ti gave a gun to my brother which i had cleaned .\nr-67\t1\t\the let the cats out which were whining .\nr-67\t0\t*\twhat did bill buy potatoes and ?\nr-67\t0\t*\twhat dl , d john fall asleep and bill wear ?\nr-67\t1\t\twho did mary see walking toward the railroad station ?\nr-67\t1\t\twhom did mary see walking toward the railroad station ?\nr-67\t1\t\tdo you know the boy who mary saw ?\nr-67\t1\t\tdo you know the boy whom mary saw ?\nr-67\t1\t\tthe government prescribes the height of the lettering on the covers of the reports .\nr-67\t0\t*\there is the snowball which i chased the boy who threw at our teacher .\nr-67\t0\t*\twhere 's the bikini which tom mentioned the fact that sue had worn ?\nr-67\t0\t*\twho did he expect who i was acquainted with to show up ?\nr-67\t1\t\twho did he expect to show up who i was acquainted with ?\nr-67\t1\t\twhose book did you find ?\nr-67\t1\t\the will put the chair between some table and some sofa .\nr-67\t0\t*\twhat table will he put the chair between some table and ?\nr-67\t0\t*\twhat table will he put the chair between and some sofa ?\nr-67\t1\t\ti know who is mad at john .\nr-67\t1\t\ti know a boy mad at john .\nr-67\t1\t\tjohn is taller than dill .\nr-67\t1\t\tjohn is taller than bill is .\nr-67\t1\t\ti want to go .\nr-67\t1\t\tshaving myself is difficult for me .\nr-67\t1\t\tthe shock touched off the explosion .\nr-67\t1\t\tthe shock touched the explosion off .\nr-67\t1\t\ti called almost all of the men from boston up .\nr-67\t0\t*\ti ran a man who was old down .\nr-67\t1\t\ti ran an old man down .\nr-67\t0\t*\ti 'm going to call somebody who is .\nr-67\t0\t*\ti polished the vase which was from india up .\nr-67\t1\t\the attributed the fire to a short circuit .\nr-67\t0\t*\the attributed to a short circuit the fire .\nr-67\t1\t\the attributed to a short circuit the fire which .\nr-67\t1\t\the threw the letter into the wastebasket .\nr-67\t0\t*\the threw into the wastebasket the letter .\nr-67\t1\t\tthey dismissed the proposal as too costly .\nr-67\t0\t*\tthey dismissed as to costly the proposal .\nr-67\t1\t\tthey dismissed as too costly the proposal for the state to build a sidewalk from dartmouth to smith .\nr-67\t1\t\ti found to be delicious some fruit which i picked up on the way home .\nr-67\t1\t\ti found delicious some fruit which i picked up on my way home .\nr-67\t0\t*\ti consider to be a fool the senator who made the opening speech .\nr-67\t0\t*\tdid that john showed up please you ?\nr-67\t1\t\tdid the fact that john showed up please you ?\nr-67\t0\t?*\tthat that john showed up pleased her was obvious .\nr-67\t1\t\ti want the fact that bill left to remain a secret .\nr-67\t1\t\ti want it to remain a secret that bill left .\nr-67\t0\t*\twhat what i ate cost almost broke me .\nr-67\t1\t\twhat the thing which i ate cost almost broke me .\nr-67\t1\t\twhat the thing cost which i ate almost broke me .\nr-67\t0\t*\ti went out with a girl who that john showed up pleased .\nr-67\t1\t\ti went out with a girl who it pleased that john showed up .\nr-67\t1\t\ti loaned a man who was watching the race my binoculars .\nr-67\t0\t*\ti loaned my binoculars a man who was watching the race .\nr-67\t1\t\tshe asked a man who was near the window whether it looked like rain .\nr-67\t0\t*\twe called my father , who had just turned 60 , up .\nr-67\t0\t?*\twe elected my father , who had just turned 60 , president .\nr-67\t0\t*\tthey gave my father , who had just turned 60 , it .\nr-67\t1\t\the figured it out .\nr-67\t0\t*\the figured out it .\nr-67\t0\t*\the figured out that .\nr-67\t1\t\the figured ann out .\nr-67\t0\t?*\the figured out ann .\nr-67\t1\t\the figured something out .\nr-67\t1\t\the figured the answer out .\nr-67\t1\t\the figured out the answer .\nr-67\t0\t*\ti sent him it .\nr-67\t1\t\ti sent him that .\nr-67\t1\t\ti sent him something .\nr-67\t0\t?*\twe elected the man who he had brought with him president .\nr-67\t1\t\tthey gave the reports which he had brought with him to me .\nr-67\t0\t*\the kept company some girls who had been injured in the wreck .\nr-67\t0\t?*\the kept some girls who had been injured in the wreck company .\nr-67\t0\t*\ti insist on seeing through all the students who had started out the term in my class .\nr-67\t0\t?*\ti insist in seeing all the students who started out the term in my class through .\nr-67\t1\t\ti insist on seeing all the student ' through who started out the term in my class .\nr-67\t0\t*\tthe doctor brought to the passengers who had passed cut from the fumes .\nr-67\t0\t*\the tries to put on everyone who he does n't like .\nr-67\t0\t?*\the tries to put everyone who he does n't like on .\nr-67\t0\t*\ti watched the indians who the man who had been my advisor in my freshman year had advised me to study when i got to utah talk .\nr-67\t1\t\ttom drives as that man does .\nr-67\t1\t\ttom drives like that man .\nr-67\t0\t*\ti know a man who tom drives as does .\nr-67\t1\t\ti know a man who tom drives like .\nr-67\t1\t\ttom drives the way that that man drives .\nr-67\t1\t\ttoms drives the way that that man does .\nr-67\t1\t\tjohn is taller than that man is .\nr-67\t1\t\tis taller than that man .\nr-67\t0\t*\ti know a man who john is taller than is .\nr-67\t1\t\tjohn is as tall as that man .\nr-67\t0\t*\ti know a man who john is as tall as is .\nr-67\t1\t\ti know a man who john is as tall as .\nr-67\t1\t\tmary has never kissed a man who is taller than john is .\nr-67\t1\t\tmary has never kissed a man who is taller than john .\nr-67\t1\t\tmary has never kissed a man taller than john .\nr-67\t0\t*\tmary has never kissed a man taller than john is .\nr-67\t0\t?*\tmary has never kissed as tall a man as john is .\nr-67\t1\t\tmary has never kissed as tall a man as john .\nr-67\t1\t\tthe brave are not afraid to die .\nr-67\t1\t\tdrowning cats are hard to rescue .\nr-67\t1\t\tdrowning cats is against the law .\nr-67\t1\t\ti know a taller man than john .\nr-67\t1\t\tthe shooting of the prisoners shocked me .\nr-67\t1\t\the told peter that i know a taller man than john , but peter did n't believe it .\nr-67\t1\t\ti divulged when bill promised to call me , but i did so reluctantly .\nr-67\t1\t\ti 'll talk to john on friday about the report that the shooting of the prisoners shocked me , and to his wife on saturday .\nr-67\t1\t\ti read a statement which was about that man .\nr-67\t1\t\ti read a statement about that man .\nr-67\t0\t*\tthe man who i read a statement which was about is sick .\nr-67\t1\t\tthe man who i read a statement about is sick .\nr-67\t1\t\ti read that bill had seen me .\nr-67\t0\t*\ti read that bill had seen myself .\nr-67\t1\t\tevidence that he was drunk will be presented .\nr-67\t1\t\tevidence will be presented that he was drunk .\nr-67\t1\t\tthat the defendant had been rude was stoutly denied by his lawyer .\nr-67\t1\t\tbill told me something awful : that ice wo n't sink .\nr-67\t1\t\tthis is a hat which i 'm going to see to it that my wife buys .\nr-67\t1\t\tthis is a hat which i 'm going to see that my wife buys .\nr-67\t1\t\tphineas knows a girl who is jealous of maxime .\nr-67\t1\t\tphineas knows a girl who is behind maxime .\nr-67\t1\t\tphineas knows a girl who is working with maxime .\nr-67\t0\t*\twho does phineas know a girl who is jealous of ? .\nr-67\t0\t*\twho does phineas know a girl who is behind ?\nr-67\t0\t*\twho does phineas know a girl who is working with ?\nr-67\t0\t*\twho does phineas know a girl jealous of ?\nr-67\t0\t*\twho does phineas know a girl behind ?\nr-67\t0\t*\twho does phineas know a girl working with ?\nr-67\t1\t\ti believed the claim that otto was wearing this hat .\nr-67\t1\t\ti believed that otto was wearing this hat .\nr-67\t0\t*\tthe hat which i believed the claim that otto was wearing is red .\nr-67\t1\t\tthe hat which i believed that otto was wearing is red .\nr-67\t1\t\trutherford understands himself .\nr-67\t0\t*\trutherford is understood . by himself .\nr-67\t1\t\tthe man who ordered ice cream said the pudding would be tasty .\nr-67\t1\t\tthe pudding which the man who ordered ice cream said would be tasty was a horror show .\nr-67\t1\t\tthe man who ordered it said the pudding would be tasty .\nr-67\t0\t*\tthe pudding which the man who ordered it said would be tasty was a horror show .\nr-67\t1\t\tthe sheriff denied that gangsters had bribed him .\nr-67\t1\t\tthat gangsters had bribed him was denied by the sheriff .\nr-67\t0\t*\tthe money which i am discussing the claim that the company squandered amounts to $ 400,000 .\nr-67\t0\t*\tthe money which i am discussing sarah 's claim that the company squandered amounts to $ 400,000 .\nr-67\t1\t\tthe money which i have hopes that the company will squander amounts to $ 400,000 .\nr-67\t1\t\tthe money which i will have a chance to squander amounts to $ 400,000 .\nr-67\t1\t\tthe money which i will make a proposal for us to squander amounts to $ 400,000 .\nr-67\t1\t\tthe money which i will make a proposal that we squander amounts to $ 400,000 .\nr-67\t1\t\ti yawned .\nr-67\t1\t\tsam progressed .\nr-67\t1\t\tbill gave me $ 40 .\nr-67\t1\t\ti took a snooze .\nr-67\t1\t\tsam made progress .\nr-67\t1\t\tbill made a gift to me of $ 40 .\nr-67\t1\t\tmax gave the car a shove .\nr-67\t1\t\ti have a feeling that arch will show up .\nr-67\t1\t\tbob proved that this set is recursive .\nr-67\t1\t\tbob proved this set is recursive .\nr-67\t1\t\tthe proof that this set is recursive is difficult .\nr-67\t1\t\ti have hopes the company will squander the money .\nr-67\t1\t\ti have a feeling the company will squander the money .\nr-67\t0\t*\ti made a proposal we squander the money .\nr-67\t1\t\tdick 's claim that semantics is generative is preposterous .\nr-67\t1\t\twe are discussing their claim that flying saucers are real .\nr-67\t0\t*\tmyron is making suzie 's claim that dead is better than red .\nr-67\t1\t\tmyron is making the claim that dead is better than red .\nr-67\t0\t*\ti have tom 's feeling that the company will squander the money .\nr-67\t0\t*\tmyra took betty 's snooze .\nr-67\t0\t*\tbill made sarah 's gal to me of $ 40 .\nr-67\t0\t*\tmax gave the car levi 's shove .\nr-67\t1\t\tthe man who was arrested by officer bob went mad .\nr-67\t1\t\tjack is claiming that you wo n't need it .\nr-67\t1\t\tjack is claiming you wo n't need it .\nr-67\t0\t*\tthe claim you wo n't need it is being made by jack .\nr-67\t1\t\tyou 're going to hurt yourself one of these days .\nr-67\t1\t\ti spoke to bill about himself .\nr-67\t0\t*\the said that himself was hungry .\nr-67\t1\t\ti know a man who hates me .\nr-67\t0\t*\ti know a man who hates myself .\nr-67\t1\t\ti read him two statements about himself .\nr-67\t0\t*\ti read him judy 's statement about himself .\nr-67\t1\t\ti know two men who are behind me .\nr-67\t1\t\ti know two men behind me .\nr-67\t0\t*\ti know two men behind myself .\nr-67\t1\t\tyou are too flip with people who are jealous of you .\nr-67\t1\t\ti screamed at some children who were watching me .\nr-67\t1\t\ti screamed at some children watching me .\nr-67\t0\t*\twhat sofa will he put the chair between some table and ?\nr-67\t0\t*\twhat table will he put the chair between and some sofa .\nr-67\t0\t*\tthe lute which henry plays and sings madrigals is warped .\nr-67\t0\t*\tthe nurse who polished her trombone and the plumber computed my tax was a blond .\nr-67\t0\t*\twhich trombone did the nurse polish and ?\nr-67\t0\t*\tthe plumber who the nurse polished her trombone and computed my tax was a hefty fellow .\nr-67\t0\t*\twhose tax did the nurse polish her trombone and the plumber compute ?\nr-67\t1\t\tirma washed the dishes , and sally dried , and floyd idled .\nr-67\t1\t\ti went to the store and bought some whisky .\nr-67\t1\t\ti went to the store and nike bought some whisky .\nr-67\t1\t\there 's the whisky which i went to the store and bought .\nr-67\t1\t\there 's the whisky which i went to the store and mike bought .\nr-67\t1\t\ttony has a fiat and yearns for a tall nurse .\nr-67\t0\t*\tthe tall nurse who tony has a fiat and yearns for is cruel to him .\nr-67\t0\t*\tthe shirts which i went to the movies and did n't pick up will cost us a lot of money .\nr-67\t1\t\ti went to the store and have bought some excellent whisky .\nr-67\t0\t*\tthe excellent whisky which i went to the store and have bought was very costly .\nr-67\t1\t\ti went to the store to buy some whisky .\nr-67\t0\t*\ttony has a fiat to yearn for a tall nurse .\nr-67\t0\t*\ti went to the movies not to pick the shirts up .\nr-67\t0\t*\ti went to the movies to not pick the shirts up .\nr-67\t0\t*\ti went to the store to have bought some whisky .\nr-67\t1\t\tshe 's gone and ruined her dress now .\nr-67\t1\t\ti 've got to try and find that screw .\nr-67\t1\t\taunt hattie wants you to be nice and kiss your granny .\nr-67\t1\t\tthe screw which i 've got to try and find holds the door to the frame .\nr-67\t1\t\twhich granny does aunt hattie want me to be nice and kiss ?\nr-67\t1\t\tthe boy works in a skyscraper and the girl in a quonset hut .\nr-67\t0\t*\twhich boy works in a skyscraper and the girl in a quonset hut ?\nr-67\t0\t*\tthe skyscraper which the boy works in and the girl in a quonset hut belongs to uncle sam .\nr-67\t0\t*\tthe girl who the by works in a skyscraper and in a quonset but has a dimple on her nose .\nr-67\t0\t*\twhich quonset hut does the boy work in a skyscraper and the girl in ?\nr-67\t1\t\tthe luscious chick who billy went to the movies with will wed me ere the morn .\nr-67\t0\t*\tthe luscious chick who billy and went to the movies will wed me ere the morn .\nr-67\t0\t*\tthe ferrari which pietro bought from me and sofia adores him cost him a bundle .\nr-67\t1\t\tthe ferrari which pietro , who sofia adores , bought from me cost him a bundle .\nr-67\t1\t\tsally might be pregnant , and everyone believes sheila definitely is pregnant .\nr-67\t1\t\tsally might be , and everyone believes sheila definitely is , pregnant .\nr-67\t1\t\ttom picked these grapes , and i washed these grapes , and suzie will prepare these grapes .\nr-67\t1\t\ttom picked , and i washed , and suzie will prepare , these grapes .\nr-67\t0\t*\ttom picked , and i washed some turnips , and suzie will prepare , these grapes .\nr-67\t1\t\tstudents who fail the final exam or who do not do the reading will be executed .\nr-67\t1\t\tstudents who fail the final exam will be executed or students who do not do the reading will be executed .\nr-67\t1\t\tjohn has been captured by the cops and i 'm afraid he 'll talk .\nr-67\t1\t\ti heated up the coffee and sally wiped the table off .\nr-67\t1\t\talthough bob may not be a nut , many people have claimed it and i think so too .\nr-67\t0\t*\talthough bob may not be a nut , many people have claimed and i think so too .\nr-67\t1\t\twhen did you get back and what did you bring me ?\nr-67\t1\t\tmake yourself comfortable .\nr-67\t1\t\tdid merv show up and did you play chess ?\nr-67\t0\t*\tsally 's sick and what did you bring me ?\nr-67\t0\t*\tmake yourself comfortable and i got sick .\nr-67\t0\t*\twhat are you eating or did you play chess ?\nr-67\t0\t*\twhich boy and the girl embraced ?\nr-67\t0\t*\ti 'm hungry and did you play chess ?\nr-67\t1\t\twho ate what ?\nr-67\t1\t\twhat exploded when ?\nr-67\t1\t\twho gave what to whom ?\nr-67\t1\t\thow long did this fit of generosity last and who gave what to whom ?\nr-67\t0\t*\ti saw you there and who ate what ?\nr-67\t0\t*\twhat exploded when and i warned you it would ?\nr-67\t1\t\tplease make yourself comfortable and i 'll wash the dishes .\nr-67\t1\t\tyou please make yourself comfortable and i 'll wash the dishes .\nr-67\t1\t\tharry will be in the marines next year and herman was drafted last night .\nr-67\t1\t\tsasha is gobbling down dumplings faster than i can reheat them .\nr-67\t1\t\ti want to peruse that contract before filing it away .\nr-67\t1\t\tfred tore the curtain in roiling it up .\nr-67\t0\t??\tthe dumplings which sasha is gobbling down faster than i can reheat are extremely tasty , if i do say so .\nr-67\t1\t\tthe curtain which fred tore in rolling up was the kind gift of my maternal aunt priscilla .\nr-67\t1\t\ti want to peruse that contract before damaging it while filing it away .\nr-67\t1\t\tsasha is gobbling down dumplings faster than i can reheat the meatballs .\nr-67\t1\t\ti want to peruse that contract before filing away the deed .\nr-67\t1\t\tfred tore the curtain in rolling up the wallpaper .\nr-67\t0\t*\ti think anita may have poisoned the meatballs which sasha is gobbling down dumplings faster than i can reheat .\nr-67\t0\t*\tthe deed which i want to peruse that contract before filing away is probably a forgery .\nr-67\t1\t\tthe dumplings which sasha is gobbling down faster than i can reheat the meatballs are extremely tasty , if i do say so .\nr-67\t1\t\ti suspect that the contract which i want to peruse before filing away the deed may some loopholes .\nr-67\t1\t\tthe curtain which fred tore in rolling the wallpaper up was the kind gift of my maternal aunt priscilla .\nr-67\t1\t\tthe dumplings which sasha is gobbling down faster than i can reheat them are extremely tasty , if i do say so .\nr-67\t1\t\treports which the government prescribes the height of the lettering on the covers of are invariably boring .\nr-67\t1\t\treports the covers of which the government prescribes the height of the lettering on almost always put me to sleep .\nr-67\t1\t\treports the lettering on the covers of which the government prescribes the height of are a shocking waste of public funds .\nr-67\t1\t\treports the height of the lettering on the covers of which the government prescribes should be abolished .\nr-67\t0\t*\treports of which the government prescribes the height of the lettering on the covers are invariably boring .\nr-67\t0\t*\treports on the covers of which the government prescribes the height of the lettering almost always put me to sleep .\nr-67\t0\t*\treports of the lettering on the covers of which the government prescribes the height are shocking waste of public funds .\nr-67\t1\t\the has books by several greek authors .\nr-67\t1\t\twhich greek authors does he have books by ?\nr-67\t0\t?*\tby which greek authors does he have books ?\nr-67\t0\t*\tthe boy who i watched bill and was vain .\nr-67\t0\t*\tthe boy bill and who i watched was vain .\nr-67\t1\t\tthey will give me a hat which i know that i wo n't like .\nr-67\t0\t*\tthey will give me a hat that i wo n't like which i know .\nr-67\t1\t\tthe boy whose guardian 's employee we elected president betrayed us .\nr-67\t0\t*\tthe boy whose guardian 's we elected employer president betrayed us .\nr-67\t0\t*\tthe boy whose we elected guardian 's guardian 's employer president betrayed us .\nr-67\t1\t\ti 'm going to ask bill to make the old geezer take up these points later .\nr-67\t1\t\tthese points i 'm going to ask bill to make the old geezer take up later .\nr-67\t1\t\tthe boy 's guardians ' employer we elected president .\nr-67\t0\t*\tthe boy 's guardian 's we elected employer president .\nr-67\t0\t*\tthe boy 's we elected guardian 's employer president .\nr-67\t1\t\twe elected president the boy 's guardian 's employer .\nr-67\t0\t*\twe elected employer president the boy 's guardian 's .\nr-67\t0\t*\twe elected guardian 's employer president the boy .\nr-67\t1\t\twhich boy 's guardian 's employer did we elect president ? .\nr-67\t0\t*\twhich boy 's guardian 's did we elect employer president ?\nr-67\t0\t*\thow have you picked up tnt carelessly ?\nr-67\t1\t\thow carelessly have you picked up tnt ?\nr-67\t1\t\tsheila married that tall a man .\nr-67\t1\t\thow tall a man did sheila marry ?\nr-67\t0\t*\thow tall did sheila marry a man ?\nr-67\t0\t*\thow did sheila marry tall a man ?\nr-67\t1\t\ton which bed does tom sleep ?\nr-67\t1\t\tthe bed on which tom slept was hard .\nr-67\t1\t\twhich bed did tom sleep on ?\nr-67\t1\t\tthe bed which tom slept on was hard .\nr-67\t1\t\tmy sister arrived at a time when no buses were running , and my brother arrived at a time when no buses were running too .\nr-67\t1\t\tjack disappeared in a mysterious manner and the hudson disappeared in a mysterious manner too .\nr-67\t0\t*\tjack disappeared in a mysterious manner and marion disappeared in one too .\nr-67\t0\t*\twhat time did you arrive at ?\nr-67\t0\t*\tthe manner which jack disappeared in was creepy .\nr-67\t0\t*\tthe place which i live at is the place where route 150 crosses the hudson river .\nr-67\t1\t\tthe only relatives who i 'd like to do away with are my aunts .\nr-67\t1\t\tthat meeting i 'd like to sit in on .\nr-67\t0\t*\tthe only relatives with whom i 'd like to do away are my aunts .\nr-67\t0\t*\tto whom is she trying to make up now ?\nr-67\t0\t*\ton that meeting i 'd like to sit in .\nr-67\t1\t\tfor whose rights do you expect me to speak up ?\nr-67\t1\t\tone plan which i got wind of was calculated to keep us in suspense .\nr-67\t1\t\tdid you notice which difficulties she made .\nr-67\t1\t\twho are you trying to get hold of ?\nr-67\t0\t*\tone plan of which i got wind was calculated to keep us in suspense .\nr-67\t0\t?*\tdid you notice of which difficulties she made light ?\nr-67\t0\t*\tof whom are you trying to get hold ?\nr-67\t1\t\tthe only offer of which i plan to take advantage will give me an eleven month paid vacation .\nr-67\t1\t\tthe scenes to which the censors took objection had to do with the mixed marriage of a woman and a giant panda .\nr-67\t1\t\tadvantage will be taken of his offer .\nr-67\t1\t\this offer will be taken advantage of .\nr-67\t1\t\tin this experiment , fourteen variables must be kept track of simultaneously .\nr-67\t1\t\tobjection was taken to the length of our skirts .\nr-67\t1\t\ta plan to negotiate an honorable end to the war in vietnam was gotten wind of .\nr-67\t0\t*\tlight was made of her indiscretions .\nr-67\t1\t\ther indiscretions were made light of .\nr-67\t0\t*\thold has been gotten of some rare old manuscripts .\nr-67\t1\t\tsome rare old manuscripts have been gotten hold of .\nr-67\t1\t\tuse was made of gauss 's polynomial lemma .\nr-67\t1\t\ttabs were kept on all persons entering the station .\nr-67\t0\t??\tthe persons on whom we kept tabs all proved to be innocent .\nr-67\t0\t*\tfaith was had in all kinds of people .\nr-67\t1\t\tmy friends mike talked to about politics yesterday .\nr-67\t1\t\tto my friends mike talked about politics yesterday .\nr-67\t0\t*\tmike talked to about politics yesterday my friends .\nr-67\t0\t??\tshe made light , not too surprisingly , of the difficulties we might have at the border .\nr-67\t1\t\ti gave to the officer in charge the blackjack . which i had found in the cookie jar .\nr-67\t1\t\ti am confident of , and my boss depends on , a successful outing at the track .\nr-67\t0\t*\tthe university 's students are intelligent and faculty is committed to freedom .\nr-67\t1\t\tthe boy 's uncle and aunt were kissing .\nr-67\t0\t*\tthe boy whose uncle and tom 's aunt 's grandmother were kissing was furious .\nr-67\t1\t\twho are you gawking at ?\nr-67\t1\t\twhich hat do you believe that she never wore ?\nr-67\t1\t\tthe reporters expected that the principal would fire some teacher .\nr-67\t1\t\tthat the principal would fire some teacher was expected by the reporters .\nr-67\t1\t\tthe teacher who the reporters expected that the principal would fire is a crusty old jerk .\nr-67\t0\t*\tthe teacher who that the principal would fire was expected by the reporters is a crusty old jerk .\nr-67\t1\t\tthe teacher who it was expected by the reporters that the principal would fire is a crusty old jerk .\nr-67\t0\t*\tthe hat which that i brought seemed strange to the nurse was a fedora .\nr-67\t1\t\ti disliked the boy 's playing the piano loudly .\nr-67\t1\t\tthe boy whose loud playing of the piano i disliked was a student .\nr-67\t1\t\tthe piano which i disliked the boy 's playing loudly was badly out of tune .\nr-67\t1\t\tthe boy 's loud playing of the piano drove everyone crazy .\nr-67\t1\t\tthe boy 's playing the piano loudly drove everyone crazy .\nr-67\t1\t\tthat piano , the boy 's loud playing of which drove everyone crazy , was badly out of tune .\nr-67\t0\t*\tthat piano , the boy 's playing which loudly drove everyone crazy , was badly out of tune .\nr-67\t0\t*\tthat piano , which the boy 's playing loudly drove everyone crazy , was badly out of tune .\nr-67\t0\t*\tdid that he played the piano surprise you ?\nr-67\t0\t*\twould for him to have played the piano have surprised you ?\nr-67\t0\t*\tis whether he played the piano known ?\nr-67\t1\t\tdid his having played the piano surprise you ?\nr-67\t1\t\tmike quipped that she never wore this hat .\nr-67\t0\t*\tmike quipped she never wore this hat .\nr-67\t0\t*\ti dislike it for him to tickle myself .\nr-67\t0\t*\ti dislike his tickling myself .\nr-67\t1\t\tfor anna to tickle him drives frank crazy .\nr-67\t1\t\tanna 's tickling him drove frank crazy .\nr-67\t1\t\tthey are investigating all people owning parakeets .\nr-67\t0\t*\tthe cages which we donated wire for the convicts to build with are strong .\nr-67\t0\t*\twhat kind cf parakeets are they investigating all people owning ?\nr-67\t1\t\tit appears to be true that harry likes girls .\nr-67\t1\t\tthis is the dog that chased the cat that caught the rat that ate the cheese .\nr-67\t0\t*\ther efficient looking of the answer up pleased the boss .\nr-67\t1\t\ther efficient looking up of the answer pleased the boss .\nr-67\t1\t\tshe did away with her father .\nr-67\t0\t*\tshe did with her father away .\nr-67\t1\t\tit is not true that that bob was lying was obvious .\nr-67\t1\t\ta proof was given that the claim that john had lied had been made .\nr-67\t0\t*\ta proof that the claim had been made was given that john had lied .\nr-67\t0\t*\tthat sam did n't pick those packages up is possible which are to be mailed tomorrow .\nr-67\t0\t?*\tthat sam did n't pick those packages which are to be mailed tomorrow up is possible .\nr-67\t0\t*\tit is possible that sam did n't pick those packages which are to be mailed tomorrow up .\nr-67\t1\t\tthat sam did n't pick those packages up which are to be mailed tomorrow is possible .\nr-67\t1\t\twhich packages is it possible that sam did n't pick up which are to be mailed tomorrow ?\nr-67\t1\t\tsam did n't pick those packages up which are to be mailed tomorrow until it had stopped raining .\nr-67\t1\t\tsam picked those packages up which are to be mailed tomorrow rest might , but he did n't want to do so until it had stopped raining .\nr-67\t0\t?*\tsam did n't pick those packages up until it had stopped raining which are to be mailed tomorrow .\nr-67\t1\t\twhich packages which are to be mailed tomorrow is it possible that sam did n't pick up until it had stopped raining ?\nr-67\t0\t*\twhich packages is it possible that sam did n't pick up which are to be mailed tomorrow until it had stopped raining ?\nr-67\t0\t*\twhich packages did n't sam pick up which are to be mailed tomorrow until it had stopped raining ?\nr-67\t1\t\ta girl came in who had worn this coat .\nr-67\t0\t*\tthe coat which a girl came in who had worn was torn .\nr-67\t1\t\tthat that for herschel to throw a fit would confuse the guards was obvious is not true .\nr-67\t1\t\tit is not true that that for herschel to throw a fit would confuse the guards was obvious .\nr-67\t1\t\tit is not true that it was obvious that for herschel to throw a fit would confuse the guards .\nr-67\t1\t\tthat that it would confuse the guards for herschel to throw a fit was obvious is not true .\nr-67\t1\t\tit is not true that that it would confuse the guards for herschel to throw a fit was obvious .\nr-67\t1\t\tthat it was obvious that it would confuse the guards for herschel to throw a fit is not true .\nr-67\t1\t\tit is not true that it was obvious that it would confuse the guards for herschel to throw a fit .\nr-67\t1\t\ta review of this article came out yesterday .\nr-67\t1\t\ta review came out yesterday of this article .\nr-67\t1\t\ta review seems to have come out yesterday of this article .\nr-67\t1\t\twhy do n't you pick some review up of this article ?\nr-67\t1\t\tann is going to send a picture of chairman mao to her teacher , as soon as she gets home .\nr-67\t1\t\tann is going to send a picture to her teacher of chairman mao , as soon as she gets home .\nr-67\t0\t*\twhich picture is ann going to send to her teacher of chairman mao as soon as she gets home ?\nr-67\t0\t*\twho is ann going to send a picture to her teacher of , as soon as she gets home ?\nr-67\t1\t\tthat a review came out yesterday of this article is catastrophic .\nr-67\t0\t*\tthat a review came out yesterday is catastrophic of this article .\nr-67\t1\t\ti 'll give some to my good friend from akron .\nr-67\t0\t*\ti 'll give to my good friend from akron some .\nr-67\t1\t\taround midnight i promised that he would be there .\nr-67\t1\t\ti promised that he would be there tomorrow .\nr-67\t1\t\ttomorrow i promised chat he would be there .\nr-67\t1\t\ti promised that tomorrow he would be there .\nr-67\t1\t\ti promised that around midnight he would be there .\nr-67\t1\t\tbeans i do n't like .\nr-67\t1\t\tproud of him i 've never been .\nr-67\t1\t\tbeans i do n't think you 'll be able to convince me harry has ever tasted in his life .\nr-67\t1\t\tit was tomorrow that i promised that he would be there .\nr-67\t1\t\tit is beans that i do n't like .\nr-67\t1\t\tdo you think that he sometimes went there alone ?\nr-67\t1\t\ti wo n't ask you to believe that he tried to force me to give her some money .\nr-67\t1\t\tthat he sometimes went there alone is certain .\nr-67\t1\t\tdo you believe that somebody was looking for something ?\nr-67\t1\t\ti never met that man who somebody tried to kill .\nr-67\t1\t\ti wo n't have any money .\nr-67\t0\t*\ti will ask you to believe that he tried to force me to give her any money .\nr-67\t1\t\tdo you think that he ever went there alone ?\nr-67\t1\t\tthat he ever went there alone is odd .\nr-67\t0\t\tthat he ever went there alone is certain .\nr-67\t0\t*\ti never met that man who anybody tried to kill .\nr-67\t1\t\ttom told somebody that he was n't sick .\nr-67\t1\t\tbuffy could n't do 100 push ups and somebody laughed .\nr-67\t0\t*\ttom told anybody that he was n't sick .\nr-67\t0\t*\tbuffy could n't do 100 push ups and anybody laughed .\nr-67\t1\t\ti believe that the sun was out .\nr-67\t1\t\ti believed that the sun was out .\nr-67\t1\t\ti believed that the sun is out .\nr-67\t0\t*\tthat the sun is out was obvious .\nr-67\t1\t\tthat i believed that the sun was out is obvious .\nr-67\t1\t\tthat i believed that the sun is out is obvious .\nr-67\t1\t\tthat jack sometimes slept is impossible .\nr-67\t1\t\tthat jack ever slept is impossible .\nr-67\t0\t*\tthat jack ever slept is possible .\nr-67\t0\t*\ti talked to winston about winston .\nr-67\t0\t*\ti talked to winston about him .\nr-67\t1\t\tthat the sun was out was obvious .\nr-67\t1\t\tjohn scratched his arm and mary did too .\nr-67\t1\t\tmary scratched her arm too .\nr-67\t1\t\tmary scratched john 's arm too .\nr-67\t1\t\tjohn scratched his arm and the boy who knew mary scratched her arm .\nr-67\t1\t\tjohn scratched his arm and the boy who mary knew did so too .\nr-67\t1\t\tthat the fuzz wanted him worried john but it did n't worry mary .\nr-67\t1\t\tthat the fuzz wanted him worried john , but that the fuzz wanted john did n't worry mary .\nr-67\t1\t\tthat the police wanted him worried johns but it did n't worry the boy who mary knew .\nr-67\t0\t*\tjohn is prouder of having gone than nobody expected me to believe he would be .\nr-67\t0\t*\tjohn is prouder of having gone than john did n't expect me to believe he would be .\nr-67\t0\t*\tjohn is prouder of having gone than john expected nobody to believe he would be .\nr-67\t0\t*\tjohn is prouder of having gone than john expected me not to believe he would be .\nr-67\t0\t*\tjohn is prouder of having gone than than john expected me to believe not all .\nr-67\t0\t*\tjohn is prouder of having gone than john expected me to believe that he was n't .\nr-67\t1\t\tjohn is prouder of having gone than people who do n't know him would expect me to believe he would be .\nr-67\t1\t\tjohn is prouder of having gone than sally expected joan to believe that the man who did n't shave would be .\nr-67\t1\t\tjohn is prouder of having gone than i expected you to believe he would be of not having fallen asleep .\nr-67\t1\t\ttom knows it and dick knows it and harry knows it .\nr-67\t1\t\ttom washed the car , and dick waxed the car .\nr-67\t1\t\ttom ordered bacon , and dick ordered lettuce , and harry ordered tomatoes .\nr-67\t1\t\ttom , dick , and harry know it .\nr-67\t1\t\ttom washed , and dick waxed , and harry polished the car .\nr-67\t1\t\ttom , dick , and harry ate , drank , and sang .\nr-67\t1\t\ttom ordered bacon , and dick lettuce , and harry tomatoes .\nr-67\t1\t\ttom ordered bacon , and dick ordered lettuce , and i think that harry ordered tomatoes .\nr-67\t0\t*\ttom ordered bacon , and dick lettuce , and i think that harry tomatoes .\nr-67\t1\t\tjoe is taller than mary is .\nr-67\t1\t\tjoe is taller than mary .\nr-67\t1\t\tjoe is taller than i think mary is .\nr-67\t0\t*\tjoe is taller than i think mary .\nr-67\t1\t\tmike will sing if you will sing .\nr-67\t1\t\tmike will sing if you will .\nr-67\t1\t\tjim will go if he feels good .\nr-67\t1\t\tif jim feels good , he will go .\nr-67\t1\t\ti gave the book to harvey because he asked me to .\nr-67\t1\t\tit never occurred to harvey that i might want to leave because he is insensitive to other people 's desires .\nr-67\t0\t*\tit never occurred to harvey because he is insensitive to other people 's desires that , i might want to leave .\nr-67\t1\t\t. i figured it out that she was lying .\nr-67\t1\t\ti explained it to bill that she was lying .\nr-67\t1\t\ti took it for granted that she was lying .\nr-67\t1\t\ti regret it exceedingly that she was lying .\nr-67\t1\t\the 'll bring me a hot dog if he sees one .\nr-67\t0\t*\the 'll bring me one if he sees a hot dog .\nr-67\t1\t\tif he sees a hot dog , he 'll bring me one .\nr-67\t1\t\tif he sees one ; he 'll bring me a hot dog .\nr-67\t0\t*\tseven more soldiers came in after ten ones had left .\nr-67\t0\t*\tseven more ones came in after ten soldiers had left .\nr-67\t0\t*\tafter ten soldiers had left , seven more ones came in .\nr-67\t0\t*\tafter ten ones had left , seven more soldiers came in .\nr-67\t1\t\tseven more soldiers came in after ten had left .\nr-67\t0\t*\tseven more came in after ten soldiers had left .\nr-67\t1\t\tafter ten had left , seven more soldiers came in .\nr-67\t1\t\tharry believes that sally is innocent , although no one else believes it .\nr-67\t1\t\talthough no one else believes that sally is innocent , harry believes it .\nr-67\t1\t\talthough no one else believes it , harry believes that sally is innocent .\nr-67\t1\t\twebster touched a sword after henry had done it .\nr-67\t1\t\tafter henry had touched a sword , webster did it .\nr-67\t1\t\tafter henry had done it , webster touched a sword .\nr-67\t1\t\tif so , i 've lost $ 500 .\nr-67\t0\t*\tif it , i 've lost $ 500 .\nr-67\t1\t\tharry thinks that sally is innocent , although no one else thinks so .\nr-67\t0\t*\tharry thinks so , although no one else thinks sally is innocent .\nr-67\t1\t\talthough no one else thinks that sally is innocent , harry thinks so .\nr-67\t1\t\talthough no one else thinks so , harry thinks that sally is innocent .\nr-67\t1\t\twebster touched a sword after henry had done so .\nr-67\t1\t\tafter henry had touched a sword , webster did so .\nr-67\t1\t\tafter henry had done so , webster touched a sword .\nr-67\t1\t\ti 'll work on it if i can work on it .\nr-67\t1\t\ti 'll work on it if no one else has worked on it .\nr-67\t1\t\ti 'll work on it if you do .\nr-67\t1\t\ti 'll work on it if no one else had .\nr-67\t1\t\ti 'll work on it if same will be too .\nr-67\t0\t*\ti will if i can work on it .\nr-67\t1\t\tif i can work on it , i will .\nr-67\t1\t\tif i can , i will work on it .\nr-67\t1\t\tthe boy who mary loves hates her .\nr-67\t1\t\tthe man who ordered a hot dog got one .\nr-67\t1\t\ttom says that it 's going to rain but i do n't believe it .\nr-67\t1\t\the said he would leave and now he 's done it .\nr-67\t1\t\ti think that mort 's a swell guy , and lenny thinks so too .\nr-67\t1\t\twhy ca n't the man who usually cuts the grass do so today ?\nr-67\t1\t\tmickey and roger have signed , and whitey will tomorrow .\nr-67\t1\t\tronald scoffs at the belief that he would run if nominated .\nr-67\t1\t\tromeo conceded that he and juliet were going steady .\nr-67\t1\t\ti lost a japanese slide rule , and the fact that peter now has one i regard with suspicion .\nr-67\t1\t\tthe earth is flat , but will all those who do n't believe it please raise their hands ?\nr-67\t1\t\tpilots who can fly barrel rolls say that for me to try to do it in a glider would be hazardous .\nr-67\t1\t\tthe passengers who had known that the train was not on fire said that those who had thought so had barricaded themselves in the bathrooms .\nr-67\t1\t\tplaying with matches is ; lots of fun , but doing , so and emptying gasoline from one can to another at the same time is a sport best reserved for arsons .\nr-67\t1\t\tswimming is fun , and i believe that people who ca n't should be taught to .\nr-67\t1\t\thow brave he is !\nr-67\t1\t\thow surprisingly well he dances !\nr-67\t0\t*\twhether he left !\nr-67\t0\t*\twhy he knows the answer !\nr-67\t0\t*\twhich boy is tall !\nr-67\t1\t\thow brave everybody must think you expect me to believe he is !\nr-67\t0\t*\thow brave they must believe the claim that you are !\nr-67\t1\t\thow brave they must believe that you are !\nr-67\t0\t*\thow brave he is tall and !\nr-67\t0\t*\thow brave mike is cowardly and sam is !\nr-67\t1\t\thow he is brave !\nr-67\t1\t\tbill left when everyone will believe that the police have forced me to confess that i shot sandra .\nr-67\t0\t*\tbill left when i am looking at a girl who vomited .\nr-67\t1\t\tbill left when i believe the bomb had just exploded .\nr-67\t0\t*\tbill left when i believe the claim that the bomb had just exploded .\nr-67\t1\t\twhen i am awake and susan is asleep , bill will leave .\nr-67\t0\t*\twhen i am awake at that time and susan is asleep , bill will leave .\nr-67\t0\t*\tbill left when that no one else was awake is certain .\nr-67\t1\t\tbill left when it is certain that no one else was awake .\nr-67\t1\t\there 's a knife for you to cut up the onions with .\nr-67\t0\t*\ti brought a razor to shave himself with .\nr-67\t0\t*\ti brought a razor to shave myself with .\nr-67\t1\t\ti brought a razor with which to shave myself .\nr-67\t0\t*\ti brought a razor with which to shave himself .\nr-67\t1\t\ti brought john a razor to shave himself with .\nr-67\t0\t*\ti brought john a razor to shave myself with .\nr-67\t1\t\ti brought john a razor with which to shave himself .\nr-67\t0\t*\ti brought john a razor with which to shave myself .\nr-67\t0\t*\there 's a knife which for you to cut up the onions with .\nr-67\t1\t\there 's a plate for you to make bob try to begin to force his sister to leave the cookies on .\nr-67\t0\t*\there 's a knife for you to say was on the table .\nr-67\t0\t*\there 's a pole for you to kiss the girl who tied the string around .\nr-67\t0\t*\there 's a razor for you to chop up these nuts with this cleaver and .\nr-67\t0\t*\there 's a razor for that you will be shaved with to be announced .\nr-67\t0\t??\there 's a razor for it to be announced that you will be shaved with .\nr-67\t0\t*\ti loaned maggie a swiss army knife whose to open the padlock with corkscrew .\nr-67\t1\t\tfluffy is sick , which few people realize .\nr-67\t1\t\tfluffy is sick , which i 'm not sure you know sarah expects me to believe joan realizes .\nr-67\t0\t*\tfluffy is sick , which i slapped a boy who would n't acknowledge .\nr-67\t1\t\tfluffy is sick , which i believe that few people realize .\nr-67\t0\t*\tfluffy is sick , which i fell asleep and tom suddenly realized .\nr-67\t0\t*\tfluffy is sick , which that no one here realizes is certain .\nr-67\t1\t\tfluffy is sick , which it is certain that no one here realizes .\nr-67\t1\t\tfluffy is sick , which nobody knows .\nr-67\t0\t*\tfluffy is sick , as nobody knows .\nr-67\t1\t\tfluffy is sick , as not everybody knows .\nr-67\t0\t*\tfluffy is sick , as surprises me .\nr-67\t1\t\tit was this hat that tom said al thought you wanted me to make jack put on .\nr-67\t1\t\twhat tom said al thought you wanted me to make jack put on was this hat .\nr-67\t1\t\tthis hat tom said al thought you wanted me to make jack put on .\nr-67\t0\t*\tit is this hat that i know the boy who is wearing .\nr-67\t1\t\tit is this hat that i believe that he was wearing .\nr-67\t0\t*\twhat i know the boy who was wearing is this hat .\nr-67\t1\t\twhat i believe that he was wearing is this hat .\nr-67\t0\t*\tthis hat i know the boy who was wearing .\nr-67\t1\t\tthis hat i believe that he was wearing .\nr-67\t0\t*\twhat the gloves and were on the table was this hat .\nr-67\t0\t*\tthis hat the gloves and were on the table .\nr-67\t0\t*\tit is this hat that that he was wearing is certain .\nr-67\t1\t\tit is this hat that it is certain that he was wearing .\nr-67\t0\t*\twhat that he was wearing is certain is this hat .\nr-67\t1\t\twhat it is certain that he was wearing is this hat .\nr-67\t1\t\tthis hat it is certain that he was wearing .\nr-67\t0\t*\tit was john 's that i stole bike .\nr-67\t0\t*\tthe one whose i stole bike was john 's .\nr-67\t0\t*\tjohn 's i stole bike .\nr-67\t1\t\tmaxwell is n't the doctor that his father was .\nr-67\t1\t\tmaxwell is n't half the doctor that his father was .\nr-67\t1\t\tmaxwell is the man who won the nobel prize for astrology .\nr-67\t0\t*\tmaxwell is n't half the doctor .\nr-67\t1\t\tmaxwell is quite the doctor .\nr-67\t1\t\tmaxwell is n't much of a doctor .\nr-67\t1\t\tmaxwell is more of a doctor than his son is .\nr-67\t0\t*\tmaxwell is n't half the doctor that was here .\nr-67\t0\t*\tmaxwell is n't half the doctor that polished off the vodka .\nr-67\t0\t*\thalf the doctor that maxwell 's father was sat down .\nr-67\t1\t\tmaxwell is n't half the doctor that feared marge would realize tom had confessed that he knew bill expected him to be .\nr-67\t0\t*\tmaxwell is n't half the doctor that i know an african chief who is .\nr-67\t1\t\tmaxwell is n't half the doctor that people around here believe that his father was .\nr-67\t0\t*\tmaxwell is n't half the doctor that his sister is a psychologist and his father was .\nr-67\t0\t*\tmaxwell is n't half the doctor that that he would be if he studied is certain .\nr-67\t1\t\tmaxwell is n't half the doctor that i 'm certain that he would be if he studied .\nr-67\t1\t\the 's the happiest that i 've ever seen him .\nr-67\t1\t\tthe hardest that it ever snowed was last january 12th .\nr-67\t1\t\tthe hardest that i think i remember him ever telling me that he had heard of it snowing around here was last january 12th .\nr-67\t0\t*\the 's the happiest that we ever talked to the boy who had seen him .\nr-67\t1\t\the 's the happiest that i believe that he 's ever been .\nr-67\t0\t*\tthe hardest that i ever knew a man who said that it had snowed was last january 12th .\nr-67\t1\t\tthe hardest that i believe that it ever snowed was last january 12th .\nr-67\t0\t*\the 's the happiest that i 've ever seen him drunk and .\nr-67\t0\t*\tthe hardest that all the power lines were down and it snowed was last january 12th .\nr-67\t0\t*\the is the happiest that that he has ever been is believed .\nr-67\t1\t\the is the happiest that it is believed that he has ever been .\nr-67\t0\t*\tthe hardest that that it has snowed here is believed was last january 12th .\nr-67\t1\t\tthe hardest that it is believed that it has ever snowed here was last january 12th .\nr-67\t0\t*\ta friend of mine and a girl who was from his home town met in vienna who was working in europe .\nr-67\t0\t*\ta friend of mine who was working in europe and a girl met in vienna who was from his home town .\nr-67\t0\t*\tit and that he loved another was painfully evident that she loved him .\nr-67\t0\t*\tthat she loved him and it was painfully evident that he loved another .\nr-67\t1\t\tmary and an old friend who comes from miami kissed .\nr-67\t0\t*\tmary and kissed an old friend who comes from miami .\nr-67\t1\t\ti gave a picture of a covered bridge and a hundred hikers from hoboken to my sister .\nr-67\t0\t*\ti gave a picture of a covered bridge and to my sister a hundred hikers from hoboken .\nr-67\t0\t*\tjoan plays and sings folk songs a wonderful old guitar from spain .\nr-67\t1\t\tsally might be pregnant , and i know a girl who definitely is pregnant .\nr-67\t0\t?*\tsally might be , and i know a girl who definitely is pregnant .\nr-67\t1\t\tsally might be pregnant , and i believe the claim that sheila definitely is pregnant .\nr-67\t1\t\tsally might be pregnant , and i believe that sheila definitely is pregnant .\nr-67\t0\t?*\tsally is tall , and may be , and sheila is short , and definitely is , blond .\nr-67\t1\t\thank plays the guitar and finds arrangements for all the old folk songs which are still sung in these hills , and ernie writes down all the old folk songs which are still sung in these hills .\nr-67\t0\t??\thank plays the guitar and finds arrangements for , and ernie writes down , all the old folk songs which are still sung in these hills .\nr-67\t1\t\tthey said that tom would pay up and he will pay up .\nr-67\t1\t\tthey said that tom was working , and he is working .\nr-67\t1\t\tthey said that tom would pay up , and pay up he did .\nr-67\t1\t\tthey said that tom would pay up , and pay up he will .\nr-67\t1\t\tthey said that tom was working , and working he is .\nr-67\t1\t\tthey said tom would pay up , and pay up i 'm sure everybody will tell you that his lawyers expect me to believe he did .\nr-67\t1\t\tthey said nobody would pay up , but i know a boy who did pay up .\nr-67\t0\t*\tthey said nobody would pay up , but pay up i know a boy who did .\nr-67\t1\t\tthey said that tom would pay up , and pay up i believe that he did .\nr-67\t1\t\tthey said that tom would n't pay up , but he did go to the bank , and he did pay up .\nr-67\t1\t\tthey said that tom would n't pay up , but pay up he did go to the bank and he did .\nr-67\t0\t*\tthey said that tom would pay up , and pay up that he did is well-known .\nr-67\t1\t\tthey said that tom would pay up , and pay up it is well-known that he did .\nr-67\t1\t\talthough dick is handsome , i 'm still going marry herman .\nr-67\t1\t\thandsome though dick is , i 'm still going to marry herman .\nr-67\t1\t\thandsome though everyone expects me to try to force bill to make mom agree that dick is , i 'm still going to marry herman .\nr-67\t0\t*\thandsome though i know several boys who are , i 'm still going to marry herman .\nr-67\t1\t\thandsome though i believe that dick is , i 'm still going to marry herman .\nr-67\t0\t*\thandsome though dick is fair , nordic , strong and , i 'm still going to marry herman .\nr-67\t0\t*\thandsome though that dick will be is likely , i 'm still going to marry herman .\nr-67\t1\t\tthe more contented we pretended to be , the more we grew angry at the doctors .\nr-67\t0\t*\tthe more contented i laughed at the nurse who thought that we were becoming , the more angry we grew at the doctors .\nr-67\t0\t??\tthe more contented the nurses began to believe that we were going to pretend to be , the more angry we grew at the doctors .\nr-67\t0\t*\tthe more contented we pretended to be better fed and , the more angry we grew at the doctors .\nr-67\t0\t*\tthe more contented for us to pretend to be became possible , the more angry we grew at the doctors .\nr-67\t1\t\ti have some papers to grade .\nr-67\t0\t?*\ti have some papers to announce that i 've got to grade .\nr-67\t1\t\ti have some papers to try to finish grading .\nr-67\t1\t\ti have getting into college to consider .\nr-67\t0\t*\ti have some papers to grade these exams and .\nr-67\t0\t*\ti have some voice exercises to play the guitar and sing .\nr-67\t0\t*\ti have john 's to grade paper .\nr-67\t1\t\twilt is taller than i imagine anybody would ever guess that people had begun expecting red to announce that he was .\nr-67\t1\t\tthe sofa was longer than the desk was .\nr-67\t1\t\tthe sofa was longer than the desk was long .\nr-67\t0\t*\twilly is taller than i know a boy who is .\nr-67\t1\t\twilt is taller than i believe that bill is .\nr-67\t0\t*\twilly is taller than bill is strong and .\nr-67\t0\t*\tdean drank more booze than frank ate wheaties and sammy drank .\nr-67\t0\t*\twilly is taller than that bill is is generally believed .\nr-67\t1\t\twilt is taller than it is generally believed that bill is .\nr-67\t1\t\twilly is taller than bill by 7 millimeters .\nr-67\t1\t\tthe raise which scrooge generously gave tom 's father increased his yearly salary by five cents .\nr-67\t1\t\tthe hare outran the tortoise by so much that he forgot the latter was even in the race any more .\nr-67\t1\t\twho knew mickey would overthrow home plate by that much ?\nr-67\t1\t\twilly is taller than bill by that much .\nr-67\t1\t\tjohn is taller than bill by that much .\nr-67\t1\t\twilly is taller than bill by as much as joe is taller than the dan .\nr-67\t1\t\twilly is taller than bill by more than joe is taller than dan .\nr-67\t1\t\twilly is taller than bill by as much as everybody seems to expect me to admit to having publicly proclaimed that i believed .\nr-67\t0\t*\twilly is taller than bill by as much as i know a boy who thinks that bill is taller than dan .\nr-67\t1\t\twilly is taller than bill by as much a peter believes that billy is taller than dan .\nr-67\t0\t*\twilly is taller than bill by as much as i watch all the games and i know billy is taller than dan .\nr-67\t0\t*\twilly is taller than bill by as much as that bill is taller than dan is believed .\nr-67\t1\t\twilly is taller than bill by as much as it is believed that joe is taller than dan .\nr-67\t1\t\tthe rock was too heavy for me to pick up .\nr-67\t1\t\tthis rock is too heavy for me to begin to decide about helping bob to try to pick it up .\nr-67\t0\t??\tthis rock is too heavy for me to begin to decide about helping bob to try to pick up .\nr-67\t0\t*\tthis rock is too heavy for us to try to claim that we picked up .\nr-67\t1\t\tsodium is a little too peppy for me to want to try mixing it and water in a teacup .\nr-67\t0\t*\tsodium is a little too peppy for me to want to try mixing and water in a teacup .\nr-67\t0\t*\tthat piece of ice is too big for for him to be able to pick up with a teaspoon to be likely .\nr-67\t0\t??\tthat piece of ice is too big for it to be likely for him to be able to pick up with a teaspoon .\nr-67\t1\t\tbob is too thin for me to be able to squeeze into his jacket .\nr-67\t0\t*\tbob is too thin for me to be able to squeeze into jacket .\nr-67\t1\t\tthis rock is light enough for marcia to pick it up .\nr-67\t1\t\tthis rock is light enough for marcia to pick up .\nr-67\t1\t\tthe socks are ready for you to put on .\nr-67\t1\t\tthe socks are ready for you to go about beginning to put them on .\nr-67\t1\t\tthe socks are ready for you to announce that you will put them on .\nr-67\t0\t*\tthe socks are ready for you to announce that you will put on .\nr-67\t1\t\tthe socks are ready for you to try them and the shoes on .\nr-67\t0\t*\tthe socks are ready for you to try and the shoes on .\nr-67\t1\t\tjohn is ready for you to inspect his bunk .\nr-67\t0\t*\tjohn is ready for you to inspect bunk .\nr-67\t0\t*\tthe socks are ready for it to be planned for you to put on .\nr-67\t1\t\tit is tough to play sonatas on this violin .\nr-67\t1\t\tsonatas are difficult to play on this violin .\nr-67\t1\t\tsonatas are easy to play on this violin .\nr-67\t1\t\tsonatas are tough to play on this violin .\nr-67\t1\t\tthis violin is easy to play sonatas on .\nr-67\t1\t\tthis violin is tough to play sonatas on .\nr-67\t1\t\ti made john easy to get along with .\nr-67\t1\t\ti made it easy to get along with john .\nr-67\t1\t\tjohn tries to be easy to get along with .\nr-67\t0\t*\tjohn tried bill to play checkers .\nr-67\t0\t*\tjohn tried for bill to play checkers .\nr-67\t0\t*\tbill would be easy for for you to chat with in moscow to become expensive .\nr-67\t0\t*\tbill would be easy for it to become expensive for you to chat with in moscow .\nr-67\t1\t\tmy father , he 's armenian , and my mother , she 's greek .\nr-67\t0\t*\tif my father , he comes home late , my mother always grills him .\nr-67\t0\t*\tit started to rain after jackie and me , we had finally gotten to our seats .\nr-67\t0\t?*\ti acknowledged that my father , he was tight as an owl .\nr-67\t1\t\ti said that my father , he was tight as an owl .\nr-67\t0\t*\tthat beans he likes is now obvious .\nr-67\t0\t*\ti 'm going to write to the game warden if more than one deer my neighbor brings back .\nr-67\t0\t*\ti do n't know the boy who the flowers mary gave to .\nr-67\t0\t*\ti do n't know the boy the flowers who mary gave to .\nr-67\t0\t*\tthat informers they never use is claimed by the cia .\nr-67\t1\t\tmy father , i hardly ever see him and my mother when they 're not glaring at each other .\nr-67\t1\t\tthis guitar , i 've sung folk songs and accompanied myself on it all my life .\nr-67\t1\t\tmy father , that he 's lived here all his life is well-known to the cops .\nr-67\t1\t\tmy wife , somebody stole her handbag last night .\nr-67\t1\t\tthey spoke to the janitor about that robbery yesterday , the cops .\nr-67\t1\t\tthe cops spoke to him about that robbery yesterday , the janitor .\nr-67\t1\t\tthe cops spoke to the janitor about it yesterday , that robbery .\nr-67\t1\t\tthat they spoke to the janitor about that robbery yesterday , the cops , is terrible .\nr-67\t1\t\tthat the cops spoke to the janitor about it yesterday , that robbery , is terrible .\nr-67\t0\t?*\tthat they spoke to the janitor about that robbery yesterday is terrible , the cops .\nr-67\t0\t*\tthey let him go yesterday , he .\nr-67\t0\t*\tthey let him go yesterday , him .\nr-67\t0\t*\ti like beer , i .\nr-67\t0\t?*\ti like beer , me .\nr-67\t0\t*\twe 'll go together , us .\nr-67\t0\t*\tthey ca n't stand each other , they .\nr-67\t0\t*\tthey ca n't stand each other , them .\nr-67\t1\t\twe 'll do it together , you and i .\nr-67\t1\t\twe 'll do it together , you and me .\nr-67\t1\t\tthey ca n't stand each other , he and she .\nr-67\t1\t\tthey ca n't stand each other , him and her .\nr-67\t0\t*\the , they let him go yesterday .\nr-67\t1\t\thim , they let him go yesterday .\nr-67\t0\t*\ti , i like beer .\nr-67\t1\t\tme , i like beer .\nr-67\t0\t*\twe , we 'll go together .\nr-67\t1\t\tus , we 'll go together .\nr-67\t0\t*\tthey , they ca n't stand each other .\nr-67\t1\t\tthem , they ca n't stand each other .\nr-67\t0\t*\ti saw mary and downtown yesterday your friend from boston .\nr-67\t1\t\ti saw mary and him downtown yesterday , your friend from boston .\nr-67\t0\t*\ti noticed car in the driveway last night your friend from boston .\nr-67\t1\t\ti noticed his car in the driveway last night , your friend from boston .\nr-67\t0\t*\ti spoke to about the war yesterday that guy who 's always following us .\nr-67\t1\t\ti spoke to him about the war yesterday , that guy who 's always following us .\nr-67\t1\t\ti just saw that girl who long john 's claim that he was a martian made all the headlines .\nr-67\t1\t\tall the students who the papers which they submitted were lousy i 'm not going to allow to register next term .\nr-67\t1\t\tdid n't that guy who the game warden and him had seen a flying saucer crack up ?\nr-67\t1\t\tpalmer is a guy who for for him to stay in school would be stupid .\nr-67\t1\t\tking kong is a movie which you 'll laugh yourself sick if you see it .\nr-67\t1\t\tenrico , who is the smartest of us all , got the answer in seven seconds .\nr-67\t1\t\tenrico , and he is the smartest of us all , got the answer in seven seconds .\nr-67\t0\t*\tany student , who wears socks , is a swinger .\nr-67\t0\t*\tno student , who wears socks , is a swinger .\nr-67\t0\t*\tevery student , who wears socks , is a swinger .\nr-67\t0\t*\tany student , and he wears socks , is a swinger .\nr-67\t0\t*\tno student , and he wears socks , is a swinger .\nr-67\t1\t\tis even clarence , who is wearing mauve socks , a swinger ?\nr-67\t1\t\tseven pine trees are behind that barn .\nr-67\t1\t\tthere are seven pine trees behind that barn .\nr-67\t1\t\tthat barn has seven pine trees behind it .\nr-67\t1\t\tthere will be a hole in jack 's pocket .\nr-67\t0\t*\tthere will be the hole in jack 's pocket .\nr-67\t1\t\tjack will have a hole in his pocket .\nr-67\t0\t*\tthat barn has seven pine trees behind itself .\nr-67\t0\t*\tthat barn has seven pine trees behind the cow .\nr-67\t1\t\tjack 's pocket will have a hole in it .\nr-67\t0\t??\tthere is a hole in john 's quilt 's upper right-hand corner .\nr-67\t0\t??\tjohn 's quilt 's upper right-hand corner has a hole .\nr-67\t1\t\tjohn 's quilt has a hole in its upper right-hand corner .\nr-67\t0\t??\tjohn has a hole in his quilt 's upper right-hand corner .\nr-67\t1\t\tjohn has a hole in the upper right-hand corner of his quilt .\nr-67\t1\t\tthere are seven holes in the door and window .\nr-67\t0\t*\tthe door has seven holes in it and the window .\nr-67\t1\t\tthere is a blemish on the end of jerry 's sister 's nose .\nr-67\t0\t*\tjerry has a blemish on the end of his sister 's nose .\nr-67\t1\t\tjerry 's sister has a blemish on the end of her nose .\nr-67\t1\t\tthere is a hole in the rug which toby bought in boston .\nr-67\t1\t\tthere was an error in the proof johns presented .\nr-67\t1\t\tthere was a snake behind the car fred was sitting in .\nr-67\t1\t\tjohn had an error in the proof he presented .\nr-67\t0\t*\tjohn had an error in the proof sarah presented .\nr-67\t1\t\tfred had a snake behind the car joe was sitting in .\nr-67\t1\t\tfred had a snake behind the car he was sitting in .\nr-67\t1\t\tthere was a yellow collar on the dog which the car injured .\nr-67\t1\t\tthere was a snake behind the car the time bomb was sitting in .\nr-67\t0\t*\tthe car had a yellow collar on the dog which it injured .\nr-67\t0\t*\tthat stone has a hole in the tarpaulin which it is holding down .\nr-67\t0\t*\tthe time bomb had a snake behind the car which it was sitting in .\nr-67\t1\t\tthere were several hundred people yelling for me to put down gently .\nr-67\t1\t\tthe hot potato which there were several hundred people yelling for me to put down gently turned out to have been filled with tnt .\nr-67\t0\t*\tthe hot potato had several hundred people yelling for me to put it down gently .\nr-67\t1\t\tbartlett and toni danced .\nr-67\t1\t\tbartlett danced with toni .\nr-67\t0\t*\tbartlett and danced toni .\nr-67\t0\t*\tand toni danced bartlett .\nr-67\t1\t\tit bothers me for her to wear that old fedora .\nr-67\t0\t*\tthe only girl for whom it bothers me to wear that old fedora is annabelle .\nr-67\t0\t*\tthe only girl who it bothers me to wear that old fedora is annabelle .\nr-67\t1\t\ti would prefer it for there to be no talking .\nr-67\t1\t\the gave my binoculars to that girl .\nr-67\t1\t\the gave that girl my binoculars .\nr-67\t1\t\twhich girl did he give my binoculars to ?\nr-67\t0\t*\twhich girl did he give my binoculars ?\nr-67\t1\t\tmy binoculars were given to that girl by him .\nr-67\t1\t\tbill confirmed that roger has eaten the cake .\nr-67\t1\t\tbill alleged that roger had eaten the cake .\nr-67\t1\t\tbill alleged that roger has eaten the cake .\nr-67\t0\t??\twhat did bill confirm that roger had eaten ?\nr-67\t1\t\twhat did bill allege that roger had eaten ?\nr-67\t0\t?*\tbill did n't confirm that roger had eaten anything .\nr-67\t0\t*\twaldo did n't report the possibility that anyone had left .\nr-67\t1\t\twaldo did n't report that anyone had left .\nr-67\t1\t\tanybody who ever swears at me better watch his step .\nr-67\t1\t\ti want all the students who have ever tried to pat fido to show me their scars .\nr-67\t0\t*\tonly the travelers who anybody has ever robbed do n't carry machetes .\nr-67\t1\t\tthe only travelers who anybody has ever robbed do n't carry machetes .\nr-67\t0\t*\ti ca n't remember the name of somebody who had any misgivings .\nr-67\t1\t\ti ca n't remember the name of anybody who had any misgivings .\nr-67\t1\t\teverybody who has ever , worked in any office which contained any typewriter which had ever been used to type any letters which had to be signed by any administrator who ever worked in any department like mine will know what i mean .\nr-67\t1\t\tno student who ever goes to europe ever has enough money .\nr-67\t0\t*\tevery student who ever goes to europe ever has enough money .\nr-67\t0\t*\ti did n't eat the ice cream and any cake .\nr-67\t1\t\ti realized that it had rained and some crops had been destroyed .\nr-67\t1\t\ti did n't realize that it had rained and some crops had been destroyed .\nr-67\t1\t\ti did n't eat any ice cream or any cake .\nr-67\t0\t*\ti did n't eat any ice cream and any cake .\nr-67\t0\t?*\ti did n't eat the cake or any ice cream .\nr-67\t0\t*\ti did n't eat any ice cream or mary 's cake .\nr-67\t0\t*\ti did n't eat any ice cream or the cake .\nr-67\t1\t\tjohn and mary met in vienna .\nr-67\t1\t\tjohn met mary in vienna .\nr-67\t0\t*\tfew writers and any playwrights meet in vienna .\nr-67\t1\t\tfew writers meet any playwrights in vienna .\nr-67\t0\t*\tmy brother and few americans meet in vienna .\nr-67\t1\t\tmy brother meets few americans in vienna .\nr-67\t1\t\tno writer , and no playwright , speaks clearly .\nr-67\t1\t\tno writer , nor any playwright , speaks clearly .\nr-67\t0\t*\tbill understands mary and himself .\nr-67\t0\t?*\tbill understands himself and mary .\nr-67\t0\t*\tbill and mary washed himself .\nr-67\t0\t*\tandy pinched sarah and tickled herself .\nr-67\t0\t*\tthe gun and a description of itself lay on the bureau .\nr-67\t1\t\tbill believes that anna and he are similar .\nr-67\t1\t\tbill believes anna and him to be similar .\nr-67\t0\t*\tbill believes anna and himself to be similar .\nr-67\t0\t*\ti deny that that bob has any money is certain .\nr-67\t1\t\ti deny that it is certain that bob has any money .\nr-67\t0\t??\ti deny that that bob has some money is certain .\nr-67\t1\t\ttom will not force you to marry any student .\nr-67\t1\t\ttom will force you to marry no student .\nr-67\t0\t*\tthe writers of any of the reports did n't know the answer .\nr-67\t1\t\tthe writers of none of the reports .\nr-67\t1\t\ttom will force you to marry no student , and neither will i .\nr-67\t1\t\tit is not certain that you 'll marry any student .\nr-67\t1\t\tit is not certain that you 'll marry any particular student .\nr-67\t1\t\tit is certain that you 'll marry no student .\nr-67\t1\t\tthat you will marry any particular student is not certain .\nr-67\t1\t\tthe man who i gave john a picture of was bald .\nr-67\t0\t??\tthe man who i gave john this picture of was bald .\nr-67\t0\t*\tthe man who i gave john ed 's picture of was bald .\nr-67\t1\t\ti gave jack a picture of myself .\nr-67\t0\t*\ti gave jack ed 's picture of myself .\nr-67\t1\t\ti did n't give jack a picture of anybody .\nr-67\t0\t*\ti did n't give jack this picture of anybody .\nr-67\t1\t\ti hope i 'm not treading on anyone 's toes .\nr-67\t1\t\tabernathy admitted that the poison pen letter had been written by my sister and him .\nr-67\t1\t\tabernathy admitted that the poison pen letter had been written by my sister and himself .\nr-67\t1\t\tthat the sun was out is obvious .\nr-67\t1\t\tthat anybody ever left at all is not known .\nr-67\t1\t\tthat anybody ever left at all is not certain .\nr-67\t1\t\tthat anybody ever left at all is impossible .\nr-67\t1\t\tthat anybody ever left at all is surprises me .\nr-67\t1\t\ttonight , what bob cooked yesterday still tastes good .\nr-67\t1\t\ttonight , what bob cooked yesterday still tastes good , so tonight , what bob cooked yesterday will be eaten up .\nr-67\t1\t\ttonight , what bob cooked yesterday still tastes good , so tonight it will be eaten up .\nrhl07\t1\t\tmartha gave myrna an apple .\nrhl07\t1\t\tleigh threw the ball to lane .\nrhl07\t1\t\tleigh threw lane the ball .\nrhl07\t1\t\tthe noise gave terry a headache .\nrhl07\t0\t*\tthe noise gave a headache to terry .\nrhl07\t1\t\tjill threw the ball from home plate to third base .\nrhl07\t1\t\tjill kicked the ball from home plate to third base .\nrhl07\t1\t\ti sent the bicycle from my house at the beach to my house in the mountains .\nrhl07\t1\t\ti shipped the bicycle from my house at the beach to my house in the mountains .\nrhl07\t1\t\tfred threw the ball under the porch .\nrhl07\t1\t\tfred threw the ball behind the tree .\nrhl07\t1\t\tfred threw the ball over the fence .\nrhl07\t1\t\tfred kicked the ball under the porch .\nrhl07\t1\t\tfred kicked the ball behind the tree .\nrhl07\t1\t\tfred kicked the ball over the fence .\nrhl07\t1\t\tfelicia threw the ball off the bench .\nrhl07\t1\t\tfelicia threw the ball out the window .\nrhl07\t1\t\tfelicia kicked the ball out the window .\nrhl07\t0\t*\tfelicia sent the box off the shelf .\nrhl07\t0\t*\tfelicia sent the box out of the storeroom .\nrhl07\t0\t*\tfelicia shipped the box off the shelf .\nrhl07\t0\t*\tfelicia shipped the box out of the storeroom .\nrhl07\t0\t*\tjake sent the box at carson .\nrhl07\t0\t*\tjake sent the box towards carson .\nrhl07\t0\t*\tjake shipped the box at carson .\nrhl07\t0\t*\tjake shipped the box towards carson .\nrhl07\t1\t\tanne is curious as to why her father sent her a telegram to america to return home at once .\nrhl07\t0\t*\twhere did you give the ball ?\nrhl07\t1\t\twhere did you throw the ball ? to third base .\nrhl07\t1\t\twhere did you send the bicycle ? to rome .\nrhl07\t1\t\ti gave the package to maria .\nrhl07\t0\t*\ti gave the package to london .\nrhl07\t1\t\ti sent the package to london .\nrhl07\t1\t\ti threw the ball to maria .\nrhl07\t1\t\ti threw the ball to the other side of the field .\nrhl07\t0\t*\tsusan gave the ball halfway to bill .\nrhl07\t0\t*\tsusan gave the ball all the way to bill .\nrhl07\t1\t\tjake threw the ball all the way to bill .\nrhl07\t1\t\tjake threw the ball halfway to bill .\nrhl07\t1\t\tjake kicked the ball all the way to bill .\nrhl07\t1\t\tjake kicked the ball halfway to bill .\nrhl07\t1\t\ti sent the package all the way around the world .\nrhl07\t1\t\ti sent the package to the antarctic .\nrhl07\t1\t\ti shipped the package halfway around the world .\nrhl07\t1\t\ti shipped the package all the way around the world .\nrhl07\t1\t\ti shipped the package halfway to the antarctic .\nrhl07\t0\t*\tfred gave the ball under molly .\nrhl07\t0\t*\tfred gave the ball behind molly .\nrhl07\t0\t*\tfred gave the ball over molly .\nrhl07\t0\t*\tfred offered the ball under molly .\nrhl07\t0\t*\tfred offered the ball over molly .\nrhl07\t0\t*\tsam gave the ball off the shelf .\nrhl07\t0\t*\tsam offered the ball off the shelf .\nrhl07\t0\t*\tjill gave the ball at bob .\nrhl07\t0\t*\tjill gave the ball towards bob .\nrhl07\t0\t*\tjill offered the ball at bob .\nrhl07\t0\t*\tjill offered the ball towards bob .\nrhl07\t1\t\tgive a fresh coat of paint to the front door .\nrhl07\t1\t\tone of the jewish children is a spunky girl , who gave a black eye to the kid with the german roots before the start of the war .\nrhl07\t1\t\tthe door has a fresh coat of paint .\nrhl07\t1\t\tthe spunky girl has a black eye .\nrhl07\t1\t\ti promise a good time to all who come .\nrhl07\t1\t\tall who come will have a good time .\nrhl07\t1\t\the died from exhaustion .\nrhl07\t1\t\tthe water melted into ice .\nrhl07\t1\t\tthe water melted to ice .\nrhl07\t1\t\ta hefty sum of money came to him from his grandfather .\nrhl07\t1\t\tthe close brush with the law put the fear of god in him .\nrhl07\t1\t\tshe fell in love .\nrhl07\t1\t\tshe fell into a sulk .\nrhl07\t1\t\tshe fell into a funk .\nrhl07\t1\t\tto whom did you give the ball ?\nrhl07\t1\t\tto whom did you throw the ball ?\nrhl07\t1\t\twhere did you throw the ball ?\nrhl07\t1\t\tto whom did you send the package ?\nrhl07\t1\t\twhere did you send the package ?\nrhl07\t1\t\tsmith threw the ball to the first baseman .\nrhl07\t1\t\tsmith threw the first baseman the ball .\nrhl07\t0\t*\tsmith threw the first base the ball .\nrhl07\t1\t\tsmith envied jones his good fortune .\nrhl07\t0\t*\tsmith envied his good fortune to jones .\nrhl07\t1\t\tno one can forgive you that comment .\nrhl07\t1\t\tthe recession cost my grandfather a raise .\nrhl07\t1\t\tmary taught john linguistics .\nrhl07\t1\t\tmary taught linguistics to john .\nrhl07\t1\t\ti threw the ball to julian , but it fell short of him .\nrhl07\t1\t\tmax offered the victims help , but they refused his offer .\nrhl07\t1\t\tmax offered help to the victims , but they refused his offer .\nrhl07\t1\t\tsarah promised her old car to catherine , but then gave it to her son instead .\nrhl07\t1\t\ti taught them english for an entire year , but they do n't seem to have learned .\nrhl07\t1\t\ti read him the figures , but when i looked up , he was gone .\nrhl07\t1\t\ti throw you a lifeline and you giggle .\nrhl07\t1\t\ti kicked him the ball , but the wind blew it astray .\nrhl07\t1\t\ti threw mary the ball , but she was looking at the birds flying overhead and did n't even notice .\nrhl07\t1\t\ti threw the ball to mary , but she was looking at the birds flying overhead and did n't even notice .\nrhl07\t1\t\tlewis shipped sam a bicycle , but it never arrived .\nrhl07\t1\t\tlewis sent sam a bicycle , but it never arrived .\nrhl07\t1\t\tthe police read the detainees ' rights to them , but not a single one was paying attention .\nrhl07\t1\t\ti wrote a letter to blair , but i tore it up before i sent it .\nrhl07\t1\t\tthe police read the detainees their rights , but not a single one was paying attention .\nrhl07\t1\t\ti wrote blair a letter , but i tore it up before i sent it .\nrhl07\t1\t\tann copied the manuscript , but she did n't finish it .\nrhl07\t1\t\talex read the newspaper for an hour .\nrhl07\t1\t\talex read the newspaper in an hour .\nrhl07\t0\t*\ti lent the book halfway to tony .\nrhl07\t0\t*\ti lent the book all the way to tony .\nrhl07\t0\t*\ti lent the book most of the way to tony .\nrhl07\t0\t*\ti lent tony the book partway .\nrhl07\t0\t*\ti lent tony the book halfway .\nrhl07\t0\t*\ti lent tony the book all the way .\nrhl07\t0\t*\ti lent tony the book most of the way .\nrhl07\t0\t*\trobin arrived partway at the station .\nrhl07\t0\t*\trobin arrived all the way at the station .\nrhl07\t0\t*\trobin arrived most of the way at the station .\nrhl07\t0\t*\tthe old dog died partway .\nrhl07\t0\t*\tthe old dog died halfway .\nrhl07\t0\t*\tthe old dog died all the way .\nrhl07\t1\t\tsandy taught the children the alphabet , but only got as far as the letter `` r '' .\nrhl07\t1\t\tmaxine read the children goodnight moon , but they fell asleep before she got to the end .\nrhl07\t1\t\tinterviewing richard nixon gave norman mailer a book .\nrhl07\t1\t\tnixon 's behavior gave mailer an idea for a book .\nrhl07\t1\t\tnixon 's behavior gave an idea for a book to every journalist living in new york city in the 1970s .\nrhl07\t1\t\twe gave a fresh coat of paint to the house .\nrhl07\t1\t\tthe five `` soundscape '' pieces gave a festive air to park square .\nrhl07\t1\t\tgordie gillespie still can give a piece of his mind to the umps .\nrhl07\t1\t\ti sent the salesman to the devil .\nrhl07\t0\t*\ti sent the devil the salesman .\nrhl07\t1\t\tnixon 's behavior gave an idea for a book to every journalist living in new york .\nrhl07\t1\t\tthe music lent a festive air to the party .\nrhl07\t1\t\tit is very difficult to get an idea for a book simply from an interview .\nrhl07\t1\t\tit is unreadable , guaranteed to give a headache to anyone who looks hard at the small print .\nrhl07\t1\t\t`` doing my taxes '' gives a headache to 22 percent of americans surveyed for pfizer , which makes tylenol pain relief medicine .\nrhl07\t1\t\tlopez says that he has done more than simply give a fresh coat of paint to the site .\nrhl07\t1\t\ti think it 's time you give your lovely illness to someone else !\nl-93\t1\t\tsharon sprayed the plants with water .\nl-93\t1\t\tthe farmer loaded apples into the cart .\nl-93\t0\t*\tmonica covered a blanket over the baby .\nl-93\t1\t\tmonica covered the baby with a blanket .\nl-93\t1\t\tcarla poured lemonade into the pitcher .\nl-93\t0\t*\tcarla poured the pitcher with lemonade .\nl-93\t1\t\tthe farmer dumped apples into the cart .\nl-93\t1\t\tthe window broke .\nl-93\t1\t\tthe little boy broke the window .\nl-93\t1\t\ta rabbit appeared out of the magician 's hat .\nl-93\t0\t*\tthe magician appeared a rabbit out of his hat .\nl-93\t1\t\tmartha carved a toy out of wood for the baby .\nl-93\t1\t\tmartha carved some wood into a toy for the baby .\nl-93\t0\t*\tmartha carved the baby some wood into a toy .\nl-93\t1\t\tmargaret cut the bread .\nl-93\t1\t\tjanet broke the vase .\nl-93\t1\t\tterry touched the cat .\nl-93\t1\t\tcarla hit the door .\nl-93\t1\t\tcrystal vases break easily .\nl-93\t0\t*\tcats touch easily .\nl-93\t0\t*\tdoor frames hit easily .\nl-93\t1\t\tmargaret cut at the bread .\nl-93\t0\t*\tjanet broke at the vase .\nl-93\t0\t*\tterry touched at the cat .\nl-93\t1\t\tcarla hit at the door .\nl-93\t1\t\tmargaret cut bill 's arm .\nl-93\t1\t\tmargaret cut bill on the arm .\nl-93\t1\t\tjanet broke bill 's finger .\nl-93\t1\t\tterry touched bill 's shoulder .\nl-93\t1\t\tterry touched bill on the shoulder .\nl-93\t1\t\tcarla hit bill 's back .\nl-93\t1\t\tcarla hit bill on the back .\nl-93\t1\t\tjean moved the table .\nl-93\t0\t*\tjean moved at the table .\nl-93\t1\t\tmargaret cut the string .\nl-93\t0\t*\tthe string cut .\nl-93\t0\t*\tthe cat touched .\nl-93\t0\t*\tthe door hit .\nl-93\t1\t\ta the butcher cuts the meat .\nl-93\t1\t\tthe meat cuts easily .\nl-93\t1\t\tjanet broke the crystal .\nl-93\t1\t\tcrystal breaks at the slightest touch .\nl-93\t1\t\tkelly adores french fabrics .\nl-93\t0\t*\tfrench fabrics adore easily .\nl-93\t1\t\tjoan knew the answer .\nl-93\t0\t*\tthe answer knows easily .\nl-93\t1\t\tbill pounded the metal .\nl-93\t1\t\tbill pounded the metal fiat .\nl-93\t1\t\tthis metal wo n't pound flat .\nl-93\t1\t\tthe cup broke .\nl-93\t1\t\tthey gave the bicycle to me .\nl-93\t0\t*\tthe bicycle gave to me .\nl-93\t0\t*\tthe bread cut .\nl-93\t0\t*\tthe magician appeared a dove from his sleeve .\nl-93\t1\t\tsylvia jumped the horse over the fence .\nl-93\t1\t\tthe horse jumped over the fence .\nl-93\t1\t\tthe scientist ran the rats through the maze .\nl-93\t1\t\tthe rats ran through the maze .\nl-93\t1\t\tthe bell rang .\nl-93\t1\t\tthey stood the statue on the pedestal .\nl-93\t1\t\tthe statue stood on the pedestal .\nl-93\t1\t\tthe army lodged the soldiers in the schoolhouse .\nl-93\t1\t\theat radiates from the sun .\nl-93\t1\t\tthe sun radiates heat .\nl-93\t1\t\tthe departing passenger waved at the crowd .\nl-93\t0\t*\tjennifer craned .\nl-93\t1\t\ti shaved my face .\nl-93\t1\t\ti shaved .\nl-93\t1\t\tcelia braided her hair .\nl-93\t0\t*\tcelia braided .\nl-93\t0\t*\ttessa sprained .\nl-93\t1\t\tjill dressed hurriedly .\nl-93\t0\t*\ti shaved myself .\nl-93\t0\t*\tcelia brushed .\nl-93\t1\t\ttessa cut herself .\nl-93\t0\t*\ttessa cut .\nl-93\t1\t\twe loaded ourselves onto the bus .\nl-93\t1\t\twe loaded onto the bus .\nl-93\t1\t\twe pulled ourselves free .\nl-93\t1\t\tanne met cathy .\nl-93\t1\t\tanne and cathy met .\nl-93\t0\t*\tbrenda chatted molly .\nl-93\t1\t\tbrenda and molly chatted .\nl-93\t1\t\tthe drunk hugged the lamppost .\nl-93\t0\t*\tthe drunk and the lamppost hugged .\nl-93\t1\t\titaly touches france .\nl-93\t1\t\titaly and france touch .\nl-93\t0\t*\tellen argued helen .\nl-93\t1\t\tellen and helen argued .\nl-93\t1\t\tthe sign warned us against skating on the pond .\nl-93\t1\t\tthe sign warned against skating on the pond .\nl-93\t1\t\tfor discussion of the same phenomenon in russian .\nl-93\t1\t\tthat dog bites people .\nl-93\t1\t\tthat dog bites .\nl-93\t1\t\ti cut the bread with this knife .\nl-93\t1\t\tthis knife cut the bread .\nl-93\t1\t\tthis knife does n't cut .\nl-93\t1\t\tthese shears clip well .\nl-93\t1\t\tthis machine records well .\nl-93\t1\t\tthis oven cooks well .\nl-93\t1\t\tthis lotion softens , soothes , and protects .\nl-93\t1\t\tthis polish cleans , protects , and shines .\nl-93\t0\t*\tthis key wo n't open .\nl-93\t1\t\tthis key wo n't open the jock .\nl-93\t0\t*\tthis hammer wo n't break .\nl-93\t1\t\tthis hammer wo n't break the window .\nl-93\t1\t\tthey pushed their way through the crowd .\nl-93\t1\t\tthey pushed through the crowd .\nl-93\t1\t\tbake for 30 minutes .\nl-93\t0\t*\tlike the ice cream .\nl-93\t0\t*\tlike after tasting .\nl-93\t1\t\tpaula hit the fence .\nl-93\t1\t\tpaula hit at the fence .\nl-93\t1\t\tfaustina sprayed the lilies .\nl-93\t0\t*\tjanet broke at the bread .\nl-93\t1\t\ti pushed the table .\nl-93\t1\t\ti pushed at the table .\nl-93\t1\t\ti pushed on the table .\nl-93\t1\t\ti pushed against the table .\nl-93\t1\t\tthe mouse nibbled the cheese .\nl-93\t1\t\tthe mouse nibbled at the cheese .\nl-93\t1\t\tthe mouse nibbled on the cheese .\nl-93\t0\t*\tmonica moved at the cat .\nl-93\t1\t\tmartha climbed up the mountain .\nl-93\t1\t\tmartha climbed the mountain .\nl-93\t1\t\tthey skated along the canals .\nl-93\t1\t\tthey skated the canals .\nl-93\t1\t\tthe spaceship revolves around the earth .\nl-93\t0\t*\tthe spaceship revolves the earth .\nl-93\t1\t\tmartha slowly descended the stairs .\nl-93\t1\t\tjill met sarah .\nl-93\t1\t\tjill embraced sarah .\nl-93\t1\t\tbill sold tom a car .\nl-93\t1\t\tbill sent a package london .\nl-93\t1\t\tbill sent tom a package .\nl-93\t0\t*\tbill sent london a package .\nl-93\t1\t\tmartha carved a toy for the baby .\nl-93\t1\t\tmartha carved the baby a toy .\nl-93\t1\t\tthe architect selected a house for the couple .\nl-93\t1\t\tjack sprayed paint on the wall .\nl-93\t1\t\tjack sprayed the wall with paint .\nl-93\t0\t*\tjune covered the blanket over the baby .\nl-93\t1\t\tjune covered the baby with a blanket .\nl-93\t0\t*\ttamara poured the bowl with water .\nl-93\t1\t\thenry cleared dishes from the table .\nl-93\t1\t\thenry cleared the table of dishes .\nl-93\t1\t\tthe thief stole the painting from the museum .\nl-93\t0\t*\tthe thief stole the museum of the painting .\nl-93\t1\t\tthe doctor cured pat of pneumonia .\nl-93\t1\t\thelen wiped the wall .\nl-93\t0\t*\thelen wiped the wall of fingerprints .\nl-93\t1\t\tbees are swarming in the garden .\nl-93\t1\t\tthe garden is swarming with bees .\nl-93\t0\t*\tpeople are seething in the square .\nl-93\t0\t*\tthe pasture is herding with cattle .\nl-93\t1\t\tclouds cleared from the sky .\nl-93\t1\t\tthe sky cleared .\nl-93\t1\t\tmartha carved the piece of wood into a toy .\nl-93\t1\t\tdavid constructed a house out of .\nl-93\t1\t\tdavid constructed a house from bricks .\nl-93\t0\t*\tdavid constructed the bricks into a house .\nl-93\t1\t\ti whipped the eggs into a froth .\nl-93\t1\t\tthe witch turned him into a frog .\nl-93\t0\t*\tthe witch turned him from a prince .\nl-93\t1\t\tan oak tree will grow from that acorn .\nl-93\t1\t\tthe witch turned him from a prince into a frog .\nl-93\t0\t*\tmartha carved the piece of wood from a branch into a toy .\nl-93\t0\t*\ti whipped the eggs from a puddle into a froth .\nl-93\t1\t\the turned from a prince into a frog .\nl-93\t0\t*\tthat acorn will grow from a seed into an oak tree .\nl-93\t1\t\tthe car collided with the fence .\nl-93\t1\t\ti separated the yolk from the white .\nl-93\t1\t\ti separated the yolk and the white .\nl-93\t1\t\ti mixed the sugar and the butter .\nl-93\t1\t\ti confused maria with anna .\nl-93\t1\t\ti confused maria and anna .\nl-93\t1\t\tlinda taped the label to the cover .\nl-93\t0\t*\tlinda taped the label and the cover .\nl-93\t1\t\tharriet alternated folk songs with pop songs .\nl-93\t0\t*\tharriet alternated folk songs and pop songs together .\nl-93\t1\t\ti broke the twig and the branch apart .\nl-93\t0\t*\ti detached the handle and the box apart .\nl-93\t1\t\tbrenda agreed with molly .\nl-93\t1\t\tbrenda and molly agreed .\nl-93\t1\t\tthe oil separated from the vinegar .\nl-93\t1\t\tthe oil and vinegar separated .\nl-93\t0\t*\tbill married with kathy .\nl-93\t1\t\tbill and kathy married .\nl-93\t1\t\tthe twig broke off of the branch .\nl-93\t0\t*\tthe twig and the branch broke .\nl-93\t1\t\tthe eggs and the cream mixed together .\nl-93\t0\t*\tplays and ballets alternate together .\nl-93\t1\t\tthe twig and the branch broke apart .\nl-93\t0\t*\tthe yolk and the white separated apart .\nl-93\t1\t\tthe judge presented the winner with a prize .\nl-93\t1\t\tthe judge offered a prize to the winner .\nl-93\t0\t*\tthe judge offered the winner with a prize .\nl-93\t0\t*\tthe judge saddled a prize to the winner .\nl-93\t1\t\tthe judge saddled the winner with a prize .\nl-93\t1\t\tthe jeweller inscribed the name on the ring .\nl-93\t1\t\tthe jeweller inscribed the ring with the name .\nl-93\t1\t\tthe jeweller copied the name on the ring .\nl-93\t0\t*\tthe jeweller copied the ring with the name .\nl-93\t0\t*\tthe jeweller decorated the name on the ring .\nl-93\t1\t\tthe jeweller decorated the ring with the name .\nl-93\t1\t\tbrian hit the fence with the stick .\nl-93\t1\t\tdon swatted the mosquito with the newspaper .\nl-93\t1\t\talison pierced the cloth with a needle .\nl-93\t1\t\tpaula hit the fence with the stick .\nl-93\t1\t\tmira blamed the accident on terry .\nl-93\t1\t\tmira blamed terry for the accident .\nl-93\t0\t*\tmira condemned the accident on terry .\nl-93\t1\t\tida hunted the woods for deer .\nl-93\t1\t\tida hunted for deer in the woods .\nl-93\t1\t\tida hunted deer in the woods .\nl-93\t1\t\tmelissa searched the papers for a clue .\nl-93\t1\t\tmelissa searched for a clue in the papers .\nl-93\t0\t*\tmelissa searched a clue in the papers .\nl-93\t1\t\ti stalked the woods for game .\nl-93\t0\t*\ti stalked for game in the woods .\nl-93\t1\t\ti stalked game in the woods .\nl-93\t0\t*\twe investigated for bombs in the area .\nl-93\t0\t*\twe investigated bombs in the area .\nl-93\t0\t*\twe rummaged the desk for papers .\nl-93\t1\t\twe rummaged through the desk for papers .\nl-93\t0\t*\twe rummaged papers through the desk .\nl-93\t0\t*\ti sought the woods for game .\nl-93\t0\t*\ti sought for game in the woods .\nl-93\t1\t\tselina touched the horse on the back .\nl-93\t1\t\tselina touched the horse 's back .\nl-93\t1\t\tthe horse kicked penny in the shin .\nl-93\t1\t\tthe horse kicked penny 's shin .\nl-93\t1\t\talison poked daisy in the ribs .\nl-93\t1\t\talison poked daisy 's ribs .\nl-93\t0\t*\tthe horse broke penny in the shin .\nl-93\t1\t\tthe horse broke penny 's shin .\nl-93\t0\t*\tthe glass cut rachel in the toe .\nl-93\t1\t\tthe glass cut rachel 's toe .\nl-93\t1\t\tthey praised the volunteers ' dedication .\nl-93\t1\t\tthey praised the volunteers for their dedication .\nl-93\t1\t\ti admired him for his courage .\nl-93\t1\t\tthe inspector analyzed the building 's soundness .\nl-93\t1\t\tthe inspector analyzed the building for its soundness .\nl-93\t0\t*\ti sensed him for his eagerness .\nl-93\t1\t\ti admired his honesty .\nl-93\t1\t\ti admired the honesty in him .\nl-93\t1\t\ti admired him for his honesty .\nl-93\t1\t\ti sensed the eagerness in him .\nl-93\t0\t*\tthey praise the dedication in the volunteers .\nl-93\t1\t\tmark terrified me with his single mindedness .\nl-93\t1\t\tmark 's single mindedness terrified me .\nl-93\t1\t\tthe clown amused the children with his antics .\nl-93\t1\t\tthe clown 's antics amused the children .\nl-93\t1\t\tmeat fell in price .\nl-93\t1\t\tthe price of meat fell .\nl-93\t1\t\tthe president appointed smith press secretary .\nl-93\t1\t\tthe president appointed smith as press secretary .\nl-93\t0\t*\tangela characterized shelly a lifesaver .\nl-93\t1\t\tangela characterized shelly as a lifesaver .\nl-93\t0\t*\tthe captain named the ship as seafarer .\nl-93\t1\t\tthe world saw the beginning of a new era in 1492 .\nl-93\t1\t\t1492 saw the beginning of a new era .\nl-93\t1\t\ti dried the clothes in the sun .\nl-93\t1\t\tthe sun dried the clothes .\nl-93\t1\t\tdavid broke the window with a hammer .\nl-93\t0\t*\tthe spoon ate the ice cream .\nl-93\t1\t\tthe crane loaded the truck .\nl-93\t0\t*\tthe pitchfork loaded the truck .\nl-93\t1\t\the established his innocence with the letter .\nl-93\t1\t\tthe letter established his innocence .\nl-93\t1\t\ti filled the pail with water .\nl-93\t1\t\twater filled the pail .\nl-93\t1\t\twe sleep five people in each room .\nl-93\t1\t\teach room sleeps five people .\nl-93\t1\t\ti incorporated the new results into the paper .\nl-93\t1\t\tthe paper incorporates the new results .\nl-93\t1\t\tthat whole wheat flour bakes wonderful bread .\nl-93\t0\t*\tthose new bricks constructed a house .\nl-93\t1\t\ti bought you a ticket for $ 5 .\nl-93\t1\t\t$ 5 will buy a ticket .\nl-93\t1\t\t$ 5 will buy you a ticket .\nl-93\t1\t\tthe contractor will build a house for $ 100,000 .\nl-93\t1\t\tthe contractor will build you a house for $ 100,000 .\nl-93\t1\t\t$ 100,000 will build you a house .\nl-93\t1\t\t$ 100,000 will build a house .\nl-93\t1\t\tthe middle class will benefit from the new tax laws .\nl-93\t1\t\tthe new tax laws will benefit the middle class .\nl-93\t1\t\tthe middle class will gain from the new tax laws .\nl-93\t0\t*\tthe new tax jaws will gain the middle class .\nl-93\t1\t\tthe butcher cuts the meat .\nl-93\t1\t\tthe butler polished the silver .\nl-93\t1\t\tthis silver polishes itself .\nl-93\t1\t\tthe audience watched the movie .\nl-93\t0\t*\tthis movie just watches itself .\nl-93\t1\t\tthis window just opens itself .\nl-93\t1\t\tthe heat melted the ice cream .\nl-93\t0\t*\tthis ice cream just melts itself .\nl-93\t1\t\tthis book just sells itself .\nl-93\t1\t\ti presented a solution to the problem yesterday .\nl-93\t1\t\ta solution to the problem presented itself yesterday .\nl-93\t1\t\tthe cook sliced the mushrooms .\nl-93\t1\t\tthe mushrooms were sliced by the cook .\nl-93\t1\t\tcolumbus believed the earth to be round .\nl-93\t1\t\tcolumbus believed that the earth was round .\nl-93\t1\t\tit was believed that the earth was round .\nl-93\t1\t\tthe police kept tabs on the suspect .\nl-93\t1\t\ttabs were kept on the suspect .\nl-93\t1\t\tthe lax supervision was taken advantage of .\nl-93\t1\t\tthis bed was slept in by george washington .\nl-93\t0\t*\ttuesday was slept on by george washington .\nl-93\t0\t*\tthe horizon was appeared on by a pirate ship .\nl-93\t1\t\tthe pillow remained stuffed with feathers .\nl-93\t1\t\ta flowering plant is on the windowsill .\nl-93\t1\t\tthere is a flowering plant on the windowsill .\nl-93\t1\t\ta problem developed .\nl-93\t1\t\tthere developed a problem .\nl-93\t1\t\ta ship appeared on the horizon .\nl-93\t1\t\tthere appeared a ship on the horizon .\nl-93\t0\t*\tthere appeared the ship on the horizon .\nl-93\t1\t\ta little boy darted into the room .\nl-93\t1\t\tthere darted into the room a little boy .\nl-93\t1\t\ta little boy ran in the yard .\nl-93\t0\t*\tthere ran a little boy in the yard .\nl-93\t1\t\tan ancient treasure trove was found in this cave .\nl-93\t1\t\tthere was found in this cave an ancient treasure trove .\nl-93\t1\t\tsuddenly an ugly old man entered the hall .\nl-93\t1\t\tsuddenly there entered the hall an ugly old man .\nl-93\t1\t\ta lot of snow melted on the streets of chicago .\nl-93\t0\t*\tthere melted a lot of snow on the streets of chicago .\nl-93\t1\t\ton the windowsill is a flowering plant .\nl-93\t1\t\tin the woods lives an old woman .\nl-93\t1\t\ta cat jumped onto the table .\nl-93\t1\t\ta cat jumped on the table .\nl-93\t0\t*\ton the table jumped a cat .\nl-93\t1\t\ta choir sang in the church .\nl-93\t1\t\tin the church sang a choir .\nl-93\t1\t\tin this cave was found an ancient treasure trove .\nl-93\t1\t\ta violent demonstration took place in the main square .\nl-93\t0\t*\ton the streets of chicago melted a lot of snow .\nl-93\t1\t\tsarah smiled .\nl-93\t1\t\tsarah sang .\nl-93\t1\t\tsarah sang a song .\nl-93\t1\t\tsarah sang a ballad .\nl-93\t1\t\tsarah sang an aria .\nl-93\t1\t\tsarah sang a hymn .\nl-93\t1\t\tsarah sang the anthem .\nl-93\t1\t\theather snorted .\nl-93\t1\t\tkelly buttered the bread .\nl-93\t0\t*\tkelly buttered the bread with butter .\nl-93\t1\t\tkelly buttered the bread with unsalted butter .\nl-93\t1\t\tlinda taped the box with two-sided tape .\nl-93\t1\t\tthe men were able to mine more gold .\nl-93\t0\t*\tlydia pocketed the change in her pocket .\nl-93\t0\t*\tthe cook boned the fish of bones .\nl-93\t0\t*\tthe cook boned the fish of its backbone .\nl-93\t1\t\tpauline smiled her thanks .\nl-93\t1\t\tsandra beamed .\nl-93\t0\t*\ta cheerful welcome was beamed by sandra .\nl-93\t1\t\tshe mumbled .\nl-93\t1\t\tshe mumbled her adoration .\nl-93\t1\t\tthey shopped their way around new york .\nl-93\t1\t\the worked his way through the book .\nl-93\t1\t\tshe stipulated her way out of the problem .\nl-93\t1\t\tthe boy pushed his way through the crowd .\nl-93\t1\t\tthe explorers cut their way through the jungle .\nl-93\t0\t*\tthe children came their way to the party .\nl-93\t0\t*\tthe flower bloomed its way to a prize .\nl-93\t0\t*\tthey disappeared their way off the stage .\nl-93\t1\t\tthe silversmith pounded the metal flat .\nl-93\t0\t*\tthe silversmith pounded on the metal flat .\nl-93\t1\t\tpauline hammered the metal flat .\nl-93\t1\t\tjasmine pushed the door open .\nl-93\t1\t\tthe guests drank the teapot dry .\nl-93\t1\t\tamanda burned the stove black .\nl-93\t1\t\tbelinda walked the soles off her shoes .\nl-93\t1\t\tphilippa cried herself to sleep .\nl-93\t1\t\tthe river froze solid .\nl-93\t1\t\tthe door slid shut .\nl-93\t1\t\tthe metal was hammered flat .\nl-93\t1\t\tthe door was pushed open .\nl-93\t1\t\tphilippa cried her eyes dry .\nl-93\t0\t*\tthe dog smelled the flower bed bare .\nl-93\t0\t*\tthe teacher hated the pupils angry .\nl-93\t0\t*\twilla arrived breathless .\nl-93\t0\t*\tsharon brought willa breathless .\nl-93\t0\t*\tthis list includes my name on itself .\nl-93\t1\t\tfanny pulled the blanket over herself .\nl-93\t1\t\tfanny pulled the blanket over her .\nl-93\t1\t\tthe truck rumbled .\nl-93\t1\t\tthe truck rumbled into the driveway .\nl-93\t1\t\taudrey tiptoed to the door .\nl-93\t1\t\tthe couple waltzed to the window .\nl-93\t1\t\tthe clown wobbled down the hall .\nl-93\t1\t\tleona pushed the cart to the market .\nl-93\t1\t\tit is rumored that he left town .\nl-93\t0\t*\tthey rumor that he left town .\nl-93\t0\t*\tthe politician perjured his aide .\nl-93\t1\t\tjennifer craned her neck .\nl-93\t0\t*\tjennifer craned his neck .\nl-93\t0\t?*\tjennifer craned her arm .\nl-93\t1\t\tthey 've got it made .\nl-93\t1\t\tthe teacher meant well .\nl-93\t0\t*\tthe teacher meant .\nl-93\t1\t\tthe horse would n't budge .\nl-93\t1\t\twould the horse budge if you pushed ?\nl-93\t0\t*\tthe horse budged .\nl-93\t0\t*\ti put the book to sally .\nl-93\t0\t*\ti put the book from edna .\nl-93\t0\t*\ti put the book from edna to sally .\nl-93\t1\t\ti put books on the table .\nl-93\t0\t*\ti put the table with the books .\nl-93\t0\t*\ti put the table with books .\nl-93\t1\t\ti put the books on the table .\nl-93\t0\t*\tthe books put on the table easily .\nl-93\t0\t*\tthe books put on the table .\nl-93\t0\t*\ti put on the table .\nl-93\t1\t\tcheryl stood the books next to the magazines .\nl-93\t1\t\tcheryl stood the books on the shelf .\nl-93\t0\t*\tcheryl stood the books from edna .\nl-93\t0\t*\tcheryl stood the books from edna to sarah .\nl-93\t0\t*\tcheryl stood the shelf with books .\nl-93\t0\t*\tcheryl stood the shelf with the books .\nl-93\t1\t\tcheryl stood the tall books on the table .\nl-93\t0\t*\ttall books stand on tables easily .\nl-93\t1\t\tcheryl stood the books on the table .\nl-93\t1\t\tthe books stood on the table .\nl-93\t0\t*\tcheryl stood on the table .\nl-93\t1\t\ti funneled the mixture into the bottle .\nl-93\t0\t*\ti funneled the mixture to rina .\nl-93\t0\t*\ti funneled the mixture from edna to rina .\nl-93\t0\t*\ti funneled the bottle with the mixture .\nl-93\t0\t*\tthe mixture funnels easily .\nl-93\t0\t*\tthe mixture funnels .\nl-93\t0\t*\ti funneled the mixture .\nl-93\t0\t*\ti funneled into the bottle .\nl-93\t1\t\ti lifted the books .\nl-93\t1\t\ti lifted the book onto the table .\nl-93\t1\t\ti lifted the book onto the out of the box .\nl-93\t1\t\ti lifted the books from the floor to the table .\nl-93\t1\t\ti lifted the books onto the table .\nl-93\t0\t*\ti lifted the table with the books .\nl-93\t1\t\ti lifted the books to him .\nl-93\t1\t\ti lifted the books up to him .\nl-93\t0\t*\ti lifted him up the books .\nl-93\t0\t*\ti lifted onto the table .\nl-93\t1\t\ttamara poured water into the bowl .\nl-93\t1\t\ttamara poured water over the flowers .\nl-93\t1\t\ttamara poured water out of the pitcher .\nl-93\t0\t*\ttamara poured at water into the bowl .\nl-93\t1\t\ttamara poured water onto the plants .\nl-93\t0\t*\twater pours easily onto the plants .\nl-93\t1\t\twater poured onto the plants .\nl-93\t1\t\tcora coiled the rope around the post .\nl-93\t0\t*\tcora coiled the post with the rope .\nl-93\t0\t*\tcora coiled at the rope around the post .\nl-93\t1\t\tthe rope coiled around the post .\nl-93\t1\t\tthat kind of rope coils easily around the post .\nl-93\t0\t*\tcora coiled around the post .\nl-93\t1\t\tjessica loaded boxes onto the wagon .\nl-93\t1\t\tjessica loaded boxes into the wagon .\nl-93\t1\t\tjessica sprayed paint onto the table .\nl-93\t1\t\tjessica sprayed paint under the table .\nl-93\t1\t\tjessica sprayed paint over the table .\nl-93\t1\t\tjessica sprayed paint on the wall .\nl-93\t1\t\tpaint sprayed on the wall .\nl-93\t1\t\tjessica sprayed the wall with paint .\nl-93\t0\t*\tthe wall sprayed with paint .\nl-93\t1\t\tjessica squirted water at me .\nl-93\t1\t\tjessica sprayed water at me .\nl-93\t1\t\tjessica splashed water at me .\nl-93\t0\t*\tjessica loaded boxes at the truck .\nl-93\t0\t*\tjessica stuffed boxes at the truck .\nl-93\t1\t\tleslie staffed the store with employees .\nl-93\t0\t*\tleslie staffed employees in the store .\nl-93\t0\t*\tthe store staffed with employees .\nl-93\t1\t\tthe employees staffed the store .\nl-93\t1\t\tleigh swaddled the baby with blankets .\nl-93\t1\t\tlora buttered the toast .\nl-93\t0\t*\tlora buttered unsalted butter on the toast .\nl-93\t1\t\tlora buttered the toast with unsalted butter .\nl-93\t0\t*\tlora buttered at the toast with unsalted butter .\nl-93\t0\t*\tthe toast buttered with unsalted butter .\nl-93\t0\t*\tthe toast buttered .\nl-93\t1\t\tlydia pocketed the change .\nl-93\t0\t*\tlydia pocketed her pocket with the change .\nl-93\t0\t*\tthe change pocketed .\nl-93\t1\t\tdoug removed the scratches from the tabletop .\nl-93\t1\t\tdoug removed the scratches from around the sink .\nl-93\t0\t*\tdoug removed the scratches out of the drawer .\nl-93\t0\t*\tdoug removed the scratches to nowhere .\nl-93\t0\t*\tdoug removed the tabletop of scratches .\nl-93\t0\t*\tdoug removed at the scratches from the tabletop .\nl-93\t0\t*\tthe scratches removed from the tabletop .\nl-93\t1\t\tthe king banished the general from the army .\nl-93\t1\t\tthe king banished the general to a mountain fortress .\nl-93\t0\t*\tthe king banished the general from the palace to a mountain fortress .\nl-93\t0\t*\tthe king banished at the general from the army .\nl-93\t0\t*\tthe general banished from the army .\nl-93\t1\t\tdoug cleared the dishes from under the rack .\nl-93\t1\t\tdoug cleared the table .\nl-93\t0\t*\tdoug cleared at the table of dishes .\nl-93\t0\t*\tdoug cleared at the table .\nl-93\t1\t\tthe strong winds cleared the skies .\nl-93\t1\t\tthe strong winds slowly cleared the clouds from the sky .\nl-93\t1\t\tbrian wiped the fingerprints from the counter .\nl-93\t1\t\tbrian wiped the fingerprints from inside the cupboard .\nl-93\t1\t\tbrian wiped the fingerprints from under the cupboard .\nl-93\t1\t\tbrian wiped the fingerprints from outside the cupboard .\nl-93\t0\t*\tbrian wiped the counter of fingerprints .\nl-93\t1\t\tbrian wiped the counter .\nl-93\t1\t\tpaula trimmed the bush .\nl-93\t1\t\tbrian was wiping the counter .\nl-93\t1\t\tbrian was wiping .\nl-93\t1\t\tbrian was wiping the wall behind the stove .\nl-93\t1\t\tcarla shoveled the snow from the walk .\nl-93\t1\t\tcarla shoveled the snow from under the bushes .\nl-93\t1\t\tcarla shoveled the snow from among the bushes .\nl-93\t1\t\tcarla shoveled the snow from near the bushes .\nl-93\t0\t*\tcarla shoveled the walk of snow .\nl-93\t0\t*\tcarla shoveled at the walk .\nl-93\t1\t\tcarla was shoveling the walk .\nl-93\t1\t\tcarla was shoveling .\nl-93\t1\t\tcarla mopped the floor under the furniture .\nl-93\t1\t\tcarla mopped under the furniture .\nl-93\t1\t\tthe thief stole the painting for mr. smith .\nl-93\t0\t*\tthe thief stole mr. smith the painting .\nl-93\t0\t*\tthe thief stole at the painting from the museum .\nl-93\t0\t*\tthe painting stole from the museum .\nl-93\t0\t*\tthe doctor cured pneumonia from pat .\nl-93\t0\t*\tpat cured of pneumonia .\nl-93\t1\t\tthe swindler cheated pat of her fortune .\nl-93\t1\t\tthe cook boned the fish .\nl-93\t0\t*\tthe fish boned .\nl-93\t0\t*\tthe fish scrubbed .\nl-93\t1\t\tthe men mined the gold .\nl-93\t0\t*\tthe gold mined .\nl-93\t1\t\tnora sent the book from paris .\nl-93\t1\t\tnora sent the book to london .\nl-93\t1\t\tnora sent the book from paris to london .\nl-93\t1\t\tnora sent the book to peter .\nl-93\t0\t*\tnora sent at the book to peter .\nl-93\t0\t*\tthe book sent to peter .\nl-93\t1\t\tnora sent books to children .\nl-93\t0\t*\tbooks send easily to children .\nl-93\t1\t\tcarla slid the books across the table .\nl-93\t1\t\tcarla slid the book to dale .\nl-93\t1\t\tcarla slid dale the book .\nl-93\t0\t*\tcarla slid at the book to dale .\nl-93\t1\t\tthe books slid across the table .\nl-93\t1\t\tcarla slid those books across the table .\nl-93\t1\t\tthose books slide across the table easily .\nl-93\t1\t\tnora brought the book to the meeting .\nl-93\t1\t\tnora brought the book to pamela .\nl-93\t1\t\tnora brought the book from horne .\nl-93\t1\t\tnora brought pamela the book .\nl-93\t0\t*\tnora brought at the book to the meeting .\nl-93\t0\t*\tthe book brought to the meeting .\nl-93\t0\t*\tthe book brings easily to the meeting .\nl-93\t1\t\tamanda carried the package .\nl-93\t1\t\tamanda carried the package from boston .\nl-93\t1\t\tamanda carried the package to new york .\nl-93\t1\t\tamanda carried the package from boston to new york .\nl-93\t0\t*\tamanda carried at the package to new york .\nl-93\t0\t*\tthe package carried to new york .\nl-93\t0\t*\tthe package carried .\nl-93\t1\t\tamanda carried packages to new york .\nl-93\t1\t\tamanda carried packages .\nl-93\t0\t*\tpackages carry easily to new york .\nl-93\t1\t\tamanda drove the package from boston to new york .\nl-93\t1\t\tamanda drove the package to new york .\nl-93\t1\t\tamanda drove the package from boston .\nl-93\t1\t\tamanda drove the package .\nl-93\t1\t\tamanda drove the package to pamela .\nl-93\t0\t*\tamanda drove at the package to new york .\nl-93\t0\t*\tamanda drove at the package .\nl-93\t0\t*\tthe package drove to new york .\nl-93\t0\t*\tthe package drove .\nl-93\t1\t\tamanda drove packages to new york .\nl-93\t1\t\tamanda drove packages .\nl-93\t0\t*\tpackages drive easily .\nl-93\t1\t\tnora pushed the chair .\nl-93\t1\t\tnora pushed at the chair .\nl-93\t1\t\tnora pushed on the chair .\nl-93\t1\t\tnora pushed against the chair .\nl-93\t1\t\tnora pushed through the crowd .\nl-93\t1\t\tnora pushed her way through the crowd .\nl-93\t1\t\tnora pushed the chair against the wall .\nl-93\t1\t\tthey lent a bicycle to me .\nl-93\t1\t\tthey lent me a bicycle .\nl-93\t0\t*\tthey lent me with a bicycle .\nl-93\t0\t*\ta bicycle lent .\nl-93\t0\t*\ta bicycle lent to me .\nl-93\t1\t\twe contributed our paycheck to her .\nl-93\t0\t*\twe contributed her our paycheck .\nl-93\t0\t*\twe contributed her with our paycheck .\nl-93\t0\t*\tour paycheck contributed .\nl-93\t0\t*\twe offered a job behind her .\nl-93\t1\t\twe offered her a job .\nl-93\t0\t*\twe offered her with a job .\nl-93\t0\t*\ta job offered to her .\nl-93\t1\t\tbrown presented jones with a plaque .\nl-93\t1\t\tthe presentation of a plaque was a proud moment .\nl-93\t1\t\tbrown equipped jones with a camera .\nl-93\t0\t*\tbrown equipped a camera near jones .\nl-93\t0\t*\tbrown equipped a camera next to jones .\nl-93\t0\t*\tbrown equipped a camera at jones .\nl-93\t0\t*\tbrown equipped a camera to jones .\nl-93\t0\t*\tbrown equipped jones a camera .\nl-93\t1\t\tcarmen bought a dress .\nl-93\t1\t\tcarmen bought a dress at bloomingdale 's .\nl-93\t1\t\tcarmen bought a dress for mary .\nl-93\t0\t*\tcarmen bought a dress to mary .\nl-93\t1\t\tcarmen bought a dress from diana .\nl-93\t0\t*\tcarmen bought diana of a dress .\nl-93\t1\t\tcarmen bought a dress at bloomingdale 's for $ 50 .\nl-93\t1\t\t$ 50 wo n't even buy a dress at bloomingdale 's .\nl-93\t1\t\tcarmen obtained the spare part .\nl-93\t0\t*\tcarmen obtained mary a spare part .\nl-93\t0\t*\tcarmen obtained a spare part to mary .\nl-93\t1\t\tcarmen obtained a spare part from diana .\nl-93\t0\t*\tcarmen obtained diana of a spare part .\nl-93\t1\t\tcarmen purchased a dress at bloomingdale 's for $ 50 .\nl-93\t1\t\t$ 50 wo n't even purchase a dress at bloomingdale 's .\nl-93\t1\t\tgwen exchanged the dress for a shirt .\nl-93\t0\t*\tgwen exchanged the dress to mary .\nl-93\t0\t*\tgwen exchanged mary the dress .\nl-93\t1\t\tgwen exchanged the dress for mary .\nl-93\t1\t\tthe children like to berry in the summer .\nl-93\t0\t*\tshe held at the rail .\nl-93\t0\t*\tthe rail holds easily .\nl-93\t1\t\tshe held his arm .\nl-93\t1\t\tshe held him by the arm .\nl-93\t0\t*\tshe held the paper from him .\nl-93\t1\t\tmichelle kept the papers in the desk .\nl-93\t1\t\tmichelle kept the papers behind the desk .\nl-93\t1\t\tmichelle kept the papers over the desk .\nl-93\t1\t\tmichelle kept the papers under the desk .\nl-93\t1\t\tfrances hid the presents from sally .\nl-93\t1\t\tfrances hid the presents behind the books .\nl-93\t0\t*\tfrances hid sally of the presents .\nl-93\t1\t\tsteve tossed the ball .\nl-93\t1\t\tsteve tossed the ball into the garden .\nl-93\t1\t\tsteve tossed the ball over the fence .\nl-93\t1\t\tsteve tossed the ball from the tree to the gate .\nl-93\t1\t\tsteve tossed the ball at anna .\nl-93\t0\t*\tsteve tossed anna with the ball .\nl-93\t1\t\tsteve tossed the ball to anna .\nl-93\t1\t\tsteve tossed the ball against the wall .\nl-93\t0\t*\tsteve tossed the wall with the ball .\nl-93\t0\t*\tsteve tossed at the ball .\nl-93\t0\t*\tthe ball tossed .\nl-93\t1\t\tsteve tossed the softball .\nl-93\t0\t*\tbaseballs toss easily .\nl-93\t1\t\tsteve pelted anna with acorns .\nl-93\t0\t*\tsteve pelted acorns at anna .\nl-93\t1\t\tsteve pelted anna .\nl-93\t0\t*\tsteve pelted at anna .\nl-93\t0\t*\tsteve pelted at anna with acorns .\nl-93\t0\t*\tsteve pelted acorns against anna .\nl-93\t0\t*\tsteve pelted acorns to anna .\nl-93\t0\t*\tsteve pelted anna acorns .\nl-93\t1\t\tsteve pelted the squirrels with acorns .\nl-93\t0\t*\tsquirrels pelt easily with acorns .\nl-93\t1\t\tpaula hit the stick on the fence .\nl-93\t1\t\tpaula hit the stick against the fence .\nl-93\t0\t*\tpaula hit the stick into the fence .\nl-93\t1\t\tpaula hit at the fence with the stick .\nl-93\t1\t\tpaula hit deirdre on the back .\nl-93\t1\t\tpaula hit deirdre 's back .\nl-93\t1\t\tpaula hit the sticks together .\nl-93\t0\t*\tpaula hit the sticks .\nl-93\t0\t*\tthe fence hit with a stick .\nl-93\t0\t*\tthe fence hit .\nl-93\t0\t*\tthe fence hits easily .\nl-93\t1\t\tthe stick hit the fence .\nl-93\t0\t*\tpaula swatted the cloth on the fly .\nl-93\t0\t*\tpaula swatted the cloth against the fly .\nl-93\t1\t\tpaula swatted the fly with the cloth .\nl-93\t0\t*\tpaula swatted the cloth through the fly .\nl-93\t0\t*\tpaula swatted the cloth into the fly .\nl-93\t1\t\tpaula swatted the fly .\nl-93\t1\t\tpaula swatted at the fly .\nl-93\t1\t\tpaula swatted deirdre on the back .\nl-93\t1\t\tpaula swatted deirdre 's back .\nl-93\t0\t*\tthe fly swatted .\nl-93\t1\t\tpaula swatted flies .\nl-93\t0\t*\tflies swat easily .\nl-93\t1\t\tpaula swatted the fly with a cloth .\nl-93\t0\t*\tthe cloth swatted the fly .\nl-93\t0\t*\tpaula spanked her right hand against the naughty child .\nl-93\t1\t\tpaula spanked the naughty child with her right hand .\nl-93\t0\t*\tpaula spanked her right hand into the naughty child .\nl-93\t0\t*\tpaula spanked her right hand through the naughty child .\nl-93\t1\t\tpaula spanked the naughty child on the back .\nl-93\t1\t\tpaula spanked the naughty child 's back .\nl-93\t1\t\tpaula spanked the naughty child .\nl-93\t0\t*\tthe naughty child spanked .\nl-93\t0\t*\tnaughty children spank easily .\nl-93\t0\t*\tpaula 's right hand spanked the naughty child .\nl-93\t0\t*\tthe wall banged with the grocery cart .\nl-93\t1\t\tthe old cart banged against the new cart .\nl-93\t0\t*\tthe old and new carts banged .\nl-93\t1\t\tthe old and new carts banged together .\nl-93\t1\t\talison poked the needle through the cloth .\nl-93\t1\t\talison poked the needle into the cloth .\nl-93\t1\t\talison poked the cloth with a needle .\nl-93\t1\t\talison poked the cloth .\nl-93\t1\t\talison poked the needle through the denim .\nl-93\t1\t\tcarrie touched the cat .\nl-93\t0\t*\tcarrie touched the stick against the cat .\nl-93\t1\t\tcarrie touched the cat with the stick .\nl-93\t0\t*\tcarrie touched the stick into the cat .\nl-93\t0\t*\tcarrie touched the stick through .\nl-93\t0\t*\tcarrie touched at the cat .\nl-93\t1\t\tcarrie touched him on the shoulder .\nl-93\t0\t*\tthat cat touches easily .\nl-93\t1\t\tcarrie touched the fence with a stick .\nl-93\t0\t*\tthe stick touched the fence .\nl-93\t1\t\tcarol cut the bread with a knife .\nl-93\t1\t\tcarol cut the bread .\nl-93\t1\t\tcarol cut at the bread .\nl-93\t1\t\tcarol cut herself on the thumb .\nl-93\t1\t\tcarol cut her thumb .\nl-93\t1\t\tcarol cut the whole wheat bread .\nl-93\t1\t\twhole wheat bread cuts easily .\nl-93\t1\t\tthe knife cut the bread .\nl-93\t1\t\tthis knife cuts well .\nl-93\t1\t\tcarol carved the stone with a chisel .\nl-93\t1\t\tcarol carved the stone .\nl-93\t0\t*\tcarol carved at the stone .\nl-93\t0\t*\tcarol carved the tree on the branch .\nl-93\t1\t\tcarol carved the tree 's branch .\nl-93\t0\t*\tthe stone carved .\nl-93\t1\t\tcarol carved the marble .\nl-93\t1\t\tmarble carves easily .\nl-93\t1\t\tcarol carved the marble with a chisel .\nl-93\t1\t\tthe chisel carved the marble .\nl-93\t1\t\tthat chisel carved the statue .\nl-93\t1\t\tthat chisel carves well .\nl-93\t1\t\therman mixed the eggs with the cream .\nl-93\t1\t\therman mixed the eggs and the cream .\nl-93\t1\t\tthe eggs mixed with the cream .\nl-93\t1\t\tthe eggs and the cream mixed .\nl-93\t1\t\therman mixed the eggs and the cream together .\nl-93\t1\t\ti mixed the soap into the water .\nl-93\t1\t\ti mixed the soap and the water .\nl-93\t1\t\ti mixed the eggs with cream .\nl-93\t1\t\ti mixed the eggs and cream .\nl-93\t1\t\ti mixed the eggs and cream together .\nl-93\t1\t\tharriet alternated folk songs and pop songs .\nl-93\t1\t\tplays alternate with ballets .\nl-93\t1\t\tplays and ballets alternate .\nl-93\t1\t\tharriet interconnected the pieces .\nl-93\t1\t\therman whipped the cream .\nl-93\t0\t*\tlinda taped the wall with the picture .\nl-93\t1\t\tlinda taped the label and the cover together .\nl-93\t1\t\tthe child clung to her mother .\nl-93\t0\t*\tthe child and her mother clung .\nl-93\t0\t*\tthe war clung the child to her mother .\nl-93\t1\t\tthe yolk separated from the white .\nl-93\t1\t\tthe yolk and the white separated .\nl-93\t1\t\ti separated the cream from the milk .\nl-93\t1\t\ti separated the egg yolk and the egg white .\nl-93\t1\t\ti separated the egg yolks and the egg whites .\nl-93\t0\t*\ti separated the milk of the cream .\nl-93\t1\t\ti broke the twig off the branch .\nl-93\t1\t\ti broke the twig off of the branch .\nl-93\t0\t*\ti broke the twig and the branch .\nl-93\t1\t\ti broke twigs off those branches .\nl-93\t1\t\ti broke twigs off of those branches .\nl-93\t1\t\ti broke those twigs and branches apart .\nl-93\t1\t\ti detached the handle .\nl-93\t1\t\ti detached the handle from the box .\nl-93\t0\t*\ti detached the handle and the box .\nl-93\t0\t*\tthe handle detached from the box .\nl-93\t1\t\ti detached that new handle .\nl-93\t1\t\ti detached that new handle from the box .\nl-93\t1\t\tthat new handle detaches easily .\nl-93\t0\t*\tthat new handle detaches from the box easily .\nl-93\t1\t\tthe winter schedule differed from the spring schedule .\nl-93\t1\t\tthis flyer differs from that flyer .\nl-93\t0\t*\ti differed this flyer from that flyer .\nl-93\t1\t\tphyllis dyed the dress .\nl-93\t1\t\tsmith inscribed his name over the door .\nl-93\t1\t\tsmith inscribed his name under the picture .\nl-93\t1\t\tsmith inscribed the ring with his name .\nl-93\t1\t\tsmith was annealing the rings .\nl-93\t1\t\tsmith was annealing .\nl-93\t1\t\tthe jeweller printed the name on the ring .\nl-93\t1\t\tthe jeweller printed the name over the door .\nl-93\t1\t\tthe jeweller printed the name under the picture .\nl-93\t1\t\tthe jeweller printed the name onto the cup .\nl-93\t1\t\tthe jeweller scribbled his name on the contract .\nl-93\t1\t\tsmith was scribbling his notes .\nl-93\t1\t\tsmith was scribbling .\nl-93\t1\t\tthe jeweller decorated the ring .\nl-93\t1\t\tthe secretary transcribed the speech .\nl-93\t1\t\tthe secretary transcribed the speech into the record .\nl-93\t0\t*\tthe secretary transcribed the record with the speech .\nl-93\t1\t\tmartha carved a toy out of the piece of wood .\nl-93\t1\t\tmartha carves .\nl-93\t1\t\tmartha carved a toy out of a piece of wood for the baby .\nl-93\t1\t\tmartha carved a piece of wood for the baby .\nl-93\t1\t\tmartha carved a piece of wood into a toy for the baby .\nl-93\t1\t\tmartha carved beautiful toys out of this wood .\nl-93\t1\t\tthis wood carves beautiful toys .\nl-93\t1\t\t$ 100,000 will build you a house .\nl-93\t1\t\t$ 100,000 will build a house .\nl-93\t1\t\tthe gardener grew an oak tree from that acorn .\nl-93\t1\t\tdonna fixed a sandwich .\nl-93\t0\t*\tdonna fixed last night 's leftovers into a sandwich .\nl-93\t1\t\tdonna fixed a sandwich for me .\nl-93\t1\t\tdonna fixed me a sandwich .\nl-93\t1\t\tdavid constructed a house .\nl-93\t1\t\tdavid constructed a house out of bricks .\nl-93\t0\t*\tdavid constructed me a house .\nl-93\t1\t\tdavid constructed the house .\nl-93\t0\t*\tthe house constructed .\nl-93\t0\t*\tdavid constructed the mansion from bricks into a house .\nl-93\t1\t\ti shaped the dough into a loaf .\nl-93\t1\t\ti shaped the dough .\nl-93\t0\t*\ti shaped a loaf from the dough .\nl-93\t1\t\ti twirled the dough into a pretzel .\nl-93\t0\t*\ti shaped a good loaf from this dough .\nl-93\t0\t*\tthis dough shapes a good loaf .\nl-93\t0\t*\ti shaped the dough from a lump into a loaf .\nl-93\t0\t*\the turned from a prince .\nl-93\t1\t\tsandy sang a song to me .\nl-93\t1\t\tsandy sang me a song .\nl-93\t1\t\tsandy sang a song for me .\nl-93\t1\t\tsandy sang a song .\nl-93\t1\t\tsandy sang .\nl-93\t0\t*\tthe song sang .\nl-93\t1\t\tracial inequality engenders conflict .\nl-93\t0\t*\tconflict engenders .\nl-93\t0\t*\tthe president appointed press secretary to smith .\nl-93\t1\t\tthe captain named the ship seafarer .\nl-93\t0\t*\tthe captain named seafarer to the ship .\nl-93\t1\t\tthe president declared smith press secretary .\nl-93\t0\t*\tthe president declared smith as press secretary .\nl-93\t0\t*\tthe president declared smith to press secretary .\nl-93\t0\t*\tthe press conjectured smith the appointee .\nl-93\t1\t\tthe press conjectured that smith would be the appointee .\nl-93\t1\t\tdina posed as a lawyer .\nl-93\t0\t*\tdina posed a lawyer .\nl-93\t1\t\tmiriam tutored her brother .\nl-93\t1\t\ther cousin clerked for judge davis .\nl-93\t1\t\ti see someone running down the street .\nl-93\t1\t\ti saw jane run down the street .\nl-93\t1\t\ti saw the mona lisa .\nl-93\t0\t*\tthe mona lisa sees easily .\nl-93\t0\t*\twe spotted that they were running .\nl-93\t0\t*\twe spotted them run .\nl-93\t0\t*\trunaway cats spot easily .\nl-93\t1\t\twe peered at the baby .\nl-93\t1\t\twe peered around the room .\nl-93\t1\t\twe peered through the screen .\nl-93\t1\t\twe peered into the closet .\nl-93\t1\t\tthat pea soup tasted delicious to me .\nl-93\t1\t\tthe clown amused the children .\nl-93\t0\t*\tthe children amused at the clown .\nl-93\t1\t\tthe clown amused the little children .\nl-93\t1\t\tlittle children amuse easily .\nl-93\t1\t\tthat joke never fails to amuse little children .\nl-93\t1\t\tthat joke never fails to amuse .\nl-93\t1\t\tthat the clown had a red nose amused the children .\nl-93\t1\t\tto win the prize : would thrill me .\nl-93\t1\t\tthe clown was amusing to the children .\nl-93\t1\t\ttourists admire paintings .\nl-93\t0\t*\tpaintings admire easily .\nl-93\t1\t\ti admired him as a teacher .\nl-93\t0\t*\ti admired him a teacher .\nl-93\t1\t\tmegan marveled at the beauty of the grand canyon .\nl-93\t1\t\tdorothy needs new shoes .\nl-93\t0\t*\tdorothy is needing new shoes .\nl-93\t1\t\tdorothy needs her skills .\nl-93\t1\t\tdorothy needs her for her skills .\nl-93\t0\t*\tdorothy needs the skills in her .\nl-93\t1\t\tdorothy needs that dress as a costume .\nl-93\t0\t*\tdorothy needs that dress a costume .\nl-93\t1\t\tdana longs for a sunny day .\nl-93\t1\t\tdana is longing for a sunny day .\nl-93\t1\t\tthey praised the volunteers .\nl-93\t1\t\tthe director praised the volunteers .\nl-93\t0\t*\tvolunteers praise easily .\nl-93\t1\t\tthey praised them as volunteers .\nl-93\t0\t*\tthe inspector analyzed the soundness in the building .\nl-93\t1\t\ti hunted game in the woods .\nl-93\t1\t\ti was hunting game .\nl-93\t1\t\ti was hunting game in the woods .\nl-93\t1\t\ti was hunting in the woods .\nl-93\t1\t\ti was hunting .\nl-93\t1\t\ti searched for treasure in the cave .\nl-93\t0\t*\ti searched treasure in the cave .\nl-93\t0\t*\twe rummaged the drawer for important documents .\nl-93\t1\t\twe rummaged in the drawer for important documents .\nl-93\t0\t*\twe rummaged important documents in the drawer .\nl-93\t0\t*\ti hunted the woods for game .\nl-93\t0\t*\ti hunted for game in the woods .\nl-93\t1\t\ti hunted the secret out of him .\nl-93\t1\t\tbrenda haggled with molly .\nl-93\t1\t\tbrenda and molly haggled .\nl-93\t0\t*\tbrenda haggled molly .\nl-93\t1\t\tbrenda and molly haggled about the party .\nl-93\t1\t\tbill married kathy .\nl-93\t0\t*\tbrenda met .\nl-93\t1\t\tbrenda and molly met .\nl-93\t1\t\tanne met with cathy .\nl-93\t1\t\twanda taught the students .\nl-93\t1\t\twanda taught french to the students .\nl-93\t1\t\twanda taught the students french .\nl-93\t1\t\twanda taught the students that the earth was round .\nl-93\t1\t\tellen told a story .\nl-93\t1\t\tellen told a story to helen .\nl-93\t1\t\tellen told helen a story .\nl-93\t1\t\tellen told helen .\nl-93\t1\t\tellen told helen about the situation .\nl-93\t0\t*\tellen told a story at helen .\nl-93\t0\t*\tellen told for helen to come .\nl-93\t1\t\tsusan whispered .\nl-93\t1\t\tsusan whispered to rachel .\nl-93\t1\t\tsusan whispered a few words .\nl-93\t1\t\tsusan whispered the news to rachel .\nl-93\t0\t*\tsusan whispered rachel the news .\nl-93\t1\t\tsusan whispered for me to come .\nl-93\t1\t\tsusan whispered `` shut up '' .\nl-93\t1\t\tsusan whispered `` shut up '' at them .\nl-93\t1\t\tthey whispered that the winner would be announced tonight .\nl-93\t1\t\theather cabled the news .\nl-93\t1\t\theather cabled sara .\nl-93\t1\t\theather cabled the news to sara .\nl-93\t1\t\theather cabled sara the news .\nl-93\t0\t*\theather cabled the news at sara .\nl-93\t1\t\theather cabled sara about the situation .\nl-93\t1\t\theather cabled for sara to come .\nl-93\t1\t\tellen talked .\nl-93\t0\t*\tellen talked for helen to come .\nl-93\t1\t\tellen talked with helen about the problem .\nl-93\t1\t\tellen talked with helen .\nl-93\t1\t\tellen and helen talked .\nl-93\t1\t\tellen and helen talked together .\nl-93\t0\t*\tellen talked helen .\nl-93\t1\t\tellen was conferring .\nl-93\t1\t\tellen conferred with helen .\nl-93\t1\t\tellen conferred with helen about the problem .\nl-93\t0\t*\tellen conferred to helen .\nl-93\t0\t*\tellen conferred for helen to come .\nl-93\t1\t\tellen and helen conferred .\nl-93\t0\t*\tellen and helen conferred together .\nl-93\t0\t*\tellen conferred helen .\nl-93\t1\t\tellen said to helen that melons were selling well .\nl-93\t1\t\tellen said something .\nl-93\t1\t\tellen said something to helen .\nl-93\t0\t*\tellen said to helen .\nl-93\t1\t\tellen complained to helen .\nl-93\t1\t\tellen complained about the situation .\nl-93\t1\t\tellen complained about the situation to helen .\nl-93\t1\t\tellen warned helen .\nl-93\t0\t*\tellen warned to helen .\nl-93\t1\t\tellen warned against skating on thin ice .\nl-93\t1\t\tellen warned helen that melons were selling .\nl-93\t1\t\tellen warned that melons were selling .\nl-93\t0\t*\tellen warned for helen to come .\nl-93\t1\t\tellen warned helen about the traffic jam .\nl-93\t1\t\tthe dog barked .\nl-93\t1\t\tthe dog barked at the cat .\nl-93\t1\t\tcynthia ate the peach .\nl-93\t1\t\tcynthia ate .\nl-93\t1\t\tcynthia ate at the peach .\nl-93\t0\t*\tcynthia ate on the peach .\nl-93\t1\t\tcynthia ate the peach with a fork .\nl-93\t1\t\tcynthia nibbled the carrot .\nl-93\t1\t\tcynthia nibbled .\nl-93\t1\t\tcynthia nibbled at the carrot .\nl-93\t1\t\tcynthia gobbled the pizza .\nl-93\t1\t\tcynthia gobbled the pizza down .\nl-93\t0\t*\tcynthia gobbled .\nl-93\t0\t*\tcynthia gobbled at the pizza .\nl-93\t0\t*\tcynthia gobbled on the pizza .\nl-93\t1\t\tcynthia devoured the pizza .\nl-93\t0\t*\tcynthia devoured .\nl-93\t0\t*\tcynthia devoured at the pizza .\nl-93\t0\t*\tcynthia devoured on the pizza .\nl-93\t1\t\tcynthia lunched .\nl-93\t1\t\tcynthia lunched on peaches .\nl-93\t0\t*\tcynthia lunched peaches .\nl-93\t0\t*\tcynthia lunched at peaches .\nl-93\t0\t*\tcynthia munched .\nl-93\t1\t\tcynthia munched on peaches .\nl-93\t0\t*\tcynthia munched peaches .\nl-93\t0\t*\tcynthia munched at peaches .\nl-93\t1\t\tteresa bottle fed the baby .\nl-93\t1\t\tteresa bottle fed soy milk to the baby .\nl-93\t1\t\tteresa bottle fed the baby soy milk .\nl-93\t0\t*\tteresa bottle fed soy milk .\nl-93\t1\t\tpaul yawned .\nl-93\t0\t*\tpaul yawned on mary .\nl-93\t0\t*\tpaul yawned at mary .\nl-93\t1\t\tpaul breathed .\nl-93\t0\t*\tpaul breathed at mary .\nl-93\t1\t\tpaul exhaled .\nl-93\t0\t*\tpaul exhaled at mary .\nl-93\t0\t*\tpaul exhaled on mary .\nl-93\t1\t\tpaul laughed .\nl-93\t1\t\tshe laughed from embarrassment .\nl-93\t1\t\tlinda winked her eye .\nl-93\t0\t*\tlinda winked her nose .\nl-93\t0\t*\tlinda winked his eye .\nl-93\t1\t\tlinda winked .\nl-93\t1\t\tlinda winked at the audience .\nl-93\t1\t\tlinda winked in agreement .\nl-93\t0\t*\tjennifer craned her arm .\nl-93\t1\t\tjennifer shook her finger at the naughty child .\nl-93\t1\t\tthe princess bowed .\nl-93\t1\t\tthe princess bowed to the queen .\nl-93\t0\t*\tthe heavy meal dozed gloria .\nl-93\t1\t\tgloria dozed .\nl-93\t1\t\tsharon flinched .\nl-93\t1\t\tsharon flinched at the sight of the accident .\nl-93\t0\t*\tthe shock flinched sharon .\nl-93\t1\t\tsharon shivered .\nl-93\t1\t\tsharon shivered from fear .\nl-93\t1\t\tsharon shivered at the thought of the cold sea .\nl-93\t0\t*\tthe fear shivered sharon .\nl-93\t1\t\tthe pirates drowned the sailor .\nl-93\t1\t\tthe sailor drowned .\nl-93\t1\t\tthe sea monster drowned the sailors .\nl-93\t1\t\tmy eyes are itching .\nl-93\t1\t\tmy eyes are itching me .\nl-93\t0\t*\tmy eyes are itching my brother .\nl-93\t1\t\tmy eyes are itching from the smoke .\nl-93\t1\t\tmy heart is pounding .\nl-93\t0\t*\tmy heart is pounding my brother .\nl-93\t1\t\tmy heart is pounding from fear .\nl-93\t1\t\ttessa sprained her ankle .\nl-93\t0\t*\ttessa sprained mary 's ankle .\nl-93\t1\t\tsharon fainted .\nl-93\t1\t\tsharon fainted at the sight of the accident .\nl-93\t0\t*\thunger fainted sharon .\nl-93\t1\t\tthe baby dressed .\nl-93\t1\t\tmarlene dressed the baby .\nl-93\t1\t\tmarlene dressed herself .\nl-93\t0\t*\tmarlene dressed her body .\nl-93\t0\t*\tthe horse groomed itself .\nl-93\t1\t\tthe barber shaved my chin .\nl-93\t1\t\ti shaved my chin .\nl-93\t1\t\tcelia brushed the baby 's hair .\nl-93\t1\t\tcelia brushed her hair .\nl-93\t0\t*\tcelia brushed herself .\nl-93\t1\t\tshe always wore purple dresses .\nl-93\t0\t*\tshe always wore herself .\nl-93\t0\t*\tshe always wore herself in purple .\nl-93\t0\t*\tshe always wore .\nl-93\t1\t\tshe spruced herself up before the job interview .\nl-93\t1\t\tshe spruced up before the job interview .\nl-93\t1\t\tshe was always clad in black .\nl-93\t0\t*\ther stepmother always clad her in black .\nl-93\t0\t*\tshe always clad herself in black .\nl-93\t0\t*\tshe always clad in black .\nl-93\t1\t\tbrutus murdered julius caesar .\nl-93\t0\t*\tjulius caesar murdered .\nl-93\t1\t\tthe bandits murdered innocent victims .\nl-93\t0\t*\tinnocent victims murder easily .\nl-93\t1\t\tbrutus murdered julius caesar with a dagger .\nl-93\t1\t\tthe exterminator killed the insects with ddt .\nl-93\t1\t\tthe witch poisoned snow white .\nl-93\t0\t*\tchildren poison easily .\nl-93\t1\t\tthe jewel sparkled .\nl-93\t1\t\tjewels sparkled on the crown .\nl-93\t1\t\tthe crown sparkled with jewels .\nl-93\t1\t\ta magnificent diamond sparkled on his finger .\nl-93\t1\t\ton his finger sparkled a magnificent diamond .\nl-93\t1\t\ton his finger there sparkled a magnificent diamond .\nl-93\t0\t*\tthe director sparkled the lights .\nl-93\t1\t\tthe door hinges squeaked .\nl-93\t1\t\tbirds sang in the trees .\nl-93\t1\t\tthe trees sang with birds .\nl-93\t1\t\tin the hallway ticked a grandfather clock .\nl-93\t1\t\tin the hallway there ticked a grandfather clock .\nl-93\t1\t\ti buzzed the bell .\nl-93\t1\t\tthe bell chimed the hour .\nl-93\t1\t\ta squeaking door announced john 's presence .\nl-93\t1\t\tthe onions reeked .\nl-93\t1\t\tthe room reeked of onions .\nl-93\t1\t\tthe room reeked .\nl-93\t0\t*\tkelly reeked the onions .\nl-93\t1\t\tthe well gushed oil .\nl-93\t0\t*\ti gushed the fountain .\nl-93\t1\t\ti bled him .\nl-93\t1\t\toil gushed from the well .\nl-93\t1\t\tthe streets gushed with water .\nl-93\t1\t\ta fragrant stew bubbled over the fire .\nl-93\t1\t\tcaesar put a gushing fountain by his palace .\nl-93\t1\t\tthe romans destroyed the city .\nl-93\t0\t*\tthe city destroyed .\nl-93\t0\t*\tcities destroy easily .\nl-93\t0\t*\tthe romans destroyed the city into ruins .\nl-93\t0\t*\tthe romans destroyed ruins from the city .\nl-93\t0\t*\tthe romans destroyed the city into a ruin .\nl-93\t0\t*\tthe romans destroyed the city from a capital into a ruin .\nl-93\t1\t\tthe builders destroyed the warehouse with explosives .\nl-93\t1\t\tthe explosives destroyed the warehouse .\nl-93\t1\t\tthe builders destroyed the warehouse .\nl-93\t0\t*\tthe builders destroyed at the warehouse .\nl-93\t1\t\ttony broke the window .\nl-93\t1\t\ttony broke the crystal vase .\nl-93\t1\t\ttony broke the window with a hammer .\nl-93\t1\t\ttony broke the cup against the wall .\nl-93\t0\t*\ttony broke the wall with the cup .\nl-93\t0\t*\ttony broke at the window .\nl-93\t0\t*\ttony broke herself on the ann .\nl-93\t1\t\ttony broke her arm .\nl-93\t1\t\ttony bent the rod with pliers .\nl-93\t1\t\tthe rod bent .\nl-93\t1\t\ttony bent the copper rod .\nl-93\t1\t\tcopper rods bend easily .\nl-93\t1\t\tthe pliers bent the rod .\nl-93\t1\t\ttony bent the rod against the table .\nl-93\t0\t*\ttony bent the table with the rod .\nl-93\t0\t*\ttony bent at the rod .\nl-93\t0\t*\ttony bent mary in the arm .\nl-93\t1\t\ttony bent mary 's arm .\nl-93\t1\t\tthe potatoes baked .\nl-93\t1\t\tjennifer baked idaho potatoes .\nl-93\t1\t\tidaho potatoes bake beautifully .\nl-93\t1\t\tjennifer baked the potatoes in the oven .\nl-93\t1\t\tthis oven bakes potatoes well .\nl-93\t0\t*\tjennifer baked at the potatoes .\nl-93\t1\t\tbill dried the clothes .\nl-93\t1\t\tthe clothes dried .\nl-93\t1\t\tbill dried the cotton clothes .\nl-93\t1\t\tcotton clothes dry easily .\nl-93\t1\t\tbill dried the clothes with a hair dryer .\nl-93\t1\t\tthe hair dryer dried the clothes .\nl-93\t0\t*\tbill dried at the clothes .\nl-93\t1\t\ta lot of clothes are drying on the line .\nl-93\t0\t*\tthe line is drying with a lot of clothes .\nl-93\t1\t\tbill is drying a lot of clothes on the line .\nl-93\t0\t*\tbill is drying the line with a lot of clothes .\nl-93\t0\t*\ton the line are drying a lot of clothes .\nl-93\t0\t*\ton the line there are drying a lot of clothes .\nl-93\t1\t\tthe roses bloomed .\nl-93\t0\t*\tthe sun bloomed the roses .\nl-93\t1\t\tthe temperature soared .\nl-93\t0\t*\tthe heat soared the temperature .\nl-93\t0\t*\tthere soared oil in price .\nl-93\t0\t*\tin price soared oil .\nl-93\t1\t\tcornelia lodged with the smiths .\nl-93\t1\t\tcornelia lodged at mrs. parker 's .\nl-93\t1\t\tan old woman lodged at mrs. parker 's .\nl-93\t0\t*\tthere lodged an old woman at mrs. parker 's .\nl-93\t0\t*\tat mrs. parker 's lodged an old woman .\nl-93\t1\t\tsquatters lodged in these abandoned buildings .\nl-93\t0\t*\tthese abandoned buildings lodged with squatters .\nl-93\t1\t\tthe soldiers lodged in the schoolhouse .\nl-93\t1\t\tan old woman lived in the forest .\nl-93\t1\t\tunicorns do n't exist .\nl-93\t1\t\tthere exists a solution to this problem .\nl-93\t1\t\tin the forest languished an old woman .\nl-93\t1\t\ta crowd of people remained in the square .\nl-93\t0\t*\tthe square remained with a crowd of people .\nl-93\t0\t*\tthe famous mathematician existed a solution to the problem .\nl-93\t1\t\tthe beer bubbled .\nl-93\t1\t\ta fire raged in the mountains .\nl-93\t1\t\tin the mountains there raged a fire .\nl-93\t1\t\ta fire raged all through the mountains .\nl-93\t1\t\tall through the mountains raged a fire .\nl-93\t1\t\troses flowered in the garden .\nl-93\t1\t\tthe garden flowered with roses .\nl-93\t1\t\ta fire raged over the fields .\nl-93\t0\t*\tthe farmers raged a fire over the fields .\nl-93\t1\t\ta large flag fluttered .\nl-93\t1\t\ta large flag fluttered over the fort .\nl-93\t1\t\tmany flags fluttered over the fort .\nl-93\t1\t\tover the fort there fluttered a large flag .\nl-93\t1\t\tover the fort fluttered a large flag .\nl-93\t1\t\tthe tree trembled .\nl-93\t1\t\tthe flag waved .\nl-93\t1\t\tthe hall is echoing with voices .\nl-93\t1\t\ta loud cry echoed through the hall .\nl-93\t1\t\tthrough the hall there echoed a loud cry .\nl-93\t1\t\tthrough the hall echoed a loud cry .\nl-93\t1\t\tthe music echoed .\nl-93\t0\t*\tthe magician echoed the music .\nl-93\t0\t*\tan echoing voice rang out .\nl-93\t1\t\ta striped fish swam in the aquarium .\nl-93\t1\t\tin the aquarium swam a striped fish .\nl-93\t1\t\tin the aquarium there swam a striped fish .\nl-93\t1\t\tthe cattle are herding in the pasture .\nl-93\t1\t\tthe cattle herded .\nl-93\t1\t\ti herded the cattle .\nl-93\t1\t\tthe bag is bulging with groceries .\nl-93\t0\t*\tgroceries are bulging in the bag .\nl-93\t1\t\tthe bag is bulging .\nl-93\t0\t*\ti had to bulge the bag with groceries .\nl-93\t1\t\ta statue of jefferson stood on the comer .\nl-93\t1\t\tthere stood on the comer a statue of jefferson .\nl-93\t1\t\ta statue of jefferson stood on the comer of the two boulevards .\nl-93\t1\t\ton the comer of the two boulevards stood a statue of jefferson .\nl-93\t1\t\tthe hanging gardens are a sight to behold .\nl-93\t1\t\tthe river runs from the lake to the sea .\nl-93\t1\t\tthe stream winds through the valley .\nl-93\t1\t\tthe stream crawls through the valley .\nl-93\t1\t\tthrough the valley ran a rushing stream .\nl-93\t1\t\tthere ran through the valley a rushing stream .\nl-93\t1\t\titaly borders france .\nl-93\t1\t\tsnow caps the mountain .\nl-93\t1\t\ta ship appeared .\nl-93\t1\t\ta large ship appeared on the horizon .\nl-93\t1\t\ton the horizon appeared a large ship .\nl-93\t1\t\ta solution immediately presented itself .\nl-93\t0\t*\ta solution immediately presented .\nl-93\t1\t\ta solution immediately presented itself to him .\nl-93\t1\t\ta wonderful opportunity presented itself yesterday .\nl-93\t0\t*\tto him presented itself a wonderful opportunity .\nl-93\t1\t\ti presented a solution yesterday .\nl-93\t1\t\ta solution presented itself yesterday .\nl-93\t1\t\tthe crowd vanished .\nl-93\t1\t\ta valuable 13th-century manuscript recently vanished from the library .\nl-93\t1\t\tthe rabbit vanished into thin air .\nl-93\t0\t*\tthe magician vanished a rabbit into thin air .\nl-93\t1\t\ta serious accident happened yesterday .\nl-93\t1\t\tthere happened a serious accident yesterday .\nl-93\t1\t\ta serious accident happened in front of them .\nl-93\t1\t\tin front of them happen .\nl-93\t1\t\tthe accident happened .\nl-93\t0\t*\tthe motorist happened the accident .\nl-93\t1\t\tsylvia squirmed .\nl-93\t0\t*\tthe lecture squirmed sylvia .\nl-93\t1\t\tthe dog flopped onto the bed .\nl-93\t1\t\tthe dog flopped in the comer .\nl-93\t1\t\ta dog lay in the comer .\nl-93\t0\t*\tthere lay a dog in the comer .\nl-93\t1\t\ta dog lay in the corner .\nl-93\t0\t*\tin the corner lay a dog .\nl-93\t1\t\tthe convict escaped .\nl-93\t1\t\tthe convict escaped from the police .\nl-93\t1\t\tthe convict escaped the police .\nl-93\t0\t*\tthe collaborators escaped the convict .\nl-93\t1\t\twe abandoned the area .\nl-93\t0\t*\twe abandoned from the area .\nl-93\t1\t\tthe ball rolled .\nl-93\t1\t\tthe ball rolled down the hill .\nl-93\t1\t\tthe ball rolled over the hill .\nl-93\t1\t\tthe ball rolled into the gutter .\nl-93\t1\t\tbill rolled the ball down the hill .\nl-93\t0\t*\tthe ball rolled the hill .\nl-93\t1\t\tthe horse jumped over the stream .\nl-93\t1\t\tthe horse jumped across the stream .\nl-93\t1\t\tthe horse jumped into the stream .\nl-93\t1\t\tthe horse jumped out of the stream .\nl-93\t1\t\tthe lions jumped through the hoop .\nl-93\t1\t\tthe horse jumped the stream .\nl-93\t1\t\ta little white rabbit jumped out of the box .\nl-93\t1\t\tthere jumped out of the box a little white rabbit .\nl-93\t1\t\twe walked ourselves into a state of exhaustion .\nl-93\t1\t\ttom ran the soles off his shoes .\nl-93\t1\t\the skated penny around the rink .\nl-93\t1\t\tthey rowed .\nl-93\t1\t\the rowed penny across the lake .\nl-93\t1\t\tpenny rowed across the lake .\nl-93\t1\t\tthey rowed along the canals of venice .\nl-93\t1\t\tthey rowed the canals of venice .\nl-93\t1\t\tthey waltzed .\nl-93\t1\t\tshe waltzed across the floor .\nl-93\t1\t\the waltzed her across the floor .\nl-93\t1\t\tjackie chased after the thief .\nl-93\t1\t\tjackie chased the thief down the street .\nl-93\t1\t\tjackie chased the thief .\nl-93\t0\t*\tthe thief chased down the street .\nl-93\t0\t*\tthe thief chased .\nl-93\t0\t*\trose accompanied .\nl-93\t1\t\tsasha lingered in the museum .\nl-93\t1\t\tsasha lingered over lunch .\nl-93\t0\t*\tphyllis lingered sasha over lunch .\nl-93\t1\t\tmaggie hurried through the museum .\nl-93\t1\t\ther sister hurried .\nl-93\t1\t\tmaggie hurried her sister .\nl-93\t1\t\tthe package weighed ten pounds .\nl-93\t0\t*\tten pounds was weighed by the package .\nl-93\t0\t*\ti weighed the package ten pounds .\nl-93\t1\t\ti weighed the package .\nl-93\t1\t\tthe book costs $ 10 .\nl-93\t0\t*\tthe book valued at $ 200 .\nl-93\t0\t*\tthe book valued $ 200 .\nl-93\t1\t\tthe phone company billed me $ 10 for that phone call .\nl-93\t0\t*\tthe phone company billed $ 10 to me .\nl-93\t1\t\tthe phone company billed me $ 10 .\nl-93\t0\t*\tthe phone company billed $ 10 as me .\nl-93\t1\t\tthe meeting began at 4 p.m .\nl-93\t1\t\ti began the meeting at 4 p.m .\nl-93\t1\t\twilma completed the assignment .\nl-93\t0\t*\tthe assignment completed .\nl-93\t1\t\tmy family always summers at the seashore .\nks08\t1\t\tthe man kicked a ball .\nks08\t1\t\ta man kicked the ball .\nks08\t1\t\tthe ball kicked a man .\nks08\t1\t\ta ball kicked the man .\nks08\t1\t\tthe ball , a man kicked .\nks08\t1\t\tthe man , a ball kicked .\nks08\t0\t*\tkicked the man the ball .\nks08\t0\t*\tman the ball kicked the .\nks08\t0\t*\tthe man a ball kicked .\nks08\t0\t*\tkim lives in the house lee sold it to her .\nks08\t0\t*\tkim fond of lee .\nks08\t1\t\tkim is fond of lee .\nks08\t1\t\tin january 2002 , a dull star in an obscure constellation suddenly became 600,000 times more luminous than our sun , temporarily making it the brightest star in our galaxy .\nks08\t1\t\tthe man kicked the ball .\nks08\t1\t\tthe tall man kicked the ball .\nks08\t1\t\tthe handsome , tall man kicked the ball .\nks08\t1\t\tthe handsome , tall , nice man kicked the ball .\nks08\t1\t\tsome sentences can go on .\nks08\t1\t\tsome sentences can go on and on .\nks08\t1\t\tsome sentences can go on and on and on .\nks08\t1\t\tsome sentences can go on and on and on and on .\nks08\t1\t\tall native speakers have a grammatical competence which can generate an infinite set of grammatical sentences from a finite set of resources .\nks08\t0\t*\tthe professor found some strong evidences of water on mars .\nks08\t1\t\tdo not end a sentence with a preposition .\nks08\t1\t\tavoid double negatives .\nks08\t0\t*\tthe evidence that john found was more helpful than the one that smith found .\nks08\t0\t*\twe had hoped to get three new equipments every month , but we only had enough money to get an equipment every two weeks .\nks08\t0\t*\tthe equipment we bought last year was more expensive than the one we bought this year .\nks08\t1\t\tthe student was hoping for a good clue .\nks08\t1\t\tthe clue that john got was more helpful than the one that smith got .\nks08\t1\t\tthe student was hoping for a tool .\nks08\t1\t\tthe tool that jones got was more helpful than the one that smith got .\nks08\t1\t\tmuch evidence is needed .\nks08\t1\t\tmuch equipment is needed .\nks08\t1\t\tmuch information is needed .\nks08\t1\t\tmuch furniture is needed .\nks08\t1\t\tmuch advice is needed .\nks08\t0\t*\tmuch clue is needed .\nks08\t0\t*\tmuch tool is needed .\nks08\t0\t*\tmuch armchair is needed .\nks08\t0\t*\tmuch bags is needed .\nks08\t0\t*\tmany evidence was provided .\nks08\t0\t*\tmany equipment is available .\nks08\t0\t*\tthe room contains many furniture .\nks08\t1\t\tthe paper provides many clues .\nks08\t1\t\tthe box contains many tools .\nks08\t1\t\tjohn offers many suggestions .\nks08\t1\t\tlittle evidence was provided .\nks08\t1\t\tlittle equipment is available .\nks08\t1\t\tjohn offers little advice .\nks08\t1\t\tlittle information was provided .\nks08\t0\t*\tlittle clue could be found .\nks08\t0\t*\tthe box contains little tool .\nks08\t0\t*\tjohn offers little suggestion .\nks08\t0\t*\tthe room contains little armchair .\nks08\t0\t*\tfew evidence was provided .\nks08\t0\t*\tfew equipment is available .\nks08\t0\t*\tthe room contains few furniture .\nks08\t0\t*\tjohn offers few advice .\nks08\t0\t*\tfew information was provided .\nks08\t1\t\tfew clues could be found .\nks08\t1\t\tjohn offers few suggestions .\nks08\t1\t\tthe room contains few armchairs .\nks08\t1\t\tthe president was hoping for a good cake .\nks08\t1\t\tthe bartender gave john some good beers .\nks08\t1\t\tno one knows how to tell from a good beer to a bad one .\nks08\t1\t\tmy pastor says i ate too much cake .\nks08\t1\t\tthe students drank too much beer last night .\nks08\t1\t\tpeople now drink less beer .\nks08\t1\t\tin english , the main verb agrees with the head element of the subject .\nks08\t0\t*\tthe recent strike by pilots have cost the country a great deal of money from tourism and so on .\nks08\t0\t*\tthe average age at which people begin to need eyeglasses vary considerably .\nks08\t0\t*\tdespite of his limited educational opportunities , abraham lincoln became one of the greatest intellectuals in the world .\nks08\t0\t*\ta pastor was executed , notwithstanding on many applications in favor of him .\nks08\t1\t\tvisiting relatives can be boring .\nks08\t1\t\the said that that ` that ' that that man used was wrong .\nks08\t1\t\tkim and sandy is looking for a new bicycle .\nks08\t1\t\ti have never put the book .\nks08\t1\t\tthe boat floated down the river sank .\nks08\t1\t\tchris must liking syntax .\nks08\t1\t\tthere is eager to be fifty students in this class .\nks08\t1\t\twhat is john eager to do ?\nks08\t1\t\twhat is john easy to do ?\nks08\t1\t\tis the boy who holding the plate can see the girl ?\nks08\t1\t\twhich chemical did you mix the hydrogen peroxide and ?\nks08\t1\t\tthere seem to be a good feeling developing among the students .\nks08\t1\t\tstrings have been pulled many times to get students into that university .\nks08\t1\t\the washed himself .\nks08\t0\t*\the washed herself .\nks08\t0\t*\the washed myself .\nks08\t0\t*\the washed ourselves .\nks08\t1\t\the washed me .\nks08\t1\t\the washed us .\nks08\t1\t\twash yourself .\nks08\t1\t\twash yourselves .\nks08\t0\t*\twash himself .\nks08\t1\t\twash me !\nks08\t1\t\tthe weather is lovely today .\nks08\t1\t\ti am hoping that the weather is lovely today .\nks08\t1\t\tthe birds are singing because the weather is lovely today .\nks08\t1\t\tthey read the book .\nks08\t1\t\the treats john very .\nks08\t1\t\the walked right the wall .\nks08\t1\t\tthey have no tv .\nks08\t1\t\tthey have no car .\nks08\t1\t\tthey have no information .\nks08\t1\t\tthey have no friend .\nks08\t0\t*\tthey have no went .\nks08\t0\t*\tthey have no old .\nks08\t0\t*\tthey have no and .\nks08\t1\t\tthey can sing .\nks08\t1\t\tthey can run .\nks08\t1\t\tthey can smile .\nks08\t1\t\tthey can cry .\nks08\t0\t*\tthey can happy .\nks08\t0\t*\tthey can down .\nks08\t0\t*\tthey can door .\nks08\t0\t*\tthey can very .\nks08\t1\t\tthey read the new book .\nks08\t1\t\tthey read the interesting book .\nks08\t1\t\tthey read the scientific book .\nks08\t0\t*\tthey read the sing book .\nks08\t0\t*\tthey read the under book .\nks08\t0\t*\tthey read the every book .\nks08\t1\t\the treats john very nicely .\nks08\t1\t\the treats john very badly .\nks08\t1\t\the treats john very kindly .\nks08\t0\t*\the treats john very kind .\nks08\t0\t*\the treats john very shame .\nks08\t1\t\the walked right into the wall .\nks08\t0\t*\the walked right happy .\nks08\t0\t*\the walked right the wall .\nks08\t1\t\tjohn sang a song , mary played the piano .\nks08\t1\t\twe found out that very lucrative jobs were in jeopardy .\nks08\t0\t*\tmy these jobs are in jeopardy .\nks08\t0\t*\tthe his jobs are in jeopardy .\nks08\t1\t\ti think learning english is not easy at all .\nks08\t1\t\ti doubt you can help me in understanding this .\nks08\t1\t\ti think that learning english is not all that easy .\nks08\t1\t\ti doubt if you can help me in understanding this .\nks08\t1\t\ti am anxious for you to study english grammar hard .\nks08\t0\t*\ti think that learning english to be not all that easy .\nks08\t0\t*\ti doubt if you to help me in understanding this .\nks08\t0\t*\ti am anxious for you should study english grammar hard .\nks08\t1\t\tjohn not leave .\nks08\t1\t\tjohn drink beer last night .\nks08\t1\t\tjohn leave for seoul tomorrow ?\nks08\t1\t\tjohn will study syntax , and mary , too .\nks08\t1\t\the left .\nks08\t1\t\the did not leave .\nks08\t1\t\tstudents wanted to write a letter .\nks08\t1\t\tstudents intended to surprise the teacher .\nks08\t1\t\tstudents objected to the teacher .\nks08\t1\t\tstudents sent letters to the teacher .\nks08\t1\t\tit is crucial for john to show an interest .\nks08\t1\t\tit is crucial that john should show an interest .\nks08\t1\t\ti know i should go to the dentist 's , but i just do n't want to .\nks08\t1\t\ti do n't really want to go to the dentist 's , but i know i should .\nks08\t0\t*\tshe thought it was likely that everyone to fit into the car .\nks08\t1\t\tshe thought it was likely that everyone might fit into the car .\nks08\t1\t\tshe thought it was easy for everyone to fit into the car .\nks08\t0\t*\tshe thought it was easy for everyone would fit into the car .\nks08\t1\t\tthe umpire called off the game .\nks08\t1\t\tthe umpire called the game off .\nks08\t1\t\tthe two boys looked the word up .\nks08\t1\t\tthe umpire fell off the deck .\nks08\t1\t\tthe two boys looked up the high stairs .\nks08\t1\t\tthe two boys looked up the high stairs from the floor .\nks08\t0\t*\tthe umpire fell the deck off .\nks08\t0\t*\tthe students looked the high stairs up from the floor .\nks08\t0\t*\tthe students looked the high stairs up .\nks08\t1\t\tthe umpire called it of .\nks08\t0\t*\tthe umpire called off it .\nks08\t0\t*\tthe umpire fell it off .\nks08\t1\t\tthe umpire fell off it .\nks08\t1\t\ta tall boy threw the ball .\nks08\t1\t\tthe cat chased the long string .\nks08\t1\t\tthat ball hit a student .\nks08\t1\t\tthe piano played a song .\nks08\t1\t\tthe piano kicked a student .\nks08\t1\t\tthat ball sang a student .\nks08\t1\t\tthe tall , handsome man kicked the ball .\nks08\t1\t\tthe tall , kind , handsome man kicked the ball .\nks08\t1\t\tthe happy , happy , happy , happy , happy , happy man sang a song .\nks08\t1\t\tthe mother of the boy and the girl is arriving soon .\nks08\t1\t\tthe mother of the boy and the girl are arriving soon .\nks08\t1\t\tjohn saw the man with a telescope .\nks08\t1\t\twe need more intelligent leaders .\nks08\t1\t\tthe student enjoyed his english syntax class last semester .\nks08\t1\t\tthe policeman met several young students in the park last night .\nks08\t1\t\tit was the policeman that met several young students in the park last night .\nks08\t1\t\tit was several young students that the policeman met in the park last night .\nks08\t1\t\tit was last night that the policeman met several young students in the park .\nks08\t0\t*\tit was several young students in that the policeman met the park last night .\nks08\t0\t*\tit was in the park last night that the policeman met several young students .\nks08\t1\t\twhere did the policeman meet several young students ?\nks08\t1\t\twhat did you put in your box ?\nks08\t1\t\twhere did you put the book ?\nks08\t1\t\twhat did you do ?\nks08\t1\t\tjohn looked up the inside of the chimney .\nks08\t1\t\tjohn looked up the meaning of ` chanson ' .\nks08\t1\t\twhat did he look up ?\nks08\t1\t\twhere did he look ?\nks08\t1\t\tup what did he look ?\nks08\t1\t\twhat do you think the man who is standing by the door is doing now ?\nks08\t1\t\twhat do you think he is doing now ?\nks08\t1\t\thave you been to seoul ?\nks08\t1\t\tjohn might go home , so might bill .\nks08\t1\t\tjohn might pass the exam , and as might bill .\nks08\t1\t\tif john can speak french fluently – which we all know he can – we will have no problems .\nks08\t1\t\tjohn asked me to put the clothes in the cupboard , and to annoy him i really stuffed them there .\nks08\t1\t\tjohn asked me to put the clothes in the cupboard , and to annoy him i stuffed them there .\nks08\t0\t*\tjohn asked me to put the clothes in the cupboard , but i did so put the clothes in the suitcase .\nks08\t1\t\tthe girls played in the water and swam under the bridge .\nks08\t1\t\tthe children were neither in their rooms nor on the porch .\nks08\t1\t\tmany people drink beer or wine .\nks08\t0\t*\tmary waited for the bus and to go home .\nks08\t0\t*\tlee went to the store and crazy .\nks08\t1\t\tliked ice cream .\nks08\t0\t*\tthe whistle tune was beautiful .\nks08\t0\t*\tthe easily student finished his homework .\nks08\t0\t*\tthe my dog is a terrier .\nks08\t1\t\tthe monkey wants to leave the meeting .\nks08\t0\t*\tthe monkey eager to leave the meeting .\nks08\t1\t\tthe monkeys approved of their leader .\nks08\t1\t\tthe men practice medicine .\nks08\t0\t*\tthe men doctors of medicine .\nks08\t1\t\tjohn read the book loudly .\nks08\t1\t\tjohn sounded happy .\nks08\t1\t\tjohn felt proud that his son won the game .\nks08\t0\t*\tjohn sounded happily .\nks08\t0\t*\tjohn sounded the student .\nks08\t0\t*\tjohn sounded in the park .\nks08\t0\t*\tthe monkeys seem want to leave the meeting .\nks08\t1\t\tthe monkeys seem eager to leave the meeting .\nks08\t0\t*\tjohn seems know about the bananas .\nks08\t1\t\tjohn seems certain about the bananas .\nks08\t1\t\tjohn came from seoul .\nks08\t1\t\tthey put the book in the box .\nks08\t1\t\tthey stayed in the hotel .\nks08\t1\t\tthe fly fell into the soup .\nks08\t1\t\tthe squirrel ran straight .\nks08\t1\t\tthe squirrel ran right up the tree .\nks08\t0\t*\tthe squirrel is right angry .\nks08\t0\t*\tthe squirrel ran straight quickly .\nks08\t0\t*\tthe squirrel ran right quickly .\nks08\t1\t\tthis handsome man chased a dog .\nks08\t1\t\ta man kicked that ball .\nks08\t1\t\tthat tall woman chased a cat .\nks08\t1\t\this friend kicked a ball .\nks08\t1\t\tbill claims john believes mary thinks tom is honest .\nks08\t1\t\tjane imagines bill claims john believes mary thinks tom is honest .\nks08\t1\t\tthe little boy hit the child with a toy .\nks08\t1\t\tchocolate cakes and pies are my favorite desserts .\nks08\t0\t*\tthe children were in their rooms or happily .\nks08\t1\t\tjohn suddenly got off the bus .\nks08\t1\t\tjohn suddenly put off the customers .\nks08\t0\t*\tjohn suddenly got the bus off .\nks08\t1\t\tjohn suddenly put the customers off .\nks08\t1\t\this second book came out earlier this year and became an instant best-seller .\nks08\t1\t\twhen you book something such as a hotel room , you arrange to have it .\nks08\t1\t\tprice quotes on selected categories will be sent out upon request .\nks08\t1\t\tno doubt that he was forced to leave his family against his will .\nks08\t1\t\the intended to will the large amount of money to frank .\nks08\t1\t\tjane stood aside to let her pass .\nks08\t1\t\the has a rail pass that 's right for you .\nks08\t1\t\tit is important for us to spend time with children .\nks08\t1\t\the was arrested for being drunk .\nks08\t1\t\ti think that person we met last week is insane .\nks08\t1\t\twe believe that he is quite reasonable .\nks08\t1\t\ti forgot to return the book that i borrowed from the teacher .\nks08\t1\t\ti am anxious that you should arrive on time .\nks08\t1\t\ti am anxious for you to arrive on time .\nks08\t0\t*\ti am anxious for you should arrive on time .\nks08\t1\t\ti wonder whether you 'd be kind enough to give us information .\nks08\t1\t\tif students study hard , teachers will be happy .\nks08\t1\t\twhether they say it or not , most teachers expect their students to study hard .\nks08\t1\t\tjohn put a book on the table .\nks08\t1\t\tshe turned down his offer .\nks08\t1\t\the looked at a book about swimming .\nks08\t1\t\the talked to a girl about swimming .\nks08\t1\t\the talked with a girl about swimming .\nks08\t1\t\ti do n't know the people present .\nks08\t0\t*\tcould you turn off the fire and on the light ?\nks08\t0\t*\ti know the truth and that you are innocent .\nks08\t1\t\tjohn refused the offer proudly .\nks08\t1\t\ti consider john the best candidate .\nks08\t1\t\ti saw him leaving the main building .\nks08\t1\t\the took john to the school by the park .\nks08\t1\t\tjohn sang a song and danced to the music .\nks08\t1\t\tjohn wants to study linguistics in near future .\nks08\t1\t\tthey told angelica to arrive early for the award .\nks08\t1\t\tthat louise had abandoned the project surprised everyone .\nks08\t1\t\ti know you like the back of my hand .\nks08\t1\t\ttime flies like an arrow .\nks08\t1\t\ti need to have that report on our web page by tomorrow .\nks08\t1\t\tthe monkey scratched a boy on monday .\nks08\t1\t\tjohn tagged the monkey in the forest .\nks08\t1\t\tthe monkey was tagged in the forest by john .\nks08\t1\t\tthe cat devoured the rat .\nks08\t1\t\tthe rat devoured the cat .\nks08\t1\t\tthis car stinks .\nks08\t1\t\tit rains .\nks08\t1\t\tthe committee disliked her proposal .\nks08\t1\t\tthese books disappoint me .\nks08\t1\t\tour neighbor takes his children to school in his car .\nks08\t0\t*\tour neighbor take his children to school in his car .\nks08\t1\t\tthe book , including all the chapters in the first section , is very interesting .\nks08\t0\t*\tthe book , including all the chapters in the first section , are very interesting .\nks08\t1\t\tthe effectiveness of teaching and learning depends on several factors .\nks08\t0\t*\tthe effectiveness of teaching and learning depend on several factors .\nks08\t1\t\tthe tornadoes that tear through this county every spring are more than just a nuisance .\nks08\t0\t*\tthe tornadoes that tear through this county every spring is more than just a nuisance .\nks08\t1\t\tthe lady singing with a boy is a genius , is n't he ?\nks08\t0\t*\tthe lady singing with a boy is a genius , is n't she ?\nks08\t1\t\twith their teacher , the kids have arrived safely , have n't they ?\nks08\t0\t*\twith their teacher , the kids have arrived safely , has n't he ?\nks08\t1\t\tthe kids have arrived safely .\nks08\t1\t\tit could be more detrimental .\nks08\t1\t\tis this teacher a genius ?\nks08\t1\t\thave the kids arrived safely ?\nks08\t1\t\tcould it be more detrimental ?\nks08\t1\t\tthe kids in our class have arrived safely .\nks08\t0\t*\thave in our class the kids arrived safely ?\nks08\t1\t\this girlfriend bought this computer .\nks08\t1\t\tthunder frightens the dog .\nks08\t1\t\tthe dog fears thunder .\nks08\t1\t\this girlfriend bought this computer for him .\nks08\t1\t\tthe child broke the teapot by accident .\nks08\t1\t\tthis computer was bought for him by his girlfriend .\nks08\t1\t\tthe teapot was broken by the child by accident .\nks08\t1\t\tthis item belongs to the student .\nks08\t0\t*\tthe student is belonged to by this item .\nks08\t1\t\the remained a good friend to me .\nks08\t0\t*\ta good friend is remained to me .\nks08\t1\t\tjohn gave the boys the cds .\nks08\t1\t\tmy mother baked me a birthday cake .\nks08\t1\t\tshe was sent a review copy of the book by the publisher .\nks08\t1\t\tshe was sent a review copy of the book .\nks08\t1\t\tjohn gave the cds to the boys .\nks08\t1\t\tthe publisher sent a review copy of the book to her .\nks08\t1\t\tmy mother baked a cake for me .\nks08\t1\t\tthe cds were given to the boys by john .\nks08\t1\t\ta review copy of the book was sent to her by the publisher .\nks08\t1\t\tthis nice cake was baked for me by my mother .\nks08\t1\t\tthis is my ultimate goal .\nks08\t1\t\tmichelle became an architect .\nks08\t1\t\tthey elected graham chairman .\nks08\t0\t*\tchairman was elected graham .\nks08\t0\t*\tthe best writer was considered andrew .\nks08\t1\t\tjohn made kim a great doll .\nks08\t1\t\tthe situation became terrible .\nks08\t1\t\tthis map is what he wants .\nks08\t1\t\tthe message was that you should come on time .\nks08\t1\t\ti made kim angry .\nks08\t1\t\ti consider him immoral .\nks08\t1\t\ti regard andrew as the best writer .\nks08\t1\t\tthey spoil their kids rotten .\nks08\t1\t\tjohn put books in the box .\nks08\t1\t\tjohn talked to bill about the exam .\nks08\t1\t\tshe reminded him of the last time they met .\nks08\t1\t\tthey would inform mary of any success they have made .\nks08\t1\t\tjohn gave a book to the student .\nks08\t1\t\tjohn bought a book for the student .\nks08\t1\t\tthe bus stopped suddenly .\nks08\t1\t\tshakespeare wrote his plays a long time ago .\nks08\t1\t\tthey went to the theater in london .\nks08\t1\t\the failed chemistry because he ca n't understand it .\nks08\t0\t*\tjohn gave tom a book a record .\nks08\t1\t\ti saw this film several times last year during the summer .\nks08\t1\t\tmy uncle visited today .\nks08\t0\t*\ttoday was visited by my uncle .\nks08\t1\t\tthe termites destroyed the sand castle .\nks08\t1\t\tbeing honest is not an easy task .\nks08\t1\t\tthat john passed surprised her .\nks08\t1\t\tto finish this work on time is almost unexpected .\nks08\t1\t\tunder the bed is a safe place to hide .\nks08\t1\t\ti sent a surprise present to john .\nks08\t1\t\tthey wondered what she did yesterday .\nks08\t1\t\tthey believed that everybody would pass the test .\nks08\t1\t\tare you going on holiday before or after easter ? i prefer after easter .\nks08\t1\t\tthat john passed surprised her , did n't it ?\nks08\t1\t\tthat the march should go ahead and that it should be cancelled have been argued by different people at different times .\nks08\t0\t*\tthat the march should go ahead and that it should be cancelled has been argued by different people at different times .\nks08\t1\t\tto finish it on time made quite a statement , did n't it ?\nks08\t1\t\tto delay the march and to go ahead with it have been argued by different people at different times .\nks08\t0\t*\tto delay the march and to go ahead with it has been argued by different people at different times .\nks08\t1\t\tthe little cat devoured a mouse last night .\nks08\t1\t\tjohn left very early .\nks08\t1\t\tjohn studied hard to pass the exam .\nks08\t1\t\tshe disappeared when the main party arrived .\nks08\t1\t\ta boy hit the ball .\nks08\t1\t\tthe students felt comfortable in the class .\nks08\t1\t\tjohn gave a book to the students .\nks08\t1\t\tjohn died last night .\nks08\t1\t\tjohn bought a lot of books for his sons .\nks08\t1\t\tjohn promised bill to leave tomorrow morning .\nks08\t1\t\tjohn deprived his sons of game cards .\nks08\t1\t\tmary received an award from the department .\nks08\t1\t\tjohn told the rumor to his friend .\nks08\t1\t\tjohn put his books in the attic .\nks08\t1\t\tthe government kept all the money .\nks08\t1\t\tjohn hit the ball with a bat .\nks08\t1\t\tjohn wiped the window with a towel .\nks08\t1\t\tthe cat chased pat the mouse .\nks08\t1\t\tthe mouse was chased by the cat .\nks08\t1\t\tthere still remains an issue to be solved .\nks08\t1\t\tthere lived a man with his grandson .\nks08\t1\t\tthere arrived a tall , red haired and incredibly well dressed man .\nks08\t0\t*\tthere sang a man with a pipe .\nks08\t0\t*\tthere dances a man with an umbrella .\nks08\t1\t\tjohn resembles his mother .\nks08\t1\t\ta is similar to b .\nks08\t1\t\tjohn runs into the house .\nks08\t1\t\tmary looked at the sky .\nks08\t1\t\tthe school awarded a few of the girls in miss kim 's class scholarships .\nks08\t1\t\tshe was the nicest teacher in the senior school .\nks08\t1\t\tthey elected him america 's 31st president .\nks08\t1\t\tthe next morning we set out for seoul .\nks08\t1\t\tdoing syntax is not easy .\nks08\t1\t\the saw the man with the stick .\nks08\t1\t\tthey parted the best of friends .\nks08\t1\t\tin the summer we always go to france .\nks08\t1\t\tlast year i saw this film several times .\nks08\t1\t\the baked tom the bread last night .\nks08\t1\t\tthat they have completed the course is amazing .\nks08\t1\t\tthe teacher made students happy .\nks08\t1\t\twe reminded him of the agreement .\nks08\t1\t\tin the garden stands a statue .\nks08\t0\t*\tin the garden stand a statue .\nks08\t1\t\tamong the guests was sitting my friend louise .\nks08\t0\t*\tamong the guests were sitting my friend louise .\nks08\t1\t\tthis proved my hypothesis .\nks08\t1\t\tthe students all enjoyed that summer .\nks08\t1\t\tthe students all worked that summer .\nks08\t1\t\tthe scientist made her a robot .\nks08\t1\t\tthe students called me a teacher .\nks08\t1\t\ta big green insect flew into the soup .\nks08\t1\t\tjohn 's mother sent a letter to mary .\nks08\t1\t\twe placed the cheese in the refrigerator .\nks08\t1\t\tfrank threw himself into the sofa .\nks08\t1\t\tthe ice melted .\nks08\t1\t\tthe vacuum cleaner frightens the child .\nks08\t1\t\tscientists found that the birds sang well in the evenings , but performed badly in the mornings .\nks08\t0\t*\tjohn put his gold .\nks08\t0\t*\tjohn put his gold safe .\nks08\t0\t*\tjohn put his gold to be under the bathtub .\nks08\t1\t\tjohn put his gold under the bathtub .\nks08\t1\t\tthis is the box in which john put his gold .\nks08\t1\t\tthis is the gold that john put under the bathtub .\nks08\t0\t*\tthe king kept put his gold under the bathtub .\nks08\t1\t\tthe king kept putting his gold under the bathtub .\nks08\t1\t\tthe defendant denied the accusation .\nks08\t0\t*\tthe defendant denied .\nks08\t1\t\tthe teacher handed the student a book .\nks08\t0\t*\tthe teacher handed the student .\nks08\t1\t\tthey want to leave the meeting .\nks08\t0\t*\tthey eager to leave the meeting .\nks08\t1\t\tthe senators know that the president is telling a lie .\nks08\t0\t*\tthe senators certain that the president is telling a lie .\nks08\t1\t\tbe eager to leave the meeting .\nks08\t0\t*\tthe senators to be certain that the president is telling a lie .\nks08\t0\t*\tthe senators be certain that the president is telling a lie .\nks08\t1\t\ttom offered advice to his students in his office .\nks08\t1\t\ttom offered advice to his students with love .\nks08\t1\t\tjohn kept him behind the garage .\nks08\t0\t*\tjohn stayed kim behind the garage .\nks08\t0\t*\tjohn placed him busy .\nks08\t1\t\tjohn kept him busy .\nks08\t0\t*\tjohn stayed him busy .\nks08\t0\t*\tjohn placed behind the counter .\nks08\t0\t*\tjohn kept behind the counter .\nks08\t1\t\tjohn stayed behind the counter .\nks08\t1\t\tjohn deposited some money in the bank .\nks08\t1\t\tjohn deposited some money in the bank on friday .\nks08\t0\t*\tthe un blamed global warming on humans on natural causes .\nks08\t1\t\tkim and sandy met in seoul in the lobby of the lotte hotel in march .\nks08\t1\t\tjohn deposited some money in the checking account and mary did the same thing .\nks08\t1\t\tjohn deposited some money in the checking account on friday and mary did the same thing .\nks08\t1\t\tjohn deposited some money in the checking account on friday and mary did the same thing on monday .\nks08\t0\t*\tjohn deposited some money in the checking account and mary did the same thing in the savings account .\nks08\t0\t*\tjohn gave a present to the student and mary did the same thing to the teacher .\nks08\t0\t*\tjohn locked fido in the garage and mary did so in the room .\nks08\t0\t*\tjohn ate a carrot and mary did so a radish .\nks08\t1\t\tkim jogs on the hill .\nks08\t1\t\tkim jogs under the hill .\nks08\t1\t\tkim jogs over the hill .\nks08\t1\t\tkim depends .\nks08\t1\t\tkim relies on sandy .\nks08\t1\t\tkim depends on sandy .\nks08\t0\t*\tkim depends at sandy .\nks08\t1\t\tjohn met a student in the park .\nks08\t0\t*\tjohn met in the park a student .\nks08\t0\t*\tthe problem disappeared the accusation .\nks08\t1\t\tthe problem disappeared .\nks08\t0\t*\tthe boy gave the book .\nks08\t1\t\tthe boy gave the baby the book .\nks08\t1\t\tthe bird devours the worm .\nks08\t1\t\tthe birds devour the worm .\nks08\t1\t\tevery photo of max and sketch by his students appeared in the magazine .\nks08\t1\t\tno photo of max and sketch by his students appeared in the magazine .\nks08\t0\t*\tsketch by his students appeared in the magazine .\nks08\t1\t\tthe present king of country music is more popular than the last one .\nks08\t0\t*\tthe king of rock and roll is more popular than the one of country music .\nks08\t1\t\twhich student were you talking about ?\nks08\t0\t*\tjohn put in the box .\nks08\t0\t*\tin the box put john the book .\nks08\t1\t\tthe election results surprised everybody .\nks08\t1\t\tthat he won the election surprised everybody .\nks08\t1\t\tjohn disappeared .\nks08\t0\t*\tjohn disappeared bill .\nks08\t1\t\tjohn coughed .\nks08\t0\t*\tjohn coughed the money .\nks08\t1\t\tthe president looked weary .\nks08\t1\t\tthe teacher became tired of the students .\nks08\t1\t\tthe lasagna tasted delicious .\nks08\t1\t\tjohn remained somewhat calm .\nks08\t1\t\tthe jury seemed ready to leave .\nks08\t1\t\tjohn became a success .\nks08\t1\t\tjohn seemed a fool .\nks08\t1\t\tjohn remained a student .\nks08\t1\t\tjohn saw fred .\nks08\t1\t\talice typed the letter .\nks08\t1\t\tclinton supported the health care bill .\nks08\t1\t\traccoons destroyed the garden .\nks08\t1\t\tthe school board leader asked the students a question .\nks08\t1\t\tjohn taught new students english syntax .\nks08\t1\t\tthe school board leader asked a question of the students .\nks08\t1\t\tthe sexual revolution makes some people uncomfortable .\nks08\t1\t\tad agencies call young people generation x-ers .\nks08\t1\t\thistorians believe fdr to be our most effective president .\nks08\t0\t*\tjohn carried to the door .\nks08\t1\t\ttom locked fido in the garage .\nks08\t1\t\ttom bathed fido in the garage .\nks08\t1\t\ttom placed it under the table .\nks08\t1\t\ttom played it under the table .\nks08\t1\t\ti wonder if you will come back tomorrow .\nks08\t1\t\tyou would have a reply if you come back tomorrow .\nks08\t1\t\ttom hid the manuscript in the cupboard .\nks08\t1\t\tfred hired sharon to change the oil .\nks08\t1\t\tthey pushed the prisoners into the truck .\nks08\t1\t\tfrank hopes to persuade harry to make the cook wash the dishes .\nks08\t1\t\tgeorge mailed the attorney his photograph of the accident .\nks08\t1\t\ttom keeps asking karen 's sister to buy the car .\nks08\t1\t\tjane left the book on the table .\nks08\t1\t\twe have not confirmed whether the flight had been booked .\nks08\t1\t\twe saw him beaten by the champion .\nks08\t1\t\tthey confined his remarks to the matter under discussion .\nks08\t0\t*\toliver ascribed his longevity there .\nks08\t0\t*\toliver mentioned charles the problem .\nks08\t0\t*\toliver fined ten pounds to the prisoner .\nks08\t0\t*\toliver drove me a lunatic .\nks08\t0\t*\toliver addressed the king the letter .\nks08\t1\t\tthe students of english from seoul faced many issues in the process of interpreting , transcribing , and editing the poems .\nks08\t1\t\tthe love of my life and father of my children would never do such a thing .\nks08\t1\t\tthe museum displayed no painting by miro or drawing by klee .\nks08\t1\t\tby law , every dog and cat in the area has to be neutered .\nks08\t1\t\tlearning to use a language freely and fully is a lengthy and arduous process .\nks08\t1\t\tkim put the book in the box .\nks08\t0\t*\tkim put the book .\nks08\t0\t*\tis putting the book in the box .\nks08\t0\t*\ttalked with bill about the exam .\nks08\t1\t\tthey wrote to her .\nks08\t1\t\tthey are kind to her .\nks08\t1\t\tthey want to write to her .\nks08\t0\t*\tthey want to wrote to her .\nks08\t1\t\tthey want to be kind to her .\nks08\t0\t*\tthey want to are kind to her .\nks08\t1\t\tthe student knows the answers .\nks08\t1\t\tthe student knew the answers .\nks08\t1\t\tthe students know the answers .\nks08\t0\t*\tthe student knowing the answers .\nks08\t0\t*\tthe student known the answers .\nks08\t1\t\the is writing another long book about beavers .\nks08\t1\t\tbroadly speaking , the project was successful .\nks08\t1\t\the is proud of his son 's passing the bar exam .\nks08\t1\t\tthe chicken has eaten .\nks08\t1\t\tthe chicken was eaten .\nks08\t1\t\tseen from this perspective , there is no easy solution .\nks08\t1\t\tthe monkeys kept forgetting their lines .\nks08\t0\t*\tthe monkeys kept forgot their lines .\nks08\t0\t*\tthe monkeys kept forgotten their lines .\nks08\t0\t*\twe caught them ate the bananas .\nks08\t0\t*\twe caught them eat the bananas .\nks08\t0\t*\twe caught them eaten the bananas .\nks08\t1\t\tjohn made mary cook korean food .\nks08\t0\t*\tjohn made mary to cook korean food .\nks08\t0\t*\tjohn made mary cooking korean food .\nks08\t1\t\tthe monkey seems despondent that it is in a cage .\nks08\t1\t\tthe monkey seems despondent .\nks08\t1\t\the seems intelligent to study medicine .\nks08\t0\t*\the seems intelligent to study medicine .\nks08\t1\t\tmonkeys are eager to leave .\nks08\t0\t*\tmonkeys are eager leaving the compound .\nks08\t0\t*\tthe chickens seem fond with the farmer .\nks08\t1\t\tthe foxes seem compatible with the chickens .\nks08\t0\t*\tthe foxes seem compatible for the chickens .\nks08\t1\t\tthese are similar to the bottles .\nks08\t0\t*\tthese are similar with the bottles .\nks08\t1\t\tthe teacher is proud of his students .\nks08\t0\t*\tthe teacher is proud with his students .\nks08\t1\t\tthe contract is subject to approval by my committee .\nks08\t0\t*\tthe contract is subject for approval by my committee .\nks08\t1\t\tthere exists only one truly amphibian mammal .\nks08\t1\t\tthere arose a great storm .\nks08\t1\t\tthere exist few solutions which are cost-effective .\nks08\t1\t\tthere is a riot in the park .\nks08\t1\t\tthere remained just a few problems to be solved .\nks08\t0\t*\tthere runs a man in the park .\nks08\t0\t*\tthere sings a man loudly .\nks08\t1\t\tthey believe that charles darwin 's theory of evolution is just a scientific theory .\nks08\t1\t\tthey believe charles darwin 's theory of evolution is just a scientific theory .\nks08\t1\t\tjohn demanded that she stop phoning him .\nks08\t1\t\tjoe warned the class that the exam would be difficult .\nks08\t1\t\twe told tom that he should consult an accountant .\nks08\t1\t\tmary convinced me that the argument was sound .\nks08\t1\t\ttom intends for sam to review that book .\nks08\t1\t\tjohn would prefer for the children to finish the oatmeal .\nks08\t1\t\tfor john to either make up such a story or repeat it is outrageous .\nks08\t1\t\tfor john either to make up such a story or to repeat it is outrageous .\nks08\t1\t\tfor john to tell bill such a lie and bill to believe it is outrageous .\nks08\t1\t\tjohn intends to review the book .\nks08\t1\t\tjohn would prefer to finish the oatmeal .\nks08\t1\t\ttom tried to ask a question .\nks08\t0\t*\ttom tried for bill to ask a question .\nks08\t1\t\ttom tends to avoid confrontations .\nks08\t0\t*\ttom tends for mary to avoid confrontations .\nks08\t1\t\tjoe hoped to find a solution .\nks08\t0\t*\tjoe hoped for beth to find a solution .\nks08\t1\t\tjohn believed it .\nks08\t1\t\tjohn believed that he is honest .\nks08\t1\t\tjohn mentioned the issue to me .\nks08\t1\t\tjohn mentioned to me that the question is an issue .\nks08\t1\t\tshe pinched his arm as hard as she could .\nks08\t0\t*\tshe pinched that he feels pain .\nks08\t1\t\twe hope that such a vaccine could be available in ten years .\nks08\t0\t*\twe hope the availability of such a vaccine in ten years .\nks08\t1\t\tcohen proved the independence of the continuum hypothesis .\nks08\t1\t\tcohen proved that the continuum hypothesis was independent .\nks08\t1\t\tjohn bothers me .\nks08\t1\t\tthat john coughed bothers me .\nks08\t1\t\tjohn loves bill .\nks08\t0\t*\tthat john coughs loves bill .\nks08\t1\t\tthat john sold the ostrich surprised bill .\nks08\t1\t\tfor john to train his horse would be desirable .\nks08\t1\t\tto train his horse would be desirable .\nks08\t1\t\tthat the king or queen be present is a requirement on all royal weddings .\nks08\t1\t\twhich otter you should adopt first is unclear .\nks08\t0\t*\tthat tom missed the lecture was enjoyable .\nks08\t0\t*\tfor john to remove the mother is undeniable .\nks08\t0\t*\thow much money gordon spent is true .\nks08\t1\t\ttom is confident that the elephants respect him .\nks08\t1\t\ttom is insistent that the defendants be truthful .\nks08\t1\t\ttom seems eager for her brother to catch a cold .\nks08\t1\t\ttom seems eager to catch a cold .\nks08\t1\t\ti am ashamed that i neglected you .\nks08\t1\t\ti am delighted that mary finished his thesis .\nks08\t1\t\twe were thankful that no one had been hurt .\nks08\t1\t\twe were glad it was over .\nks08\t1\t\tbill alleged that fred signed the check .\nks08\t1\t\twe believe that the directors were present .\nks08\t1\t\twe convinced him that the operation is safe .\nks08\t0\t*\talan is thinking about that his students are eager to learn english .\nks08\t0\t*\tfred is counting on for tom to make an announcement .\nks08\t1\t\tthe outcome depends on how many candidates participate in the election .\nks08\t1\t\tfred is thinking about whether he should stay in seoul .\nks08\t1\t\tthe offer made smith admire the administrators .\nks08\t1\t\tjohn tried to make sam let george ask bill to keep delivering the mail .\nks08\t1\t\tjohn enjoyed drawing trees for his syntax homework .\nks08\t1\t\tthe picture on the wall reminded him of his country .\nks08\t1\t\tfree enterprise is compatible with american values and traditions .\nks08\t1\t\twe need to be in frequent contact with the clients .\nks08\t1\t\tacknowledge that everyone has limits .\nks08\t1\t\twe are aware of the existing problems .\nks08\t0\t*\twhy do n't you leaving me concentrate on my work ?\nks08\t0\t*\tthe general commended that all troops was in dress uniform .\nks08\t0\t*\tmy morning routine features swim free styles slowly for one hour .\nks08\t0\t*\tyou should avoid to travel in the rush hour .\nks08\t0\t*\tyou should attempt answering every question .\nks08\t0\t*\tthe authorities blamed greenpeace with the bombing .\nks08\t0\t*\tthe authorities charged the students of the cheating .\nks08\t0\t*\tsharon has been eager finishing the book .\nks08\t0\t*\twe respect mary 's desire for becoming famous .\nks08\t0\t*\tjohn referred from the building .\nks08\t0\t*\tjohn died to heart disease .\nks08\t0\t*\twe were glad what to do .\nks08\t0\t*\tshe was busy to make lunch .\nks08\t1\t\tthe constant rain forced the abandonment of the next day 's competitions .\nks08\t1\t\taloe may have an analgesic effect on inflammation and minor skin irritations .\nks08\t1\t\tthe public never had faith in his ability to handle the job .\nks08\t1\t\the repeated his claim that the people backed his action .\nks08\t1\t\twe made them take the money .\nks08\t0\t*\twe made them are rude .\nks08\t1\t\tdo not use these words in the beginning of a sentence .\nks08\t1\t\twe know the defendants seem eager to testify against the criminal .\nks08\t1\t\tjane is n't sure whether the students keep the books .\nks08\t0\t*\tbook is available in most countries .\nks08\t0\t*\tstudent studies english for 4 hours a day .\nks08\t1\t\tstudents study english for 4 hours a day .\nks08\t1\t\this friend learned dancing .\nks08\t1\t\tmy bother 's friend learned dancing .\nks08\t1\t\tthe president 's bodyguard learned surveillance .\nks08\t1\t\tthe king of rock and roll 's records led to dancing .\nks08\t1\t\tpresident lincoln delivered his gettysburg address in 1863 .\nks08\t0\t*\tpresident lincoln delivered her gettysburg address in 1863 .\nks08\t0\t*\tafter reading the pamphlet , judy threw them into the garbage can .\nks08\t1\t\tafter the party , i asked myself why i had faxed invitations to everyone in my office building .\nks08\t1\t\tedward usually remembered to send a copy of his e-mail to himself .\nks08\t1\t\tno john smiths attended the meeting .\nks08\t1\t\tthis john smith lives in seoul .\nks08\t1\t\tthere are three davids in my class .\nks08\t1\t\tit 's nothing like the america i remember .\nks08\t1\t\tmy brother is an einstein at maths .\nks08\t1\t\tin the book , he talks about his ups and downs at school .\nks08\t1\t\tif john wants to succeed in corporate life , he has to know the rules of the game .\nks08\t1\t\tthe critique of plato 's republic was written from a contemporary point of view .\nks08\t1\t\tthe characters in shakespeare 's twelfth night live in a world that has been turned upside-down .\nks08\t0\t*\tthe characters in shakespeare 's twelfth night lives in a world that has been turned upside-down .\nks08\t1\t\tstudents studying english read conrad 's heart of darkness while at university .\nks08\t1\t\tyou is the only person that i can rely on .\nks08\t0\t*\tyou are the only person that i can rely on .\nks08\t1\t\the is the only person that i can rely on .\nks08\t0\t*\the are the only person that i can rely on .\nks08\t1\t\tthe boy swims .\nks08\t0\t*\tthe boys swim .\nks08\t1\t\tking prawns cooked in chili salt and pepper was very much better , a simple dish deliciously executed .\nks08\t1\t\tfour pounds was quite a bit of money in 1950 and it was not easy to come by .\nks08\t1\t\tfive pounds is a lot of money .\nks08\t0\t*\tfive pounds are a lot of money .\nks08\t1\t\ttwo drops sanitize anything in your house .\nks08\t0\t*\ttwo drops sanitize anything in your house .\nks08\t1\t\tfifteen dollars in a week is much .\nks08\t0\t*\tfifteen dollars in a week are not much .\nks08\t1\t\tfifteen years represents a long period of his life .\nks08\t0\t*\tfifteen years represent a long period of his life .\nks08\t1\t\ttwo miles is as far as they can walk .\nks08\t0\t*\ttwo miles are as far as they can walk .\nks08\t1\t\tthis government have been more transparent in the way they have dealt with public finances than any previous government .\nks08\t1\t\tthis government has been more transparent in the way they have dealt with public finances than any previous government .\nks08\t1\t\tin preparation for the return fixture this team has trained more efficiently than they had in recent months .\nks08\t1\t\tshe does n't believe much of that story .\nks08\t1\t\twe listened to as little of his speech as possible .\nks08\t1\t\thow much of the fresco did the flood damage ?\nks08\t0\t*\tshe does n't believe much story .\nks08\t0\t*\twe listened to as little speech as possible .\nks08\t0\t*\thow much fresco did the flood damage ?\nks08\t0\t*\ti read some book .\nks08\t1\t\tone of the people was dying of thirst .\nks08\t1\t\tmany of the people were dying of thirst .\nks08\t0\t*\tone people was dying of thirst .\nks08\t1\t\tmany people were dying of thirst .\nks08\t1\t\teach of the suggestions is acceptable .\nks08\t1\t\tneither of the cars has air conditioning .\nks08\t1\t\tnone of these men wants to be president .\nks08\t1\t\tmost of the children are here .\nks08\t1\t\tsome of the soup needs more salt .\nks08\t1\t\tsome of the diners need menus .\nks08\t1\t\tall of the land belongs to the government .\nks08\t1\t\tall of these cars belong to me .\nks08\t1\t\tjohn is in the room .\nks08\t1\t\ti am fond of him .\nks08\t1\t\tmost of john 's boat has been repainted .\nks08\t1\t\tsome of the record contains evidence of wrongdoing .\nks08\t1\t\tmuch of that theory is unfounded .\nks08\t0\t*\tone of the story has appeared in your newspaper .\nks08\t1\t\the is afraid of foxes .\nks08\t1\t\tit is a wooden desk .\nks08\t1\t\tit is the main street .\nks08\t0\t*\tit is an alive fish .\nks08\t0\t*\tthey are afraid people .\nks08\t0\t*\tthis objection is main .\nks08\t0\t*\tthis fact is key .\nks08\t1\t\tthe man eager to start the meeting is john 's sister .\nks08\t1\t\tthe man holding the bottle disappeared .\nks08\t1\t\tthe papers removed from the safe have not been found .\nks08\t1\t\tthe money that you gave me disappeared last night .\nks08\t0\t*\tjohn in the doorway waved to his father .\nks08\t0\t*\the in the doorway waved to his father .\nks08\t1\t\tand index values of the subject and the main verb .\nks08\t1\t\tneither of these men is worthy to lead italy .\nks08\t1\t\tnone of his customary excuses suffices edgar now .\nks08\t1\t\tone of the problems was the robins .\nks08\t1\t\tall of the plant virus web sites have been conveniently collected in one central location .\nks08\t1\t\tsome of the water from melted snow also goes into the ground for plants .\nks08\t1\t\tmost of the milk your baby consumes during breastfeeding is produced during nursing .\nks08\t1\t\tall special rights of voting in the election were abolished .\nks08\t1\t\tone of major factors affecting the value of diamonds was their weight .\nks08\t1\t\teach of these stones has to be cut and polished .\nks08\t1\t\tmost of her free time was spent attending concerts and plays or visiting museums and art galleries .\nks08\t1\t\tthe committee was unanimous in their decision .\nks08\t1\t\tthe committee have all now resigned .\nks08\t0\t*\tthe committee has all now resigned .\nks08\t1\t\tthe crew have both agreed to change sponsor .\nks08\t0\t*\tthe crew has both agreed to change sponsor .\nks08\t1\t\ther family is all avid skiers .\nks08\t0\t*\ther family are all avid skiers .\nks08\t0\t*\ta variety of styles have been in vogue for the last year .\nks08\t1\t\tboth of the workers will wear carnations .\nks08\t1\t\tboth the workers will wear carnations .\nks08\t1\t\tboth will wear carnations .\nks08\t1\t\tfew doctors approve of our remedy .\nks08\t1\t\tfew approve of our remedy .\nks08\t1\t\tan example of these substances be tobacco .\nks08\t1\t\tthe effectiveness of teaching and learning depend on several factors .\nks08\t1\t\tone of the most serious problems that some students have be lack of motivation .\nks08\t1\t\tten years be a long time to spend in prison .\nks08\t1\t\teveryone of us be given a prize .\nks08\t1\t\tsome of the fruit be going bad .\nks08\t1\t\tall of his wealth come from real estate investments .\nks08\t1\t\tdo some of your relatives live nearby ?\nks08\t1\t\tfifty pounds seem like a lot of weight to lose in one year .\nks08\t1\t\tnews of persephone and demeter reach the great gods and goddesses of olympus .\nks08\t1\t\thalf of the year be dark and wintry .\nks08\t1\t\tsome of the promoters of ostrich meat compare its taste to beef tenderloin .\nks08\t1\t\tthe committee has n't yet made up its mind .\nks08\t0\t*\tthe committee has n't yet made up their mind .\nks08\t1\t\tthe committee have n't yet made up their mind .\nks08\t0\t*\tthe committee have n't yet made up its mind .\nks08\t0\t*\tthat dog is so ferocious , it even tried to bite himself .\nks08\t0\t*\ti washed me .\nks08\t1\t\ti washed myself .\nks08\t0\t*\tyou washed myself .\nks08\t1\t\ti washed you .\nks08\t1\t\the kicked you .\nks08\t0\t*\ti washed yourself .\nks08\t1\t\tyou washed yourself .\nks08\t1\t\tharry says that sally dislikes him .\nks08\t0\t*\tharry says that sally dislikes himself .\nks08\t0\t*\tsally wishes that everyone would praise herself .\nks08\t1\t\tsally believes that she is brilliant .\nks08\t0\t*\tsally believes that herself is brilliant .\nks08\t1\t\tthe power of your mind and the power of your body have a tight connection .\nks08\t1\t\tjohn tries to fix the computer .\nks08\t1\t\tjohn seems to fix the computer .\nks08\t1\t\tmary persuaded john to fix the computer .\nks08\t1\t\tmary expected john to fix the computer .\nks08\t0\t*\tjohn to fix the computer .\nks08\t0\t*\tseems john to fix the computer .\nks08\t1\t\tjohn tries to be honest .\nks08\t1\t\tjohn seems to be honest .\nks08\t1\t\tjohn makes efforts for himself to be honest .\nks08\t1\t\tit seems that john is honest .\nks08\t1\t\tit tends to be warm in september .\nks08\t1\t\tit seems to bother kim that they resigned .\nks08\t0\t*\tit tries to be warm in september .\nks08\t0\t*\tit hopes to bother kim that they resigned .\nks08\t1\t\tit is easy to please kim .\nks08\t1\t\tjohn is eager to please kim .\nks08\t0\t*\tthere tries to be warm in september .\nks08\t0\t*\tthere hopes to bother kim that they resigned .\nks08\t0\t*\tit is eager to please kim .\nks08\t1\t\tstephen seemed to be intelligent .\nks08\t1\t\tit seems to be easy to fool ben .\nks08\t1\t\tthere is likely to be a letter in the mailbox .\nks08\t1\t\ttabs are likely to be kept on participants .\nks08\t0\t*\tjohn seems to be easy to fool ben .\nks08\t0\t*\tjohn is likely to be kept on participants .\nks08\t1\t\tsandy tried to eat oysters .\nks08\t0\t*\tthere tried to be riots in seoul .\nks08\t0\t*\tit tried to bother me that chris lied .\nks08\t0\t*\ttabs try to be kept on bob by the fbi .\nks08\t0\t*\tthat he is clever is eager to be obvious .\nks08\t1\t\tthe king thanked the man .\nks08\t1\t\tthe color red seems to be his favorite color .\nks08\t1\t\tthe cat seems to be out of the bag .\nks08\t1\t\tthe dentist is likely to examine pat .\nks08\t1\t\tpat is likely to be examined by the dentist .\nks08\t1\t\tthe dentist is eager to examine pat .\nks08\t1\t\tpat is eager to be examined by the dentist .\nks08\t1\t\tstephen believed ben to be careful .\nks08\t1\t\tstephen persuaded ben to be careful .\nks08\t1\t\tstephen believed it to be easy to please kim .\nks08\t0\t*\tstephen persuaded it to be easy to please kim .\nks08\t1\t\tstephen believed there to be a fountain in the park .\nks08\t0\t*\tstephen persuaded there to be a fountain in the park .\nks08\t1\t\tstephen believed the cat to be out of the bag .\nks08\t0\t*\tstephen persuaded the cat to be out of the bag .\nks08\t1\t\tthe dentist was believed to have examined pat .\nks08\t1\t\tpat was believed to have been examined by the dentist .\nks08\t1\t\tthe dentist was persuaded to examine pat .\nks08\t1\t\tstephen seems to be irritating .\nks08\t1\t\ttom believes stephen to be irritating .\nks08\t1\t\tjohn persuaded stephen to be more careful .\nks08\t0\t*\tit seemed to be intelligent .\nks08\t1\t\tit seemed to rain .\nks08\t1\t\tthere seemed to be a fountain in the park .\nks08\t1\t\tstephen tried to be intelligent .\nks08\t0\t*\tit tried to be intelligent .\nks08\t0\t*\tthere tried to be intelligent .\nks08\t0\t*\tit tried to rain .\nks08\t0\t*\tthere tried to be a fountain in the park .\nks08\t1\t\tsomeone tried to leave the town .\nks08\t1\t\tthere seems to be a fountain in the park .\nks08\t0\t*\tit seems to be a fountain in the park .\nks08\t0\t*\tjohn seems to be a fountain in the park .\nks08\t1\t\twe believed there to be a fountain in the park .\nks08\t0\t*\twe believed it to be a fountain in the park .\nks08\t0\t*\tthere tries to leave the country .\nks08\t0\t*\twe believed it to try to leave the country .\nks08\t0\t*\twe believed there to try to leave the country .\nks08\t1\t\twe believed john to try to leave the country .\nks08\t1\t\tthe cat tries to be out of the bag .\nks08\t1\t\tthey persuaded me to leave .\nks08\t1\t\tthey promised me to leave .\nks08\t0\t*\tthey persuaded it to rain .\nks08\t0\t*\tthey promised it to rain .\nks08\t1\t\tunder the bed is a fun place to hide .\nks08\t0\t*\tunder the bed wants to be a fun place to hide .\nks08\t1\t\tkim may have admitted to let mary mow the lawn .\nks08\t1\t\tgregory appears to have wanted to be loyal to the company .\nks08\t1\t\tjones would prefer for it to be clear to barry that the city plans to sue him .\nks08\t1\t\tjohn continues to avoid the conflict .\nks08\t1\t\tthe captain ordered the troops to proceed .\nks08\t1\t\the coaxed his brother to give him the candy .\nks08\t1\t\tjohn wants it to be clear to ben that the city plans to honor him .\nks08\t0\t*\tjohn seems to rain .\nks08\t0\t*\tjohn is likely to appear that he will win the game .\nks08\t0\t*\tbeth tried for bill to ask a question .\nks08\t0\t*\the believed there to be likely that he won the game .\nks08\t0\t*\tit is likely to seem to be arrogant .\nks08\t0\t*\tsandy appears that kim is happy .\nks08\t0\t*\tdana would be unlikely for pat to be called upon .\nks08\t0\t*\trobin is nothing in the box .\nks08\t0\t*\tit said that kim was happy .\nks08\t0\t*\tthere preferred for sandy to get the job .\nks08\t1\t\tthere is only one chemical substance involved in nerve transmission .\nks08\t0\t*\tthere are only one chemical substance involved in nerve transmission .\nks08\t0\t*\tthere is more chemical substances involved in nerve transmission .\nks08\t1\t\tthere are more chemical substances involved in nerve transmission .\nks08\t1\t\tthere is believed to be a sheep in the park .\nks08\t0\t*\tthere is believed to be a sheep in the park .\nks08\t1\t\tthere are believed to be sheep in the park .\nks08\t1\t\tthere seems to be no student absent .\nks08\t0\t*\tthere are likely to be no student absent .\nks08\t1\t\tthere is likely to be no student absent .\nks08\t1\t\tpat expected leslie to be aggressive .\nks08\t1\t\tpat persuaded leslie to be aggressive .\nks08\t1\t\tpat promised leslie to be aggressive .\nks08\t1\t\tkevin urged anne to be loyal to her .\nks08\t1\t\twe expect the dentist to examine us .\nks08\t0\t*\twe expect the dentist to examine ourselves .\nks08\t1\t\twe expect them to examine themselves .\nks08\t1\t\twe persuaded the dentist to examine us .\nks08\t0\t*\twe persuaded the dentist to examine ourselves .\nks08\t1\t\twe persuaded them to examine themselves .\nks08\t0\t*\twe persuaded them to examine them .\nks08\t1\t\tjohn may drink water , and bill drink beer .\nks08\t1\t\ttom will not leave .\nks08\t0\t*\ttom kicked not a ball .\nks08\t1\t\twill tom leave the party now ?\nks08\t0\t*\tleft tom the party already ?\nks08\t1\t\tjohn could n't leave the party .\nks08\t0\t*\tjohn left n't the party early .\nks08\t1\t\tif anybody is spoiling the children , john is .\nks08\t0\t*\tif anybody keeps spoiling the children , john keeps .\nks08\t1\t\tyou should leave , should n't you ?\nks08\t0\t*\tyou did n't leave , left you ?\nks08\t1\t\tshe would never believe that story .\nks08\t0\t*\tshe believed never his story .\nks08\t1\t\tthe boys will all be there .\nks08\t0\t*\tour team played all well .\nks08\t1\t\tthe children will have been being entertained .\nks08\t0\t*\tthe house is been remodelling .\nks08\t0\t*\tmargaret has had already left .\nks08\t0\t*\the has will seeing his children .\nks08\t0\t*\the has been must being interrogated by the police at that very moment .\nks08\t1\t\tmary solved the problem .\nks08\t1\t\tmary would solve the problem .\nks08\t1\t\tmary was solving the problem .\nks08\t1\t\tmary would easily solve the problem .\nks08\t0\t*\tmary not avoided bill .\nks08\t1\t\tmary did not avoid bill .\nks08\t1\t\tfred must have been singing songs and probably was drinking beer .\nks08\t1\t\tfred must both have been singing songs and have been drinking beer .\nks08\t1\t\tfred must have both been singing songs and been drinking beer .\nks08\t1\t\tfred must have been both singing songs and drinking beer .\nks08\t1\t\tthere might be a unicorn in the garden .\nks08\t1\t\tit will rain tomorrow .\nks08\t1\t\tjohn will leave the party earlier .\nks08\t0\t*\tthere hopes to finish the project .\nks08\t0\t*\tthe bus hopes to be here at five .\nks08\t0\t*\ti hope to can study in france .\nks08\t1\t\ti hope to study in france .\nks08\t0\t*\tjohn stopped can to sign in tune .\nks08\t0\t*\tjohn stopped canning to sign in tune .\nks08\t0\t*\tjohn wills leave the party early .\nks08\t0\t*\tjohn can kicked the ball .\nks08\t0\t*\tjohn can kicking the ball .\nks08\t0\t*\tjohn can to kick the ball .\nks08\t1\t\tjohn will kick the ball .\nks08\t0\t*\tjohn will kicked the ball .\nks08\t0\t*\tjohn will to kick the ball .\nks08\t0\t*\tkim must bakes a cake .\nks08\t0\t*\tkim must baked a cake .\nks08\t0\t*\tkim must will bake a cake .\nks08\t1\t\tthere may exist a man in the park .\nks08\t0\t*\tit may exist a man in the park .\nks08\t0\t*\tit is vital that we will study everyday .\nks08\t1\t\the is a fool .\nks08\t1\t\the has a car .\nks08\t1\t\tjohn is running to the car .\nks08\t1\t\twas the child in the school ?\nks08\t1\t\twas the child running to the car ?\nks08\t1\t\twas the child found ?\nks08\t1\t\tthe child never became crazy .\nks08\t1\t\tthe child was never crazy .\nks08\t1\t\tthe child was never running to the car .\nks08\t1\t\tthe child was never deceived .\nks08\t1\t\tjohn is happy about the outcome .\nks08\t1\t\tjohn was seeing his children .\nks08\t1\t\tthe children are seen in the yard .\nks08\t1\t\tjohn has not sung a song .\nks08\t1\t\thas john sung a song ?\nks08\t1\t\tjohn has n't been singing a song .\nks08\t1\t\tjohn has sung a song and mary has too .\nks08\t1\t\tjohn can have danced .\nks08\t1\t\tjohn can be dancing .\nks08\t1\t\the has seen his children .\nks08\t1\t\the will have been seeing his children .\nks08\t0\t*\tamericans have paying income tax ever since 1913 .\nks08\t0\t*\tgeorge has went to america .\nks08\t1\t\tis out since the following is finite .\nks08\t1\t\tyou are a student .\nks08\t1\t\tyou have not enough money .\nks08\t1\t\thave you enough money ?\nks08\t1\t\tjohn does not like this town .\nks08\t1\t\tin no other circumstances does that distinction matter .\nks08\t1\t\tthey did n't leave any food .\nks08\t0\t*\tthey expected us to do leave him .\nks08\t0\t*\tthey expected us to should leave him .\nks08\t0\t*\ti found myself doing need sleep .\nks08\t0\t*\the does be leaving .\nks08\t0\t*\the does have been eating .\nks08\t0\t*\tthey will do come .\nks08\t1\t\tjohn did leave .\nks08\t1\t\tdid john find the solution ?\nks08\t1\t\thow long did it last ?\nks08\t1\t\tjohn may leave .\nks08\t1\t\tit may rain .\nks08\t0\t*\tjohn may rain .\nks08\t1\t\tjohn did not leave .\nks08\t0\t*\tjohn did not rain .\nks08\t1\t\the might have left .\nks08\t0\t*\the might do leave .\nks08\t0\t*\the does can leave here .\nks08\t0\t*\the does may leave here .\nks08\t0\t*\tjim does have supported the theory .\nks08\t0\t*\tthe proposal did be endorsed by clinton .\nks08\t0\t*\ti do not have sung .\nks08\t0\t*\ti do not be happy .\nks08\t1\t\tdo be honest !\nks08\t1\t\tdo n't be silly !\nks08\t0\t*\tjohn believed kim to do not leave here .\nks08\t1\t\tjohn believes kim not to leave here .\nks08\t0\t*\tjohn believed kim to leaving here .\nks08\t0\t*\tjohn did not leaving here .\nks08\t0\t*\tjohn expect to must leave .\nks08\t0\t*\tjohn did not may leave .\nks08\t1\t\ttom wanted to go home , but peter did n't want to .\nks08\t1\t\tlee voted for bill because his father told him to .\nks08\t1\t\tkim regrets not having seen the movie .\nks08\t1\t\tkim regrets never having seen the movie .\nks08\t1\t\twe asked him not to try to call us again .\nks08\t1\t\twe asked him never to try to call us again .\nks08\t1\t\tduty made them not miss the weekly meetings .\nks08\t1\t\tduty made them never miss the weekly meetings .\nks08\t1\t\tnot speaking english is a disadvantage .\nks08\t0\t*\tspeaking not english is a disadvantage .\nks08\t0\t*\tlee likes not kim .\nks08\t1\t\tlee is believed not to like kim .\nks08\t1\t\tlee is believed to not like kim .\nks08\t0\t*\tlee is believed to like not kim .\nks08\t1\t\tthe president could not approve the bill .\nks08\t1\t\tit would be possible for the president not to approve the bill .\nks08\t1\t\tit would not be possible for the president to approve the bill .\nks08\t0\t*\tlee not left .\nks08\t1\t\tlee will never leave .\nks08\t1\t\tlee will not leave .\nks08\t1\t\tjohn could not leave the town .\nks08\t0\t*\tjohn not left the town .\nks08\t0\t*\tjohn not could leave the town .\nks08\t1\t\tmary sang a song , but lee never did .\nks08\t0\t*\tmary sang a song , but lee did never .\nks08\t1\t\tmary sang a song , but lee did not .\nks08\t1\t\tthe president could not approve the bill , could n't he ?\nks08\t0\t*\tthe president could not approve the bill , could he ?\nks08\t1\t\tare you studying english syntax ?\nks08\t1\t\twhat are you studying nowadays ?\nks08\t1\t\ti shall go downtown .\nks08\t1\t\tshall i go downtown ?\nks08\t1\t\tmay she live forever !\nks08\t1\t\twas i that stupid ?\nks08\t1\t\tdo n't you even touch that !\nks08\t1\t\tyou better not drink .\nks08\t1\t\tyou can do it , but you better not .\nks08\t0\t*\tbetter you not drink .\nks08\t1\t\tthey 'd leave soon .\nks08\t1\t\tthey would n't leave soon .\nks08\t1\t\tthey should n't leave soon .\nks08\t1\t\tthey can do it , ca n't they ?\nks08\t1\t\tthey ca n't do it , can they ?\nks08\t0\t*\tthey ca n't do it , ca n't they ?\nks08\t0\t*\tthey ca n't do it , can he ?\nks08\t1\t\tkim can dance , and sandy can , too .\nks08\t1\t\tkim has danced , and sandy has , too .\nks08\t1\t\tkim was dancing , and sandy was , too .\nks08\t0\t*\tkim considered joining the navy , but i never considered .\nks08\t0\t*\tkim wanted to go and sandy wanted , too .\nks08\t1\t\tkim is happy and sandy is too .\nks08\t1\t\twhen kim was in china , i was too .\nks08\t1\t\thave you anything to share with the group ?\nks08\t1\t\thave you brought anything to share with the group ?\nks08\t1\t\tsandy must have been , too .\nks08\t1\t\tsandy must have , too .\nks08\t1\t\tsandy must , too .\nks08\t1\t\tbecause john persuaded sally to , he did n't have to talk to the reporters .\nks08\t0\t*\tmary sang a song , but lee could never .\nks08\t1\t\tmary sang a song , but lee could not .\nks08\t1\t\tjohn got sent to prison .\nks08\t1\t\the ought to leave his luggage here .\nks08\t1\t\the dared not argue against his parents .\nks08\t1\t\the used to go there very often .\nks08\t1\t\tthe gardener must trim the rose bushes today .\nks08\t1\t\tthis should be the beginning of a beautiful friendship .\nks08\t1\t\ti am removing the shovel from the shed .\nks08\t1\t\tthe travelers have returned from their vacation .\nks08\t1\t\tspringfield would have built a police station with the federal grant .\nks08\t1\t\tsharks could have been cruising near the beach .\nks08\t1\t\tshe seem to have given financial assistance to an important french art dealer .\nks08\t0\t*\tann may spending her vacation in italy .\nks08\t0\t*\tann may spends her vacation in italy .\nks08\t0\t*\tann may spent her vacation in italy .\nks08\t1\t\tit has rained every day for the last week .\nks08\t0\t*\tit has raining every day for the last week .\nks08\t0\t*\tit has rains every day for the last week .\nks08\t0\t*\tit has rain every day for the last week .\nks08\t1\t\ttagalog is spoken in the philippines .\nks08\t0\t*\ttagalog is speak in the philippines .\nks08\t0\t*\ttagalog is speaks in the philippines .\nks08\t0\t*\ttagalog is spoke in the philippines .\nks08\t1\t\tthe roof is leaking .\nks08\t0\t*\tthe roof is leaked .\nks08\t0\t*\tthe roof is leaks .\nks08\t0\t*\tgeorge is having lived in toledo for thirty years .\nks08\t0\t*\tthe house is been remodeling .\nks08\t0\t*\ta medal was been given to the mayor by the sewer commissioner .\nks08\t0\t*\tdoes john have gone to the library ?\nks08\t0\t*\tjohn seems fond of ice cream , and bill seems , too .\nks08\t1\t\tsam may have been being interrogated by the fbi .\nks08\t0\t*\tsam may have been being interrogating by the fbi .\nks08\t0\t*\tsam may be had been interrogating by the fbi .\nks08\t1\t\thave social problems made police work difficult ?\nks08\t1\t\tthe senator should not have forgotten the concerns of her constituents .\nks08\t1\t\ttokyo has not loosened trade restrictions .\nks08\t1\t\tdid the doctor prescribe aspirin ?\nks08\t1\t\tsandy will read your reports , but harold will not .\nks08\t1\t\the can hardly believe that it 's already over .\nks08\t1\t\ti could have little known that more trouble was just around the corner .\nks08\t1\t\ti have never been spoken to so rudely !\nks08\t1\t\thardly was there any rain falling .\nks08\t1\t\tlittle did i know that more trouble was just around the corner .\nks08\t1\t\tnever have i been spoken to so rudely !\nks08\t1\t\the had hardly collected the papers on his desk , had he ?\nks08\t0\t*\the had hardly collected the papers on his desk , had n't he ?\nks08\t1\t\the never achieved anything , did he ?\nks08\t0\t*\the never achieved anything , did n't he ?\nks08\t1\t\tas a statesman , he scarcely could do anything worth mentioning .\nks08\t0\t*\tas a statesman , scarcely he could do anything worth mentioning .\nks08\t0\t*\tany zebras ca n't fly .\nks08\t0\t*\tanything has n't happened to his optimism .\nks08\t0\t*\tany of the citizens hardly ever say anything .\nks08\t1\t\ti did n't find any bugs in my bed .\nks08\t1\t\tnobody told them anything .\nks08\t1\t\tnever have i stolen from any members of your family .\nks08\t1\t\twhy have n't any books been returned ?\nks08\t1\t\thardly any of the citizens ever say anything .\nks08\t1\t\tthese lines were written by one of korea 's most famous poets .\nks08\t1\t\tthe unidentified victim was apparently struck during the early morning hours .\nks08\t1\t\ttargets can be observed at any angle .\nks08\t1\t\tduring the early evening , saturn can be found in the north , while jupiter rises in the east .\nks08\t1\t\ti poured 20 liters of acid into the beaker .\nks08\t1\t\tabout 20 liters of acid was poured into the beaker .\nks08\t1\t\tthe executive committee approved the new policy .\nks08\t1\t\tthe new policy was approved by the executive committee .\nks08\t1\t\tjohn has taken bill to the library .\nks08\t1\t\tjohn has chosen bill for the position .\nks08\t0\t*\tjohn has taken to the library .\nks08\t0\t*\tjohn has chosen for the position .\nks08\t0\t*\tthe guide has been taken john to the library .\nks08\t0\t*\tthe department has been chosen john for the position .\nks08\t1\t\tjohn has been taken to the library .\nks08\t1\t\tjohn has been chosen for the position .\nks08\t1\t\tpat handed a book to chris .\nks08\t0\t*\tpat handed to chris .\nks08\t0\t*\tpat handed a book .\nks08\t1\t\ta book was handed to chris by pat .\nks08\t0\t*\ta book was handed by pat .\nks08\t1\t\ta book was handed to chris .\nks08\t0\t*\ta book was handed .\nks08\t1\t\tthey believe it to be easy to annoy ben .\nks08\t0\t*\tthey believe stephen to be easy to annoy ben .\nks08\t1\t\tthey believe there to be a dragon in the wood .\nks08\t1\t\tit is believed to be easy to annoy ben .\nks08\t0\t*\tstephen is believed to be easy to annoy ben .\nks08\t1\t\tthere is believed to be a dragon in the wood .\nks08\t1\t\tno one believes that he is a fool .\nks08\t1\t\tno one suspects that he is a fool .\nks08\t1\t\tthat he is a fool is suspected by no one .\nks08\t1\t\tthey believe the cat to be out of the bag .\nks08\t1\t\tthe cat is believed to be out of the bag .\nks08\t1\t\tjohn drove the car .\nks08\t1\t\tjohn was driving the car .\nks08\t1\t\tthe car was being driven .\nks08\t1\t\tjohn will drive the car .\nks08\t1\t\tthe car will be driven .\nks08\t1\t\tjohn has driven the car .\nks08\t1\t\tthe car has been driven .\nks08\t1\t\tjohn has been driving the car .\nks08\t1\t\tthe car has been being driven .\nks08\t1\t\tthe car will have been being driven .\nks08\t1\t\tpat handed chris a note .\nks08\t1\t\tchris was handed a note .\nks08\t1\t\tchris was handed a note by pat .\nks08\t1\t\tideas are put into children 's heads by tv .\nks08\t1\t\tyesterday , the child really kicked a monkey in the street .\nks08\t1\t\tthe model resembles kim in nearly every detail .\nks08\t0\t*\tkim is resembled by the model in nearly every detail .\nks08\t0\t*\tyou are not fitted by the coat .\nks08\t1\t\ti was born in 1970 .\nks08\t1\t\tit is rumored that he is on his way out .\nks08\t1\t\tjohn is said to be rich .\nks08\t1\t\the is reputed to be a good scholar .\nks08\t0\t*\tmy mother bore me in 1970 .\nks08\t0\t*\teveryone rumored that he was on his way out .\nks08\t0\t*\tthey said him to be rich .\nks08\t0\t*\tthey reputed him to be a good scholar .\nks08\t1\t\the kicked the ball .\nks08\t1\t\tthe ball was kicked by him .\nks08\t1\t\tjohn kicked him .\nks08\t1\t\the was kicked by john .\nks08\t1\t\tjohn sent her to seoul .\nks08\t1\t\tshe was sent to seoul .\nks08\t1\t\tthey widely believed that john was ill .\nks08\t1\t\tthat john was ill was widely believed .\nks08\t1\t\tthey have n't decided which attorney will give the closing argument .\nks08\t1\t\twhich attorney will give the closing argument has n't been decided .\nks08\t1\t\twhich attorney will give the closing argument has n't been decided by them .\nks08\t1\t\tyou can rely on ben .\nks08\t1\t\tben can be relied on .\nks08\t1\t\tthey talked about the scandal for days .\nks08\t1\t\tthe scandal was talked about for days .\nks08\t1\t\tthe issue was dealt with promptly .\nks08\t1\t\tthat 's not what 's asked for .\nks08\t1\t\tthis should be attended to immediately .\nks08\t0\t*\tthe capital was gathered near by a crowd of people .\nks08\t0\t*\tthe hot sun was played under by the children .\nks08\t1\t\tthat 's something i would have paid twice for .\nks08\t1\t\tthese are the books that we have gone most thoroughly over .\nks08\t1\t\tthey look generally on john as selfish .\nks08\t0\t*\teverything was paid twice for .\nks08\t0\t*\tyour books were gone most thoroughly over .\nks08\t0\t*\the is looked generally on as selfish .\nks08\t1\t\tpavarotti relied on loren and bond on hepburn .\nks08\t0\t*\tpavarotti relied on loren and bond hepburn .\nks08\t1\t\tloren was relied on by pavarotti and hepburn by bond .\nks08\t0\t*\tloren was relied on by pavarotti and hepburn on by bond .\nks08\t1\t\tthe lawyer looked into the document .\nks08\t1\t\tthe document was looked into by the lawyer .\nks08\t1\t\tpeter has been asked to resign .\nks08\t1\t\ti assume the matter to have been filed in the appropriate records .\nks08\t1\t\tsmith wants the picture to be removed from the office .\nks08\t1\t\tthe events have been described well .\nks08\t1\t\tover 120 different contaminants have been dumped into the river .\nks08\t1\t\tthe balloon is positioned in an area of blockage and is inflated .\nks08\t1\t\tcancer is now thought to be unlikely to be caused by hot dogs .\nks08\t1\t\twhether this is feasible has n't yet been determined .\nks08\t1\t\tpaying taxes ca n't be avoided .\nks08\t1\t\tit has n't yet been determined whether this is feasible .\nks08\t1\t\tfrances has had the drapes cleaned .\nks08\t1\t\tshirley seems to have fred promoted .\nks08\t1\t\tnina got bill elected to the committee .\nks08\t1\t\twe got our car radio stolen twice on holiday .\nks08\t1\t\tfrances has had her clean the drapes .\nks08\t1\t\tnina got them to elect bill .\nks08\t1\t\tthe news was dealt with carefully .\nks08\t1\t\tthe tree was looked after by kim .\nks08\t1\t\twe can not put up with the noise anymore .\nks08\t1\t\the will keep up with their expectations .\nks08\t1\t\tthis noise can not be put up with .\nks08\t1\t\ttheir expectations will be kept up with .\nks08\t1\t\tthey paid a lot of attention to the matter .\nks08\t1\t\tthe son took care of his parents .\nks08\t1\t\tthe matter was paid a lot of attention to .\nks08\t1\t\ta lot of attention was paid to the matter .\nks08\t0\t*\tnew york was slept in .\nks08\t0\t*\tthe lake was camped beside by my sister .\nks08\t1\t\tthe lake is not to be camped beside by anybody .\nks08\t0\t*\tsix inches were grown by the boy .\nks08\t0\t*\ta mile to work was run by him .\nks08\t1\t\tthe beans were grown by the gardener .\nks08\t1\t\tthe plums were weighed by the grocer .\nks08\t0\t*\tsan francisco has been lived in by my brother .\nks08\t1\t\tthe house has been lived in by several famous personages .\nks08\t0\t*\tseoul was slept in by the businessman last night .\nks08\t1\t\tthis bed was surely slept in by a huge guy last night .\nks08\t1\t\trosie got struck by lightning .\nks08\t1\t\ti got phoned by a woman friend .\nks08\t1\t\the got hit in the face with the tip of a surfboard .\nks08\t1\t\tjohn 's bike got fixed or got stolen .\nks08\t0\t*\tthe lesson got read by a priest .\nks08\t0\t*\tthe letter got written by a poet .\nks08\t0\t*\ttom got understood to have asked for a refund .\nks08\t0\t*\tmary got heard to insult her parents .\nks08\t1\t\tis john clever ?\nks08\t1\t\twho is clever ?\nks08\t1\t\thow clever you are !\nks08\t1\t\tbe very clever .\nks08\t1\t\ti ask you if this is what you want .\nks08\t1\t\twould you mind taking out the garbage ?\nks08\t1\t\tcan the child read the book ?\nks08\t1\t\twhat can the child read ?\nks08\t1\t\twhich version did they recommend ?\nks08\t1\t\twith what did the baby eat the food ?\nks08\t1\t\thow did he eat the food ?\nks08\t1\t\twhich man did you talk to ?\nks08\t1\t\tto which man did you talk ?\nks08\t1\t\thow ill has hobbs been ?\nks08\t0\t*\twhich man did you talk ?\nks08\t0\t*\tto which man did you talk to ?\nks08\t1\t\twho do you think hobbs imagined mary said tom saw ?\nks08\t1\t\twho did kim work for and sandy rely on ?\nks08\t0\t*\twho did kim work for and sandy rely ?\nks08\t0\t*\twho did kim work for and sandy rely on mary ?\nks08\t1\t\tyou can rely on edward 's help .\nks08\t1\t\tedward 's help , you can rely on .\nks08\t1\t\twe talked about the fact that he was sick for days .\nks08\t1\t\tthe fact that he was sick for days , we talked about .\nks08\t0\t*\tyou can rely on that he will help you .\nks08\t0\t*\twe talked about that he was sick for days .\nks08\t1\t\tthat he was sick , we talked about for days .\nks08\t1\t\tthat arrows do n't stop in midair is captured by this theory .\nks08\t0\t*\twho did you see and a picture of ?\nks08\t1\t\tthese qualities recommended him to oliver .\nks08\t1\t\tthe un recommended an enlarged peacekeeping force .\nks08\t1\t\tthis is the book which the teacher recommended .\nks08\t1\t\twho will they recommend ?\nks08\t1\t\tjohn put the books in a box .\nks08\t1\t\twhich books did john put in the box ?\nks08\t1\t\twhere did john put the books ?\nks08\t1\t\tin which box did john put the book ?\nks08\t1\t\thow happy has john been ?\nks08\t1\t\twho put the book in the box ?\nks08\t1\t\twho did put the book in the box ?\nks08\t1\t\twho can put the book in the box ?\nks08\t1\t\twho do you think visited seoul last year ?\nks08\t1\t\tthat 's the un delegate that the government thinks visits seoul last year .\nks08\t1\t\twho do you believe that sara invited ?\nks08\t1\t\twho do you believe invited sara ?\nks08\t0\t*\twho do you believe that invited sara ?\nks08\t0\t*\twho do you think that would be nominated for the position ?\nks08\t1\t\tthis is the kind of person who i doubt that under normal circumstances would have anything to do with such a scheme .\nks08\t1\t\tjohn asks whose book his son likes .\nks08\t1\t\tjohn has forgotten which player his son shouted at .\nks08\t1\t\the told me how many employees karen introduced to the visitors .\nks08\t1\t\the had been reading the article .\nks08\t0\t*\ttom denied which book he had been reading .\nks08\t0\t*\ttom claimed how much money she had spent .\nks08\t0\t*\tjohn inquired that he should read it .\nks08\t0\t*\tpeter will decide that we should review the book .\nks08\t1\t\tjohn inquired which book he should read .\nks08\t1\t\tpeter will decide which book we should review .\nks08\t1\t\tjohn told us that we should review the book .\nks08\t1\t\tjohn told us which book we should review .\nks08\t1\t\tin which box did he put the book ?\nks08\t1\t\twhich book by his father did he read ?\nks08\t1\t\tjohn asks in which box he put the book .\nks08\t1\t\tjohn asks which book by his father he read .\nks08\t1\t\tkim has wondered in which room gary stayed .\nks08\t1\t\tlee asked me how fond of chocolates the monkeys are .\nks08\t0\t*\tkim has wondered that gary stayed in the room .\nks08\t0\t*\tkim asked me that the monkeys are very fond of chocolates .\nks08\t1\t\tjohn knows whose book mary bought and tom borrowed from her .\nks08\t0\t*\tjohn knows whose book mary bought and tom talked .\nks08\t1\t\ti do n't know whether i should agree .\nks08\t1\t\tshe gets upset if i exclude her from anything .\nks08\t1\t\tshe gets upset whether i exclude her from anything .\nks08\t1\t\ti wonder if you 'd be kind enough to give us information .\nks08\t1\t\ti am not certain about when he will come .\nks08\t1\t\ti am not certain about whether he will go or not .\nks08\t0\t*\ti am not certain about if he will come .\nks08\t0\t*\ti am not certain about if he will go or not .\nks08\t1\t\ti do n't know where to go .\nks08\t1\t\ti do n't know what to do .\nks08\t1\t\ti do n't know how to do it .\nks08\t1\t\ti do n't know whether to agree with him or not .\nks08\t0\t*\ti do n't know if to agree with him .\nks08\t0\t*\ti do n't know that to agree with him or not .\nks08\t1\t\tfred knows which politician to support .\nks08\t1\t\tkaren asked where to put the chairs .\nks08\t1\t\tthe student protected him .\nks08\t1\t\twho protected him ?\nks08\t1\t\tto protect him is not an easy task .\nks08\t0\t*\tfred knows which politician for karen to vote for .\nks08\t0\t*\tfred knows which politician for her to vote for .\nks08\t0\t*\tkaren asked where for jerry to put the chairs .\nks08\t0\t*\tkaren asked where for him to put the chairs .\nks08\t1\t\thow carefully have you considered your future career ?\nks08\t1\t\twhen can we register for graduation ?\nks08\t1\t\twhere do we go to register for graduation ?\nks08\t1\t\twhy have you borrowed my pencil ?\nks08\t1\t\twhen did he say that he was fired ?\nks08\t1\t\twhere did he tell you that he met mary ?\nks08\t1\t\twhy do you wonder whether she will invite me ?\nks08\t1\t\thow often did he ask when she will meet at the party ?\nks08\t1\t\twhat causes students to select particular majors ?\nks08\t1\t\twho will john ask for information about summer courses ?\nks08\t1\t\twhich textbook did the teacher use in the class last summer ?\nks08\t1\t\twhose car is blocking the entrance to the store ?\nks08\t1\t\twhy do you think he left ?\nks08\t1\t\twho do you guess will be here ?\nks08\t1\t\twho do you think borrowed my book ?\nks08\t1\t\twhich city does fred think that you believe that john lives in ?\nks08\t1\t\ti wonder on which shelf john will put the book ?\nks08\t1\t\twhat proof that he has implicated have you found ?\nks08\t1\t\tjoseph has forgotten how many matches he has won .\nks08\t1\t\tfred will warn martha that she should claim that her brother is patriotic .\nks08\t1\t\tthat bill tried to discover which drawer alice put the money in made us realize that we should have left him in seoul .\nks08\t1\t\tjasper wonders which book he should attempt to persuade his students to buy .\nks08\t0\t*\ti wonder if on which shelve john will put the book .\nks08\t0\t*\ti wonder what city that romans destroyed .\nks08\t0\t*\tjohn was wondering to whom he was referring to .\nks08\t0\t*\twho do you think that has given the tickets to bill ?\nks08\t0\t*\twhat city will fred say that mary thinks that john lives ?\nks08\t0\t*\ton whom does dana believe chris knows sandy trusts ?\nks08\t0\t*\tthe politician denied how the opponent was poisoned .\nks08\t0\t*\tfred knows which book for the children to read during the summer vacation .\nks08\t1\t\tthis needs mending .\nks08\t0\t*\tthis needs mending the shoe .\nks08\t0\t*\the mended .\nks08\t1\t\the mended the shoe .\nks08\t1\t\tthis needs investigating .\nks08\t0\t*\tthis needs investigating the problem .\nks08\t0\t*\tthey investigated .\nks08\t1\t\tthey investigated the problem .\nks08\t1\t\tthe video which you recommended was really terrific .\nks08\t1\t\tthe video which i thought you recommended was really terrific .\nks08\t1\t\tthe video which i thought john told us you recommended was really terrific .\nks08\t1\t\tthe student who won the prize left .\nks08\t1\t\tthe student who everyone likes left .\nks08\t1\t\tthe person whom john gave the book to left .\nks08\t1\t\tthe day when i met her was sunny .\nks08\t1\t\tthe president who fred voted for has resigned .\nks08\t1\t\tthe president that fred voted for dislikes his opponents .\nks08\t1\t\tthe president fred voted for has resigned .\nks08\t1\t\thas no relative pronoun at all .\nks08\t1\t\the is the kind of person with whom to consult .\nks08\t1\t\tthese are the things for which to be thankful .\nks08\t1\t\twe will invite volunteers on whom to work .\nks08\t1\t\tthis is the student pictures of whom appeared in the newspaper .\nks08\t0\t*\tpictures of whom appeared in the newspaper ?\nks08\t1\t\tthe people happy with the proposal left .\nks08\t1\t\tthe person standing on my foot is heavy .\nks08\t0\t*\tthe paper to finish by tomorrow is too long .\nks08\t0\t*\tthe person stand on my foot is heavy .\nks08\t0\t*\tthe person stood on my foot is heavy .\nks08\t0\t*\tthe student met the senator john met bill .\nks08\t0\t*\tthe student met the senator that john met bill .\nks08\t0\t*\tthe student met the senator for john to meet bill .\nks08\t1\t\tjack is the person whom jenny fell in love with .\nks08\t1\t\tjack is the person with whom jenny fell in love .\nks08\t0\t*\tjack is the person whom jenny fell in love .\nks08\t1\t\ti met the critic whose remarks i wanted to object to .\nks08\t1\t\tthis is the friend for whose mother kim gave a party .\nks08\t1\t\tthe teacher set us a problem the answer to which we can find in the textbook .\nks08\t1\t\twe called the senators who met fred .\nks08\t1\t\tthe kid picked up the apple that fell down on the ground .\nks08\t0\t*\tthe student met john came .\nks08\t0\t*\tthe problem intrigued us bothered me .\nks08\t1\t\the made a statement which everyone thought was really interesting and important .\nks08\t1\t\tthey all agreed to include those matters which everyone believed had been excluded from the treaty .\nks08\t1\t\tmary knows that john was elected .\nks08\t1\t\tthat john was elected surprised frank .\nks08\t1\t\tmary told bill that john was elected .\nks08\t1\t\tthis is the book that we had read .\nks08\t1\t\tthe president abandoned the people that voted for him .\nks08\t1\t\tit is an argument that people think will never end in egypt .\nks08\t0\t*\tevery essay she 's written and which i 've read is on that pile .\nks08\t0\t*\tevery essay she 's written and that i 've read is on that pile .\nks08\t1\t\tevery essay which she 's written and that i 've read is on that pile .\nks08\t1\t\tevery essay that she 's written and which i 've read is on that pile .\nks08\t1\t\tthe student whose turn it was left .\nks08\t0\t*\tthe student that 's turn it was left .\nks08\t1\t\tthe pencil with which he is writing broke .\nks08\t0\t*\tthe pencil with that he is writing broke .\nks08\t1\t\ta pencil with which to write broke .\nks08\t0\t*\ta pencil with that to write broke .\nks08\t0\t*\tthe people in who we placed our trust left .\nks08\t0\t*\tthe person with who we were talking left .\nks08\t1\t\tthe company in which they have invested left .\nks08\t1\t\tthe people in whose house we stayed left .\nks08\t1\t\tthe person with whom he felt most comfortable left .\nks08\t1\t\the bought a bench on which to sit .\nks08\t1\t\the bought a refrigerator in which to put the beer .\nks08\t1\t\tthere is a bench for you to sit on .\nks08\t0\t*\tkaren asked where for washington to put the chairs .\nks08\t1\t\tthe person i met is from boston .\nks08\t1\t\tthe box we put the books in is sealed .\nks08\t1\t\the made a statement everyone thought was interesting and important .\nks08\t1\t\tthey all agreed to include those matters everyone believed had been excluded from the treaty .\nks08\t1\t\ti just know that the big 12 south teams everyone knew would win actually won the game .\nks08\t1\t\tthe person who john asked for help thinks he is foolish .\nks08\t1\t\tmary , who john asked for help , thinks he is foolish .\nks08\t1\t\tjohn has two sisters , who became lawyers .\nks08\t1\t\ti met the lady from france who grows peaches .\nks08\t0\t*\ti met john who grows peaches .\nks08\t0\t*\ti met her who grows peaches .\nks08\t1\t\tin the classroom , the teacher praised john , whom i also respect .\nks08\t1\t\treagan , whom the republicans nominated in 1980 , lived most of his life in california .\nks08\t1\t\tevery student who attended the party had a good time .\nks08\t0\t*\tevery student , who attended the party , had a good time .\nks08\t1\t\tno student who scored 80 or more in the exam was ever failed .\nks08\t0\t*\tno student , who scored 80 or more in the exam , was ever failed .\nks08\t1\t\tthe contestant who won the first prize , who is the judge 's brother-in-law , sang dreadfully .\nks08\t0\t*\tthe contestant , who is the judge 's brother-in-law , who won the first prize sang dreadfully .\nks08\t1\t\the who laughs last laughs best .\nks08\t1\t\the who is without sin among you , let him cast the first stone .\nks08\t1\t\twho did he believe that he would one day meet ?\nks08\t1\t\twhich celebrity did he mention that he had run into ?\nks08\t0\t*\twho did he believe the claim that he had never met ?\nks08\t0\t*\twhich celebrity did he mention the fact that he had run into ?\nks08\t1\t\tthe knife which he threw into the sea had a gold handle .\nks08\t1\t\tthe knife that he threw into the sea had a gold handle .\nks08\t1\t\tthe knife , which he threw into the sea had a gold handle .\nks08\t0\t??\tthe knife , that he threw into the sea had a gold handle .\nks08\t1\t\tbill cooked supper and washed the dishes .\nks08\t0\t*\twhat did bill cook and wash the dishes ?\nks08\t0\t*\twhat did bill cook supper and wash ?\nks08\t1\t\the refuted the proof that you can not square it .\nks08\t0\t*\twhat did he refute the proof that you can not square ?\nks08\t1\t\tthey met someone who knows the professor .\nks08\t0\t*\twhich professor did they meet someone who knows ?\nks08\t1\t\tthat he has met the professor is extremely unlikely .\nks08\t0\t*\twho is that he has met extremely unlikely ?\nks08\t1\t\tshe bought john 's book .\nks08\t1\t\tdid john wonder who would win the game ?\nks08\t0\t*\twhat did john wonder who would win ?\nks08\t1\t\twhat did he get the impression that the problem really was ?\nks08\t1\t\tthis is the paper that we really need to find the linguist who understands .\nks08\t0\t*\twhich rebel leader did you hear cheney 's rumor that the cia assassinated ?\nks08\t1\t\tstudents enter high-level educational institutions might face many problems relating to study habits .\nks08\t1\t\ta fellow student saw this felt sorry for miss kim and offered her his own book .\nks08\t1\t\texperts all agree that dreams cause great anxiety and stress are called nightmares .\nks08\t1\t\tthe victims of the earthquake their property was destroyed in the disaster were given temporary housing by the government .\nks08\t1\t\tthis is the book which i need to read .\nks08\t1\t\tthe person whom they intended to speak with agreed to reimburse us .\nks08\t1\t\tthe motor that martha thinks that joe replaced costs thirty dollars .\nks08\t1\t\tthe official to whom smith loaned the money has been indicted .\nks08\t1\t\tthe man on whose lap the puppet is sitting is ventriloquist .\nks08\t1\t\twe just finished the final exam the result of which we can find out next week .\nks08\t0\t*\twhat did herb start to play only after he drank ?\nks08\t0\t*\twho did herb believe the claim that cheated ?\nks08\t0\t*\twhat was that the vikings ate a real surprise to you ?\nks08\t0\t*\twhat did you meet someone who understands ?\nks08\t1\t\tthe fact that scientists have now established all the genes in the human body is still not widely known .\nks08\t1\t\tthe fact that the scientists used the latest technology to verify was reported at the recent conference .\nks08\t1\t\tthey ignored the suggestion that lee made .\nks08\t1\t\tthey ignored the suggestion that lee lied .\nks08\t1\t\tthey denied the claim that we had advanced by ourselves .\nks08\t1\t\tthey denied the claim that they should report only to us .\nks08\t1\t\tthe hotel where gloria stays is being remodelled .\nks08\t1\t\tthe day when jim got fired was a sad day for everyone .\nks08\t1\t\tjohn is tough to persuade .\nks08\t1\t\tjohn made it clear that he would finish it on time .\nks08\t1\t\tit is john that i met last night in the park .\nks08\t1\t\ti wonder whom sandy loves .\nks08\t1\t\tthis is the politician on whom sandy relies .\nks08\t1\t\the is hard to love .\nks08\t1\t\tit is easy to please john .\nks08\t1\t\tjohn is easy to please .\nks08\t0\t*\tto please john is eager .\nks08\t0\t*\tit is eager to please john .\nks08\t1\t\tjohn is eager to please .\nks08\t1\t\tto please john is tough .\nks08\t1\t\tit is tough to please john .\nks08\t1\t\tjohn is tough to please .\nks08\t0\t*\tto please john is ready .\nks08\t0\t*\tit is ready to please john .\nks08\t1\t\tjohn is ready to please .\nks08\t1\t\tkim is easy to please .\nks08\t1\t\tkim is eager to please .\nks08\t1\t\tthis doll is hard to see .\nks08\t1\t\tthe child is impossible to teach .\nks08\t1\t\tthe problem is easy to solve .\nks08\t0\t*\tthis doll is hard to see it .\nks08\t0\t*\tthe child is impossible to teach him .\nks08\t0\t*\tthe problem is easy to solve the question .\nks08\t1\t\tjohn is eager to examine the patient .\nks08\t1\t\tjohn is eager to find a new home .\nks08\t0\t*\tjohn is eager to examine .\nks08\t0\t*\tjohn is eager to find .\nks08\t1\t\thei is easy to please i .\nks08\t1\t\tthis theorem will take only five minutes to prove .\nks08\t1\t\tthis theorem will take only five minutes to establish that he proved in 1930 .\nks08\t1\t\tthis scratch will cost kim $ 500 to fix .\nks08\t1\t\tthis $ 500 bribe will cost the government $ 500,000 to prove that senator jones accepted .\nks08\t0\t*\tkim is eager to recommend .\nks08\t1\t\twho is kim eager to recommend ?\nks08\t1\t\tthis sonata is easy to play on this piano .\nks08\t1\t\twhich piano is this sonata easy to play on ?\nks08\t1\t\tthat dogs bark annoys people .\nks08\t1\t\tit annoys people that dogs bark .\nks08\t1\t\twhy she told him is unclear .\nks08\t1\t\tit is unclear why she told him .\nks08\t1\t\tto leave so soon would be inconvenience .\nks08\t1\t\tit would be inconvenience to leave so soon .\nks08\t1\t\tit would be inconvenience for you to leave so soon .\nks08\t1\t\tthat the dalai lama claims tibet independence disturbs the chinese government .\nks08\t1\t\tit disturbs the chinese government that the dalai lama claims tibet independence .\nks08\t1\t\ti believe the problem to be obvious .\nks08\t0\t*\ti believe that the problem is not easy to be obvious .\nks08\t1\t\ti believe it to be obvious that the problem is not easy .\nks08\t1\t\ti do not think it unreasonable to ask for the return of my subscription .\nks08\t1\t\the made it clear he would continue to co-operate with the united nations .\nks08\t1\t\tthey 're not finding it a stress being in the same office .\nks08\t1\t\tthat you came early surprised me .\nks08\t1\t\tit surprised me that you came early .\nks08\t0\t*\tsurprised me that you came early .\nks08\t1\t\tthat chris knew the answer occurred to pat .\nks08\t1\t\tit occurred to pat that chris knew the answer .\nks08\t1\t\tit really freaks me out that we invaded iraq .\nks08\t1\t\tthat we invaded iraq really creeps me out .\nks08\t1\t\tthat we invaded iraq really freaks me out .\nks08\t1\t\tit really bites that we invaded iraq .\nks08\t1\t\tthat fido barks annoys me .\nks08\t1\t\ta man came into the room that no one knew .\nks08\t1\t\ta man came into the room with blond hair .\nks08\t1\t\ti read a book during the vacation which was written by chomsky .\nks08\t1\t\tray found the outcome frustrating .\nks08\t1\t\tray found it frustrating that his policies made little impact on poverty .\nks08\t0\t*\ti made to settle the matter my objective .\nks08\t1\t\ti made it my objective to settle the matter .\nks08\t1\t\ti made the settlement of the matter my objective .\nks08\t0\t*\ti owe that the jury acquitted me to you .\nks08\t1\t\ti owe it to you that the jury acquitted me .\nks08\t1\t\ti owe my acquittal to you .\nks08\t1\t\ti believe strongly that the world is round .\nks08\t0\t*\ti believe that the world is round strongly .\nks08\t1\t\tit 's their teaching material that we 're using .\nks08\t1\t\twhat we 're using is their teaching material .\nks08\t1\t\ttheir teaching material is what we are using .\nks08\t1\t\twe are using their teaching material .\nks08\t1\t\ti share your view but i just wonder why you think that 's good .\nks08\t1\t\tit was the man that bought the articles from him .\nks08\t1\t\tit was then that he felt a sharp pain .\nks08\t1\t\tit was to the student that the teacher gave the best advice .\nks08\t1\t\tit was not until i was perhaps twenty-five or thirty that i read and enjoyed them .\nks08\t0\t*\tit was to finish the homework that john tried .\nks08\t0\t*\tit is that bill is honest that john believes .\nks08\t1\t\tit 's the second monday that we get back from easter holiday .\nks08\t1\t\tit was the girl who kicked the ball .\nks08\t1\t\tit 's mainly his attitude which convinced the teacher .\nks08\t1\t\twhat you want is a little greenhouse .\nks08\t1\t\twhat 's actually happening in london at the moment is immensely exciting .\nks08\t1\t\twhat is to come is in this document .\nks08\t1\t\twhat i 've always tended to do is to do my own stretches at home .\nks08\t1\t\twhat i meant was that you have done it really well .\nks08\t1\t\twhat happened is they caught her without a license .\nks08\t1\t\twhat the gentleman seemed to be asking is how policy would have differed .\nks08\t1\t\tinsensitive is how i would describe him .\nks08\t1\t\tin the early morning is when i do my best research .\nks08\t0\t*\twear it like that is what you do .\nks08\t0\t*\tthey caught her without a license is what happened .\nks08\t0\t*\tthat you have done it really well is what i meant .\nks08\t1\t\tthat 's when i read .\nks08\t1\t\tthat was why she looked so nice .\nks08\t1\t\tthat 's how they do it .\nks08\t1\t\tthat 's who i played with over christmas .\nks08\t1\t\twhat you heard was an explosion .\nks08\t1\t\tit was an explosion that you heard .\nks08\t1\t\twhat you should do is order one first .\nks08\t0\t*\tit is order one first that you should do first .\nks08\t0\t*\torder one first is what you should do .\nks08\t1\t\tit was not until i was perhaps twenty-five or thirty that i read them and enjoyed them .\nks08\t0\t*\twhen i read them and enjoyed them was not until i was perhaps twenty-five .\nks08\t0\t*\tnot until i was perhaps twenty-five was when i read them and enjoyed them .\nks08\t1\t\tit 's the writer that gets you so involved .\nks08\t0\t*\tthat gets you so involved is the writer .\nks08\t0\t*\tthe writer is that gets you so involved .\nks08\t1\t\tand it was this matter on which i consulted with the chairman of the select committee .\nks08\t0\t*\ton which i consulted with the chairman of the select committee was this matter .\nks08\t0\t*\tthis matter was on which i consulted with the chairman of the select committee .\nks08\t1\t\twhat i ate is an apple .\nks08\t1\t\twhat we are using is their teaching material .\nks08\t1\t\tthe student who got a in the class was very happy .\nks08\t1\t\tthe one who broke the window was mr. kim .\nks08\t1\t\the got what he wanted .\nks08\t1\t\the put the money where lee told him to put it .\nks08\t1\t\tthe concert started when the bell rang .\nks08\t0\t*\tlee wants to meet who kim hired .\nks08\t0\t*\tlee solved the puzzle how kim solved it .\nks08\t0\t*\twhich book he read the book was that one .\nks08\t1\t\ti ate what john ate .\nks08\t1\t\ti ate an apple .\nks08\t0\t*\tto whom i gave the cake is john .\nks08\t0\t*\tthat brought the letter is bill .\nks08\t1\t\tthis is how he did it .\nks08\t1\t\tthis is why he came early .\nks08\t1\t\ttype a : it is on bill that john relies .\nks08\t1\t\ttype b : it is bill on whom john relies .\nks08\t1\t\tit was then when we all went to bed .\nks08\t0\t*\tjohn that we are looking for showed up .\nks08\t1\t\tit 's the second monday that we get back from easter .\nks08\t1\t\tit was in 1997 when the in introduced the alien registration receipt card .\nks08\t1\t\tit is uncle john whose address i lost .\nks08\t0\t*\tit is kim on whom that sandy relies .\nks08\t0\t*\tit is kim on whom sandy relies on .\nks08\t0\t*\tit is kim whom sandy relies .\nks08\t1\t\tit was the director that she wants to meet .\nks08\t1\t\tit was the director that she said she wants to meet .\nks08\t1\t\tit was the director that i think she said she wants to meet .\nks08\t1\t\ti wonder who it was who saw you .\nks08\t1\t\ti wonder who it was you saw .\nks08\t1\t\ti wonder in which pocket it was that kim had hidden the jewels .\nks08\t1\t\twho do you think it is that mary met ?\nks08\t0\t*\tto whom do you think it is the book that mary gave ?\nks08\t1\t\tit is difficult for me to concentrate on calculus .\nks08\t1\t\tfor me to concentrate on calculus is difficult .\nks08\t1\t\tcalculus is difficult for me to concentrate on .\nks08\t1\t\tbeing lovely to look at has its advantages .\nks08\t1\t\tletters to grandma are easy to help the children to write .\nks08\t1\t\tit was to boston that they decided to take the patient .\nks08\t1\t\tit was with a great deal of regret that i vetoed your proposal .\nks08\t1\t\tit was tom who spilled beer on this couch .\nks08\t1\t\tit is martha whose work critics will praise .\nks08\t1\t\tit was john on whom the sheriff placed the blame .\nks08\t1\t\ti wondered who it was you saw .\nks08\t1\t\ti was wondering in which pocket it was that kim had hidden the jewels .\nks08\t0\t*\tit is on kim on whom sandy relies .\nks08\t1\t\twas it for this that we suffered and toiled ?\nks08\t1\t\twho was it who interviewed you ?\nks08\t1\t\ti believe it to be her father who was primarily responsible .\nks08\t1\t\ti believe it to be the switch that is defective .\nks08\t1\t\ttom ate what mary offered to him .\nks08\t1\t\ti wonder what mary offered to him .\nks08\t1\t\twhat mary offered to him is unclear .\nkl93\t1\t\ti do n't have any potatoes .\nkl93\t0\t*\ti have any potatoes .\nkl93\t1\t\tat most three girls saw anything .\nkl93\t0\t*\tat least three girls saw anything .\nkl93\t1\t\tevery girl who saw anything was happy .\nkl93\t0\t*\tsome girl who saw anything was happy .\nkl93\t1\t\tany owl hunts mice .\nkl93\t1\t\tany lawyer could tell you that .\nkl93\t1\t\ti would dance with anybody .\nkl93\t1\t\talmost every lawyer could answer that question .\nkl93\t1\t\talmost no lawyer could answer that question .\nkl93\t1\t\talmost any lawyer could answer that question .\nkl93\t0\t*\ti do n't have almost any potatoes .\nkl93\t1\t\ti would dance with mary or sue .\nkl93\t1\t\tmary or sue could tell you that .\nkl93\t1\t\tdo you have dry socks ? claim .\nkl93\t1\t\tperhaps some dry socks would help ?\nkl93\t1\t\tan owl hunts mice .\nkl93\t1\t\tgenerics allow exceptions .\nkl93\t1\t\ta poodle gives live birth .\nkl93\t1\t\tevery poodle gives live birth .\nkl93\t1\t\ti do n't have potatoes .\nkl93\t1\t\tevery man who has matches is happy .\nkl93\t1\t\tevery man who has any matches is happy .\nkl93\t1\t\tcould we make some french fries ?\nkl93\t1\t\twhy do n't we make some french fries ?\nkl93\t1\t\tare you prepared for school tomorrow ?\nkl93\t1\t\tand then all the owls go on a mice hunt .\nkl93\t1\t\tif you take a dry match and strike it , it lights .\nkl93\t1\t\tat most three teachers assigned homework .\nkl93\t1\t\tat most three teachers assigned any homework .\nkl93\t1\t\tevery student who handed in some homework will get a prize .\nkl93\t1\t\tevery student who handed in any homework will get a prize .\nkl93\t1\t\tbefore you make plans , consult the secretary .\nkl93\t1\t\tbefore you make any plans , consult the secretary .\nkl93\t1\t\tis there anything i can do for you ?\nkl93\t1\t\ta professional dancer would be able to do it .\nkl93\t1\t\tany professional dancer would be able to do it .\nkl93\t1\t\twe do n't have potatoes , or at least not enough .\nkl93\t1\t\tevery man who has any matches is happy . happy .\nkl93\t0\t*\tevery boy has any potatoes .\nkl93\t0\t*\tit 's not the case that every boy has any potatoes .\nkl93\t1\t\ti 'm surprised we had any potatoes .\nkl93\t1\t\tat most three boys did n't see anything .\nkl93\t0\t*\teven sue said anything .\nkl93\t1\t\tsue was the most likely not to say anything .\nkl93\t1\t\tsue said something although she was the most likely not to say anything .\nkl93\t1\t\tcows fly more often than john visits any relatives .\nkl93\t0\t*\teach candidate who has any interest in semantics will be admit ted to the department .\nkl93\t1\t\tevery child should have a daily glass of milk .\nkl93\t1\t\teach child should have a daily glass of milk .\nkl93\t1\t\ti 'm surprised that he ever said anything .\nkl93\t1\t\ti 'm sorry that he ever said anything .\nkl93\t0\t*\ti 'm glad that i ever met him .\nkl93\t0\t*\ti 'm sure that i ever met him .\nkl93\t1\t\ti 'm surprised he bought a car .\nkl93\t1\t\tbut these tickets are terrible !\nkl93\t1\t\ti was surprised that he stole the watch , in as far as that was a daring thing to do .\nkl93\t1\t\tgiven my high opinion on his moral character , i was surprised that he stole the watch .\nkl93\t1\t\twere you surprised that he stole the watch ?\nkl93\t1\t\ti 'm sorry that anybody hates me .\nkl93\t1\t\ti want for nobody to hate me .\nkl93\t1\t\ti 'm glad he bought a car .\nkl93\t1\t\ti 'm sorry he bought a car .\nkl93\t1\t\the bought a honda .\nkl93\t0\t*\ti 'm glad i saw anybody .\nkl93\t1\t\ti 'm glad anybody likes me !\nkl93\t1\t\tcould n't you get any tickets better than this ?\nkl93\t1\t\tit 's fine that he paid and apologized , but i do n't really care about his gratitude , or the money , or anything .\nkl93\t0\t*\ti 'm sure we got any tickets !\nkl93\t1\t\ti 'm sure he speaks to me !\nkl93\t1\t\ti 'm glad a linguist likes me .\nkl93\t1\t\ti did n't help him because i have any sympathy for urban guerillas .\nkl93\t0\t*\tit is n't because sue said anything bad about me that i 'm angry , although she did say some bad things about me .\nkl93\t1\t\ti do n't have any sympathy for urban guerillas .\nkl93\t0\t*\talmost an owl hunts mice .\nkl93\t0\t*\tabsolutely an owl hunts mice .\nkl93\t1\t\talmost any owl hunts mice .\nkl93\t1\t\tabsolutely any owl hunts mice .\nb_82\t1\t\the began writing poems .\nb_82\t1\t\the kept writing poems .\nb_82\t1\t\the continued writing poems .\nb_82\t1\t\the stopped writing poems .\nb_82\t1\t\tthe men would have all been working .\nb_82\t1\t\tthe men would have been all working .\nb_82\t1\t\twould the men each have been working ?\nb_82\t0\t*\twould each the men have been working ?\nb_82\t1\t\tthe men would not enjoy that .\nb_82\t0\t*\twould not the men enjoy that ?\nb_82\t1\t\twould the men not enjoy that ?\nb_82\t0\t*\tthe men would all not have been working .\nb_82\t1\t\tthe men all would not have been working .\nb_82\t1\t\tthe men would not have all been working .\nb_82\t1\t\tthe men would not all have been working .\nb_82\t1\t\tthe men would not have been all working .\nb_82\t1\t\tthat john is a fool is obvious .\nb_82\t1\t\tit is obvious that john is a fool .\nb_82\t0\t*\tjohn believes that fred likes steak that joe likes pizza .\nb_82\t1\t\tjohn whined that he was hungry .\nb_82\t0\t*\tthat he was hungry was whined by john .\nb_82\t1\t\tjohn is certain that the mets will win .\nb_82\t1\t\tthat he has blood on his hands proves that john is the murderer .\nb_82\t0\t*\tit proves that john is the murderer that he has blood on his hands .\nb_82\t1\t\tto please john would be difficult .\nb_82\t1\t\tit would be difficult to please john .\nb_82\t1\t\tit is believed to be obvious by everyone that fred is crazy .\nb_82\t0\t*\tjohn is believed to be certain by everyone that fred is crazy .\nb_82\t1\t\tit disturbed him that people did n't like fred .\nb_82\t1\t\tit was believed to have disturbed him that people did n't like fred .\nb_82\t0\t*\thow easy to please john is it ?\nb_82\t0\t*\thow difficult to study for the exam was it ?\nb_82\t0\t*\thow hard to read the book was it ?\nb_82\t0\t*\thow easy to tease john it is !\nb_82\t0\t*\thow hard to read the book it was !\nb_82\t1\t\thow certain that the mets will win are you ?\nb_82\t1\t\thow likely to win is he ?\nb_82\t1\t\tthis book i enjoyed .\nb_82\t0\t*\tto whom the book did you give .\nb_82\t0\t*\tthe book to whom did you give .\nb_82\t1\t\the 's a man to whom liberty we could never grant .\nb_82\t1\t\tit 's obvious that mary , he ca n't stand .\nb_82\t1\t\ti think that the trolls will take the shepherd tomorrow .\nb_82\t1\t\tas for max , i really like him .\nb_82\t0\t*\the 's a man to whom as for liberty , we could never grant it .\nb_82\t0\t*\the 's a man to whom liberty , we could never grant it .\nb_82\t1\t\tjohn would like that because he 's such a nice guy .\nb_82\t1\t\tjohn , because he 's such a nice guy , would like that .\nb_82\t1\t\tbecause he 's such a nice guy , john would like that .\nb_82\t1\t\tjohn would , because he 's such a nice guy , like that .\nb_82\t1\t\tbecause he 's such a nice guy , what would john like ?\nb_82\t1\t\tit 's obvious that , although he 's a nice guy , john is n't too bright .\nb_82\t0\t*\tjohn ate after getting home the steak .\nb_82\t1\t\ti gave mary a book .\nb_82\t1\t\ti considered fred crazy .\nb_82\t1\t\ti put the book on the table .\nb_82\t1\t\ti worded the telegram tersely .\nb_82\t0\t*\ti considered fred after the party crazy .\nb_82\t0\t*\t1 put the book after the party on the table .\nb_82\t0\t*\ti worded the telegram after the party tersely .\nb_82\t0\t*\tbecause she 's so pleasant , mary i really like her .\nb_82\t1\t\tbecause she 's so pleasant , mary i really like .\nb_82\t1\t\tthough he may seem intelligent , he does not seem deep .\nb_82\t1\t\tintelligent though he may seem , he does not seem deep .\nb_82\t1\t\tthough i may love her , that wo n't affect the grade .\nb_82\t1\t\tlove her though i may , that wo n't affect the grade .\nb_82\t0\t*\thandsome though i believe the claim that tom is , i still wo n't date him .\nb_82\t0\t*\thandsome though they told me that tom is , i still wo n't date him .\nb_82\t0\t*\thandsome though my friends suggested that mary thinks that tom is , i still wo n't date him .\nb_82\t1\t\thate those who criticize carter though he may , it does n't matter .\nb_82\t1\t\twould john hate that ?\nb_82\t1\t\twould john hate that !\nb_82\t0\t*\twill , after john comes home , sally take a shower ?\nb_82\t1\t\twill sally , after john comes home , take a shower ?\nb_82\t1\t\tafter john comes home , will sally take a shower ?\nb_82\t1\t\ti would prefer that he not have finished .\nb_82\t0\t*\ti would prefer that he have not finished .\nb_82\t1\t\the has not finished .\nb_82\t1\t\the is not finishing .\nb_82\t1\t\the would not finish .\nb_82\t1\t\the does not finish .\nb_82\t0\t*\tthose people will , after the party , not come home .\nb_82\t1\t\tthose people , after the party , will not come home .\nb_73\t1\t\ti 've never seen a man taller than my father .\nb_73\t1\t\ti 've never seen a taller man than my father .\nb_73\t1\t\ti 've never seen a man taller than my mother .\nb_73\t1\t\ti 've never seen a taller man than my mother .\nb_73\t1\t\tjack eats caviar more than he eats mush .\nb_73\t1\t\tjack eats more caviar than he eats mush .\nb_73\t1\t\tjack eats caviar more than he sleeps .\nb_73\t0\t*\tjack eats more caviar than he sleeps .\nb_73\t1\t\ti am more angry today than i was yesterday .\nb_73\t1\t\ti am more angry than sad .\nb_73\t0\t*\ti am angrier than sad .\nb_73\t1\t\tmary is more than six feet tall .\nb_73\t1\t\tmary is taller than six feet .\nb_73\t0\t*\tmary is more than five feet short .\nb_73\t1\t\tmary is shorter than five feet .\nb_73\t1\t\tthey think she has too much independence .\nb_73\t0\t*\tthey think she is too much happy .\nb_73\t0\t*\tmary speaks so much gently .\nb_73\t0\t*\ta tangerine is n't as much different from an orange as i 'd thought .\nb_73\t1\t\ta tangerine is n't as different from an orange as i 'd thought .\nb_73\t1\t\tyou and i are as much alike as a horse and a cow .\nb_73\t1\t\tyou and i are as alike as a horse and a cow .\nb_73\t1\t\tyou and i are as little alike as a horse and a cow .\nb_73\t0\t*\tjohn is as much intelligent as mary .\nb_73\t1\t\tjohn is as intelligent as mary .\nb_73\t1\t\tjohn is more than 6 feet tall .\nb_73\t1\t\tjohn is taller than 6 feet .\nb_73\t1\t\tthese plants may grow as much as 6 feet high .\nb_73\t1\t\tthese plants may grow as high as 6 feet .\nb_73\t1\t\tmore has happened in the last week than will happen in the next year .\nb_73\t1\t\the offers more than we had hoped for .\nb_73\t1\t\the was hoping for more than we offered .\nb_73\t1\t\tenough is going on to keep them confused .\nb_73\t1\t\tyou 've said enough to convince me .\nb_73\t1\t\tsally eats caviar more than i had expected .\nb_73\t1\t\tsusan does n't eat her vegetables enough .\nb_73\t1\t\tsally eats the stuff pretty often .\nb_73\t0\t*\tsally eats pretty often the stuff .\nb_73\t1\t\tsally eats the stuff more .\nb_73\t0\t*\tsally eats more the stuff .\nb_73\t0\t*\tsusan does n't eat enough her vegetables .\nb_73\t1\t\tjohn eats more .\nb_73\t1\t\tjohn does n't eat enough .\nb_73\t1\t\tjohn eats more than he sleeps .\nb_73\t1\t\the gave me more of his marbles than i wanted .\nb_73\t0\t*\tsally enough eats caviar .\nb_73\t0\t*\tenough sally eats caviar .\nb_73\t1\t\tjack is more tall than thin .\nb_73\t1\t\ti did it more in jest than in anger .\nb_73\t1\t\tthere is enough of the bread left to have tomorrow .\nb_73\t1\t\tthere is enough bread for all of you .\nb_73\t1\t\tthere is bread enough for all of you .\nb_73\t1\t\tshe has enough of a problem as it is .\nb_73\t0\t*\tshe has enough a problem as it is .\nb_73\t0\t*\tshe has enough problem as it is .\nb_73\t0\t*\tshe has problem enough as it is .\nb_73\t0\t*\tshe has enough of problems as it is .\nb_73\t1\t\tshe has enough problems as it is .\nb_73\t1\t\the looks more formidable than he is .\nb_73\t0\t*\the seems enough intelligent for you .\nb_73\t1\t\the seems intelligent enough for you .\nb_73\t1\t\tshe writes more clearly than she speaks .\nb_73\t0\t*\tshe speaks enough clearly to be understood .\nb_73\t1\t\the 's enough of a fool to try it .\nb_73\t1\t\the 's fool enough to try it .\nb_73\t1\t\ti saw more of the man than you did .\nb_73\t1\t\ti saw enough of the fool to be convinced .\nb_73\t1\t\tharry got to be more of a celebrity .\nb_73\t1\t\tharry got to be more of the celebrity .\nb_73\t0\t*\the 's enough of the coward to pull the trigger .\nb_73\t1\t\twhat his father wants him to be is more of a man .\nb_73\t0\t*\tmore of a man is here .\nb_73\t0\t*\ti 've kicked more of a man than you have .\nb_73\t0\t*\ti 've known more of a man than frank .\nb_73\t1\t\ti 've never known more of a man than frank .\nb_73\t1\t\the was hoping for too much .\nb_73\t1\t\tsally eats caviar too much for her own good .\nb_73\t1\t\tjohn eats so much .\nb_73\t1\t\the gave me many marbles .\nb_73\t1\t\ti have much typing to do .\nb_73\t0\t*\the looks so much formidable .\nb_73\t1\t\the looks so formidable .\nb_73\t0\t*\tshe speaks too much clearly .\nb_73\t1\t\tshe speaks too clearly .\nb_73\t1\t\ti 'm as much of a man as you are , my dear .\nb_73\t1\t\tharry got to be as much of a celebrity as his father .\nb_73\t0\t*\tharry got to be as much of the celebrity as his father .\nb_73\t0\t*\tas much of a man is here .\nb_73\t0\t*\ti 've seen as much of a coward as frank .\nb_73\t1\t\tmany are called ; few are chosen .\nb_73\t1\t\tmore are called than are ever chosen .\nb_73\t1\t\twe made enough pudding to last for days .\nb_73\t0\t*\twe ate enough a pudding to satisfy us .\nb_73\t1\t\twe made enough puddings to last for days .\nb_73\t0\t*\twe ate enough the puddings to satisfy us .\nb_73\t1\t\tjohn is the kind of a fool that i told you about .\nb_73\t1\t\tjohn is the kind of the fool that i told you about .\nb_73\t1\t\the 's a bit of a gossip .\nb_73\t0\t*\the 's the bit of a gossip .\nb_73\t1\t\the 's something of a gossip .\nb_73\t1\t\tjohn is the kind of fool that i told you about .\nb_73\t0\t*\the 's fool .\nb_73\t0\t*\the 's a fool enough to try it .\nb_73\t0\t*\tshe 's just enough tall .\nb_73\t0\t*\tshe 's enough tall .\nb_73\t0\t*\tshe speaks enough clearly .\nb_73\t1\t\the 's that reliable a man .\nb_73\t0\t*\the 's a that reliable man .\nb_73\t1\t\the 's too reliable a man .\nb_73\t0\t*\the 's a too reliable man .\nb_73\t1\t\the 's as reliable a man .\nb_73\t0\t*\the 's an as reliable man .\nb_73\t1\t\the 's so reliable a man .\nb_73\t0\t*\the 's a so reliable man .\nb_73\t0\t*\the 's more reliable a man .\nb_73\t0\t*\the 's reliable enough a man .\nb_73\t1\t\the 's a reliable enough man .\nb_73\t1\t\ttom was not more reliable than a grasshopper .\nb_73\t1\t\ttom was no more reliable than a grasshopper .\nb_73\t0\t*\tnot more reliable a man could be found .\nb_73\t0\t*\tany more reliable a man could not be found .\nb_73\t1\t\ti do n't want trouble .\nb_73\t0\t*\tjohn is not more reliable a fellow than bill .\nb_73\t1\t\tjohn is not a more reliable fellow than bill .\nb_73\t1\t\tjohn is n't any more reliable a fellow than bill .\nb_73\t0\t*\tjohn is n't an any more reliable fellow than bill .\nb_73\t1\t\tjohn is no more reliable a fellow than bill .\nb_73\t0\t*\tjohn is a no more reliable fellow than bill .\nb_73\t1\t\ti have as many too many marbles as you .\nb_73\t1\t\ti have as many marbles too many as you .\nb_73\t1\t\ti have six too many marbles .\nb_73\t1\t\ti have six marbles too many .\nb_73\t1\t\ti have six more of them .\nb_73\t0\t*\ti have six of them more .\nb_73\t1\t\ti have half a dozen too many of these marbles .\nb_73\t0\t*\ti have half a dozen of these marbles too many .\nb_73\t1\t\tshe writes clearly enough .\nb_73\t1\t\tshe is as brilliant a woman as her mother .\nb_73\t0\t*\tshe is as brilliant the woman as her mother .\nb_73\t1\t\ti 've never known as strong a person as louise .\nb_73\t1\t\tfido is a smarter dog than spot .\nb_73\t0\t*\tfido is the smarter dog than spot .\nb_73\t1\t\twhat his father wants him to be is a better pool player .\nb_73\t1\t\ta taller man than bill is here .\nb_73\t1\t\ti 've never known a smarter dog than fido .\nb_73\t1\t\the 's so tall a man that doors are dangerous to him .\nb_73\t1\t\the 's such a tall man that doors are dangerous to him .\nb_73\t1\t\the 's such a tall man .\nb_73\t1\t\the 's such the tall man .\nb_73\t1\t\twhat her mother wants her to be is such a fine surgeon that everyone will respect her .\nb_73\t1\t\tit was as awful a picture as it first seemed .\nb_73\t0\t*\tit was so awful a picture as it first seemed .\nb_73\t1\t\tit was n't as awful a picture as it first seemed .\nb_73\t1\t\tit was n't such an awful picture as it first seemed .\nb_73\t1\t\tit was so awful a picture that i tore it up .\nb_73\t1\t\tit was such an awful picture that i tore it up .\nb_73\t1\t\tmary is such a wit that people are afraid of her .\nb_73\t1\t\tsally is n't such a fool as people think .\nb_73\t0\t*\tsally is such a fool as people think .\nb_73\t1\t\ti love her so much .\nb_73\t1\t\ti gave her so much .\nb_73\t0\t*\ti gave her so .\nb_73\t1\t\thilda is such a scholar .\nb_73\t1\t\thilda is such a scholar that all her work is impeccable .\nb_73\t1\t\thilda is such a scholar as you were speaking of just now .\nb_73\t1\t\tso eminent a scholar as dr. lucille hein was here .\nb_73\t1\t\tsuch an eminent scholar as dr. lucille hein was here .\nb_73\t1\t\tso elegant a solution as you have presented us with can elicit only admiration .\nb_73\t1\t\tyou have presented so elegant a solution that we can only admire it .\nb_73\t1\t\tsuch a scholar as you were speaking of just now is here .\nb_73\t0\t*\tso much of a scholar is here .\nb_73\t1\t\ther mother wants mary to be such an eminent woman that everyone will respect her .\nb_73\t1\t\tjohn a decidedly taller man than bill .\nb_73\t0\t*\tjohn is a decidedly too tall man .\nb_73\t1\t\tthat 's an obviously better solution .\nb_73\t0\t*\tthat 's an obviously so good solution .\nb_73\t1\t\tshe made so much better a reply .\nb_73\t0\t*\tshe made such a much better reply .\nb_73\t0\t*\tshe made such a better reply .\nb_73\t0\t*\tthat 's the most kind answer that i ever heard .\nb_73\t0\t*\tthat 's a most kind answer that i ever heard .\nb_73\t0\t*\tthat 's a kindest answer that i ever heard .\nb_73\t1\t\tthat 's the kindest answer that i ever heard .\nb_73\t1\t\tmost helpful advice is unwanted .\nb_73\t1\t\tsally will give me more helpful advice than the advice i got from you .\nb_73\t1\t\ti 've never seen a man who is taller than my mother .\nb_73\t1\t\ti 've never seen the one man taller than my father .\nb_73\t0\t*\ti 've never seen the taller man than my father .\nb_73\t0\t*\ti 've never seen the one taller man than my father .\nb_73\t1\t\tjohn wants to come up with as good a solution as christine did .\nb_73\t1\t\tjohn wants to come up with a solution as good as christine .\nb_73\t1\t\tjohn wants to find a solution better than christine 's .\nb_73\t1\t\tcaviar is eaten by jack more than mush .\nb_73\t1\t\tmore caviar than mush is eaten by jack .\nb_73\t1\t\tjack ate more of this than he ate of that .\nb_73\t1\t\tthe table is longer than the door is wide .\nb_73\t1\t\tmary 's happy about her work , and john 's happy about his children .\nb_73\t0\t*\tmary 's happy about her work , and john 's about his children .\nb_73\t1\t\tmary 's happy about her work , and john is about his children .\nb_73\t1\t\tmary is happy with her work , and john is with his children .\nb_73\t1\t\tmary 's happy with her work , and john 's with his children .\nb_73\t0\t*\tthe table is longer than the door 's wide .\nb_73\t1\t\tthe table is long , and the door 's wide .\nb_73\t0\t*\ti was happier there than i 'm here .\nb_73\t1\t\ti 'm sad , more than i 'm angry .\nb_73\t0\t*\ti 'm sadder than i 'm angry .\nb_73\t1\t\ti 'm more sad than angry .\nb_73\t1\t\ti 'm worrying , more than thinking .\nb_73\t0\t*\ti 'm more worrying than thinking .\nb_73\t0\t*\ti 'm sadder than angry .\nb_73\t1\t\ti 'm sad , as much as i 'm angry .\nb_73\t1\t\ti 'm as much sad as angry .\nb_73\t0\t*\ti 'm as sad as angry .\nb_73\t1\t\ti am angrier today that i was yesterday .\nb_73\t1\t\tjohn is taller than six feet .\nb_73\t1\t\tjohn is taller than bill .\nb_73\t1\t\tmary has more than two friends .\nb_73\t0\t*\tmary has more than just bill and pete friends .\nb_73\t1\t\tmary has more friends than two .\nb_73\t1\t\tmary has more friends than just bill and pete .\nb_73\t1\t\tthey may grow as much as six feet high .\nb_73\t0\t*\tthey may grow as much as bamboo high .\nb_73\t1\t\tthey may grow as high as six feet .\nb_73\t1\t\tsome of them made as many as 20 errors .\nb_73\t0\t*\tsome of them made as many as joan errors .\nb_73\t1\t\tsome of them made as many errors as joan .\nb_73\t0\t*\tjohn is taller than six feet is .\nb_73\t1\t\tjohn is taller than pete is .\nb_73\t0\t*\tmary has more friends that two .\nb_73\t0\t*\tmary has more friends than just bill and pete are .\nb_73\t0\t*\tjohn is more than five feet short .\nb_73\t1\t\tjohn is shorter than five feet .\nb_73\t1\t\tmary has more enemies than bill has friends .\nb_73\t0\t*\tmary has more than bill has friends enemies .\nb_73\t1\t\tmary does n't have as many too many too many as jane .\nb_73\t1\t\tjane has more nearly as many too many than mary .\nb_73\t1\t\tmary swam five more laps than joan swam .\nb_73\t1\t\tmary swam as many more laps than joan as linda .\nc_13\t1\t\tbill kissed himself .\nc_13\t0\t*\tbill kissed herself .\nc_13\t1\t\tsally kissed herself .\nc_13\t0\t*\tkiss himself .\nc_13\t1\t\tthe robot kissed itself .\nc_13\t1\t\tshe knocked herself on the head with a zucchini .\nc_13\t0\t*\tshe knocked himself on the head with a zucchini .\nc_13\t1\t\tthe snake flattened itself against the rock .\nc_13\t1\t\tthe joneses think themselves the best family on the block .\nc_13\t0\t*\tthe joneses think himself the most wealthy guy on the block .\nc_13\t1\t\tgary and kevin ran themselves into exhaustion .\nc_13\t0\t*\tgary and kevin ran himself into exhaustion .\nc_13\t1\t\tpeople from tucson think very highly of themselves .\nc_13\t1\t\ti gave myself the bucket of ice cream .\nc_13\t0\t*\tshe hit myself with a hammer .\nc_13\t1\t\tshe hit herself with a hammer .\nc_13\t1\t\tdoug blew the building up .\nc_13\t1\t\tdoug blew up the building .\nc_13\t1\t\tdoug blew it up .\nc_13\t1\t\tdoug blew up it .\nc_13\t0\t*\twho do you wonder what bought ?\nc_13\t1\t\ti wonder what fiona bought .\nc_13\t0\t*\ttoothbrush the is blue .\nc_13\t1\t\tcheese mice love stinks .\nc_13\t1\t\tthe dancing chorus line of elephants broke my television set .\nc_13\t1\t\trosie loves magazine ads .\nc_13\t1\t\ti think rosie loves magazine ads .\nc_13\t1\t\tdana doubts that drew believes i think rosie loves magazine ads .\nc_13\t1\t\tdave left .\nc_13\t1\t\tdave and alina left .\nc_13\t1\t\tdave , dan , erin , and alina left .\nc_13\t1\t\twho do you think that ciaran will question first ?\nc_13\t1\t\twho do you think ciaran will question first ?\nc_13\t1\t\twho do you think will question seamus first ?\nc_13\t0\t*\twho do you think that will question seamus first ?\nc_13\t1\t\ti expect soon to see the results .\nc_13\t1\t\ti expect to see the results soon .\nc_13\t1\t\ti expect to soon see the results .\nc_13\t0\t*\ti expect more than to double my profits .\nc_13\t0\t*\ti expect to double more than my profits .\nc_13\t0\t*\ti expect to double my profits more than .\nc_13\t1\t\ti expect to more than double my profits .\nc_13\t1\t\twho did you see in las vegas ?\nc_13\t1\t\tyou are taller than me .\nc_13\t0\t*\tmy red is refrigerator .\nc_13\t0\t*\twho do you think that saw bill ?\nc_13\t1\t\tmy friends wanted to quickly leave the party .\nc_13\t0\t*\tbunnies carrots eat .\nc_13\t1\t\tgeorge sang to himself .\nc_13\t0\t*\thimself sang to george .\nc_13\t1\t\tbetsy loves herself in blue leather .\nc_13\t1\t\teveryone should be able to defend himself .\nc_13\t1\t\teveryone should be able to defend herself .\nc_13\t1\t\ti hope nobody will hurt themselves .\nc_13\t1\t\ti hope nobody will hurt himself .\nc_13\t1\t\tdo n't hit yourself !\nc_13\t1\t\tshe is dancing .\nc_13\t1\t\tthey are dancing .\nc_13\t1\t\tthe man is dancing .\nc_13\t1\t\tthe men are dancing .\nc_13\t1\t\tthe students met to discuss the project .\nc_13\t1\t\tzeke cooked and ate the chili .\nc_13\t1\t\tzeke ate and cooked the chili .\nc_13\t1\t\the put the clothes .\nc_13\t1\t\the put in the washing machine .\nc_13\t1\t\ti gave my brother a birthday present .\nc_13\t1\t\ti gave a birthday present to my brother .\nc_13\t1\t\twhere do you guys live at ?\nc_13\t1\t\tit is obvious to everybody that tasha likes misha .\nc_13\t1\t\tthe man loved peanut butter cookies .\nc_13\t1\t\tthe puppy loved peanut butter cookies .\nc_13\t1\t\tthe king loved peanut butter cookies .\nc_13\t0\t*\tthe green loved peanut butter cookies .\nc_13\t0\t*\tthe in loved peanut butter cookies .\nc_13\t0\t*\tthe sing loved peanut butter cookies .\nc_13\t1\t\tjohn went to the store .\nc_13\t1\t\tthe man went to the store .\nc_13\t0\t*\tquickly walks went to the store .\nc_13\t0\t*\tto the washroom kissed the blarney stone .\nc_13\t1\t\tthe destruction of the city bothered the mongols .\nc_13\t1\t\tsincerity is an important quality .\nc_13\t1\t\tthe assassination of the president .\nc_13\t1\t\ttucson is a great place to live .\nc_13\t1\t\tgabrielle 's mother is an axe murderer .\nc_13\t1\t\thamsters mother attractive offspring .\nc_13\t1\t\twendy 's mother country is iceland .\nc_13\t1\t\tlouis said that parts of speech intrigued her .\nc_13\t0\t*\tcat ate the spider .\nc_13\t1\t\tthe cat ate the spider .\nc_13\t1\t\tcats ate the spider .\nc_13\t1\t\tthe cats ate the spider .\nc_13\t0\t*\ti ate apple .\nc_13\t1\t\ti ate the apple .\nc_13\t1\t\ti ate sugar .\nc_13\t1\t\ti ate the sugar .\nc_13\t1\t\the is filled with sincerity .\nc_13\t1\t\ti doubt his sincerity .\nc_13\t1\t\tthe dastardly surgeon stole the physician 's lunch .\nc_13\t1\t\ti asked the question .\nc_13\t1\t\ti asked if you knew the answer .\nc_13\t1\t\ti hit the ball .\nc_13\t1\t\ti spared him the trouble .\nc_13\t0\t*\ti put the box the book .\nc_13\t1\t\ti put the book in the box .\nc_13\t1\t\ti gave the box to leah .\nc_13\t1\t\ti gave leah the box .\nc_13\t1\t\ti told daniel the story .\nc_13\t1\t\ti told daniel that the exam was cancelled .\nc_13\t1\t\ti told the story to daniel .\nc_13\t1\t\tthe canadian government uses a parliamentary system of democracy .\nc_13\t1\t\tthe canadian bought himself a barbecue .\nc_13\t1\t\tthe prudish linguist did n't enjoy looking at the internet .\nc_13\t1\t\twe keep those censored copies of the book hidden to protect the sensibilities of the prudish .\nc_13\t1\t\tsusan bought some flowers for her mother .\nc_13\t1\t\tsusan bought some flowers for her birthday .\nc_13\t0\t*\tsusan bought her birthday some flowers .\nc_13\t1\t\ti gave blood .\nc_13\t1\t\ti do n't give a darn .\nc_13\t1\t\tandy gives freely of his time .\nc_13\t1\t\tdan gave his life .\nc_13\t1\t\tdan gives to charity .\nc_13\t1\t\tsorry , i gave last week .\nc_13\t1\t\tthe student loved his phonology readings .\nc_13\t1\t\ti saw these dancers and those musicians smoking something .\nc_13\t1\t\ti am drinking lemonade and eating a brownie .\nc_13\t1\t\twe went through the woods and over the bridge .\nc_13\t1\t\tthe man whose car i hit last week sued me .\nc_13\t1\t\tthe big man from ny has often said that he gave peanuts to elephants .\nc_13\t1\t\tthe man killed the king with the knife .\nc_13\t1\t\twe ate at a really fancy restaurant .\nc_13\t0\t*\twe ate at .\nc_13\t1\t\tbig bowls of beans are what i like .\nc_13\t1\t\tthe big boy was kissed by the drooling dog .\nc_13\t1\t\tthe drooling dog kissed the big boy .\nc_13\t1\t\tjohn and the man went to the store .\nc_13\t0\t*\tjohn and very blue went to the store .\nc_13\t1\t\tbruce loved and kelly hated phonology class .\nc_13\t0\t*\tthe with milk coffee is hot .\nc_13\t1\t\tthe kangaroo hopped over the truck .\nc_13\t1\t\ti have n't seen this sentence before .\nc_13\t1\t\tsusan will never sing at weddings .\nc_13\t1\t\tthe officer carefully inspected the license .\nc_13\t1\t\tevery cat always knows the location of her favorite catnip toy .\nc_13\t1\t\tthe cat put her catnip toy on the plastic mat .\nc_13\t1\t\tthe very young child walked from school to the store .\nc_13\t1\t\tjohn paid a dollar for a head of lettuce .\nc_13\t1\t\tteenagers drive rather quickly .\nc_13\t1\t\ta clever magician with the right equipment can fool the audience easily .\nc_13\t1\t\tthe police might plant the drugs in the apartment .\nc_13\t1\t\tthose olympic hopefuls should practice diligently daily .\nc_13\t1\t\tthe latest research on dieting always warns people about the dangers of too much cholesterol .\nc_13\t1\t\tthat annoying faucet was dripping constantly for months .\nc_13\t1\t\tmarian wonders if the package from boston will ever arrive .\nc_13\t1\t\ti said that bonny should do some dances from the middle east .\nc_13\t1\t\tthat dan smokes in the office really bothers alina .\nc_13\t1\t\tthe belief that syntactic theory reveals the inner structure of sentences emboldened the already much too cocky professor .\nc_13\t1\t\ti bought the parrot in the store .\nc_13\t1\t\ti put the milk in the fridge .\nc_13\t1\t\ti mailed the sweater to mary .\nc_13\t1\t\ti knew the man with the brown hair .\nc_13\t1\t\tjohn said mary went to the store quickly .\nc_13\t1\t\ti discovered an old english poem .\nc_13\t1\t\tsusanne gave the minivan to george .\nc_13\t1\t\tclyde got a passionate love letter from stacy .\nc_13\t1\t\the blew out the candle .\nc_13\t1\t\the turned off the light .\nc_13\t1\t\the blew up the building .\nc_13\t1\t\the rode out the storm .\nc_13\t0\t*\tshannon kissed quietly the kitten .\nc_13\t1\t\tshannon left quietly every day .\nc_13\t1\t\tjuliet says that romeo lies to his parents a lot .\nc_13\t1\t\tthe puppy licked the kitten 's face .\nc_13\t1\t\tit is raining .\nc_13\t1\t\tfred feels fine .\nc_13\t1\t\tthat bill 's breath smells of onions bothers erin .\nc_13\t1\t\tsusan kissed the clown 's nose .\nc_13\t1\t\tcedric danced a jolly jig .\nc_13\t1\t\tdale said that the lawn was overgrown .\nc_13\t1\t\tgilgamesh cut the steak with a knife .\nc_13\t1\t\twe drove all the way to buenos aires .\nc_13\t1\t\tjohn tagged lewis with a regulation baseball on tuesday .\nc_13\t1\t\tthe big man from new york loves bagels with cream cheese .\nc_13\t1\t\tsusan rode a bright blue train from new york .\nc_13\t1\t\tthe plucky platypus kicked a can of soup from new york to tucson .\nc_13\t1\t\tjohn said martha sang the aria with gusto .\nc_13\t1\t\tmartha said john sang the aria from la bohème .\nc_13\t1\t\tthe book of poems with the bright red cover stinks .\nc_13\t1\t\tlouis hinted mary stole the purse deftly .\nc_13\t1\t\tthe extremely tired students hated syntactic trees with a passion .\nc_13\t1\t\tmany soldiers have claimed bottled water satisfies thirst best .\nc_13\t1\t\tnetworking helps you grow your business .\nc_13\t1\t\ti did n't read a single book the whole time i was in the library .\nc_13\t1\t\ti did not have a red cent .\nc_13\t1\t\tfelicia wrote a fine paper on zapotec .\nc_13\t1\t\theidi hit herself on the head with a zucchini .\nc_13\t0\t*\therself hit heidi on the head with a zucchini .\nc_13\t1\t\theidi believes any description of herself .\nc_13\t1\t\tjohn knew that there would be a picture of himself hanging in the post .\nc_13\t1\t\talthough he loves marshmallows , john is not a big fan of chocolate .\nc_13\t1\t\this yearbook picture gives tom the creeps .\nc_13\t1\t\tmarilyn monroe is norma jeane baker .\nc_13\t1\t\tgene simmons was originally named haim goldberg .\nc_13\t1\t\tkevin ate spaghetti with a spoon and geordie did so too .\nc_13\t1\t\tthe chef eats beans and serves salads with forks .\nc_13\t1\t\ti am frightened of tigers .\nc_13\t1\t\ti am afraid of tigers .\nc_13\t1\t\ti am fond of circus performers .\nc_13\t1\t\ti fear tigers .\nc_13\t1\t\ti like circus performers .\nc_13\t1\t\ti am afraid of tigers and fond of clowns without exception .\nc_13\t1\t\ti am frightened of tigers and fond of clowns without exception .\nc_13\t1\t\tbob is very serious about mary , but less so than paul .\nc_13\t1\t\tthe book of poems with a red cover from blackwell by robert burns takes a very long time to read .\nc_13\t1\t\tthe book of poems with a red cover from blackwell by robert burns takes a very long time to read .\nc_13\t1\t\tthe book of poems from blackwell with a red cover by robert burns takes a very long time to read .\nc_13\t1\t\tthe book of poems from blackwell by robert burns with a red cover takes a very long time to read .\nc_13\t1\t\tthe book of poems by robert burns from blackwell with a red cover takes a very long time to read .\nc_13\t1\t\tthe book of poems by robert burns with a red cover from blackwell takes a very long time to read .\nc_13\t1\t\tthe book of poems with a red cover by robert burns from blackwell takes a very long time to read .\nc_13\t0\t*\tthe book with a red cover of poems from blackwell by robert burns takes a very long time to read .\nc_13\t0\t*\tthe book with a red cover from blackwell of poems by robert burns takes a very long time to read .\nc_13\t0\t*\tthe book with a red cover from blackwell by robert burns of poems takes a very long time to read .\nc_13\t1\t\tthe book of poems with a red cover and with a blue spine takes a very long time to read .\nc_13\t1\t\tthe book of poems and of fiction from blackwell takes a very long time to read .\nc_13\t0\t*\tthe one of poems with a red cover takes a very long time to read .\nc_13\t1\t\ti loved the policeman intensely with all my heart .\nc_13\t0\t*\ti loved intensely the policeman with all my heart .\nc_13\t0\t*\ti loved the policeman the baker intensely with all my heart .\nc_13\t1\t\tmika loved the policeman intensely and susan did so half heartedly .\nc_13\t0\t*\tsusan did so the baker .\nc_13\t1\t\tjohn fears dogs .\nc_13\t1\t\tjohn is afraid of dogs .\nc_13\t1\t\ttwo or three books take a very long time to read .\nc_13\t0\t*\ttwo or boring books take a very long time to read .\nc_13\t1\t\tthe red dress with the pink stripes looks good on sandy .\nc_13\t1\t\tthe ugly man from brazil found books of poems in the puddle .\nc_13\t1\t\terin never keeps her pencils in the correct drawer .\nc_13\t1\t\tdan walked to new mexico in the rain last year .\nc_13\t1\t\tgeorge wrote a volume of poems in latin for jane .\nc_13\t1\t\tpeople with boxes of old clothes lined up behind the door of the building with the leaky roof .\nc_13\t1\t\tthat automobile factories abound in michigan worries me greatly .\nc_13\t1\t\tno one understands that phrase structure rules explain the little understood phenomenon of the infinite length of sentences .\nc_13\t1\t\tmy favorite language is a language with simple morphology and complicated syntax .\nc_13\t1\t\tivan got a headache on wednesday from the disgruntled students of phonology from michigan .\nc_13\t1\t\tthe collection of syntax articles with the red cover bores students of syntax in tucson .\nc_13\t1\t\tthe red volume of obscene verse from italy shocked the puritan soul of the minister with the beard quite thoroughly yesterday .\nc_13\t1\t\tthe biggest man in the room said that john danced an irish jig from county kerry to county tipperary on thursday .\nc_13\t1\t\ta burlap sack of potatoes with mealy skins fell on the professor of linguistics with the terrible taste in t-shirts from the twelfth story .\nc_13\t1\t\tthe bright green filing cabinet was filled to the brim with the most boring articles from a prestigious journal of linguistics with a moderately large readership .\nc_13\t1\t\tthe coat of the panther is dark black .\nc_13\t1\t\tthe roof of the building is leaking .\nc_13\t1\t\tthe hat of the man standing over there impressed me greatly .\nc_13\t1\t\tthe panther 's coat is dark black .\nc_13\t1\t\tthe building 's roof is leaking .\nc_13\t1\t\tthe man standing over there 's hat impressed me greatly .\nc_13\t0\t*\tthe man 's standing over there hat impressed me greatly .\nc_13\t0\t*\tthe man standing over there 's the hat impressed me greatly .\nc_13\t1\t\tthe boy ran .\nc_13\t1\t\thoward is a linguistics student .\nc_13\t1\t\tpeter said that danny danced .\nc_13\t1\t\tbill wants susan to leave .\nc_13\t1\t\tpeter thinks that cathy loves him .\nc_13\t1\t\tpeople selling their stocks caused the crash of 1929 .\nc_13\t1\t\tfor mary to love that boor is a travesty .\nc_13\t1\t\ti said that mary signed my yearbook .\nc_13\t1\t\ti want mary to sign my yearbook .\nc_13\t1\t\ti 've never seen you eat asparagus .\nc_13\t1\t\ti know you ate asparagus .\nc_13\t0\t*\ti 've never seen you ate asparagus .\nc_13\t0\t*\ti 've never seen him eats asparagus .\nc_13\t1\t\ti 've never seen him eat asparagus .\nc_13\t1\t\ti think that he eats asparagus .\nc_13\t1\t\ti want to eat asparagus .\nc_13\t1\t\ti want him to eat asparagus .\nc_13\t1\t\ti wonder if he eats asparagus .\nc_13\t1\t\tfor him to eat asparagus is a travesty .\nc_13\t1\t\ti asked for him to eat the asparagus .\nc_13\t1\t\ti think he will eat asparagus .\nc_13\t1\t\tfabio asked if claus had run a marathon .\nc_13\t0\t*\tfabio asked if had claus run a marathon .\nc_13\t0\t*\tfabio asked had if claus run a marathon .\nc_13\t1\t\tyou can lead a horse to water but will it drink ?\nc_13\t1\t\the will go .\nc_13\t1\t\the goes .\nc_13\t1\t\tthe peanut butter has got moldy .\nc_13\t1\t\tthe swing blasted the golf ball across the green .\nc_13\t1\t\tthat harry loves dancing is evidenced by his shiny tap shoes .\nc_13\t1\t\tthe brazilians pumped the oil across the river .\nc_13\t1\t\tlenin believes the tsar to be a power hungry dictator .\nc_13\t1\t\tbrezhnev had said for andropov to leave .\nc_13\t1\t\tyeltsin saw stalin holding the bag .\nc_13\t1\t\trobert thinks that students should eat asparagus .\nc_13\t1\t\trobert thinks that student should eat asparagus .\nc_13\t1\t\tlinguistics students like phonetics tutorials .\nc_13\t1\t\tmartha said that bill loved his cheerios in the morning .\nc_13\t1\t\teloise wants you to study a new language . assume to = t .\nc_13\t1\t\tfor maurice to quarrel with joel frightened maggie .\nc_13\t1\t\tno man has ever beaten the centaur .\nc_13\t0\t*\tsome man has ever beaten the centaur .\nc_13\t0\t*\tevery man has ever beaten the centaur .\nc_13\t1\t\trosemary hates new york .\nc_13\t0\t*\trosemary hates .\nc_13\t1\t\tjennie smiled .\nc_13\t0\t*\tjennie smiled the microwave .\nc_13\t1\t\ttraci gave the whale a lollipop .\nc_13\t0\t*\ttraci gave the whale .\nc_13\t0\t*\ttraci gave a lollipop .\nc_13\t1\t\tryan hit andrew .\nc_13\t1\t\tmichael accidentally broke the glass .\nc_13\t1\t\tleah likes cookies .\nc_13\t1\t\tlorenzo saw the eclipse .\nc_13\t1\t\tsyntax frightens kenny .\nc_13\t1\t\talyssa kept her syntax book .\nc_13\t1\t\tthe arrow hit ben .\nc_13\t1\t\tthe psychologist hates phonology .\nc_13\t1\t\tdoug went to chicago .\nc_13\t1\t\tdave was given the margarita mix .\nc_13\t1\t\tgeorge gave jessica the book .\nc_13\t1\t\tdaniel received a scolding from hanna .\nc_13\t1\t\tbob gave steve the syntax assignment .\nc_13\t1\t\tstacy came directly from linguistics class .\nc_13\t1\t\tandrew is in tucson 's finest apartment .\nc_13\t1\t\tchris hacked the computer apart with an axe .\nc_13\t1\t\tthis key will open the door to the linguistics building .\nc_13\t1\t\the bought these flowers for aaron .\nc_13\t1\t\tshe cooked matt dinner .\nc_13\t0\t*\tjohn placed the flute .\nc_13\t1\t\tjohn put the book on the table .\nc_13\t1\t\tjohn put the book on the table with a pair of tongs .\nc_13\t1\t\tmegan loves kevin .\nc_13\t0\t*\tmegan loves .\nc_13\t0\t*\tmegan loves jason .\nc_13\t1\t\tit rained .\nc_13\t1\t\tit snowed .\nc_13\t1\t\tit hailed .\nc_13\t1\t\tthat bill loves chocolate is likely .\nc_13\t1\t\tit is likely that bill likes chocolate .\nc_13\t1\t\ti put a book on it .\nc_13\t1\t\tit bit me on the leg .\nc_13\t1\t\tshannon sent dan an email .\nc_13\t1\t\tstacy hit a baseball to julia .\nc_13\t1\t\tjaime danced a jig .\nc_13\t1\t\tyuko rubbed the pizza with a garlic clove .\nc_13\t1\t\tit is raining in san francisco .\nc_13\t1\t\tthe stodgy professor left with his teaching assistant .\nc_13\t1\t\ti played a tune on my ipod .\nc_13\t1\t\tmolly gave calvin a kiss .\nc_13\t1\t\tmercedes gave a test to the students in the lecture hall .\nc_13\t1\t\tspot ate a cat treat .\nc_13\t1\t\tsusan ate yesterday at the restaurant .\nc_13\t1\t\tgwen looked at a fire truck .\nc_13\t1\t\tmichael asked a question .\nc_13\t1\t\tadam asked if hyacinth likes pineapples .\nc_13\t1\t\ti feel it is unfortunate that television is so vulgar these days .\nc_13\t1\t\tthat angus hates sushi is mysterious .\nc_13\t0\t*\tjennie smiled the sandwich .\nc_13\t0\t*\tplaced the flute on the table .\nc_13\t0\t*\tjohn placed on the table .\nc_13\t0\t*\tjohn placed the flute the violin on the table .\nc_13\t0\t*\tthe rock placed the sky with the fork .\nc_13\t0\t*\tjohn placed the flute the table .\nc_13\t1\t\tjohn bit the apple .\nc_13\t1\t\tsusan forgave louis .\nc_13\t1\t\tthe jockey rides the horse .\nc_13\t1\t\tphillip gave the soldier the medal .\nc_13\t1\t\tthe apple was bitten .\nc_13\t1\t\tlouis was forgiven .\nc_13\t1\t\tthe horse was ridden .\nc_13\t1\t\tthe medal was given to the soldier .\nc_13\t1\t\tthe soldier was given the medal .\nc_13\t1\t\tthe apple was bitten by john .\nc_13\t1\t\tlouis was forgiven by susan .\nc_13\t1\t\tthe horse was ridden by the jockey .\nc_13\t1\t\tthe medal was given to the soldier by phillip .\nc_13\t1\t\tthe soldier was given the medal by phillip .\nc_13\t1\t\ti ate a basket of apples .\nc_13\t1\t\ti ate .\nc_13\t1\t\ti think that john likes his beer .\nc_13\t1\t\ti think john likes his beer .\nc_13\t0\t*\ti think for john to like his beer .\nc_13\t0\t*\ti think if john likes his beer .\nc_13\t1\t\ti ordered that john drink his beer .\nc_13\t1\t\ti ordered john drink his beer .\nc_13\t0\t*\ti ordered for john to drink his beer .\nc_13\t1\t\ti ordered john to drink his beer .\nc_13\t0\t*\ti ordered if john drink his beer .\nc_13\t0\t*\ti inquired that john like his beer .\nc_13\t0\t*\ti inquired john likes his beer .\nc_13\t0\t*\ti inquired for john to like his beer .\nc_13\t0\t*\ti inquired john to like his beer .\nc_13\t1\t\ti inquired if john likes his beer .\nc_13\t1\t\theidi thinks that andy is eating salmon flavored candy bars .\nc_13\t1\t\theidi thinks that andy has eaten salmon flavored candy bars .\nc_13\t1\t\theidi thinks that andy will eat salmon flavored candy bars .\nc_13\t1\t\theidi thinks that andy eats salmon flavored candy bars .\nc_13\t1\t\theidi thinks that the salmon flavored candy bars were eaten .\nc_13\t1\t\the has danced .\nc_13\t1\t\ti had eaten the deep fried muffins .\nc_13\t1\t\ti have eaten the beef waffles .\nc_13\t1\t\ti will have eaten the beef waffles .\nc_13\t1\t\tjeff was dancing with sylvia while amy sat angrily at their table .\nc_13\t1\t\tthe soup had been being eaten when it got spilled .\nc_13\t1\t\tjeff must have eaten the deep fried muffin .\nc_13\t0\t*\tjeff has must eaten the deep fried muffin .\nc_13\t1\t\tjeff must not have eaten the deep fried muffin .\nc_13\t0\t*\tjeff not must have eaten the deep fried muffin .\nc_13\t1\t\tcalvin has a peanut .\nc_13\t1\t\tsusan has a cold .\nc_13\t1\t\tbill had an accident .\nc_13\t1\t\tcalvin has eaten a peanut .\nc_13\t1\t\tfrank has drunk too much .\nc_13\t1\t\tbill has been dancing .\nc_13\t1\t\tdane is a doctor .\nc_13\t1\t\tjorge was the one .\nc_13\t1\t\talex was eating the popsicle .\nc_13\t1\t\tmegan was sat on by her brother .\nc_13\t1\t\tcatherine did her homework .\nc_13\t1\t\tcatherine did not eat .\nc_13\t1\t\tcalvin did not do a back flip .\nc_13\t1\t\thas bill eaten his tuna ?\nc_13\t1\t\tis bill eating his tuna ?\nc_13\t1\t\tdid bill eat his dinner ?\nc_13\t0\t*\tate bill his dinner ?\nc_13\t0\t*\thas calvin a bowl ?\nc_13\t1\t\tangus is not leaving .\nc_13\t1\t\tcalvin has not eaten his dinner .\nc_13\t1\t\tspot did not play with his mouse .\nc_13\t0\t*\tcalvin ate not his dinner .\nc_13\t0\t*\tcalvin has not any catnip .\nc_13\t0\t*\tangus did not his homework .\nc_13\t1\t\ti should not eat plums .\nc_13\t0\t*\ti have not should eat plums .\nc_13\t0\t*\ti must can eat plums .\nc_13\t0\t*\ti have should eat plums .\nc_13\t0\t*\ti want to should eat plums .\nc_13\t1\t\tcalvin will not eat the beef waffles .\nc_13\t0\t*\tcalvin not will eat the beef waffles .\nc_13\t0\t*\tcalvin could will eat the beef waffles .\nc_13\t0\t*\tcalvin will could eat the beef waffles .\nc_13\t1\t\ti ate deep fried muffins .\nc_13\t1\t\the always eats deep fried muffins .\nc_13\t0\t*\ti might ate deep fried muffins .\nc_13\t0\t*\the always might eats deep fried muffins .\nc_13\t1\t\the might eat deep fried muffins .\nc_13\t1\t\ti might eat deep fried muffins .\nc_13\t1\t\the will eat deep fried muffins .\nc_13\t0\t*\the will eats deep fried muffins .\nc_13\t1\t\tsylvia will be slapping jeff upside the head in martial arts class .\nc_13\t1\t\tsylvia could be slapping jeff upside the head in martial arts class .\nc_13\t1\t\tsylvia is slapping jeff upside the head in martial arts class .\nc_13\t1\t\tthe cat had eaten .\nc_13\t1\t\tthe cat had been eating .\nc_13\t1\t\tthe tuna had been eaten .\nc_13\t0\t*\tthe cat had haven eaten .\nc_13\t1\t\tthe cat was leaving .\nc_13\t1\t\tthe tuna was being eaten .\nc_13\t0\t*\tthe cat was being eating .\nc_13\t0\t*\tthe cat was having eaten .\nc_13\t1\t\tthe cake was eaten .\nc_13\t0\t*\tthe cake was been eating .\nc_13\t0\t*\tthe cake was have eaten .\nc_13\t1\t\treggie did not chase the ball .\nc_13\t1\t\tdid calvin eat the beef waffles ?\nc_13\t1\t\twhat did calvin eat ?\nc_13\t0\t*\tjohn must not do have eaten .\nc_13\t0\t*\tjohn must do not have eaten .\nc_13\t1\t\tthe prisoner must have been being interrogated when the supervisor walked into the room and saw what was going on and put a stop to it .\nc_13\t1\t\tfiona must not eat the sauteed candy canes .\nc_13\t1\t\tfiona has not eaten the sauteed candy canes .\nc_13\t1\t\tcan fiona eat sauteed candy canes ?\nc_13\t0\t*\ti wanted that he should leave .\nc_13\t0\t*\ti wanted he should leave .\nc_13\t0\t*\ti wanted if he should leave .\nc_13\t1\t\ti wanted him to leave .\nc_13\t1\t\ti wanted to leave .\nc_13\t0\t*\theidi investigated that john ate the cauliflower .\nc_13\t0\t*\theidi investigated john ate the cauliflower .\nc_13\t1\t\theidi investigated whether john ate the cauliflower .\nc_13\t1\t\theidi investigated if john ate the cauliflower .\nc_13\t0\t*\theidi investigated john to eat the cauliflower .\nc_13\t0\t*\theidi investigated to eat the cauliflower .\nc_13\t1\t\tjohn said heidi was obsessed with broccoli .\nc_13\t0\t*\tjohn said if heidi was obsessed with broccoli .\nc_13\t0\t*\tjohn said heidi to eat the broccoli .\nc_13\t1\t\tandy promised that we would go .\nc_13\t1\t\tandy promised we would go .\nc_13\t0\t*\tandy promised if we would go .\nc_13\t0\t*\tandy promised us to go .\nc_13\t1\t\tandy promised to go .\nc_13\t1\t\tif i were a rich man , i 'd buy a diamond ring .\nc_13\t0\t*\tif he is a rich man , he 'd buy a diamond ring .\nc_13\t1\t\trory eats .\nc_13\t1\t\trory ate muffins .\nc_13\t1\t\tthe muffins were eaten .\nc_13\t1\t\trory had eaten the muffins .\nc_13\t1\t\trory has eaten the muffins .\nc_13\t1\t\trory must have eaten the muffins .\nc_13\t1\t\trory may be eating the muffins .\nc_13\t1\t\trory will eat the muffins .\nc_13\t1\t\trory eats muffins .\nc_13\t1\t\trory is eating muffins .\nc_13\t1\t\trory might have been eating the muffins .\nc_13\t1\t\tthe muffins might have been being eaten .\nc_13\t1\t\tthe tuna had been being eaten .\nc_13\t1\t\tcalvin will eat .\nc_13\t1\t\tthe tuna will be eaten .\nc_13\t1\t\tcalvin will be eating .\nc_13\t1\t\tcalvin will have eaten .\nc_13\t1\t\tthe tuna will be being eaten .\nc_13\t1\t\tthe tuna will have been eaten .\nc_13\t1\t\tcalvin will have been eating .\nc_13\t1\t\tcalvin was eating .\nc_13\t1\t\tcalvin had eaten .\nc_13\t1\t\tcalvin had been eating .\nc_13\t1\t\tthe tuna must have been eaten .\nc_13\t1\t\tthe tuna will have been being eaten .\nc_13\t1\t\the has not eaten yet today .\nc_13\t1\t\ti have never seen this movie .\nc_13\t1\t\ti never have a pen when i need it .\nc_13\t1\t\ti have always loved peanut butter .\nc_13\t1\t\ti do not love peanut butter .\nc_13\t1\t\tmartha often thinks kim hates phonology .\nc_13\t1\t\tdo you like peanut butter ?\nc_13\t1\t\thave you always hated peanut butter ?\nc_13\t1\t\tare you always thinking dirty thoughts ?\nc_13\t1\t\tbradley left .\nc_13\t1\t\tstacy left tucson .\nc_13\t1\t\tjohn left his wife .\nc_13\t0\t*\ti want bradley that left .\nc_13\t0\t*\tjohn thinks that left .\nc_13\t1\t\tthat john will leave is likely .\nc_13\t1\t\tit is likely that john will leave .\nc_13\t1\t\tthe policeman kissed the puppy .\nc_13\t1\t\tthe puppy was kissed by the policeman .\nc_13\t1\t\tthe puppy was kissed .\nc_13\t1\t\tjohn laughed .\nc_13\t1\t\tthe audience laughed .\nc_13\t0\t*\tbill is likely john to hit .\nc_13\t0\t*\tit was kissed the puppy .\nc_13\t1\t\tjennifer swatted steve .\nc_13\t1\t\tsteve swatted jennifer .\nc_13\t1\t\tshe swatted him .\nc_13\t1\t\the swatted her .\nc_13\t1\t\ti walk .\nc_13\t1\t\tyou walk .\nc_13\t1\t\tit is likely that patrick left .\nc_13\t1\t\tthat patrick left is likely .\nc_13\t0\t*\tpatrick is likely that left .\nc_13\t0\t*\tit is likely patrick to leave .\nc_13\t0\t*\tpatrick to leave is likely .\nc_13\t1\t\tpatrick is likely to leave .\nc_13\t1\t\the kissed her .\nc_13\t1\t\tshe was kissed .\nc_13\t0\t*\tshe was kissed him .\nc_13\t0\t*\tit was kissed her .\nc_13\t1\t\tstacy danced at the palace .\nc_13\t1\t\tstacy arrived at the palace .\nc_13\t0\t*\tthere danced three men at the palace .\nc_13\t0\t*\tthere arrived three men at the palace .\nc_13\t0\t*\tit seems sonny to love cher .\nc_13\t0\t*\tbill was bitten the dog .\nc_13\t0\t*\tdonny is likely that left .\nc_13\t1\t\tthe shah slept in a bed .\nc_13\t1\t\tthe bed was slept in by the shah .\nc_13\t1\t\tdust fell on the bed .\nc_13\t0\t*\tthe bed was fallen on by the dust .\nc_13\t1\t\tbill was hit by the baseball .\nc_13\t0\t*\twas been hit by bill by the baseball .\nc_13\t1\t\tbill gave sue the book .\nc_13\t1\t\tsue was given the book by bill .\nc_13\t0\t*\tthe book was been given by bill by sue .\nc_13\t1\t\ti cut the soft bread .\nc_13\t1\t\tthe soft bread cuts easily .\nc_13\t1\t\tthe boat sank .\nc_13\t1\t\tthe torpedo sank the boat .\nc_13\t1\t\tthe captain sank the boat .\nc_13\t1\t\tthe captain sank the boat with a torpedo .\nc_13\t0\t*\twas sunk by the boat .\nc_13\t1\t\tthe boat was sunk by the captain with a torpedo .\nc_13\t1\t\ti sent a book to louis .\nc_13\t1\t\ti sent louis a book .\nc_13\t1\t\ta book was sent to louis .\nc_13\t0\t*\tlouis was sent a book to .\nc_13\t0\t*\tto louis was sent a book .\nc_13\t1\t\tlouis was sent a book .\nc_13\t0\t*\ta book was sent louis .\nc_13\t1\t\tjohn seems to have left .\nc_13\t1\t\tbill wants john to leave .\nc_13\t1\t\tjohn wants bill to leave .\nc_13\t1\t\tjohn wants him to leave .\nc_13\t1\t\tjohn believes him to have been at the game .\nc_13\t1\t\the is believed by john to have been at the game .\nc_13\t1\t\the is believed to have been at the game .\nc_13\t1\t\tbecky bought the syntax book .\nc_13\t1\t\twhat did becky buy ?\nc_13\t1\t\twhat did stacy say becky bought ?\nc_13\t1\t\tmatt kissed her .\nc_13\t1\t\twhom did matt kiss ?\nc_13\t1\t\ti wonder who jim kissed .\nc_13\t1\t\tthe fact that i like strawberry flavored milk shakes is none of your business .\nc_13\t1\t\tshe made the outrageous claim that tuna flavored milkshakes are good for you .\nc_13\t1\t\ti asked where you found it .\nc_13\t1\t\ti wo n't reveal the place .\nc_13\t1\t\ti asked who she kissed .\nc_13\t1\t\ti know several people who she kissed .\nc_13\t1\t\ti know several people she kissed .\nc_13\t1\t\ti know several people that she kissed .\nc_13\t1\t\ti know i bought the book you recommended .\nc_13\t1\t\ti know i bought the book that you recommended .\nc_13\t1\t\tthe guy who is wearing the red hat just hit me !\nc_13\t1\t\tthat guy , who i think might be drunk , just hit me !\nc_13\t0\t*\tthe man , who i think might be drunk , that is escaping hit me .\nc_13\t1\t\twhat did bill claim that he read ?\nc_13\t1\t\twhat do you think matt kissed ?\nc_13\t0\t*\twhat did bill make the claim that he read in the syntax book ?\nc_13\t0\t*\twhich cake did you see the man who baked ?\nc_13\t1\t\ti wonder what john bought .\nc_13\t1\t\thow do you think john bought the sweater ?\nc_13\t0\t*\thow do you wonder what john bought ?\nc_13\t1\t\thow do you think john bought what ?\nc_13\t1\t\ti wonder what john bought how .\nc_13\t1\t\ti wonder what john kissed .\nc_13\t0\t*\twho did you wonder what kissed ?\nc_13\t1\t\ti asked what john kissed .\nc_13\t1\t\tthat the police would arrest several rioters was a certainty .\nc_13\t1\t\ti liked mary and john .\nc_13\t0\t*\twho did you like and john ?\nc_13\t1\t\ti ate some popcorn and drank some soda .\nc_13\t0\t*\twhat did you eat some popcorn and drink ?\nc_13\t1\t\twho loves who ?\nc_13\t1\t\twho loves whom ?\nc_13\t1\t\tshelly loves who ?\nc_13\t1\t\tfred saw a spaceship in the linguistics lounge ?\nc_13\t1\t\twhat is bothering you ?\nc_13\t1\t\twho has seen my snorkel ?\nc_13\t1\t\thow was the plot discovered by the authorities ?\nc_13\t1\t\twhich animals appear to have lost their collars ?\nc_13\t1\t\twhat did jean think was likely to have been stolen ?\nc_13\t1\t\tcar sales have surprised the stockbrokers .\nc_13\t1\t\tcan you find the light bulb store ?\nc_13\t1\t\tjohn was bitten by an advertising executive .\nc_13\t1\t\tit is likely that tami will leave new york .\nc_13\t1\t\ttami is likely to leave new york .\nc_13\t1\t\tlucy seems to have been mugged .\nc_13\t1\t\twhat did you buy at the supermarket ?\nc_13\t1\t\twhat is it likely for beth to have bought at the supermarket ?\nc_13\t1\t\twhat is likely to have been bought at the supermarket ?\nc_13\t1\t\tthe trail we walked today was built by slave labor .\nc_13\t1\t\tbill is always complaining about the guys who work near him .\nc_13\t1\t\tthe cost of bagels that are imported from iceland surprised the teacher who mike hired last week .\nc_13\t0\t*\tjosh gave clay carefully a book .\nc_13\t1\t\tjosh gave clay a book carefully .\nc_13\t1\t\tbriana showed justin himself .\nc_13\t0\t*\tbriana showed himself justin .\nc_13\t1\t\ti blew up the building .\nc_13\t1\t\ti blew the building up .\nc_13\t0\t*\ti blew up it .\nc_13\t1\t\ti blew it up .\nc_13\t1\t\tsusan sent the package to heidi .\nc_13\t1\t\ti asked mike if he had seen the yeti .\nc_13\t1\t\ti bought some flowers for manuel .\nc_13\t1\t\ti bought manuel some flowers .\nc_13\t1\t\tjean is likely to leave .\nc_13\t1\t\tjean is reluctant to leave .\nc_13\t0\t*\tjean is likely .\nc_13\t1\t\tjean wants brian to leave .\nc_13\t1\t\tjean persuaded brian to leave .\nc_13\t1\t\tthat jean left is likely .\nc_13\t1\t\tit is likely that jean left .\nc_13\t0\t*\tis likely jean to leave .\nc_13\t0\t*\tit is reluctant that jean left .\nc_13\t0\t*\tthat jean left is reluctant .\nc_13\t1\t\tjean is likely to dance .\nc_13\t1\t\tthe cat is out of the bag .\nc_13\t1\t\tthe cat thinks that he is out of the bag .\nc_13\t0\t*\tis likely to jean dance .\nc_13\t1\t\tit is likely that jean will dance .\nc_13\t1\t\tjean wants robert .\nc_13\t1\t\tjean wants him .\nc_13\t0\t*\ti want she to dance .\nc_13\t1\t\ti want jean .\nc_13\t1\t\ti want jean to dance .\nc_13\t1\t\tjean wants herself to dance .\nc_13\t1\t\tjean is reluctant .\nc_13\t1\t\tto find a new mate , go to a dating service .\nc_13\t1\t\tjean tried to behave .\nc_13\t1\t\trobert knows that it is essential to be well behaved .\nc_13\t1\t\trobert knows that it is essential .\nc_13\t1\t\trobert knows it is essential that he is well behaved .\nc_13\t1\t\tlouis begged kate to leave .\nc_13\t0\t*\tlouis begged kate that she leave her job .\nc_13\t0\t*\tlouis begged kate to shave himself .\nc_13\t1\t\tlouis begged kate that he be allowed to shave himself .\nc_13\t1\t\tto behave oneself in public is expected .\nc_13\t1\t\trobert knew that it was necessary to behave himself .\nc_13\t1\t\tmike expected greg incorrectly to take out the trash .\nc_13\t1\t\tthe boys do n't all want to leave .\nc_13\t1\t\trobert is eager to do his homework .\nc_13\t1\t\tjean seems to be in a good mood .\nc_13\t1\t\trosemary tried to get a new car .\nc_13\t1\t\tsusan begged bill to let her sing in the concert .\nc_13\t1\t\tsusan begged to be allowed to sing in the concert .\nc_13\t1\t\tchristina is ready to leave .\nc_13\t1\t\tfred was believed to have wanted to try to dance .\nc_13\t1\t\tsusan consented to try to seem to have been kissed .\nc_13\t1\t\talan told me who wanted to seem to be invincible .\nc_13\t1\t\twhat did john want to eat ?\nc_13\t1\t\tthis book is easy to read .\nc_13\t1\t\tjohn is easy to please .\nc_13\t1\t\tto improve myself is a goal for next year .\nc_13\t1\t\tto improve yourself would be a good idea .\nc_13\t1\t\tto improve himself , bruce should consider therapy .\nc_13\t1\t\tto improve herself , jane went to a health spa .\nc_13\t1\t\tkathleen really hates her job .\nc_13\t1\t\tmy brother likes collecting jazz records .\nc_13\t1\t\tmartina is deathly afraid of spiders .\nc_13\t1\t\tthat kind of behavior annoys me .\nc_13\t1\t\tthe news pleased the students .\nc_13\t1\t\thorror films disturb milo .\nc_13\t1\t\tthe exhibition really impressed the critics .\nc_13\t1\t\tkathleen hates those pictures of herself .\nc_13\t1\t\tthe children admired photos of each other .\nc_13\t1\t\tsandra hates reading about herself in the tabloids .\nc_13\t1\t\tpictures of himself always disturb milo .\nc_13\t1\t\tto be able to buy myself a ticket to france would be a dream .\nc_13\t1\t\treading about herself in the tabloids always annoys sandra .\nc_13\t1\t\tbrandon has been reading more novels than he has short stories .\nc_13\t1\t\trobin will eat cabbage but she wo n't ice cream .\nc_13\t1\t\tjohn could bake something , but i 'm not sure what .\nc_13\t1\t\tfrank will eat an apple and morgan will too .\nc_13\t1\t\tfrank will eat an apple and morgan will eat an apple too .\nc_13\t1\t\tcalvin will strike himself .\nc_13\t1\t\tcalvin will strike himself and otto will too .\nc_13\t1\t\tcalvin will strike himself and otto will strike himself too .\nc_13\t1\t\tcalvin has dated every girl who jeff has .\nc_13\t1\t\tcalvin has dated every girl who jeff has dated .\nc_13\t1\t\ti know which guys you 've dated , but i do n't know which guys you have n't .\nc_13\t0\t*\twhich language do you want to hire someone who speaks ?\nc_13\t0\t*\tthey want to hire someone who speaks a balkan language , but i do n't know which language .\nc_13\t1\t\tcalvin will fire someone today , but i do n't know who .\nc_13\t1\t\tpeter was talking with someone but i do n't know who .\nc_13\t1\t\tbrandon read every book that megan did .\nc_13\t1\t\tevery book that megan did brandon read too .\nc_13\t1\t\tdarin has eaten more squid than john has octopus .\nc_13\t1\t\twhat does calvin like .\nc_13\t1\t\talexandra wants to catch a fish and sylvia does too .\nc_13\t1\t\tcalvin admired himself in the mirror .\nc_13\t0\t*\tchris said that himself was sad .\nc_13\t1\t\tchris wants himself to win .\nc_13\t1\t\twhich pictures of himself did chris see in the gallery ?\nc_13\t1\t\tchris liked which pictures of himself ?\nc_13\t1\t\twhich pictures of himself did chris like ?\nc_13\t0\t*\theidi believes bill 's description of herself .\nc_13\t1\t\theidi thinks that she has won .\nc_13\t1\t\theidi thinks that pictures of herself are beautiful .\nc_13\t1\t\theidi gave a present to herself .\nc_13\t1\t\tthe army 's destruction of the palace was a tragedy .\nc_13\t1\t\tthe army destroyed the palace .\nc_13\t1\t\theidi wants to kiss herself .\nc_13\t0\t*\theidi believes john 's description of herself .\nc_13\t0\t*\theidi dislikes the tv 's depiction of herself .\nc_13\t1\t\theidi said that pictures of herself were embarrassing .\nc_13\t0\t*\theidi said that bill 's pictures of herself were embarrassing .\nc_13\t0\t*\tchris said that himself was angry .\nc_13\t1\t\theidi saw peter 's picture of her .\nc_13\t1\t\theidi saw drawings of her .\nc_13\t1\t\tjohn loves himself .\nc_13\t1\t\tjohn loves pictures of himself .\nc_13\t0\t*\tjohn loves mary 's pictures of himself .\nc_13\t0\t*\tjohn thinks that mary 's depiction of himself is wrong .\nc_13\t1\t\tjohn thinks that most depictions of himself are wrong .\nc_13\t1\t\tjohn seems to like pictures of himself .\nc_13\t1\t\tjohn believes himself to be the best at baseball .\nc_13\t1\t\tjohn wants to congratulate himself .\nc_13\t0\t*\tjohn loves him .\nc_13\t1\t\tjohn loves his puppy .\nc_13\t1\t\tjohn asked if the unflattering description of his work would be published in the paper .\nc_13\t1\t\tjohn asked if his essay would be published in the paper .\nd_98\t1\t\tany owl can hunt mice .\nd_98\t0\t*\tjohn talked to any woman .\nd_98\t0\t*\tany woman contributed to the fund .\nd_98\t1\t\tjohn talked to any woman who came up to him .\nd_98\t1\t\tany woman who heard the news contributed to the fund .\nd_98\t1\t\tany man who saw the fly in the food did n't eat dinner .\nd_98\t1\t\tyou may pick any flower .\nd_98\t0\t*\tyou must pick any flower .\nd_98\t1\t\tany pilot could be flying this plane .\nd_98\t0\t*\tany pilot must be flying this plane .\nd_98\t1\t\tany student must work hard .\nd_98\t1\t\tany doctor will tell you that .\nd_98\t1\t\tany soldier should be prepared to die for her country .\nd_98\t1\t\tjohn talked to a woman .\nd_98\t1\t\tjohn did n't talk to a woman .\nd_98\t1\t\tjohn kissed even the ugliest woman .\nd_98\t1\t\tjohn kissed even the ugliest woman who came up to him .\nd_98\t1\t\ta lion is usually majestic .\nd_98\t0\t*\tany lion is usually majestic .\nd_98\t1\t\ta philosopher is sometimes wrong .\nd_98\t1\t\tany philosopher is sometimes wrong .\nd_98\t1\t\tyou must pick a flower .\nd_98\t1\t\ta pilot must be flying this plane .\nd_98\t1\t\ta student must work hard .\nd_98\t1\t\ta soldier should be prepared to die for her country .\nd_98\t1\t\trarely is any lion majestic .\nd_98\t1\t\tseldom is any lion majestic .\nd_98\t1\t\tnever is any lion majestic .\nd_98\t0\t*\tusually , any lion is majestic .\nd_98\t0\t*\toften , any lion is majestic .\nd_98\t0\t*\talways , any lion is majestic .\nd_98\t1\t\tyou may pick absolutely any flower .\nd_98\t1\t\tyou may pick almost any flower .\nd_98\t1\t\talmost any pilot could be flying this plane .\nd_98\t1\t\tabsolutely any pilot could be flying this plane .\nd_98\t1\t\tyou may pick any flower except the rose .\nd_98\t1\t\tany pilot except sue could be flying this plane .\nd_98\t1\t\tjohn talked to absolutely any woman who came up to him .\nd_98\t1\t\tjohn talked to almost any woman who came up to him .\nd_98\t1\t\tjohn talked to any woman who came up to him except sue .\nd_98\t1\t\tjohn put carrots from his garden in the salad .\nd_98\t1\t\tjohn put any carrot from his garden in the salad .\nd_98\t1\t\tjohn talked to a woman who came up to him .\nd_98\t1\t\ta woman who heard the news contributed to the fund .\nd_98\t1\t\ta man who saw the fly in the food did n't eat dinner .\nd_98\t1\t\tjohn talked to every woman who came up to him .\nd_98\t1\t\tevery woman who heard the news contributed to the fund .\nd_98\t1\t\tevery man who saw the fly in the food did n't eat dinner .\nd_98\t1\t\tjohn talked to every woman .\nd_98\t1\t\tmary regretted that she did anything to help him .\nd_98\t0\t*\tmary talked to any man or any woman .\nd_98\t1\t\tevery student who is in mary 's class is working on polarity items .\nd_98\t1\t\tit happens to be true of every student in mary 's class that he is working on polarity items .\nd_98\t1\t\tevery student in mary 's class , by virtue of being in her class , is working on polarity items .\nd_98\t1\t\tevery student in mary 's class happened to vote republican .\nd_98\t1\t\tevery woman standing under that tree is mary 's friend .\nd_98\t1\t\tthe president thanked every soldier who had fought in the .\nd_98\t1\t\teverybody who attended last week 's huge rally signed the petition .\nd_98\t1\t\twe did n't keep a list of the names , but the president thanked every soldier who had fought in the gulf war .\nd_98\t0\t*\tevery student in mary 's class , whoever they were , happened to vote republican .\nd_98\t0\t*\tevery woman standing under that tree , whoever she may be , is mary 's friend .\nd_98\t0\t*\tany student in mary 's class happened to vote .\nd_98\t0\t*\tany woman standing under that tree is mary 's friend .\nd_98\t1\t\tthe president thanked any soldier who had fought in the gulf .\nd_98\t1\t\tevery restaurant that advertises in any of these papers happens to have four stars in the handbook .\nd_98\t1\t\teverybody who is in mary 's semantics seminar is writing a paper on polarity items .\nd_98\t1\t\tjohn talked to any woman at the party .\nd_98\t1\t\tjohn talked to any politician who is powerful .\nd_98\t0\t*\tjohn talked to any powerful politician .\nd_98\t1\t\tmary confidently answered any objections .\nd_98\t1\t\tafter the dinner , we threw away any leftovers .\nd_98\t0\t*\tjohn bought any picture of queen elizabeth .\nd_98\t1\t\tjohn bought any picture of queen elizabeth that was on sale .\nd_98\t1\t\tevery philosopher is sometimes wrong , but he usually does n't admit it .\nd_98\t0\t*\tany lion is generally majestic .\nd_98\t0\t*\tany lion is rare .\nd_98\t1\t\tany female tiger has orange fur , marked with black stripes .\nd_98\t1\t\tbirds fly .\nd_98\t1\t\tany bird flies .\nd_98\t1\t\tall fugitives are in jail now .\nd_98\t1\t\tall lizards will die .\nd_98\t0\t*\tyesterday john talked to any woman .\nd_98\t1\t\tyesterday john talked to any woman he saw .\nd_98\t1\t\tsnow is white and snow is not white .\nd_98\t1\t\tany man did n't eat dinner .\nd_98\t1\t\tmary talks to any student .\nd_98\t0\t*\tmary talked to any angry student .\nd_98\t1\t\tmary talked to any student who was angry .\nd_98\t0\t*\tmary talked to any actual student .\nd_98\t0\t*\tany pilot on duty today must be flying this plane .\nd_98\t1\t\tany pilot must be out flying planes today .\nd_98\t1\t\tevery student read any book on giraffes he found .\nd_98\t0\t*\tyou must pick any flower in this bed .\nd_98\t1\t\tyou may pick any of the flowers .\nd_98\t0\t*\tyou must pick any of the flowers .\nd_98\t0\t*\tmary picked any of the flowers .\nd_98\t1\t\tyou may pick every flower .\nd_98\t1\t\tyou may pick any flower , but leave a few for mary .\nd_98\t1\t\tyou may pick any five flowers .\nd_98\t1\t\tmary did n't pick any of the flowers .\nd_98\t1\t\tpick any flower .\nd_98\t1\t\tconfiscate any liquor .\nd_98\t1\t\tpick any of these flowers .\nd_98\t0\t*\tconfiscate any of this liquor .\nd_98\t0\t*\tmary did n't see almost every flower .\nd_98\t0\t*\tmary did n't see almost any flower .\nd_98\t1\t\tevery student in mary 's class is working on negative polarity .\nd_98\t1\t\tthere were twenty students at the lecture and every student who was there said it was inspiring .\nd_98\t0\t*\tthere were twenty students at the lecture and any student who was there said it was inspiring .\nd_98\t1\t\twe have many graduate students but this year the graduate director met with every student in the graduate program individually to discuss their progress .\nd_98\t0\t*\twe have many graduate students but this year the graduate director met with any student in the graduate program individually to discuss their progress .\nd_98\t1\t\tsusan found every book she had been looking for at borders .\nd_98\t0\t*\tsusan found any book she had been looking for at borders .\nd_98\t1\t\tpaul has interviewed every student who was at the scene of the crime and kate has interviewed them too .\nd_98\t0\t*\tpaul has interviewed any student who was at the scene of the crime and kate has interviewed them too .\nd_98\t0\t*\tprofessor smith would support sue and prof jones bill .\nd_98\t1\t\tthere is every book by chomsky in this library .\nd_98\t0\t*\tthere is any book by chomsky in this library .\nd_98\t1\t\tthere 's everything mary had asked for in this store .\nd_98\t0\t*\tthere 's anything mary had asked for in this store .\nd_98\t1\t\tthere is any book you could imagine in this library .\nd_98\t1\t\tthere 's anything mary could desire in this store .\nd_98\t1\t\tthat evening john laughed with everybody he talked to .\nd_98\t1\t\tthat evening john laughed with anybody he talked to .\nd_98\t1\t\tjohn talked to everybody who came up to him at the party .\nd_98\t1\t\tjohn talked to anybody who came up to him at the party .\nd_98\t1\t\tbill offered mary everything he had cooked for dinner .\nd_98\t0\t*\tbill offered mary anything he had cooked for dinner .\nd_98\t1\t\tthose days bill offered mary everything he cooked .\nd_98\t1\t\tthose days bill offered mary anything he cooked .\nd_98\t1\t\tjohn made a fool of himself in front of everyone who was there .\nd_98\t1\t\tjohn made a fool of himself in front of anyone who was there .\nd_98\t1\t\tmary sang for everyone who wanted to hear her .\nd_98\t1\t\tmary sang for anyone who wanted to hear her .\nd_98\t1\t\tjohn slipped in front of everyone who was there .\nd_98\t0\t*\tjohn slipped in front of anyone who was there .\nd_98\t1\t\tat 4 p.m . i saw john lecturing to everyone who was near him .\nd_98\t0\t*\tat 4 p.m . i saw john lecturing to anyone who was near him .\nd_98\t1\t\tjohn knew every language that we encountered on our trip .\nd_98\t1\t\tjohn knew any language that we encountered on our trip .\nd_98\t1\t\tjohn liked everything that was placed before him .\nd_98\t1\t\tjohn liked anything that was placed before him .\nd_98\t1\t\tat the end of his speech , the president thanked any soldier who had fought in the gulf war .\nd_98\t1\t\tbob does not think that there is anyone from greece in his basement .\nd_98\t1\t\tcan anyone pledge $ 1000 ?\nd_98\t1\t\tis it possible for everyone to to pledge $ 1000 ?\nd_98\t1\t\tis there someone who can pledge $ 1000 ?\nd_98\t1\t\tif anybody comes , he rings the doorbell .\nd_98\t1\t\tevery student who wins any trophy displays it in a prominent place .\nd_98\t0\t*\tjohn saw anything .\nd_98\t1\t\tjohn did n't see anything .\nd_98\t0\t*\tsome who read anything passed .\nd_98\t1\t\tevery who read anything passed .\nd_98\t1\t\tno student who read anything passed .\nd_98\t0\t*\tsome answered any question .\nd_98\t0\t*\tevery student answered any question .\nd_98\t1\t\tany cat does n't like mice .\nd_98\t1\t\tevery cat does n't like mice .\nd_98\t1\t\tevery cat does n't like mice , for example felix does n't .\nd_98\t1\t\talmost every cat likes mice , but felix does n't .\nd_98\t0\t*\tevery cat does n't like mice , but felix does n't .\nd_98\t1\t\talmost every cat likes mice , for example felix does n't .\ng_81\t1\t\tthe dodgers beat the red sox and the dodgers were beaten by the giants .\ng_81\t1\t\tthe dodgers beat the red sox and the giants beat the dodgers .\ng_81\t1\t\tdifferent teams beat the red sox and were beaten by the giants .\ng_81\t1\t\tjohn gave the books to mary and the records to sue .\ng_81\t1\t\thow many did you buy of those pies at the fair ?\ng_81\t1\t\thow many have you given of these books to these people .\ng_81\t0\t*\tthe man chased fido returned .\ng_81\t1\t\tthe man that chased fido returned .\ng_81\t1\t\tthe man i think chased fido returned .\ng_81\t0\t*\tthe man i think that chased fido returned .\ng_81\t1\t\tthe man who i think chased fido returned .\ng_81\t0\t*\tthe man who i think that chased fido returned .\ng_81\t1\t\twho did you think mary saw ?\ng_81\t1\t\thow slowly would you say he was driving ?\ng_81\t1\t\thow suspicious was mary ?\ng_81\t1\t\twho saw the man ?\ng_81\t1\t\twho do you think that you saw ?\ng_81\t0\t*\twho do you think that saw you ?\ng_81\t1\t\twho do you regret that you saw ?\ng_81\t0\t*\twho do you regret that saw you ?\ng_81\t1\t\twho do you think you saw ?\ng_81\t1\t\twho do you think saw you ?\ng_81\t0\t*\twho do you regret you saw ?\ng_81\t0\t*\twho do you regret saw you ?\ng_81\t0\t*\twho did you believe that came ?\ng_81\t0\t*\twho did you wonder whether came ?\ng_81\t0\t*\twho did you wonder if came ?\ng_81\t0\t*\twho did you arrange for to come ?\ng_81\t0\t*\twhich table did you wonder on kim put the book ?\ng_81\t0\t*\twhich did you buy the table on kim put the book ?\ng_81\t0\t*\twhat do you believe that iron is to be a fact well known to virtually everybody ?\ng_81\t0\t*\twho did you wonder saw kim ?\ng_81\t0\t*\twhich did you buy the table supported the book ?\ng_81\t0\t*\tthe fact , i put it down to that kim came .\ng_81\t0\t*\tthe table , i put kim on which supported the book .\ng_81\t1\t\twho is it that mary likes ?\ng_81\t1\t\the was talkative .\ng_81\t1\t\the was a bully .\ng_81\t1\t\the was talkative and a bully .\ng_81\t0\t*\tthe talkative and a bully man entered .\ng_81\t0\t*\ttalkative and a bully entered .\ng_81\t0\t*\tjohn is easy to please and to love mary .\ng_81\t0\t*\tthe man who mary loves and sally hates george computed my tax .\ng_81\t1\t\tjohn is easy to please and to love .\ng_81\t1\t\tthe kennel which mary made and fido sleeps in has been stolen .\ng_81\t1\t\tthe kennel in which mary keeps drugs and fido sleeps has been stolen .\ng_81\t0\t*\tthe kennel in which mary made and fido sleeps has been stolen .\ng_81\t1\t\tjohn saw more horses than bill saw or pete talked to .\ng_81\t1\t\tjohn saw more horses than bill saw cows or pete talked to cats .\ng_81\t0\t*\tjohn saw more horses than bill saw cows or pete talked to .\ng_81\t1\t\ti know a man who bill saw and mary liked .\ng_81\t1\t\ti know a man who saw bill and liked mary .\ng_81\t0\t*\ti know a man who bill saw and liked mary .\ng_81\t1\t\ti wonder who bill saw and mary liked .\ng_81\t0\t*\ti wonder who bill saw and liked mary .\ng_81\t1\t\ti wonder who mary likes and hopes will win .\ng_81\t0\t*\tjohn asked who and where bill had seen .\ng_81\t1\t\twhich book and which pencil did john buy ?\ng_81\t0\t*\twhere and when did bill put the book ?\ng_81\t1\t\ton which table and under which flower pot did john put the keys ?\ng_81\t1\t\tto which city and to which conference did bill go ?\ng_81\t1\t\tto which city and which conference did bill go ?\ng_81\t1\t\twhich city and which conference did bill go to ?\ng_81\t0\t*\twhich city and which conference did bill go to to ?\ng_81\t0\t*\twhich city and to which conference did bill go to ?\ng_81\t0\t*\tto which city and which conference did bill go to ?\ng_81\t0\t*\tjohn , who and whose friends you saw , is a fool .\ng_81\t1\t\tjohn , to who and to whose friends that letter was addressed , is a fool .\ng_81\t1\t\ti wonder when and how often she went that day .\ng_81\t1\t\ti wonder who and whose friends he handed over to the fbi .\ng_81\t1\t\ti have wanted to know exactly what happened to rosa luxemburg for many years .\ng_81\t1\t\ti have wanted to know for many years exactly what happened to rosa .\ng_81\t1\t\ti had hoped that it was true that rosa luxemburg had actually defected to iceland for many years .\ng_81\t1\t\ti had hoped that it was true for many years that rosa luxemburg had actually defected to iceland .\ng_81\t1\t\ti have wanted to meet the man who spent so much money planning the assassination of kennedy for many years .\ng_81\t1\t\ti have wanted to meet for many years the man who spent so much money planning the assassination of kennedy .\ng_81\t1\t\tthe woman believed that the man was ill who was here .\ng_81\t1\t\tthe woman believed that the man who was here was ill .\ng_81\t1\t\tthe woman who was here believed that the man was ill .\ng_81\t1\t\ta woman hit a girl who was pregnant .\ng_81\t1\t\tpeople are said to do crazier things at higher speeds there by dorothy than they are by other people .\ng_81\t1\t\tpeople are said to do such crazy things at such high speeds there by dorothy that i am getting skeptical .\ng_81\t1\t\ta woman hit a pregnant girl .\ng_81\t1\t\ta pregnant woman hit a girl .\ng_81\t1\t\ta man just came in and a woman went out who were similar in all kinds of ways .\ng_81\t1\t\ta man just came in and a woman went out who hate each other like poison and always have .\ng_81\t0\t*\ti find it easy to believe - but joan finds it hard to believe - tom to be dishonest .\ng_81\t0\t*\tjohn offered , and harry gave , sally a cadillac .\ng_81\t0\t*\tjohn told , and harry showed , seymour that sally was a virgin .\ng_81\t1\t\tjack may be and tony certainly is a werewolf .\ng_81\t1\t\tharry has claimed but i do not believe that melvin is a communist .\ng_81\t1\t\ti like but tom does n't like to visit new places .\ng_81\t1\t\ti can tell you when , but i ca n't tell you why , he left me .\ng_81\t1\t\ti 've been wondering whether , but would n't positively want to state that .\ng_81\t1\t\tjohn hummed , and mary sang , the same tune .\ng_81\t1\t\tjohn hummed , and mary sang , at equal volumes .\ng_81\t1\t\tjohn gave mary , and joan presented to fred , books which looked .\ng_81\t1\t\tthe red sox beat , and the giants were beaten by , different teams .\ng_81\t1\t\tsmith loaned , and his widow later donated , a valuable collection of manuscripts to the library .\nm_02\t1\t\twhich club did you hit the winning putt with ?\nm_02\t1\t\twith which club did you hit the winning putt ?\nm_02\t1\t\tethel was sitting at her desk .\nm_02\t0\t*\tthe ethel was sitting at her desk .\nm_02\t0\t*\taccountant was sitting at her desk .\nm_02\t1\t\tthe accountant was sitting at her desk .\nm_02\t1\t\taccountants audit our finances every year .\nm_02\t0\t*\ti would like an accountants to sort out my tax return .\nm_02\t1\t\tsome accountants were quietly counting in the back office .\nm_02\t1\t\twould more accountants make any difference to my tax bill ?\nm_02\t1\t\tthe truck spread salt .\nm_02\t1\t\tthe truck spread the salt .\nm_02\t1\t\tthe truck spread salts .\nm_02\t1\t\tthis truck spread less salt than that one .\nm_02\t0\t*\tthis truck spread fewer salt than that one .\nm_02\t1\t\tthere are fewer trucks on the motorway this winter .\nm_02\t1\t\tthere are less trucks on the motorway this winter .\nm_02\t0\t*\tthe white rabbit vanished his watch .\nm_02\t1\t\tdogs chase cats .\nm_02\t0\t*\tdogs chase .\nm_02\t1\t\tflora cooks .\nm_02\t1\t\tflora cooks gourmet meals .\nm_02\t1\t\tthe cat shot into the kitchen on sunday morning carrying a dead mouse .\nm_02\t1\t\tthe cat sauntered into the kitchen carrying a dead mouse .\nm_02\t1\t\tmaisie drove her car from morningside to leith on wednesday .\nm_02\t1\t\ton wednesday maisie drove her car from morningside to leith .\nm_02\t1\t\tmaisie drove her car on wednesday from morningside to leith .\nm_02\t1\t\tjeeves sauntered into the room .\nm_02\t0\t*\tinto jeeves sauntered the room .\nm_02\t1\t\tinto the room sauntered jeeves .\nm_02\t1\t\twhich room did jeeves sauntered into ?\nm_02\t1\t\tinto which room did jeeves sauntered ?\nm_02\t1\t\tbarbara handed the results to alan on tuesday .\nm_02\t1\t\tthe pupils in this maths class gave cakes to margaret every .\nm_02\t1\t\tcakes were given to margaret every friday by the pupils in this maths class .\nm_02\t1\t\tthis parcel is very heavy .\nm_02\t1\t\tthis very heavy parcel was delivered yesterday .\nm_02\t1\t\tvery heavy , this parcel !\nm_02\t1\t\twhat this parcel is is very heavy .\nm_02\t1\t\twe felled the murder with this chainsaw .\nm_02\t1\t\twith this chainsaw we felled the murder .\nm_02\t1\t\tbarbara handed the intriguing results of the latest examination to alan on tuesday .\nm_02\t1\t\tbarbara handed them to alan on tuesday .\nm_02\t1\t\tthis large parcel is very heavy .\nm_02\t1\t\tthis large parcel is very heavy and so is this small packet .\nm_02\t1\t\tvera is knitting in the lounge .\nm_02\t1\t\tvera is knitting there .\nm_02\t1\t\tgrandma is coming to mr chalky 's school tomorrow .\nm_02\t1\t\tgrandma is coming here tomorrow .\nm_02\t1\t\tthe cat was sleeping in the kitchen .\nm_02\t1\t\tthe cat trotted into the kitchen .\nm_02\t1\t\tthe mouse jumped out of the cheese box .\nm_02\t1\t\tthe mouse was out the cheese box .\nm_02\t1\t\tthe cat trotted in the kitchen .\nm_02\t1\t\tthe cat trotted in .\nm_02\t1\t\tthe mouse jumped out .\nm_02\t1\t\tthe terrier attacked the burglar .\nm_02\t1\t\tthe terrier savaged the burglar 's ankles .\nm_02\t1\t\tthe terrier attacked the burglar and the terrier savaged the burglar 's ankles .\nm_02\t1\t\tthe terrier attacked the burglar and savaged the burglar 's ankles .\nm_02\t1\t\tdid the wealthy young man buy that piano for his secret fiancée ?\nm_02\t1\t\twho bought that piano for his secret fiancée ?\nm_02\t1\t\twhat did the wealthy young man buy for his secret fiancée ?\nm_02\t1\t\twho did the wealthy young man buy that piano for ?\nm_02\t1\t\tthe wealthy young man bought his secret fiancée that piano .\nm_02\t1\t\tthat piano was bought for his secret fiancée by the wealthy young man .\nm_02\t1\t\ti do n't like the plum brandy , but the port i just love .\nm_02\t1\t\tfrank bought the piano for jane .\nm_02\t1\t\tfrank bought jane the piano .\nm_02\t1\t\tthe piano was bought for jane by frank .\nm_02\t1\t\tthe piano frank bought for jane .\nm_02\t1\t\tdid frank buy the piano for jane ?\nm_02\t1\t\tdid frank buy jane the piano ?\nm_02\t1\t\twas the piano bought for jane by frank ?\nm_02\t1\t\twhat did frank buy for jane ?\nm_02\t1\t\tfrank bought something for jane .\nm_02\t1\t\tdid frank buy something for jane .\nm_02\t1\t\twhat did frank buy for jane .\nm_02\t1\t\tthe children chased the dog .\nm_02\t1\t\tthe cook saved no scraps for the dog .\nm_02\t1\t\tsarah devoured the cakes in the kitchen last night .\nm_02\t1\t\tmr knightley despaired .\nm_02\t1\t\temma slighted miss bates .\nm_02\t1\t\tjane fairfax seemed upset .\nm_02\t1\t\tmr woodhouse sat in an armchair .\nm_02\t1\t\tmr knightley walked into the drawing room .\nm_02\t1\t\tmr elton handed his wife into the carriage .\nm_02\t1\t\temma gave bad advice to harriet .\nm_02\t1\t\tmr knightley suggested that thieves would break into hartfield .\nm_02\t1\t\teleanor blamed willoughby for marianne 's unhappiness .\nm_02\t1\t\teleanor blamed marianne 's unhappiness on willoughby .\nm_02\t1\t\tthe romans built this aqueduct .\nm_02\t1\t\tthe computer will calculate the value of the variable .\nm_02\t1\t\tthese objections killed the proposal .\nm_02\t0\t*\tlecturer was sitting at her desk .\nm_02\t1\t\ttoo much salt damages vehicles .\nm_02\t0\t*\ttoo much vehicles are damaged by salt .\nm_02\t0\t*\ttoo many salt damages vehicles .\nm_02\t1\t\ttoo many vehicles are damaged by salt .\nm_02\t1\t\tfrank churchill gave a piano to jane fairfax .\nm_02\t1\t\ta piano was given to jane fairfax by frank churchill .\nm_02\t1\t\twickham eloped with lydia .\nm_02\t1\t\tmiss bates can chatter on for hours .\nm_02\t1\t\thenry crawford loved fanny but fanny loved edmund .\nm_02\t1\t\tmr bingley became tired of jane or mr d'arcy persuaded mr .\nm_02\t1\t\telizabeth regretted that she had met wickham .\nm_02\t1\t\tcatherine feared that the abbey was haunted .\nm_02\t1\t\tthat anne was in conversation with mr elliott dismayed captain .\nm_02\t1\t\tfanny was delighted by the idea that she could subscribe to a library .\nm_02\t1\t\twho thought up the proposal that the committee be abolished ?\nm_02\t1\t\tthe cottage which mrs dashwood accepted was rather small .\nm_02\t1\t\tthe gentleman who saved marianne was willoughby .\nm_02\t1\t\tthe building that we liked is in thornton lacey .\nm_02\t1\t\tit was anne elliott who loved captain wentworth but who rejected his first proposal .\nm_02\t1\t\ta motorist has reported that the road is blocked by snow at bunker hill .\nm_02\t1\t\tthe labrador ate all the food which we left on the kitchen table .\nm_02\t1\t\tshow me the folder in which you stored the documents .\nm_02\t1\t\ti like the book that you gave me .\nm_02\t1\t\ti love the food they cook in the halls of residence .\nm_02\t1\t\ta motorist has reported the road is blocked at bunker hill .\nm_02\t1\t\ti am delighted at the idea they might demolish the appleton tower .\nm_02\t1\t\tthe cottage which mrs dashwood accepted was very small .\nm_02\t1\t\tanne musgrave has just seen mr elliott in bath street .\nm_02\t1\t\tnurse rooke has discovered where anne elliott stayed .\nm_02\t1\t\tnurse rooke suspected that mrs clay planned to run away with .\nm_02\t1\t\tanne astonished her father .\nm_02\t1\t\tthat captain wentworth married anne astonished her father .\nm_02\t1\t\tsir walter elliott imagined the scene .\nm_02\t1\t\tsir walter elliott imagined that he was still handsome .\nm_02\t1\t\tyesterday lydia eloped with wickham .\nm_02\t1\t\tlydia eloped with wickham yesterday .\nm_02\t1\t\twhen lydia went to brighton , she eloped with wickham .\nm_02\t1\t\tlydia eloped with wickham when she went to brighton .\nm_02\t1\t\tbecause of the strike the commuters travelled by army lorry .\nm_02\t1\t\tthe commuters travelled by army lorry because of the strike .\nm_02\t1\t\tbecause the bus drivers were on strike , the commuters travelled by army lorry .\nm_02\t1\t\tthe commuters travelled by army lorry because the bus drivers were on strike .\nm_02\t1\t\talthough mr d'arcy disliked mrs bennet he married elizabeth .\nm_02\t1\t\tin spite of his dislike of mrs bennet , mr d'arcy married elizabeth .\nm_02\t1\t\tif emma had left hartfield , mr woodhouse would have been unhappy .\nm_02\t1\t\tdid captain wentworth write a letter to anne elliott ?\nm_02\t1\t\twrite a letter to anne elliott .\nm_02\t0\t*\tbecause did marianne love willoughby , she refused to .\nm_02\t0\t*\tif did emma leave hartfield , mr woodhouse would be unhappy .\nm_02\t0\t*\twhen did fanny return , she found tom bertram very ill .\nm_02\t0\t*\tthe cottage which did mrs dashwood accept was rather small .\nm_02\t0\t*\tcatherine feared that was the abbey haunted .\nm_02\t1\t\tthe girls wondered who mr bennet had received in his library .\nm_02\t1\t\twe were wondering who did you meet at the conference .\nm_02\t1\t\tshe said that in came aunt norris .\nm_02\t1\t\tshe said that into the room came aunt norris .\nm_02\t0\t*\tthe person who in came at that moment was aunt norris .\nm_02\t0\t*\tbecause in came aunt norris , fanny stopped talking .\nm_02\t0\t*\twhen in came aunt norris , fanny stopped talking .\nm_02\t0\t*\tbecause into the room came aunt norris , fanny stopped talking .\nm_02\t0\t*\twhen into the room came aunt norris , fanny stopped talking .\nm_02\t1\t\tnever had sir thomas been so offended .\nm_02\t0\t*\tthe person who never had he been so offended was sir thomas .\nm_02\t0\t*\tbecause never had sir thomas been so offended , even mr yates left .\nm_02\t0\t*\twhen never had sir thomas been so offended , mr yates left .\nm_02\t1\t\tdr jones habitually ate too much rich food , did n't he ?\nm_02\t0\t*\twe realised that dr jones died because he ate too much rich food , did n't he ?\nm_02\t0\t*\tthe person who ate too much rich food did n't he was dr .\nm_02\t0\t*\tbecause dr jones ate too much rich food did n't he , he died of apoplexy .\nm_02\t0\t*\twhen dr jones died of apoplexy did n't he , mary crawford went to live with his wife .\nm_02\t1\t\tfanny stopped talking because in came aunt norris .\nm_02\t0\t*\tbecause in came aunt norris fanny stopped talking .\nm_02\t1\t\tmr yates left because never had sir thomas been so offended .\nm_02\t0\t*\tbecause never had sir thomas been so offended , mr yates left .\nm_02\t0\t*\tfanny stopped talking when in came aunt norris .\nm_02\t0\t*\twhen in came aunt norris fanny stopped talking .\nm_02\t0\t*\tfanny continued talking although in came aunt norris .\nm_02\t0\t*\talthough in came aunt norris , fanny continued talking .\nm_02\t1\t\tfanny had just stopped talking when in came aunt norris .\nm_02\t1\t\tfanny regretted talking to mary .\nm_02\t1\t\thenry wanted to marry fanny .\nm_02\t1\t\tmrs bennet having taken the others upstairs , mr bingley proposed to .\nm_02\t1\t\tall mr collins does is praise lady de bourg .\nm_02\t1\t\tlady de bourg tried to persuade elizabeth to renounce mr d'arcy .\nm_02\t1\t\thenry wanted to have married fanny before edmund returned .\nm_02\t1\t\tfanny regretted having talked to mary .\nm_02\t1\t\twhat mr collins is doing is praising lady de bourg .\nm_02\t0\t*\tfanny regretted being talking to mary .\nm_02\t0\t*\tall mr collins has done is have praised lady de bourg .\nm_02\t1\t\tjulia and maria wanted to be allowed to perform a play .\nm_02\t1\t\tedmund wanted fanny to be able to ride a horse .\nm_02\t0\t*\thenry wanted to possibly marry fanny .\nm_02\t1\t\tfanny loved talking to mary .\nm_02\t1\t\tslamming the door , he ran down the steps .\nm_02\t0\t*\the was knowing the country well .\nm_02\t1\t\twhen ripe , these apples will be delicious .\nm_02\t1\t\tthe tigers hunt prey at night .\nm_02\t1\t\tfiona hoped to meet the prime minister .\nm_02\t1\t\tarthur tried to bake a cake .\nm_02\t1\t\tfiona persuaded arthur to bake a cake .\nm_02\t1\t\tsusan wanted jane to study german .\nm_02\t1\t\tayala went to the ball and chatted to jonathan stubbs .\nm_02\t0\t*\tayala went to the ball and jonathan stubbs chatted to .\nm_02\t1\t\tayala went to the ball and was chatted to by jonathan stubbs .\nm_02\t1\t\tall the beatles came to merle park .\nm_02\t1\t\tthe beatles all came to merle park .\nm_02\t1\t\tboth jane and elizabeth were at home .\nm_02\t1\t\tjane and elizabeth were both at home .\nm_02\t1\t\tlarry hunted all the foxes .\nm_02\t0\t*\tlarry all hunted the foxes .\nm_02\t0\t*\tlarry hunted the foxes all .\nm_02\t1\t\tgeorge built both the houses .\nm_02\t0\t*\tgeorge both built the houses .\nm_02\t0\t*\tgeorge built the houses both .\nm_02\t1\t\tall the foxes were hunted by larry .\nm_02\t1\t\taugusta blamed herself for what happened .\nm_02\t1\t\tthese documents elizabeth is checking at this very moment .\nm_02\t1\t\tlouise broke the cup .\nm_02\t1\t\talison drove the car .\nm_02\t1\t\tmartha chewed the bread .\nm_02\t1\t\tthe cup was broken by louise .\nm_02\t1\t\tthe car was driven by alison .\nm_02\t1\t\tthe bread was chewed by martha .\nm_02\t1\t\tthese fields were marched over by all the armies of europe .\nm_02\t1\t\thow is someone to chat to a girl if she does not go out ?\nm_02\t1\t\tall the armies of europe marched over these fields .\nm_02\t1\t\tayala sent back the diamond necklace .\nm_02\t1\t\tayala sent the diamond necklace back .\nm_02\t1\t\tayala sent her cousin the diamond necklace .\nm_02\t0\t*\tayala sent back her cousin the diamond necklace .\nm_02\t1\t\ttatiana wrote to onegin .\nm_02\t1\t\tfrank bought a piano for jane .\nm_02\t1\t\tlucy sent a letter to jane .\nm_02\t1\t\tlucy sent jane a letter .\nm_02\t1\t\tthe company sent china its senior mining engineers to help plan the new mines .\nm_02\t0\t*\tthe experts attributed raphael this picture .\nm_02\t0\t*\ti forwarded winifred the letter .\nm_02\t0\t*\tthe manager presented the foreman a gold watch .\nm_02\t0\t*\tkick john the ball .\nm_02\t0\t*\tthe critics ascribe shakespeare this play .\nm_02\t1\t\twho did john send a book to ?\nm_02\t1\t\tto whom did john send a book ?\nm_02\t1\t\twhat place did you travel to ?\nm_02\t1\t\tto what place did you travel ?\nm_02\t1\t\twhat place did john send the book ?\nm_02\t0\t*\twho was the book sent by john .\nm_02\t0\t*\twhat place was the book sent by john ?\nm_02\t1\t\tonly to the best students would he give this book .\nm_02\t0\t*\tonly the best students would he give this book .\nm_02\t1\t\tonly to glasgow would he go by train .\nm_02\t0\t*\tonly glasgow would he travel by train .\nm_02\t1\t\tit is to the best students that he gives this book .\nm_02\t0\t*\tit is the best students he gives this book .\nm_02\t1\t\tit is to ireland that he is going .\nm_02\t0\t*\tit is ireland that he is going .\nm_02\t1\t\the told her the whole story .\nm_02\t1\t\tshe told him the whole story .\nm_02\t1\t\tthe other plan she rejected out of hand .\nm_02\t1\t\tthe vase got broken that sheila had brought all the way from .\nm_02\t1\t\tthe plan was rejected out of hand that traffic should be banned .\nm_02\t1\t\tnorman lemming jumped off the cliff and william lemming did so too .\nm_02\t1\t\tnorman lemming jumped off the cliff and so did william lemming .\nm_02\t1\t\tharriet could n't marry mr knightley but emma could .\nm_02\t1\t\twhat harriet did was marry mr martin .\nm_02\t1\t\tmarry mr martin was what harriet did .\nm_02\t1\t\temma insulted miss bates and annoyed mr knightley .\nm_02\t1\t\tharriet swooned .\nm_02\t1\t\tthe book is astonishingly boring .\nm_02\t1\t\tthe ethel we all know and love wishes to ask you some awkward questions .\nm_02\t1\t\tgolfers can be good company .\nm_02\t1\t\tenthusiastic golfers with large handicaps can be good company .\nm_02\t1\t\tthese enthusiastic golfers that i met at the nineteenth hole can be good company .\nm_02\t0\t*\tgolfer who is in training has a pretty powerful swing .\nm_02\t1\t\tmemo ate the spaghetti .\nm_02\t1\t\tmemo liked lasagna .\nm_02\t1\t\temma made harriet her friend .\nm_02\t1\t\tthe quiche and i were cooking .\nm_02\t1\t\terika made her mother an omelet and the kitchen a mess .\nm_02\t1\t\tbill went to london on monday .\nm_02\t1\t\tbill went on monday to london .\nm_02\t1\t\tmy brother lives near strasbourg .\nm_02\t1\t\tnear strasbourg my brother lives .\nm_02\t1\t\the planted the garden with roses last november .\nm_02\t1\t\the planted the garden last november with roses .\nm_02\t1\t\tthe baby chewed the biscuit .\nm_02\t1\t\tthe baby is heavy .\nm_02\t1\t\twhat the baby did was chew the biscuit .\nm_02\t1\t\tthe baby was chewing the biscuit .\nm_02\t1\t\tchew the biscuit !\nm_02\t1\t\thartfield house is in surrey .\nm_02\t1\t\tmr knightley rode to kingston .\nm_02\t1\t\teleanor and marianne travelled from shropshire .\nm_02\t1\t\tfrank gave a piano to jane fairfax .\nm_02\t1\t\tjane fairfax received a piano from frank .\nm_02\t1\t\tthe thief smashed the window with a hammer .\nm_02\t1\t\tcaptain wentworth recovered the property for mrs smith .\nm_02\t1\t\tthe window was broken by a hammer .\nm_02\t1\t\twren built st paul 's cathedral .\nm_02\t1\t\tsiobhan burnt a pattern on the piece of wood .\nm_02\t1\t\tthe dog dug a hole in the lawn .\nm_02\t1\t\tthe vase stood on the table in the hall .\nm_02\t1\t\timogen took the vase to her mother 's .\nm_02\t1\t\timogen broke the vase .\nm_02\t1\t\tsue knows the answer .\nm_02\t1\t\tthe answer is known to sue .\nm_02\t1\t\tjim was happily chopping logs .\nm_02\t1\t\tjim was chopping logs when margaret left and was still at it when she got back .\nm_02\t1\t\tjim was enthusiastically chopping logs .\nm_02\t1\t\tcaptain oates died in order to save his comrades .\nm_02\t1\t\tthis arch supports the weight of the tower .\nm_02\t1\t\twhat this arch does is support the weight of the tower .\nm_02\t1\t\tthis arch is supporting the weight of the tower .\nm_02\t1\t\tthe computer is playing six simultaneous games of three dimensional chess .\nm_02\t1\t\tthe intense cold killed the climbers .\nm_02\t1\t\tthe climbers were killed by the intense cold .\nm_02\t1\t\tthe climbers were killed with the intense cold .\nm_02\t1\t\tcatriona opened the door with this key .\nm_02\t1\t\tthe visas are with the passports .\nm_02\t1\t\tsally went to the party with andrew .\nm_02\t1\t\talan made the loaf with strong white flour .\nm_02\t1\t\tthe builders made the wall with concrete blocks .\nm_02\t1\t\tthe gardener planted roses in the garden .\nm_02\t1\t\tit was roses that the gardener planted in the garden .\nm_02\t1\t\tit is the garden that the gardener planted with roses .\nm_02\t1\t\troses are certain to be planted in the garden by the gardener .\nm_02\t1\t\tthe garden is certain to be planted with roses by the gardener .\nm_02\t1\t\thelen sent a scarf to jim for margaret .\nm_02\t1\t\twhat happened was they went home .\nm_02\t0\t*\twhat happened was they knew his parents .\nm_02\t0\t*\twe are knowing this theory .\nm_02\t1\t\tthey 're believing everything you say .\nm_02\t1\t\tyou 'll soon be owning all the land round here .\nm_02\t1\t\twhat she did was e-mail all her friends .\nm_02\t0\t*\twhat she did was know this theory .\nm_02\t0\t*\twhat she did was be very cold .\nm_02\t0\t*\twhat she did was own all the land round here .\nm_02\t1\t\tharriet talked to emma for hours .\nm_02\t1\t\tthe dog chased the cat for days .\nm_02\t1\t\tharriet told emma the whole story .\nm_02\t1\t\tthe dog caught the cat .\nm_02\t1\t\tthe beaver built a dam .\nm_02\t1\t\tanne played the tune on the piano .\nm_02\t1\t\tjane was playing the piano .\nm_02\t1\t\tjane played the piano .\nm_02\t1\t\ttess was knocking at the door .\nm_02\t1\t\ttess knocked at the door .\nm_02\t1\t\tfrank churchill was crossing the street .\nm_02\t1\t\tjane is visiting emma .\nm_02\t1\t\tjane visits emma .\nm_02\t1\t\ttess is knocking at the door .\nm_02\t1\t\ttess knocks at the door .\nm_02\t1\t\tfrank churchill is crossing the street .\nm_02\t1\t\tfrank churchill crosses the street .\nm_02\t1\t\treal play valencia next sunday .\nm_02\t1\t\ti leave for paris next week .\nm_02\t0\t*\tthe volcano erupts on tuesday .\nm_02\t1\t\tthe minister has arrived .\nm_02\t1\t\ti 've been at work for six hours .\nm_02\t1\t\thave you ever visited doubtful sound ?\nm_02\t1\t\tthere was an attack yesterday .\nm_02\t1\t\temma and harriet were attacked by those bandits .\nm_02\t1\t\tthose bandits attacked emma and harriet yesterday .\nm_02\t1\t\tthe vase was smashed deliberately .\nm_02\t1\t\tthe sheep got infected with scrapie .\nm_02\t1\t\tthe fans were deliberately provoked by a rival group .\nm_02\t1\t\tthe fans got deliberately provoked by a rival group .\nm_02\t1\t\tsix students got shot accidentally .\nm_02\t1\t\tsome gifts get used a dozen or so times a year .\nm_02\t1\t\tca n't you see i 'm reading ?\nm_02\t1\t\tpeople go hunting in the autumn .\nm_02\t1\t\twe spent yesterday cooking .\nm_02\t1\t\tshe buys for harrods .\nm_02\t1\t\ti saw and he chops .\nm_02\t1\t\tthis sweater washes well .\nm_02\t1\t\tthis book reads well .\nm_02\t1\t\tthese cars sold very quickly last week .\nm_02\t1\t\tit will take years for the mersey to clean .\nm_02\t1\t\tthe course is jumping well .\nm_02\t1\t\tone bomb did n't guide and crashed .\nm_02\t1\t\tfiona may be here by 5 o'clock .\nm_02\t1\t\tif fiona is here by 5 o'clock , we can go to the party .\nm_02\t1\t\tit 's high time fiona got a job .\nm_02\t0\t*\tit 's high time fiona gets a job .\nsgww85\t1\t\tpat is either stupid or a liar .\nsgww85\t1\t\tpat is a republican and proud of it .\nsgww85\t1\t\tpat is healthy and of sound mind .\nsgww85\t1\t\tpat is either asleep or at the office .\nsgww85\t1\t\tthat was a rude remark and in very bad taste .\nsgww85\t1\t\tsandy is either a lunatic or under the influence of drugs .\nsgww85\t1\t\ti am hoping to get an invitation and optimistic about my chances .\nsgww85\t1\t\ti am neither an authority on this subject nor trying to portray myself as one .\nsgww85\t1\t\tpat was neither recommended for promotion nor under any illusions about what that meant .\nsgww85\t1\t\tpat has become a banker and very conservative .\nsgww85\t1\t\ti consider that a rude remark and in very [ np and pp ] bad taste .\nsgww85\t1\t\tthe scene of the movie was in chicago .\nsgww85\t0\t*\tthe scene of the movie and that i wrote was in chicago .\nsgww85\t1\t\tjohn sang beautifully .\nsgww85\t1\t\tjohn sang a carol .\nsgww85\t0\t*\tjohn sang beautifully and a carol .\nsgww85\t1\t\tkim sang and sandy danced .\nsgww85\t1\t\tkim and sandy met .\nsgww85\t1\t\tkim sang and was accompanied by sandy .\nsgww85\t0\t*\tthe irritating and a bully man was my brother .\nsgww85\t0\t*\tsoon irritating and a bully started shouting again .\nsgww85\t1\t\tkim was a banker .\nsgww85\t1\t\tdana was quite competent .\nsgww85\t1\t\tleslie was in the flood zone .\nsgww85\t1\t\tronnie was talking to lou .\nsgww85\t1\t\tjean was given a prize .\nsgww85\t1\t\tpat has become a republican .\nsgww85\t1\t\tgerry became quite conservative .\nsgww85\t0\t*\tconnie has become of the opinion that we should get out .\nsgww85\t0\t*\ttracy became awarded a prize .\nsgww85\t0\t*\tchris will become talking to colleagues .\nsgww85\t1\t\tpat became a republican and quite conservative .\nsgww85\t0\t*\ttracy has become a republican and of the opinion that we must place nuclear weapons in europe .\nsgww85\t0\t*\tchris became quite conservative and trying to change their minds .\nsgww85\t0\t*\tgerry became a republican and awarded a prize .\nsgww85\t1\t\twe walked slowly and with great care .\nsgww85\t1\t\tthey wanted to leave tomorrow or on tuesday .\nsgww85\t1\t\twe are open saturdays , any national holiday , and on alternate .\nsgww85\t1\t\tkim alienates cats and beats his dog .\nsgww85\t1\t\tkim alienates cats and beat his dog .\nsgww85\t1\t\tkim alienated cats and beats his dog .\nsgww85\t1\t\tkim alienated cats and beat his dog .\nsgww85\t0\t*\tkim alienated cats and beaten his dog .\nsgww85\t0\t*\tkim beating his dog and alienates cats .\nsgww85\t0\t*\tkim to beat his dog and alienated cats .\nsgww85\t0\t*\tkim beaten his dog and alienates cats .\nsgww85\t1\t\twhich student 's grades went unreported ?\nsgww85\t1\t\tthey found pictures of themselves .\nsgww85\t0\t*\twho did you say my talking to would bother hilary ?\nsgww85\t1\t\twho did you say my talking to would bother ?\nsgww85\t1\t\twhich article did terry file without reading ?\nsgww85\t1\t\twhich books did robin read and hate ?\nsgww85\t0\t*\twhich books did robin talk to chris and read ?\nsgww85\t0\t*\twhich books did robin read and talk to chris ?\nsgww85\t0\t*\twho did robin visit and ?\nsgww85\t1\t\tthey talked to kim and to each other .\nsgww85\t1\t\the hated himself and his friends .\nsgww85\t1\t\tthey were wary of themselves and of each other .\nsgww85\t1\t\tthey asked which students and which teachers would get along together .\nsgww85\t1\t\twe called up every man whose father and whose mother had played on the team .\nsgww85\t1\t\ti went to the store and bought some whiskey .\nsgww85\t1\t\tshe 's gone and ruined her dress now .\nsgww85\t1\t\ti 've got to try and find that screw .\nsgww85\t1\t\tshe goes and buys some whiskey .\nsgww85\t1\t\ti have gone and bought some whiskey .\nsgww85\t1\t\ti will go and buy some whiskey .\nsgww85\t1\t\ti will try and buy some whiskey .\nsgww85\t0\t*\ti have gone and buys some whiskey .\nsgww85\t0\t*\tto go and buying whiskey is not the solution to your problem .\nsgww85\t0\t*\ti will go and bought some whiskey .\nsgww85\t0\t*\ti tried and buy some whiskey .\nsgww85\t0\t*\ti was trying and buying some whiskey .\nsgww85\t0\t*\twhat did you say i went and get ?\nsgww85\t0\t*\twhat did you say i go and got ?\nsgww85\t1\t\ti went to the store and i bought some whiskey .\nsgww85\t1\t\ti 've got to try and i 've got to find that screw .\nsgww85\t1\t\ti both went to the store and bought some whiskey .\nsgww85\t1\t\ti 've got to both try and find that screw .\nsgww85\t1\t\there 's the whiskey which i went to the store and bought .\nsgww85\t1\t\twhich dress has she gone and ruined now ?\nsgww85\t1\t\tthe screw which i 've got to try and find holds the door to the frame .\nsgww85\t1\t\teither we americans or i myself will get ourselves in trouble .\nsgww85\t1\t\teither you or i will incriminate ourselves .\nsgww85\t1\t\tyou and i may incriminate ourselves .\nsgww85\t1\t\twe americans and the british pamper ourselves .\nsgww85\t1\t\tyou british and you americans pamper yourselves .\nsgww85\t1\t\tyou british or you americans will get yourselves in trouble .\nsgww85\t1\t\tyou and kerry have outdone yourselves .\nsgww85\t1\t\tyou or kerry have perjured yourselves .\nsgww85\t1\t\tthe boys and the girls seem happy .\nsgww85\t0\t*\tthe boys and the girls seems happy .\nsgww85\t1\t\teither the boys or the girls are going to be there .\nsgww85\t1\t\tthe students and professor swansong are meeting in the park .\nsgww85\t1\t\teither professor swansong or the graduate students are going to proctor the exam .\nsgww85\t1\t\teither dana or lee is going to lead the parade .\nsgww85\t1\t\tkim and terry are happy .\nsgww85\t0\t*\teither the boys or the girls is going to be there .\nsgww85\t0\t*\tthe students and professor swansong is meeting in the park .\nsgww85\t0\t*\teither professor swansong or the graduate students is going to proctor the exam .\nsgww85\t1\t\teither dana or lee are going to lead the parade .\nsgww85\t1\t\tkim likes sandy , and lee leslie . to try to go to rome .\nsgww85\t1\t\tpat wanted to try to go to berne , and chris to go to rome . to rome .\nsgww85\t1\t\tkim went to the store , and then lou .\nsgww85\t1\t\tsome people go by car , but others by bike .\nsgww85\t1\t\tsome people like bagels , but others cream cheese .\nsgww85\t1\t\ton weekdays , terry eats meat and vegetables , but on weekends , only vegetables .\nsgww85\t0\t*\tjohn drinks coffee at 11 , and mary , tea at 10:30 .\nsgww85\t1\t\tjohn gave the books to mary at christmas , and the records to sue for her birthday .\nsgww85\t1\t\tjohn talked to his supervisor about his thesis , and erich to the dean about department politics .\nsgww85\t1\t\ta businessman will drink a martini to relax , and a health nut , a glass of wine , just to remain healthy .\nsgww85\t0\t*\tjohn left at 11 and at 12 , bill .\nsgww85\t1\t\tjohn left his office at 11 and at 12 , the library .\nsgww85\t1\t\ta policeman walked in at 11 , and at 12 , a fireman .\nsgww85\t1\t\ttwo days ago , we went out to dinner , and this afternoon , to the movies .\nsgww85\t1\t\ton this table , they put a lamp , and on that table , a radio .\nsgww85\t0\t*\tjohn did n't see mary and bill sue .\nsgww85\t1\t\tjohn did n't give the books to mary and the papers to sue .\nsgww85\t0\t*\tkim likes sandy , and lee to leslie .\nsgww85\t0\t*\tpat wanted to go to berne , and chris going to rome .\nsgww85\t0\t*\tkim gave a dollar to bobbie and a dime into his pocket .\nsgww85\t0\t*\tkim likes lee , and to ronnie .\nsgww85\t0\t*\tkim likes sandy and lee likes to leslie .\nsgww85\t1\t\tleslie is rather foolish , and lou a complete idiot .\nsgww85\t1\t\tkim seems to be just surviving , and terry in dire need of our help .\nsgww85\t1\t\twe consider leslie rather foolish , and lou a complete idiot .\nsgww85\t1\t\tpat has become crazy , and chris an incredible bore .\nsgww85\t0\t*\tpat has become crazy , and chris in good spirits .\nsgww85\t0\t*\ti gave a book to john 's mother and a magazine to him .\nsgww85\t1\t\tpat remembered the appointment and that it was important to be on time .\nsgww85\t1\t\tthat goldstein appointed heydrich and the implications thereof frightened many observers .\nsgww85\t1\t\twe talked about mr. colson and that he had worked at the .\nsgww85\t1\t\tyou can depend on my assistant and that he will be on time .\nsgww85\t1\t\tpat was annoyed by the children 's noise and that their parents did nothing to stop it .\nsgww85\t0\t*\twe talked about that he had worked at the white house .\nsgww85\t0\t*\tyou can depend on that he will be on time .\nsgww85\t0\t*\tpat was annoyed by that their parents did nothing to stop it .\nsgww85\t1\t\twe talked about the issues we had worked on as students and that our perspectives had changed over the years .\nsgww85\t0\t*\twe talked about that our perspectives had changed over the years and the issues we had worked on as students .\nsgww85\t1\t\tthat our perspectives had changed over the years and the issues we had worked on as students were the topics of discussion .\nsks13\t1\t\tthe clever snake disappeared into a hole in the ground .\nsks13\t0\t*\thole into disappeared ground the the in clever a little .\nsks13\t0\t*\tthe snake clever disappeared into a hole in the ground .\nsks13\t0\t*\tthis girl in the red coat will put \ba picture of bill it on your desk before tomorrow .\nsks13\t0\t*\tthis girl in the red coat will put a picture of bill \bon your desk there before tomorrow .\nsks13\t0\t*\tthis \bgirl in the red coat one will put a picture of bill on your desk before tomorrow .\nsks13\t0\t*\tthis girl in the red coat will put a picture of bill on your desk it before tomorrow .\nsks13\t1\t\tthis girl in the red coat will put a picture of bill on your desk before tomorrow .\nsks13\t1\t\tbill will put a picture of this girl in the red coat on your desk before tomorrow .\nsks13\t1\t\tshe will put a picture of bill on your desk before tomorrow .\nsks13\t0\t*\tbill will put a picture of she on your desk before tomorrow .\nsks13\t1\t\tbill will put a picture of her on your desk before tomorrow .\nsks13\t0\t*\ther will put a picture of bill on your desk before tomorrow .\nsks13\t0\t*\tshe her will put a picture of bill on your desk before tomorrow .\nsks13\t0\t*\tbill will put a picture of she her on your desk before tomorrow .\nsks13\t1\t\tclean your desk before tomorrow .\nsks13\t1\t\tthis girl will put a picture of bill on your desk before tomorrow .\nsks13\t1\t\tthis boy must not go to school , and his father must not go to school either .\nsks13\t1\t\tthis boy must not go to france , but his father must go to france .\nsks13\t1\t\tthis actress must play in this movie and she will play in this movie .\nsks13\t1\t\tcan mary win the race and will sue win the race too ?\nsks13\t1\t\tthis girl will buy bread and so will that one buy bread .\nsks13\t1\t\tthe tourists will go to the park .\nsks13\t1\t\twill the tourists go to the park ?\nsks13\t1\t\tsome student from australia speaks chinese .\nsks13\t1\t\tdoes some student from australia speak chinese ?\nsks13\t1\t\tthey would have been walking for hours .\nsks13\t1\t\twould they have been walking for hours ?\nsks13\t1\t\tthis girl will not buy bread , will she buy bread ?\nsks13\t1\t\tsean penn can act well in many kinds of movies , ca n't he act well in many kinds of movies ?\nsks13\t1\t\tyou will put a picture of bill on your desk before tomorrow .\nsks13\t1\t\tthis girl in the red coat or you will put a picture of bill on your desk before tomorrow .\nsks13\t1\t\tno boys will put a picture of bill on your desk before tomorrow .\nsks13\t1\t\tthis girl in the red coat but no boys will put a picture of bill on your desk before tomorrow .\nsks13\t1\t\tthis girl in the red coat will put it and a picture of bill on your desk before tomorrow .\nsks13\t1\t\tthis girl in the red coat will put a picture of bill in the mailbox before tomorrow .\nsks13\t1\t\tthis girl in the red coat will put a picture of bill on your desk after the dinner .\nsks13\t1\t\tthis girl in the red coat will put a picture of bill on your desk after the dinner and before tomorrow .\nsks13\t1\t\tthis girl in the red coat will eat her breakfast before tomorrow .\nsks13\t1\t\tthis girl in the red coat will eat her breakfast before tomorrow and put a picture of bill on your desk before tomorrow .\nsks13\t1\t\tthis girl in the red coat will eat her breakfast and will put a picture of bill on your desk before tomorrow .\nsks13\t1\t\tthis girl in the red coat will put a picture of bill on your desk .\nsks13\t1\t\tthis girl in the red dress must put a picture of bill on your desk .\nsks13\t0\t*\tthis girl in the red coat will and dress must put a picture of bill on your desk .\nsks13\t0\t*\tthis girl in the or on the red coat will put a picture of bill on your desk .\nsks13\t1\t\tjohn and mary will play with henry and with sue .\nsks13\t1\t\tthey play unusual music , and i listen to unusual music .\nsks13\t1\t\tthey play and i listen to unusual music .\nsks13\t1\t\ti love ice milk tea but you hate ice milk tea .\nsks13\t1\t\ti love but you hate ice milk tea .\nsks13\t1\t\tshe may have thawed the roast and should have thawed the roast .\nsks13\t1\t\tshe may have and should have thawed the roast .\nsks13\t1\t\tsmith loaned a valuable collection of manuscripts to the library , and his widow later donated a valuable collection of manuscripts to the library .\nsks13\t1\t\tsmith loaned and his widow later donated a valuable collection of manuscripts to the library .\nsks13\t1\t\ti borrowed large sums of money from the bank , and my sister stole large sums of money from the bank .\nsks13\t1\t\ti borrowed and my sister stole large sums of money from the bank .\nsks13\t0\t*\tput a picture of bill on your desk , this girl in the red coat will put a picture of bill on your desk before tomorrow .\nsks13\t1\t\tmary should know that you must go to the station .\nsks13\t1\t\tthat you must go to the station , mary should know that you must go to the station .\nsks13\t0\t*\tthis your , this girl in the red coat will put a picture of bill on your desk before tomorrow .\nsks13\t0\t*\twill bill , this girl in the red coat will put a picture of bill on your desk before tomorrow .\nsks13\t0\t*\tred picture desk , this girl in the red coat will put a picture of .\nsks13\t0\t*\tbefore your , this girl in the red coat will put a picture of bill on your desk before tomorrow .\nsks13\t0\t*\tgirl in the red coat , this girl in the red coat will put a picture of bill on your desk before tomorrow .\nsks13\t0\t*\twill put a picture of bill on your desk before tomorrow , this girl in the red coat will put a picture of bill on your desk before tomorrow .\nsks13\t0\t*\tthe red , this girl in the red coat will put a picture of bill on your desk before tomorrow .\nsks13\t0\t*\tof bill on , this girl in the red coat will put a picture of bill on your desk before tomorrow .\nsks13\t0\t*\twill put , this girl in the red coat will put a picture of bill on your desk before tomorrow .\nsks13\t0\t*\tyour desk before , this girl in the red coat will put a picture of bill on your desk before tomorrow .\nsks13\t1\t\tit is your notes that john wants to look at after class .\nsks13\t1\t\tit is after class that john wants to look at your notes .\nsks13\t1\t\tit is john who wants to look at your notes after class .\nsks13\t1\t\tit was ann who bought a first edition of richard iii for $ 1000 .\nsks13\t1\t\tit was a first edition of richard iii that ann bought for $ 1000 .\nsks13\t1\t\tit was for $ 1000 that ann bought a first edition of richard iii .\nsks13\t0\t*\tit is before tomorrow that this girl in the red coat will put a picture of bill on your desk before tomorrow .\nsks13\t1\t\tmary saw the tall man coming from england .\nsks13\t0\t*\tit is the tall man coming from england that mary saw the tall man coming from england .\nsks13\t1\t\tmary saw the tall man come from the back .\nsks13\t0\t*\tit is the tall man come from the back that mary saw the tall man come from the back .\nsks13\t1\t\tit is a picture of bill that this girl in the red coat will put on your desk before tomorrow .\nsks13\t0\t*\tit is put a picture of bill on your desk before tomorrow that this girl in the red coat will .\nsks13\t1\t\twhat john wants to look at now is your notes .\nsks13\t0\t*\twhat mary gave was a book to john .\nsks13\t0\t*\twhat mary donated was a lot of money to npr .\nsks13\t1\t\tit is to cleveland that john drove the truck .\nsks13\t1\t\twhat john became was deadly afraid of flying .\nsks13\t0\t*\tit is deadly afraid of flying that john became .\nsks13\t1\t\tjohn told us that he wants to quit school .\nsks13\t0\t*\tit is that he wants to quit school that john told us .\nsks13\t1\t\twhat john told us is that he wants to quit school .\nsks13\t1\t\tjohn promised us to be gentle .\nsks13\t0\t*\tit is to be gentle that john promised .\nsks13\t1\t\tmary will arrive tomorrow .\nsks13\t0\t*\tit is arrive tomorrow that mary will .\nsks13\t1\t\thenri wants the book which is on the top shelf .\nsks13\t1\t\twhat henri wants is the book which is on the top shelf .\nsks13\t1\t\tthe spy became too friendly with his new contacts .\nsks13\t1\t\twhat the spy became was too friendly with his new contacts .\nsks13\t1\t\twhat this girl in the red coat will do is put a picture of bill on your desk before tomorrow .\nsks13\t1\t\thenri wants to buy these books about cooking .\nsks13\t1\t\twhich books about cooking does henri want to buy ?\nsks13\t1\t\ti sent it to you .\nsks13\t0\t*\ti sent to you it .\nsks13\t0\t*\ti sent to you recipes .\nsks13\t1\t\tbill 's mother 's friends are waiting at the restaurant .\nsks13\t1\t\tbill 's mother 's friends and john are waiting at the restaurant .\nsks13\t1\t\tit was john that was waiting at the restaurant .\nsks13\t0\t*\tit was john bill that were waiting at the restaurant .\nsks13\t1\t\tit was john and bill that were waiting at the restaurant .\nsks13\t1\t\ti will eat spaghetti on sunday with marco .\nsks13\t1\t\ti will speak to hector about this .\nsks13\t1\t\ti doubt that mary reads mysteries .\nsks13\t1\t\the muttered that the visitors will leave .\nsks13\t1\t\tthe fact that john is snoring is informative .\nsks13\t1\t\tthe man that mary saw knew me .\nsks13\t1\t\tthat the visiting team won the race could surprise them .\nsks13\t1\t\tthat is what you should see .\nsks13\t1\t\tjohn knows that she left .\nsks13\t1\t\tjohn knows whether she will come back .\nsks13\t1\t\tjohn knows that she left and whether she will come back .\nsks13\t1\t\tjohn knows that she left and john knows whether she will come back .\nsks13\t1\t\tjohn asked whether she left .\nsks13\t1\t\ti doubt if she kicks perfect goals every time .\nsks13\t1\t\tthey think that she can do it .\nsks13\t1\t\twhether she left is most unclear .\nsks13\t1\t\tthat the girl put a picture there proves her guilt .\nsks13\t1\t\ti prefer for the girl to put a picture there .\nsks13\t1\t\tfor the girl to put a picture there is what i prefer .\nsks13\t1\t\tfor the girl to put a picture there would surprise you .\nsks13\t1\t\ti prefer for the girl to win .\nsks13\t0\t*\ti prefer for the girl to will win .\nsks13\t0\t*\ti prefer for the girl to wins .\nsks13\t1\t\tlet 's walk .\nsks13\t1\t\ti run on the beach .\nsks13\t1\t\tthe three sunbathers went swimming .\nsks13\t1\t\ti hope that mary wins .\nsks13\t1\t\tthey know if mary won .\nsks13\t1\t\ti wonder whether mary will win .\nsks13\t1\t\tthey prefer for mary to leave .\nsks13\t1\t\tjohn wonders whether mary will win .\nsks13\t1\t\tjohn wonders whether to win .\nsks13\t1\t\twhether she will win is a question mary never considered .\nsks13\t1\t\twhether to win is a question mary never considered .\nsks13\t1\t\ti think that you will see that the girl will put a picture on your desk .\nsks13\t1\t\tthey understand that you will prefer for the girl to put a picture on your desk .\nsks13\t1\t\tmary cuts the paper easily .\nsks13\t1\t\tthe paper cuts easily .\nsks13\t1\t\tthat he won the race could surprise them .\nsks13\t0\t*\tthat him won the race could surprise them .\nsks13\t1\t\tfor him to win the race would surprise them .\nsks13\t0\t*\tfor he to win the race would surprise them .\nsks13\t1\t\tjohn saw mary .\nsks13\t1\t\tharry likes movies .\nsks13\t1\t\tfor mary to leave on time is important .\nsks13\t0\t*\tthink about linguistics all night , she does think about linguistics all night .\nsks13\t0\t*\tclimb to the top , they do climb to the top .\nsks13\t1\t\tjohn can go to the market on his bike .\nsks13\t1\t\tmary should buy some flowers on sunday .\nsks13\t1\t\tmy niece could write me letters before her third birthday .\nsks13\t1\t\tmy nephew could write letters to his parents with a fountain pen .\nsks13\t1\t\tjohn can go to the market quickly .\nsks13\t1\t\tmary should buy some flowers for her mother to arrange .\nsks13\t1\t\tmy niece could write me letters more faithfully .\nsks13\t1\t\tjohn can quickly go to the market .\nsks13\t1\t\tmy niece could more faithfully write me letters .\nsks13\t0\t*\tjohn can go to the market to india .\nsks13\t0\t*\tmary should buy some flowers some bread .\nsks13\t0\t*\tmy niece could write me you letters .\nsks13\t0\t*\tmy nephew could write letters the postcards to his parents .\nsks13\t1\t\tjohn can go to the market on his bike on a truck .\nsks13\t1\t\tmary should buy some flowers on sunday at 5 o'clock .\nsks13\t1\t\tmy nephew could write letters to his parents with a fountain pen with your help .\nsks13\t1\t\tpelé visited his uncle .\nsks13\t1\t\tshe sold the car to sam for five dollars .\nsks13\t1\t\tshe ran the car on propane from reno to vegas .\nsks13\t1\t\tthe process changed the substance from solid to liquid to gas to energy .\nsks13\t1\t\twe associated their subsidiaries with our corporate office .\nsks13\t1\t\ti cycled around france .\nsks13\t1\t\tmary drank some beer in the barn from 6 to nine .\nsks13\t1\t\tit was in the barn or it took place in the barn .\nsks13\t0\t*\tit was some beer or it took place some beer .\nsks13\t1\t\tthey wonder whether mary will run .\nsks13\t1\t\tthey wonder about this .\nsks13\t1\t\tthey wonder .\nsks13\t1\t\ti know that she runs .\nsks13\t1\t\ti know .\nsks13\t1\t\ti said that she runs .\nsks13\t1\t\ti said that .\nsks13\t0\t*\ti said .\nsks13\t1\t\ti prefer for mary to run .\nsks13\t1\t\ti prefer this .\nsks13\t0\t*\ti prefer .\nsks13\t1\t\ti said for mary to run .\nsks13\t1\t\ti said this .\nsks13\t1\t\ti put the book on the shelf .\nsks13\t0\t*\ti put the book .\nsks13\t0\t*\ti put .\nsks13\t1\t\ttwo ships appeared , arrived , remained , emerged .\nsks13\t1\t\tsuddenly , there appeared two ships on the horizon .\nsks13\t1\t\ttwo inspectors from the ins appeared , arrived , remained , emerged .\nsks13\t1\t\tthe ice melts , breaks .\nsks13\t1\t\tthe door opens , closes .\nsks13\t1\t\tthey melted , broke the ice .\nsks13\t1\t\tthey opened , closed the door .\nsks13\t1\t\tthey cooked , thickened the soup .\nsks13\t1\t\ti go , run , swim , jump , fly , crawl , dance , walk .\nsks13\t0\t*\tthey went me , ran me , swam me , jumped me , flew me , crawled me , danced me , walked me .\nsks13\t1\t\ti danced a dance .\nsks13\t1\t\the walked the walk .\nsks13\t1\t\tthe time elapsed slowly .\nsks13\t0\t*\tthe time elapsed the day .\nsks13\t1\t\ti see stars .\nsks13\t1\t\ti see .\nsks13\t1\t\ti liked mary .\nsks13\t0\t*\ti liked .\nsks13\t1\t\tthey surrounded the fort .\nsks13\t0\t*\tthey surrounded .\nsks13\t1\t\ti gave the charity .\nsks13\t1\t\ti gave money .\nsks13\t1\t\ti gave .\nsks13\t1\t\ti handed the ball to reg .\nsks13\t0\t*\ti handed the ball .\nsks13\t0\t*\ti handed to reg .\nsks13\t0\t*\ti handed .\nsks13\t1\t\tjohn ate .\nsks13\t1\t\tjohn knows .\nsks13\t0\t*\tjohn needed .\nsks13\t0\t*\tjohn criticized .\nsks13\t1\t\tjohn saw .\nsks13\t1\t\tjohn told .\nsks13\t1\t\tthe agency classified the documents .\nsks13\t0\t*\tthe agency classified .\nsks13\t1\t\tthe war intensified the poverty .\nsks13\t1\t\tthis project is manageable .\nsks13\t1\t\tit mattered on sunday .\nsks13\t1\t\ti saw john on sunday .\nsks13\t1\t\ti put the book on the desk on sunday .\nsks13\t1\t\ti saw john with a telescope .\nsks13\t0\t*\tit mattered with a telescope .\nsks13\t1\t\ti covered the bread with butter .\nsks13\t0\t*\ti emptied it with butter .\nsks13\t1\t\tmary will complete her exam within an hour .\nsks13\t0\t*\tmary will complete her exam for an hour .\nsks13\t1\t\tthe hiker will reach the top of the mountain within an hour .\nsks13\t0\t*\tthe hiker will reach the top of the mountain for an hour .\nsks13\t1\t\thenri will paint the floor for an hour .\nsks13\t1\t\ti will read linguistics for an hour .\nsks13\t1\t\tthe student left .\nsks13\t1\t\tonly the student left .\nsks13\t1\t\teven the student left .\nsks13\t1\t\tall the students left .\nsks13\t1\t\ti saw the student .\nsks13\t1\t\ti saw only the student .\nsks13\t1\t\ti saw all the students .\nsks13\t1\t\tjohn , who i saw yesterday , will visit us .\nsks13\t1\t\ti saw the brilliant student .\nsks13\t1\t\ti saw the brilliant one .\nsks13\t1\t\ti saw the brilliant student with long hair .\nsks13\t1\t\ti saw the brilliant one with long hair .\nsks13\t1\t\ti saw the one with long hair .\nsks13\t1\t\ti saw the physics student .\nsks13\t0\t*\ti saw the physics one .\nsks13\t1\t\ti saw the student of physics .\nsks13\t0\t*\ti saw the one of physics .\nsks13\t1\t\ti saw the student of physics with long hair .\nsks13\t1\t\tthe big student of physics with long hair in the library .\nsks13\t1\t\tit is big .\nsks13\t1\t\tit is with long hair .\nsks13\t0\t*\tit is of physics .\nsks13\t1\t\tit is in the library .\nsks13\t1\t\tthey are intense .\nsks13\t0\t*\tthey are intense of bill .\nsks13\t1\t\tthey intensified .\nsks13\t1\t\tthey are special .\nsks13\t0\t*\tthey are special of bill .\nsks13\t1\t\tthey specialized .\nsks13\t1\t\tshe is proud .\nsks13\t1\t\tshe is the mother .\nsks13\t1\t\tshe is the mother of john .\nsks13\t1\t\tthey read the paper .\nsks13\t1\t\tthe paper is readable .\nsks13\t0\t*\tit is readable of the paper .\nsks13\t0\t*\tthey are readable of the paper .\nsks13\t1\t\tthe driver of the car thinks that mary should leave dallas for boise tomorrow .\nsks13\t1\t\ther little sister will disagree with her .\nsks13\t1\t\tthe girl he met at the departmental party will very surely call him .\nsks13\t1\t\tbeavers build dams .\nsks13\t1\t\tjohn will see you .\nsks13\t1\t\tjohn thinks that mary left .\nsks13\t1\t\tjohn thinks mary left .\nsks13\t1\t\tjohn whispered that mary left .\nsks13\t1\t\tjohn will carefully study russian .\nsks13\t1\t\tjohn carefully studies russian .\nsks13\t0\t*\tjohn studies carefully russian .\nsks13\t1\t\ti wonder if she will use paints .\nsks13\t1\t\tyes , she will .\nsks13\t0\t*\tyes , she .\nsks13\t0\t*\tyes , she will use .\nsks13\t1\t\ti wonder if she used paints .\nsks13\t1\t\tyes , she did .\nsks13\t0\t*\tyes , she used .\nsks13\t1\t\tjohn will have been eating cake .\nsks13\t0\t*\tmary wo n't have been eating cake , but john .\nsks13\t1\t\tmary wo n't have been eating cake , but john will .\nsks13\t1\t\tmary wo n't have been eating cake , but john will have .\nsks13\t1\t\tmary wo n't have been eating cake , but john will have been .\nsks13\t1\t\tjohn will enthusiastically have been eating cake .\nsks13\t1\t\tjohn will have enthusiastically been eating cake .\nsks13\t0\t*\tjohn will have been eating enthusiastically cake .\nsks13\t1\t\tjohn will have been eating cake enthusiastically .\nsks13\t0\t*\tjohn studied carefully russian .\nsks13\t1\t\tjohn has carefully studied russian .\nsks13\t1\t\tjohn had carefully studied russian .\nsks13\t1\t\tjohn is carefully studying russian .\nsks13\t1\t\tjohn was carefully studying russian .\nsks13\t1\t\tjohn goes to school .\nsks13\t0\t*\tgoes john to school ?\nsks13\t1\t\tmary thinks that bill will come .\nsks13\t0\t*\tmary thinks whether bill will come .\nsks13\t0\t*\tmary thinks for bill to come .\nsks13\t1\t\tmary wonders whether bill will come .\nsks13\t0\t*\tmary wonders for bill to come .\nsks13\t0\t*\tmary prefers that bill will come .\nsks13\t0\t*\tmary prefers whether bill will come .\nsks13\t1\t\tmary prefers for bill to come .\nsks13\t1\t\ti wonder has mary worked for microsoft .\nsks13\t1\t\ti wonder whether mary has worked for microsoft .\nsks13\t0\t*\ti wonder whether has mary worked for microsoft .\nsks13\t0\t*\ti wonder has whether mary worked for microsoft .\nsks13\t1\t\twill john not go to school ?\nsks13\t1\t\thas henri not studied for his exam ?\nsks13\t1\t\tdid sue not pass her exam ?\nsks13\t1\t\two n't john go to school ?\nsks13\t1\t\tshould n't mary taste the soup ?\nsks13\t1\t\thas n't henri studied for his exam ?\nsks13\t1\t\tis n't bill sick ?\nsks13\t1\t\tdid n't sue pass her exam ?\nsks13\t0\t*\twill not john go to school ?\nsks13\t0\t*\tshould not mary taste the soup ?\nsks13\t0\t*\thas not henri studied for his exam ?\nsks13\t0\t*\tis not bill sick ?\nsks13\t0\t*\tdid not sue pass her exam ?\nsks13\t0\t*\tsue put .\nsks13\t0\t*\thenri arrived bill .\nsks13\t0\t*\tmary wonders that john said if bill left .\nsks13\t0\t*\thenri told sue in the drawer that bill put socks .\nsks13\t1\t\tshe will win the race .\nsks13\t0\t*\ther will the race .\nsks13\t1\t\telmer finished the cake and john did too , finish the cake .\nsks13\t1\t\twe need to provide two trees and .\nsks13\t1\t\twe also need to explain the relation between these trees .\nsks13\t0\t*\tjohn not liked mary .\nsks13\t0\t*\tjohn liked not mary .\nsks13\t1\t\tjohn did not like mary .\nsks13\t1\t\tjohn will endorse the treaty , but georges will not endorse the treaty .\nsks13\t1\t\twill george indeed not endorse the treaty ?\nsks13\t0\t*\the will indeed not endorse the treaty .\nsks13\t1\t\the will indeed endorse the treaty .\nsks13\t1\t\the will not endorse the treaty ; and indeed .\nsks13\t1\t\tjohn thinks that bill left .\nsks13\t1\t\tjohn asked whether bill left .\nsks13\t1\t\tjohn was wondering whether to leave or not .\nsks13\t1\t\tjohn was wondering whether to leave .\nsks13\t0\t*\ti read these big three books .\nsks13\t0\t*\tmary sent .\nsks13\t1\t\tmary sent a book to bill .\nsks13\t1\t\tmary send a book .\nsks13\t1\t\tmary sent bill a book , … .\nsks13\t0\t*\tbill examined a book .\nsks13\t0\t*\tsincerity examined a book .\nsks13\t0\t*\twe put .\nsks13\t1\t\twe put a book on the table .\nsks13\t1\t\twe think that bill left .\nsks13\t0\t*\twe think for bill left .\nsks13\t0\t*\twe think if bill left .\nsks13\t1\t\twe wonder whether bill left .\nsks13\t1\t\twe wonder if bill left .\nsks13\t0\t*\twe wonder that bill left .\nsks13\t1\t\tjohn came in .\nsks13\t1\t\tthen , john left .\nsks13\t1\t\the took his umbrella .\nsks13\t1\t\the hurt himself with it when he tried to open it .\nsks13\t1\t\tthe idiot ca n't even open an umbrella !\nsks13\t0\t*\tjohn hurt john with john 's umbrella when john tried to open it .\nsks13\t1\t\tjohn ca n't even open an umbrella !\nsks13\t1\t\tjohn said he was sick .\nsks13\t1\t\tthe ta who graded him says that john did really well .\nsks13\t0\t*\thimself should decide soon .\nsks13\t0\t*\tmary wrote a letter to himself last year .\nsks13\t1\t\the should decide soon .\nsks13\t1\t\tmary wrote a letter to him last year .\nsks13\t1\t\tour rabbit and the neighbor 's cat like each other .\nsks13\t1\t\tthe boys fought with each other .\nsks13\t1\t\teach of our rabbit and the neighbor 's cat likes the other .\nsks13\t1\t\teach of the boys fought with the other boys .\nsks13\t1\t\tthe boy likes himself .\nsks13\t0\t*\tthe boy likes herself .\nsks13\t0\t*\tthe boy likes themselves .\nsks13\t1\t\tthe girls likes themselves .\nsks13\t0\t*\tthe girls likes herself .\nsks13\t1\t\teach of the girls likes herself .\nsks13\t0\t*\tthe girls likes yourselves .\nsks13\t0\t*\thimself likes john .\nsks13\t1\t\tmary 's pictures of herself surprised bill .\nsks13\t1\t\ti noticed john 's excessive appreciation of himself .\nsks13\t1\t\tmary noticed john 's excessive appreciation of himself .\nsks13\t0\t*\tmary noticed john 's excessive appreciation of herself .\nsks13\t0\t*\tmary noticed that john excessively appreciates herself .\nsks13\t1\t\tjohn loved the new pictures of himself .\nsks13\t1\t\ti showed mary several portraits of herself .\nsks13\t0\t*\tjohn believes that mary saw himself .\nsks13\t1\t\tmary noticed that john excessively appreciates himself .\nsks13\t1\t\tmary appreciates only john and herself .\nsks13\t0\t*\tmary appreciates john and himself .\nsks13\t1\t\tmary really appreciates and constantly praises herself and sue knows it .\nsks13\t0\t*\tmary really appreciates and constantly praises himself and bill knows it .\nsks13\t1\t\tjohn heard their criticism of each other .\nsks13\t1\t\tjohn heard their criticism of themselves .\nsks13\t0\t*\tthey heard john 's criticism of each other .\nsks13\t0\t*\tthey heard john 's criticism of themselves .\nsks13\t0\t*\tjohn heard that they criticized each other .\nsks13\t0\t*\tthey heard that john criticized each other .\nsks13\t1\t\tjohn likes himself .\nsks13\t1\t\tthe students are proud of themselves .\nsks13\t1\t\teveryone likes himself .\nsks13\t1\t\tno spy betrayed himself .\nsks13\t1\t\ti heard john 's criticism of himself .\nsks13\t0\t*\ti heard john 's criticism of myself .\nsks13\t1\t\tjohn heard that i criticized myself .\nsks13\t0\t*\ti heard that john criticized myself .\nsks13\t1\t\tmary likes herself .\nsks13\t0\t*\tour rabbit and the neighbor 's cat like them .\nsks13\t0\t*\tbill likes herself .\nsks13\t0\t*\thimself laughs .\nsks13\t1\t\tthe girls likes them .\nsks13\t1\t\tjohn 's mother likes him .\nsks13\t1\t\tjohn believes that bill saw himself .\nsks13\t1\t\tjohn believes that bill saw him .\nsks13\t0\t*\tmary believes that bill saw herself .\nsks13\t1\t\tthey like their books .\nsks13\t1\t\teveryone thinks he is smart .\nsks13\t1\t\twho in this class thinks he is smart ?\nsks13\t1\t\tbill 's mother saw him .\nsks13\t0\t*\tno one 's mother saw himself .\nsks13\t1\t\tthe mayor of john 's hometown wrote to him .\nsks13\t1\t\tthe builder of his house visited peter .\nsks13\t1\t\tthat is a bird .\nsks13\t1\t\tthat 's the truth .\nsks13\t1\t\the is john .\nsks13\t1\t\tbob dylan is robert zimmerman .\nsks13\t1\t\ti like mary and she likes me .\nsks13\t0\t*\ti like mary and she does too .\nsks13\t0\t*\ti like mary and she does like mary too .\nsks13\t1\t\tshe considers john proud of his work .\nsks13\t1\t\tthey saw bill leave .\nsks13\t1\t\tmary prefers that her ice cream is in a cone .\nsks13\t1\t\thenry saw that bill left .\nsks13\t1\t\twhat mary prefers is her ice cream in a cone .\nsks13\t0\t*\twhat she considers is john proud of his work .\nsks13\t0\t*\twhat henry found is bill sad .\nsks13\t0\t*\twhat they saw is bill leave .\nsks13\t0\t*\twhat henry find was bill sad .\nsks13\t0\t*\tjohn heard mary describe himself .\nsks13\t1\t\tjohn heard mary describe herself .\nsks13\t0\t*\tmary considers john proud of herself .\nsks13\t1\t\tmary considers john proud of her .\nsks13\t1\t\tmary considers john proud of himself .\nsks13\t1\t\tjohn believes himself to be proud of mary .\nsks13\t1\t\tthe pictures of bill she put on your desk .\nsks13\t1\t\twhich pictures of bill did she put on your desk .\nsks13\t1\t\tsusan wanted to sleep .\nsks13\t1\t\tshe put the pictures of bill on your desk .\nsks13\t1\t\tthe pictures of bill , she put on your desk .\nsks13\t0\t*\tthe picture of bill she slept .\nsks13\t0\t*\tshe slept the picture of bill .\nsks13\t1\t\tyou put which picture of bill on his desk ?\nsks13\t1\t\twhich picture of bill did you put on his desk ?\nsks13\t1\t\thow many strings did you say she had to pull in order to do that ?\nsks13\t1\t\thow much care do you think he would be taking of his patients under those circumstances ?\nsks13\t1\t\thow much headway is he likely to make .\nsks13\t1\t\twho left bill .\nsks13\t0\t*\twhom left bill .\nsks13\t1\t\twho did bill leave .\nsks13\t1\t\twhom did bill leave .\nsks13\t1\t\tis there anything to do today ?\nsks13\t1\t\tthere are two main characters in the novel .\nsks13\t1\t\tthere are 3 firemen available .\nsks13\t0\t*\tthere stabbed an animal .\nsks13\t0\t*\tthere ran many people .\nsks13\t0\t*\tmary judged there .\nsks13\t0\t*\ti had a realization of there .\nsks13\t1\t\tthere were seven people .\nsks13\t1\t\tthere were several doctors available .\nsks13\t1\t\trodney was eating some squid , was n't he ?\nsks13\t1\t\tthere is a man ready to jump from the roof , is n't there ?\nsks13\t1\t\tsharks seem to swim slowly in the tropics .\nsks13\t1\t\tthe cat seems to be out of the bag .\nsks13\t1\t\tthe shit seems to have hit the fan .\nsks13\t0\t*\tthere run many people .\nsks13\t1\t\tthere seems to be a nurse available .\nsks13\t0\t*\tthere seems to stab an animal .\nsks13\t0\t*\tthere seems to run many people to the station .\nsks13\t1\t\tit seems that john left .\nsks13\t1\t\tseveral people seem sick .\nsks13\t1\t\tjohn considers several people sick .\nsks13\t1\t\tthere are several people sick .\nsks13\t1\t\tseveral people seem several people sick .\nsks13\t1\t\tseveral people are sick .\nsks13\t1\t\tbill is sick .\nsks13\t1\t\tsusan hopes to sleep .\nsks13\t1\t\tsusan hopes that she will sleep .\nsks13\t0\t*\tsusan hopes susan to sleep .\nsks13\t0\t*\teveryone hopes him to sleep .\nsks13\t1\t\teveryone hopes to sleep .\nsks13\t1\t\teveryone hopes that everyone will sleep .\nsks13\t0\t*\tsusan hopes her to sleep .\nsks13\t1\t\tonly churchill remembered giving the blood , sweat and tears speech .\nsks13\t1\t\tonly churchill remembered his giving the blood , sweat and tears speech .\nsks13\t1\t\tonly churchill remembered himself giving the blood , sweat and .\nsks13\t1\t\tsusan hopes herself to sleep .\nsks13\t1\t\tfor john to hurt his friends is stupid .\nsks13\t1\t\tto hurt his friends is stupid .\nsks13\t1\t\tfor john to hurt himself is stupid .\nsks13\t1\t\tto hurt oneself is stupid .\nsks13\t0\t*\tfor john to hurt oneself is stupid .\nsks13\t1\t\tjohn promised bill to leave .\nsks13\t1\t\tjohn promised mary that he would leave .\nsks13\t1\t\tjohn promised mary to cut the grass .\nsks13\t1\t\tjohn promise mary to control himself .\nsks13\t0\t*\tjohn promised mary to control herself .\nsks13\t0\t*\tjohn promised mary to shave herself .\nsks13\t1\t\tjohn seems to sleep all day .\nsks13\t1\t\tjohn hopes to sleep .\nsks13\t1\t\tjohn tried to sleep .\nsks13\t0\t*\tjohn believes to have slept .\nsks13\t1\t\tjohn believes bill to have slept .\nsks13\t0\t*\tjohn believes for bill to have slept .\nsks13\t1\t\tjohn believes that bill has slept .\nsks13\t0\t*\tjohn believes bill that mary has slept .\nsks13\t0\t*\tjohn convinced to sleep .\nsks13\t1\t\tjohn convinced bill to sleep .\nsks13\t0\t*\tjohn convinced bill for mary to sleep .\nsks13\t0\t*\tjohn convinced that bill has slept .\nsks13\t0\t*\tit convinced bill that mary should sleep .\nsks13\t1\t\tjohn believes it to be obvious that bill left .\nsks13\t1\t\tjohn believes it to be raining .\nsks13\t0\t*\tjohn convinced it to be obvious that bill left .\nsks13\t0\t*\tjohn convinced it to be raining .\nsks13\t0\t*\tjohn convinced there to be several firemen available .\nsks13\t1\t\tbill cooked the rice .\nsks13\t1\t\tthe rice was cooked by bill .\nsks13\t1\t\tbill visited mary .\nsks13\t1\t\tmary was visited by bill .\nsks13\t1\t\tjohn believes bill to have cooked the rice .\nsks13\t1\t\tjohn believes the rice to have been cooked by bill .\nsks13\t1\t\tjohn believes bill to have visited mary .\nsks13\t1\t\tjohn believes mary to have been visited by bill .\nsks13\t1\t\tjohn convinced bill to cook the rice .\nsks13\t0\t*\tjohn convinced the rice to be cooked by bill .\nsks13\t1\t\tjohn convinced bill to visit mary .\nsks13\t1\t\tjohn believes that bill slept .\nsks13\t1\t\ti sent money .\nsks13\t1\t\ti sent mary money .\nsks13\t1\t\ti sent money to mary .\nsks13\t0\t*\ti sent bill money to mary to sam .\nsks13\t1\t\ti worked on sunday in the city on that project without a break .\nsks13\t1\t\ti praised mary .\nsks13\t0\t*\ti praised .\nsks13\t1\t\tthe moon glows in the darkness .\nsks13\t1\t\tthe moon glows .\nsks13\t1\t\ti sang a song with mary while you did so with bill .\nsks13\t1\t\twhat mary did with bill was sing a song .\nad03\t1\t\tshe tried to leave\nad03\t1\t\twho said he would give the cloak to lee ?\nad03\t0\t*\tgilgamesh does n't be in the dungeon\nad03\t1\t\twhich book about herself did jenny say that anson had written .\nad03\t1\t\tpaul had eighty eight billion sixty three million forty-four thousand nine hundred at\nad03\t0\t*\twhat i said that was we would go .\nad03\t1\t\tthe boy thought she was happy .\nad03\t1\t\tthe landlord donated a helicopter\nad03\t1\t\tmost dragons have been neutered .\nad03\t1\t\twho did you meet all when you were in derry ?\nad03\t1\t\tjason persuaded medea to desert her family .\nad03\t1\t\tmichael abandoned an old friend at mardi gras\nad03\t1\t\tyou friends of the king are all the same\nad03\t1\t\the is that kind of actor\nad03\t0\t*\tlucy 's gomez 's wallet\nad03\t1\t\tmedea tended to appear to be evil .\nad03\t1\t\the 's bound to could do it\nad03\t1\t\tnathan received the cloak from benjamin\nad03\t1\t\tthat the world is round is obvious .\nad03\t1\t\tposeidon wept , after the executioner left .\nad03\t1\t\ti asked who did medea poison .\nad03\t1\t\ti never liked his analysis .\nad03\t0\t*\tpeter is some happy pigs which can fly .\nad03\t0\t*\tgilgamesh not left .\nad03\t0\t*\tthere arrived by medea .\nad03\t1\t\ti might have eaten some seaweed .\nad03\t1\t\tthere appears to be a problem with this solution .\nad03\t1\t\twhat julie became was fond of lloyd .\nad03\t1\t\tbill did not defeat the gods but gilgamesh did .\nad03\t1\t\taphrodite frees animals\nad03\t0\t*\tthe hospital was donated the book to .\nad03\t1\t\tmedea , jason poisoned .\nad03\t0\t*\tthey kicked himself\nad03\t1\t\temily showed benjamin himself in the mirror .\nad03\t1\t\tjason was killed by medea .\nad03\t1\t\thow did you eat the cake ?\nad03\t1\t\ti asked who medea poisoned .\nad03\t1\t\taphrodite wanted to live and ishtar tried to\nad03\t1\t\ti was sitting not under the tree but under the bush\nad03\t1\t\tthe child wails\nad03\t1\t\tgilgamesh has n't left\nad03\t0\t*\twhiskey do i drink .\nad03\t1\t\tdracula thought that he was the prince of darkness .\nad03\t1\t\the looked up the number .\nad03\t1\t\tshe has kissed her .\nad03\t1\t\tagamemnon stopped jason casting the spell\nad03\t1\t\thumans love to eat some disgruntled old pigs in those ditches .\nad03\t1\t\tjason whispered that the phoenix had escaped\nad03\t1\t\tron definitely has bought a dog .\nad03\t0\t*\the book\nad03\t1\t\twho is it obvious that plato loves .\nad03\t0\t*\twhich god the statue ?\nad03\t0\t*\tkiss pigs is my happiest memory\nad03\t0\t*\tdante accused\nad03\t1\t\tthat picture of jenny in a rubber dress does n't flatter her .\nad03\t1\t\the might could go\nad03\t1\t\tbenjamin gave lee the cloak and nathan the chalice .\nad03\t1\t\tthat monkey is ate the banana\nad03\t1\t\ti bought a book about harry\nad03\t0\t*\tthe children wails\nad03\t1\t\twho was it obvious that plato loved ?\nad03\t1\t\tit was for jenny that i intended to be present .\nad03\t1\t\ti think she is pregnant\nad03\t1\t\tit 's extremely windy today .\nad03\t0\t*\twho did you believe that to kiss seemed wrong ?\nad03\t1\t\tjason would prefer medea to have cursed agamemnon .\nad03\t0\t*\tthe therapist 's analysis of lucy 's\nad03\t1\t\twho did athena introduce to whom ?\nad03\t1\t\tit appears that poseidon owns a dragon\nad03\t1\t\ti have often eaten muffins .\nad03\t1\t\tgilgamesh can seek ishtar\nad03\t1\t\tyou kicked yourself\nad03\t1\t\tagamemnon seems to have left .\nad03\t1\t\tthe dragons had all eaten the pigs .\nad03\t1\t\tanson shot the dinosaur with his rifle in the jungle\nad03\t0\t*\tgenie intoned the mirror .\nad03\t1\t\ti often have eaten muffins .\nad03\t1\t\the kicked himself\nad03\t1\t\tgilgamesh has not read the cuneiform tablets .\nad03\t1\t\the might maybe do that , might n't he ?\nad03\t1\t\ti intended for jenny to be present .\nad03\t0\t*\twe believed to be omnipotent .\nad03\t1\t\twhose poem about achilles did homer persuade jason that he should read ?\nad03\t1\t\tjason would prefer for medea to have cursed agamemnon .\nad03\t1\t\ti asked who medea gave what ?\nad03\t1\t\ti have every hope that you will defeat him .\nad03\t1\t\tparis is no more\nad03\t1\t\the will can do it\nad03\t1\t\twe believed him to be the headmaster\nad03\t1\t\twho kissed who ?\nad03\t1\t\twho did you say that john thought would leave early ?\nad03\t0\t*\tany boy saw no one .\nad03\t0\t*\twhat i arranged for jenny was to be present .\nad03\t0\t*\the kicked herself\nad03\t1\t\tcassandra has warned agamemnon again .\nad03\t1\t\tgilgamesh has been fighting the dragon .\nad03\t1\t\tlucy 's photograph of jane\nad03\t1\t\twho did jason think medea had poisoned ?\nad03\t1\t\tgilgamesh may have quickly cast the spell\nad03\t0\t*\thaving read of shakespeare satisfied me\nad03\t0\t*\tmedea tried her to leave .\nad03\t1\t\tthe potion boiled over\nad03\t1\t\tthere arrived a new actor .\nad03\t1\t\ti ate fruit\nad03\t1\t\ti hoped that you would defeat him .\nad03\t1\t\twho seems to be certain to leave first ?\nad03\t0\t*\tshe liked moya 's football .\nad03\t1\t\this hen loves anson .\nad03\t1\t\ti ate a mango and gillian did too .\nad03\t1\t\twhy did you eat the cake ?\nad03\t0\t*\the would can go\nad03\t1\t\tperhaps gilgamesh should be leaving\nad03\t1\t\ti want to can do it\nad03\t1\t\tjason intended for him to learn magic .\nad03\t1\t\ti went to the shop for to get bread .\nad03\t1\t\ti asked which king invaded which city .\nad03\t1\t\twe made the claim that perseus killed the gorgon .\nad03\t1\t\tplato listened to dp demosthenes ' oration about philip .\nad03\t1\t\tthe old house collapsed .\nad03\t0\t*\ti believed she is pregnant\nad03\t1\t\thow are you feeling ?\nad03\t1\t\taphrodite misses gilgamesh .\nad03\t1\t\tanson very happily demonized david .\nad03\t1\t\tthat plato loved aster proved to be his undoing .\nad03\t1\t\thas n't the potion worked ?\nad03\t1\t\tbill 's reading shakespeare satisfied me\nad03\t1\t\tevery vampire slept .\nad03\t1\t\ti might be leaving soon .\nad03\t0\t*\tit 's arrived first that julie and jenny\nad03\t1\t\tthe man i saw left .\nad03\t0\t*\the replied his answer .\nad03\t1\t\tbecause they hated him , the druids forced jason to live in a cupboard\nad03\t1\t\twe kicked ourselves\nad03\t1\t\tdid medea poison jason ?\nad03\t1\t\taphrodite freed animals\nad03\t1\t\tthe book was donated to the hospital .\nad03\t1\t\tmedea poisoned more children than jason did .\nad03\t1\t\tnathan showed benjamin himself in the mirror .\nad03\t1\t\tthat plato loved aster deeply was obvious .\nad03\t1\t\the kicked him\nad03\t1\t\tjason expected the doctor to treat medea\nad03\t1\t\tthe therapist 's analysis of lucy\nad03\t1\t\twhere are you living ?\nad03\t1\t\twho showed what to who ?\nad03\t1\t\tmedea thought that , after the executioner had left , poseidon would be relieved .\nad03\t1\t\tthe consul 's gift of the gladiator to himself .\nad03\t1\t\tall the dragons have been slain .\nad03\t1\t\tgilgamesh should slowly be tickling the mandrake .\nad03\t1\t\todysseus planned to hear the sirens .\nad03\t1\t\tbill reading shakespeare and maureen singing schubert satisfies me\nad03\t1\t\tthe shooting of the hunters was very loud .\nad03\t0\t*\tthe librarians likes books .\nad03\t0\t*\tcan he will do it ?\nad03\t0\t*\ti ordered there to be three books on the subject .\nad03\t1\t\ttruman punched johnson\nad03\t1\t\the became fond of peanuts .\nad03\t1\t\tthe therapist analysed lucy\nad03\t1\t\tdracula thought himself to be the prince of darkness .\nad03\t0\t*\twhich poem did you hear those recitals of last night ?\nad03\t1\t\tathena introduced medea to jason\nad03\t1\t\the 'll no can do it , will he ?\nad03\t1\t\tanson is incredibly difficult to please .\nad03\t1\t\tit was claimed by everyone that the poison was neutralised\nad03\t1\t\tthe banana is being eaten by that monkey .\nad03\t1\t\ti want to kiss pigs\nad03\t1\t\tburn letters to him !\nad03\t1\t\this analysis of her was flawed\nad03\t1\t\tdid the potion boil over ?\nad03\t1\t\ti did n't see him ever .\nad03\t0\t*\tshe said moya liked football .\nad03\t1\t\twe all thought him to be unhappy\nad03\t1\t\twhich book are you reading ?\nad03\t1\t\tthat monkey is eating the banana .\nad03\t1\t\tthat bottle of water might have cracked open .\nad03\t1\t\twho did gilgamesh believe to have kissed aphrodite ?\nad03\t1\t\tpaul had three affairs . . .\nad03\t1\t\tclose the door !\nad03\t1\t\ti was eating not a peach but an apple\nad03\t1\t\twhich poem did you go to hear a recital of last night ?\nad03\t0\t*\twhen time will you be there .\nad03\t1\t\ti have sent 0 letters to environmental heath .\nad03\t1\t\twhy did you kill pegasus ?\nad03\t1\t\taphrodite does free animals\nad03\t1\t\tgilgamesh will seek ishtar\nad03\t1\t\ti assumed him to be innocent\nad03\t1\t\ti am being whipped\nad03\t1\t\tnever will i do syntax again .\nad03\t1\t\tthe children wail\nad03\t1\t\tmary fell .\nad03\t1\t\ti inquired if we could leave early .\nad03\t1\t\tbenjamin gave the cloak and sent the book to lee\nad03\t1\t\thera tried to appear to be happy .\nad03\t0\t*\ti arranged for to see her .\nad03\t0\t*\tbill 's reading shakespeare and maureen 's singing schubert satisfies me\nad03\t0\t*\tmyself shaved me .\nad03\t0\t*\tno reading shakespeare satisfied me\nad03\t1\t\tthe emperor 's every wish was immediately carried out .\nad03\t1\t\tjenny has eaten a cake .\nad03\t0\t*\tmoya played football with her\nad03\t0\t*\ti intoned fruit\nad03\t1\t\tthe sheep cry\nad03\t1\t\the ca n't possibly do that , can he\nad03\t1\t\twe believed that aphrodite was omnipotent .\nad03\t1\t\twhich book about ulysses did you say that you would read ?\nad03\t0\t*\ti wanted any cake .\nad03\t1\t\tgilgamesh is not reading the cuneiform tablets .\nad03\t1\t\tjason persuaded medea to try to run away .\nad03\t1\t\twe believed aphrodite to be omnipotent\nad03\t1\t\tthat bottle of water might have .\nad03\t1\t\ti do n't remember what all i said ?\nad03\t0\t*\taphrodite said he freed the animals and freed the animals he\nad03\t1\t\tthat aphrodite was so promiscuous astounded the other gods .\nad03\t0\t*\tgilgamesh does n't ate the honey\nad03\t1\t\ti claimed that she was pregnant\nad03\t0\t*\taphrodite do freed animals .\nad03\t1\t\tdavid wrote that you said that anson thought that julie had fainted\nad03\t0\t*\tgilgamesh failed often biology\nad03\t1\t\tit rained\nad03\t1\t\tposeidon was asleep , when the executioner arrived .\nad03\t1\t\tpeople are in the garden\nad03\t1\t\tanson became happy\nad03\t1\t\tit is tough to teach syntax .\nad03\t1\t\tthere 's going to be a party , is n't there ?\nad03\t0\t*\ti have might be flying helicopters .\nad03\t1\t\tthey brought the hat to the teacher\nad03\t1\t\twhat medea attempted was to poison her children .\nad03\t1\t\tbenjamin gave the cloak 0 and sent the book to lee\nad03\t1\t\tthe man chuckles\nad03\t1\t\tmilena will make pasta .\nad03\t1\t\taphrodite did free animals\nad03\t1\t\tgilgamesh should seek ishtar\nad03\t1\t\tthey depend on mary .\nad03\t0\t*\tthe greeks arrived all .\nad03\t0\t*\thas not the potion worked\nad03\t0\t*\tgomez 's photograph of pugsley of lucy 's\nad03\t0\t*\twill can he do it ?\nad03\t1\t\thumans love to eat the old pigs .\nad03\t0\t*\the could might go\nad03\t1\t\tit 's under the bed that 's the best place to hide\nad03\t1\t\the left .\nad03\t1\t\tthat picture of her pleases jenny .\nad03\t1\t\tconstantly reading shakespeare satisfied me\nad03\t1\t\tthere was a dragon in the cave .\nad03\t1\t\tpeople like lard .\nad03\t1\t\tthese ones are to be smuggled from hungary .\nad03\t0\t*\temily showed himself to benjamin in the mirror .\nad03\t0\t*\ti said that that jason was jealous annoyed medea\nad03\t1\t\twe donated a chopper to the new hospital\nad03\t1\t\tthese expensive and illegal bottles of absinthe are to be smuggled from hungary .\nad03\t1\t\ta programme about euripides is on a radio 4 tonight .\nad03\t1\t\tlucy 's analysis was the most successful\nad03\t0\t*\ti am having eaten seaweed .\nad03\t1\t\tmedea tried to poison her children .\nad03\t0\t*\tanson demonized old\nad03\t1\t\twhat i said was that we would go .\nad03\t1\t\tthere are many fish in the sea .\nad03\t1\t\tjason gave the poisoned clothes to who ?\nad03\t0\t*\tby is eaten monkey banana that the being\nad03\t1\t\tbenjamin said he would run away and he did .\nad03\t1\t\twho is sailing to ithaca ?\nad03\t1\t\tthey sat on mary .\nad03\t1\t\tjulie filed letters to herself .\nad03\t1\t\the looked it up\nad03\t1\t\twho 's there ?\nad03\t0\t*\tthere was he in the garden .\nad03\t1\t\the might no could have done it\nad03\t0\t*\tgilgamesh did n't ate the honey\nad03\t1\t\tthe sheep cries\nad03\t1\t\tfor aphrodite to appear to be happy would be impossible .\nad03\t0\t*\twho did plato listen to dp demosthenes ' oration about ?\nad03\t0\t*\tme gave it to him .\nad03\t1\t\tif one were to steal talismans from witches , then that would be dangerous .\nad03\t1\t\tbill did not destroy the world .\nad03\t1\t\tboth the twins might have been at the party .\nad03\t1\t\tthat plato loved aster was obvious .\nad03\t1\t\tthe pigs grunt\nad03\t0\t*\twhere place are you living .\nad03\t0\t*\twho was for medea to poison awful ?\nad03\t0\t*\tjulie maintained if the barman was sober .\nad03\t1\t\tthe analysis of lucy took longer than that of gomez .\nad03\t0\t*\tjulie became a fond .\nad03\t1\t\ti climbed up the tree .\nad03\t1\t\ti inquired when we could leave .\nad03\t1\t\twhere alison and david soaked their feet was in the kitchen\nad03\t0\t*\twhat medea wondered if was the potion was ready\nad03\t1\t\tthat photograph of jane of lucy 's\nad03\t1\t\tthe constant reading of shakespeare satisfied me\nad03\t0\t*\tmedea wondered if that the potion was ready\nad03\t1\t\twhat she thought was that the poison was neutralised\nad03\t1\t\tbecause she had got the highest marks , medea was happy\nad03\t1\t\twhen did you arrive ?\nad03\t1\t\twhich poisonous plant is it certain that we will find in amazonia ?\nad03\t1\t\tthe microphone salesman 's 0 irritating patter was relentless .\nad03\t1\t\tthe paris i used to know is no more\nad03\t1\t\tsam gave the cloak to lee and gave the magic chalice to matthew .\nad03\t1\t\tgilgamesh has eaten the honey\nad03\t1\t\ti will eat a mango , and gillian will too .\nad03\t1\t\tcomputer viruses increased in virulence last year .\nad03\t1\t\tat trade , anson danced extremely frantically\nad03\t1\t\trichard is going to chop some wood .\nad03\t1\t\tthe poem that homer wrote .\nad03\t1\t\twho did drink the poison ?\nad03\t1\t\tevan 's every idea was completely insane .\nad03\t1\t\tsally is making scones , and gillian is too .\nad03\t1\t\teveryone claimed that the poison was neutralized .\nad03\t0\t*\tjonathan persuaded kate to lick himself .\nad03\t1\t\tthat jason arrived infuriated medea .\nad03\t1\t\tmedea was happy , because she had got the highest marks\nad03\t1\t\tkeep yourself clean !\nad03\t1\t\tcassandra has foretold disaster again .\nad03\t1\t\tbill 's reading of shakespeare satisfied me\nad03\t1\t\twho poisoned who ?\nad03\t1\t\tpigs love truffles .\nad03\t1\t\towners of pigs love truffles\nad03\t1\t\tso quickly did the vampire move , that we barely saw him .\nad03\t1\t\thumans love to eat those pigs .\nad03\t0\t*\tshe has kissed she .\nad03\t0\t*\tjason intended for he to learn magic .\nad03\t1\t\tjason persuaded medea to be treated by the doctor\nad03\t1\t\tit is true that i might be doing something other than going to the party .\nad03\t1\t\tjason expected medea to be treated by the doctor\nad03\t0\t*\ti found there .\nad03\t1\t\tmoya said she liked football .\nad03\t1\t\tanson became the mayor\nad03\t1\t\tkane ate dirt .\nad03\t1\t\tbenjamin gave the cloak to nathan\nad03\t0\t*\tthe fig chuckled\nad03\t1\t\tposeidon had run away , because the executioner murdered hera .\nad03\t1\t\ta description of aristotle is in the book .\nad03\t1\t\tjulie and jenny did .\nad03\t1\t\tit 's quarter past four .\nad03\t1\t\towners of a pig love to eat truffles .\nad03\t0\t*\tthat whether the world is round is unknown bothered athena .\nad03\t1\t\tno one expected agamemnon to to win\nad03\t1\t\teuclid was interested in plato 's description of geometry .\nad03\t0\t*\tevery reading shakespeare satisfied me\nad03\t0\t*\tcan will he do it ?\nad03\t1\t\tmedea poisoned who ?\nad03\t0\t*\the looked up it\nad03\t0\t*\twho guy did you see .\nad03\t0\t*\twe kicked myself\nad03\t0\t*\twho would poseidon run away , if the executioner murdered ?\nad03\t0\t*\tanson kissed him\nad03\t1\t\twhich city the claim that philip would invade .\nad03\t1\t\ti have n't left yet\nad03\t0\t*\ti am eating a mango and gillian has too .\nad03\t0\t*\tletter is on the table\nad03\t1\t\twho ate the cake ?\nad03\t1\t\twhy did you say that you were leaving ?\nad03\t1\t\tmichael left meg\nad03\t0\t*\taphrodite quickly may free the animals\nad03\t1\t\tthe reading of shakespeare satisfied me\nad03\t0\t*\tthe weather rained\nad03\t0\t*\tgilgamesh seek may ishtar\nad03\t1\t\tno one expected to win .\nad03\t0\t*\twho did that plato loved seem to be known by everyone .\nad03\t1\t\tthe bear sniffs\nad03\t1\t\tit hung on the wall .\nad03\t0\t*\tjason killed .\nad03\t0\t*\tmany people were there playing on the beach\nad03\t1\t\tknow yourself !\nad03\t1\t\tagamemnon attempted to behave well .\nad03\t1\t\tjulie felt he was there\nad03\t1\t\the thought that dracula was the prince of darkness .\nad03\t1\t\ti have eaten already\nad03\t1\t\tit is not true that i have left yet .\nad03\t0\t*\tthat monkeys is eating the banana .\nad03\t1\t\ti could have been flying helicopters by now .\nad03\t0\t*\tanson put a book\nad03\t0\t*\tgilgamesh might have not been reading the cuneiform tablets .\nad03\t1\t\ti asked if medea poisoned jason .\nad03\t1\t\twho did you persuade to go ?\nad03\t1\t\twhat did you get all for xmas ?\nad03\t1\t\tsome disgruntled old pigs in those ditches love truffles\nad03\t1\t\tjason was killed .\nad03\t0\t*\ti would like to might do it\nad03\t0\t*\tpeter is some disgruntled old pigs in those ditches .\nad03\t0\t*\tthere was him in the garden .\nad03\t1\t\tgilgamesh is in the dungeon .\nad03\t1\t\tanson will come to the party .\nad03\t1\t\tgilgamesh has never flown a dragon .\nad03\t0\t*\tjulie maintained her own questions over the course of the argument .\nad03\t1\t\this analysis , i never liked .\nad03\t1\t\tthat bottle of water might .\nad03\t0\t*\tdid medea poison who ?\nad03\t1\t\tshe took a picture of the phoenix\nad03\t0\t*\tlook after herself !\nad03\t1\t\twho did medea poison ?\nad03\t1\t\ti tried for to get them .\nad03\t1\t\twho did you introduce athena to ?\nad03\t1\t\tcan i keep the screwdriver just like a carpenter keep the screwdriver ?\nad03\t1\t\tjason refrained from casting the spell\nad03\t1\t\tandrew likes lard on his sandwiches\nad03\t1\t\twho seemed to have left first ?\nad03\t0\t*\tron asked that the potion was ready\nad03\t1\t\thierarchy of projections :\nad03\t1\t\twe decided to paint the bathroom a lurid lime green colour .\nad03\t1\t\tshe kicked her\nad03\t0\t*\the knows he .\nad03\t1\t\ti believed there to be three books on the subject .\nad03\t0\t*\tthe child wail\nad03\t1\t\twhich girl ate the cake ?\nad03\t1\t\tthat plato lived in the city of athens was well-known .\nad03\t0\t*\tcollapsed harry .\nad03\t1\t\tfor you to do that would be a mistake .\nad03\t0\t*\tjason thinks who medea had poisoned .\nad03\t1\t\ti believe she is pregnant\nad03\t1\t\tno one expected him to to win .\nad03\t1\t\the 'll no can do it , can he ?\nad03\t0\t*\twhich poem did you hear homer 's recital of last night ?\nad03\t1\t\traffi slept well , and gillian will too\nad03\t1\t\the 's bound to should do it\nad03\t1\t\tit might have cracked open\nad03\t1\t\twhere did perseus see the gorgon ?\nad03\t1\t\tthe scissors are lost\nad03\t1\t\tgilgamesh should be slowly tickling the mandrake\nad03\t0\t*\tagamemnon seems pro to be a maniac\nad03\t0\t*\tmyself saw me\nad03\t1\t\ti believed she was pregnant\nad03\t1\t\tanson gave fluffy to jenny .\nad03\t1\t\tthe very old and extremely wise owl .\nad03\t0\t*\twho did that plato loved prove to be his undoing .\nad03\t0\t*\twhat medea believed was jason to be a murderer .\nad03\t1\t\tthe owl hated the evil bat and loved the wise eagle .\nad03\t1\t\tno one could remove the blood on the wall\nad03\t0\t*\the can can go\nad03\t0\t*\tgillian has made pasta and david is too .\nad03\t0\t*\tjason intended for pro to learn magic .\nad03\t1\t\tthe boys should could all go\nad03\t0\t*\ti assumed to be innocent\nad03\t1\t\tanson danced extremely frantically at trade .\nad03\t0\t*\tthe gorgon is easy to believe the claim that perseus slew .\nad03\t0\t*\tshe kicked itself\nad03\t1\t\tjulie became a fond of lloyd .\nad03\t1\t\tlee 's youngest and dawn 's oldest son ran away .\nad03\t1\t\tanson kicked the cat\nad03\t1\t\tmerlin is extremely evil .\nad03\t1\t\tsyntax is easy to pretend that you can teach .\nad03\t1\t\ti want to eat macaroni\nad03\t1\t\twhich ode did which poet write ?\nad03\t0\t*\twhat she thought that was the poison was neutralised\nad03\t1\t\twho drank the poison ?\nad03\t1\t\twhat medea arranged was for her children to be poisoned .\nad03\t1\t\tno one 's mother had baked anything .\nad03\t1\t\twhat kind of actor is he ?\nad03\t1\t\twhat did she eat ?\nad03\t0\t*\tfrantically at , anson danced extremely trade\nad03\t1\t\ti have often a cold .\nad03\t1\t\twho did maria say that she 'd kiss and kick ?\nad03\t1\t\twhere did they go all for their holidays ?\nad03\t1\t\tthey came running over the hill and through the woods\nad03\t0\t*\tthe airport yawned\nad03\t1\t\thow quickly did you eat the cake ?\nad03\t1\t\tmany fish are in the sea .\nad03\t1\t\tthey arrived first\nad03\t1\t\tpeople were playing on the beach .\nad03\t0\t*\tbenjamin gave to lee it .\nad03\t0\t*\the liked anson .\nad03\t0\t*\tthe bear sniff\nad03\t0\t*\ti inquired could we leave early .\nad03\t1\t\tthe bears sniff\nad03\t0\t*\ti persuaded there to be a problem .\nad03\t1\t\this book\nad03\t1\t\the looked the number up\nad03\t1\t\thas jenny eaten a cake ?\nad03\t1\t\twhich goddess helped us ?\nad03\t1\t\tmedea killed jason .\nad03\t1\t\tron certainly will buy a dog .\nad03\t0\t*\tthey shaved david and anson .\nad03\t0\t*\twe believed to be the headmaster\nad03\t1\t\twhich king did you wonder invaded which city ?\nad03\t1\t\tno one expected agamemnon to win .\nad03\t0\t*\tthe day snowed\nad03\t1\t\tgilgamesh never flies dragons .\nad03\t0\t*\tkeep myself clean !\nad03\t1\t\tthe dragons have all been slain .\nad03\t0\t*\tdid that medea killed her children upset jason ?\nad03\t1\t\tthe amoeba coughed and then it fainted .\nad03\t1\t\ti want to sing\nad03\t1\t\the will can go\nad03\t0\t*\tmedea seemed that has poisoned jason .\nad03\t1\t\thaving read shakespeare satisfied me\nad03\t0\t*\tpeter is owners of pigs .\nad03\t0\t*\todysseus attempted the helmsman to hear the sirens .\nad03\t1\t\tgilgamesh may seek ishtar\nad03\t1\t\tthe librarian likes books .\nad03\t1\t\talison and david soaked their feet after dinner\nad03\t1\t\tmary is faster than john is .\nad03\t1\t\talison and david soaked their feet in the kitchen\nad03\t0\t*\tyou kicked you\nad03\t1\t\tdid you see mary ?\nad03\t1\t\traffi has made pasta , and david has too .\nad03\t1\t\tthere seemed to be three men in the garden .\nad03\t1\t\tthat medea murdered jason did n't surprise anyone .\nad03\t1\t\tmoya 's football team loved her\nad03\t0\t*\ti sent she away .\nad03\t1\t\tjason persuaded medea that she should desert her family\nad03\t0\t*\taphrodite stinks to be omnipotent .\nad03\t1\t\tevery reading of shakespeare satisfied me\nad03\t0\t*\tbill reading shakespeare and maureen singing schubert satisfy me\nad03\t1\t\twhen the executioner arrived , poseidon was asleep\nad03\t1\t\tthey kicked themselves\nad03\t1\t\tmany vampires have become vegetarian .\nad03\t0\t*\tthat that the world is round is obvious upset hermes .\nad03\t0\t*\tbill not destroyed the world .\nad03\t1\t\tjohn saw stephan\nad03\t0\t*\ti destroyed there .\nad03\t0\t*\twhat was euclid interested in plato 's description of ?\nad03\t1\t\ti like anson\nad03\t1\t\tthe dragons simply all died out .\nad03\t1\t\tgilgamesh did not fly the dragon .\nad03\t1\t\twhich goddess might help us ?\nad03\t1\t\thumans love to eat pigs .\nad03\t1\t\twhich poem about achilles did homer recite ?\nad03\t1\t\tthe boys should all could go\nad03\t0\t*\tthe owl hated the evil and the wise eagle .\nad03\t1\t\tthe shield that saved achilles life .\nad03\t1\t\tevan 's every desire\nad03\t1\t\ti wondered whether medea had fled .\nad03\t1\t\ti have eaten my hat already\nad03\t0\t*\the will could go\nad03\t1\t\tjenny swallowed the fly\nad03\t1\t\tthe flying car hit the tree in the air\nad03\t1\t\ti have a book .\nad03\t1\t\tjason thought of defending the dragon\nad03\t1\t\tit seems that agamemnon is a maniac\nad03\t0\t*\twhich city do you believe the claim that philip would invade ?\nad03\t1\t\twe claimed that perseus had killed the gorgon\nad03\t1\t\twe need some technician to help us .\nad03\t0\t*\tthe scissors is lost\nad03\t1\t\ti have been flying helicopters for years .\nad03\t1\t\tsam gave the cloak to lee and the magic chalice to matthew .\nad03\t0\t*\twe kicked us\nad03\t1\t\tno reading of shakespeare satisfied me\nad03\t1\t\twhat did he reply ?\nad03\t0\t*\tit was claimed that by everyone the poison was neutralised\nad03\t1\t\ti asked which city which king invaded .\nad03\t1\t\traffi makes pesto pasta , and david does too\nad03\t1\t\teat dirt !\nad03\t1\t\tlook after yourself !\nad03\t0\t*\tshe wanted to can leave\nad03\t1\t\tarthur gave the tapestry to lancelot .\nad03\t1\t\twe took the car to the town\nad03\t1\t\tbenjamin gave the cloak to lee .\nad03\t1\t\tnot reading shakespeare satisfied me\nad03\t0\t*\tthere were killed three men .\nad03\t1\t\tgilgamesh has not been reading the cuneiform tablets\nad03\t0\t*\tthe imposition of the government of a fine .\nad03\t1\t\twhen alison and david soaked their feet was after dinner\nad03\t1\t\tthis problem 's analysis is made a lot easier when you understand differential equations .\nad03\t0\t*\tdracula thought that himself was the prince of darkness .\nad03\t0\t*\tgilgamesh might have been not reading the cuneiform tablets .\nad03\t1\t\twho asked which statue which tourist had taken a photo of ?\nad03\t1\t\twillow said that she 'd kiss tara and kick xander .\nad03\t1\t\ti climbed right up the tree .\nad03\t1\t\tall the dragons had escaped .\nad03\t1\t\twho did you attempt to force jason to kill ?\nad03\t1\t\ti thought of the moon\nad03\t0\t*\tbenjamin thought he would give the cloak to lee and the cloak to lee he gave .\nad03\t1\t\ti wondered had he left yet .\nad03\t1\t\ti thought she was pregnant\nad03\t1\t\ti arranged for him to see her .\nad03\t1\t\tit was over the hill and through the woods that they came running\nad03\t1\t\twho did you ask saw what ?\nad03\t0\t*\tgilgamesh might can seek ishtar\nad03\t1\t\tgilgamesh arrived\nad03\t0\t*\tjason arrived by medea .\nad03\t1\t\toil spread over the sea shore .\nad03\t1\t\twhat jason asked was whether the potion was ready\nad03\t0\t*\tjason asked whether that the potion was ready\nad03\t1\t\thave you seen mary ? i have vp seen mary\nad03\t1\t\tit seems that agamemnon left .\nad03\t1\t\tthose monkeys are eating the banana .\nad03\t0\t*\ti introduced her to he .\nad03\t0\t*\tnathan showed to benjamin it .\nad03\t0\t*\the kicked yourself\nad03\t1\t\tanson tried to shave himself .\nad03\t0\t*?\tgilgamesh never has flown a dragon .\nad03\t0\t*\twhat julie did of lloyd was become fond .\nad03\t1\t\tit is not allowed to incriminate oneself .\nad03\t1\t\tthe analysis of the problem was flawed\nad03\t1\t\twhich goddess did help us ?\nad03\t1\t\tposeidon appears to have turned out to have left .\nad03\t0\t*\tgilgamesh has been not reading the cuneiform tablets .\nad03\t0\t*\tdanced extremely , anson frantically at trade\nad03\t0\t*\taphrodite wanted to live and ishtar tried to do\nad03\t0\t*\ti kicked yourself\nad03\t1\t\thow fond of esther is agamemnon ?\nad03\t1\t\tron heard a discussion in the foyer\nad03\t0\t*\tmy mother hated myself\nad03\t1\t\tthe students demonstrated the technique this morning\nad03\t1\t\the walked up the hill .\nad03\t0\t*\twe wanted to ate cake\nad03\t0\t*\tjason knew those medea had cast the spell\nad03\t0\t*\tgilgamesh must should seek ishtar\nad03\t1\t\taphrodite said he freed the animals and free the animals he did\nad03\t1\t\tdid you drink the poison ?\nad03\t1\t\twhether agamemnon had triumphed was unknown .\nad03\t0\t*\ther has kissed her .\nad03\t1\t\ti often have a cold .\nad03\t0\t*\tjason whispered the phoenix had escaped\nad03\t0\t*\tbill reading of shakespeare satisfied me\nad03\t1\t\tdid n't the magic work ?\nad03\t1\t\tanson thought julie had fainted\nad03\t1\t\tthe horse fell\nad03\t0\t*\todysseus attempted odysseus to hear the sirens .\nad03\t1\t\tburn letters to peter !\nad03\t1\t\tgenie intoned the prayer\nad03\t1\t\tgilgamesh did n't fly the broomstick .\nad03\t1\t\tron 's likely to be on the web , is n't he ?\nad03\t1\t\tbill 's reading shakespeare and maureen 's singing schubert satisfy me\nad03\t0\t*\towners of a pig loves to eat truffles\nad03\t0\t*\tgilgamesh might loved ishtar\nad03\t1\t\tpaul had an affair\nad03\t1\t\tposeidon appears to own a dragon\nad03\t1\t\tthe twins might have both been at the party .\nad03\t0\t*\tthat jason had arrived was obvious infuriated medea .\nad03\t1\t\tthat i should evaporate is my fondest dream\nad03\t1\t\twhat gilgamesh may do is seek ishtar\nad03\t1\t\tyou said that anson thought that julie had fainted\nad03\t1\t\tthe owl hated the evil bat and the wise eagle\nad03\t1\t\twhat did john buy ?\nad03\t1\t\tagamemnon forced aphrodite to leave the school .\nad03\t1\t\tthere is a description of aristotle in the book .\nad03\t0\t*\tmedea exclaimed if the potion was ready\nad03\t1\t\thumans love to eat them .\nad03\t0\t*\tsomeone did medea poison .\nad03\t1\t\tperhaps iphigenia will have murdered oedipus by tomorrow .\nad03\t1\t\tso that he could escape , jason became invisible\nad03\t0\t*\ti wondered who had medea poisoned .\nad03\t1\t\ti asked did medea poison jason .\nad03\t1\t\tagamemnon stopped jason from casting the spell\nad03\t1\t\tno one wanted any cake .\nad03\t1\t\ti wanted jimmy for to come with me .\nad03\t0\t*\the walked the hill up .\nad03\t1\t\tthey should have all sent oedipus to thebes\nad03\t0\t*\tthose monkey are eating the banana .\nad03\t0\t*\twho had poseidon run away , before the executioner murdered ?\nad03\t1\t\ti asked anson if he was happy\nad03\t1\t\tdaniel became a blond .\nad03\t0\t*\thas that we have arrived back at our starting point proved that the world is round ?\nad03\t1\t\tit was the man i saw that you wanted to meet .\nad03\t1\t\tthat photograph by gomez of pugsley of lucy 's\nad03\t1\t\ti ate that .\nad03\t1\t\tit snowed\nad03\t1\t\taphrodite said he would free the animals and free the animals he will\nad03\t1\t\tthat the golden thread would show jason his path through the labyrinth was\nad03\t1\t\tjulie and jenny arrived first\nad03\t1\t\twhat have you eaten ?\nad03\t0\t*\tpeter is owners .\nad03\t0\t*\ti said this he left\nad03\t1\t\twho has drunk my whiskey ?\nad03\t0\t*\tyou said she liked yourself\nad03\t0\t*\tshe tried to left\nad03\t1\t\ti 'd planned to have finished , and finished i have\nad03\t1\t\tron expected the sack .\nad03\t1\t\tthat i am here proves that i care .\nad03\t0\t*\tshe tried to may leave\nad03\t1\t\tgilgamesh misses aphrodite\nad03\t0\t*\twho seemed had poisoned jason ?\nad03\t1\t\tthat plato loved aster seemed to be known by everyone .\nad03\t1\t\twhen dining with evil crocodiles , it is advisable to wear armour .\nad03\t0\t*\tbenjamin said he would give the cloak to lee and give the cloak he did to lee .\nad03\t1\t\tdid the magic work ?\nad03\t1\t\twho has drunk the poison ?\nad03\t1\t\tbenjamin said he would give the cloak to lee and give the cloak to lee he did .\nad03\t1\t\tjason became invisible , so that he could escape\nad03\t1\t\taphrodite may quickly free the animals .\nad03\t1\t\tthe horse galloped\nad03\t1\t\thow quickly did the greeks take troy ?\nad03\t1\t\tsome happy pigs which can fly love truffles\nad03\t1\t\tjulie felt a twinge in her arm\nad03\t1\t\tthe wizard turned the beetle into beer with a wave of his wand\nad03\t0\t*\twho seemed that had poisoned jason ?\nad03\t1\t\tkick me !\nad03\t1\t\twe wanted to eat cake\nad03\t1\t\tgomez 's photograph of pugsley belonging to lucy .\nad03\t1\t\tall the boys should could go\nad03\t1\t\tjulie maintained her own ideas over the course of the argument .\nad03\t1\t\tthe intrepid pirate and the fearful captain 's mate sunk the galleon .\nad03\t1\t\tgilgamesh might not have been reading the cuneiform tablets .\nad03\t1\t\tit was obvious that plato loved aster obvious .\nad03\t1\t\the loves him\nad03\t1\t\twe all thought he was unhappy\nad03\t0\t*\temily showed himself benjamin in the mirror .\nad03\t1\t\tanson believed the report .\nad03\t1\t\ti looked the number up .\nad03\t0\t*\tanson is incredibly difficult to be pleased .\nad03\t1\t\tno vampire slept .\nad03\t1\t\tafter the executioner left , poseidon wept .\nad03\t1\t\tpeter was at the party\nad03\t0\t*\twhales have i seen .\nad03\t0\t*\ti thought she is pregnant\nad03\t0\t*\thimself saw him\nad03\t1\t\tthat he is coming is clear .\nad03\t0\t*\tthere seem three men to be in the garden .\nad03\t0\t*\the analysis her was flawed\nad03\t1\t\twhere all did they go for their holidays ?\nad03\t1\t\tgilgamesh decided not to kill ishtar\nad03\t1\t\tbill reading shakespeare satisfied me\nad03\t1\t\tperseus saw the gorgon in his shield .\nad03\t1\t\tposeidon would run away , if the executioner murdered hera .\nad03\t0\t*\twho did a statue of surprise medea ?\nad03\t1\t\twhat did you say ( that ) the poet had written ?\nad03\t1\t\ti saw people playing there on the beach .\nad03\t0\t*\twho was that plato loved obvious ?\nad03\t1\t\ti did n't want any cake .\nad03\t1\t\tthat i should kiss pigs is my fondest dream\nad03\t0\t*\tgilgamesh flew not the broomstick .\nad03\t1\t\tron failed biology , unfortunately\nad03\t1\t\tthe men chuckle\nad03\t1\t\ti expected there to be a problem .\nad03\t1\t\tgilgamesh wanted to seduce ishtar , and seduce ishtar he did .\nad03\t1\t\tharry collapsed .\nad03\t1\t\ti asked who saw what .\nad03\t0\t*\tthe doctor arrived a new actor .\nad03\t0\t*\thim loves him\nad03\t0\t*\twho had poseidon run away , because the executioner murdered ?\nad03\t1\t\the has been happy\nad03\t1\t\tposeidon had run away , before the executioner murdered hera .\nad03\t0\t*\twhich the poem did homer recite ?\nad03\t0\t*\tnot reading of shakespeare satisfied me\nad03\t0\t*\twho did athena introduce who to ?\nad03\t1\t\tmerlin is a dangerous sorcerer .\nad03\t1\t\tanson saw anson .\nad03\t1\t\ti am to eat macaroni .\nad03\t1\t\tposeidon had escaped , before the executioner arrived .\nad03\t1\t\towners love truffles\nad03\t0\t*\tthe dragons were slain all .\nad03\t0\t*\ti saw him ever .\nad03\t1\t\thumans love to eat owners of pigs .\nad03\t0\t*\ti have sent 0 letter to environmental heath\nad03\t0\t*\twhat jason asked whether was the potion was ready\nad03\t1\t\tthose pigs love truffles\nad03\t0\t*\twe all thought he to be unhappy\nad03\t1\t\ti 'd planned to have finished by now .\nad03\t1\t\thas the potion not worked ?\nad03\t1\t\twhat i love is toast and sun dried tomatoes\nad03\t1\t\tmary ran .\nad03\t0\t*\tthe man i saw shaved myself .\nad03\t0\t*\treadings shakespeare satisfied me\nad03\t0\t*\tthe picture of no one hung upon any wall .\nad03\t1\t\the replied that he was happy .\nad03\t1\t\tno one could remove the blood from the wall\nad03\t1\t\tjulie maintained that the barman was sober .\nad03\t0\t*\ti kicked me\nad03\t1\t\tbenjamin gave lee the cloak .\nad03\t1\t\taphrodite wanted hera to persuade athena to leave .\nad03\t1\t\tgilgamesh is fighting the dragon .\nad03\t1\t\ti claimed she was pregnant\nad03\t1\t\tfor jenny , i intended to be present .\nad03\t1\t\tgilgamesh missed aphrodite\nad03\t1\t\tshe might be pregnant .\nad03\t0\t*\tthe pig grunt\nad03\t1\t\tanson demonized david at the club .\nad03\t1\t\tjason asked whether the potion was ready\nad03\t1\t\tfrieda closed the door\nad03\t0\t*\tpeter is the old pigs .\nad03\t1\t\tmedea might have given jason a poisoned robe ( just treat a poisoned robe as an np\nad03\t1\t\tquickly kiss anson !\nad03\t0\t*\tanson believed jenny to have hurt himself .\nad03\t1\t\tjulie felt hot\nad03\t1\t\tagamemnon expected esther to seem to be happy .\nad03\t0\t*\thim book\nad03\t1\t\tthat the answer is obvious upset hermes .\nad03\t0\t*\tthe consul 's gift of himself to the gladiator .\nad03\t1\t\thomer recited the poem about achilles ?\nad03\t1\t\tno vampire can survive sunrise .\nad03\t1\t\tunder the bed is the best place to hide\nad03\t1\t\tanson appeared\nad03\t1\t\tthere seems to be a problem .\nad03\t0\t*\tanson became that he was happy\nad03\t1\t\ti intoned that she was happy\nad03\t0\t*\twe all thought him was unhappy\nad03\t1\t\tmedea saw who ?\nad03\t1\t\tno one expected that agamemnon would win .\nad03\t1\t\tbelieving that the world is flat gives one some solace .\nad03\t1\t\tkick them !\nad03\t0\t*\tthe bears sniffs\nad03\t0\t*\twhere did you disappear before you hid the gold ?\nad03\t0\t*\tshe tried to do go .\nad03\t1\t\tmedea wondered if the potion was ready\nad03\t1\t\twho all did you meet when you were in derry ?\nad03\t1\t\twho did you hear an oration about ?\nad03\t1\t\talison ran\nad03\t1\t\tromeo sent letters to juliet .\nad03\t1\t\trichard 's gift of the helicopter to the hospital and of the bus to the school .\nad03\t1\t\tnathan caused benjamin to see himself in the mirror .\nad03\t1\t\ta. madeleine planned to catch the sardines and she did .\nad03\t0\t*\tmedea tried medea to poison her children .\nad03\t0\t*\twhich temple did athena contemplate the reason that her devotees had built ?\nad03\t1\t\ti did not understand .\nad03\t1\t\tgilgamesh loved ishtar and aphrodite did too\nad03\t1\t\twe believed him to be omnipotent\nad03\t0\t*\tron captured quickly a phoenix\nad03\t1\t\tdavid ate mangoes and raffi should too .\nad03\t1\t\tjulie and fraser ate those delicious pies in julie 's back garden .\nad03\t1\t\tthe old pigs love truffles\nad03\t1\t\tthe boys all should could go\nad03\t1\t\taphrodite quickly freed the animals\nad03\t1\t\tpaul had two affairs\nad03\t1\t\twhat alison and david did was soak their feet in a bucket\nad03\t1\t\tanson demonized david almost constantly .\nad03\t0\t*\tagamemnon seemed that left .\nad03\t1\t\tanson 's hen nibbled his ear .\nad03\t0\t*\twhat a kind of actor is he ?\nad03\t0\t*\tthe constantly reading shakespeare satisfied me\nad03\t1\t\tbefore the executioner arrived , poseidon had escaped\nad03\t1\t\tgilgamesh did n't leave .\nad03\t1\t\tgenie intoned that she was tired\nad03\t1\t\tlook at all these books . which book would you like ?\nad03\t0\t*\tthere were killed three men by the assassin .\nad03\t0\t*\tpeter is those pigs .\nad03\t1\t\ti do n't remember what i said all ?\nad03\t1\t\tthe pig grunts\nad03\t0\t*\tthe poison was neutralised was claimed that by everyone\nad03\t1\t\tpeople are stupid\nad03\t1\t\twhat i arranged was for jenny to be present .\nad03\t1\t\ti compared ginger to fred\nad03\t0\t*\tpeter is pigs\nad03\t1\t\twhich poet wrote which ode ?\nad03\t1\t\thow did julie ask if jenny left ?\nad03\t1\t\tdracula thought him to be the prince of darkness .\nad03\t0\t*\the ca n't possibly do that , possibly he ?\nad03\t1\t\ti must eat macaroni .\nad03\t1\t\ti asked who john would introduce to who .\nad03\t0\t*\tthe owl hated the and loved the bat .\nad03\t1\t\treading shakespeare satisfied me\nad03\t1\t\thumans love to eat owners .\nad03\t1\t\tgilgamesh fears death and achilles does as well\nad03\t0\t*\tthe pigs grunts\nad03\t0\t*\tconstant reading shakespeare satisfied me\nad03\t0\t*\tanson believed to be happy .\nad03\t1\t\thow did julie say that jenny left ?\nad03\t1\t\tshow me letters !\nad03\t1\t\tthe readings of shakespeare satisfied me\nad03\t1\t\tanson demonized david every day .\nad03\t1\t\tthe students demonstrated this morning\nad03\t1\t\twe believed aphrodite to be omnipotent .\nad03\t1\t\temily caused benjamin to see himself in the mirror .\nad03\t0\t*\tanson left before jenny saw himself .\nad03\t1\t\tnothing like that would i ever eat again .\nad03\t1\t\twhere has he put the cake ?\nad03\t1\t\tjason persuaded medea to desert her family\nad03\t1\t\tgilgamesh perhaps should be leaving .\nad03\t1\t\tgilgamesh has n't kissed ishtar .\nad03\t0\t*\tanson thought that himself was going to the club .\nad03\t0\t*\tposeidon appears to own a dragon\nad03\t0\t*\tdigitize is my happiest memory\nad03\t1\t\tit is easy to slay the gorgon .\nad03\t1\t\ti had the strangest feeling that i knew you .\nad03\t1\t\twhat all did you get for christmas ?\n"
  },
  {
    "path": "Chapter02/out_of_domain_dev.tsv",
    "content": "clc95\t1\t\tsomebody just left - guess who .\nclc95\t1\t\tthey claimed they had settled on something , but it was n't clear what they had settled on .\nclc95\t1\t\tif sam was going , sally would know where .\nclc95\t1\t\tthey 're going to serve the guests something , but it 's unclear what .\nclc95\t1\t\tshe 's reading . i ca n't imagine what .\nclc95\t1\t\tjohn said joan saw someone from her graduating class .\nclc95\t0\t*\tjohn ate dinner but i do n't know who .\nclc95\t0\t*\tshe mailed john a letter , but i do n't know to whom .\nclc95\t1\t\ti served leek soup to my guests .\nclc95\t1\t\ti served my guests .\nclc95\t0\t*\tshe was bathing , but i could n't make out who .\nclc95\t0\t*\tshe knew french for tom .\nclc95\t0\t*\tjohn is tall on several occasions .\nclc95\t0\t*\tthe ship sank , but i do n't know with what .\nclc95\t0\t*\tthey noticed the painting , but i do n't know for how long .\nclc95\t0\t*\tjohn was tall , but i do n't know on what occasions .\nclc95\t1\t\tjoan ate dinner with someone but i do n't know who .\nclc95\t1\t\tjoan ate dinner with someone but i do n't know who with .\nclc95\t0\t*\ti know that meg 's attracted to harry , but they do n't know who .\nclc95\t0\t*\tsince jill said joe had invited sue , we did n't have to ask who .\nclc95\t1\t\ti know that meg 's attracted to harry , but they do n't know who .\nclc95\t0\t*\tshe said she had spoken to everybody , but he was n't sure who .\nclc95\t0\t*\teach of the performers came in , but were sitting so far back that we could n't see who .\nclc95\t1\t\tshe did n't talk to one student .\nclc95\t0\t*\tshe does n't meet anyone for dinner , but they ca n't figure out who .\nclc95\t1\t\teveryone relies on someone . it 's unclear who .\nclc95\t1\t\teach student wrote a paper on a mayan language , but i do n't remember which one .\nclc95\t1\t\tthe newspaper has reported that they are about to appoint someone , but i ca n't remember who the newspaper has reported that they are about to appoint .\nclc95\t1\t\tthe newspaper has reported that they are about to appoint someone , but i ca n't remember who they are about to appoint .\nclc95\t1\t\tmost columnists claim that a senior white house official has been briefing them , and the newspaper today reveals which one .\nclc95\t1\t\tmost columnists claim that a senior white house official has been briefing them , but none will reveal which one .\nclc95\t1\t\tbill wondered how many papers sandy had read , but he did n't care which ones .\nclc95\t1\t\ti never know which papers sandy has read , but i usually know how many .\nclc95\t1\t\tsandy had read how many papers ? !\nclc95\t1\t\teverybody gets on well with a certain relative , but often only his therapist knows which one .\nclc95\t1\t\twhich book did each author recommend ?\nclc95\t1\t\this or her least known work .\nclc95\t1\t\tthey were going to meet sometime on sunday , but the faculty did n't know when .\nclc95\t1\t\tjohn likes some students , but i do n't know who .\nclc95\t1\t\ti do n't know who john likes .\nclc95\t0\t*\tjohn likes some students , but i do n't know who john likes some students .\nclc95\t0\t*\tjoan said she talked to the students , but fred could n't figure out who .\nclc95\t0\t*\the announced he had eaten the asparagus , but we did n't know what .\nclc95\t1\t\tshe was reading the books under the table , but fred did n't know what books .\nclc95\t1\t\the announced he would marry the woman he loved most , but none of his relatives could figure out who .\nclc95\t1\t\tshe talked to john or mary but i do n't know which .\nclc95\t1\t\tshe talked to john or mary but i do n't know which one .\nclc95\t1\t\tshe talked to harry , but i do n't know who else .\nclc95\t1\t\ti will see them , but i do n't know how many of them .\nclc95\t1\t\teveryone who knows either susan or laura likes her .\nclc95\t0\t*\tshe said she talked to three students but i do n't know how many .\nclc95\t0\t*\tshe said she talked to those students but i do n't know how many .\nclc95\t1\t\the shouted again , but i do n't know who to .\nclc95\t1\t\tshe was dancing with somebody , but i do n't know who with .\nclc95\t1\t\tseveral firefighters were injured , but it 's not known .\nclc95\t1\t\tmeg is attracted to harry , but they do n't know who she is attracted to .\nclc95\t1\t\tsandy was trying to work out which students would be able to solve a certain problem , but she would n't tell us which one .\nclc95\t0\t*\tsandy was trying to work out which students would be able to solve a certain problem , but she would n't tell us which one .\nclc95\t0\t*\tjohn and someone were dancing together , but i do n't know who .\nclc95\t1\t\tthe ta 's have been arguing about whether some student or other should pass , but i ca n't now remember which one .\nclc95\t0\t*\tit has been determined that somebody will be appointed ; it 's just not clear yet who .\nclc95\t0\t*\tsally asked if somebody was going to fail math class , but i ca n't remember who .\nclc95\t0\t*\tthe ta 's have been arguing about whether some student or other should pass , but i ca n't now remember which one .\nclc95\t1\t\tsandy is very anxious to see if the students will be able to solve the homework problem in a particular way , but she wo n't tell us which .\nclc95\t1\t\tsandy is very anxious to see if the students will be able to solve the homework problem in a particular way , but she wo n't tell us in which way .\nclc95\t1\t\tclinton is anxious to find out which budget dilemmas panetta would be willing to tackle in a certain way , but he wo n't say in which .\nclc95\t1\t\tsandy is wondering whether there will be students who have to drop the class for a certain reason , but she wo n't reveal what .\nclc95\t0\t*\tin which way is sandy very anxious to see if the students will be able to solve the homework problem ?\nclc95\t0\t*\tin which way is clinton anxious to find out which budget dilemmas panetta would be willing to solve ?\nclc95\t1\t\ti know how many assignments i 've graded , but i do n't know how many bill has .\nclc95\t0\t*\twhat did you leave before they did ?\nclc95\t0\t*\twhat did you leave before they started playing ?\nclc95\t1\t\tsandy was trying to work out which students would be able to solve a certain problem .\nclc95\t1\t\tthe administration has issued a statement that it is willing to meet with one of the student groups .\nclc95\t1\t\tsandy was trying to work out which students would be able to solve a problem .\nclc95\t1\t\tthe administration has issued a statement that it is willing to meet a student group .\nclc95\t1\t\tthe administration has issued a statement that it is willing to meet a student group , but i 'm not sure which one .\nclc95\t1\t\ti think agnes said that bill would speak , but i do n't remember what about .\nclc95\t0\t*\tagnes wondered how john could eat but it 's not clear what .\nclc95\t0\t*\ttony sent mo a picture that he painted , but it 's not clear with what .\nclc95\t1\t\tshe 's been dancing but we do n't know with whom .\nclc95\t0\t*\twho did they see someone ?\nc-05\t1\t\tit was believed by everybody that mary was a thief .\nc-05\t1\t\tthat professor is feared by all students .\nc-05\t1\t\tmary was respected by john .\nc-05\t1\t\tted was bitten by the spider .\nc-05\t0\t*\tthe book was by john written .\nc-05\t0\t*\tthe argument was summed by the coach up .\nc-05\t1\t\tthe paper was written up by john .\nc-05\t0\t*\tthe paper was written by john up .\nc-05\t1\t\tjohn was spoken to by mary .\nc-05\t0\t*\tjohn was spoken by mary to .\nc-05\t1\t\tthe book was seen by mary .\nc-05\t0\t*\tjohn was seen the book .\nc-05\t1\t\tthe book was written .\nc-05\t0\t*\tjohn was spoke by mary to .\nc-05\t1\t\tthe table was wiped clean by john .\nc-05\t0\t*\tthe table was wiped by john clean .\nc-05\t0\t*\tmary was given by john the book .\nc-05\t1\t\tjohn was believed to be telling the truth by mary .\nc-05\t1\t\tjohn was believed by mary to be telling the truth .\nc-05\t1\t\tthe car was driven by john to maine .\nc-05\t1\t\tit was believed by the students that they would have an exam .\nc-05\t0\t*\tthe magazines were sent to herself by mary .\nc-05\t0\t*\tchocolate eggs were hidden from each other by the children .\nc-05\t1\t\tthe magazines were sent by mary to herself .\nc-05\t1\t\tchocolate eggs were hidden from no child by any adult .\nc-05\t1\t\ttabs were kept on each agent by the other .\nc-05\t1\t\tchocolate eggs were hidden from every child by his mother .\nc-05\t1\t\tbooks were taken from no student and given to mary .\nc-05\t0\t*\tbooks were taken from no student and given to mary by any professor .\nc-05\t1\t\tbooks were taken from each student by the other .\nc-05\t1\t\tbooks were taken from each student and given to mary .\nc-05\t0\t*\tbooks were taken from each student and given to mary by the other .\nj_71\t1\t\tjack hates sue and is loved by mary .\nj_71\t1\t\tvera sent a baby alligator to max and a leather dinosaur to phyllis .\nj_71\t1\t\teither sam plays the bassoon or jekyll the oboe .\nj_71\t1\t\tsam does n't play bassoon , nor medusa oboe .\nj_71\t0\t*\tbill ate the peaches , but harry the grapes .\nj_71\t1\t\ti no more could have stolen that steak than jack the diamonds .\nj_71\t1\t\tbill ate more peaches than harry did grapes .\nj_71\t0\t*\tbill ate the peaches and harry did the grapes .\nj_71\t0\t*\ttom will smoke the grass , and reuben has the hash .\nj_71\t1\t\tif the ants were called elephants and elephants ants , i 'd be able to squash an elephant .\nj_71\t1\t\tsimon quickly dropped the gold , and jack the diamonds .\nj_71\t1\t\tbob tried to wash himself , and mary to read the funnies .\nj_71\t1\t\tharry told sue that albania is a lovely place for a vacation , and tom told sally that albania is a lovely place for a vacation .\nj_71\t1\t\tharry told sue that albania is a lovely place for a vacation , and tom .\nj_71\t1\t\tmax seemed to be trying to begin to love harriet , and fred to be trying to begin to love sue .\nj_71\t1\t\tmax seemed to be trying to force ted to leave the room , and walt , ira .\nj_71\t0\t*\tmax seemed to be trying to force ted to leave the room , and walt to stay a little longer .\nj_71\t0\t*\tarizona elected goldwater senator , and massachusetts , mccormack .\nj_71\t0\t*\tmillie will send the president an obscene telegram , and paul , the secretary a rude letter .\nj_71\t0\t*\tmaytag will give a brand-new dryer to the winner of the mrs .\nj_71\t0\t*\tbill did n't eat the peaches , nor harry .\nj_71\t1\t\tbill ate the peaches , and harry did , too .\nj_71\t0\t*\tbill must quickly eat the peaches , and harry must slowly .\nj_71\t1\t\twhenever russia has made a major political blunder , the u.s. has too .\nj_71\t1\t\tbill 's story about sue and max 's about kathy both amazed me .\nj_71\t1\t\ti bought three quarts of wine and two of clorox .\nj_71\t1\t\tscientists at the south hanoi institute of technology have succeeded in raising one dog with five legs , another with a cow 's liver , and a third with no head .\nj_71\t1\t\tbill 's story about sue may be amazing , but max 's is virtually incredible .\nj_71\t1\t\ti like bill 's yellow shirt , but not max 's .\nj_71\t1\t\tbill 's funny story about sue and max 's boring one about kathy both amazed me .\nj_71\t1\t\tbill 's wine from france and ted 's from california can not be compared .\nj_71\t0\t*\tas a teacher , you have to deal simultaneously with the administration 's pressure on you to succeed , and the children 's to be a nice guy .\nj_71\t1\t\tneither von karajan 's recording of beethoven 's 6th on columbia nor klemperer 's on angel has the right tempo .\nj_71\t0\t*\tgould 's performance of bach on the piano does n't please me anywhere as much as ross 's on the harpsichord .\nj_71\t0\t*\ttom 's dog with one eye attacked frank 's with three legs .\nj_71\t0\t*\tbecause steve 's of a spider 's eye had been stolen , i borrowed fred 's diagram of a snake 's fang .\nj_71\t1\t\tneither von karajan 's recording of beethoven 's 6th on columbia nor klemperer 's has the right tempo .\nj_71\t1\t\ttom 's dog with one eye attacked fred 's .\nj_71\t1\t\ti borrowed fred 's diagram of a snake 's eye because steve 's had been stolen .\nj_71\t1\t\tjerry attempted to blow up the pentagon .\nj_71\t1\t\tso fast did he run that nobody could catch him .\nj_71\t1\t\tbill bought a red house , and max bought one too .\ns_97\t1\t \twho always drinks milk ?\ns_97\t1\t\tthe book which inspired them was very long .\ns_97\t0\t*\tthe book what inspired them was very long .\ns_97\t1\t\ti know the person whose mother died .\ns_97\t1\t\tthe person whose mother 's dog we were all fond of .\ns_97\t1\t\ti wonder whose mother died .\ns_97\t1\t\ti wonder whose mother 's dog died .\ns_97\t1\t\ti wonder to whom they dedicated the building .\ns_97\t1\t\tgive me the phone number of the person whose mother 's dog died .\ns_97\t1\t\tthis is the senator to whose mother 's friend 's sister 's i sent the letter .\ns_97\t0\t*\ti want goes to the store .\ns_97\t0\t*\ti wonder what to be a clown on the cover of .\ns_97\t0\t*\tbother you that kim left !\ns_97\t0\t*\ta student who to talk to us just walked in .\ns_97\t1\t\twhose bagels do you like ?\ns_97\t1\t\ti wonder in whom to place my trust .\ns_97\t1\t\tthere were several old rocks songs that she and i were the only two who knew .\ns_97\t0\t*\tit was to to amuse us that kim was singing that they wanted .\ns_97\t0\t*\twhat they feared most was to be no one available to help them .\ns_97\t0\t*\twe tried to amuse them that kim was singing .\ns_97\t1\t\tmary asked me if , in st. louis , john could rent a house cheap .\ns_97\t0\t*\tmary arranged for , in st. louis , john to rent a house cheap .\ns_97\t1\t\tit would be unwise for there to be no fire exit .\ns_97\t1\t\ti believe there to be no way out .\ns_97\t0\t*\ti wonder in whom them to place their trust .\ns_97\t0\t*\ti wonder whom us to trust .\ns_97\t0\t*\ti wonder who for us to trust .\ns_97\t1\t\ti wonder who to place my trust in .\ns_97\t1\t\ti know the people that voted in the election .\ns_97\t1\t\ti threw away a book that sandy thought we had read .\ns_97\t1\t\ti thought that you were sick .\ns_97\t0\t*\ti dislike the people in who we placed our trust .\ns_97\t1\t\ti dislike the company in which we placed our trust .\ns_97\t1\t\ti dislike the people in whose house we stayed .\ns_97\t1\t\ti dislike the person with whom we were talking .\ns_97\t0\t*\tjones , that we were talking to last night , always watches football games alone .\ns_97\t1\t\ta letter was received that jones would be upset by .\ns_97\t0\t*\ta letter was received jones would be upset by .\ns_97\t1\t\ti saw someone yesterday i had n't seen for years .\ns_97\t1\t\tsomething happened i could n't really talk about .\ns_97\t1\t\tthe only person that i like whose kids dana is willing to put up with is pat .\ns_97\t1\t\tthe book that i like which everyone else in the class hates was written by john .\ns_97\t0\t*\tthe only person whose kids dana is willing to put up with was written by john .\ns_97\t0\t*\tthe book that i like - everyone else in the class hates .\ns_97\t1\t\tthe only person whose kids dana is willing to put up with is pat .\ns_97\t1\t\twhich book 's , author did you meet ?\ns_97\t0\t*\twhich boy 's , mother , did you meet who you liked ?\ns_97\t0\t*\twhich book 's , author did you meet who you liked ?\ns_97\t1\t\twhich boy 's , mother , did you meet ?\ns_97\t1\t\tall who lost money in the scam are eligible for the program .\ns_97\t0\t*\twho for sandy to talk to is still enrolled in the class ?\ns_97\t1\t\twho who you like does sandy also like ?\ns_97\t1\t\teverything you like is on the table .\ns_97\t1\t\tthe bills passed by the house yesterday that we objected to were vetoed .\ns_97\t1\t\tthe only people being added to our group who were at harvard were students .\nswb04\t1\t\twe like ourselves .\nswb04\t1\t\tnobody likes us .\nswb04\t0\t*\tleslie likes ourselves .\nswb04\t0\t*\tourselves like ourselves .\nswb04\t1\t\tshe voted for herself .\nswb04\t0\t*\twe gave us presents .\nswb04\t1\t\twe gave ourselves presents .\nswb04\t1\t\twe gave presents to ourselves .\nswb04\t0\t*\twe gave us to the cause .\nswb04\t1\t\twe gave ourselves to the cause .\nswb04\t0\t*\tleslie told us about us .\nswb04\t0\t*\tleslie told ourselves about us .\nswb04\t1\t\twe think that leslie likes us .\nswb04\t0\t*\twe think that leslie likes ourselves .\nswb04\t1\t\tour friends like us .\nswb04\t1\t\tthose pictures of us offended us .\nswb04\t0\t*\twe found your letter to ourselves in the trash .\nswb04\t0\t*\tvote for you !\nswb04\t1\t\tvote for yourself !\nswb04\t0\t*\twe appeared to them to vote for themselves .\nswb04\t1\t\twe admired the pictures of us in the album .\nswb04\t1\t\twe admired the pictures of ourselves in the album .\nswb04\t1\t\tleslie used a pen .\nswb04\t1\t\twe put the pigs in a pen .\nswb04\t1\t\twe need to pen the pigs to keep them from getting into the corn .\nswb04\t1\t\tthey should pen the letter quickly .\nswb04\t1\t\tthe car wo n't run .\nswb04\t1\t\tthis dye will run .\nswb04\t1\t\tshe can run an accelerator .\nswb04\t1\t\tthese stockings will run .\nswb04\t1\t\twe need another run to win .\nswb04\t1\t\tlee saw the student with a telescope .\nswb04\t1\t\ti forgot how good beer tastes .\nswb04\t1\t\tvisiting relatives can be boring .\nswb04\t1\t\tif only superman would stop flying planes !\nswb04\t1\t\tthat 's a new car dealership .\nswb04\t1\t\ti know you like the back of my hand .\nswb04\t1\t\tmax is on the phone now .\nswb04\t1\t\ti saw her duck .\nswb04\t1\t\ti 'm creating a committee . kim – you 're in charge .\nswb04\t1\t\tlights go out at ten . there will be no talking afterwards .\nswb04\t1\t\tthey found the book on the atom .\nswb04\t1\t\twhich experts testified against defendants who exposed them ?\nswb04\t1\t\tlist all experts for the defendant who represented himself .\nswb04\t1\t\tlist associates of each defendant who speaks spanish .\nswb04\t0\t*\tthey lost themselves ' books .\nswb04\t1\t\tsome sentences go on and on and on .\nswb04\t0\t*\tsentences some go on and on and on and on .\nswb04\t1\t\tthat surprised me .\nswb04\t0\t*\ti noticed the .\nswb04\t1\t\tthey were interested in his .\nswb04\t1\t\tthis is my favorite .\nswb04\t1\t\ta large dog chased a small cat .\nswb04\t1\t\tsome people yell at noisy dogs in my neighborhood .\nswb04\t1\t\tsome people yell at the dogs in my neighborhood .\nswb04\t1\t\tsome people yell at the dogs .\nswb04\t1\t\tsome people yell at noisy dogs .\nswb04\t1\t\tsome people yell at dogs .\nswb04\t1\t\tsome people consider the noisy dogs dangerous .\nswb04\t1\t\tsome people consider the dogs in my neighborhood dangerous .\nswb04\t1\t\tsome people consider noisy dogs in my neighborhood dangerous .\nswb04\t1\t\tsome people consider the dogs dangerous .\nswb04\t1\t\tsome people consider noisy dogs dangerous .\nswb04\t1\t\tsome people consider dogs in my neighborhood dangerous .\nswb04\t1\t\tsome people consider dogs dangerous .\nswb04\t1\t\tpeople with children who use drugs should be locked up .\nswb04\t1\t\tthis disease gave leslie a fever in rome .\nswb04\t1\t\tthe love of my life and mother of my children would never do such a thing .\nswb04\t1\t\tmost elections are quickly forgotten , but the election of 2000 , everyone will remember for a long time .\nswb04\t0\t*\tit is painting by klee or drawing by miro that the museum displays no .\nswb04\t1\t\tthe defendant denied the accusation .\nswb04\t0\t*\tthe teacher disappeared the problem .\nswb04\t0\t*\tthe teacher handed the student .\nswb04\t1\t\tthe bird sings .\nswb04\t0\t*\tthe bird sing .\nswb04\t0\t*\tbirds sings .\nswb04\t1\t\tthe birds give the worm a tug .\nswb04\t0\t*\tthe bird give the worm a tug .\nswb04\t0\t*\tthe birds gives the worm a tug .\nswb04\t1\t\tterry delighted in my pain .\nswb04\t0\t*\tterry delighted .\nswb04\t0\t*\tterry delighted my pain .\nswb04\t1\t\tkerry remarked it was late .\nswb04\t1\t\twhat additional categories and rules would be required to handle these verbs ?\nswb04\t1\t\twe created a monster .\nswb04\t0\t*\ti was already aware of fact .\nswb04\t0\t*\tthe defendant deny the allegation .\nswb04\t0\t*\tthe defendants denies the allegation .\nswb04\t1\t\tthe defendant walks .\nswb04\t0\t*\tthe defendant walk .\nswb04\t0\t*\tthe defendants walks .\nswb04\t1\t\thow many feature structures categories can label the first daughter ?\nswb04\t1\t\tthe child put the toy on the table .\nswb04\t1\t\tthe teacher became angry with the students .\nswb04\t0\t*\tthe teacher became .\nswb04\t1\t\tthe jury believed the defendant lied .\nswb04\t1\t\tthe guests dined .\nswb04\t1\t\twe relied on leslie .\nswb04\t0\t*\twe relied above leslie .\nswb04\t1\t\twe celebrated in the streets .\nswb04\t1\t\twe celebrated in the streets in the rain on tuesday in the morning .\nswb04\t0\t*\tthe children are happy of ice cream .\nswb04\t0\t*\tthe children are fond with the ice cream .\nswb04\t0\t*\tthe children are fond that they have ice cream .\nswb04\t1\t\ta magazine appeared on the newsstands .\nswb04\t1\t\ta magazine about crime appeared on the newsstands .\nswb04\t1\t\tnewsweek appeared on the newsstands .\nswb04\t0\t*\tnewsweek about crime appeared on the newsstands .\nswb04\t1\t\tthe report that crime was declining surprised many people .\nswb04\t1\t\tthe book surprised many people .\nswb04\t0\t*\tthe book that crime was declining surprised many people .\nswb04\t1\t\tthe storm arrived after the picnic .\nswb04\t0\t*\tthe storm arrived while the picnic .\nswb04\t1\t\tthe storm arrived while we ate lunch .\nswb04\t0\t*\tthis dogs barked .\nswb04\t1\t\tthese dogs barked .\nswb04\t1\t\ta chair was broken .\nswb04\t1\t\tthey want them arrested .\nswb04\t1\t\tthey preferred them arrested .\nswb04\t1\t\twe preferred them on our team .\nswb04\t1\t\twith my parents as supportive as they are , i 'll be in fine shape .\nswb04\t0\t*\twe walks .\nswb04\t0\t*\tfew dog barked .\nswb04\t1\t\tthe dogs barked .\nswb04\t1\t\ti walk and dana runs .\nswb04\t1\t\tthey like us .\nswb04\t0\t*\tus like them .\nswb04\t1\t\tkim likes dogs .\nswb04\t1\t\tdogs like kim .\nswb04\t1\t\tthe person responsible confessed .\nswb04\t0\t*\tthe person confessed responsible .\nswb04\t0\t*\tthe cat slept soundly and furry .\nswb04\t0\t*\tthe soundly and furry cat slept .\nswb04\t1\t\tchris walks , pat eats broccoli , and sandy plays squash .\nswb04\t1\t\tthere was some particular dog who saved every family .\nswb04\t1\t\tsusan frightens her .\nswb04\t1\t\tsusan told her a story .\nswb04\t1\t\tsusan told a story to her .\nswb04\t1\t\tsusan devoted herself to linguistics .\nswb04\t1\t\tnobody told susan about herself .\nswb04\t1\t\tthat picture of susan offended her .\nswb04\t1\t\the offended sandy .\nswb04\t0\t*\ti enjoy yourself .\nswb04\t1\t\tthey talk to themselves .\nswb04\t1\t\tnobody told susan .\nswb04\t1\t\tprotect yourself !\nswb04\t0\t*\tprotect you !\nswb04\t1\t\ti met the person who left .\nswb04\t1\t\tleslie slept .\nswb04\t0\t*\tchris handed bo .\nswb04\t1\t\tdana walked and leslie ran .\nswb04\t0\t*\tdana walking and leslie ran .\nswb04\t0\t*\tdana walking and leslie running .\nswb04\t0\t*\tthe putter of books left .\nswb04\t1\t\tkris donated a book to the library .\nswb04\t1\t\tthe police sprayed the protesters with water .\nswb04\t1\t\tthe police sprayed water on the protesters .\nswb04\t1\t\tthe students drove cars .\nswb04\t1\t\tthese cars drive easily .\nswb04\t1\t\tthe horse kicked me black and blue .\nswb04\t1\t\tthey yelled .\nswb04\t1\t\tthe horse raced past the barn fell .\nswb04\t1\t\tthe horse that was raced past the barn fell .\nswb04\t1\t\tthe boat seen down the river sank .\nswb04\t1\t\tthe evidence assembled by the prosecution convinced the jury .\nswb04\t1\t\tlou forgot the umbrella .\nswb04\t1\t\tlou forgot the umbrella in the closet .\nswb04\t1\t\tlou hoped the umbrella was broken .\nswb04\t0\t*\tlou hoped the umbrella in the closet .\nswb04\t0\t*\tlou put the umbrella was broken .\nswb04\t1\t\tlou put the umbrella in the closet .\nswb04\t1\t\tthe artist drew the child with a pencil .\nswb04\t1\t\tthe dog bit the cat .\nswb04\t0\t*\tthe cat was bitten the mouse .\nswb04\t0\t*\tthe cat was bitten the mouse by the dog .\nswb04\t0\t*\tchris was handed sandy a note by pat .\nswb04\t1\t\tchris was handed a note .\nswb04\t0\t*\tchris was handed sandy a note .\nswb04\t1\t\ttv puts dumb ideas in children 's heads .\nswb04\t1\t\tdumb ideas are put in children 's heads by tv .\nswb04\t1\t\tdumb ideas are put in children 's heads .\nswb04\t0\t*\tdumb ideas are put notions in children 's heads by tv .\nswb04\t1\t\tthe patient died .\nswb04\t0\t*\tthe patient was died .\nswb04\t0\t*\tchris was handed .\nswb04\t0\t*\ttv puts dumb ideas .\nswb04\t1\t\the was arrested by the police .\nswb04\t1\t\tthe cat got bitten .\nswb04\t0\t*\tthe cat were bitten by the dog .\nswb04\t1\t\tthere is a monster in loch ness .\nswb04\t1\t\tit is obvious that pat is lying .\nswb04\t1\t\tpat is the captain of the team .\nswb04\t0\t*\tpat is hate chris .\nswb04\t1\t\tthere is a unicorn in the garden .\nswb04\t1\t\tthere was a felon elected to the city council .\nswb04\t1\t\tthere is a seat available .\nswb04\t0\t*\ta seat available was in the last row .\nswb04\t1\t\tmany people were fond of pat .\nswb04\t1\t\tpeople are looking through the window .\nswb04\t1\t\ta felon was elected to the city council .\nswb04\t0\t*\tthere loved sandy .\nswb04\t0\t*\twe talked to them about there .\nswb04\t1\t\tit mattered that the giants had lost .\nswb04\t1\t\tthat dogs bark annoys people .\nswb04\t1\t\tit annoys people that dogs bark .\nswb04\t1\t\tthat chris knew the answer occurred to pat .\nswb04\t1\t\tit never occurred to pat that chris knew the answer .\nswb04\t1\t\tthat the cardinal won the game gave sandy a thrill .\nswb04\t1\t\tit gave sandy a thrill that the cardinal won the game .\nswb04\t0\t*\tthat sandy had lied suggested .\nswb04\t0\t*\tit loved sandy .\nswb04\t1\t\tcohen proved the independence of the continuum hypothesis .\nswb04\t1\t\tcohen proved that the continuum hypothesis was independent .\nswb04\t1\t\twe forgot our invitations .\nswb04\t1\t\tnobody saw pat .\nswb04\t1\t\tthat fido barks annoys me .\nswb04\t1\t\tfido barks .\nswb04\t1\t\tchris dreads the bucket .\nswb04\t1\t\tthe candidates bring advantage to the voters .\nswb04\t1\t\ttabs are kept on suspected drug dealers by the fbi .\nswb04\t1\t\tadvantage is taken of every opportunity for improvement .\nswb04\t1\t\tthe bucket was kicked by pat .\nw_80\t1\t\tjohn is sad .\nw_80\t1\t\tjohn loaded the wagon full with hay .\nw_80\t0\t*\tjohn loaded the wagon with hay green .\nw_80\t0\t*\ti presented john with it dead .\nw_80\t1\t\tof whom are you thinking ?\nw_80\t1\t\tjohn became rich .\nw_80\t1\t\ti gave john gold apples .\nw_80\t1\t\thow silly is bill considered ?\nw_80\t1\t\thow mad was bill made ?\nw_80\t1\t\tjohn is sick .\nw_80\t1\t\tjohn left singing .\nw_80\t1\t\tjohn is near larry .\nw_80\t1\t\tjohn gave bill the dog dead .\nw_80\t0\t*\tbill was struck by john as stupid .\nw_80\t0\t*\tjohn was struck as sick .\nw_80\t1\t\tjohn was struck by bill 's idiocy .\nw_80\t1\t\tjohn promised bill to leave .\nw_80\t1\t\tjohn tried to leave .\nw_80\t1\t\tto leave would be a pleasure .\nw_80\t0\t*\tjohn was struck by bill as pompous .\nw_80\t0\t*\tjohn was promised by bill to leave .\nw_80\t1\t\tthey make good cooks .\nw_80\t1\t\tthere is nothing to do .\nw_80\t1\t\tjohn has something for bill to do .\nw_80\t1\t\ti am counting on bill to incriminate himself .\nw_80\t1\t\ton whom are you counting to incriminate himself ?\nw_80\t1\t\ti am counting on bill to get there on time .\nw_80\t1\t\ti would prefer to leave .\nw_80\t1\t\ti would hate for john to leave .\nw_80\t1\t\ti would prefer for john to leave .\nw_80\t0\t*\tit was hated for john to leave .\nw_80\t0\t*\tjohn decided for bill to get the prize .\nw_80\t0\t*\tjohn decided bill to get the prize .\nw_80\t1\t\tto die is no fun .\nw_80\t1\t\tjohn wants to leave .\nw_80\t1\t\tjohn counted on bill to get there on time .\nw_80\t1\t\ti bought bill a book to read .\nw_80\t1\t\tjohn told mary that it would be important to leave early .\nw_80\t1\t\tjohn told mary that it was important to fred to leave early .\nw_80\t1\t\tjohn , told mary that it would be appropriate to leave together .\nw_80\t0\t*\tthe election of john president surprised me .\nw_80\t1\t\tjohn 's arriving dead surprised me .\nw_80\t1\t\tthe attempt by john to leave surprised me .\nw_80\t1\t\tjohn left orders to follow pete .\nw_80\t1\t\tjohn left us orders to follow pete .\nw_80\t1\t\tjohn left orders not to be disturbed .\nw_80\t1\t\tthat he is here is clear .\nw_80\t1\t\tit is a problem that he is here .\nw_80\t1\t\tit bothers me that he is here .\nw_80\t1\t\tjohn regretted it that bill had a good time .\nw_80\t0\t*\tjohn believes it that bill is here .\nw_80\t0\t*\tjohn believes it sincerely that bill is here .\nw_80\t0\t*\tjohn is aware of it that bill is here .\nw_80\t0\t*\tjohn felt it that bill was tardy .\nw_80\t0\t*\tjohn believed it that bill was tardy .\nw_80\t1\t\tit was believed that bill was tardy .\nw_80\t0\t*\tthat john is reluctant seems .\nw_80\t0\t*\tit is the problem that he is here .\nw_80\t1\t\tthat he is here is the problem .\nw_80\t1\t\tthe problem we are discussing is george .\nw_80\t0\t*\tit is to give up to leave .\nw_80\t1\t\tit would prove our theory to be untenable for carrots to be vegetables .\nw_80\t0\t*\tit was believed to be illegal by them to do that .\nw_80\t1\t\tjohn grudgingly accepted judgments of his incompetence as an auto mechanic .\nw_80\t1\t\tit was to john that i gave the book .\nw_80\t1\t\ti bought it to read .\nw_80\t1\t\ti bought it to give to pete .\nw_80\t1\t\ti gave it to pete to take to the fair .\nw_80\t0\t*\ti gave pete the book to impress .\nw_80\t1\t\ti wrote to bill .\nw_80\t1\t\ti presented it to bill to read .\nw_80\t0\t*\ti presented bill with it to read .\nw_80\t1\t\ti gave a book to bill to read .\nw_80\t1\t\tjohn thinks it would upset himself to die .\nw_80\t1\t\tjohn made bill mad at himself .\nw_80\t1\t\tjohn made bill master of himself .\nw_80\t1\t\tthe correspondence school made bill a good typist .\nw_80\t1\t\tthe correspondence school sent bill a good typist .\nw_80\t1\t\tjohn considers bill silly .\nw_80\t1\t\tjohn considers bill to be silly .\nw_80\t0\t*\tjohn bought a dog for himself to play with .\nw_80\t1\t\tjohn arranged for himself to get the prize .\nw_80\t1\t\tjohn talked to bill about himself .\n"
  },
  {
    "path": "Chapter03/KantaiBERT.ipynb",
    "content": "{\"nbformat\":4,\"nbformat_minor\":0,\"metadata\":{\"accelerator\":\"GPU\",\"colab\":{\"name\":\"KantaiBERT_2.ipynb\",\"provenance\":[],\"collapsed_sections\":[],\"toc_visible\":true,\"machine_shape\":\"hm\"},\"kernelspec\":{\"display_name\":\"Python 3\",\"name\":\"python3\"}},\"cells\":[{\"cell_type\":\"markdown\",\"metadata\":{\"id\":\"M1oqh0F6W3ad\"},\"source\":[\"# How to train a new language model from scratch using Transformers and Tokenizers\\n\",\"\\n\",\"Copyright 2020, Denis Rothman. Denis Rothman adapted a Hugging Face reference notebook to pretrain a transformer model.The next steps would be work on the building a larger dataset and testing several transformer models. \\n\",\"\\n\",\"The Transformer model of this Notebook is a Transformer model named ***KantaiBERT***. ***KantaiBERT*** is trained as a RoBERTa Transformer with DistilBERT architecture. The dataset was compiled with three books by Immanuel Kant downloaded from the [Gutenberg Project](https://www.gutenberg.org/). \\n\",\"\\n\",\"<img src=\\\"https://eco-ai-horizons.com/data/Kant.jpg\\\" style=\\\"margin: auto; display: block; width: 260px;\\\">\\n\",\"\\n\",\"![](https://commons.wikimedia.org/wiki/Kant_gemaelde_1.jpg)\\n\",\"\\n\",\"***KantaiBERT*** was pretrained with a small model of 84 million parameters using the same number of layers and heads as DistilBert, i.e., 6 layers, 768 hidden size,and 12 attention heads. ***KantaiBERT*** is then fine-tuned for a downstream masked Language Modeling task.\\n\",\"\\n\",\"### The Hugging Face original Reference and notes:\\n\",\"\\n\",\"Notebook edition (link to original of the reference blogpost [link](https://huggingface.co/blog/how-to-train)).\\n\"]},{\"cell_type\":\"code\",\"metadata\":{\"colab\":{\"base_uri\":\"https://localhost:8080/\"},\"id\":\"HOk4iZ9YZvec\",\"executionInfo\":{\"status\":\"ok\",\"timestamp\":1611319407103,\"user_tz\":-330,\"elapsed\":1307,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"}},\"outputId\":\"312e6d71-acb6-43e3-a4b1-dcd25f27c5f3\"},\"source\":[\"#@title Step 1: Loading the Dataset\\n\",\"#1.Load kant.txt using the Colab file manager\\n\",\"#2.Downloading the file from GitHub\\n\",\"!curl -L https://raw.githubusercontent.com/PacktPublishing/Transformers-for-Natural-Language-Processing/master/Chapter03/kant.txt --output \\\"kant.txt\\\"\"],\"execution_count\":2,\"outputs\":[{\"output_type\":\"stream\",\"text\":[\"  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current\\n\",\"                                 Dload  Upload   Total   Spent    Left  Speed\\n\",\"100 10.7M  100 10.7M    0     0  31.0M      0 --:--:-- --:--:-- --:--:-- 30.9M\\n\"],\"name\":\"stdout\"}]},{\"cell_type\":\"code\",\"metadata\":{\"colab\":{\"base_uri\":\"https://localhost:8080/\"},\"id\":\"5duRggBRZKvP\",\"executionInfo\":{\"elapsed\":48685,\"status\":\"ok\",\"timestamp\":1611302298137,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"},\"user_tz\":-330},\"outputId\":\"52e4d19b-8a7b-412c-83ab-d12a4759e508\"},\"source\":[\"#@title Step 2:Installing Hugging Face Transformers\\n\",\"# We won't need TensorFlow here\\n\",\"!pip uninstall -y tensorflow\\n\",\"# Install `transformers` from master\\n\",\"!pip install git+https://github.com/huggingface/transformers\\n\",\"!pip list | grep -E 'transformers|tokenizers'\\n\",\"# transformers version at notebook update --- 2.9.1\\n\",\"# tokenizers version at notebook update --- 0.7.0\"],\"execution_count\":null,\"outputs\":[{\"output_type\":\"stream\",\"text\":[\"Uninstalling tensorflow-2.4.0:\\n\",\"  Successfully uninstalled tensorflow-2.4.0\\n\",\"Collecting git+https://github.com/huggingface/transformers\\n\",\"  Cloning https://github.com/huggingface/transformers to /tmp/pip-req-build-c75zlcml\\n\",\"  Running command git clone -q https://github.com/huggingface/transformers /tmp/pip-req-build-c75zlcml\\n\",\"  Installing build dependencies ... \\u001b[?25l\\u001b[?25hdone\\n\",\"  Getting requirements to build wheel ... \\u001b[?25l\\u001b[?25hdone\\n\",\"    Preparing wheel metadata ... \\u001b[?25l\\u001b[?25hdone\\n\",\"Requirement already satisfied: tqdm>=4.27 in /usr/local/lib/python3.6/dist-packages (from transformers==4.3.0.dev0) (4.41.1)\\n\",\"Collecting sacremoses\\n\",\"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/7d/34/09d19aff26edcc8eb2a01bed8e98f13a1537005d31e95233fd48216eed10/sacremoses-0.0.43.tar.gz (883kB)\\n\",\"\\u001b[K     |████████████████████████████████| 890kB 5.7MB/s \\n\",\"\\u001b[?25hRequirement already satisfied: filelock in /usr/local/lib/python3.6/dist-packages (from transformers==4.3.0.dev0) (3.0.12)\\n\",\"Requirement already satisfied: importlib-metadata; python_version < \\\"3.8\\\" in /usr/local/lib/python3.6/dist-packages (from transformers==4.3.0.dev0) (3.3.0)\\n\",\"Requirement already satisfied: requests in /usr/local/lib/python3.6/dist-packages (from transformers==4.3.0.dev0) (2.23.0)\\n\",\"Requirement already satisfied: numpy in /usr/local/lib/python3.6/dist-packages (from transformers==4.3.0.dev0) (1.19.5)\\n\",\"Requirement already satisfied: dataclasses; python_version < \\\"3.7\\\" in /usr/local/lib/python3.6/dist-packages (from transformers==4.3.0.dev0) (0.8)\\n\",\"Requirement already satisfied: packaging in /usr/local/lib/python3.6/dist-packages (from transformers==4.3.0.dev0) (20.8)\\n\",\"Requirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.6/dist-packages (from transformers==4.3.0.dev0) (2019.12.20)\\n\",\"Collecting tokenizers==0.9.4\\n\",\"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/0f/1c/e789a8b12e28be5bc1ce2156cf87cb522b379be9cadc7ad8091a4cc107c4/tokenizers-0.9.4-cp36-cp36m-manylinux2010_x86_64.whl (2.9MB)\\n\",\"\\u001b[K     |████████████████████████████████| 2.9MB 18.6MB/s \\n\",\"\\u001b[?25hRequirement already satisfied: six in /usr/local/lib/python3.6/dist-packages (from sacremoses->transformers==4.3.0.dev0) (1.15.0)\\n\",\"Requirement already satisfied: click in /usr/local/lib/python3.6/dist-packages (from sacremoses->transformers==4.3.0.dev0) (7.1.2)\\n\",\"Requirement already satisfied: joblib in /usr/local/lib/python3.6/dist-packages (from sacremoses->transformers==4.3.0.dev0) (1.0.0)\\n\",\"Requirement already satisfied: typing-extensions>=3.6.4; python_version < \\\"3.8\\\" in /usr/local/lib/python3.6/dist-packages (from importlib-metadata; python_version < \\\"3.8\\\"->transformers==4.3.0.dev0) (3.7.4.3)\\n\",\"Requirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.6/dist-packages (from importlib-metadata; python_version < \\\"3.8\\\"->transformers==4.3.0.dev0) (3.4.0)\\n\",\"Requirement already satisfied: chardet<4,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests->transformers==4.3.0.dev0) (3.0.4)\\n\",\"Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.6/dist-packages (from requests->transformers==4.3.0.dev0) (2020.12.5)\\n\",\"Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests->transformers==4.3.0.dev0) (2.10)\\n\",\"Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from requests->transformers==4.3.0.dev0) (1.24.3)\\n\",\"Requirement already satisfied: pyparsing>=2.0.2 in /usr/local/lib/python3.6/dist-packages (from packaging->transformers==4.3.0.dev0) (2.4.7)\\n\",\"Building wheels for collected packages: transformers\\n\",\"  Building wheel for transformers (PEP 517) ... \\u001b[?25l\\u001b[?25hdone\\n\",\"  Created wheel for transformers: filename=transformers-4.3.0.dev0-cp36-none-any.whl size=1744849 sha256=274a76dfeb1da6a19cf68c3c28455142689a60d1cfdb99f1464d3e4d31cd010d\\n\",\"  Stored in directory: /tmp/pip-ephem-wheel-cache-f7yzuk0p/wheels/70/d3/52/b3fa4f8b8ef04167ac62e5bb2accb62ae764db2a378247490e\\n\",\"Successfully built transformers\\n\",\"Building wheels for collected packages: sacremoses\\n\",\"  Building wheel for sacremoses (setup.py) ... \\u001b[?25l\\u001b[?25hdone\\n\",\"  Created wheel for sacremoses: filename=sacremoses-0.0.43-cp36-none-any.whl size=893261 sha256=f75b02ce1a1faa3820b27ed3b46d0d8b84e18a5aa12510611c055c2dc7dbc5f2\\n\",\"  Stored in directory: /root/.cache/pip/wheels/29/3c/fd/7ce5c3f0666dab31a50123635e6fb5e19ceb42ce38d4e58f45\\n\",\"Successfully built sacremoses\\n\",\"Installing collected packages: sacremoses, tokenizers, transformers\\n\",\"Successfully installed sacremoses-0.0.43 tokenizers-0.9.4 transformers-4.3.0.dev0\\n\",\"tokenizers                    0.9.4          \\n\",\"transformers                  4.3.0.dev0     \\n\"],\"name\":\"stdout\"}]},{\"cell_type\":\"code\",\"metadata\":{\"colab\":{\"base_uri\":\"https://localhost:8080/\"},\"id\":\"IMnymRDLe0hi\",\"executionInfo\":{\"elapsed\":2860,\"status\":\"ok\",\"timestamp\":1611303247694,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"},\"user_tz\":-330},\"outputId\":\"706de1c8-715a-41e2-bdcf-3caa67125bf8\"},\"source\":[\"#@title Step 3: Training a Tokenizer\\n\",\"%%time \\n\",\"from pathlib import Path\\n\",\"\\n\",\"from tokenizers import ByteLevelBPETokenizer\\n\",\"\\n\",\"paths = [str(x) for x in Path(\\\".\\\").glob(\\\"**/*.txt\\\")]\\n\",\"# Initialize a tokenizer\\n\",\"tokenizer = ByteLevelBPETokenizer()\\n\",\"\\n\",\"# Customize training\\n\",\"tokenizer.train(files=paths, vocab_size=52_000, min_frequency=2, special_tokens=[\\n\",\"    \\\"<s>\\\",\\n\",\"    \\\"<pad>\\\",\\n\",\"    \\\"</s>\\\",\\n\",\"    \\\"<unk>\\\",\\n\",\"    \\\"<mask>\\\",\\n\",\"])\"],\"execution_count\":null,\"outputs\":[{\"output_type\":\"stream\",\"text\":[\"CPU times: user 6.04 s, sys: 449 ms, total: 6.49 s\\n\",\"Wall time: 1.76 s\\n\"],\"name\":\"stdout\"}]},{\"cell_type\":\"code\",\"metadata\":{\"colab\":{\"base_uri\":\"https://localhost:8080/\"},\"id\":\"nqYKX1XYyRI-\",\"executionInfo\":{\"elapsed\":1506,\"status\":\"ok\",\"timestamp\":1611303250245,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"},\"user_tz\":-330},\"outputId\":\"3247a100-2230-4c7f-92e2-41442e76f3b9\"},\"source\":[\"#@title Step 4: Saving the files to disk\\n\",\"import os\\n\",\"token_dir = '/content/KantaiBERT'\\n\",\"if not os.path.exists(token_dir):\\n\",\"  os.makedirs(token_dir)\\n\",\"tokenizer.save_model('KantaiBERT')\"],\"execution_count\":null,\"outputs\":[{\"output_type\":\"execute_result\",\"data\":{\"text/plain\":[\"['KantaiBERT/vocab.json', 'KantaiBERT/merges.txt']\"]},\"metadata\":{\"tags\":[]},\"execution_count\":7}]},{\"cell_type\":\"code\",\"metadata\":{\"id\":\"tKVWB8WShT-z\"},\"source\":[\"#@title Step 5 Loading the Trained Tokenizer Files \\n\",\"from tokenizers.implementations import ByteLevelBPETokenizer\\n\",\"from tokenizers.processors import BertProcessing\\n\",\"\\n\",\"tokenizer = ByteLevelBPETokenizer(\\n\",\"    \\\"./KantaiBERT/vocab.json\\\",\\n\",\"    \\\"./KantaiBERT/merges.txt\\\",\\n\",\")\"],\"execution_count\":null,\"outputs\":[]},{\"cell_type\":\"code\",\"metadata\":{\"colab\":{\"base_uri\":\"https://localhost:8080/\"},\"id\":\"I9hQqVS_qZWg\",\"executionInfo\":{\"elapsed\":1393,\"status\":\"ok\",\"timestamp\":1611303257943,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"},\"user_tz\":-330},\"outputId\":\"ed5a7467-b61e-4210-d2e5-c506edd44268\"},\"source\":[\"tokenizer.encode(\\\"The Critique of Pure Reason.\\\").tokens\"],\"execution_count\":null,\"outputs\":[{\"output_type\":\"execute_result\",\"data\":{\"text/plain\":[\"['The', 'ĠCritique', 'Ġof', 'ĠPure', 'ĠReason', '.']\"]},\"metadata\":{\"tags\":[]},\"execution_count\":9}]},{\"cell_type\":\"code\",\"metadata\":{\"colab\":{\"base_uri\":\"https://localhost:8080/\"},\"id\":\"OGjAwZVGrfyS\",\"executionInfo\":{\"elapsed\":1499,\"status\":\"ok\",\"timestamp\":1611303260078,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"},\"user_tz\":-330},\"outputId\":\"fa7923d2-939c-485a-a064-fb43966357cc\"},\"source\":[\"tokenizer.encode(\\\"The Critique of Pure Reason.\\\")\"],\"execution_count\":null,\"outputs\":[{\"output_type\":\"execute_result\",\"data\":{\"text/plain\":[\"Encoding(num_tokens=6, attributes=[ids, type_ids, tokens, offsets, attention_mask, special_tokens_mask, overflowing])\"]},\"metadata\":{\"tags\":[]},\"execution_count\":10}]},{\"cell_type\":\"code\",\"metadata\":{\"id\":\"hO5M3vrAhcuj\"},\"source\":[\"tokenizer._tokenizer.post_processor = BertProcessing(\\n\",\"    (\\\"</s>\\\", tokenizer.token_to_id(\\\"</s>\\\")),\\n\",\"    (\\\"<s>\\\", tokenizer.token_to_id(\\\"<s>\\\")),\\n\",\")\\n\",\"tokenizer.enable_truncation(max_length=512)\"],\"execution_count\":null,\"outputs\":[]},{\"cell_type\":\"code\",\"metadata\":{\"colab\":{\"base_uri\":\"https://localhost:8080/\"},\"id\":\"kD140sFjh0LQ\",\"executionInfo\":{\"elapsed\":1546,\"status\":\"ok\",\"timestamp\":1611303265026,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"},\"user_tz\":-330},\"outputId\":\"466229fc-5980-4ced-ffb7-0035fba3ff73\"},\"source\":[\"#@title Step 6: Checking Resource Constraints: GPU and NVIDIA \\n\",\"!nvidia-smi\"],\"execution_count\":null,\"outputs\":[{\"output_type\":\"stream\",\"text\":[\"Fri Jan 22 08:14:23 2021       \\n\",\"+-----------------------------------------------------------------------------+\\n\",\"| NVIDIA-SMI 460.32.03    Driver Version: 418.67       CUDA Version: 10.1     |\\n\",\"|-------------------------------+----------------------+----------------------+\\n\",\"| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |\\n\",\"| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |\\n\",\"|                               |                      |               MIG M. |\\n\",\"|===============================+======================+======================|\\n\",\"|   0  Tesla P100-PCIE...  Off  | 00000000:00:04.0 Off |                    0 |\\n\",\"| N/A   34C    P0    25W / 250W |      0MiB / 16280MiB |      0%      Default |\\n\",\"|                               |                      |                 ERR! |\\n\",\"+-------------------------------+----------------------+----------------------+\\n\",\"                                                                               \\n\",\"+-----------------------------------------------------------------------------+\\n\",\"| Processes:                                                                  |\\n\",\"|  GPU   GI   CI        PID   Type   Process name                  GPU Memory |\\n\",\"|        ID   ID                                                   Usage      |\\n\",\"|=============================================================================|\\n\",\"|  No running processes found                                                 |\\n\",\"+-----------------------------------------------------------------------------+\\n\"],\"name\":\"stdout\"}]},{\"cell_type\":\"code\",\"metadata\":{\"colab\":{\"base_uri\":\"https://localhost:8080/\"},\"id\":\"VNZZs-r6iKAV\",\"executionInfo\":{\"elapsed\":1562,\"status\":\"ok\",\"timestamp\":1611303382926,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"},\"user_tz\":-330},\"outputId\":\"6b3f8b32-4ccd-4661-d58a-74b324766495\"},\"source\":[\"#@title Checking that PyTorch Sees CUDAnot\\n\",\"import torch\\n\",\"torch.cuda.is_available()\"],\"execution_count\":null,\"outputs\":[{\"output_type\":\"execute_result\",\"data\":{\"text/plain\":[\"True\"]},\"metadata\":{\"tags\":[]},\"execution_count\":14}]},{\"cell_type\":\"code\",\"metadata\":{\"id\":\"LTXXutqeDzPi\"},\"source\":[\"#@title Step 7: Defining the configuration of the Model\\n\",\"from transformers import RobertaConfig\\n\",\"\\n\",\"config = RobertaConfig(\\n\",\"    vocab_size=52_000,\\n\",\"    max_position_embeddings=514,\\n\",\"    num_attention_heads=12,\\n\",\"    num_hidden_layers=6,\\n\",\"    type_vocab_size=1,\\n\",\")\"],\"execution_count\":null,\"outputs\":[]},{\"cell_type\":\"code\",\"metadata\":{\"colab\":{\"base_uri\":\"https://localhost:8080/\"},\"id\":\"5-UsuK9Ps0H7\",\"executionInfo\":{\"elapsed\":1631,\"status\":\"ok\",\"timestamp\":1611303394881,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"},\"user_tz\":-330},\"outputId\":\"405400e1-733f-490b-de7b-ae249d95ac01\"},\"source\":[\"print(config)\"],\"execution_count\":null,\"outputs\":[{\"output_type\":\"stream\",\"text\":[\"RobertaConfig {\\n\",\"  \\\"attention_probs_dropout_prob\\\": 0.1,\\n\",\"  \\\"bos_token_id\\\": 0,\\n\",\"  \\\"eos_token_id\\\": 2,\\n\",\"  \\\"gradient_checkpointing\\\": false,\\n\",\"  \\\"hidden_act\\\": \\\"gelu\\\",\\n\",\"  \\\"hidden_dropout_prob\\\": 0.1,\\n\",\"  \\\"hidden_size\\\": 768,\\n\",\"  \\\"initializer_range\\\": 0.02,\\n\",\"  \\\"intermediate_size\\\": 3072,\\n\",\"  \\\"layer_norm_eps\\\": 1e-12,\\n\",\"  \\\"max_position_embeddings\\\": 514,\\n\",\"  \\\"model_type\\\": \\\"roberta\\\",\\n\",\"  \\\"num_attention_heads\\\": 12,\\n\",\"  \\\"num_hidden_layers\\\": 6,\\n\",\"  \\\"pad_token_id\\\": 1,\\n\",\"  \\\"position_embedding_type\\\": \\\"absolute\\\",\\n\",\"  \\\"transformers_version\\\": \\\"4.3.0.dev0\\\",\\n\",\"  \\\"type_vocab_size\\\": 1,\\n\",\"  \\\"use_cache\\\": true,\\n\",\"  \\\"vocab_size\\\": 52000\\n\",\"}\\n\",\"\\n\"],\"name\":\"stdout\"}]},{\"cell_type\":\"code\",\"metadata\":{\"id\":\"4keFBUjQFOD1\"},\"source\":[\"#@title Step 8: Re-creating the Tokenizer in Transformers\\n\",\"from transformers import RobertaTokenizer\\n\",\"tokenizer = RobertaTokenizer.from_pretrained(\\\"./KantaiBERT\\\", max_length=512)\"],\"execution_count\":null,\"outputs\":[]},{\"cell_type\":\"code\",\"metadata\":{\"colab\":{\"base_uri\":\"https://localhost:8080/\"},\"id\":\"BzMqR-dzF4Ro\",\"executionInfo\":{\"elapsed\":4263,\"status\":\"ok\",\"timestamp\":1611303404170,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"},\"user_tz\":-330},\"outputId\":\"e71ab069-0d78-4592-f4cc-158050b47f75\"},\"source\":[\"#@title Step 9: Initializing a Model From Scratch\\n\",\"from transformers import RobertaForMaskedLM\\n\",\"\\n\",\"model = RobertaForMaskedLM(config=config)\\n\",\"print(model)\"],\"execution_count\":null,\"outputs\":[{\"output_type\":\"stream\",\"text\":[\"RobertaForMaskedLM(\\n\",\"  (roberta): RobertaModel(\\n\",\"    (embeddings): RobertaEmbeddings(\\n\",\"      (word_embeddings): Embedding(52000, 768, padding_idx=1)\\n\",\"      (position_embeddings): Embedding(514, 768, padding_idx=1)\\n\",\"      (token_type_embeddings): Embedding(1, 768)\\n\",\"      (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\"      (dropout): Dropout(p=0.1, inplace=False)\\n\",\"    )\\n\",\"    (encoder): RobertaEncoder(\\n\",\"      (layer): ModuleList(\\n\",\"        (0): RobertaLayer(\\n\",\"          (attention): RobertaAttention(\\n\",\"            (self): RobertaSelfAttention(\\n\",\"              (query): Linear(in_features=768, out_features=768, bias=True)\\n\",\"              (key): Linear(in_features=768, out_features=768, bias=True)\\n\",\"              (value): Linear(in_features=768, out_features=768, bias=True)\\n\",\"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\"            )\\n\",\"            (output): RobertaSelfOutput(\\n\",\"              (dense): Linear(in_features=768, out_features=768, bias=True)\\n\",\"              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\"            )\\n\",\"          )\\n\",\"          (intermediate): RobertaIntermediate(\\n\",\"            (dense): Linear(in_features=768, out_features=3072, bias=True)\\n\",\"          )\\n\",\"          (output): RobertaOutput(\\n\",\"            (dense): Linear(in_features=3072, out_features=768, bias=True)\\n\",\"            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\"          )\\n\",\"        )\\n\",\"        (1): RobertaLayer(\\n\",\"          (attention): RobertaAttention(\\n\",\"            (self): RobertaSelfAttention(\\n\",\"              (query): Linear(in_features=768, out_features=768, bias=True)\\n\",\"              (key): Linear(in_features=768, out_features=768, bias=True)\\n\",\"              (value): Linear(in_features=768, out_features=768, bias=True)\\n\",\"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\"            )\\n\",\"            (output): RobertaSelfOutput(\\n\",\"              (dense): Linear(in_features=768, out_features=768, bias=True)\\n\",\"              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\"            )\\n\",\"          )\\n\",\"          (intermediate): RobertaIntermediate(\\n\",\"            (dense): Linear(in_features=768, out_features=3072, bias=True)\\n\",\"          )\\n\",\"          (output): RobertaOutput(\\n\",\"            (dense): Linear(in_features=3072, out_features=768, bias=True)\\n\",\"            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\"          )\\n\",\"        )\\n\",\"        (2): RobertaLayer(\\n\",\"          (attention): RobertaAttention(\\n\",\"            (self): RobertaSelfAttention(\\n\",\"              (query): Linear(in_features=768, out_features=768, bias=True)\\n\",\"              (key): Linear(in_features=768, out_features=768, bias=True)\\n\",\"              (value): Linear(in_features=768, out_features=768, bias=True)\\n\",\"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\"            )\\n\",\"            (output): RobertaSelfOutput(\\n\",\"              (dense): Linear(in_features=768, out_features=768, bias=True)\\n\",\"              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\"            )\\n\",\"          )\\n\",\"          (intermediate): RobertaIntermediate(\\n\",\"            (dense): Linear(in_features=768, out_features=3072, bias=True)\\n\",\"          )\\n\",\"          (output): RobertaOutput(\\n\",\"            (dense): Linear(in_features=3072, out_features=768, bias=True)\\n\",\"            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\"          )\\n\",\"        )\\n\",\"        (3): RobertaLayer(\\n\",\"          (attention): RobertaAttention(\\n\",\"            (self): RobertaSelfAttention(\\n\",\"              (query): Linear(in_features=768, out_features=768, bias=True)\\n\",\"              (key): Linear(in_features=768, out_features=768, bias=True)\\n\",\"              (value): Linear(in_features=768, out_features=768, bias=True)\\n\",\"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\"            )\\n\",\"            (output): RobertaSelfOutput(\\n\",\"              (dense): Linear(in_features=768, out_features=768, bias=True)\\n\",\"              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\"            )\\n\",\"          )\\n\",\"          (intermediate): RobertaIntermediate(\\n\",\"            (dense): Linear(in_features=768, out_features=3072, bias=True)\\n\",\"          )\\n\",\"          (output): RobertaOutput(\\n\",\"            (dense): Linear(in_features=3072, out_features=768, bias=True)\\n\",\"            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\"          )\\n\",\"        )\\n\",\"        (4): RobertaLayer(\\n\",\"          (attention): RobertaAttention(\\n\",\"            (self): RobertaSelfAttention(\\n\",\"              (query): Linear(in_features=768, out_features=768, bias=True)\\n\",\"              (key): Linear(in_features=768, out_features=768, bias=True)\\n\",\"              (value): Linear(in_features=768, out_features=768, bias=True)\\n\",\"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\"            )\\n\",\"            (output): RobertaSelfOutput(\\n\",\"              (dense): Linear(in_features=768, out_features=768, bias=True)\\n\",\"              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\"            )\\n\",\"          )\\n\",\"          (intermediate): RobertaIntermediate(\\n\",\"            (dense): Linear(in_features=768, out_features=3072, bias=True)\\n\",\"          )\\n\",\"          (output): RobertaOutput(\\n\",\"            (dense): Linear(in_features=3072, out_features=768, bias=True)\\n\",\"            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\"          )\\n\",\"        )\\n\",\"        (5): RobertaLayer(\\n\",\"          (attention): RobertaAttention(\\n\",\"            (self): RobertaSelfAttention(\\n\",\"              (query): Linear(in_features=768, out_features=768, bias=True)\\n\",\"              (key): Linear(in_features=768, out_features=768, bias=True)\\n\",\"              (value): Linear(in_features=768, out_features=768, bias=True)\\n\",\"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\"            )\\n\",\"            (output): RobertaSelfOutput(\\n\",\"              (dense): Linear(in_features=768, out_features=768, bias=True)\\n\",\"              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\"            )\\n\",\"          )\\n\",\"          (intermediate): RobertaIntermediate(\\n\",\"            (dense): Linear(in_features=768, out_features=3072, bias=True)\\n\",\"          )\\n\",\"          (output): RobertaOutput(\\n\",\"            (dense): Linear(in_features=3072, out_features=768, bias=True)\\n\",\"            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\"          )\\n\",\"        )\\n\",\"      )\\n\",\"    )\\n\",\"  )\\n\",\"  (lm_head): RobertaLMHead(\\n\",\"    (dense): Linear(in_features=768, out_features=768, bias=True)\\n\",\"    (layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\"    (decoder): Linear(in_features=768, out_features=52000, bias=True)\\n\",\"  )\\n\",\")\\n\"],\"name\":\"stdout\"}]},{\"cell_type\":\"code\",\"metadata\":{\"colab\":{\"base_uri\":\"https://localhost:8080/\"},\"id\":\"jU6JhBSTKiaM\",\"executionInfo\":{\"elapsed\":1417,\"status\":\"ok\",\"timestamp\":1611303407295,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"},\"user_tz\":-330},\"outputId\":\"7cefac4a-263c-4785-d91f-f9020bfd3d1c\"},\"source\":[\"print(model.num_parameters())\"],\"execution_count\":null,\"outputs\":[{\"output_type\":\"stream\",\"text\":[\"83504416\\n\"],\"name\":\"stdout\"}]},{\"cell_type\":\"code\",\"metadata\":{\"colab\":{\"base_uri\":\"https://localhost:8080/\"},\"id\":\"-BXhhe7twTxb\",\"executionInfo\":{\"elapsed\":1327,\"status\":\"ok\",\"timestamp\":1611303409350,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"},\"user_tz\":-330},\"outputId\":\"5f78d978-6e80-477e-aebc-3a1c9119b5e9\"},\"source\":[\"#@title Exploring the Parameters\\n\",\"LP=list(model.parameters())\\n\",\"lp=len(LP)\\n\",\"print(lp)\\n\",\"for p in range(0,lp):\\n\",\"  print(LP[p])\"],\"execution_count\":null,\"outputs\":[{\"output_type\":\"stream\",\"text\":[\"106\\n\",\"Parameter containing:\\n\",\"tensor([[ 0.0235,  0.0174, -0.0312,  ..., -0.0200,  0.0193,  0.0241],\\n\",\"        [ 0.0223,  0.0050, -0.0057,  ...,  0.0110, -0.0061,  0.0102],\\n\",\"        [-0.0245, -0.0372,  0.0108,  ...,  0.0088, -0.0083,  0.0045],\\n\",\"        ...,\\n\",\"        [-0.0384, -0.0139,  0.0199,  ..., -0.0005,  0.0123,  0.0251],\\n\",\"        [ 0.0307,  0.0179,  0.0046,  ..., -0.0197,  0.0076, -0.0035],\\n\",\"        [ 0.0019,  0.0276, -0.0056,  ...,  0.0491, -0.0172,  0.0045]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[-0.0130, -0.0137,  0.0190,  ..., -0.0165, -0.0319,  0.0139],\\n\",\"        [-0.0254,  0.0061,  0.0060,  ...,  0.0178,  0.0224,  0.0162],\\n\",\"        [-0.0003,  0.0218, -0.0115,  ..., -0.0138, -0.0128,  0.0331],\\n\",\"        ...,\\n\",\"        [-0.0115,  0.0210,  0.0268,  ..., -0.0152,  0.0361, -0.0047],\\n\",\"        [ 0.0272,  0.0065,  0.0166,  ...,  0.0208, -0.0169, -0.0053],\\n\",\"        [ 0.0158,  0.0003,  0.0151,  ..., -0.0129,  0.0220, -0.0140]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[ 7.1800e-03,  6.6437e-03,  8.4003e-03, -9.4242e-03,  3.5971e-02,\\n\",\"          1.5493e-02,  8.6217e-03, -2.5297e-02, -3.4352e-02, -2.7498e-04,\\n\",\"          3.5180e-03,  2.6678e-02,  8.6635e-03, -2.7617e-02,  1.0993e-02,\\n\",\"          8.0117e-03, -1.4683e-02, -4.3267e-03, -2.5300e-02, -1.5936e-02,\\n\",\"          1.4094e-02, -1.0912e-02, -1.1442e-02,  3.0577e-03, -1.4531e-02,\\n\",\"         -1.6202e-04, -3.8920e-03, -3.7224e-02,  2.1012e-02, -4.2987e-03,\\n\",\"         -1.0250e-02, -4.4596e-02,  9.0525e-03,  3.5725e-02, -2.5446e-02,\\n\",\"         -1.5833e-02, -1.2881e-02,  1.3866e-02,  1.3644e-02,  1.7893e-05,\\n\",\"          1.5038e-02,  5.3411e-03, -3.6112e-03, -3.5338e-03,  9.4914e-03,\\n\",\"         -4.9824e-04, -5.1193e-03, -1.0588e-02,  2.9083e-02,  3.1769e-02,\\n\",\"          5.6870e-04,  1.6663e-03,  2.3251e-02,  1.7677e-02, -6.0966e-02,\\n\",\"          2.4458e-02, -9.8698e-03,  8.8821e-03,  2.2937e-03, -2.7309e-02,\\n\",\"         -6.4727e-03, -1.2389e-02,  2.5948e-02,  2.4857e-02,  3.5420e-02,\\n\",\"         -1.8719e-02, -1.3693e-03, -3.0626e-02,  2.7882e-03, -5.6030e-03,\\n\",\"          3.6367e-02,  4.4579e-03, -2.8664e-03, -8.1722e-03,  2.7467e-02,\\n\",\"          2.4763e-02, -1.5073e-02,  1.6423e-02,  1.3034e-02, -2.8786e-02,\\n\",\"         -1.8404e-02,  2.0704e-02,  7.7605e-04, -3.0608e-02, -1.2811e-02,\\n\",\"          1.7842e-02,  4.0206e-02,  1.2923e-02,  4.2755e-02, -2.9865e-02,\\n\",\"         -8.5164e-03, -1.5878e-02, -2.4676e-02, -5.8619e-03,  1.0806e-02,\\n\",\"         -2.4007e-02, -1.7718e-02,  1.9903e-02,  8.6628e-03,  3.6344e-03,\\n\",\"          8.1299e-03,  2.5190e-02,  6.9107e-03, -4.3021e-02,  1.5741e-02,\\n\",\"          5.8291e-03, -8.6341e-03, -4.1600e-03, -6.1461e-03, -8.9474e-03,\\n\",\"          3.9467e-03,  1.9342e-02,  1.6835e-02, -1.1109e-02, -3.2239e-03,\\n\",\"         -9.3587e-05,  2.1938e-02, -1.9343e-02,  1.8259e-02,  8.5760e-03,\\n\",\"         -6.0120e-03,  2.0020e-02, -1.2867e-03,  3.0612e-02, -2.8033e-02,\\n\",\"          2.3411e-02, -2.7851e-02,  8.3895e-03,  2.0522e-02,  2.8449e-02,\\n\",\"          5.3295e-03,  1.7460e-03, -1.0284e-02,  3.9836e-03, -2.9588e-03,\\n\",\"         -1.7243e-02,  1.5264e-02, -2.8806e-03,  7.4932e-03, -1.2714e-03,\\n\",\"         -3.7502e-02, -3.0640e-02, -1.9682e-02,  4.5279e-03, -1.8760e-02,\\n\",\"          1.1877e-02, -2.1112e-03, -7.6996e-03, -1.1017e-02, -1.8120e-02,\\n\",\"          1.1911e-02,  1.4793e-02, -3.6629e-02,  1.6278e-02, -2.3724e-02,\\n\",\"         -1.7545e-02,  1.5860e-02, -1.4429e-02, -1.6339e-02,  2.0127e-03,\\n\",\"         -2.2843e-03,  1.4623e-02,  1.4657e-02,  4.2222e-02,  2.1177e-03,\\n\",\"         -3.4719e-03,  2.5870e-02, -2.1785e-03,  3.1395e-03,  6.6356e-03,\\n\",\"         -5.9466e-03, -5.7608e-02,  2.9915e-02,  2.1435e-02,  3.2267e-02,\\n\",\"         -3.7518e-02, -2.3894e-03,  1.3420e-02, -2.4054e-03,  1.4911e-02,\\n\",\"         -1.3702e-04, -2.7453e-02,  1.8495e-02,  1.4016e-02, -1.8197e-03,\\n\",\"          9.7095e-03, -5.1503e-03,  7.9470e-04,  1.4856e-02, -1.3550e-02,\\n\",\"         -1.3359e-02, -1.8834e-02, -7.5882e-03,  2.0986e-02,  2.5334e-02,\\n\",\"          8.9823e-04,  3.4695e-02,  2.2731e-02, -1.7828e-02,  7.3283e-03,\\n\",\"          3.2347e-03,  1.8042e-02, -1.3865e-02,  7.8857e-03, -3.2168e-02,\\n\",\"         -9.6327e-03,  2.2697e-02, -6.7531e-03, -2.4187e-02, -7.3901e-03,\\n\",\"          1.6666e-02,  3.7033e-03,  2.2159e-02,  2.1215e-02, -1.7350e-02,\\n\",\"         -3.5021e-02,  4.0338e-02, -1.4414e-03, -2.7513e-02,  1.9779e-02,\\n\",\"         -2.7719e-02,  1.4489e-02,  4.3596e-03,  1.2859e-02,  9.3213e-03,\\n\",\"          2.0891e-02,  1.0693e-02, -5.1071e-03,  3.1345e-02, -3.0417e-02,\\n\",\"         -3.4125e-02,  2.4389e-02,  2.0400e-02, -1.6777e-02,  4.3065e-02,\\n\",\"          5.6042e-03, -7.2931e-03, -4.3282e-03,  1.6478e-02, -3.6286e-02,\\n\",\"         -1.4050e-02,  1.1615e-02, -4.2972e-03, -8.1791e-03,  8.1628e-03,\\n\",\"         -2.4173e-02, -4.4340e-03, -8.4001e-03,  2.1200e-03, -1.2033e-02,\\n\",\"         -2.3261e-02, -1.4815e-02, -2.6957e-02,  9.9435e-03, -2.8107e-02,\\n\",\"         -2.3369e-02, -1.0778e-02, -1.4185e-02,  6.0750e-03, -1.3494e-02,\\n\",\"          2.7330e-02, -5.6782e-03,  7.8278e-04,  6.3407e-03,  1.3132e-02,\\n\",\"          1.9369e-02, -3.2497e-02, -4.0580e-02, -3.6554e-02,  2.1006e-02,\\n\",\"          1.5128e-02, -2.2121e-02, -3.2965e-02, -2.3447e-02,  1.2934e-02,\\n\",\"         -8.9827e-03, -1.4176e-02, -1.0781e-02,  6.4699e-03,  2.8518e-02,\\n\",\"          6.9459e-03,  1.1522e-03, -1.3720e-03,  4.2082e-02,  3.0859e-03,\\n\",\"         -9.0549e-03,  1.1470e-02, -1.1126e-02,  3.0940e-02, -4.2342e-02,\\n\",\"         -1.3248e-02, -6.9904e-03,  6.8190e-03,  2.5913e-03,  7.9832e-03,\\n\",\"         -1.1913e-02, -2.8337e-02,  1.4619e-02, -1.9521e-02,  1.2593e-02,\\n\",\"          1.2898e-02,  2.1613e-02, -1.5045e-02,  3.4067e-02,  2.7984e-02,\\n\",\"          1.2481e-02, -3.2347e-02,  3.8949e-02,  8.9595e-03,  7.9586e-03,\\n\",\"         -3.1714e-02,  4.2258e-03, -1.3134e-02,  5.1858e-03,  4.8415e-03,\\n\",\"         -4.5262e-02, -2.8169e-02, -2.4474e-02, -1.7195e-02,  2.9952e-02,\\n\",\"         -1.7173e-02,  9.7922e-03,  5.6517e-03,  7.5203e-03,  4.0959e-03,\\n\",\"          1.0813e-02,  4.3824e-02,  1.1238e-02,  3.3847e-02, -6.5930e-04,\\n\",\"          2.1624e-02,  2.5236e-02, -3.8943e-03, -9.1064e-03, -1.1540e-02,\\n\",\"         -3.8315e-02,  2.8085e-02, -1.9670e-02,  7.4361e-03, -1.1309e-02,\\n\",\"          1.9085e-03, -1.6417e-03,  2.2166e-02, -7.0164e-04, -2.6895e-02,\\n\",\"          1.2807e-02, -3.3783e-02,  2.7430e-02, -1.8868e-02, -3.3606e-02,\\n\",\"          1.3179e-02,  5.5675e-03, -2.9198e-03, -7.6581e-03, -2.1437e-02,\\n\",\"         -2.1133e-02, -2.1569e-02,  1.4381e-02,  2.7465e-03, -1.3964e-02,\\n\",\"         -1.4823e-03, -6.7077e-03,  1.2270e-02, -3.1715e-02,  3.0172e-02,\\n\",\"         -2.6980e-02,  4.9205e-02, -1.1472e-02, -1.0497e-02, -9.2518e-03,\\n\",\"         -4.7793e-04, -1.6612e-02, -5.6121e-03, -1.0638e-04, -2.1223e-02,\\n\",\"         -9.3877e-03,  1.7303e-02, -9.3583e-03, -4.3980e-02, -4.8787e-03,\\n\",\"          3.8577e-03, -8.0367e-04, -9.6154e-04, -1.5294e-02, -4.7520e-04,\\n\",\"         -3.0764e-02, -4.4183e-03, -5.6546e-02, -1.0320e-02,  6.8146e-03,\\n\",\"         -1.6257e-02,  3.5170e-03,  1.2217e-02, -1.1546e-03,  2.4336e-02,\\n\",\"          3.0590e-02, -3.3474e-03, -2.1434e-02, -2.4652e-02,  4.6145e-02,\\n\",\"         -5.3906e-03, -2.3682e-03,  5.4975e-03, -1.7060e-02, -2.0117e-02,\\n\",\"         -3.2636e-02, -1.2466e-02, -2.4518e-02,  1.4289e-02, -2.3107e-02,\\n\",\"          3.7563e-03,  1.0498e-03, -1.4731e-04,  1.6694e-02, -8.0411e-03,\\n\",\"          1.4120e-03, -3.1152e-03, -3.8396e-02,  3.1607e-02, -1.0289e-02,\\n\",\"          1.8154e-02,  1.1437e-03, -2.7626e-02,  2.1337e-03, -1.8168e-02,\\n\",\"         -3.5218e-02,  2.2459e-02,  7.8964e-03, -2.5786e-03, -9.5183e-03,\\n\",\"          7.1241e-03,  2.0980e-03,  9.6836e-03, -1.6857e-03,  1.4061e-02,\\n\",\"         -1.2510e-02,  2.2653e-02, -8.3410e-03,  2.8464e-02, -1.3376e-02,\\n\",\"         -1.5460e-02,  4.2510e-02,  2.9187e-02, -1.7961e-02, -3.4083e-03,\\n\",\"         -1.5833e-02,  9.7975e-03,  6.0228e-03,  1.6737e-02,  4.6077e-02,\\n\",\"          4.7919e-02, -1.7183e-02, -3.9046e-02,  4.2153e-03, -5.8682e-03,\\n\",\"         -7.6091e-03,  6.1273e-03, -5.4660e-03, -8.3213e-03,  3.4065e-02,\\n\",\"         -1.3743e-03, -3.0128e-02, -5.6012e-03,  1.8614e-02, -1.4121e-02,\\n\",\"          7.2251e-03,  1.9277e-02,  9.1925e-03, -1.0967e-02, -1.4405e-02,\\n\",\"          1.1874e-02,  3.7143e-02,  4.0905e-03, -3.5487e-02,  2.1459e-02,\\n\",\"          1.2339e-02, -2.5228e-02,  1.3806e-03, -7.1485e-03, -4.3867e-04,\\n\",\"         -4.0890e-02, -7.7639e-03,  1.8846e-02, -7.9675e-03,  1.9798e-02,\\n\",\"          2.6019e-02, -2.9910e-02,  3.7135e-02, -2.3721e-03, -1.3041e-02,\\n\",\"         -1.3590e-02, -2.1053e-02, -4.2212e-02, -2.0449e-02, -1.6761e-03,\\n\",\"          1.3793e-02,  5.0163e-03,  7.2525e-03, -1.0584e-03,  1.7911e-02,\\n\",\"         -6.1241e-03,  3.6156e-02, -9.7795e-03,  1.1575e-02, -1.2393e-02,\\n\",\"         -2.1529e-02,  3.6920e-02,  5.4768e-03, -2.5465e-03, -3.5564e-02,\\n\",\"         -1.8852e-02, -1.9270e-02, -2.3598e-02,  1.6461e-02, -8.5044e-03,\\n\",\"         -8.1342e-02,  1.9946e-02,  2.3101e-02,  1.1608e-02, -2.7202e-02,\\n\",\"         -2.1214e-03,  1.8512e-02, -2.2433e-02,  5.2665e-02,  1.4524e-02,\\n\",\"         -2.7767e-02,  2.0917e-02, -2.4531e-02,  2.8340e-04,  7.4590e-03,\\n\",\"          1.3207e-02, -4.4964e-04,  1.8489e-02, -8.1855e-03, -2.6571e-02,\\n\",\"          6.0091e-03, -1.5137e-02, -1.0845e-02,  1.8406e-02, -8.6995e-03,\\n\",\"          4.6326e-02, -3.4162e-02, -4.5423e-03, -1.1695e-02, -1.6018e-03,\\n\",\"          4.5359e-04,  1.3775e-02, -3.0665e-02, -3.7231e-03,  1.9538e-02,\\n\",\"          1.8332e-02,  1.5294e-02, -2.2645e-02, -1.7966e-04, -5.6845e-03,\\n\",\"          2.4474e-02,  1.2335e-02, -2.8537e-02, -3.3811e-02, -3.9477e-02,\\n\",\"          1.5555e-02,  2.4807e-02,  1.1926e-02,  1.3630e-02, -2.4158e-03,\\n\",\"          5.2492e-03, -3.5830e-03,  1.6062e-02,  8.3222e-03,  4.5684e-03,\\n\",\"         -7.3681e-03,  1.7268e-02, -1.6564e-02, -2.3695e-02,  1.8832e-02,\\n\",\"          1.2441e-02, -2.6820e-02, -1.8805e-02, -1.5253e-02, -6.4328e-03,\\n\",\"         -1.8041e-02, -3.2997e-02,  1.3746e-02, -3.5377e-02, -1.6488e-02,\\n\",\"          3.8501e-02,  9.5742e-03, -1.4900e-02, -1.6956e-02, -3.1345e-02,\\n\",\"         -1.0592e-02,  1.2087e-02,  4.2428e-03,  2.2683e-02,  9.1621e-03,\\n\",\"          1.9223e-02, -2.2042e-02,  9.1639e-03, -4.3950e-02,  8.7145e-03,\\n\",\"          2.8953e-03,  5.7122e-03, -2.4340e-02, -1.0640e-02, -1.1329e-02,\\n\",\"         -2.6244e-02,  1.6505e-02, -3.3996e-02,  1.9131e-02,  1.0224e-02,\\n\",\"          1.4776e-02,  2.7232e-02, -1.3224e-02, -9.5900e-03, -9.4386e-03,\\n\",\"         -2.3307e-02, -2.9251e-02, -2.1324e-02, -8.7544e-03, -3.4134e-03,\\n\",\"          1.0803e-02,  2.8762e-03,  1.4169e-02, -1.3228e-02, -1.1342e-02,\\n\",\"         -2.9762e-02,  3.8964e-02,  2.5084e-02, -5.2755e-03,  2.5368e-02,\\n\",\"          1.5550e-02,  2.3373e-02,  4.4992e-02,  3.5252e-02,  5.1078e-03,\\n\",\"         -3.5071e-02, -1.6495e-02, -2.0655e-03, -2.2318e-02,  4.7724e-03,\\n\",\"          3.0932e-02, -2.7194e-03,  6.9039e-03, -1.2915e-02,  1.3947e-02,\\n\",\"          2.8844e-02,  3.2517e-03,  4.6779e-03,  7.0392e-03, -1.3077e-03,\\n\",\"         -1.7592e-02, -4.7972e-03, -1.7324e-02, -6.2396e-03,  2.2592e-02,\\n\",\"         -1.3515e-02, -8.0224e-03,  2.1195e-02,  2.2268e-02,  2.5593e-02,\\n\",\"         -1.9392e-02,  2.1210e-02, -6.4903e-04, -6.2049e-03,  4.3366e-03,\\n\",\"         -1.2365e-02,  9.8694e-03,  2.6114e-02,  3.4560e-02, -1.6659e-02,\\n\",\"          2.1352e-02, -2.9365e-02, -3.3745e-03,  6.6747e-03,  1.3251e-02,\\n\",\"         -9.8489e-03,  1.8596e-02,  3.2361e-02,  7.7174e-03, -1.2383e-03,\\n\",\"          2.5911e-02,  2.4750e-02,  5.1987e-03,  1.3954e-02, -1.5113e-02,\\n\",\"          7.1089e-02,  8.0755e-05, -8.1151e-03,  1.2588e-02,  9.8257e-03,\\n\",\"          3.8920e-02, -2.1225e-02,  2.4114e-02, -1.6625e-02, -3.0056e-02,\\n\",\"         -2.7325e-03,  7.4526e-03, -1.4002e-02,  3.4950e-03, -8.6745e-03,\\n\",\"          3.0063e-03, -2.1799e-02, -1.4129e-02,  2.9121e-04, -4.3018e-02,\\n\",\"         -4.4798e-02, -1.8633e-02, -1.2476e-02, -8.5260e-03,  2.1731e-03,\\n\",\"          7.5067e-03, -2.0826e-02, -3.4043e-02,  3.2342e-03,  1.9906e-02,\\n\",\"          1.8168e-02, -2.3600e-02, -2.7241e-03, -4.9381e-02, -7.6955e-03,\\n\",\"         -5.1531e-03,  3.1543e-02, -1.9595e-03,  3.5362e-02,  3.4881e-03,\\n\",\"         -9.6533e-03,  1.2043e-02,  3.6402e-02,  7.7131e-03,  1.1696e-02,\\n\",\"          2.5230e-02, -5.0558e-03,  1.0681e-02,  1.3198e-02,  1.5028e-02,\\n\",\"         -3.5566e-02, -2.5206e-02, -9.1994e-03, -3.8904e-02,  1.6052e-02,\\n\",\"          9.3387e-03,  3.6619e-02, -1.5902e-02, -6.2100e-03, -2.9235e-03,\\n\",\"         -7.6509e-03, -3.5802e-02, -1.0864e-02,  7.5553e-03,  2.3614e-02,\\n\",\"          3.2622e-03,  1.5450e-02, -7.8131e-03,  1.4952e-02,  2.0042e-02,\\n\",\"         -2.0912e-02,  2.9369e-02, -1.1444e-02, -1.8924e-03,  1.9401e-02,\\n\",\"         -6.1184e-03,  2.1170e-04, -6.4930e-03, -7.0237e-04,  3.6012e-02,\\n\",\"          2.1304e-02, -1.2476e-02,  1.8241e-02]], requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[ 0.0054, -0.0249,  0.0291,  ...,  0.0022, -0.0016,  0.0385],\\n\",\"        [ 0.0306,  0.0053, -0.0565,  ..., -0.0035,  0.0238, -0.0246],\\n\",\"        [-0.0154,  0.0495, -0.0128,  ..., -0.0275,  0.0204, -0.0024],\\n\",\"        ...,\\n\",\"        [-0.0148,  0.0189,  0.0026,  ...,  0.0448,  0.0153, -0.0079],\\n\",\"        [-0.0347,  0.0001,  0.0442,  ...,  0.0052, -0.0085,  0.0158],\\n\",\"        [ 0.0194,  0.0051, -0.0135,  ...,  0.0190, -0.0092,  0.0258]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[ 0.0113,  0.0310,  0.0108,  ...,  0.0016,  0.0146, -0.0166],\\n\",\"        [-0.0192, -0.0021,  0.0224,  ..., -0.0187,  0.0372,  0.0042],\\n\",\"        [ 0.0228,  0.0064, -0.0263,  ...,  0.0390,  0.0026,  0.0213],\\n\",\"        ...,\\n\",\"        [-0.0450,  0.0222, -0.0257,  ..., -0.0105, -0.0424,  0.0225],\\n\",\"        [ 0.0079,  0.0048,  0.0043,  ..., -0.0155,  0.0029, -0.0219],\\n\",\"        [-0.0418,  0.0221, -0.0124,  ...,  0.0272, -0.0198,  0.0119]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[-0.0113, -0.0214,  0.0008,  ...,  0.0096, -0.0131, -0.0022],\\n\",\"        [ 0.0034,  0.0181,  0.0045,  ...,  0.0213,  0.0026, -0.0047],\\n\",\"        [ 0.0222, -0.0167,  0.0034,  ..., -0.0064, -0.0021,  0.0017],\\n\",\"        ...,\\n\",\"        [ 0.0029,  0.0131, -0.0039,  ...,  0.0112, -0.0027, -0.0162],\\n\",\"        [ 0.0292,  0.0232,  0.0022,  ...,  0.0169,  0.0313, -0.0035],\\n\",\"        [ 0.0070,  0.0183,  0.0586,  ..., -0.0171,  0.0111,  0.0258]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[-0.0363,  0.0225, -0.0609,  ..., -0.0468, -0.0013,  0.0161],\\n\",\"        [ 0.0152, -0.0506, -0.0623,  ..., -0.0139, -0.0189, -0.0270],\\n\",\"        [ 0.0230, -0.0084, -0.0055,  ...,  0.0324, -0.0222, -0.0231],\\n\",\"        ...,\\n\",\"        [ 0.0151, -0.0383,  0.0167,  ...,  0.0069,  0.0069, -0.0132],\\n\",\"        [ 0.0169, -0.0264, -0.0132,  ..., -0.0112,  0.0104,  0.0146],\\n\",\"        [ 0.0034, -0.0298, -0.0019,  ...,  0.0200,  0.0011, -0.0155]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[-0.0077,  0.0063,  0.0143,  ...,  0.0019, -0.0524,  0.0445],\\n\",\"        [-0.0125,  0.0156, -0.0290,  ..., -0.0203, -0.0214,  0.0349],\\n\",\"        [-0.0074, -0.0013,  0.0146,  ...,  0.0062,  0.0252,  0.0079],\\n\",\"        ...,\\n\",\"        [ 0.0063,  0.0212, -0.0081,  ..., -0.0179, -0.0004,  0.0155],\\n\",\"        [-0.0057,  0.0016,  0.0063,  ...,  0.0437, -0.0100,  0.0192],\\n\",\"        [ 0.0144, -0.0082, -0.0022,  ..., -0.0005,  0.0238,  0.0159]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0.,  ..., 0., 0., 0.], requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[ 0.0047,  0.0345,  0.0557,  ..., -0.0219, -0.0153, -0.0291],\\n\",\"        [ 0.0389, -0.0243,  0.0141,  ...,  0.0330,  0.0041, -0.0300],\\n\",\"        [ 0.0149, -0.0104,  0.0221,  ..., -0.0033,  0.0114, -0.0043],\\n\",\"        ...,\\n\",\"        [ 0.0028, -0.0109, -0.0051,  ..., -0.0174,  0.0029, -0.0119],\\n\",\"        [-0.0236,  0.0067,  0.0196,  ..., -0.0007, -0.0110,  0.0118],\\n\",\"        [-0.0044, -0.0271,  0.0073,  ...,  0.0059,  0.0052, -0.0130]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[ 0.0371, -0.0192,  0.0182,  ..., -0.0041, -0.0052,  0.0179],\\n\",\"        [ 0.0274,  0.0165,  0.0110,  ..., -0.0003,  0.0148,  0.0029],\\n\",\"        [ 0.0241, -0.0423, -0.0193,  ...,  0.0032, -0.0114,  0.0185],\\n\",\"        ...,\\n\",\"        [-0.0191,  0.0087, -0.0006,  ...,  0.0065,  0.0221, -0.0228],\\n\",\"        [-0.0435, -0.0280, -0.0225,  ...,  0.0004, -0.0100,  0.0038],\\n\",\"        [ 0.0104,  0.0024,  0.0126,  ...,  0.0063, -0.0131,  0.0316]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[-0.0023,  0.0078,  0.0455,  ...,  0.0066, -0.0019, -0.0146],\\n\",\"        [-0.0186,  0.0153, -0.0109,  ..., -0.0367, -0.0061, -0.0016],\\n\",\"        [ 0.0156,  0.0037, -0.0166,  ...,  0.0102,  0.0307,  0.0078],\\n\",\"        ...,\\n\",\"        [ 0.0219, -0.0116, -0.0122,  ..., -0.0252,  0.0032,  0.0406],\\n\",\"        [ 0.0203,  0.0145, -0.0515,  ...,  0.0131,  0.0013, -0.0063],\\n\",\"        [ 0.0067, -0.0223, -0.0189,  ...,  0.0266,  0.0110, -0.0115]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[-0.0287, -0.0230,  0.0014,  ...,  0.0075,  0.0079,  0.0613],\\n\",\"        [ 0.0374,  0.0188, -0.0121,  ...,  0.0040,  0.0162, -0.0196],\\n\",\"        [ 0.0042, -0.0110, -0.0315,  ..., -0.0221, -0.0409,  0.0357],\\n\",\"        ...,\\n\",\"        [-0.0087, -0.0071,  0.0022,  ...,  0.0310,  0.0067,  0.0144],\\n\",\"        [ 0.0077,  0.0096, -0.0059,  ..., -0.0267,  0.0289, -0.0156],\\n\",\"        [ 0.0087, -0.0253, -0.0012,  ..., -0.0169, -0.0123, -0.0010]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[-0.0215, -0.0247, -0.0500,  ...,  0.0362, -0.0077,  0.0157],\\n\",\"        [ 0.0178, -0.0209,  0.0173,  ...,  0.0163, -0.0242,  0.0330],\\n\",\"        [-0.0260,  0.0015, -0.0006,  ...,  0.0037, -0.0195,  0.0091],\\n\",\"        ...,\\n\",\"        [ 0.0184,  0.0291,  0.0384,  ..., -0.0104,  0.0043,  0.0370],\\n\",\"        [-0.0538,  0.0278,  0.0242,  ..., -0.0162, -0.0008, -0.0071],\\n\",\"        [ 0.0257,  0.0098,  0.0103,  ..., -0.0066, -0.0165,  0.0016]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[ 0.0114,  0.0074,  0.0016,  ..., -0.0198, -0.0020, -0.0096],\\n\",\"        [ 0.0078,  0.0069,  0.0396,  ...,  0.0034, -0.0085, -0.0037],\\n\",\"        [-0.0195, -0.0269, -0.0479,  ..., -0.0211, -0.0026, -0.0250],\\n\",\"        ...,\\n\",\"        [ 0.0057,  0.0213, -0.0129,  ...,  0.0020,  0.0266,  0.0101],\\n\",\"        [-0.0033,  0.0155,  0.0236,  ..., -0.0229, -0.0166, -0.0096],\\n\",\"        [-0.0014,  0.0099,  0.0002,  ...,  0.0407, -0.0093, -0.0057]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0.,  ..., 0., 0., 0.], requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[-0.0133, -0.0316,  0.0196,  ..., -0.0108,  0.0064,  0.0170],\\n\",\"        [ 0.0272, -0.0227,  0.0144,  ..., -0.0309,  0.0075, -0.0324],\\n\",\"        [-0.0131, -0.0101, -0.0131,  ..., -0.0107,  0.0268,  0.0084],\\n\",\"        ...,\\n\",\"        [-0.0045, -0.0154,  0.0094,  ..., -0.0247, -0.0337, -0.0311],\\n\",\"        [-0.0163,  0.0192,  0.0021,  ...,  0.0031, -0.0060, -0.0188],\\n\",\"        [ 0.0039,  0.0330, -0.0208,  ..., -0.0112, -0.0151, -0.0054]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[-0.0110,  0.0115, -0.0081,  ..., -0.0104, -0.0163,  0.0067],\\n\",\"        [-0.0065, -0.0073,  0.0089,  ..., -0.0405,  0.0007,  0.0241],\\n\",\"        [ 0.0008,  0.0435, -0.0219,  ..., -0.0030,  0.0032,  0.0166],\\n\",\"        ...,\\n\",\"        [-0.0195, -0.0130, -0.0039,  ..., -0.0105, -0.0171,  0.0047],\\n\",\"        [ 0.0145,  0.0111,  0.0021,  ..., -0.0247, -0.0031,  0.0089],\\n\",\"        [ 0.0348, -0.0068,  0.0093,  ...,  0.0129,  0.0171,  0.0110]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[ 0.0017, -0.0207, -0.0193,  ..., -0.0182,  0.0319, -0.0042],\\n\",\"        [ 0.0246, -0.0065,  0.0370,  ...,  0.0140,  0.0228,  0.0050],\\n\",\"        [-0.0100, -0.0325,  0.0043,  ...,  0.0470, -0.0174,  0.0122],\\n\",\"        ...,\\n\",\"        [ 0.0396, -0.0135,  0.0053,  ..., -0.0031,  0.0022,  0.0057],\\n\",\"        [-0.0297, -0.0040, -0.0111,  ..., -0.0220, -0.0053,  0.0058],\\n\",\"        [-0.0027, -0.0132, -0.0135,  ...,  0.0018, -0.0094,  0.0045]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[-0.0175, -0.0272,  0.0002,  ..., -0.0198,  0.0161,  0.0005],\\n\",\"        [-0.0147,  0.0293,  0.0050,  ..., -0.0218,  0.0021,  0.0015],\\n\",\"        [-0.0279, -0.0163, -0.0152,  ...,  0.0146, -0.0068, -0.0099],\\n\",\"        ...,\\n\",\"        [-0.0066,  0.0098, -0.0221,  ...,  0.0155, -0.0009, -0.0207],\\n\",\"        [-0.0035, -0.0050, -0.0044,  ..., -0.0118,  0.0075, -0.0457],\\n\",\"        [-0.0084, -0.0077,  0.0478,  ...,  0.0118,  0.0303, -0.0203]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[-0.0064,  0.0323, -0.0135,  ..., -0.0135,  0.0215,  0.0121],\\n\",\"        [ 0.0011, -0.0242,  0.0095,  ...,  0.0274, -0.0122, -0.0283],\\n\",\"        [-0.0224,  0.0338, -0.0290,  ..., -0.0263,  0.0217, -0.0171],\\n\",\"        ...,\\n\",\"        [-0.0120,  0.0282, -0.0060,  ..., -0.0208,  0.0026,  0.0056],\\n\",\"        [ 0.0052, -0.0032,  0.0125,  ..., -0.0473, -0.0094,  0.0141],\\n\",\"        [-0.0010, -0.0062,  0.0059,  ...,  0.0261, -0.0050,  0.0003]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[ 0.0029, -0.0024, -0.0296,  ...,  0.0012,  0.0428, -0.0092],\\n\",\"        [-0.0169,  0.0191,  0.0063,  ...,  0.0139,  0.0459, -0.0038],\\n\",\"        [-0.0046, -0.0121, -0.0403,  ...,  0.0099,  0.0046,  0.0167],\\n\",\"        ...,\\n\",\"        [ 0.0467,  0.0041,  0.0186,  ...,  0.0034, -0.0331, -0.0016],\\n\",\"        [ 0.0091, -0.0312, -0.0105,  ...,  0.0193,  0.0108, -0.0265],\\n\",\"        [-0.0092,  0.0173, -0.0221,  ..., -0.0228,  0.0626, -0.0301]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0.,  ..., 0., 0., 0.], requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[ 0.0288,  0.0347, -0.0111,  ...,  0.0066, -0.0029, -0.0072],\\n\",\"        [ 0.0079, -0.0051,  0.0204,  ...,  0.0104, -0.0036,  0.0418],\\n\",\"        [-0.0095, -0.0150,  0.0176,  ..., -0.0050, -0.0217,  0.0044],\\n\",\"        ...,\\n\",\"        [-0.0206, -0.0033,  0.0309,  ...,  0.0207, -0.0070,  0.0111],\\n\",\"        [ 0.0137,  0.0179,  0.0485,  ..., -0.0346, -0.0086,  0.0238],\\n\",\"        [ 0.0257, -0.0012, -0.0298,  ...,  0.0205, -0.0131, -0.0042]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[-0.0357, -0.0351,  0.0085,  ...,  0.0087,  0.0193, -0.0061],\\n\",\"        [ 0.0101,  0.0315, -0.0053,  ...,  0.0200, -0.0208, -0.0224],\\n\",\"        [ 0.0202, -0.0045, -0.0026,  ...,  0.0071,  0.0281, -0.0238],\\n\",\"        ...,\\n\",\"        [-0.0048,  0.0365, -0.0033,  ...,  0.0502,  0.0061, -0.0374],\\n\",\"        [ 0.0037, -0.0032, -0.0197,  ..., -0.0347,  0.0175, -0.0231],\\n\",\"        [ 0.0091, -0.0069,  0.0172,  ..., -0.0107,  0.0019,  0.0130]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[ 0.0235, -0.0070, -0.0124,  ..., -0.0014,  0.0005,  0.0221],\\n\",\"        [-0.0234,  0.0223,  0.0084,  ...,  0.0203, -0.0004,  0.0217],\\n\",\"        [-0.0277,  0.0151, -0.0243,  ..., -0.0004,  0.0123,  0.0058],\\n\",\"        ...,\\n\",\"        [ 0.0311, -0.0026,  0.0344,  ..., -0.0022, -0.0060, -0.0143],\\n\",\"        [-0.0079, -0.0146,  0.0151,  ..., -0.0138, -0.0092, -0.0064],\\n\",\"        [-0.0063,  0.0249, -0.0266,  ..., -0.0188, -0.0208,  0.0085]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[ 0.0048,  0.0173, -0.0011,  ...,  0.0227,  0.0020,  0.0363],\\n\",\"        [ 0.0359, -0.0102, -0.0046,  ...,  0.0402,  0.0201,  0.0505],\\n\",\"        [-0.0343,  0.0053, -0.0294,  ...,  0.0225,  0.0183, -0.0022],\\n\",\"        ...,\\n\",\"        [ 0.0146,  0.0148, -0.0042,  ..., -0.0033,  0.0284, -0.0066],\\n\",\"        [ 0.0114, -0.0046, -0.0007,  ..., -0.0223,  0.0237,  0.0019],\\n\",\"        [ 0.0199, -0.0108, -0.0134,  ..., -0.0157, -0.0056,  0.0061]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[ 0.0104, -0.0175, -0.0179,  ...,  0.0039, -0.0230,  0.0308],\\n\",\"        [ 0.0036, -0.0209, -0.0021,  ..., -0.0294, -0.0010, -0.0134],\\n\",\"        [ 0.0043, -0.0066, -0.0020,  ...,  0.0302, -0.0107,  0.0294],\\n\",\"        ...,\\n\",\"        [ 0.0050,  0.0231,  0.0165,  ...,  0.0235,  0.0017,  0.0008],\\n\",\"        [-0.0062, -0.0152,  0.0197,  ..., -0.0150,  0.0227, -0.0002],\\n\",\"        [-0.0078,  0.0004, -0.0485,  ..., -0.0065,  0.0180, -0.0047]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[ 0.0307, -0.0206,  0.0074,  ..., -0.0162, -0.0204, -0.0239],\\n\",\"        [ 0.0042,  0.0065,  0.0052,  ..., -0.0414, -0.0222,  0.0336],\\n\",\"        [ 0.0133,  0.0100, -0.0241,  ...,  0.0120,  0.0076,  0.0027],\\n\",\"        ...,\\n\",\"        [-0.0333, -0.0121,  0.0306,  ..., -0.0032, -0.0266, -0.0231],\\n\",\"        [-0.0119,  0.0303, -0.0076,  ...,  0.0027,  0.0146, -0.0204],\\n\",\"        [ 0.0218, -0.0079, -0.0013,  ...,  0.0240, -0.0374,  0.0312]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0.,  ..., 0., 0., 0.], requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[-0.0322, -0.0077,  0.0120,  ...,  0.0150, -0.0179,  0.0023],\\n\",\"        [ 0.0057, -0.0142, -0.0065,  ..., -0.0118,  0.0151,  0.0168],\\n\",\"        [-0.0048,  0.0125,  0.0374,  ...,  0.0152,  0.0463,  0.0009],\\n\",\"        ...,\\n\",\"        [ 0.0073,  0.0066,  0.0022,  ..., -0.0218,  0.0131,  0.0258],\\n\",\"        [-0.0050,  0.0442, -0.0028,  ..., -0.0188,  0.0114, -0.0279],\\n\",\"        [-0.0033,  0.0115, -0.0173,  ...,  0.0038, -0.0212,  0.0098]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[-0.0242,  0.0441, -0.0098,  ..., -0.0031, -0.0083,  0.0330],\\n\",\"        [-0.0369,  0.0118,  0.0001,  ...,  0.0003, -0.0220,  0.0033],\\n\",\"        [-0.0065,  0.0101,  0.0038,  ...,  0.0188, -0.0076,  0.0088],\\n\",\"        ...,\\n\",\"        [ 0.0006,  0.0281,  0.0263,  ..., -0.0092,  0.0023,  0.0010],\\n\",\"        [-0.0148,  0.0130,  0.0021,  ..., -0.0130, -0.0136,  0.0012],\\n\",\"        [-0.0165,  0.0078, -0.0509,  ..., -0.0126,  0.0186,  0.0120]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[-0.0259, -0.0256,  0.0229,  ...,  0.0480, -0.0087,  0.0039],\\n\",\"        [ 0.0022, -0.0142,  0.0378,  ...,  0.0080, -0.0094,  0.0031],\\n\",\"        [-0.0170,  0.0531,  0.0088,  ..., -0.0128,  0.0364,  0.0242],\\n\",\"        ...,\\n\",\"        [-0.0153, -0.0138,  0.0145,  ...,  0.0315, -0.0114,  0.0018],\\n\",\"        [ 0.0389, -0.0455,  0.0126,  ...,  0.0030,  0.0271, -0.0047],\\n\",\"        [-0.0268, -0.0066, -0.0116,  ...,  0.0275,  0.0002,  0.0163]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[ 0.0333, -0.0119, -0.0056,  ..., -0.0094, -0.0057,  0.0245],\\n\",\"        [-0.0046,  0.0278,  0.0025,  ..., -0.0072, -0.0165,  0.0011],\\n\",\"        [-0.0152, -0.0066,  0.0119,  ..., -0.0010, -0.0457,  0.0160],\\n\",\"        ...,\\n\",\"        [-0.0148, -0.0123, -0.0129,  ..., -0.0055,  0.0173, -0.0246],\\n\",\"        [ 0.0077, -0.0306, -0.0068,  ..., -0.0308,  0.0058, -0.0056],\\n\",\"        [-0.0104, -0.0199, -0.0015,  ..., -0.0020,  0.0161,  0.0124]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[-0.0062,  0.0128, -0.0137,  ...,  0.0202,  0.0229, -0.0264],\\n\",\"        [-0.0273, -0.0203,  0.0245,  ..., -0.0114,  0.0284, -0.0209],\\n\",\"        [-0.0245, -0.0111,  0.0219,  ..., -0.0153,  0.0036, -0.0129],\\n\",\"        ...,\\n\",\"        [-0.0330,  0.0134,  0.0144,  ...,  0.0189, -0.0339, -0.0204],\\n\",\"        [ 0.0267,  0.0215, -0.0244,  ...,  0.0086, -0.0319, -0.0028],\\n\",\"        [-0.0270,  0.0148,  0.0155,  ...,  0.0113, -0.0063, -0.0188]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[-0.0118, -0.0135,  0.0127,  ...,  0.0226,  0.0109, -0.0040],\\n\",\"        [-0.0076,  0.0183,  0.0090,  ..., -0.0140,  0.0563, -0.0068],\\n\",\"        [ 0.0183,  0.0099,  0.0319,  ...,  0.0139,  0.0437,  0.0120],\\n\",\"        ...,\\n\",\"        [-0.0376, -0.0088,  0.0195,  ...,  0.0481,  0.0247, -0.0063],\\n\",\"        [-0.0007, -0.0104, -0.0115,  ..., -0.0122, -0.0012, -0.0033],\\n\",\"        [-0.0102,  0.0107,  0.0317,  ...,  0.0021,  0.0174, -0.0022]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0.,  ..., 0., 0., 0.], requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[-0.0080, -0.0068, -0.0147,  ..., -0.0147, -0.0201,  0.0060],\\n\",\"        [-0.0216, -0.0156,  0.0003,  ..., -0.0099, -0.0240, -0.0519],\\n\",\"        [-0.0034, -0.0163, -0.0302,  ...,  0.0365, -0.0289,  0.0012],\\n\",\"        ...,\\n\",\"        [-0.0215, -0.0266,  0.0209,  ..., -0.0011, -0.0199, -0.0020],\\n\",\"        [ 0.0053, -0.0307,  0.0172,  ...,  0.0115, -0.0379,  0.0196],\\n\",\"        [ 0.0070,  0.0208, -0.0073,  ...,  0.0190, -0.0255,  0.0458]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[ 6.9585e-03, -1.6945e-02,  2.1462e-02,  ...,  2.5339e-02,\\n\",\"          2.0430e-02, -1.4363e-02],\\n\",\"        [ 4.4982e-02, -2.5055e-02,  1.1185e-02,  ..., -1.7393e-02,\\n\",\"         -1.4757e-02, -4.4690e-02],\\n\",\"        [-2.1813e-02,  2.6028e-02,  3.0546e-04,  ...,  8.1186e-03,\\n\",\"         -3.3577e-03, -1.8338e-02],\\n\",\"        ...,\\n\",\"        [ 2.9144e-02, -6.2635e-05, -2.2587e-02,  ...,  3.1348e-02,\\n\",\"          7.0294e-03,  1.0935e-02],\\n\",\"        [ 2.6010e-02,  3.3259e-02,  1.3087e-02,  ...,  2.6066e-02,\\n\",\"          1.1860e-02,  1.2322e-02],\\n\",\"        [ 1.1514e-02,  5.3906e-03,  2.4099e-02,  ..., -4.5759e-03,\\n\",\"         -3.0175e-03, -1.0820e-02]], requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[-0.0131, -0.0151, -0.0092,  ..., -0.0124, -0.0029,  0.0157],\\n\",\"        [-0.0058,  0.0491, -0.0354,  ...,  0.0302, -0.0023,  0.0292],\\n\",\"        [ 0.0319,  0.0409,  0.0114,  ...,  0.0136, -0.0021,  0.0093],\\n\",\"        ...,\\n\",\"        [-0.0085,  0.0182, -0.0265,  ..., -0.0299, -0.0108,  0.0093],\\n\",\"        [-0.0116, -0.0126, -0.0154,  ..., -0.0226,  0.0024, -0.0118],\\n\",\"        [-0.0020,  0.0256, -0.0206,  ..., -0.0158, -0.0241,  0.0084]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[-0.0089,  0.0046,  0.0110,  ..., -0.0274, -0.0066,  0.0079],\\n\",\"        [-0.0042, -0.0302,  0.0265,  ...,  0.0379,  0.0011, -0.0103],\\n\",\"        [-0.0234, -0.0332,  0.0367,  ..., -0.0007,  0.0157,  0.0053],\\n\",\"        ...,\\n\",\"        [-0.0051,  0.0228,  0.0412,  ..., -0.0233, -0.0331, -0.0270],\\n\",\"        [-0.0409, -0.0014,  0.0307,  ...,  0.0017,  0.0290, -0.0176],\\n\",\"        [-0.0023,  0.0131, -0.0247,  ..., -0.0064,  0.0265,  0.0064]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[-0.0150,  0.0113,  0.0144,  ...,  0.0170, -0.0096,  0.0191],\\n\",\"        [-0.0004,  0.0008, -0.0175,  ..., -0.0170,  0.0112, -0.0070],\\n\",\"        [ 0.0299,  0.0273, -0.0075,  ..., -0.0206, -0.0012, -0.0053],\\n\",\"        ...,\\n\",\"        [-0.0252,  0.0047,  0.0243,  ...,  0.0028, -0.0012,  0.0110],\\n\",\"        [-0.0212, -0.0151,  0.0219,  ..., -0.0139,  0.0290,  0.0310],\\n\",\"        [-0.0261,  0.0195,  0.0263,  ...,  0.0245, -0.0040,  0.0331]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[-0.0167, -0.0493, -0.0141,  ..., -0.0422,  0.0203,  0.0107],\\n\",\"        [-0.0437, -0.0005,  0.0243,  ..., -0.0028,  0.0221, -0.0080],\\n\",\"        [ 0.0308, -0.0203,  0.0144,  ...,  0.0223,  0.0124,  0.0252],\\n\",\"        ...,\\n\",\"        [ 0.0112,  0.0005, -0.0042,  ...,  0.0042,  0.0232,  0.0172],\\n\",\"        [-0.0113,  0.0347, -0.0536,  ...,  0.0057,  0.0190, -0.0136],\\n\",\"        [-0.0060,  0.0082,  0.0432,  ..., -0.0228,  0.0417,  0.0247]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0.,  ..., 0., 0., 0.], requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[-0.0105, -0.0253, -0.0303,  ...,  0.0160, -0.0179,  0.0256],\\n\",\"        [-0.0052,  0.0164,  0.0133,  ..., -0.0161,  0.0213, -0.0034],\\n\",\"        [ 0.0360, -0.0218,  0.0057,  ...,  0.0125,  0.0144, -0.0003],\\n\",\"        ...,\\n\",\"        [-0.0088,  0.0230,  0.0231,  ...,  0.0153, -0.0139, -0.0112],\\n\",\"        [ 0.0244, -0.0448, -0.0341,  ..., -0.0271, -0.0140, -0.0010],\\n\",\"        [-0.0094,  0.0033, -0.0191,  ..., -0.0193, -0.0186, -0.0260]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0.,  ..., 0., 0., 0.], requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([[ 0.0208, -0.0084,  0.0142,  ...,  0.0035, -0.0125, -0.0398],\\n\",\"        [-0.0025,  0.0109,  0.0187,  ...,  0.0066,  0.0255, -0.0056],\\n\",\"        [ 0.0407,  0.0072, -0.0126,  ..., -0.0236, -0.0165, -0.0047],\\n\",\"        ...,\\n\",\"        [-0.0352,  0.0254, -0.0119,  ...,  0.0193,  0.0495, -0.0062],\\n\",\"        [-0.0097, -0.0344,  0.0108,  ...,  0.0191, -0.0036,  0.0204],\\n\",\"        [ 0.0050,  0.0167,  0.0047,  ...,  0.0294,  0.0066, -0.0200]],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\\n\",\"        1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\\n\",\"Parameter containing:\\n\",\"tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\"        0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\\n\",\"       requires_grad=True)\\n\"],\"name\":\"stdout\"}]},{\"cell_type\":\"code\",\"metadata\":{\"colab\":{\"base_uri\":\"https://localhost:8080/\"},\"id\":\"Ej82kG6K3akQ\",\"executionInfo\":{\"elapsed\":1596,\"status\":\"ok\",\"timestamp\":1611303413503,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"},\"user_tz\":-330},\"outputId\":\"09db7a80-a3c2-4cfb-9a80-dd172a1bffa1\"},\"source\":[\"#@title Counting the parameters\\n\",\"np=0\\n\",\"for p in range(0,lp):#number of tensors\\n\",\"  PL2=True\\n\",\"  try:\\n\",\"    L2=len(LP[p][0]) #check if 2D\\n\",\"  except:\\n\",\"    L2=1             #not 2D but 1D\\n\",\"    PL2=False\\n\",\"  L1=len(LP[p])      \\n\",\"  L3=L1*L2\\n\",\"  np+=L3             # number of parameters per tensor\\n\",\"  if PL2==True:\\n\",\"    print(p,L1,L2,L3)  # displaying the sizes of the parameters\\n\",\"  if PL2==False:\\n\",\"    print(p,L1,L3)  # displaying the sizes of the parameters\\n\",\"\\n\",\"print(np)              # total number of parameters\"],\"execution_count\":null,\"outputs\":[{\"output_type\":\"stream\",\"text\":[\"0 52000 768 39936000\\n\",\"1 514 768 394752\\n\",\"2 1 768 768\\n\",\"3 768 768\\n\",\"4 768 768\\n\",\"5 768 768 589824\\n\",\"6 768 768\\n\",\"7 768 768 589824\\n\",\"8 768 768\\n\",\"9 768 768 589824\\n\",\"10 768 768\\n\",\"11 768 768 589824\\n\",\"12 768 768\\n\",\"13 768 768\\n\",\"14 768 768\\n\",\"15 3072 768 2359296\\n\",\"16 3072 3072\\n\",\"17 768 3072 2359296\\n\",\"18 768 768\\n\",\"19 768 768\\n\",\"20 768 768\\n\",\"21 768 768 589824\\n\",\"22 768 768\\n\",\"23 768 768 589824\\n\",\"24 768 768\\n\",\"25 768 768 589824\\n\",\"26 768 768\\n\",\"27 768 768 589824\\n\",\"28 768 768\\n\",\"29 768 768\\n\",\"30 768 768\\n\",\"31 3072 768 2359296\\n\",\"32 3072 3072\\n\",\"33 768 3072 2359296\\n\",\"34 768 768\\n\",\"35 768 768\\n\",\"36 768 768\\n\",\"37 768 768 589824\\n\",\"38 768 768\\n\",\"39 768 768 589824\\n\",\"40 768 768\\n\",\"41 768 768 589824\\n\",\"42 768 768\\n\",\"43 768 768 589824\\n\",\"44 768 768\\n\",\"45 768 768\\n\",\"46 768 768\\n\",\"47 3072 768 2359296\\n\",\"48 3072 3072\\n\",\"49 768 3072 2359296\\n\",\"50 768 768\\n\",\"51 768 768\\n\",\"52 768 768\\n\",\"53 768 768 589824\\n\",\"54 768 768\\n\",\"55 768 768 589824\\n\",\"56 768 768\\n\",\"57 768 768 589824\\n\",\"58 768 768\\n\",\"59 768 768 589824\\n\",\"60 768 768\\n\",\"61 768 768\\n\",\"62 768 768\\n\",\"63 3072 768 2359296\\n\",\"64 3072 3072\\n\",\"65 768 3072 2359296\\n\",\"66 768 768\\n\",\"67 768 768\\n\",\"68 768 768\\n\",\"69 768 768 589824\\n\",\"70 768 768\\n\",\"71 768 768 589824\\n\",\"72 768 768\\n\",\"73 768 768 589824\\n\",\"74 768 768\\n\",\"75 768 768 589824\\n\",\"76 768 768\\n\",\"77 768 768\\n\",\"78 768 768\\n\",\"79 3072 768 2359296\\n\",\"80 3072 3072\\n\",\"81 768 3072 2359296\\n\",\"82 768 768\\n\",\"83 768 768\\n\",\"84 768 768\\n\",\"85 768 768 589824\\n\",\"86 768 768\\n\",\"87 768 768 589824\\n\",\"88 768 768\\n\",\"89 768 768 589824\\n\",\"90 768 768\\n\",\"91 768 768 589824\\n\",\"92 768 768\\n\",\"93 768 768\\n\",\"94 768 768\\n\",\"95 3072 768 2359296\\n\",\"96 3072 3072\\n\",\"97 768 3072 2359296\\n\",\"98 768 768\\n\",\"99 768 768\\n\",\"100 768 768\\n\",\"101 52000 52000\\n\",\"102 768 768 589824\\n\",\"103 768 768\\n\",\"104 768 768\\n\",\"105 768 768\\n\",\"83504416\\n\"],\"name\":\"stdout\"}]},{\"cell_type\":\"code\",\"metadata\":{\"colab\":{\"base_uri\":\"https://localhost:8080/\"},\"id\":\"GlvP_A-THEEl\",\"executionInfo\":{\"elapsed\":22936,\"status\":\"ok\",\"timestamp\":1611303439451,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"},\"user_tz\":-330},\"outputId\":\"ce117d0d-56d0-473f-eb4e-9efffc7b25dc\"},\"source\":[\"#@title Step 10: Building the Dataset\\n\",\"%%time\\n\",\"from transformers import LineByLineTextDataset\\n\",\"\\n\",\"dataset = LineByLineTextDataset(\\n\",\"    tokenizer=tokenizer,\\n\",\"    file_path=\\\"./kant.txt\\\",\\n\",\"    block_size=128,\\n\",\")\"],\"execution_count\":null,\"outputs\":[{\"output_type\":\"stream\",\"text\":[\"/usr/local/lib/python3.6/dist-packages/transformers/data/datasets/language_modeling.py:128: FutureWarning: This dataset will be removed from the library soon, preprocessing should be handled with the 🤗 Datasets library. You can have a look at this example script for pointers: https://github.com/huggingface/transformers/blob/master/examples/language-modeling/run_mlm.py\\n\",\"  FutureWarning,\\n\"],\"name\":\"stderr\"},{\"output_type\":\"stream\",\"text\":[\"CPU times: user 20.2 s, sys: 661 ms, total: 20.9 s\\n\",\"Wall time: 20.9 s\\n\"],\"name\":\"stdout\"}]},{\"cell_type\":\"code\",\"metadata\":{\"id\":\"zTgWPa9Dipk2\"},\"source\":[\"#@title Step 11: Defining a Data Collator\\n\",\"from transformers import DataCollatorForLanguageModeling\\n\",\"\\n\",\"data_collator = DataCollatorForLanguageModeling(\\n\",\"    tokenizer=tokenizer, mlm=True, mlm_probability=0.15\\n\",\")\"],\"execution_count\":null,\"outputs\":[]},{\"cell_type\":\"code\",\"metadata\":{\"id\":\"YpvnFFmZJD-N\"},\"source\":[\"#@title Step 12: Initializing the Trainer\\n\",\"from transformers import Trainer, TrainingArguments\\n\",\"\\n\",\"training_args = TrainingArguments(\\n\",\"    output_dir=\\\"./KantaiBERT\\\",\\n\",\"    overwrite_output_dir=True,\\n\",\"    num_train_epochs=1,\\n\",\"    per_device_train_batch_size=64,\\n\",\"    save_steps=10_000,\\n\",\"    save_total_limit=2,\\n\",\")\\n\",\"\\n\",\"trainer = Trainer(\\n\",\"    model=model,\\n\",\"    args=training_args,\\n\",\"    data_collator=data_collator,\\n\",\"    train_dataset=dataset,\\n\",\")\"],\"execution_count\":null,\"outputs\":[]},{\"cell_type\":\"code\",\"metadata\":{\"colab\":{\"base_uri\":\"https://localhost:8080/\",\"height\":285},\"id\":\"VmaHZXzmkNtJ\",\"executionInfo\":{\"elapsed\":351691,\"status\":\"ok\",\"timestamp\":1611303814910,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"},\"user_tz\":-330},\"outputId\":\"998acefe-df4b-4d07-8059-d25b323587e1\"},\"source\":[\"#@title Step 13: Pre-training the Model\\n\",\"%%time\\n\",\"trainer.train()\"],\"execution_count\":null,\"outputs\":[{\"output_type\":\"display_data\",\"data\":{\"text/html\":[\"\\n\",\"    <div>\\n\",\"        <style>\\n\",\"            /* Turns off some styling */\\n\",\"            progress {\\n\",\"                /* gets rid of default border in Firefox and Opera. */\\n\",\"                border: none;\\n\",\"                /* Needs to be in here for Safari polyfill so background images work as expected. */\\n\",\"                background-size: auto;\\n\",\"            }\\n\",\"        </style>\\n\",\"      \\n\",\"      <progress value='2672' max='2672' style='width:300px; height:20px; vertical-align: middle;'></progress>\\n\",\"      [2672/2672 05:50, Epoch 1/1]\\n\",\"    </div>\\n\",\"    <table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\"  <thead>\\n\",\"    <tr style=\\\"text-align: left;\\\">\\n\",\"      <th>Step</th>\\n\",\"      <th>Training Loss</th>\\n\",\"    </tr>\\n\",\"  </thead>\\n\",\"  <tbody>\\n\",\"    <tr>\\n\",\"      <td>500</td>\\n\",\"      <td>4.755200</td>\\n\",\"    </tr>\\n\",\"    <tr>\\n\",\"      <td>1000</td>\\n\",\"      <td>4.046900</td>\\n\",\"    </tr>\\n\",\"    <tr>\\n\",\"      <td>1500</td>\\n\",\"      <td>3.770500</td>\\n\",\"    </tr>\\n\",\"    <tr>\\n\",\"      <td>2000</td>\\n\",\"      <td>3.549800</td>\\n\",\"    </tr>\\n\",\"    <tr>\\n\",\"      <td>2500</td>\\n\",\"      <td>3.431600</td>\\n\",\"    </tr>\\n\",\"  </tbody>\\n\",\"</table><p>\"],\"text/plain\":[\"<IPython.core.display.HTML object>\"]},\"metadata\":{\"tags\":[]}},{\"output_type\":\"stream\",\"text\":[\"CPU times: user 4min 14s, sys: 1min 37s, total: 5min 51s\\n\",\"Wall time: 5min 50s\\n\"],\"name\":\"stdout\"},{\"output_type\":\"execute_result\",\"data\":{\"text/plain\":[\"TrainOutput(global_step=2672, training_loss=3.8793241306693256, metrics={'train_runtime': 350.6061, 'train_samples_per_second': 7.621, 'total_flos': 1689347110470912, 'epoch': 1.0})\"]},\"metadata\":{\"tags\":[]},\"execution_count\":25}]},{\"cell_type\":\"code\",\"metadata\":{\"id\":\"QDNgPls7_l13\"},\"source\":[\"#@title Step 14: Saving the Final Model(+tokenizer + config) to disk\\n\",\"trainer.save_model(\\\"./KantaiBERT\\\")\"],\"execution_count\":null,\"outputs\":[]},{\"cell_type\":\"code\",\"metadata\":{\"colab\":{\"base_uri\":\"https://localhost:8080/\"},\"id\":\"ltXgXyCbAJLY\",\"executionInfo\":{\"elapsed\":6144,\"status\":\"ok\",\"timestamp\":1611304118693,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"},\"user_tz\":-330},\"outputId\":\"36ddf0da-9b07-4f97-b8ad-834827e4bc25\"},\"source\":[\"#@title Step 15: Language Modeling with the FillMaskPipeline\\n\",\"from transformers import pipeline\\n\",\"\\n\",\"fill_mask = pipeline(\\n\",\"    \\\"fill-mask\\\",\\n\",\"    model=\\\"./KantaiBERT\\\",\\n\",\"    tokenizer=\\\"./KantaiBERT\\\"\\n\",\")\"],\"execution_count\":null,\"outputs\":[{\"output_type\":\"stream\",\"text\":[\"Some weights of RobertaModel were not initialized from the model checkpoint at ./KantaiBERT and are newly initialized: ['roberta.pooler.dense.weight', 'roberta.pooler.dense.bias']\\n\",\"You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\\n\"],\"name\":\"stderr\"}]},{\"cell_type\":\"code\",\"metadata\":{\"colab\":{\"background_save\":true},\"id\":\"UIvgZ3S6AO0z\",\"outputId\":\"19c8ae6c-b5b3-4be9-c84e-772fabc5a5c9\"},\"source\":[\"fill_mask(\\\"Human thinking involves<mask>.\\\")\"],\"execution_count\":null,\"outputs\":[{\"output_type\":\"execute_result\",\"data\":{\"text/plain\":[\"[{'score': 0.010303723625838757,\\n\",\"  'sequence': '<s>Human thinking involves reason.</s>',\\n\",\"  'token': 394,\\n\",\"  'token_str': 'Ġreason'},\\n\",\" {'score': 0.010289391502737999,\\n\",\"  'sequence': '<s>Human thinking involves priori.</s>',\\n\",\"  'token': 578,\\n\",\"  'token_str': 'Ġpriori'},\\n\",\" {'score': 0.009549057111144066,\\n\",\"  'sequence': '<s>Human thinking involves conceptions.</s>',\\n\",\"  'token': 610,\\n\",\"  'token_str': 'Ġconceptions'},\\n\",\" {'score': 0.008349979296326637,\\n\",\"  'sequence': '<s>Human thinking involves experience.</s>',\\n\",\"  'token': 535,\\n\",\"  'token_str': 'Ġexperience'},\\n\",\" {'score': 0.00743826711550355,\\n\",\"  'sequence': '<s>Human thinking involves will.</s>',\\n\",\"  'token': 487,\\n\",\"  'token_str': 'Ġwill'}]\"]},\"metadata\":{\"tags\":[]},\"execution_count\":0}]}]}"
  },
  {
    "path": "Chapter04/Transformer_tasks.ipynb",
    "content": "{\n  \"nbformat\": 4,\n  \"nbformat_minor\": 0,\n  \"metadata\": {\n    \"colab\": {\n      \"name\": \"Transformer tasks.ipynb\",\n      \"provenance\": [],\n      \"collapsed_sections\": [],\n      \"toc_visible\": true\n    },\n    \"kernelspec\": {\n      \"name\": \"python3\",\n      \"display_name\": \"Python 3\"\n    },\n    \"widgets\": {\n      \"application/vnd.jupyter.widget-state+json\": {\n        \"3f41ce53fe774a86aeab5cedb2217a5b\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_1a64601ca3c04b5e98e1d19375a47751\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_edef5417bf564ca58e080362d7ff66a7\",\n              \"IPY_MODEL_83bc31d46193435cb8d2ad65d99a457b\"\n            ]\n          }\n        },\n        \"1a64601ca3c04b5e98e1d19375a47751\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"edef5417bf564ca58e080362d7ff66a7\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_e78397c4b7bd471191db36e12639e024\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 442,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 442,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_42ae51d8832e45be807d436f41f8ea51\"\n          }\n        },\n        \"83bc31d46193435cb8d2ad65d99a457b\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_84785747983f453cae73f9596a7ec6f0\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 442/442 [00:02&lt;00:00, 183B/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_91a64b6bf1bf4c4aa24ad0c4d08dd3df\"\n          }\n        },\n        \"e78397c4b7bd471191db36e12639e024\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"42ae51d8832e45be807d436f41f8ea51\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"84785747983f453cae73f9596a7ec6f0\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"91a64b6bf1bf4c4aa24ad0c4d08dd3df\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"acdd285b20eb4823a8dfffe6ecd76201\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_30c1e11f7e9c4ada902dc6edabf234f8\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_5783ab24f4aa44ecbd2c437f01ec8bd5\",\n              \"IPY_MODEL_35e5cd80564a43749c73a0458cc0d6da\"\n            ]\n          }\n        },\n        \"30c1e11f7e9c4ada902dc6edabf234f8\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"5783ab24f4aa44ecbd2c437f01ec8bd5\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_236898968f1e46d2bee145d6d369d0f4\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 231508,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 231508,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_c23ef21b59b141afa84133ee50e5a329\"\n          }\n        },\n        \"35e5cd80564a43749c73a0458cc0d6da\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_1852f772c435440e897bbbdf5913598e\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 232k/232k [00:00&lt;00:00, 298kB/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_a481c009790841bf912e0413788f2776\"\n          }\n        },\n        \"236898968f1e46d2bee145d6d369d0f4\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"c23ef21b59b141afa84133ee50e5a329\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"1852f772c435440e897bbbdf5913598e\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"a481c009790841bf912e0413788f2776\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"a608132fb0c247928252b7b3011fcf7d\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_9c0ead55753243999715167582feb852\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_da90cce7734d450e8e39ebf5e659658f\",\n              \"IPY_MODEL_f2b8cbe27e4c4a168a9f5c8771e6f54c\"\n            ]\n          }\n        },\n        \"9c0ead55753243999715167582feb852\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"da90cce7734d450e8e39ebf5e659658f\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_3a2e5325f1e04541baf033054d514e2a\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 629,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 629,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_2266c15cbb48451bbbd5655d6435b62b\"\n          }\n        },\n        \"f2b8cbe27e4c4a168a9f5c8771e6f54c\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_9fbbb8870a95419b90cfbaa8c7db4ae1\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 629/629 [00:01&lt;00:00, 376B/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_f473dd5bf92f4a5eaec7a709d37a1601\"\n          }\n        },\n        \"3a2e5325f1e04541baf033054d514e2a\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"2266c15cbb48451bbbd5655d6435b62b\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"9fbbb8870a95419b90cfbaa8c7db4ae1\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"f473dd5bf92f4a5eaec7a709d37a1601\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"f6abfb99d9c24695ab8a5db242947f54\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_7110b475ad774c75a7855636d4212f30\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_802be925656042b19f8c5ded138045bb\",\n              \"IPY_MODEL_59f4bcea6eb54e269d687cf9618376ea\"\n            ]\n          }\n        },\n        \"7110b475ad774c75a7855636d4212f30\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"802be925656042b19f8c5ded138045bb\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_4919fefe558047d6b7f248898ac62f6f\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 230,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 230,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_eee41447bbd7413f826225573a3836f0\"\n          }\n        },\n        \"59f4bcea6eb54e269d687cf9618376ea\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_28e9463b30a14ee59a7b65fa99f029e5\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 230/230 [00:00&lt;00:00, 1.90kB/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_5786117a2bbb44d1aa70e3ef08872ad5\"\n          }\n        },\n        \"4919fefe558047d6b7f248898ac62f6f\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"eee41447bbd7413f826225573a3836f0\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"28e9463b30a14ee59a7b65fa99f029e5\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"5786117a2bbb44d1aa70e3ef08872ad5\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"e9d5f842308740368a11ed1b46aca768\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_d8d5e37ace9b42c5b8fbe0e4763db2a5\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_cec94035159243f9ab03a5034ed26d66\",\n              \"IPY_MODEL_651c6adc8a064096bf306e5ebc1275c7\"\n            ]\n          }\n        },\n        \"d8d5e37ace9b42c5b8fbe0e4763db2a5\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"cec94035159243f9ab03a5034ed26d66\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_54a8b16ce66040c2b297f1b662b350c1\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 267844284,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 267844284,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_f1c4230f55e148338f80ffff65afd1cb\"\n          }\n        },\n        \"651c6adc8a064096bf306e5ebc1275c7\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_c2d1d30ef23346f9971c11cff4824012\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 268M/268M [00:11&lt;00:00, 22.4MB/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_f62dfd28a2ef4429a8809a7b83d3cfdc\"\n          }\n        },\n        \"54a8b16ce66040c2b297f1b662b350c1\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"f1c4230f55e148338f80ffff65afd1cb\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"c2d1d30ef23346f9971c11cff4824012\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"f62dfd28a2ef4429a8809a7b83d3cfdc\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"7ce0e4d211f34e298db9bde71aafd31d\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_ca96ab0cd02644d2897f14ef256f9ab9\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_c8d3c1a200884dfe8cc74efc73643d66\",\n              \"IPY_MODEL_7a9c6953595d4ab39267c4dfadbf72b4\"\n            ]\n          }\n        },\n        \"ca96ab0cd02644d2897f14ef256f9ab9\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"c8d3c1a200884dfe8cc74efc73643d66\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_c21b433bb763464b99dbc52cd180ae85\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 433,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 433,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_71fda054324a40bc9d852cbb94ae3240\"\n          }\n        },\n        \"7a9c6953595d4ab39267c4dfadbf72b4\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_e8ae1c2f79564550beb3df70d9e08295\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 433/433 [00:02&lt;00:00, 195B/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_757eaa4714064d93b99918fb9ea3cd43\"\n          }\n        },\n        \"c21b433bb763464b99dbc52cd180ae85\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"71fda054324a40bc9d852cbb94ae3240\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"e8ae1c2f79564550beb3df70d9e08295\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"757eaa4714064d93b99918fb9ea3cd43\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"812815f9249b4f6cb138aed2e6a03fd4\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_513fd9d4b17d47e385c7ec7399d7a355\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_c0e34e4be46b4ea395b978fd7108f420\",\n              \"IPY_MODEL_a4c8291f9c0a44d28e4c89b2a6092373\"\n            ]\n          }\n        },\n        \"513fd9d4b17d47e385c7ec7399d7a355\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"c0e34e4be46b4ea395b978fd7108f420\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_04ff1050a97d46fcba82145640ff6b78\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 213450,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 213450,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_9bad4a56293449f68e79f5b5dd0d41c1\"\n          }\n        },\n        \"a4c8291f9c0a44d28e4c89b2a6092373\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_f0f68e55618a44fd9491524d7e0d8dd5\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 213k/213k [00:00&lt;00:00, 368kB/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_fed685ae5a7645c28bdb58e3f9703384\"\n          }\n        },\n        \"04ff1050a97d46fcba82145640ff6b78\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"9bad4a56293449f68e79f5b5dd0d41c1\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"f0f68e55618a44fd9491524d7e0d8dd5\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"fed685ae5a7645c28bdb58e3f9703384\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"b88724d6b16e472f8ede902cac4ae6f2\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_163770117e5a4d0d95926e3a5d0fbf82\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_7f2b4c0c78994c83a064056dc8e79bb3\",\n              \"IPY_MODEL_3d95ba7c826c4c3a8265755fd5738434\"\n            ]\n          }\n        },\n        \"163770117e5a4d0d95926e3a5d0fbf82\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"7f2b4c0c78994c83a064056dc8e79bb3\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_1d5c9a930b5a4558ab0647c90d78f085\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 433518744,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 433518744,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_61f23475c541487899f4e559125e7b46\"\n          }\n        },\n        \"3d95ba7c826c4c3a8265755fd5738434\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_40212ae99d6e40e6a75836d1e6874dc3\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 434M/434M [00:16&lt;00:00, 27.0MB/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_890f3d9f1fa5441f9f6c0e8fb8a89c8f\"\n          }\n        },\n        \"1d5c9a930b5a4558ab0647c90d78f085\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"61f23475c541487899f4e559125e7b46\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"40212ae99d6e40e6a75836d1e6874dc3\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"890f3d9f1fa5441f9f6c0e8fb8a89c8f\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"d6106a736cf046599bc3836b40ad804f\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_76f684a27781484f9cd5ef43df693943\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_d8d9185e24604408a59bb75404fd7daa\",\n              \"IPY_MODEL_6c0bd986d2664ca896244e3858448962\"\n            ]\n          }\n        },\n        \"76f684a27781484f9cd5ef43df693943\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"d8d9185e24604408a59bb75404fd7daa\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_a76a6ab098f1470cafcba41f27c75e74\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 625,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 625,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_1df0601050254a59a1954c4c5d1d2a76\"\n          }\n        },\n        \"6c0bd986d2664ca896244e3858448962\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_ac8d253331bb458c8d9f764303bc9f0b\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 625/625 [00:01&lt;00:00, 394B/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_93bad43a579342e79fdafdf673a3f8a2\"\n          }\n        },\n        \"a76a6ab098f1470cafcba41f27c75e74\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"1df0601050254a59a1954c4c5d1d2a76\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"ac8d253331bb458c8d9f764303bc9f0b\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"93bad43a579342e79fdafdf673a3f8a2\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"cf32487a5c3d4a898ca91e270c3f266b\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_8502995caabc474c91bca08b97cfaa58\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_98b911d9620c40a79c7fee410461d039\",\n              \"IPY_MODEL_8fb914c5733747e3bae86fcb13073767\"\n            ]\n          }\n        },\n        \"8502995caabc474c91bca08b97cfaa58\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"98b911d9620c40a79c7fee410461d039\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_21acd0fcd9214093aa0e93845052ef7e\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 213450,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 213450,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_aa67bf7e7cf940848b4061bee967052c\"\n          }\n        },\n        \"8fb914c5733747e3bae86fcb13073767\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_b1d11f8f842540a682717d350d254155\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 213k/213k [00:00&lt;00:00, 278kB/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_8418d2e012a14387b231fe12ce0b9a1a\"\n          }\n        },\n        \"21acd0fcd9214093aa0e93845052ef7e\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"aa67bf7e7cf940848b4061bee967052c\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"b1d11f8f842540a682717d350d254155\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"8418d2e012a14387b231fe12ce0b9a1a\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"50993327f1d04882a66df50dd30cff3e\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_61c52c61d9c740efab94997706257ba4\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_e23a4ab1a468460a9e4b0f34ae67ef76\",\n              \"IPY_MODEL_3f1629314ca6407b832db9349a508461\"\n            ]\n          }\n        },\n        \"61c52c61d9c740efab94997706257ba4\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"e23a4ab1a468460a9e4b0f34ae67ef76\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_d577e0e2741d43b8a73489c1a0df2406\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 998,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 998,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_76ccbe1be44a4e019e6a50d32c9abf98\"\n          }\n        },\n        \"3f1629314ca6407b832db9349a508461\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_c6eb63c7e9e34d6d989427b9dbe9457c\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 998/998 [00:01&lt;00:00, 621B/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_3d9b5bff09414a5dadc8f6f3ea279227\"\n          }\n        },\n        \"d577e0e2741d43b8a73489c1a0df2406\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"76ccbe1be44a4e019e6a50d32c9abf98\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"c6eb63c7e9e34d6d989427b9dbe9457c\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"3d9b5bff09414a5dadc8f6f3ea279227\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"ffd924b0cc9d492d888e5da831481033\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_555dfb4c02df4930ae64f4f56ed158b7\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_dc47306a92f648d4a690da79d39ac4cc\",\n              \"IPY_MODEL_c4dfc9d018534634a056aa3da58fcfef\"\n            ]\n          }\n        },\n        \"555dfb4c02df4930ae64f4f56ed158b7\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"dc47306a92f648d4a690da79d39ac4cc\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_0a03eaaa65144c829bf93cacc2f66e69\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 230,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 230,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_116c0c19179c4c5d847257a531f9269b\"\n          }\n        },\n        \"c4dfc9d018534634a056aa3da58fcfef\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_aa9be08c10e44dfc8732f3419b5cc967\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 230/230 [01:38&lt;00:00, 2.33B/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_86cbe5290e654770888f9cfba100ae4e\"\n          }\n        },\n        \"0a03eaaa65144c829bf93cacc2f66e69\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"116c0c19179c4c5d847257a531f9269b\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"aa9be08c10e44dfc8732f3419b5cc967\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"86cbe5290e654770888f9cfba100ae4e\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"5b66512a4f6944b2ab0a78631d502da3\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_f861ad27060640a49d34dbc6384a236e\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_63c8f259ede54eab8b7eacc2cd393191\",\n              \"IPY_MODEL_611edbb55ee74d808993a39de5044275\"\n            ]\n          }\n        },\n        \"f861ad27060640a49d34dbc6384a236e\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"63c8f259ede54eab8b7eacc2cd393191\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_336b49d4e3d741aea03fecc236e6333a\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 1334448817,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 1334448817,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_b8c1e428241d4e67abb5df0be3eca758\"\n          }\n        },\n        \"611edbb55ee74d808993a39de5044275\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_37c31cab7d4745d9a7944e4bdda8b970\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 1.33G/1.33G [01:35&lt;00:00, 13.9MB/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_8fe12fc6327d4027ab66cb5815760e75\"\n          }\n        },\n        \"336b49d4e3d741aea03fecc236e6333a\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"b8c1e428241d4e67abb5df0be3eca758\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"37c31cab7d4745d9a7944e4bdda8b970\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"8fe12fc6327d4027ab66cb5815760e75\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"10323e3b6d3f43b6b0d0e23de45b5729\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_8bc14cbd5a01449eb01f2e01e9db2fa2\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_43faf48017b84e8b9b0c5bf29337ed30\",\n              \"IPY_MODEL_7eccb6f09783498ab44020360c1e0062\"\n            ]\n          }\n        },\n        \"8bc14cbd5a01449eb01f2e01e9db2fa2\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"43faf48017b84e8b9b0c5bf29337ed30\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_6738833ebeb94b60912151d886ca083e\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 1199,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 1199,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_6b4c9bb4710541b29c2894739c24472c\"\n          }\n        },\n        \"7eccb6f09783498ab44020360c1e0062\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_f2febefe55c64bea8d5fa39b8c94ba01\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 1.20k/1.20k [00:04&lt;00:00, 292B/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_1a8391734975484ca3feaf86d6da1161\"\n          }\n        },\n        \"6738833ebeb94b60912151d886ca083e\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"6b4c9bb4710541b29c2894739c24472c\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"f2febefe55c64bea8d5fa39b8c94ba01\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"1a8391734975484ca3feaf86d6da1161\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"1314be84dd424e94b13ee568840c7ea2\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_63392f66c53d479199e24d8d8a45823a\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_0cdc54e5dc434c73be4ef332025d0d97\",\n              \"IPY_MODEL_bd0ab5247d6349819f4ba64533e5cc80\"\n            ]\n          }\n        },\n        \"63392f66c53d479199e24d8d8a45823a\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"0cdc54e5dc434c73be4ef332025d0d97\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_e31316c758bb4236905d4291302a9e15\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 791656,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 791656,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_20af50bd6ae74e31b3c0b7f2b6e054d5\"\n          }\n        },\n        \"bd0ab5247d6349819f4ba64533e5cc80\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_8b035f4d02f2447a961ce83766868e09\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 792k/792k [00:02&lt;00:00, 316kB/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_28c23c928d8c440783419e71bfb917d7\"\n          }\n        },\n        \"e31316c758bb4236905d4291302a9e15\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"20af50bd6ae74e31b3c0b7f2b6e054d5\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"8b035f4d02f2447a961ce83766868e09\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"28c23c928d8c440783419e71bfb917d7\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"f5c80d9b2f804af3a4bf07f97ba06bf1\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_bcfe5e255d8a422baf1175c3b23e52e5\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_b50f807793aa404f9ce54c55403f8dd3\",\n              \"IPY_MODEL_6edc69c824bc4f63914e5850e589448a\"\n            ]\n          }\n        },\n        \"bcfe5e255d8a422baf1175c3b23e52e5\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"b50f807793aa404f9ce54c55403f8dd3\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_224b69b50c9846a1b244134d23635efd\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 230,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 230,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_e26e0596f8534c79b7595361eda687ce\"\n          }\n        },\n        \"6edc69c824bc4f63914e5850e589448a\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_d04989abce5d481dbee28ac450a32d2d\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 230/230 [00:21&lt;00:00, 10.7B/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_38a758bfadeb443686bc615481cd9da9\"\n          }\n        },\n        \"224b69b50c9846a1b244134d23635efd\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"e26e0596f8534c79b7595361eda687ce\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"d04989abce5d481dbee28ac450a32d2d\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"38a758bfadeb443686bc615481cd9da9\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"0d335835f44548efadcf0ecd8a49e391\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_b08f64e0d5994eb9adffdfc1d48e9088\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_45ea2974a40c4786a43ee07cdbbcd693\",\n              \"IPY_MODEL_35dd5a9f5c0843838571880dc058661a\"\n            ]\n          }\n        },\n        \"b08f64e0d5994eb9adffdfc1d48e9088\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"45ea2974a40c4786a43ee07cdbbcd693\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_c1e3435e66d64c498e2334da0c521b60\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 891691430,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 891691430,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_39500a63ce0c401a8f52d3f1610e14a3\"\n          }\n        },\n        \"35dd5a9f5c0843838571880dc058661a\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_e1be44708f6d4fc0a64a36953787eef0\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 892M/892M [00:18&lt;00:00, 47.7MB/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_13edcd9f5be5479d894bbe150c64e3d0\"\n          }\n        },\n        \"c1e3435e66d64c498e2334da0c521b60\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"39500a63ce0c401a8f52d3f1610e14a3\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"e1be44708f6d4fc0a64a36953787eef0\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"13edcd9f5be5479d894bbe150c64e3d0\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        }\n      }\n    },\n    \"accelerator\": \"GPU\"\n  },\n  \"cells\": [\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"aZSBwt0M5Hmf\",\n        \"colab_type\": \"text\"\n      },\n      \"source\": [\n        \"#Usage of Transformers\\n\",\n        \"\\n\",\n        \"[Original Hugging Face Code Reference](https://huggingface.co/transformers/usage.html)\\n\",\n        \"\\n\",\n        \"[Paper citation: HuggingFace's Transformers: State-of-the-art Natural Language Processing](https://arxiv.org/abs/1910.03771)\\n\",\n        \"\\n\",\n        \"Copyright Denis Rothman 2020, MIT License. The original usage examples have been changed for educational purposes.\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"nQ0myH1cLaQ7\",\n        \"colab_type\": \"code\",\n        \"outputId\": \"6532ebf7-6c77-4b6b-c81c-c46b0defb059\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 119\n        }\n      },\n      \"source\": [\n        \"#@title Transformer and Torch Installation\\n\",\n        \"try:\\n\",\n        \"  import transformers\\n\",\n        \"except:\\n\",\n        \"  print(\\\"Installing transformers\\\")\\n\",\n        \"  !pip -qq install transformers\\n\",\n        \"\\n\",\n        \"try:\\n\",\n        \"  import torch\\n\",\n        \"except:\\n\",\n        \"  print(\\\"Installing Torch\\\")\\n\",\n        \"  !pip -qq install torch\"\n      ],\n      \"execution_count\": 1,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Installing transformers\\n\",\n            \"\\u001b[K     |████████████████████████████████| 675kB 2.8MB/s \\n\",\n            \"\\u001b[K     |████████████████████████████████| 3.8MB 12.9MB/s \\n\",\n            \"\\u001b[K     |████████████████████████████████| 1.1MB 32.9MB/s \\n\",\n            \"\\u001b[K     |████████████████████████████████| 890kB 42.5MB/s \\n\",\n            \"\\u001b[?25h  Building wheel for sacremoses (setup.py) ... \\u001b[?25l\\u001b[?25hdone\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"foamjwawe2OX\",\n        \"colab_type\": \"code\",\n        \"outputId\": \"a3944c8e-f677-4ea6-b1e8-0894ce5d1bd4\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 296,\n          \"referenced_widgets\": [\n            \"3f41ce53fe774a86aeab5cedb2217a5b\",\n            \"1a64601ca3c04b5e98e1d19375a47751\",\n            \"edef5417bf564ca58e080362d7ff66a7\",\n            \"83bc31d46193435cb8d2ad65d99a457b\",\n            \"e78397c4b7bd471191db36e12639e024\",\n            \"42ae51d8832e45be807d436f41f8ea51\",\n            \"84785747983f453cae73f9596a7ec6f0\",\n            \"91a64b6bf1bf4c4aa24ad0c4d08dd3df\",\n            \"acdd285b20eb4823a8dfffe6ecd76201\",\n            \"30c1e11f7e9c4ada902dc6edabf234f8\",\n            \"5783ab24f4aa44ecbd2c437f01ec8bd5\",\n            \"35e5cd80564a43749c73a0458cc0d6da\",\n            \"236898968f1e46d2bee145d6d369d0f4\",\n            \"c23ef21b59b141afa84133ee50e5a329\",\n            \"1852f772c435440e897bbbdf5913598e\",\n            \"a481c009790841bf912e0413788f2776\",\n            \"a608132fb0c247928252b7b3011fcf7d\",\n            \"9c0ead55753243999715167582feb852\",\n            \"da90cce7734d450e8e39ebf5e659658f\",\n            \"f2b8cbe27e4c4a168a9f5c8771e6f54c\",\n            \"3a2e5325f1e04541baf033054d514e2a\",\n            \"2266c15cbb48451bbbd5655d6435b62b\",\n            \"9fbbb8870a95419b90cfbaa8c7db4ae1\",\n            \"f473dd5bf92f4a5eaec7a709d37a1601\",\n            \"f6abfb99d9c24695ab8a5db242947f54\",\n            \"7110b475ad774c75a7855636d4212f30\",\n            \"802be925656042b19f8c5ded138045bb\",\n            \"59f4bcea6eb54e269d687cf9618376ea\",\n            \"4919fefe558047d6b7f248898ac62f6f\",\n            \"eee41447bbd7413f826225573a3836f0\",\n            \"28e9463b30a14ee59a7b65fa99f029e5\",\n            \"5786117a2bbb44d1aa70e3ef08872ad5\",\n            \"e9d5f842308740368a11ed1b46aca768\",\n            \"d8d5e37ace9b42c5b8fbe0e4763db2a5\",\n            \"cec94035159243f9ab03a5034ed26d66\",\n            \"651c6adc8a064096bf306e5ebc1275c7\",\n            \"54a8b16ce66040c2b297f1b662b350c1\",\n            \"f1c4230f55e148338f80ffff65afd1cb\",\n            \"c2d1d30ef23346f9971c11cff4824012\",\n            \"f62dfd28a2ef4429a8809a7b83d3cfdc\"\n          ]\n        }\n      },\n      \"source\": [\n        \"#@title SST-2 Binary Classification\\n\",\n        \"from transformers import pipeline\\n\",\n        \"\\n\",\n        \"nlp = pipeline(\\\"sentiment-analysis\\\")\\n\",\n        \"\\n\",\n        \"print(nlp(\\\"If you sometimes like to go to the movies to have fun , Wasabi is a good place to start .\\\"),\\\"If you sometimes like to go to the movies to have fun , Wasabi is a good place to start .\\\")\\n\",\n        \"print(nlp(\\\"Effective but too-tepid biopic.\\\"),\\\"Effective but too-tepid biopic.\\\")\"\n      ],\n      \"execution_count\": 2,\n      \"outputs\": [\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"3f41ce53fe774a86aeab5cedb2217a5b\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=442.0, style=ProgressStyle(description_…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"acdd285b20eb4823a8dfffe6ecd76201\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=231508.0, style=ProgressStyle(descripti…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"a608132fb0c247928252b7b3011fcf7d\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=629.0, style=ProgressStyle(description_…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"f6abfb99d9c24695ab8a5db242947f54\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=230.0, style=ProgressStyle(description_…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"e9d5f842308740368a11ed1b46aca768\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=267844284.0, style=ProgressStyle(descri…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\",\n            \"[{'label': 'POSITIVE', 'score': 0.999825656414032}] If you sometimes like to go to the movies to have fun , Wasabi is a good place to start .\\n\",\n            \"[{'label': 'NEGATIVE', 'score': 0.9974064230918884}] Effective but too-tepid biopic.\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"colab_type\": \"code\",\n        \"outputId\": \"ac628ecd-9aac-46ba-bd8c-f52a80122508\",\n        \"id\": \"iILfeaHLlivA\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 215,\n          \"referenced_widgets\": [\n            \"7ce0e4d211f34e298db9bde71aafd31d\",\n            \"ca96ab0cd02644d2897f14ef256f9ab9\",\n            \"c8d3c1a200884dfe8cc74efc73643d66\",\n            \"7a9c6953595d4ab39267c4dfadbf72b4\",\n            \"c21b433bb763464b99dbc52cd180ae85\",\n            \"71fda054324a40bc9d852cbb94ae3240\",\n            \"e8ae1c2f79564550beb3df70d9e08295\",\n            \"757eaa4714064d93b99918fb9ea3cd43\",\n            \"812815f9249b4f6cb138aed2e6a03fd4\",\n            \"513fd9d4b17d47e385c7ec7399d7a355\",\n            \"c0e34e4be46b4ea395b978fd7108f420\",\n            \"a4c8291f9c0a44d28e4c89b2a6092373\",\n            \"04ff1050a97d46fcba82145640ff6b78\",\n            \"9bad4a56293449f68e79f5b5dd0d41c1\",\n            \"f0f68e55618a44fd9491524d7e0d8dd5\",\n            \"fed685ae5a7645c28bdb58e3f9703384\",\n            \"b88724d6b16e472f8ede902cac4ae6f2\",\n            \"163770117e5a4d0d95926e3a5d0fbf82\",\n            \"7f2b4c0c78994c83a064056dc8e79bb3\",\n            \"3d95ba7c826c4c3a8265755fd5738434\",\n            \"1d5c9a930b5a4558ab0647c90d78f085\",\n            \"61f23475c541487899f4e559125e7b46\",\n            \"40212ae99d6e40e6a75836d1e6874dc3\",\n            \"890f3d9f1fa5441f9f6c0e8fb8a89c8f\"\n          ]\n        }\n      },\n      \"source\": [\n        \"#@title Sequence Classification : paraphrase classification\\n\",\n        \"from transformers import AutoTokenizer, TFAutoModelForSequenceClassification\\n\",\n        \"import tensorflow as tf\\n\",\n        \"\\n\",\n        \"tokenizer = AutoTokenizer.from_pretrained(\\\"bert-base-cased-finetuned-mrpc\\\")\\n\",\n        \"model = TFAutoModelForSequenceClassification.from_pretrained(\\\"bert-base-cased-finetuned-mrpc\\\")\\n\",\n        \"\\n\",\n        \"classes = [\\\"not paraphrase\\\", \\\"is paraphrase\\\"]\\n\",\n        \"\\n\",\n        \"sequence_A = \\\"The DVD-CCA then appealed to the state Supreme Court.\\\"\\n\",\n        \"sequence_B = \\\"The DVD CCA appealed that decision to the U.S. Supreme Court.\\\"\\n\",\n        \"\\n\",\n        \"paraphrase = tokenizer.encode_plus(sequence_A, sequence_B, return_tensors=\\\"tf\\\")\\n\",\n        \"\\n\",\n        \"paraphrase_classification_logits = model(paraphrase)[0]\\n\",\n        \"\\n\",\n        \"paraphrase_results = tf.nn.softmax(paraphrase_classification_logits, axis=1).numpy()[0]\\n\",\n        \"\\n\",\n        \"print(sequence_B, \\\"should be a paraphrase\\\")\\n\",\n        \"for i in range(len(classes)):\\n\",\n        \"    print(f\\\"{classes[i]}: {round(paraphrase_results[i] * 100)}%\\\")\"\n      ],\n      \"execution_count\": 3,\n      \"outputs\": [\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"7ce0e4d211f34e298db9bde71aafd31d\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=433.0, style=ProgressStyle(description_…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"812815f9249b4f6cb138aed2e6a03fd4\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=213450.0, style=ProgressStyle(descripti…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"b88724d6b16e472f8ede902cac4ae6f2\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=433518744.0, style=ProgressStyle(descri…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\",\n            \"The DVD CCA appealed that decision to the U.S. Supreme Court. should be a paraphrase\\n\",\n            \"not paraphrase: 8.0%\\n\",\n            \"is paraphrase: 92.0%\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"PyQKscwtYgCW\",\n        \"colab_type\": \"code\",\n        \"outputId\": \"8ea9d4b7-0c4c-4bd9-d9c2-b497db7253f3\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 299,\n          \"referenced_widgets\": [\n            \"d6106a736cf046599bc3836b40ad804f\",\n            \"76f684a27781484f9cd5ef43df693943\",\n            \"d8d9185e24604408a59bb75404fd7daa\",\n            \"6c0bd986d2664ca896244e3858448962\",\n            \"a76a6ab098f1470cafcba41f27c75e74\",\n            \"1df0601050254a59a1954c4c5d1d2a76\",\n            \"ac8d253331bb458c8d9f764303bc9f0b\",\n            \"93bad43a579342e79fdafdf673a3f8a2\",\n            \"cf32487a5c3d4a898ca91e270c3f266b\",\n            \"8502995caabc474c91bca08b97cfaa58\",\n            \"98b911d9620c40a79c7fee410461d039\",\n            \"8fb914c5733747e3bae86fcb13073767\",\n            \"21acd0fcd9214093aa0e93845052ef7e\",\n            \"aa67bf7e7cf940848b4061bee967052c\",\n            \"b1d11f8f842540a682717d350d254155\",\n            \"8418d2e012a14387b231fe12ce0b9a1a\",\n            \"50993327f1d04882a66df50dd30cff3e\",\n            \"61c52c61d9c740efab94997706257ba4\",\n            \"e23a4ab1a468460a9e4b0f34ae67ef76\",\n            \"3f1629314ca6407b832db9349a508461\",\n            \"d577e0e2741d43b8a73489c1a0df2406\",\n            \"76ccbe1be44a4e019e6a50d32c9abf98\",\n            \"c6eb63c7e9e34d6d989427b9dbe9457c\",\n            \"3d9b5bff09414a5dadc8f6f3ea279227\",\n            \"ffd924b0cc9d492d888e5da831481033\",\n            \"555dfb4c02df4930ae64f4f56ed158b7\",\n            \"dc47306a92f648d4a690da79d39ac4cc\",\n            \"c4dfc9d018534634a056aa3da58fcfef\",\n            \"0a03eaaa65144c829bf93cacc2f66e69\",\n            \"116c0c19179c4c5d847257a531f9269b\",\n            \"aa9be08c10e44dfc8732f3419b5cc967\",\n            \"86cbe5290e654770888f9cfba100ae4e\",\n            \"5b66512a4f6944b2ab0a78631d502da3\",\n            \"f861ad27060640a49d34dbc6384a236e\",\n            \"63c8f259ede54eab8b7eacc2cd393191\",\n            \"611edbb55ee74d808993a39de5044275\",\n            \"336b49d4e3d741aea03fecc236e6333a\",\n            \"b8c1e428241d4e67abb5df0be3eca758\",\n            \"37c31cab7d4745d9a7944e4bdda8b970\",\n            \"8fe12fc6327d4027ab66cb5815760e75\"\n          ]\n        }\n      },\n      \"source\": [\n        \"#@title Named Entity Recognition(NER)\\n\",\n        \"from transformers import pipeline\\n\",\n        \"nlp = pipeline(\\\"ner\\\")\\n\",\n        \"sequence = \\\"Hugging Face Inc. is a company based in New York City. Its headquarters are in DUMBO, therefore very\\\" \\\\\\n\",\n        \"           \\\"close to the Manhattan Bridge which is visible from the window.\\\"\\n\",\n        \"print(nlp(sequence))\"\n      ],\n      \"execution_count\": 4,\n      \"outputs\": [\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"d6106a736cf046599bc3836b40ad804f\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=625.0, style=ProgressStyle(description_…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"cf32487a5c3d4a898ca91e270c3f266b\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=213450.0, style=ProgressStyle(descripti…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"50993327f1d04882a66df50dd30cff3e\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=998.0, style=ProgressStyle(description_…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"ffd924b0cc9d492d888e5da831481033\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=230.0, style=ProgressStyle(description_…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"5b66512a4f6944b2ab0a78631d502da3\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=1334448817.0, style=ProgressStyle(descr…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\",\n            \"[{'word': 'Hu', 'score': 0.9995632767677307, 'entity': 'I-ORG', 'index': 1}, {'word': '##gging', 'score': 0.9915938377380371, 'entity': 'I-ORG', 'index': 2}, {'word': 'Face', 'score': 0.9982671737670898, 'entity': 'I-ORG', 'index': 3}, {'word': 'Inc', 'score': 0.9994403719902039, 'entity': 'I-ORG', 'index': 4}, {'word': 'New', 'score': 0.9994346499443054, 'entity': 'I-LOC', 'index': 11}, {'word': 'York', 'score': 0.9993270635604858, 'entity': 'I-LOC', 'index': 12}, {'word': 'City', 'score': 0.9993864893913269, 'entity': 'I-LOC', 'index': 13}, {'word': 'D', 'score': 0.9825621843338013, 'entity': 'I-LOC', 'index': 19}, {'word': '##UM', 'score': 0.936983048915863, 'entity': 'I-LOC', 'index': 20}, {'word': '##BO', 'score': 0.8987101316452026, 'entity': 'I-LOC', 'index': 21}, {'word': 'Manhattan', 'score': 0.9758241176605225, 'entity': 'I-LOC', 'index': 29}, {'word': 'Bridge', 'score': 0.9902493953704834, 'entity': 'I-LOC', 'index': 30}]\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"lbwBChUX7grO\",\n        \"colab_type\": \"text\"\n      },\n      \"source\": [\n        \"prosody represented by \\\"ha,ha\\\" with sure. Could be positive or negative. Context required. \\\"Not\\\" \\\"else\\\", \\\"however\\\" are too strong in this model.\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"l871bLNcNWiA\",\n        \"colab_type\": \"code\",\n        \"outputId\": \"37e7bce6-1c08-4872-e78a-1f9ebc9f32dd\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 213,\n          \"referenced_widgets\": [\n            \"10323e3b6d3f43b6b0d0e23de45b5729\",\n            \"8bc14cbd5a01449eb01f2e01e9db2fa2\",\n            \"43faf48017b84e8b9b0c5bf29337ed30\",\n            \"7eccb6f09783498ab44020360c1e0062\",\n            \"6738833ebeb94b60912151d886ca083e\",\n            \"6b4c9bb4710541b29c2894739c24472c\",\n            \"f2febefe55c64bea8d5fa39b8c94ba01\",\n            \"1a8391734975484ca3feaf86d6da1161\",\n            \"1314be84dd424e94b13ee568840c7ea2\",\n            \"63392f66c53d479199e24d8d8a45823a\",\n            \"0cdc54e5dc434c73be4ef332025d0d97\",\n            \"bd0ab5247d6349819f4ba64533e5cc80\",\n            \"e31316c758bb4236905d4291302a9e15\",\n            \"20af50bd6ae74e31b3c0b7f2b6e054d5\",\n            \"8b035f4d02f2447a961ce83766868e09\",\n            \"28c23c928d8c440783419e71bfb917d7\",\n            \"f5c80d9b2f804af3a4bf07f97ba06bf1\",\n            \"bcfe5e255d8a422baf1175c3b23e52e5\",\n            \"b50f807793aa404f9ce54c55403f8dd3\",\n            \"6edc69c824bc4f63914e5850e589448a\",\n            \"224b69b50c9846a1b244134d23635efd\",\n            \"e26e0596f8534c79b7595361eda687ce\",\n            \"d04989abce5d481dbee28ac450a32d2d\",\n            \"38a758bfadeb443686bc615481cd9da9\",\n            \"0d335835f44548efadcf0ecd8a49e391\",\n            \"b08f64e0d5994eb9adffdfc1d48e9088\",\n            \"45ea2974a40c4786a43ee07cdbbcd693\",\n            \"35dd5a9f5c0843838571880dc058661a\",\n            \"c1e3435e66d64c498e2334da0c521b60\",\n            \"39500a63ce0c401a8f52d3f1610e14a3\",\n            \"e1be44708f6d4fc0a64a36953787eef0\",\n            \"13edcd9f5be5479d894bbe150c64e3d0\"\n          ]\n        }\n      },\n      \"source\": [\n        \"#@title Winograd\\n\",\n        \"from transformers import pipeline\\n\",\n        \"translator = pipeline(\\\"translation_en_to_fr\\\")\"\n      ],\n      \"execution_count\": 5,\n      \"outputs\": [\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"10323e3b6d3f43b6b0d0e23de45b5729\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=1199.0, style=ProgressStyle(description…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"1314be84dd424e94b13ee568840c7ea2\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=791656.0, style=ProgressStyle(descripti…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"f5c80d9b2f804af3a4bf07f97ba06bf1\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=230.0, style=ProgressStyle(description_…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"0d335835f44548efadcf0ecd8a49e391\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=891691430.0, style=ProgressStyle(descri…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"Jslzg16dTa0K\",\n        \"colab_type\": \"code\",\n        \"outputId\": \"083a682a-2b86-47c7-c6fd-ffc83f5cd829\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 34\n        }\n      },\n      \"source\": [\n        \"print(translator(\\\"The car could not go in the garage because it was too big.\\\", max_length=40))\"\n      ],\n      \"execution_count\": 6,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"[{'translation_text': \\\"La voiture ne pouvait pas aller dans le garage parce qu'elle était trop grosse.\\\"}]\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    }\n  ]\n}"
  },
  {
    "path": "Chapter05/BLEU.py",
    "content": "#BLEU : Bilingual Evaluation Understudy Score\n#Copyright 2020, MIT License BLEU Examples\n#REF PAPER: Kishore Papineni, et al.,2002,“BLEU: a Method for Automatic Evaluation of Machine Translation“. \n#                                                https://www.aclweb.org/anthology/P02-1040.pdf\n#NLTK : Natural Language Toolkit\n#NLTK sentence_bleu doc: http://www.nltk.org/api/nltk.translate.html#nltk.translate.bleu_score.sentence_bleu\n#NLTK smoothing doc: https://www.nltk.org/api/nltk.translate.html\n#NLTK REF PAPER for smoothing():Chen et al.,http://acl2014.org/acl2014/W14-33/pdf/W14-3346.pdf\n#REF DOC  : https://machinelearningmastery.com/calculate-bleu-score-for-text-python/\n\nfrom nltk.translate.bleu_score import sentence_bleu\nfrom nltk.translate.bleu_score import SmoothingFunction\n\n#Example 1\nreference = [['the', 'cat', 'likes', 'milk'], ['cat', 'likes' 'milk']]\ncandidate = ['the', 'cat', 'likes', 'milk']\nscore = sentence_bleu(reference, candidate)\nprint('Example 1', score)\n\n#Example 2\nreference = [['the', 'cat', 'likes', 'milk']]\ncandidate = ['the', 'cat', 'likes', 'milk']\nscore = sentence_bleu(reference, candidate)\nprint('Example 2', score)\n\n#Example 3\nreference = [['the', 'cat', 'likes', 'milk']]\ncandidate = ['the', 'cat', 'enjoys','milk']\nscore = sentence_bleu(reference, candidate)\nprint('Example 3', score)\n\n\n#Example 4\nreference = [['je','vous','invite', 'a', 'vous', 'lever','pour', 'cette', 'minute', 'de', 'silence']]\ncandidate = ['levez','vous','svp','pour', 'cette', 'minute', 'de', 'silence']\nscore = sentence_bleu(reference, candidate)\nprint(\"without soothing score\", score)\n\nchencherry = SmoothingFunction()\nr1=list('je vous invite a vous lever pour cette minute de silence')\ncandidate=list('levez vous svp pour cette minute de silence')\n        \n#sentence_bleu([reference1, reference2, reference3], hypothesis2,smoothing_function=chencherry.method1)\nprint(\"with smoothing score\",sentence_bleu([r1], candidate,smoothing_function=chencherry.method1))\n\n\n"
  },
  {
    "path": "Chapter05/Trax_Translation.ipynb",
    "content": "{\n  \"nbformat\": 4,\n  \"nbformat_minor\": 0,\n  \"metadata\": {\n    \"colab\": {\n      \"name\": \"Trax_Translation.ipynb\",\n      \"provenance\": [],\n      \"collapsed_sections\": []\n    },\n    \"kernelspec\": {\n      \"name\": \"python3\",\n      \"display_name\": \"Python 3\"\n    },\n    \"accelerator\": \"GPU\"\n  },\n  \"cells\": [\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"1liQji85FAIp\"\n      },\n      \"source\": [\n        \"#Machine Translation with Trax\\n\",\n        \"\\n\",\n        \"Note by Denis Rothman: The original notebook was split into cells.\\n\",\n        \"\\n\",\n        \"[Reference Code](https://colab.research.google.com/github/google/trax/blob/master/trax/intro.ipynb)\\n\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"h0pjcihTE9fR\"\n      },\n      \"source\": [\n        \"#@title Installing Trax\\n\",\n        \"import os\\n\",\n        \"import numpy as np\\n\",\n        \"\\n\",\n        \"!pip install -q -U trax\\n\",\n        \"import trax\"\n      ],\n      \"execution_count\": 7,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"ivTjrL-BMD8i\"\n      },\n      \"source\": [\n        \"#@title Creating\\n\",\n        \"# Pre-trained model config in gs://trax-ml/models/translation/ende_wmt32k.gin\\n\",\n        \"model = trax.models.Transformer(\\n\",\n        \"    input_vocab_size=33300,\\n\",\n        \"    d_model=512, d_ff=2048,\\n\",\n        \"    n_heads=8, n_encoder_layers=6, n_decoder_layers=6,\\n\",\n        \"    max_len=2048, mode='predict')\\n\"\n      ],\n      \"execution_count\": 8,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"oJgRqlrmMKbo\"\n      },\n      \"source\": [\n        \"#@title Initializing the model using pre-trained weights\\n\",\n        \"model.init_from_file('gs://trax-ml/models/translation/ende_wmt32k.pkl.gz',\\n\",\n        \"                     weights_only=True)\"\n      ],\n      \"execution_count\": 9,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"HvwJ5w-6MQNw\"\n      },\n      \"source\": [\n        \"#@title Tokenizing a sentence\\n\",\n        \"sentence = 'I am only a machine but I have machine intelligence.'\\n\",\n        \"\\n\",\n        \"tokenized = list(trax.data.tokenize(iter([sentence]),  # Operates on streams.\\n\",\n        \"                                    vocab_dir='gs://trax-ml/vocabs/',\\n\",\n        \"                                    vocab_file='ende_32k.subword'))[0]\\n\"\n      ],\n      \"execution_count\": 10,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"IVkBQOvmMW9A\"\n      },\n      \"source\": [\n        \"#@title Decoding from the Transformer\\n\",\n        \"tokenized = tokenized[None, :]  # Add batch dimension.\\n\",\n        \"tokenized_translation = trax.supervised.decoding.autoregressive_sample(\\n\",\n        \"    model, tokenized, temperature=0.0)  # Higher temperature: more diverse results.\\n\"\n      ],\n      \"execution_count\": 11,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"id\": \"QV2xr8_7Mc4B\",\n        \"outputId\": \"c78c12ea-84a1-4fd5-fb2e-770fadc19e8b\"\n      },\n      \"source\": [\n        \"#@title De-tokenizing and Displaying the Translation\\n\",\n        \"tokenized_translation = tokenized_translation[0][:-1]  # Remove batch and EOS.\\n\",\n        \"translation = trax.data.detokenize(tokenized_translation,\\n\",\n        \"                                   vocab_dir='gs://trax-ml/vocabs/',\\n\",\n        \"                                   vocab_file='ende_32k.subword')\\n\",\n        \"print(\\\"The sentence:\\\",sentence)\\n\",\n        \"print(\\\"The translation:\\\",translation)\"\n      ],\n      \"execution_count\": 12,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"The sentence: I am only a machine but I have machine intelligence.\\n\",\n            \"The translation: Ich bin nur eine Maschine, aber ich habe Maschinenübersicht.\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    }\n  ]\n}"
  },
  {
    "path": "Chapter05/read.py",
    "content": "#Pre-Processing datasets for Machine Translation\n#Copyright 2020, Denis Rothman, MIT License\n#Denis Rothman modified the code for educational purposes.\n#Reference:\n#Jason Brownlee PhD, ‘How to Prepare a French-to-English Dataset for Machine Translation\n# https://machinelearningmastery.com/prepare-french-english-dataset-machine-translation/\n\nimport pickle\nfrom pickle import dump\n\n# load doc into memory\ndef load_doc(filename):\n\t# open the file as read only\n\tfile = open(filename, mode='rt', encoding='utf-8')\n\t# read all text\n\ttext = file.read()\n\t# close the file\n\tfile.close()\n\treturn text\n \n# split a loaded document into sentences\ndef to_sentences(doc):\n\treturn doc.strip().split('\\n')\n \n# shortest and longest sentence lengths\ndef sentence_lengths(sentences):\n\tlengths = [len(s.split()) for s in sentences]\n\treturn min(lengths), max(lengths)\n\n# clean lines\nimport re\nimport string\nimport unicodedata\ndef clean_lines(lines):\n\tcleaned = list()\n\t# prepare regex for char filtering\n\tre_print = re.compile('[^%s]' % re.escape(string.printable))\n\t# prepare translation table for removing punctuation\n\ttable = str.maketrans('', '', string.punctuation)\n\tfor line in lines:\n\t\t# normalize unicode characters\n\t\tline = unicodedata.normalize('NFD', line).encode('ascii', 'ignore')\n\t\tline = line.decode('UTF-8')\n\t\t# tokenize on white space\n\t\tline = line.split()\n\t\t# convert to lower case\n\t\tline = [word.lower() for word in line]\n\t\t# remove punctuation from each token\n\t\tline = [word.translate(table) for word in line]\n\t\t# remove non-printable chars form each token\n\t\tline = [re_print.sub('', w) for w in line]\n\t\t# remove tokens with numbers in them\n\t\tline = [word for word in line if word.isalpha()]\n\t\t# store as string\n\t\tcleaned.append(' '.join(line))\n\treturn cleaned\n  \n# load English data\nfilename = 'europarl-v7.fr-en.en'\ndoc = load_doc(filename)\nsentences = to_sentences(doc)\nminlen, maxlen = sentence_lengths(sentences)\nprint('English data: sentences=%d, min=%d, max=%d' % (len(sentences), minlen, maxlen))\ncleanf=clean_lines(sentences)\nfilename = 'English.pkl'\noutfile = open(filename,'wb')\npickle.dump(cleanf,outfile)\noutfile.close()\nprint(filename,\" saved\")\n\n# load English data\nfilename = 'europarl-v7.fr-en.fr'\ndoc = load_doc(filename)\nsentences = to_sentences(doc)\nminlen, maxlen = sentence_lengths(sentences)\nprint('French data: sentences=%d, min=%d, max=%d' % (len(sentences), minlen, maxlen))\ncleanf=clean_lines(sentences)\nfilename = 'French.pkl'\noutfile = open(filename,'wb')\npickle.dump(cleanf,outfile)\noutfile.close()\nprint(filename,\" saved\")\n\n\n"
  },
  {
    "path": "Chapter05/read_clean.py",
    "content": "#Pre-Processing datasets for Machine Translation\n#Copyright 2020, Denis Rothman, MIT License\n#Denis Rothman modified the code for educational purposes.\n#Reference:\n#Jason Brownlee PhD, ‘How to Prepare a French-to-English Dataset for Machine Translation\n# https://machinelearningmastery.com/prepare-french-english-dataset-machine-translation/\n\n\nfrom pickle import load\nfrom pickle import dump\nfrom collections import Counter\n \n# load a clean dataset\ndef load_clean_sentences(filename):\n\treturn load(open(filename, 'rb'))\n \n# save a list of clean sentences to file\ndef save_clean_sentences(sentences, filename):\n\tdump(sentences, open(filename, 'wb'))\n\tprint('Saved: %s' % filename)\n \n# create a frequency table for all words\ndef to_vocab(lines):\n\tvocab = Counter()\n\tfor line in lines:\n\t\ttokens = line.split()\n\t\tvocab.update(tokens)\n\treturn vocab\n \n# remove all words with a frequency below a threshold\ndef trim_vocab(vocab, min_occurance):\n\ttokens = [k for k,c in vocab.items() if c >= min_occurance]\n\treturn set(tokens)\n \n# mark all OOV with \"unk\" for all lines\ndef update_dataset(lines, vocab):\n\tnew_lines = list()\n\tfor line in lines:\n\t\tnew_tokens = list()\n\t\tfor token in line.split():\n\t\t\tif token in vocab:\n\t\t\t\tnew_tokens.append(token)\n\t\t\telse:\n\t\t\t\tnew_tokens.append('unk')\n\t\tnew_line = ' '.join(new_tokens)\n\t\tnew_lines.append(new_line)\n\treturn new_lines\n \n# load English dataset\nfilename = 'English.pkl'\nlines = load_clean_sentences(filename)\n# calculate vocabulary\nvocab = to_vocab(lines)\nprint('English Vocabulary: %d' % len(vocab))\n# reduce vocabulary\nvocab = trim_vocab(vocab, 5)\nprint('New English Vocabulary: %d' % len(vocab))\n# mark out of vocabulary words\nlines = update_dataset(lines, vocab)\n# save updated dataset\nfilename = 'english_vocab.pkl'\nsave_clean_sentences(lines, filename)\n# spot check\nfor i in range(20):\n\tprint(\"line\",i,\":\",lines[i])\n \n# load French dataset\nfilename = 'French.pkl'\nlines = load_clean_sentences(filename)\n# calculate vocabulary\nvocab = to_vocab(lines)\nprint('French Vocabulary: %d' % len(vocab))\n# reduce vocabulary\nvocab = trim_vocab(vocab, 5)\nprint('New French Vocabulary: %d' % len(vocab))\n# mark out of vocabulary words\nlines = update_dataset(lines, vocab)\n# save updated dataset\nfilename = 'french_vocab.pkl'\nsave_clean_sentences(lines, filename)\n# spot check\nfor i in range(20):\n\tprint(\"line\",i,\":\",lines[i])\n"
  },
  {
    "path": "Chapter06/OpenAI_GPT_2.ipynb",
    "content": "{\"nbformat\":4,\"nbformat_minor\":0,\"metadata\":{\"colab\":{\"name\":\"OpenAI_GPT_2_KS.ipynb\",\"provenance\":[],\"collapsed_sections\":[],\"toc_visible\":true},\"kernelspec\":{\"name\":\"python3\",\"display_name\":\"Python 3\"},\"accelerator\":\"GPU\"},\"cells\":[{\"cell_type\":\"markdown\",\"metadata\":{\"id\":\"LH2YgC7LfzJZ\"},\"source\":[\"#OpenAI GTP-2\\n\",\"Copyright 2020, Denis Rothman MIT License. Denis Rothman created the Colab notebook using the OpenAI repository, adding title steps for educational purposes only.\\n\",\"\\n\",\"It is important to note that we are running a low-level GPT-2 model \\n\",\"and not a one-line call to obtain a result. We are also\\n\",\"avoiding pre-packaged versions. We are getting our hands dirty to\\n\",\"understand the architecture of a GPT-2 from scratch. You might get\\n\",\"some deprecation messages. However, the effort is worthwhile.\\n\",\"\\n\",\"***Code Reference***\\n\",\"[Reference: OpenAI Repository](https://github.com/openai/gpt-2)\\n\",\"\\n\",\"***Model Reference***\\n\",\"[Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, Ilya Sutskever,2019,'Language Models are Unsupervised Multitask Learners'](https://d4mucfpksywv.cloudfront.net/better-language-models/language-models.pdf)\\n\",\"\\n\",\"\\n\",\"Step 1: Pre-requisite: activate GPU in the notebook settings runTime menu\\n\",\"\\n\"]},{\"cell_type\":\"code\",\"metadata\":{\"id\":\"isqdu1fpfmqM\",\"colab\":{\"base_uri\":\"https://localhost:8080/\"},\"executionInfo\":{\"status\":\"ok\",\"timestamp\":1611121642694,\"user_tz\":-330,\"elapsed\":2122,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"}},\"outputId\":\"0893439c-1785-4977-ac91-9fa8088c3b03\"},\"source\":[\"#@title Step 2: Cloning the OpenAI GPT-2 Repository \\n\",\"!git clone https://github.com/openai/gpt-2.git\"],\"execution_count\":1,\"outputs\":[{\"output_type\":\"stream\",\"text\":[\"Cloning into 'gpt-2'...\\n\",\"remote: Enumerating objects: 233, done.\\u001b[K\\n\",\"remote: Total 233 (delta 0), reused 0 (delta 0), pack-reused 233\\u001b[K\\n\",\"Receiving objects: 100% (233/233), 4.38 MiB | 23.47 MiB/s, done.\\n\",\"Resolving deltas: 100% (124/124), done.\\n\"],\"name\":\"stdout\"}]},{\"cell_type\":\"code\",\"metadata\":{\"id\":\"7RHOjN-TjUbj\",\"colab\":{\"base_uri\":\"https://localhost:8080/\"},\"executionInfo\":{\"status\":\"ok\",\"timestamp\":1611121666299,\"user_tz\":-330,\"elapsed\":14069,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"}},\"outputId\":\"3d3312bf-e2c9-489f-9a6f-061ea6a34340\"},\"source\":[\"#@title Step 3: Installing the requirements\\n\",\"import os                     # when the VM restarts import os necessary\\n\",\"os.chdir(\\\"/content/gpt-2\\\")    \\n\",\"!pip3 install -r requirements.txt\"],\"execution_count\":2,\"outputs\":[{\"output_type\":\"stream\",\"text\":[\"Collecting fire>=0.1.3\\n\",\"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/34/a7/0e22e70778aca01a52b9c899d9c145c6396d7b613719cd63db97ffa13f2f/fire-0.3.1.tar.gz (81kB)\\n\",\"\\u001b[K     |████████████████████████████████| 81kB 7.8MB/s \\n\",\"\\u001b[?25hCollecting regex==2017.4.5\\n\",\"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/36/62/c0c0d762ffd4ffaf39f372eb8561b8d491a11ace5a7884610424a8b40f95/regex-2017.04.05.tar.gz (601kB)\\n\",\"\\u001b[K     |████████████████████████████████| 604kB 24.4MB/s \\n\",\"\\u001b[?25hCollecting requests==2.21.0\\n\",\"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl (57kB)\\n\",\"\\u001b[K     |████████████████████████████████| 61kB 9.6MB/s \\n\",\"\\u001b[?25hCollecting tqdm==4.31.1\\n\",\"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/6c/4b/c38b5144cf167c4f52288517436ccafefe9dc01b8d1c190e18a6b154cd4a/tqdm-4.31.1-py2.py3-none-any.whl (48kB)\\n\",\"\\u001b[K     |████████████████████████████████| 51kB 6.1MB/s \\n\",\"\\u001b[?25hRequirement already satisfied: six in /usr/local/lib/python3.6/dist-packages (from fire>=0.1.3->-r requirements.txt (line 1)) (1.15.0)\\n\",\"Requirement already satisfied: termcolor in /usr/local/lib/python3.6/dist-packages (from fire>=0.1.3->-r requirements.txt (line 1)) (1.1.0)\\n\",\"Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests==2.21.0->-r requirements.txt (line 3)) (3.0.4)\\n\",\"Collecting idna<2.9,>=2.5\\n\",\"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl (58kB)\\n\",\"\\u001b[K     |████████████████████████████████| 61kB 9.9MB/s \\n\",\"\\u001b[?25hRequirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.6/dist-packages (from requests==2.21.0->-r requirements.txt (line 3)) (2020.12.5)\\n\",\"Requirement already satisfied: urllib3<1.25,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from requests==2.21.0->-r requirements.txt (line 3)) (1.24.3)\\n\",\"Building wheels for collected packages: fire, regex\\n\",\"  Building wheel for fire (setup.py) ... \\u001b[?25l\\u001b[?25hdone\\n\",\"  Created wheel for fire: filename=fire-0.3.1-py2.py3-none-any.whl size=111006 sha256=84d334e01481079528fbe07f0be1143f7f49c6f454c39837521ed84d822943a1\\n\",\"  Stored in directory: /root/.cache/pip/wheels/c1/61/df/768b03527bf006b546dce284eb4249b185669e65afc5fbb2ac\\n\",\"  Building wheel for regex (setup.py) ... \\u001b[?25l\\u001b[?25hdone\\n\",\"  Created wheel for regex: filename=regex-2017.4.5-cp36-cp36m-linux_x86_64.whl size=533190 sha256=e6d35cedb29485199a171cead1d9904cf5a633fd8b2860c419d5f1dbdfc8567f\\n\",\"  Stored in directory: /root/.cache/pip/wheels/75/07/38/3c16b529d50cb4e0cd3dbc7b75cece8a09c132692c74450b01\\n\",\"Successfully built fire regex\\n\",\"\\u001b[31mERROR: spacy 2.2.4 has requirement tqdm<5.0.0,>=4.38.0, but you'll have tqdm 4.31.1 which is incompatible.\\u001b[0m\\n\",\"\\u001b[31mERROR: google-colab 1.0.0 has requirement requests~=2.23.0, but you'll have requests 2.21.0 which is incompatible.\\u001b[0m\\n\",\"\\u001b[31mERROR: fbprophet 0.7.1 has requirement tqdm>=4.36.1, but you'll have tqdm 4.31.1 which is incompatible.\\u001b[0m\\n\",\"\\u001b[31mERROR: datascience 0.10.6 has requirement folium==0.2.1, but you'll have folium 0.8.3 which is incompatible.\\u001b[0m\\n\",\"Installing collected packages: fire, regex, idna, requests, tqdm\\n\",\"  Found existing installation: regex 2019.12.20\\n\",\"    Uninstalling regex-2019.12.20:\\n\",\"      Successfully uninstalled regex-2019.12.20\\n\",\"  Found existing installation: idna 2.10\\n\",\"    Uninstalling idna-2.10:\\n\",\"      Successfully uninstalled idna-2.10\\n\",\"  Found existing installation: requests 2.23.0\\n\",\"    Uninstalling requests-2.23.0:\\n\",\"      Successfully uninstalled requests-2.23.0\\n\",\"  Found existing installation: tqdm 4.41.1\\n\",\"    Uninstalling tqdm-4.41.1:\\n\",\"      Successfully uninstalled tqdm-4.41.1\\n\",\"Successfully installed fire-0.3.1 idna-2.8 regex-2017.4.5 requests-2.21.0 tqdm-4.31.1\\n\"],\"name\":\"stdout\"}]},{\"cell_type\":\"code\",\"metadata\":{\"id\":\"_kpNCnh9fyYD\",\"colab\":{\"base_uri\":\"https://localhost:8080/\"},\"executionInfo\":{\"status\":\"ok\",\"timestamp\":1611121682119,\"user_tz\":-330,\"elapsed\":6103,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"}},\"outputId\":\"828003c4-1dff-4c43-d438-4c91bc573ab1\"},\"source\":[\"#@title Step 4 Checking the Version of TensorFlow \\n\",\"#Colab has tf 1.x and tf 2.x installed\\n\",\"#Restart runtime using 'Runtime' -> 'Restart runtime...'\\n\",\"%tensorflow_version 1.x\\n\",\"import tensorflow as tf\\n\",\"print(tf.__version__)\"],\"execution_count\":3,\"outputs\":[{\"output_type\":\"stream\",\"text\":[\"TensorFlow 1.x selected.\\n\",\"1.15.2\\n\"],\"name\":\"stdout\"}]},{\"cell_type\":\"code\",\"metadata\":{\"id\":\"jvVj0cLVkaPL\",\"colab\":{\"base_uri\":\"https://localhost:8080/\"},\"executionInfo\":{\"status\":\"ok\",\"timestamp\":1611121728589,\"user_tz\":-330,\"elapsed\":30531,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"}},\"outputId\":\"c3cabb3a-0dbf-40aa-d231-5de3b242baab\"},\"source\":[\"#@title Step 5: Downloading the 345M parameter GPT-2 Model\\n\",\"# run code and send argument\\n\",\"import os # after runtime is restarted\\n\",\"os.chdir(\\\"/content/gpt-2\\\")\\n\",\"!python3 download_model.py '345M' \"],\"execution_count\":4,\"outputs\":[{\"output_type\":\"stream\",\"text\":[\"Fetching checkpoint: 1.00kit [00:00, 945kit/s]                                                      \\n\",\"Fetching encoder.json: 1.04Mit [00:00, 3.97Mit/s]                                                   \\n\",\"Fetching hparams.json: 1.00kit [00:00, 944kit/s]                                                    \\n\",\"Fetching model.ckpt.data-00000-of-00001: 1.42Git [00:27, 51.2Mit/s]                                 \\n\",\"Fetching model.ckpt.index: 11.0kit [00:00, 9.49Mit/s]                                               \\n\",\"Fetching model.ckpt.meta: 927kit [00:00, 3.13Mit/s]                                                 \\n\",\"Fetching vocab.bpe: 457kit [00:00, 2.48Mit/s]                                                       \\n\"],\"name\":\"stdout\"}]},{\"cell_type\":\"code\",\"metadata\":{\"id\":\"boCr2SydkydA\",\"executionInfo\":{\"status\":\"ok\",\"timestamp\":1611121821353,\"user_tz\":-330,\"elapsed\":1106,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"}}},\"source\":[\"#@title Step 6: Printing UTF encoded text to the console\\n\",\"!export PYTHONIOENCODING=UTF-8\"],\"execution_count\":5,\"outputs\":[]},{\"cell_type\":\"code\",\"metadata\":{\"id\":\"T7C7JhElk-Lh\",\"executionInfo\":{\"status\":\"ok\",\"timestamp\":1611121828604,\"user_tz\":-330,\"elapsed\":1043,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"}}},\"source\":[\"#@title Step 7: Project Source Code\\n\",\"import os # import after runtime is restarted\\n\",\"os.chdir(\\\"/content/gpt-2/src\\\")\"],\"execution_count\":6,\"outputs\":[]},{\"cell_type\":\"code\",\"metadata\":{\"id\":\"ckSsdAnblFIg\",\"executionInfo\":{\"status\":\"ok\",\"timestamp\":1611121842649,\"user_tz\":-330,\"elapsed\":1122,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"}}},\"source\":[\"#@title Step 7a: Interactive Conditional Samples (src)\\n\",\"#Project Source Code for Interactive Conditional Samples:\\n\",\"# /content/gpt-2/src/interactive_conditional_samples.py file \\n\",\"import json\\n\",\"import os\\n\",\"import numpy as np\\n\",\"import tensorflow as tf\"],\"execution_count\":7,\"outputs\":[]},{\"cell_type\":\"code\",\"metadata\":{\"id\":\"2mtuJxl8tb_B\",\"executionInfo\":{\"status\":\"ok\",\"timestamp\":1611121856018,\"user_tz\":-330,\"elapsed\":3099,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"}}},\"source\":[\"#@title Step 7b: Importing model sample encoder\\n\",\"import model, sample, encoder\\n\",\"#if following message:\\n\",\"#ModuleNotFoundError: No module named 'tensorflow.contrib'\\n\",\"#then go back and run Step 2 Checking TensorFlow version \"],\"execution_count\":8,\"outputs\":[]},{\"cell_type\":\"code\",\"metadata\":{\"id\":\"SAuHo4TilJhQ\",\"executionInfo\":{\"status\":\"ok\",\"timestamp\":1611121861066,\"user_tz\":-330,\"elapsed\":1058,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"}}},\"source\":[\"#@title Step 8: Defining the model\\n\",\"def interact_model(\\n\",\"    model_name,\\n\",\"    seed,\\n\",\"    nsamples,\\n\",\"    batch_size,\\n\",\"    length,\\n\",\"    temperature,\\n\",\"    top_k,\\n\",\"    models_dir\\n\",\"):\\n\",\"    models_dir = os.path.expanduser(os.path.expandvars(models_dir))\\n\",\"    if batch_size is None:\\n\",\"        batch_size = 1\\n\",\"    assert nsamples % batch_size == 0\\n\",\"\\n\",\"    enc = encoder.get_encoder(model_name, models_dir)\\n\",\"    hparams = model.default_hparams()\\n\",\"    with open(os.path.join(models_dir, model_name, 'hparams.json')) as f:\\n\",\"        hparams.override_from_dict(json.load(f))\\n\",\"\\n\",\"    if length is None:\\n\",\"        length = hparams.n_ctx // 2\\n\",\"    elif length > hparams.n_ctx:\\n\",\"        raise ValueError(\\\"Can't get samples longer than window size: %s\\\" % hparams.n_ctx)\\n\",\"\\n\",\"    with tf.Session(graph=tf.Graph()) as sess:\\n\",\"        context = tf.placeholder(tf.int32, [batch_size, None])\\n\",\"        np.random.seed(seed)\\n\",\"        tf.set_random_seed(seed)\\n\",\"        output = sample.sample_sequence(\\n\",\"            hparams=hparams, length=length,\\n\",\"            context=context,\\n\",\"            batch_size=batch_size,\\n\",\"            temperature=temperature, top_k=top_k\\n\",\"        )\\n\",\"\\n\",\"        saver = tf.train.Saver()\\n\",\"        ckpt = tf.train.latest_checkpoint(os.path.join(models_dir, model_name))\\n\",\"        saver.restore(sess, ckpt)\\n\",\"\\n\",\"        while True:\\n\",\"            raw_text = input(\\\"Model prompt >>> \\\")\\n\",\"            while not raw_text:\\n\",\"                print('Prompt should not be empty!')\\n\",\"                raw_text = input(\\\"Model prompt >>> \\\")\\n\",\"            context_tokens = enc.encode(raw_text)\\n\",\"            generated = 0\\n\",\"            for _ in range(nsamples // batch_size):\\n\",\"                out = sess.run(output, feed_dict={\\n\",\"                    context: [context_tokens for _ in range(batch_size)]\\n\",\"                })[:, len(context_tokens):]\\n\",\"                for i in range(batch_size):\\n\",\"                    generated += 1\\n\",\"                    text = enc.decode(out[i])\\n\",\"                    print(\\\"=\\\" * 40 + \\\" SAMPLE \\\" + str(generated) + \\\" \\\" + \\\"=\\\" * 40)\\n\",\"                    print(text)\\n\",\"            print(\\\"=\\\" * 80)\"],\"execution_count\":9,\"outputs\":[]},{\"cell_type\":\"code\",\"metadata\":{\"id\":\"P8Prbrs-UHu3\",\"colab\":{\"base_uri\":\"https://localhost:8080/\",\"height\":976},\"executionInfo\":{\"status\":\"error\",\"timestamp\":1611127917030,\"user_tz\":-330,\"elapsed\":4045790,\"user\":{\"displayName\":\"Karan Sonawane\",\"photoUrl\":\"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64\",\"userId\":\"05479461208077736330\"}},\"outputId\":\"9f768ee1-75a5-499a-f7f9-be0889a29f22\"},\"source\":[\"#@title Step 9: Interacting with GPT-2 \\r\\n\",\"interact_model('345M',None,1,1,300,1,0,'/content/gpt-2/models')\"],\"execution_count\":10,\"outputs\":[{\"output_type\":\"stream\",\"text\":[\"WARNING:tensorflow:From /content/gpt-2/src/sample.py:51: The name tf.AUTO_REUSE is deprecated. Please use tf.compat.v1.AUTO_REUSE instead.\\n\",\"\\n\",\"WARNING:tensorflow:From /content/gpt-2/src/model.py:148: The name tf.variable_scope is deprecated. Please use tf.compat.v1.variable_scope instead.\\n\",\"\\n\",\"WARNING:tensorflow:From /content/gpt-2/src/model.py:152: The name tf.get_variable is deprecated. Please use tf.compat.v1.get_variable instead.\\n\",\"\\n\",\"WARNING:tensorflow:From /content/gpt-2/src/model.py:36: The name tf.rsqrt is deprecated. Please use tf.math.rsqrt instead.\\n\",\"\\n\",\"WARNING:tensorflow:From /content/gpt-2/src/sample.py:64: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.\\n\",\"Instructions for updating:\\n\",\"Use `tf.cast` instead.\\n\",\"WARNING:tensorflow:From /content/gpt-2/src/sample.py:39: where (from tensorflow.python.ops.array_ops) is deprecated and will be removed in a future version.\\n\",\"Instructions for updating:\\n\",\"Use tf.where in 2.0, which has the same broadcast rule as np.where\\n\",\"WARNING:tensorflow:From /content/gpt-2/src/sample.py:67: multinomial (from tensorflow.python.ops.random_ops) is deprecated and will be removed in a future version.\\n\",\"Instructions for updating:\\n\",\"Use `tf.random.categorical` instead.\\n\",\"INFO:tensorflow:Restoring parameters from /content/gpt-2/models/345M/model.ckpt\\n\",\"======================================== SAMPLE 1 ========================================\\n\",\" But to hold to sense alone, as to the only thing capable of constituting our perfection, is the very aim wherein nature herself establishes herself. This shall never be the final end of human reason, as I apprehend this to be; unless, indeed, it begins from spirit, and and passes through man to no other end: therefore intellectual ideas don't contemplate any hell, the existence of which the Saccadic demon of Illustration would require for perfection.\\n\",\"\\n\",\"Now, if you should see it thus, it will seem rather to refute the sensible traits of Plato who posited nature as an objective object, when she was anathema to his spirit. Now, by conceiving of the nature of its objects as hard, dull and insufferable objects, nature abounds in practicability to delineate every part of its external parts, and appears to furnish no more expository descriptions, than the manufacturer usually has to conform to the contents of his camera. Thus the Book of the Dead, ie. the lateral kings of Flight, which shall God Himself destroy in order to release man from mortal space, contains information with a memento verbi. Nor do human actions, or nerve-angle,! however fine, cease to move their parts towards things which lie in a strait, as instinct (baculum) says. But since the system always transfers itself, at, the same time to forwards and backwards, and cannot come to a stop with these reverses of\\n\",\"================================================================================\\n\"],\"name\":\"stdout\"},{\"output_type\":\"error\",\"ename\":\"KeyboardInterrupt\",\"evalue\":\"ignored\",\"traceback\":[\"\\u001b[0;31m---------------------------------------------------------------------------\\u001b[0m\",\"\\u001b[0;31mKeyboardInterrupt\\u001b[0m                         Traceback (most recent call last)\",\"\\u001b[0;32m/usr/local/lib/python3.6/dist-packages/ipykernel/kernelbase.py\\u001b[0m in \\u001b[0;36m_input_request\\u001b[0;34m(self, prompt, ident, parent, password)\\u001b[0m\\n\\u001b[1;32m    728\\u001b[0m             \\u001b[0;32mtry\\u001b[0m\\u001b[0;34m:\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0m\\n\\u001b[0;32m--> 729\\u001b[0;31m                 \\u001b[0mident\\u001b[0m\\u001b[0;34m,\\u001b[0m \\u001b[0mreply\\u001b[0m \\u001b[0;34m=\\u001b[0m \\u001b[0mself\\u001b[0m\\u001b[0;34m.\\u001b[0m\\u001b[0msession\\u001b[0m\\u001b[0;34m.\\u001b[0m\\u001b[0mrecv\\u001b[0m\\u001b[0;34m(\\u001b[0m\\u001b[0mself\\u001b[0m\\u001b[0;34m.\\u001b[0m\\u001b[0mstdin_socket\\u001b[0m\\u001b[0;34m,\\u001b[0m \\u001b[0;36m0\\u001b[0m\\u001b[0;34m)\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0m\\n\\u001b[0m\\u001b[1;32m    730\\u001b[0m             \\u001b[0;32mexcept\\u001b[0m \\u001b[0mException\\u001b[0m\\u001b[0;34m:\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0m\\n\",\"\\u001b[0;32m/usr/local/lib/python3.6/dist-packages/jupyter_client/session.py\\u001b[0m in \\u001b[0;36mrecv\\u001b[0;34m(self, socket, mode, content, copy)\\u001b[0m\\n\\u001b[1;32m    802\\u001b[0m         \\u001b[0;32mtry\\u001b[0m\\u001b[0;34m:\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0m\\n\\u001b[0;32m--> 803\\u001b[0;31m             \\u001b[0mmsg_list\\u001b[0m \\u001b[0;34m=\\u001b[0m \\u001b[0msocket\\u001b[0m\\u001b[0;34m.\\u001b[0m\\u001b[0mrecv_multipart\\u001b[0m\\u001b[0;34m(\\u001b[0m\\u001b[0mmode\\u001b[0m\\u001b[0;34m,\\u001b[0m \\u001b[0mcopy\\u001b[0m\\u001b[0;34m=\\u001b[0m\\u001b[0mcopy\\u001b[0m\\u001b[0;34m)\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0m\\n\\u001b[0m\\u001b[1;32m    804\\u001b[0m         \\u001b[0;32mexcept\\u001b[0m \\u001b[0mzmq\\u001b[0m\\u001b[0;34m.\\u001b[0m\\u001b[0mZMQError\\u001b[0m \\u001b[0;32mas\\u001b[0m \\u001b[0me\\u001b[0m\\u001b[0;34m:\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0m\\n\",\"\\u001b[0;32m/usr/local/lib/python3.6/dist-packages/zmq/sugar/socket.py\\u001b[0m in \\u001b[0;36mrecv_multipart\\u001b[0;34m(self, flags, copy, track)\\u001b[0m\\n\\u001b[1;32m    565\\u001b[0m         \\\"\\\"\\\"\\n\\u001b[0;32m--> 566\\u001b[0;31m         \\u001b[0mparts\\u001b[0m \\u001b[0;34m=\\u001b[0m \\u001b[0;34m[\\u001b[0m\\u001b[0mself\\u001b[0m\\u001b[0;34m.\\u001b[0m\\u001b[0mrecv\\u001b[0m\\u001b[0;34m(\\u001b[0m\\u001b[0mflags\\u001b[0m\\u001b[0;34m,\\u001b[0m \\u001b[0mcopy\\u001b[0m\\u001b[0;34m=\\u001b[0m\\u001b[0mcopy\\u001b[0m\\u001b[0;34m,\\u001b[0m \\u001b[0mtrack\\u001b[0m\\u001b[0;34m=\\u001b[0m\\u001b[0mtrack\\u001b[0m\\u001b[0;34m)\\u001b[0m\\u001b[0;34m]\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0m\\n\\u001b[0m\\u001b[1;32m    567\\u001b[0m         \\u001b[0;31m# have first part already, only loop while more to receive\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0m\\n\",\"\\u001b[0;32mzmq/backend/cython/socket.pyx\\u001b[0m in \\u001b[0;36mzmq.backend.cython.socket.Socket.recv\\u001b[0;34m()\\u001b[0m\\n\",\"\\u001b[0;32mzmq/backend/cython/socket.pyx\\u001b[0m in \\u001b[0;36mzmq.backend.cython.socket.Socket.recv\\u001b[0;34m()\\u001b[0m\\n\",\"\\u001b[0;32mzmq/backend/cython/socket.pyx\\u001b[0m in \\u001b[0;36mzmq.backend.cython.socket._recv_copy\\u001b[0;34m()\\u001b[0m\\n\",\"\\u001b[0;32m/usr/local/lib/python3.6/dist-packages/zmq/backend/cython/checkrc.pxd\\u001b[0m in \\u001b[0;36mzmq.backend.cython.checkrc._check_rc\\u001b[0;34m()\\u001b[0m\\n\",\"\\u001b[0;31mKeyboardInterrupt\\u001b[0m: \",\"\\nDuring handling of the above exception, another exception occurred:\\n\",\"\\u001b[0;31mKeyboardInterrupt\\u001b[0m                         Traceback (most recent call last)\",\"\\u001b[0;32m<ipython-input-10-1a68aaa30b29>\\u001b[0m in \\u001b[0;36m<module>\\u001b[0;34m()\\u001b[0m\\n\\u001b[1;32m      1\\u001b[0m \\u001b[0;31m#@title Step 9: Interacting with GPT-2\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0m\\n\\u001b[0;32m----> 2\\u001b[0;31m \\u001b[0minteract_model\\u001b[0m\\u001b[0;34m(\\u001b[0m\\u001b[0;34m'345M'\\u001b[0m\\u001b[0;34m,\\u001b[0m\\u001b[0;32mNone\\u001b[0m\\u001b[0;34m,\\u001b[0m\\u001b[0;36m1\\u001b[0m\\u001b[0;34m,\\u001b[0m\\u001b[0;36m1\\u001b[0m\\u001b[0;34m,\\u001b[0m\\u001b[0;36m300\\u001b[0m\\u001b[0;34m,\\u001b[0m\\u001b[0;36m1\\u001b[0m\\u001b[0;34m,\\u001b[0m\\u001b[0;36m0\\u001b[0m\\u001b[0;34m,\\u001b[0m\\u001b[0;34m'/content/gpt-2/models'\\u001b[0m\\u001b[0;34m)\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0m\\n\\u001b[0m\",\"\\u001b[0;32m<ipython-input-9-ad542f4b3966>\\u001b[0m in \\u001b[0;36minteract_model\\u001b[0;34m(model_name, seed, nsamples, batch_size, length, temperature, top_k, models_dir)\\u001b[0m\\n\\u001b[1;32m     41\\u001b[0m \\u001b[0;34m\\u001b[0m\\u001b[0m\\n\\u001b[1;32m     42\\u001b[0m         \\u001b[0;32mwhile\\u001b[0m \\u001b[0;32mTrue\\u001b[0m\\u001b[0;34m:\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0m\\n\\u001b[0;32m---> 43\\u001b[0;31m             \\u001b[0mraw_text\\u001b[0m \\u001b[0;34m=\\u001b[0m \\u001b[0minput\\u001b[0m\\u001b[0;34m(\\u001b[0m\\u001b[0;34m\\\"Model prompt >>> \\\"\\u001b[0m\\u001b[0;34m)\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0m\\n\\u001b[0m\\u001b[1;32m     44\\u001b[0m             \\u001b[0;32mwhile\\u001b[0m \\u001b[0;32mnot\\u001b[0m \\u001b[0mraw_text\\u001b[0m\\u001b[0;34m:\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0m\\n\\u001b[1;32m     45\\u001b[0m                 \\u001b[0mprint\\u001b[0m\\u001b[0;34m(\\u001b[0m\\u001b[0;34m'Prompt should not be empty!'\\u001b[0m\\u001b[0;34m)\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0m\\n\",\"\\u001b[0;32m/usr/local/lib/python3.6/dist-packages/ipykernel/kernelbase.py\\u001b[0m in \\u001b[0;36mraw_input\\u001b[0;34m(self, prompt)\\u001b[0m\\n\\u001b[1;32m    702\\u001b[0m             \\u001b[0mself\\u001b[0m\\u001b[0;34m.\\u001b[0m\\u001b[0m_parent_ident\\u001b[0m\\u001b[0;34m,\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0m\\n\\u001b[1;32m    703\\u001b[0m             \\u001b[0mself\\u001b[0m\\u001b[0;34m.\\u001b[0m\\u001b[0m_parent_header\\u001b[0m\\u001b[0;34m,\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0m\\n\\u001b[0;32m--> 704\\u001b[0;31m             \\u001b[0mpassword\\u001b[0m\\u001b[0;34m=\\u001b[0m\\u001b[0;32mFalse\\u001b[0m\\u001b[0;34m,\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0m\\n\\u001b[0m\\u001b[1;32m    705\\u001b[0m         )\\n\\u001b[1;32m    706\\u001b[0m \\u001b[0;34m\\u001b[0m\\u001b[0m\\n\",\"\\u001b[0;32m/usr/local/lib/python3.6/dist-packages/ipykernel/kernelbase.py\\u001b[0m in \\u001b[0;36m_input_request\\u001b[0;34m(self, prompt, ident, parent, password)\\u001b[0m\\n\\u001b[1;32m    732\\u001b[0m             \\u001b[0;32mexcept\\u001b[0m \\u001b[0mKeyboardInterrupt\\u001b[0m\\u001b[0;34m:\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0m\\n\\u001b[1;32m    733\\u001b[0m                 \\u001b[0;31m# re-raise KeyboardInterrupt, to truncate traceback\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0m\\n\\u001b[0;32m--> 734\\u001b[0;31m                 \\u001b[0;32mraise\\u001b[0m \\u001b[0mKeyboardInterrupt\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0m\\n\\u001b[0m\\u001b[1;32m    735\\u001b[0m             \\u001b[0;32melse\\u001b[0m\\u001b[0;34m:\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0m\\n\\u001b[1;32m    736\\u001b[0m                 \\u001b[0;32mbreak\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0;34m\\u001b[0m\\u001b[0m\\n\",\"\\u001b[0;31mKeyboardInterrupt\\u001b[0m: \"]}]}]}"
  },
  {
    "path": "Chapter06/Training_OpenAI_GPT_2.ipynb",
    "content": "{\n  \"nbformat\": 4,\n  \"nbformat_minor\": 0,\n  \"metadata\": {\n    \"colab\": {\n      \"name\": \"Training OpenAI GPT-2.ipynb\",\n      \"provenance\": [],\n      \"collapsed_sections\": [],\n      \"toc_visible\": true\n    },\n    \"kernelspec\": {\n      \"name\": \"python3\",\n      \"display_name\": \"Python 3\"\n    },\n    \"accelerator\": \"GPU\"\n  },\n  \"cells\": [\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"LH2YgC7LfzJZ\",\n        \"colab_type\": \"text\"\n      },\n      \"source\": [\n        \"#Training OpenAI GTP-2\\n\",\n        \"Copyright 2020, Denis Rothman MIT License. Denis Rothman created the Colab notebook using the OpenAI repository, adding title steps for educational purposes only.\\n\",\n        \"\\n\",\n        \"***Code References***\\n\",\n        \"\\n\",\n        \"[Reference: OpenAI Repository](https://github.com/openai/gpt-2)\\n\",\n        \"The repository was cloned and adapted to N Shepperd's repository.\\n\",\n        \"\\n\",\n        \"[Reference: N Shepperd Repository](https://github.com/nshepperd/gpt-2)\\n\",\n        \"The repository was not cloned. N Shepperd's training programs were inserted into the OpenAI Repository. The list of N Shepperd's programs are cited in the 'N Shepperd' section of the notebook. Some programs were modified for educational purposes only to work with this notebook.\\n\",\n        \"\\n\",\n        \"***Model Reference Paper***\\n\",\n        \"\\n\",\n        \"[Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, Ilya Sutskever,2019,'Language Models are Unsupervised Multitask Learners'](https://d4mucfpksywv.cloudfront.net/better-language-models/language-models.pdf)\\n\",\n        \"\\n\",\n        \"\\n\",\n        \"***Step 1: Pre-requisites:***\\n\",\n        \"\\n\",\n        \"a) activate GPU in the notebook settings runTime menu <br>\\n\",\n        \"b) Upload the following program files and dset.txt(dataset) with the file manager: train.py,load_dataset.py,encode.py,accumulate,memory_saving_gradients.py,dset.txt\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"isqdu1fpfmqM\",\n        \"colab_type\": \"code\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 102\n        },\n        \"outputId\": \"1d38b600-5f4a-4d66-a00c-f8cab8f5a158\"\n      },\n      \"source\": [\n        \"#@title Step 2: Cloning the OpenAI GPT-2 Repository \\n\",\n        \"#!git clone https://github.com/nshepperd/gpt-2.git\\n\",\n        \"!git clone https://github.com/openai/gpt-2.git\"\n      ],\n      \"execution_count\": 1,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Cloning into 'gpt-2'...\\n\",\n            \"remote: Enumerating objects: 230, done.\\u001b[K\\n\",\n            \"remote: Total 230 (delta 0), reused 0 (delta 0), pack-reused 230\\u001b[K\\n\",\n            \"Receiving objects: 100% (230/230), 4.38 MiB | 6.13 MiB/s, done.\\n\",\n            \"Resolving deltas: 100% (119/119), done.\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"7RHOjN-TjUbj\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"#@title Step 3: Installing the requirements\\n\",\n        \"import os                     # when the VM restarts import os necessary\\n\",\n        \"os.chdir(\\\"/content/gpt-2\\\")    \\n\",\n        \"!pip3 install -r requirements.txt\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"q9vV73Opw68m\",\n        \"colab_type\": \"code\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 105\n        },\n        \"outputId\": \"482b8e2a-4e62-437c-dcf1-1046c2d28a0a\"\n      },\n      \"source\": [\n        \"!pip install toposort\"\n      ],\n      \"execution_count\": 3,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Collecting toposort\\n\",\n            \"  Downloading https://files.pythonhosted.org/packages/e9/8a/321cd8ea5f4a22a06e3ba30ef31ec33bea11a3443eeb1d89807640ee6ed4/toposort-1.5-py2.py3-none-any.whl\\n\",\n            \"Installing collected packages: toposort\\n\",\n            \"Successfully installed toposort-1.5\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"_kpNCnh9fyYD\",\n        \"colab_type\": \"code\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 51\n        },\n        \"outputId\": \"4d140c05-569d-4dc6-a2ba-faaa883aac88\"\n      },\n      \"source\": [\n        \"#@title Step 4: Checking TensorFlow version \\n\",\n        \"#Colab has tf 1.x and tf 2.x installed\\n\",\n        \"#Restart runtime using 'Runtime' -> 'Restart runtime...'\\n\",\n        \"%tensorflow_version 1.x\\n\",\n        \"import tensorflow as tf\\n\",\n        \"print(tf.__version__)\"\n      ],\n      \"execution_count\": 1,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"TensorFlow 1.x selected.\\n\",\n            \"1.15.2\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"jvVj0cLVkaPL\",\n        \"colab_type\": \"code\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 136\n        },\n        \"outputId\": \"eb34c742-4323-45de-b5bb-bf403cbed7c8\"\n      },\n      \"source\": [\n        \"#@title Step 5: Downloading 117M parameter GPT-2 Model\\n\",\n        \"# run code and send argument\\n\",\n        \"import os # after runtime is restarted\\n\",\n        \"os.chdir(\\\"/content/gpt-2\\\")\\n\",\n        \"!python3 download_model.py '117M' #creates model directory\"\n      ],\n      \"execution_count\": 2,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\rFetching checkpoint:   0%|                                              | 0.00/77.0 [00:00<?, ?it/s]\\rFetching checkpoint: 1.00kit [00:00, 874kit/s]                                                      \\n\",\n            \"\\rFetching encoder.json:   0%|                                           | 0.00/1.04M [00:00<?, ?it/s]\\rFetching encoder.json: 1.04Mit [00:00, 45.4Mit/s]                                                   \\n\",\n            \"Fetching hparams.json: 1.00kit [00:00, 779kit/s]                                                    \\n\",\n            \"Fetching model.ckpt.data-00000-of-00001: 498Mit [00:06, 77.3Mit/s]                                  \\n\",\n            \"Fetching model.ckpt.index: 6.00kit [00:00, 4.30Mit/s]                                               \\n\",\n            \"Fetching model.ckpt.meta: 472kit [00:00, 54.3Mit/s]                                                 \\n\",\n            \"Fetching vocab.bpe: 457kit [00:00, 60.0Mit/s]                                                       \\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"aV5K8rvD1b-r\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"#@title Step 6: Copying the Project Resources to scr\\n\",\n        \"!cp /content/dset.txt /content/gpt-2/src/\\n\",\n        \"!cp -r /content/gpt-2/models/ /content/gpt-2/src/\"\n      ],\n      \"execution_count\": 4,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"dTUxDwtWlOLf\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"#@title Step 7: Copying the N Shepperd Training Files\\n\",\n        \"#Referfence GitHub repository: https://github.com/nshepperd/gpt-2\\n\",\n        \"import os # import after runtime is restarted\\n\",\n        \"!cp /content/train.py /content/gpt-2/src/\\n\",\n        \"!cp /content/load_dataset.py /content/gpt-2/src/\\n\",\n        \"!cp /content/encode.py /content/gpt-2/src/\\n\",\n        \"!cp /content/accumulate.py /content/gpt-2/src/\\n\",\n        \"!cp /content/memory_saving_gradients.py /content/gpt-2/src/\"\n      ],\n      \"execution_count\": 5,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"B6T2OrWoOvG0\",\n        \"colab_type\": \"code\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 68\n        },\n        \"outputId\": \"12c9ede7-4ddc-4046-84d0-808ea04b4a81\"\n      },\n      \"source\": [\n        \"#@title Step 8:Encoding dataset\\n\",\n        \"import os # import after runtime is restarted\\n\",\n        \"os.chdir(\\\"/content/gpt-2/src/\\\")\\n\",\n        \"model_name=\\\"117M\\\"\\n\",\n        \"!python /content/gpt-2/src/encode.py dset.txt out.npz \"\n      ],\n      \"execution_count\": 7,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Reading files\\n\",\n            \"100% 1/1 [00:01<00:00,  1.27s/it]\\n\",\n            \"Writing out.npz\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"UzlkNGbAkDBk\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"#@title Step 9:Training the Model\\n\",\n        \"#Model saved after 1000 steps\\n\",\n        \"import os # import after runtime is restarted\\n\",\n        \"os.chdir(\\\"/content/gpt-2/src/\\\")\\n\",\n        \"!python train.py --dataset out.npz\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"z-zAFd2hLQ2V\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"#@title Step 10: Creating a Training Model directory\\n\",\n        \"#Creating a Training Model directory named 'tgmodel'\\n\",\n        \"import os\\n\",\n        \"run_dir = '/content/gpt-2/models/tgmodel'\\n\",\n        \"if not os.path.exists(run_dir):\\n\",\n        \"  os.makedirs(run_dir)\"\n      ],\n      \"execution_count\": 9,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"-POx-g1Ql76C\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"#@title Step 10A: Copying training Files\\n\",\n        \"!cp /content/gpt-2/src/checkpoint/run1/model-1000.data-00000-of-00001 /content/gpt-2/models/tgmodel\\n\",\n        \"!cp /content/gpt-2/src/checkpoint/run1/checkpoint /content/gpt-2/models/tgmodel\\n\",\n        \"!cp /content/gpt-2/src/checkpoint/run1/model-1000.index /content/gpt-2/models/tgmodel\\n\",\n        \"!cp /content/gpt-2/src/checkpoint/run1/model-1000.meta /content/gpt-2/models/tgmodel\"\n      ],\n      \"execution_count\": 10,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"hdE9nNH8m7VD\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"#@title Step 10B: Copying the OpenAI GPT-2 117M Model files\\n\",\n        \"!cp /content/gpt-2/models/117M/encoder.json /content/gpt-2/models/tgmodel\\n\",\n        \"!cp /content/gpt-2/models/117M/hparams.json /content/gpt-2/models/tgmodel\\n\",\n        \"!cp /content/gpt-2/models/117M/vocab.bpe /content/gpt-2/models/tgmodel\"\n      ],\n      \"execution_count\": 11,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"3G8NOUXjMq4u\",\n        \"colab_type\": \"code\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 34\n        },\n        \"outputId\": \"0a2bd956-a7d9-4c6d-dbbd-46fe493ed2a8\"\n      },\n      \"source\": [\n        \"#@title Step 10C: Renaming the model directories\\n\",\n        \"import os\\n\",\n        \"!mv /content/gpt-2/models/117M  /content/gpt-2/models/117M_OpenAI\\n\",\n        \"!mv /content/gpt-2/models/tgmodel  /content/gpt-2/models/117M\"\n      ],\n      \"execution_count\": 13,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"mv: cannot stat '/content/gpt-2/models/tgmodel': No such file or directory\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"h3uexz_e4d18\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"#@title Step 11: Generating Unconditional Samples\\n\",\n        \"import os # import after runtime is restarted\\n\",\n        \"os.chdir(\\\"/content/gpt-2/src\\\")\\n\",\n        \"!python generate_unconditional_samples.py --model_name '117M'\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"6HI7DuBK4iSU\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"#@title Step 12: Interactive Context and Completion Examples\\n\",\n        \"import os # import after runtime is restarted\\n\",\n        \"os.chdir(\\\"/content/gpt-2/src\\\")\\n\",\n        \"!python interactive_conditional_samples.py --temperature 0.8 --top_k 40 --model_name '117M'\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    }\n  ]\n}"
  },
  {
    "path": "Chapter06/gpt-2-train_files/accumulate.py",
    "content": "import argparse\nimport json\nimport os\nimport numpy as np\nimport tensorflow as tf\nimport time\n\n\nclass AccumulatingOptimizer(object):\n    def __init__(self, opt, var_list):\n        self.opt = opt\n        self.var_list = var_list\n        self.accum_vars = {tv : tf.Variable(tf.zeros_like(tv.initialized_value()), trainable=False)\n                           for tv in var_list}\n        self.total_loss = tf.Variable(tf.zeros(shape=[], dtype=tf.float32))\n        self.count_loss = tf.Variable(tf.zeros(shape=[], dtype=tf.float32))\n\n    def reset(self):\n        updates = [tv.assign(tf.zeros_like(tv)) for tv in self.accum_vars.values()]\n        updates.append(self.total_loss.assign(tf.zeros(shape=[], dtype=tf.float32)))\n        updates.append(self.count_loss.assign(tf.zeros(shape=[], dtype=tf.float32)))\n        with tf.control_dependencies(updates):\n            return tf.no_op()\n\n    def compute_gradients(self, loss):\n        grads = self.opt.compute_gradients(loss, self.var_list)\n        updates = [self.accum_vars[v].assign_add(g) for (g,v) in grads]\n        updates.append(self.total_loss.assign_add(loss))\n        updates.append(self.count_loss.assign_add(1.0))\n        with tf.control_dependencies(updates):\n            return tf.no_op()\n\n    def apply_gradients(self):\n        grads = [(g,v) for (v,g) in self.accum_vars.items()]\n        with tf.control_dependencies([self.opt.apply_gradients(grads)]):\n            return self.total_loss / self.count_loss\n"
  },
  {
    "path": "Chapter06/gpt-2-train_files/encode.py",
    "content": "#!/usr/bin/env python3\n# Usage:\n#  PYTHONPATH=src ./encode.py <file|directory|glob> /path/to/output.npz\n#  PYTHONPATH=src ./train --dataset /path/to/output.npz\n\nimport argparse\nimport numpy as np\n\nimport encoder\nfrom load_dataset import load_dataset\n\nparser = argparse.ArgumentParser(\n    description='Pre-encode text files into tokenized training set.',\n    formatter_class=argparse.ArgumentDefaultsHelpFormatter)\nparser.add_argument('--model_name', metavar='MODEL', type=str, default='117M', help='Pretrained model name')\nparser.add_argument('--combine', metavar='CHARS', type=int, default=50000, help='Concatenate files with <|endoftext|> separator into chunks of this minimum size')\nparser.add_argument('--encoding', type=str, default='utf-8', help='Set the encoding for reading and writing files.')\nparser.add_argument('in_text', metavar='PATH', type=str, help='Input file, directory, or glob pattern (utf-8 text).')\nparser.add_argument('out_npz', metavar='OUT.npz', type=str, help='Output file path')\n\ndef main():\n    models_dir='/content/gpt-2/src/models'\n    args = parser.parse_args()\n    enc = encoder.get_encoder(args.model_name,models_dir)\n    print('Reading files')\n    chunks = load_dataset(enc, args.in_text, args.combine, encoding=args.encoding)\n    print('Writing', args.out_npz)\n    np.savez_compressed(args.out_npz, *chunks)\n\n\nif __name__ == '__main__':\n    main()\n"
  },
  {
    "path": "Chapter06/gpt-2-train_files/load_dataset.py",
    "content": "import glob\nimport numpy as np\nimport os\nimport tensorflow as tf\nimport tqdm\n\n\ndef load_dataset(enc, path, combine, encoding=None):\n    paths = []\n    if os.path.isfile(path):\n        # Simple file\n        paths.append(path)\n    elif os.path.isdir(path):\n        # Directory\n        for (dirpath, _, fnames) in os.walk(path):\n            for fname in fnames:\n                paths.append(os.path.join(dirpath, fname))\n    else:\n        # Assume glob\n        paths = glob.glob(path)\n\n    token_chunks = []\n    raw_text = ''\n    for path in tqdm.tqdm(paths):\n        if path.endswith('.npz'):\n            # Pre-encoded\n            with np.load(path) as npz:\n                for item in npz.files:\n                    token_chunks.append(npz[item])\n        else:\n            # Plain text\n            with open(path, 'r', encoding=encoding) as fp:\n                raw_text += fp.read()\n            if len(raw_text) >= combine:\n                tokens = np.stack(enc.encode(raw_text))\n                token_chunks.append(tokens)\n                raw_text = ''\n            else:\n                raw_text += '<|endoftext|>'\n    if raw_text:\n        tokens = np.stack(enc.encode(raw_text))\n        token_chunks.append(tokens)\n    return token_chunks\n\n\ndef binary_search(f, lo, hi):\n    if f(lo) or not f(hi):\n        return None\n    while hi > lo + 1:\n        mid = (lo + hi) // 2\n        if f(mid):\n            hi = mid\n        else:\n            lo = mid\n    return hi\n\n\nclass Sampler(object):\n    \"\"\"Fairly samples a slice from a set of variable sized chunks.\n\n    'Fairly' means that the distribution is the same as sampling from one concatenated chunk,\n    but without crossing chunk boundaries.\"\"\"\n\n    def __init__(self, chunks, seed=None):\n        self.chunks = chunks\n        self.total_size = sum(chunk.shape[0] for chunk in chunks)\n        self.boundaries = [0]\n        for i in range(len(chunks)):\n            self.boundaries.append(self.boundaries[-1] + chunks[i].shape[0])\n        self.rs = np.random.RandomState(seed=seed)\n\n    def sample(self, length):\n        assert length < self.total_size // len(\n            self.chunks\n        ), \"Dataset files are too small to sample {} tokens at a time\".format(\n            length)\n        while True:\n            index = self.rs.randint(0, self.total_size - length - 1)\n            i = binary_search(lambda j: self.boundaries[j] > index, 0,\n                              len(self.boundaries) - 1) - 1\n            if self.boundaries[i + 1] > index + length:\n                within_chunk = index - self.boundaries[i]\n                return self.chunks[i][within_chunk:within_chunk + length]\n"
  },
  {
    "path": "Chapter06/gpt-2-train_files/memory_saving_gradients.py",
    "content": "from toposort import toposort\nimport contextlib\nimport numpy as np\nimport tensorflow as tf\nimport tensorflow.contrib.graph_editor as ge\nimport time\nimport sys\nsys.setrecursionlimit(10000)\n# refers back to current module if we decide to split helpers out\nutil = sys.modules[__name__]\n\n# getting rid of \"WARNING:tensorflow:VARIABLES collection name is deprecated\"\nsetattr(tf.GraphKeys, \"VARIABLES\", \"variables\")\n\n# save original gradients since tf.gradient could be monkey-patched to point\n# to our version\nfrom tensorflow.python.ops import gradients as tf_gradients_lib\ntf_gradients = tf_gradients_lib.gradients\n\nMIN_CHECKPOINT_NODE_SIZE=1024    # use lower value during testing\n\n# specific versions we can use to do process-wide replacement of tf.gradients\ndef gradients_speed(ys, xs, grad_ys=None, **kwargs):\n    return gradients(ys, xs, grad_ys, checkpoints='speed', **kwargs)\n\ndef gradients_memory(ys, xs, grad_ys=None, **kwargs):\n    return gradients(ys, xs, grad_ys, checkpoints='memory', **kwargs)\n\ndef gradients_collection(ys, xs, grad_ys=None, **kwargs):\n    return gradients(ys, xs, grad_ys, checkpoints='collection', **kwargs)\n\ndef gradients(ys, xs, grad_ys=None, checkpoints='collection', **kwargs):\n    '''\n    Authors: Tim Salimans & Yaroslav Bulatov\n\n    memory efficient gradient implementation inspired by \"Training Deep Nets with Sublinear Memory Cost\"\n    by Chen et al. 2016 (https://arxiv.org/abs/1604.06174)\n\n    ys,xs,grad_ys,kwargs are the arguments to standard tensorflow tf.gradients\n    (https://www.tensorflow.org/versions/r0.12/api_docs/python/train.html#gradients)\n\n    'checkpoints' can either be\n        - a list consisting of tensors from the forward pass of the neural net\n          that we should re-use when calculating the gradients in the backward pass\n          all other tensors that do not appear in this list will be re-computed\n        - a string specifying how this list should be determined. currently we support\n            - 'speed':  checkpoint all outputs of convolutions and matmuls. these ops are usually the most expensive,\n                        so checkpointing them maximizes the running speed\n                        (this is a good option if nonlinearities, concats, batchnorms, etc are taking up a lot of memory)\n            - 'memory': try to minimize the memory usage\n                        (currently using a very simple strategy that identifies a number of bottleneck tensors in the graph to checkpoint)\n            - 'collection': look for a tensorflow collection named 'checkpoints', which holds the tensors to checkpoint\n    '''\n\n    #    print(\"Calling memsaving gradients with\", checkpoints)\n    if not isinstance(ys,list):\n        ys = [ys]\n    if not isinstance(xs,list):\n        xs = [xs]\n\n    bwd_ops = ge.get_backward_walk_ops([y.op for y in ys],\n                                       inclusive=True)\n\n    debug_print(\"bwd_ops: %s\", bwd_ops)\n\n    # forward ops are all ops that are candidates for recomputation\n    fwd_ops = ge.get_forward_walk_ops([x.op for x in xs],\n                                      inclusive=True,\n                                      within_ops=bwd_ops)\n    debug_print(\"fwd_ops: %s\", fwd_ops)\n\n    # exclude ops with no inputs\n    fwd_ops = [op for op in fwd_ops if op.inputs]\n\n    # don't recompute xs, remove variables\n    xs_ops = _to_ops(xs)\n    fwd_ops = [op for op in fwd_ops if not op in xs_ops]\n    fwd_ops = [op for op in fwd_ops if not '/assign' in op.name]\n    fwd_ops = [op for op in fwd_ops if not '/Assign' in op.name]\n    fwd_ops = [op for op in fwd_ops if not '/read' in op.name]\n    ts_all = ge.filter_ts(fwd_ops, True) # get the tensors\n    ts_all = [t for t in ts_all if '/read' not in t.name]\n    ts_all = set(ts_all) - set(xs) - set(ys)\n\n    # construct list of tensors to checkpoint during forward pass, if not\n    # given as input\n    if type(checkpoints) is not list:\n        if checkpoints == 'collection':\n            checkpoints = tf.get_collection('checkpoints')\n\n        elif checkpoints == 'speed':\n            # checkpoint all expensive ops to maximize running speed\n            checkpoints = ge.filter_ts_from_regex(fwd_ops, 'conv2d|Conv|MatMul')\n\n        elif checkpoints == 'memory':\n\n            # remove very small tensors and some weird ops\n            def fixdims(t): # tf.Dimension values are not compatible with int, convert manually\n                try:\n                    return [int(e if e.value is not None else 64) for e in t]\n                except:\n                    return [0]  # unknown shape\n            ts_all = [t for t in ts_all if np.prod(fixdims(t.shape)) > MIN_CHECKPOINT_NODE_SIZE]\n            ts_all = [t for t in ts_all if 'L2Loss' not in t.name]\n            ts_all = [t for t in ts_all if 'entropy' not in t.name]\n            ts_all = [t for t in ts_all if 'FusedBatchNorm' not in t.name]\n            ts_all = [t for t in ts_all if 'Switch' not in t.name]\n            ts_all = [t for t in ts_all if 'dropout' not in t.name]\n            # DV: FP16_FIX - need to add 'Cast' layer here to make it work for FP16\n            ts_all = [t for t in ts_all if 'Cast' not in t.name]\n\n            # filter out all tensors that are inputs of the backward graph\n            with util.capture_ops() as bwd_ops:\n                tf_gradients(ys, xs, grad_ys, **kwargs)\n\n            bwd_inputs = [t for op in bwd_ops for t in op.inputs]\n            # list of tensors in forward graph that is in input to bwd graph\n            ts_filtered = list(set(bwd_inputs).intersection(ts_all))\n            debug_print(\"Using tensors %s\", ts_filtered)\n\n            # try two slightly different ways of getting bottlenecks tensors\n            # to checkpoint\n            for ts in [ts_filtered, ts_all]:\n\n                # get all bottlenecks in the graph\n                bottleneck_ts = []\n                for t in ts:\n                    b = set(ge.get_backward_walk_ops(t.op, inclusive=True, within_ops=fwd_ops))\n                    f = set(ge.get_forward_walk_ops(t.op, inclusive=False, within_ops=fwd_ops))\n                    # check that there are not shortcuts\n                    b_inp = set([inp for op in b for inp in op.inputs]).intersection(ts_all)\n                    f_inp = set([inp for op in f for inp in op.inputs]).intersection(ts_all)\n                    if not set(b_inp).intersection(f_inp) and len(b_inp)+len(f_inp) >= len(ts_all):\n                        bottleneck_ts.append(t)  # we have a bottleneck!\n                    else:\n                        debug_print(\"Rejected bottleneck candidate and ops %s\", [t] + list(set(ts_all) - set(b_inp) - set(f_inp)))\n\n                # success? or try again without filtering?\n                if len(bottleneck_ts) >= np.sqrt(len(ts_filtered)): # yes, enough bottlenecks found!\n                    break\n\n            if not bottleneck_ts:\n                raise Exception('unable to find bottleneck tensors! please provide checkpoint nodes manually, or use checkpoints=\"speed\".')\n\n            # sort the bottlenecks\n            bottlenecks_sorted_lists = tf_toposort(bottleneck_ts, within_ops=fwd_ops)\n            sorted_bottlenecks = [t for ts in bottlenecks_sorted_lists for t in ts]\n\n            # save an approximately optimal number ~ sqrt(N)\n            N = len(ts_filtered)\n            if len(bottleneck_ts) <= np.ceil(np.sqrt(N)):\n                checkpoints = sorted_bottlenecks\n            else:\n                step = int(np.ceil(len(bottleneck_ts) / np.sqrt(N)))\n                checkpoints = sorted_bottlenecks[step::step]\n\n        else:\n            raise Exception('%s is unsupported input for \"checkpoints\"' % (checkpoints,))\n\n    checkpoints = list(set(checkpoints).intersection(ts_all))\n\n    # at this point automatic selection happened and checkpoints is list of nodes\n    assert isinstance(checkpoints, list)\n\n    debug_print(\"Checkpoint nodes used: %s\", checkpoints)\n    # better error handling of special cases\n    # xs are already handled as checkpoint nodes, so no need to include them\n    xs_intersect_checkpoints = set(xs).intersection(set(checkpoints))\n    if xs_intersect_checkpoints:\n        debug_print(\"Warning, some input nodes are also checkpoint nodes: %s\",\n                    xs_intersect_checkpoints)\n    ys_intersect_checkpoints = set(ys).intersection(set(checkpoints))\n    debug_print(\"ys: %s, checkpoints: %s, intersect: %s\", ys, checkpoints,\n                ys_intersect_checkpoints)\n    # saving an output node (ys) gives no benefit in memory while creating\n    # new edge cases, exclude them\n    if ys_intersect_checkpoints:\n        debug_print(\"Warning, some output nodes are also checkpoints nodes: %s\",\n              format_ops(ys_intersect_checkpoints))\n\n    # remove initial and terminal nodes from checkpoints list if present\n    checkpoints = list(set(checkpoints) - set(ys) - set(xs))\n\n    # check that we have some nodes to checkpoint\n    # if not checkpoints:\n    #     raise Exception('no checkpoints nodes found or given as input! ')\n\n    # disconnect dependencies between checkpointed tensors\n    checkpoints_disconnected = {}\n    for x in checkpoints:\n        if x.op and x.op.name is not None:\n            grad_node = tf.stop_gradient(x, name=x.op.name+\"_sg\")\n        else:\n            grad_node = tf.stop_gradient(x)\n        checkpoints_disconnected[x] = grad_node\n\n    # partial derivatives to the checkpointed tensors and xs\n    ops_to_copy = fast_backward_ops(seed_ops=[y.op for y in ys],\n                                    stop_at_ts=checkpoints, within_ops=fwd_ops)\n    debug_print(\"Found %s ops to copy within fwd_ops %s, seed %s, stop_at %s\",\n                    len(ops_to_copy), fwd_ops, [r.op for r in ys], checkpoints)\n    debug_print(\"ops_to_copy = %s\", ops_to_copy)\n    debug_print(\"Processing list %s\", ys)\n    copied_sgv, info = ge.copy_with_input_replacements(ge.sgv(ops_to_copy), {})\n    for origin_op, op in info._transformed_ops.items():\n        op._set_device(origin_op.node_def.device)\n    copied_ops = info._transformed_ops.values()\n    debug_print(\"Copied %s to %s\", ops_to_copy, copied_ops)\n    ge.reroute_ts(checkpoints_disconnected.values(), checkpoints_disconnected.keys(), can_modify=copied_ops)\n    debug_print(\"Rewired %s in place of %s restricted to %s\",\n                checkpoints_disconnected.values(), checkpoints_disconnected.keys(), copied_ops)\n\n    # get gradients with respect to current boundary + original x's\n    copied_ys = [info._transformed_ops[y.op]._outputs[0] for y in ys]\n    boundary = list(checkpoints_disconnected.values())\n    dv = tf_gradients(ys=copied_ys, xs=boundary+xs, grad_ys=grad_ys, **kwargs)\n    debug_print(\"Got gradients %s\", dv)\n    debug_print(\"for %s\", copied_ys)\n    debug_print(\"with respect to %s\", boundary+xs)\n\n    inputs_to_do_before = [y.op for y in ys]\n    if grad_ys is not None:\n        inputs_to_do_before += grad_ys\n    wait_to_do_ops = list(copied_ops) + [g.op for g in dv if g is not None]\n    my_add_control_inputs(wait_to_do_ops, inputs_to_do_before)\n\n    # partial derivatives to the checkpointed nodes\n    # dictionary of \"node: backprop\" for nodes in the boundary\n    d_checkpoints = {r: dr for r,dr in zip(checkpoints_disconnected.keys(),\n                                        dv[:len(checkpoints_disconnected)])}\n    # partial derivatives to xs (usually the params of the neural net)\n    d_xs = dv[len(checkpoints_disconnected):]\n\n    # incorporate derivatives flowing through the checkpointed nodes\n    checkpoints_sorted_lists = tf_toposort(checkpoints, within_ops=fwd_ops)\n    for ts in checkpoints_sorted_lists[::-1]:\n        debug_print(\"Processing list %s\", ts)\n        checkpoints_other = [r for r in checkpoints if r not in ts]\n        checkpoints_disconnected_other = [checkpoints_disconnected[r] for r in checkpoints_other]\n\n        # copy part of the graph below current checkpoint node, stopping at\n        # other checkpoints nodes\n        ops_to_copy = fast_backward_ops(within_ops=fwd_ops, seed_ops=[r.op for r in ts], stop_at_ts=checkpoints_other)\n        debug_print(\"Found %s ops to copy within %s, seed %s, stop_at %s\",\n                    len(ops_to_copy), fwd_ops, [r.op for r in ts],\n                    checkpoints_other)\n        debug_print(\"ops_to_copy = %s\", ops_to_copy)\n        if not ops_to_copy: # we're done!\n            break\n        copied_sgv, info = ge.copy_with_input_replacements(ge.sgv(ops_to_copy), {})\n        for origin_op, op in info._transformed_ops.items():\n            op._set_device(origin_op.node_def.device)\n        copied_ops = info._transformed_ops.values()\n        debug_print(\"Copied %s to %s\", ops_to_copy, copied_ops)\n        ge.reroute_ts(checkpoints_disconnected_other, checkpoints_other, can_modify=copied_ops)\n        debug_print(\"Rewired %s in place of %s restricted to %s\",\n                    checkpoints_disconnected_other, checkpoints_other, copied_ops)\n\n        # gradient flowing through the checkpointed node\n        boundary = [info._transformed_ops[r.op]._outputs[0] for r in ts]\n        substitute_backprops = [d_checkpoints[r] for r in ts]\n        dv = tf_gradients(boundary,\n                          checkpoints_disconnected_other+xs,\n                          grad_ys=substitute_backprops, **kwargs)\n        debug_print(\"Got gradients %s\", dv)\n        debug_print(\"for %s\", boundary)\n        debug_print(\"with respect to %s\", checkpoints_disconnected_other+xs)\n        debug_print(\"with boundary backprop substitutions %s\", substitute_backprops)\n\n        inputs_to_do_before = [d_checkpoints[r].op for r in ts]\n        wait_to_do_ops = list(copied_ops) + [g.op for g in dv if g is not None]\n        my_add_control_inputs(wait_to_do_ops, inputs_to_do_before)\n\n        # partial derivatives to the checkpointed nodes\n        for r, dr in zip(checkpoints_other, dv[:len(checkpoints_other)]):\n            if dr is not None:\n                if d_checkpoints[r] is None:\n                    d_checkpoints[r] = dr\n                else:\n                    d_checkpoints[r] += dr\n        def _unsparsify(x):\n            if not isinstance(x, tf.IndexedSlices):\n                return x\n            assert x.dense_shape is not None, \"memory_saving_gradients encountered sparse gradients of unknown shape\"\n            indices = x.indices\n            while indices.shape.ndims < x.values.shape.ndims:\n                indices = tf.expand_dims(indices, -1)\n            return tf.scatter_nd(indices, x.values, x.dense_shape)\n\n        # partial derivatives to xs (usually the params of the neural net)\n        d_xs_new = dv[len(checkpoints_other):]\n        for j in range(len(xs)):\n            if d_xs_new[j] is not None:\n                if d_xs[j] is None:\n                    d_xs[j] = _unsparsify(d_xs_new[j])\n                else:\n                    d_xs[j] += _unsparsify(d_xs_new[j])\n\n\n    return d_xs\n\ndef tf_toposort(ts, within_ops=None):\n    all_ops = ge.get_forward_walk_ops([x.op for x in ts], within_ops=within_ops)\n\n    deps = {}\n    for op in all_ops:\n        for o in op.outputs:\n            deps[o] = set(op.inputs)\n    sorted_ts = toposort(deps)\n\n    # only keep the tensors from our original list\n    ts_sorted_lists = []\n    for l in sorted_ts:\n        keep = list(set(l).intersection(ts))\n        if keep:\n            ts_sorted_lists.append(keep)\n\n    return ts_sorted_lists\n\ndef fast_backward_ops(within_ops, seed_ops, stop_at_ts):\n    bwd_ops = set(ge.get_backward_walk_ops(seed_ops, stop_at_ts=stop_at_ts))\n    ops = bwd_ops.intersection(within_ops).difference([t.op for t in stop_at_ts])\n    return list(ops)\n\n@contextlib.contextmanager\ndef capture_ops():\n  \"\"\"Decorator to capture ops created in the block.\n  with capture_ops() as ops:\n    # create some ops\n  print(ops) # => prints ops created.\n  \"\"\"\n\n  micros = int(time.time()*10**6)\n  scope_name = str(micros)\n  op_list = []\n  with tf.name_scope(scope_name):\n    yield op_list\n\n  g = tf.get_default_graph()\n  op_list.extend(ge.select_ops(scope_name+\"/.*\", graph=g))\n\ndef _to_op(tensor_or_op):\n  if hasattr(tensor_or_op, \"op\"):\n    return tensor_or_op.op\n  return tensor_or_op\n\ndef _to_ops(iterable):\n  if not _is_iterable(iterable):\n    return iterable\n  return [_to_op(i) for i in iterable]\n\ndef _is_iterable(o):\n  try:\n    _ = iter(o)\n  except Exception:\n    return False\n  return True\n\nDEBUG_LOGGING=False\ndef debug_print(s, *args):\n  \"\"\"Like logger.log, but also replaces all TensorFlow ops/tensors with their\n  names. Sensitive to value of DEBUG_LOGGING, see enable_debug/disable_debug\n\n  Usage:\n    debug_print(\"see tensors %s for %s\", tensorlist, [1,2,3])\n  \"\"\"\n\n  if DEBUG_LOGGING:\n    formatted_args = [format_ops(arg) for arg in args]\n    print(\"DEBUG \"+s % tuple(formatted_args))\n\ndef format_ops(ops, sort_outputs=True):\n  \"\"\"Helper method for printing ops. Converts Tensor/Operation op to op.name,\n  rest to str(op).\"\"\"\n\n  if hasattr(ops, '__iter__') and not isinstance(ops, str):\n    l = [(op.name if hasattr(op, \"name\") else str(op)) for op in ops]\n    if sort_outputs:\n      return sorted(l)\n    return l\n  else:\n    return ops.name if hasattr(ops, \"name\") else str(ops)\n\ndef my_add_control_inputs(wait_to_do_ops, inputs_to_do_before):\n    for op in wait_to_do_ops:\n        ci = [i for i in inputs_to_do_before if op.control_inputs is None or i not in op.control_inputs]\n        ge.add_control_inputs(op, ci)"
  },
  {
    "path": "Chapter06/gpt-2-train_files/train.py",
    "content": "#!/usr/bin/env python3\n# Usage:\n#  PYTHONPATH=src ./train --dataset <file|directory|glob>\n\nimport argparse\nimport json\nimport os\nimport numpy as np\nimport tensorflow as tf\nimport time\nimport tqdm\nfrom tensorflow.core.protobuf import rewriter_config_pb2\n\nimport model, sample, encoder\nfrom load_dataset import load_dataset, Sampler\nfrom accumulate import AccumulatingOptimizer\nimport memory_saving_gradients\n\nCHECKPOINT_DIR = 'checkpoint'\nSAMPLE_DIR = 'samples'\n\n\nparser = argparse.ArgumentParser(\n    description='Fine-tune GPT-2 on your custom dataset.',\n    formatter_class=argparse.ArgumentDefaultsHelpFormatter)\n\nparser.add_argument('--dataset', metavar='PATH', type=str, required=True, help='Input file, directory, or glob pattern (utf-8 text, or preencoded .npz files).')\nparser.add_argument('--model_name', metavar='MODEL', type=str, default='117M', help='Pretrained model name')\nparser.add_argument('--combine', metavar='CHARS', type=int, default=50000, help='Concatenate input files with <|endoftext|> separator into chunks of this minimum size')\nparser.add_argument('--encoding', type=str, default='utf-8', help='Set the encoding for reading and writing files.')\n\nparser.add_argument('--batch_size', metavar='SIZE', type=int, default=1, help='Batch size')\nparser.add_argument('--learning_rate', metavar='LR', type=float, default=0.00002, help='Learning rate for Adam')\nparser.add_argument('--accumulate_gradients', metavar='N', type=int, default=1, help='Accumulate gradients across N minibatches.')\nparser.add_argument('--memory_saving_gradients', default=False, action='store_true', help='Use gradient checkpointing to reduce vram usage.')\nparser.add_argument('--only_train_transformer_layers', default=False, action='store_true', help='Restrict training to the transformer blocks.')\nparser.add_argument('--optimizer', type=str, default='adam', help='Optimizer. <adam|sgd>.')\nparser.add_argument('--noise', type=float, default=0.0, help='Add noise to input training data to regularize against typos.')\n\nparser.add_argument('--top_k', type=int, default=40, help='K for top-k sampling.')\nparser.add_argument('--top_p', type=float, default=0.0, help='P for top-p sampling. Overrides top_k if set > 0.')\n\nparser.add_argument('--restore_from', type=str, default='latest', help='Either \"latest\", \"fresh\", or a path to a checkpoint file')\nparser.add_argument('--run_name', type=str, default='run1', help='Run id. Name of subdirectory in checkpoint/ and samples/')\nparser.add_argument('--sample_every', metavar='N', type=int, default=100, help='Generate samples every N steps')\nparser.add_argument('--sample_length', metavar='TOKENS', type=int, default=1023, help='Sample this many tokens')\nparser.add_argument('--sample_num', metavar='N', type=int, default=1, help='Generate this many samples')\nparser.add_argument('--save_every', metavar='N', type=int, default=1000, help='Write a checkpoint every N steps')\n\nparser.add_argument('--val_dataset', metavar='PATH', type=str, default=None, help='Dataset for validation loss, defaults to --dataset.')\nparser.add_argument('--val_batch_size', metavar='SIZE', type=int, default=2, help='Batch size for validation.')\nparser.add_argument('--val_batch_count', metavar='N', type=int, default=40, help='Number of batches for validation.')\nparser.add_argument('--val_every', metavar='STEPS', type=int, default=0, help='Calculate validation loss every STEPS steps.')\n\n\ndef maketree(path):\n    try:\n        os.makedirs(path)\n    except:\n        pass\n\n\ndef randomize(context, hparams, p):\n    if p > 0:\n        mask = tf.random.uniform(shape=tf.shape(context)) < p\n        noise = tf.random.uniform(shape=tf.shape(context), minval=0, maxval=hparams.n_vocab, dtype=tf.int32)\n        return tf.where(mask, noise, context)\n    else:\n        return context\n\n\ndef main():\n    args = parser.parse_args()\n    models_dir='/content/gpt-2/src/models'\n    enc = encoder.get_encoder(args.model_name,models_dir)\n    hparams = model.default_hparams()\n    with open(os.path.join('models', args.model_name, 'hparams.json')) as f:\n        hparams.override_from_dict(json.load(f))\n\n    if args.sample_length > hparams.n_ctx:\n        raise ValueError(\n            \"Can't get samples longer than window size: %s\" % hparams.n_ctx)\n\n    if args.model_name == '345M':\n        args.memory_saving_gradients = True\n        if args.optimizer == 'adam':\n            args.only_train_transformer_layers = True\n\n    config = tf.ConfigProto()\n    config.gpu_options.allow_growth = True\n    config.graph_options.rewrite_options.layout_optimizer = rewriter_config_pb2.RewriterConfig.OFF\n    with tf.Session(config=config) as sess:\n        context = tf.placeholder(tf.int32, [args.batch_size, None])\n        context_in = randomize(context, hparams, args.noise)\n        output = model.model(hparams=hparams, X=context_in)\n        loss = tf.reduce_mean(\n            tf.nn.sparse_softmax_cross_entropy_with_logits(\n                labels=context[:, 1:], logits=output['logits'][:, :-1]))\n\n        if args.val_every > 0:\n            val_context = tf.placeholder(tf.int32, [args.val_batch_size, None])\n            val_output = model.model(hparams=hparams, X=val_context)\n            val_loss = tf.reduce_mean(\n                tf.nn.sparse_softmax_cross_entropy_with_logits(\n                    labels=val_context[:, 1:], logits=val_output['logits'][:, :-1]))\n            val_loss_summary = tf.summary.scalar('val_loss', val_loss)\n\n\n        tf_sample = sample.sample_sequence(\n            hparams=hparams,\n            length=args.sample_length,\n            context=context,\n            batch_size=args.batch_size,\n            temperature=1.0,\n            top_k=args.top_k,\n            top_p=args.top_p)\n\n        all_vars = [v for v in tf.trainable_variables() if 'model' in v.name]\n        train_vars = [v for v in all_vars if '/h' in v.name] if args.only_train_transformer_layers else all_vars\n\n        if args.optimizer == 'adam':\n            opt = tf.train.AdamOptimizer(learning_rate=args.learning_rate)\n        elif args.optimizer == 'sgd':\n            opt = tf.train.GradientDescentOptimizer(learning_rate=args.learning_rate)\n        else:\n            exit('Bad optimizer:', args.optimizer)\n\n        if args.accumulate_gradients > 1:\n            if args.memory_saving_gradients:\n                exit(\"Memory saving gradients are not implemented for gradient accumulation yet.\")\n            opt = AccumulatingOptimizer(\n                opt=opt,\n                var_list=train_vars)\n            opt_reset = opt.reset()\n            opt_compute = opt.compute_gradients(loss)\n            opt_apply = opt.apply_gradients()\n            summary_loss = tf.summary.scalar('loss', opt_apply)\n        else:\n            if args.memory_saving_gradients:\n                opt_grads = memory_saving_gradients.gradients(loss, train_vars)\n            else:\n                opt_grads = tf.gradients(loss, train_vars)\n            opt_grads = list(zip(opt_grads, train_vars))\n            opt_apply = opt.apply_gradients(opt_grads)\n            summary_loss = tf.summary.scalar('loss', loss)\n\n        summary_lr = tf.summary.scalar('learning_rate', args.learning_rate)\n        summaries = tf.summary.merge([summary_lr, summary_loss])\n\n        summary_log = tf.summary.FileWriter(\n            os.path.join(CHECKPOINT_DIR, args.run_name))\n\n        saver = tf.train.Saver(\n            var_list=all_vars,\n            max_to_keep=5,\n            keep_checkpoint_every_n_hours=2)\n        sess.run(tf.global_variables_initializer())\n\n        if args.restore_from == 'latest':\n            ckpt = tf.train.latest_checkpoint(\n                os.path.join(CHECKPOINT_DIR, args.run_name))\n            if ckpt is None:\n                # Get fresh GPT weights if new run.\n                ckpt = tf.train.latest_checkpoint(\n                    os.path.join('models', args.model_name))\n        elif args.restore_from == 'fresh':\n            ckpt = tf.train.latest_checkpoint(\n                os.path.join('models', args.model_name))\n        else:\n            ckpt = tf.train.latest_checkpoint(args.restore_from)\n        print('Loading checkpoint', ckpt)\n        saver.restore(sess, ckpt)\n\n        print('Loading dataset...')\n        chunks = load_dataset(enc, args.dataset, args.combine, encoding=args.encoding)\n        data_sampler = Sampler(chunks)\n        if args.val_every > 0:\n            if args.val_dataset:\n                val_chunks = load_dataset(enc, args.val_dataset, args.combine, encoding=args.encoding)\n            else:\n                val_chunks = chunks\n        print('dataset has', data_sampler.total_size, 'tokens')\n        print('Training...')\n\n        if args.val_every > 0:\n            # Sample from validation set once with fixed seed to make\n            # it deterministic during training as well as across runs.\n            val_data_sampler = Sampler(val_chunks, seed=1)\n            val_batches = [[val_data_sampler.sample(1024) for _ in range(args.val_batch_size)]\n                           for _ in range(args.val_batch_count)]\n\n        counter = 1\n        counter_path = os.path.join(CHECKPOINT_DIR, args.run_name, 'counter')\n        if os.path.exists(counter_path):\n            # Load the step number if we're resuming a run\n            # Add 1 so we don't immediately try to save again\n            with open(counter_path, 'r') as fp:\n                counter = int(fp.read()) + 1\n\n        def save():\n            maketree(os.path.join(CHECKPOINT_DIR, args.run_name))\n            print(\n                'Saving',\n                os.path.join(CHECKPOINT_DIR, args.run_name,\n                             'model-{}').format(counter))\n            saver.save(\n                sess,\n                os.path.join(CHECKPOINT_DIR, args.run_name, 'model'),\n                global_step=counter)\n            with open(counter_path, 'w') as fp:\n                fp.write(str(counter) + '\\n')\n\n        def generate_samples():\n            print('Generating samples...')\n            context_tokens = data_sampler.sample(1)\n            all_text = []\n            index = 0\n            while index < args.sample_num:\n                out = sess.run(\n                    tf_sample,\n                    feed_dict={context: args.batch_size * [context_tokens]})\n                for i in range(min(args.sample_num - index, args.batch_size)):\n                    text = enc.decode(out[i])\n                    text = '======== SAMPLE {} ========\\n{}\\n'.format(\n                        index + 1, text)\n                    all_text.append(text)\n                    index += 1\n            print(text)\n            maketree(os.path.join(SAMPLE_DIR, args.run_name))\n            with open(\n                    os.path.join(SAMPLE_DIR, args.run_name,\n                                 'samples-{}').format(counter), 'w', encoding=args.encoding) as fp:\n                fp.write('\\n'.join(all_text))\n\n        def validation():\n            print('Calculating validation loss...')\n            losses = []\n            for batch in tqdm.tqdm(val_batches):\n                losses.append(sess.run(val_loss, feed_dict={val_context: batch}))\n            v_val_loss = np.mean(losses)\n            v_summary = sess.run(val_loss_summary, feed_dict={val_loss: v_val_loss})\n            summary_log.add_summary(v_summary, counter)\n            summary_log.flush()\n            print(\n                '[{counter} | {time:2.2f}] validation loss = {loss:2.2f}'\n                .format(\n                    counter=counter,\n                    time=time.time() - start_time,\n                    loss=v_val_loss))\n\n        def sample_batch():\n            return [data_sampler.sample(1024) for _ in range(args.batch_size)]\n\n\n        avg_loss = (0.0, 0.0)\n        start_time = time.time()\n\n        try:\n            while True:\n                if counter % args.save_every == 0:\n                    save()\n                if counter % args.sample_every == 0:\n                    generate_samples()\n                if args.val_every > 0 and (counter % args.val_every == 0 or counter == 1):\n                    validation()\n\n                if args.accumulate_gradients > 1:\n                    sess.run(opt_reset)\n                    for _ in range(args.accumulate_gradients):\n                        sess.run(\n                            opt_compute, feed_dict={context: sample_batch()})\n                    (v_loss, v_summary) = sess.run((opt_apply, summaries))\n                else:\n                    (_, v_loss, v_summary) = sess.run(\n                        (opt_apply, loss, summaries),\n                        feed_dict={context: sample_batch()})\n\n                summary_log.add_summary(v_summary, counter)\n\n                avg_loss = (avg_loss[0] * 0.99 + v_loss,\n                            avg_loss[1] * 0.99 + 1.0)\n\n                print(\n                    '[{counter} | {time:2.2f}] loss={loss:2.2f} avg={avg:2.2f}'\n                    .format(\n                        counter=counter,\n                        time=time.time() - start_time,\n                        loss=v_loss,\n                        avg=avg_loss[0] / avg_loss[1]))\n\n                counter += 1\n        except KeyboardInterrupt:\n            print('interrupted')\n            save()\n\n\nif __name__ == '__main__':\n    main()\n"
  },
  {
    "path": "Chapter06/head_view_bert.ipynb",
    "content": "{\n  \"nbformat\": 4,\n  \"nbformat_minor\": 0,\n  \"metadata\": {\n    \"colab\": {\n      \"name\": \"head_view_bert.ipynb\",\n      \"provenance\": [],\n      \"collapsed_sections\": []\n    },\n    \"kernelspec\": {\n      \"name\": \"python3\",\n      \"display_name\": \"Python 3\"\n    },\n    \"widgets\": {\n      \"application/vnd.jupyter.widget-state+json\": {\n        \"4d1bd7a205b94210ba8e1fd946d75821\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_1f2847673c374813ac442322e978eec7\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_f48103aee6dc4c06b176bc115be332e0\",\n              \"IPY_MODEL_c6f8b3bf7fce4c928a0db3651813347d\"\n            ]\n          }\n        },\n        \"1f2847673c374813ac442322e978eec7\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"f48103aee6dc4c06b176bc115be332e0\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_8d049af8ea834bf7b3a0fc7013fb3ff9\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 433,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 433,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_1dc9d68906594c1eb6db7c9f31876d2a\"\n          }\n        },\n        \"c6f8b3bf7fce4c928a0db3651813347d\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_f9d8ea0a95924b0596ab4b7f091a94b7\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 433/433 [00:00&lt;00:00, 1.34kB/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_ba7f35525a7b4cb1951ed1a1e6a57ffd\"\n          }\n        },\n        \"8d049af8ea834bf7b3a0fc7013fb3ff9\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"1dc9d68906594c1eb6db7c9f31876d2a\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"f9d8ea0a95924b0596ab4b7f091a94b7\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"ba7f35525a7b4cb1951ed1a1e6a57ffd\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"d3ee7a14538244b1b64abbeb24948102\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_34507ce588b04412aaacea76987f27ea\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_c32d39b32a144c4480cd8d6b1d6c199e\",\n              \"IPY_MODEL_693433c2ec204437ac7878a8bee61647\"\n            ]\n          }\n        },\n        \"34507ce588b04412aaacea76987f27ea\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"c32d39b32a144c4480cd8d6b1d6c199e\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_19a09d359acf496bb0bc68c63dea1e78\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 440473133,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 440473133,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_8e81da5616354ddb887419d81251096b\"\n          }\n        },\n        \"693433c2ec204437ac7878a8bee61647\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_2a084a03747d4c9984cc13136ccc4217\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 440M/440M [00:07&lt;00:00, 61.0MB/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_679c7f033d2940f489382161964c5c0d\"\n          }\n        },\n        \"19a09d359acf496bb0bc68c63dea1e78\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"8e81da5616354ddb887419d81251096b\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"2a084a03747d4c9984cc13136ccc4217\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"679c7f033d2940f489382161964c5c0d\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"2861d6bdfed84911ab25f8175d718e2e\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_df575c6406c0426ca9f222b818896439\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_5f214f2f5a964fdc97eda797248f720f\",\n              \"IPY_MODEL_40f8b46b4b8f44dcabd98d6a5e3044b3\"\n            ]\n          }\n        },\n        \"df575c6406c0426ca9f222b818896439\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"5f214f2f5a964fdc97eda797248f720f\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_5a68a09e677d4118bc7e3efc18c35999\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 231508,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 231508,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_73ff50d8f1dc4f4e99062845a024a20b\"\n          }\n        },\n        \"40f8b46b4b8f44dcabd98d6a5e3044b3\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_b8aa22a0efcb4f43a752e21bdeb73589\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 232k/232k [00:00&lt;00:00, 621kB/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_f2095ed84f644757888f5746c2a10ee4\"\n          }\n        },\n        \"5a68a09e677d4118bc7e3efc18c35999\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"73ff50d8f1dc4f4e99062845a024a20b\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"b8aa22a0efcb4f43a752e21bdeb73589\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"f2095ed84f644757888f5746c2a10ee4\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        }\n      }\n    }\n  },\n  \"cells\": [\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"lqAxyueMKgXT\"\n      },\n      \"source\": [\n        \"#BertViz\\n\",\n        \"\\n\",\n        \"Note: Denis Rothman added some titles to the sections of the reference notebook\\n\",\n        \"\\n\",\n        \"[Reference BertViz GitHub Repository by Jesse Vig](https://github.com/jessevig/bertviz)\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"zFo1IBx-x-rC\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"a2e29214-6b92-45d0-d072-39ac3195f63c\"\n      },\n      \"source\": [\n        \"#@title Step 1: Installing BertViz and Requirements\\n\",\n        \"import sys\\n\",\n        \"!test -d bertviz_repo && echo \\\"FYI: bertviz_repo directory already exists, to pull latest version uncomment this line: !rm -r bertviz_repo\\\"\\n\",\n        \"# !rm -r bertviz_repo # Uncomment if you need a clean pull from repo\\n\",\n        \"!test -d bertviz_repo || git clone https://github.com/jessevig/bertviz bertviz_repo\\n\",\n        \"if not 'bertviz_repo' in sys.path:\\n\",\n        \"  sys.path += ['bertviz_repo']\\n\",\n        \"!pip install regex\\n\",\n        \"!pip install transformers\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Cloning into 'bertviz_repo'...\\n\",\n            \"remote: Enumerating objects: 3, done.\\u001b[K\\n\",\n            \"remote: Counting objects: 100% (3/3), done.\\u001b[K\\n\",\n            \"remote: Compressing objects: 100% (3/3), done.\\u001b[K\\n\",\n            \"remote: Total 1077 (delta 0), reused 2 (delta 0), pack-reused 1074\\u001b[K\\n\",\n            \"Receiving objects: 100% (1077/1077), 100.00 MiB | 10.18 MiB/s, done.\\n\",\n            \"Resolving deltas: 100% (687/687), done.\\n\",\n            \"Requirement already satisfied: regex in /usr/local/lib/python3.6/dist-packages (2019.12.20)\\n\",\n            \"Collecting transformers\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/99/84/7bc03215279f603125d844bf81c3fb3f2d50fe8e511546eb4897e4be2067/transformers-4.0.0-py3-none-any.whl (1.4MB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 1.4MB 12.8MB/s \\n\",\n            \"\\u001b[?25hCollecting sacremoses\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/7d/34/09d19aff26edcc8eb2a01bed8e98f13a1537005d31e95233fd48216eed10/sacremoses-0.0.43.tar.gz (883kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 890kB 50.3MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: packaging in /usr/local/lib/python3.6/dist-packages (from transformers) (20.4)\\n\",\n            \"Requirement already satisfied: numpy in /usr/local/lib/python3.6/dist-packages (from transformers) (1.18.5)\\n\",\n            \"Requirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.6/dist-packages (from transformers) (2019.12.20)\\n\",\n            \"Requirement already satisfied: tqdm>=4.27 in /usr/local/lib/python3.6/dist-packages (from transformers) (4.41.1)\\n\",\n            \"Requirement already satisfied: requests in /usr/local/lib/python3.6/dist-packages (from transformers) (2.23.0)\\n\",\n            \"Requirement already satisfied: filelock in /usr/local/lib/python3.6/dist-packages (from transformers) (3.0.12)\\n\",\n            \"Requirement already satisfied: dataclasses; python_version < \\\"3.7\\\" in /usr/local/lib/python3.6/dist-packages (from transformers) (0.8)\\n\",\n            \"Collecting tokenizers==0.9.4\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/0f/1c/e789a8b12e28be5bc1ce2156cf87cb522b379be9cadc7ad8091a4cc107c4/tokenizers-0.9.4-cp36-cp36m-manylinux2010_x86_64.whl (2.9MB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 2.9MB 41.0MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: six in /usr/local/lib/python3.6/dist-packages (from sacremoses->transformers) (1.15.0)\\n\",\n            \"Requirement already satisfied: click in /usr/local/lib/python3.6/dist-packages (from sacremoses->transformers) (7.1.2)\\n\",\n            \"Requirement already satisfied: joblib in /usr/local/lib/python3.6/dist-packages (from sacremoses->transformers) (0.17.0)\\n\",\n            \"Requirement already satisfied: pyparsing>=2.0.2 in /usr/local/lib/python3.6/dist-packages (from packaging->transformers) (2.4.7)\\n\",\n            \"Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from requests->transformers) (1.24.3)\\n\",\n            \"Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.6/dist-packages (from requests->transformers) (2020.11.8)\\n\",\n            \"Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests->transformers) (2.10)\\n\",\n            \"Requirement already satisfied: chardet<4,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests->transformers) (3.0.4)\\n\",\n            \"Building wheels for collected packages: sacremoses\\n\",\n            \"  Building wheel for sacremoses (setup.py) ... \\u001b[?25l\\u001b[?25hdone\\n\",\n            \"  Created wheel for sacremoses: filename=sacremoses-0.0.43-cp36-none-any.whl size=893257 sha256=3f95484e6bfe6dca37925ea4e4eb8f8fd8cd56a1234b54c56e663ce9d809bdc6\\n\",\n            \"  Stored in directory: /root/.cache/pip/wheels/29/3c/fd/7ce5c3f0666dab31a50123635e6fb5e19ceb42ce38d4e58f45\\n\",\n            \"Successfully built sacremoses\\n\",\n            \"Installing collected packages: sacremoses, tokenizers, transformers\\n\",\n            \"Successfully installed sacremoses-0.0.43 tokenizers-0.9.4 transformers-4.0.0\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"nCKW2hAUyK_4\"\n      },\n      \"source\": [\n        \"#@title Step 2: Import BertViz Head Views and BERT \\n\",\n        \"from bertviz import head_view\\n\",\n        \"from transformers import BertTokenizer, BertModel\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"Mv6H9QK9yLLe\"\n      },\n      \"source\": [\n        \"#@title Step 3: Defining the HTML Function\\n\",\n        \"def call_html():\\n\",\n        \"  import IPython\\n\",\n        \"  display(IPython.core.display.HTML('''\\n\",\n        \"        <script src=\\\"/static/components/requirejs/require.js\\\"></script>\\n\",\n        \"        <script>\\n\",\n        \"          requirejs.config({\\n\",\n        \"            paths: {\\n\",\n        \"              base: '/static/base',\\n\",\n        \"              \\\"d3\\\": \\\"https://cdnjs.cloudflare.com/ajax/libs/d3/3.5.8/d3.min\\\",\\n\",\n        \"              jquery: '//ajax.googleapis.com/ajax/libs/jquery/2.0.0/jquery.min',\\n\",\n        \"            },\\n\",\n        \"          });\\n\",\n        \"        </script>\\n\",\n        \"        '''))\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"fZAXH7hWyt58\",\n        \"colab\": {\n          \"resources\": {\n            \"http://localhost:8080/static/components/requirejs/require.js\": {\n              \"data\": \"/** vim: et:ts=4:sw=4:sts=4
 * @license RequireJS 2.1.22 Copyright (c) 2010-2015, The Dojo Foundation All Rights Reserved.
 * Available via the MIT or new BSD license.
 * see: http://github.com/jrburke/requirejs for details
 */
//Not using strict: uneven strict support in browsers, #392, and causes
//problems with requirejs.exec()/transpiler plugins that may not be strict.
/*jslint regexp: true, nomen: true, sloppy: true */
/*global window, navigator, document, importScripts, setTimeout, opera */

var requirejs, require, define;
(function (global) {
    var req, s, head, baseElement, dataMain, src,
        interactiveScript, currentlyAddingScript, mainScript, subPath,
        version = '2.1.22',
        commentRegExp = /(\/\*([\s\S]*?)\*\/|([^:]|^)\/\/(.*)$)/mg,
        cjsRequireRegExp = /[^.]\s*require\s*\(\s*["']([^'"\s]+)["']\s*\)/g,
        jsSuffixRegExp = /\.js$/,
        currDirRegExp = /^\.\//,
        op = Object.prototype,
        ostring = op.toString,
        hasOwn = op.hasOwnProperty,
        ap = Array.prototype,
        isBrowser = !!(typeof window !== 'undefined' && typeof navigator !== 'undefined' && window.document),
        isWebWorker = !isBrowser && typeof importScripts !== 'undefined',
        //PS3 indicates loaded and complete, but need to wait for complete
        //specifically. Sequence is 'loading', 'loaded', execution,
        // then 'complete'. The UA check is unfortunate, but not sure how
        //to feature test w/o causing perf issues.
        readyRegExp = isBrowser && navigator.platform === 'PLAYSTATION 3' ?
                      /^complete$/ : /^(complete|loaded)$/,
        defContextName = '_',
        //Oh the tragedy, detecting opera. See the usage of isOpera for reason.
        isOpera = typeof opera !== 'undefined' && opera.toString() === '[object Opera]',
        contexts = {},
        cfg = {},
        globalDefQueue = [],
        useInteractive = false;

    function isFunction(it) {
        return ostring.call(it) === '[object Function]';
    }

    function isArray(it) {
        return ostring.call(it) === '[object Array]';
    }

    /**
     * Helper function for iterating over an array. If the func returns
     * a true value, it will break out of the loop.
     */
    function each(ary, func) {
        if (ary) {
            var i;
            for (i = 0; i < ary.length; i += 1) {
                if (ary[i] && func(ary[i], i, ary)) {
                    break;
                }
            }
        }
    }

    /**
     * Helper function for iterating over an array backwards. If the func
     * returns a true value, it will break out of the loop.
     */
    function eachReverse(ary, func) {
        if (ary) {
            var i;
            for (i = ary.length - 1; i > -1; i -= 1) {
                if (ary[i] && func(ary[i], i, ary)) {
                    break;
                }
            }
        }
    }

    function hasProp(obj, prop) {
        return hasOwn.call(obj, prop);
    }

    function getOwn(obj, prop) {
        return hasProp(obj, prop) && obj[prop];
    }

    /**
     * Cycles over properties in an object and calls a function for each
     * property value. If the function returns a truthy value, then the
     * iteration is stopped.
     */
    function eachProp(obj, func) {
        var prop;
        for (prop in obj) {
            if (hasProp(obj, prop)) {
                if (func(obj[prop], prop)) {
                    break;
                }
            }
        }
    }

    /**
     * Simple function to mix in properties from source into target,
     * but only if target does not already have a property of the same name.
     */
    function mixin(target, source, force, deepStringMixin) {
        if (source) {
            eachProp(source, function (value, prop) {
                if (force || !hasProp(target, prop)) {
                    if (deepStringMixin && typeof value === 'object' && value &&
                        !isArray(value) && !isFunction(value) &&
                        !(value instanceof RegExp)) {

                        if (!target[prop]) {
                            target[prop] = {};
                        }
                        mixin(target[prop], value, force, deepStringMixin);
                    } else {
                        target[prop] = value;
                    }
                }
            });
        }
        return target;
    }

    //Similar to Function.prototype.bind, but the 'this' object is specified
    //first, since it is easier to read/figure out what 'this' will be.
    function bind(obj, fn) {
        return function () {
            return fn.apply(obj, arguments);
        };
    }

    function scripts() {
        return document.getElementsByTagName('script');
    }

    function defaultOnError(err) {
        throw err;
    }

    //Allow getting a global that is expressed in
    //dot notation, like 'a.b.c'.
    function getGlobal(value) {
        if (!value) {
            return value;
        }
        var g = global;
        each(value.split('.'), function (part) {
            g = g[part];
        });
        return g;
    }

    /**
     * Constructs an error with a pointer to an URL with more information.
     * @param {String} id the error ID that maps to an ID on a web page.
     * @param {String} message human readable error.
     * @param {Error} [err] the original error, if there is one.
     *
     * @returns {Error}
     */
    function makeError(id, msg, err, requireModules) {
        var e = new Error(msg + '\nhttp://requirejs.org/docs/errors.html#' + id);
        e.requireType = id;
        e.requireModules = requireModules;
        if (err) {
            e.originalError = err;
        }
        return e;
    }

    if (typeof define !== 'undefined') {
        //If a define is already in play via another AMD loader,
        //do not overwrite.
        return;
    }

    if (typeof requirejs !== 'undefined') {
        if (isFunction(requirejs)) {
            //Do not overwrite an existing requirejs instance.
            return;
        }
        cfg = requirejs;
        requirejs = undefined;
    }

    //Allow for a require config object
    if (typeof require !== 'undefined' && !isFunction(require)) {
        //assume it is a config object.
        cfg = require;
        require = undefined;
    }

    function newContext(contextName) {
        var inCheckLoaded, Module, context, handlers,
            checkLoadedTimeoutId,
            config = {
                //Defaults. Do not set a default for map
                //config to speed up normalize(), which
                //will run faster if there is no default.
                waitSeconds: 7,
                baseUrl: './',
                paths: {},
                bundles: {},
                pkgs: {},
                shim: {},
                config: {}
            },
            registry = {},
            //registry of just enabled modules, to speed
            //cycle breaking code when lots of modules
            //are registered, but not activated.
            enabledRegistry = {},
            undefEvents = {},
            defQueue = [],
            defined = {},
            urlFetched = {},
            bundlesMap = {},
            requireCounter = 1,
            unnormalizedCounter = 1;

        /**
         * Trims the . and .. from an array of path segments.
         * It will keep a leading path segment if a .. will become
         * the first path segment, to help with module name lookups,
         * which act like paths, but can be remapped. But the end result,
         * all paths that use this function should look normalized.
         * NOTE: this method MODIFIES the input array.
         * @param {Array} ary the array of path segments.
         */
        function trimDots(ary) {
            var i, part;
            for (i = 0; i < ary.length; i++) {
                part = ary[i];
                if (part === '.') {
                    ary.splice(i, 1);
                    i -= 1;
                } else if (part === '..') {
                    // If at the start, or previous value is still ..,
                    // keep them so that when converted to a path it may
                    // still work when converted to a path, even though
                    // as an ID it is less than ideal. In larger point
                    // releases, may be better to just kick out an error.
                    if (i === 0 || (i === 1 && ary[2] === '..') || ary[i - 1] === '..') {
                        continue;
                    } else if (i > 0) {
                        ary.splice(i - 1, 2);
                        i -= 2;
                    }
                }
            }
        }

        /**
         * Given a relative module name, like ./something, normalize it to
         * a real name that can be mapped to a path.
         * @param {String} name the relative name
         * @param {String} baseName a real name that the name arg is relative
         * to.
         * @param {Boolean} applyMap apply the map config to the value. Should
         * only be done if this normalization is for a dependency ID.
         * @returns {String} normalized name
         */
        function normalize(name, baseName, applyMap) {
            var pkgMain, mapValue, nameParts, i, j, nameSegment, lastIndex,
                foundMap, foundI, foundStarMap, starI, normalizedBaseParts,
                baseParts = (baseName && baseName.split('/')),
                map = config.map,
                starMap = map && map['*'];

            //Adjust any relative paths.
            if (name) {
                name = name.split('/');
                lastIndex = name.length - 1;

                // If wanting node ID compatibility, strip .js from end
                // of IDs. Have to do this here, and not in nameToUrl
                // because node allows either .js or non .js to map
                // to same file.
                if (config.nodeIdCompat && jsSuffixRegExp.test(name[lastIndex])) {
                    name[lastIndex] = name[lastIndex].replace(jsSuffixRegExp, '');
                }

                // Starts with a '.' so need the baseName
                if (name[0].charAt(0) === '.' && baseParts) {
                    //Convert baseName to array, and lop off the last part,
                    //so that . matches that 'directory' and not name of the baseName's
                    //module. For instance, baseName of 'one/two/three', maps to
                    //'one/two/three.js', but we want the directory, 'one/two' for
                    //this normalization.
                    normalizedBaseParts = baseParts.slice(0, baseParts.length - 1);
                    name = normalizedBaseParts.concat(name);
                }

                trimDots(name);
                name = name.join('/');
            }

            //Apply map config if available.
            if (applyMap && map && (baseParts || starMap)) {
                nameParts = name.split('/');

                outerLoop: for (i = nameParts.length; i > 0; i -= 1) {
                    nameSegment = nameParts.slice(0, i).join('/');

                    if (baseParts) {
                        //Find the longest baseName segment match in the config.
                        //So, do joins on the biggest to smallest lengths of baseParts.
                        for (j = baseParts.length; j > 0; j -= 1) {
                            mapValue = getOwn(map, baseParts.slice(0, j).join('/'));

                            //baseName segment has config, find if it has one for
                            //this name.
                            if (mapValue) {
                                mapValue = getOwn(mapValue, nameSegment);
                                if (mapValue) {
                                    //Match, update name to the new value.
                                    foundMap = mapValue;
                                    foundI = i;
                                    break outerLoop;
                                }
                            }
                        }
                    }

                    //Check for a star map match, but just hold on to it,
                    //if there is a shorter segment match later in a matching
                    //config, then favor over this star map.
                    if (!foundStarMap && starMap && getOwn(starMap, nameSegment)) {
                        foundStarMap = getOwn(starMap, nameSegment);
                        starI = i;
                    }
                }

                if (!foundMap && foundStarMap) {
                    foundMap = foundStarMap;
                    foundI = starI;
                }

                if (foundMap) {
                    nameParts.splice(0, foundI, foundMap);
                    name = nameParts.join('/');
                }
            }

            // If the name points to a package's name, use
            // the package main instead.
            pkgMain = getOwn(config.pkgs, name);

            return pkgMain ? pkgMain : name;
        }

        function removeScript(name) {
            if (isBrowser) {
                each(scripts(), function (scriptNode) {
                    if (scriptNode.getAttribute('data-requiremodule') === name &&
                            scriptNode.getAttribute('data-requirecontext') === context.contextName) {
                        scriptNode.parentNode.removeChild(scriptNode);
                        return true;
                    }
                });
            }
        }

        function hasPathFallback(id) {
            var pathConfig = getOwn(config.paths, id);
            if (pathConfig && isArray(pathConfig) && pathConfig.length > 1) {
                //Pop off the first array value, since it failed, and
                //retry
                pathConfig.shift();
                context.require.undef(id);

                //Custom require that does not do map translation, since
                //ID is "absolute", already mapped/resolved.
                context.makeRequire(null, {
                    skipMap: true
                })([id]);

                return true;
            }
        }

        //Turns a plugin!resource to [plugin, resource]
        //with the plugin being undefined if the name
        //did not have a plugin prefix.
        function splitPrefix(name) {
            var prefix,
                index = name ? name.indexOf('!') : -1;
            if (index > -1) {
                prefix = name.substring(0, index);
                name = name.substring(index + 1, name.length);
            }
            return [prefix, name];
        }

        /**
         * Creates a module mapping that includes plugin prefix, module
         * name, and path. If parentModuleMap is provided it will
         * also normalize the name via require.normalize()
         *
         * @param {String} name the module name
         * @param {String} [parentModuleMap] parent module map
         * for the module name, used to resolve relative names.
         * @param {Boolean} isNormalized: is the ID already normalized.
         * This is true if this call is done for a define() module ID.
         * @param {Boolean} applyMap: apply the map config to the ID.
         * Should only be true if this map is for a dependency.
         *
         * @returns {Object}
         */
        function makeModuleMap(name, parentModuleMap, isNormalized, applyMap) {
            var url, pluginModule, suffix, nameParts,
                prefix = null,
                parentName = parentModuleMap ? parentModuleMap.name : null,
                originalName = name,
                isDefine = true,
                normalizedName = '';

            //If no name, then it means it is a require call, generate an
            //internal name.
            if (!name) {
                isDefine = false;
                name = '_@r' + (requireCounter += 1);
            }

            nameParts = splitPrefix(name);
            prefix = nameParts[0];
            name = nameParts[1];

            if (prefix) {
                prefix = normalize(prefix, parentName, applyMap);
                pluginModule = getOwn(defined, prefix);
            }

            //Account for relative paths if there is a base name.
            if (name) {
                if (prefix) {
                    if (pluginModule && pluginModule.normalize) {
                        //Plugin is loaded, use its normalize method.
                        normalizedName = pluginModule.normalize(name, function (name) {
                            return normalize(name, parentName, applyMap);
                        });
                    } else {
                        // If nested plugin references, then do not try to
                        // normalize, as it will not normalize correctly. This
                        // places a restriction on resourceIds, and the longer
                        // term solution is not to normalize until plugins are
                        // loaded and all normalizations to allow for async
                        // loading of a loader plugin. But for now, fixes the
                        // common uses. Details in #1131
                        normalizedName = name.indexOf('!') === -1 ?
                                         normalize(name, parentName, applyMap) :
                                         name;
                    }
                } else {
                    //A regular module.
                    normalizedName = normalize(name, parentName, applyMap);

                    //Normalized name may be a plugin ID due to map config
                    //application in normalize. The map config values must
                    //already be normalized, so do not need to redo that part.
                    nameParts = splitPrefix(normalizedName);
                    prefix = nameParts[0];
                    normalizedName = nameParts[1];
                    isNormalized = true;

                    url = context.nameToUrl(normalizedName);
                }
            }

            //If the id is a plugin id that cannot be determined if it needs
            //normalization, stamp it with a unique ID so two matching relative
            //ids that may conflict can be separate.
            suffix = prefix && !pluginModule && !isNormalized ?
                     '_unnormalized' + (unnormalizedCounter += 1) :
                     '';

            return {
                prefix: prefix,
                name: normalizedName,
                parentMap: parentModuleMap,
                unnormalized: !!suffix,
                url: url,
                originalName: originalName,
                isDefine: isDefine,
                id: (prefix ?
                        prefix + '!' + normalizedName :
                        normalizedName) + suffix
            };
        }

        function getModule(depMap) {
            var id = depMap.id,
                mod = getOwn(registry, id);

            if (!mod) {
                mod = registry[id] = new context.Module(depMap);
            }

            return mod;
        }

        function on(depMap, name, fn) {
            var id = depMap.id,
                mod = getOwn(registry, id);

            if (hasProp(defined, id) &&
                    (!mod || mod.defineEmitComplete)) {
                if (name === 'defined') {
                    fn(defined[id]);
                }
            } else {
                mod = getModule(depMap);
                if (mod.error && name === 'error') {
                    fn(mod.error);
                } else {
                    mod.on(name, fn);
                }
            }
        }

        function onError(err, errback) {
            var ids = err.requireModules,
                notified = false;

            if (errback) {
                errback(err);
            } else {
                each(ids, function (id) {
                    var mod = getOwn(registry, id);
                    if (mod) {
                        //Set error on module, so it skips timeout checks.
                        mod.error = err;
                        if (mod.events.error) {
                            notified = true;
                            mod.emit('error', err);
                        }
                    }
                });

                if (!notified) {
                    req.onError(err);
                }
            }
        }

        /**
         * Internal method to transfer globalQueue items to this context's
         * defQueue.
         */
        function takeGlobalQueue() {
            //Push all the globalDefQueue items into the context's defQueue
            if (globalDefQueue.length) {
                each(globalDefQueue, function(queueItem) {
                    var id = queueItem[0];
                    if (typeof id === 'string') {
                        context.defQueueMap[id] = true;
                    }
                    defQueue.push(queueItem);
                });
                globalDefQueue = [];
            }
        }

        handlers = {
            'require': function (mod) {
                if (mod.require) {
                    return mod.require;
                } else {
                    return (mod.require = context.makeRequire(mod.map));
                }
            },
            'exports': function (mod) {
                mod.usingExports = true;
                if (mod.map.isDefine) {
                    if (mod.exports) {
                        return (defined[mod.map.id] = mod.exports);
                    } else {
                        return (mod.exports = defined[mod.map.id] = {});
                    }
                }
            },
            'module': function (mod) {
                if (mod.module) {
                    return mod.module;
                } else {
                    return (mod.module = {
                        id: mod.map.id,
                        uri: mod.map.url,
                        config: function () {
                            return getOwn(config.config, mod.map.id) || {};
                        },
                        exports: mod.exports || (mod.exports = {})
                    });
                }
            }
        };

        function cleanRegistry(id) {
            //Clean up machinery used for waiting modules.
            delete registry[id];
            delete enabledRegistry[id];
        }

        function breakCycle(mod, traced, processed) {
            var id = mod.map.id;

            if (mod.error) {
                mod.emit('error', mod.error);
            } else {
                traced[id] = true;
                each(mod.depMaps, function (depMap, i) {
                    var depId = depMap.id,
                        dep = getOwn(registry, depId);

                    //Only force things that have not completed
                    //being defined, so still in the registry,
                    //and only if it has not been matched up
                    //in the module already.
                    if (dep && !mod.depMatched[i] && !processed[depId]) {
                        if (getOwn(traced, depId)) {
                            mod.defineDep(i, defined[depId]);
                            mod.check(); //pass false?
                        } else {
                            breakCycle(dep, traced, processed);
                        }
                    }
                });
                processed[id] = true;
            }
        }

        function checkLoaded() {
            var err, usingPathFallback,
                waitInterval = config.waitSeconds * 1000,
                //It is possible to disable the wait interval by using waitSeconds of 0.
                expired = waitInterval && (context.startTime + waitInterval) < new Date().getTime(),
                noLoads = [],
                reqCalls = [],
                stillLoading = false,
                needCycleCheck = true;

            //Do not bother if this call was a result of a cycle break.
            if (inCheckLoaded) {
                return;
            }

            inCheckLoaded = true;

            //Figure out the state of all the modules.
            eachProp(enabledRegistry, function (mod) {
                var map = mod.map,
                    modId = map.id;

                //Skip things that are not enabled or in error state.
                if (!mod.enabled) {
                    return;
                }

                if (!map.isDefine) {
                    reqCalls.push(mod);
                }

                if (!mod.error) {
                    //If the module should be executed, and it has not
                    //been inited and time is up, remember it.
                    if (!mod.inited && expired) {
                        if (hasPathFallback(modId)) {
                            usingPathFallback = true;
                            stillLoading = true;
                        } else {
                            noLoads.push(modId);
                            removeScript(modId);
                        }
                    } else if (!mod.inited && mod.fetched && map.isDefine) {
                        stillLoading = true;
                        if (!map.prefix) {
                            //No reason to keep looking for unfinished
                            //loading. If the only stillLoading is a
                            //plugin resource though, keep going,
                            //because it may be that a plugin resource
                            //is waiting on a non-plugin cycle.
                            return (needCycleCheck = false);
                        }
                    }
                }
            });

            if (expired && noLoads.length) {
                //If wait time expired, throw error of unloaded modules.
                err = makeError('timeout', 'Load timeout for modules: ' + noLoads, null, noLoads);
                err.contextName = context.contextName;
                return onError(err);
            }

            //Not expired, check for a cycle.
            if (needCycleCheck) {
                each(reqCalls, function (mod) {
                    breakCycle(mod, {}, {});
                });
            }

            //If still waiting on loads, and the waiting load is something
            //other than a plugin resource, or there are still outstanding
            //scripts, then just try back later.
            if ((!expired || usingPathFallback) && stillLoading) {
                //Something is still waiting to load. Wait for it, but only
                //if a timeout is not already in effect.
                if ((isBrowser || isWebWorker) && !checkLoadedTimeoutId) {
                    checkLoadedTimeoutId = setTimeout(function () {
                        checkLoadedTimeoutId = 0;
                        checkLoaded();
                    }, 50);
                }
            }

            inCheckLoaded = false;
        }

        Module = function (map) {
            this.events = getOwn(undefEvents, map.id) || {};
            this.map = map;
            this.shim = getOwn(config.shim, map.id);
            this.depExports = [];
            this.depMaps = [];
            this.depMatched = [];
            this.pluginMaps = {};
            this.depCount = 0;

            /* this.exports this.factory
               this.depMaps = [],
               this.enabled, this.fetched
            */
        };

        Module.prototype = {
            init: function (depMaps, factory, errback, options) {
                options = options || {};

                //Do not do more inits if already done. Can happen if there
                //are multiple define calls for the same module. That is not
                //a normal, common case, but it is also not unexpected.
                if (this.inited) {
                    return;
                }

                this.factory = factory;

                if (errback) {
                    //Register for errors on this module.
                    this.on('error', errback);
                } else if (this.events.error) {
                    //If no errback already, but there are error listeners
                    //on this module, set up an errback to pass to the deps.
                    errback = bind(this, function (err) {
                        this.emit('error', err);
                    });
                }

                //Do a copy of the dependency array, so that
                //source inputs are not modified. For example
                //"shim" deps are passed in here directly, and
                //doing a direct modification of the depMaps array
                //would affect that config.
                this.depMaps = depMaps && depMaps.slice(0);

                this.errback = errback;

                //Indicate this module has be initialized
                this.inited = true;

                this.ignore = options.ignore;

                //Could have option to init this module in enabled mode,
                //or could have been previously marked as enabled. However,
                //the dependencies are not known until init is called. So
                //if enabled previously, now trigger dependencies as enabled.
                if (options.enabled || this.enabled) {
                    //Enable this module and dependencies.
                    //Will call this.check()
                    this.enable();
                } else {
                    this.check();
                }
            },

            defineDep: function (i, depExports) {
                //Because of cycles, defined callback for a given
                //export can be called more than once.
                if (!this.depMatched[i]) {
                    this.depMatched[i] = true;
                    this.depCount -= 1;
                    this.depExports[i] = depExports;
                }
            },

            fetch: function () {
                if (this.fetched) {
                    return;
                }
                this.fetched = true;

                context.startTime = (new Date()).getTime();

                var map = this.map;

                //If the manager is for a plugin managed resource,
                //ask the plugin to load it now.
                if (this.shim) {
                    context.makeRequire(this.map, {
                        enableBuildCallback: true
                    })(this.shim.deps || [], bind(this, function () {
                        return map.prefix ? this.callPlugin() : this.load();
                    }));
                } else {
                    //Regular dependency.
                    return map.prefix ? this.callPlugin() : this.load();
                }
            },

            load: function () {
                var url = this.map.url;

                //Regular dependency.
                if (!urlFetched[url]) {
                    urlFetched[url] = true;
                    context.load(this.map.id, url);
                }
            },

            /**
             * Checks if the module is ready to define itself, and if so,
             * define it.
             */
            check: function () {
                if (!this.enabled || this.enabling) {
                    return;
                }

                var err, cjsModule,
                    id = this.map.id,
                    depExports = this.depExports,
                    exports = this.exports,
                    factory = this.factory;

                if (!this.inited) {
                    // Only fetch if not already in the defQueue.
                    if (!hasProp(context.defQueueMap, id)) {
                        this.fetch();
                    }
                } else if (this.error) {
                    this.emit('error', this.error);
                } else if (!this.defining) {
                    //The factory could trigger another require call
                    //that would result in checking this module to
                    //define itself again. If already in the process
                    //of doing that, skip this work.
                    this.defining = true;

                    if (this.depCount < 1 && !this.defined) {
                        if (isFunction(factory)) {
                            try {
                                exports = context.execCb(id, factory, depExports, exports);
                            } catch (e) {
                                err = e;
                            }

                            // Favor return value over exports. If node/cjs in play,
                            // then will not have a return value anyway. Favor
                            // module.exports assignment over exports object.
                            if (this.map.isDefine && exports === undefined) {
                                cjsModule = this.module;
                                if (cjsModule) {
                                    exports = cjsModule.exports;
                                } else if (this.usingExports) {
                                    //exports already set the defined value.
                                    exports = this.exports;
                                }
                            }

                            if (err) {
                                // If there is an error listener, favor passing
                                // to that instead of throwing an error. However,
                                // only do it for define()'d  modules. require
                                // errbacks should not be called for failures in
                                // their callbacks (#699). However if a global
                                // onError is set, use that.
                                if ((this.events.error && this.map.isDefine) ||
                                    req.onError !== defaultOnError) {
                                    err.requireMap = this.map;
                                    err.requireModules = this.map.isDefine ? [this.map.id] : null;
                                    err.requireType = this.map.isDefine ? 'define' : 'require';
                                    return onError((this.error = err));
                                } else if (typeof console !== 'undefined' &&
                                           console.error) {
                                    // Log the error for debugging. If promises could be
                                    // used, this would be different, but making do.
                                    console.error(err);
                                } else {
                                    // Do not want to completely lose the error. While this
                                    // will mess up processing and lead to similar results
                                    // as bug 1440, it at least surfaces the error.
                                    req.onError(err);
                                }
                            }
                        } else {
                            //Just a literal value
                            exports = factory;
                        }

                        this.exports = exports;

                        if (this.map.isDefine && !this.ignore) {
                            defined[id] = exports;

                            if (req.onResourceLoad) {
                                var resLoadMaps = [];
                                each(this.depMaps, function (depMap) {
                                    resLoadMaps.push(depMap.normalizedMap || depMap);
                                });
                                req.onResourceLoad(context, this.map, resLoadMaps);
                            }
                        }

                        //Clean up
                        cleanRegistry(id);

                        this.defined = true;
                    }

                    //Finished the define stage. Allow calling check again
                    //to allow define notifications below in the case of a
                    //cycle.
                    this.defining = false;

                    if (this.defined && !this.defineEmitted) {
                        this.defineEmitted = true;
                        this.emit('defined', this.exports);
                        this.defineEmitComplete = true;
                    }

                }
            },

            callPlugin: function () {
                var map = this.map,
                    id = map.id,
                    //Map already normalized the prefix.
                    pluginMap = makeModuleMap(map.prefix);

                //Mark this as a dependency for this plugin, so it
                //can be traced for cycles.
                this.depMaps.push(pluginMap);

                on(pluginMap, 'defined', bind(this, function (plugin) {
                    var load, normalizedMap, normalizedMod,
                        bundleId = getOwn(bundlesMap, this.map.id),
                        name = this.map.name,
                        parentName = this.map.parentMap ? this.map.parentMap.name : null,
                        localRequire = context.makeRequire(map.parentMap, {
                            enableBuildCallback: true
                        });

                    //If current map is not normalized, wait for that
                    //normalized name to load instead of continuing.
                    if (this.map.unnormalized) {
                        //Normalize the ID if the plugin allows it.
                        if (plugin.normalize) {
                            name = plugin.normalize(name, function (name) {
                                return normalize(name, parentName, true);
                            }) || '';
                        }

                        //prefix and name should already be normalized, no need
                        //for applying map config again either.
                        normalizedMap = makeModuleMap(map.prefix + '!' + name,
                                                      this.map.parentMap);
                        on(normalizedMap,
                            'defined', bind(this, function (value) {
                                this.map.normalizedMap = normalizedMap;
                                this.init([], function () { return value; }, null, {
                                    enabled: true,
                                    ignore: true
                                });
                            }));

                        normalizedMod = getOwn(registry, normalizedMap.id);
                        if (normalizedMod) {
                            //Mark this as a dependency for this plugin, so it
                            //can be traced for cycles.
                            this.depMaps.push(normalizedMap);

                            if (this.events.error) {
                                normalizedMod.on('error', bind(this, function (err) {
                                    this.emit('error', err);
                                }));
                            }
                            normalizedMod.enable();
                        }

                        return;
                    }

                    //If a paths config, then just load that file instead to
                    //resolve the plugin, as it is built into that paths layer.
                    if (bundleId) {
                        this.map.url = context.nameToUrl(bundleId);
                        this.load();
                        return;
                    }

                    load = bind(this, function (value) {
                        this.init([], function () { return value; }, null, {
                            enabled: true
                        });
                    });

                    load.error = bind(this, function (err) {
                        this.inited = true;
                        this.error = err;
                        err.requireModules = [id];

                        //Remove temp unnormalized modules for this module,
                        //since they will never be resolved otherwise now.
                        eachProp(registry, function (mod) {
                            if (mod.map.id.indexOf(id + '_unnormalized') === 0) {
                                cleanRegistry(mod.map.id);
                            }
                        });

                        onError(err);
                    });

                    //Allow plugins to load other code without having to know the
                    //context or how to 'complete' the load.
                    load.fromText = bind(this, function (text, textAlt) {
                        /*jslint evil: true */
                        var moduleName = map.name,
                            moduleMap = makeModuleMap(moduleName),
                            hasInteractive = useInteractive;

                        //As of 2.1.0, support just passing the text, to reinforce
                        //fromText only being called once per resource. Still
                        //support old style of passing moduleName but discard
                        //that moduleName in favor of the internal ref.
                        if (textAlt) {
                            text = textAlt;
                        }

                        //Turn off interactive script matching for IE for any define
                        //calls in the text, then turn it back on at the end.
                        if (hasInteractive) {
                            useInteractive = false;
                        }

                        //Prime the system by creating a module instance for
                        //it.
                        getModule(moduleMap);

                        //Transfer any config to this other module.
                        if (hasProp(config.config, id)) {
                            config.config[moduleName] = config.config[id];
                        }

                        try {
                            req.exec(text);
                        } catch (e) {
                            return onError(makeError('fromtexteval',
                                             'fromText eval for ' + id +
                                            ' failed: ' + e,
                                             e,
                                             [id]));
                        }

                        if (hasInteractive) {
                            useInteractive = true;
                        }

                        //Mark this as a dependency for the plugin
                        //resource
                        this.depMaps.push(moduleMap);

                        //Support anonymous modules.
                        context.completeLoad(moduleName);

                        //Bind the value of that module to the value for this
                        //resource ID.
                        localRequire([moduleName], load);
                    });

                    //Use parentName here since the plugin's name is not reliable,
                    //could be some weird string with no path that actually wants to
                    //reference the parentName's path.
                    plugin.load(map.name, localRequire, load, config);
                }));

                context.enable(pluginMap, this);
                this.pluginMaps[pluginMap.id] = pluginMap;
            },

            enable: function () {
                enabledRegistry[this.map.id] = this;
                this.enabled = true;

                //Set flag mentioning that the module is enabling,
                //so that immediate calls to the defined callbacks
                //for dependencies do not trigger inadvertent load
                //with the depCount still being zero.
                this.enabling = true;

                //Enable each dependency
                each(this.depMaps, bind(this, function (depMap, i) {
                    var id, mod, handler;

                    if (typeof depMap === 'string') {
                        //Dependency needs to be converted to a depMap
                        //and wired up to this module.
                        depMap = makeModuleMap(depMap,
                                               (this.map.isDefine ? this.map : this.map.parentMap),
                                               false,
                                               !this.skipMap);
                        this.depMaps[i] = depMap;

                        handler = getOwn(handlers, depMap.id);

                        if (handler) {
                            this.depExports[i] = handler(this);
                            return;
                        }

                        this.depCount += 1;

                        on(depMap, 'defined', bind(this, function (depExports) {
                            if (this.undefed) {
                                return;
                            }
                            this.defineDep(i, depExports);
                            this.check();
                        }));

                        if (this.errback) {
                            on(depMap, 'error', bind(this, this.errback));
                        } else if (this.events.error) {
                            // No direct errback on this module, but something
                            // else is listening for errors, so be sure to
                            // propagate the error correctly.
                            on(depMap, 'error', bind(this, function(err) {
                                this.emit('error', err);
                            }));
                        }
                    }

                    id = depMap.id;
                    mod = registry[id];

                    //Skip special modules like 'require', 'exports', 'module'
                    //Also, don't call enable if it is already enabled,
                    //important in circular dependency cases.
                    if (!hasProp(handlers, id) && mod && !mod.enabled) {
                        context.enable(depMap, this);
                    }
                }));

                //Enable each plugin that is used in
                //a dependency
                eachProp(this.pluginMaps, bind(this, function (pluginMap) {
                    var mod = getOwn(registry, pluginMap.id);
                    if (mod && !mod.enabled) {
                        context.enable(pluginMap, this);
                    }
                }));

                this.enabling = false;

                this.check();
            },

            on: function (name, cb) {
                var cbs = this.events[name];
                if (!cbs) {
                    cbs = this.events[name] = [];
                }
                cbs.push(cb);
            },

            emit: function (name, evt) {
                each(this.events[name], function (cb) {
                    cb(evt);
                });
                if (name === 'error') {
                    //Now that the error handler was triggered, remove
                    //the listeners, since this broken Module instance
                    //can stay around for a while in the registry.
                    delete this.events[name];
                }
            }
        };

        function callGetModule(args) {
            //Skip modules already defined.
            if (!hasProp(defined, args[0])) {
                getModule(makeModuleMap(args[0], null, true)).init(args[1], args[2]);
            }
        }

        function removeListener(node, func, name, ieName) {
            //Favor detachEvent because of IE9
            //issue, see attachEvent/addEventListener comment elsewhere
            //in this file.
            if (node.detachEvent && !isOpera) {
                //Probably IE. If not it will throw an error, which will be
                //useful to know.
                if (ieName) {
                    node.detachEvent(ieName, func);
                }
            } else {
                node.removeEventListener(name, func, false);
            }
        }

        /**
         * Given an event from a script node, get the requirejs info from it,
         * and then removes the event listeners on the node.
         * @param {Event} evt
         * @returns {Object}
         */
        function getScriptData(evt) {
            //Using currentTarget instead of target for Firefox 2.0's sake. Not
            //all old browsers will be supported, but this one was easy enough
            //to support and still makes sense.
            var node = evt.currentTarget || evt.srcElement;

            //Remove the listeners once here.
            removeListener(node, context.onScriptLoad, 'load', 'onreadystatechange');
            removeListener(node, context.onScriptError, 'error');

            return {
                node: node,
                id: node && node.getAttribute('data-requiremodule')
            };
        }

        function intakeDefines() {
            var args;

            //Any defined modules in the global queue, intake them now.
            takeGlobalQueue();

            //Make sure any remaining defQueue items get properly processed.
            while (defQueue.length) {
                args = defQueue.shift();
                if (args[0] === null) {
                    return onError(makeError('mismatch', 'Mismatched anonymous define() module: ' +
                        args[args.length - 1]));
                } else {
                    //args are id, deps, factory. Should be normalized by the
                    //define() function.
                    callGetModule(args);
                }
            }
            context.defQueueMap = {};
        }

        context = {
            config: config,
            contextName: contextName,
            registry: registry,
            defined: defined,
            urlFetched: urlFetched,
            defQueue: defQueue,
            defQueueMap: {},
            Module: Module,
            makeModuleMap: makeModuleMap,
            nextTick: req.nextTick,
            onError: onError,

            /**
             * Set a configuration for the context.
             * @param {Object} cfg config object to integrate.
             */
            configure: function (cfg) {
                //Make sure the baseUrl ends in a slash.
                if (cfg.baseUrl) {
                    if (cfg.baseUrl.charAt(cfg.baseUrl.length - 1) !== '/') {
                        cfg.baseUrl += '/';
                    }
                }

                //Save off the paths since they require special processing,
                //they are additive.
                var shim = config.shim,
                    objs = {
                        paths: true,
                        bundles: true,
                        config: true,
                        map: true
                    };

                eachProp(cfg, function (value, prop) {
                    if (objs[prop]) {
                        if (!config[prop]) {
                            config[prop] = {};
                        }
                        mixin(config[prop], value, true, true);
                    } else {
                        config[prop] = value;
                    }
                });

                //Reverse map the bundles
                if (cfg.bundles) {
                    eachProp(cfg.bundles, function (value, prop) {
                        each(value, function (v) {
                            if (v !== prop) {
                                bundlesMap[v] = prop;
                            }
                        });
                    });
                }

                //Merge shim
                if (cfg.shim) {
                    eachProp(cfg.shim, function (value, id) {
                        //Normalize the structure
                        if (isArray(value)) {
                            value = {
                                deps: value
                            };
                        }
                        if ((value.exports || value.init) && !value.exportsFn) {
                            value.exportsFn = context.makeShimExports(value);
                        }
                        shim[id] = value;
                    });
                    config.shim = shim;
                }

                //Adjust packages if necessary.
                if (cfg.packages) {
                    each(cfg.packages, function (pkgObj) {
                        var location, name;

                        pkgObj = typeof pkgObj === 'string' ? {name: pkgObj} : pkgObj;

                        name = pkgObj.name;
                        location = pkgObj.location;
                        if (location) {
                            config.paths[name] = pkgObj.location;
                        }

                        //Save pointer to main module ID for pkg name.
                        //Remove leading dot in main, so main paths are normalized,
                        //and remove any trailing .js, since different package
                        //envs have different conventions: some use a module name,
                        //some use a file name.
                        config.pkgs[name] = pkgObj.name + '/' + (pkgObj.main || 'main')
                                     .replace(currDirRegExp, '')
                                     .replace(jsSuffixRegExp, '');
                    });
                }

                //If there are any "waiting to execute" modules in the registry,
                //update the maps for them, since their info, like URLs to load,
                //may have changed.
                eachProp(registry, function (mod, id) {
                    //If module already has init called, since it is too
                    //late to modify them, and ignore unnormalized ones
                    //since they are transient.
                    if (!mod.inited && !mod.map.unnormalized) {
                        mod.map = makeModuleMap(id, null, true);
                    }
                });

                //If a deps array or a config callback is specified, then call
                //require with those args. This is useful when require is defined as a
                //config object before require.js is loaded.
                if (cfg.deps || cfg.callback) {
                    context.require(cfg.deps || [], cfg.callback);
                }
            },

            makeShimExports: function (value) {
                function fn() {
                    var ret;
                    if (value.init) {
                        ret = value.init.apply(global, arguments);
                    }
                    return ret || (value.exports && getGlobal(value.exports));
                }
                return fn;
            },

            makeRequire: function (relMap, options) {
                options = options || {};

                function localRequire(deps, callback, errback) {
                    var id, map, requireMod;

                    if (options.enableBuildCallback && callback && isFunction(callback)) {
                        callback.__requireJsBuild = true;
                    }

                    if (typeof deps === 'string') {
                        if (isFunction(callback)) {
                            //Invalid call
                            return onError(makeError('requireargs', 'Invalid require call'), errback);
                        }

                        //If require|exports|module are requested, get the
                        //value for them from the special handlers. Caveat:
                        //this only works while module is being defined.
                        if (relMap && hasProp(handlers, deps)) {
                            return handlers[deps](registry[relMap.id]);
                        }

                        //Synchronous access to one module. If require.get is
                        //available (as in the Node adapter), prefer that.
                        if (req.get) {
                            return req.get(context, deps, relMap, localRequire);
                        }

                        //Normalize module name, if it contains . or ..
                        map = makeModuleMap(deps, relMap, false, true);
                        id = map.id;

                        if (!hasProp(defined, id)) {
                            return onError(makeError('notloaded', 'Module name "' +
                                        id +
                                        '" has not been loaded yet for context: ' +
                                        contextName +
                                        (relMap ? '' : '. Use require([])')));
                        }
                        return defined[id];
                    }

                    //Grab defines waiting in the global queue.
                    intakeDefines();

                    //Mark all the dependencies as needing to be loaded.
                    context.nextTick(function () {
                        //Some defines could have been added since the
                        //require call, collect them.
                        intakeDefines();

                        requireMod = getModule(makeModuleMap(null, relMap));

                        //Store if map config should be applied to this require
                        //call for dependencies.
                        requireMod.skipMap = options.skipMap;

                        requireMod.init(deps, callback, errback, {
                            enabled: true
                        });

                        checkLoaded();
                    });

                    return localRequire;
                }

                mixin(localRequire, {
                    isBrowser: isBrowser,

                    /**
                     * Converts a module name + .extension into an URL path.
                     * *Requires* the use of a module name. It does not support using
                     * plain URLs like nameToUrl.
                     */
                    toUrl: function (moduleNamePlusExt) {
                        var ext,
                            index = moduleNamePlusExt.lastIndexOf('.'),
                            segment = moduleNamePlusExt.split('/')[0],
                            isRelative = segment === '.' || segment === '..';

                        //Have a file extension alias, and it is not the
                        //dots from a relative path.
                        if (index !== -1 && (!isRelative || index > 1)) {
                            ext = moduleNamePlusExt.substring(index, moduleNamePlusExt.length);
                            moduleNamePlusExt = moduleNamePlusExt.substring(0, index);
                        }

                        return context.nameToUrl(normalize(moduleNamePlusExt,
                                                relMap && relMap.id, true), ext,  true);
                    },

                    defined: function (id) {
                        return hasProp(defined, makeModuleMap(id, relMap, false, true).id);
                    },

                    specified: function (id) {
                        id = makeModuleMap(id, relMap, false, true).id;
                        return hasProp(defined, id) || hasProp(registry, id);
                    }
                });

                //Only allow undef on top level require calls
                if (!relMap) {
                    localRequire.undef = function (id) {
                        //Bind any waiting define() calls to this context,
                        //fix for #408
                        takeGlobalQueue();

                        var map = makeModuleMap(id, relMap, true),
                            mod = getOwn(registry, id);

                        mod.undefed = true;
                        removeScript(id);

                        delete defined[id];
                        delete urlFetched[map.url];
                        delete undefEvents[id];

                        //Clean queued defines too. Go backwards
                        //in array so that the splices do not
                        //mess up the iteration.
                        eachReverse(defQueue, function(args, i) {
                            if (args[0] === id) {
                                defQueue.splice(i, 1);
                            }
                        });
                        delete context.defQueueMap[id];

                        if (mod) {
                            //Hold on to listeners in case the
                            //module will be attempted to be reloaded
                            //using a different config.
                            if (mod.events.defined) {
                                undefEvents[id] = mod.events;
                            }

                            cleanRegistry(id);
                        }
                    };
                }

                return localRequire;
            },

            /**
             * Called to enable a module if it is still in the registry
             * awaiting enablement. A second arg, parent, the parent module,
             * is passed in for context, when this method is overridden by
             * the optimizer. Not shown here to keep code compact.
             */
            enable: function (depMap) {
                var mod = getOwn(registry, depMap.id);
                if (mod) {
                    getModule(depMap).enable();
                }
            },

            /**
             * Internal method used by environment adapters to complete a load event.
             * A load event could be a script load or just a load pass from a synchronous
             * load call.
             * @param {String} moduleName the name of the module to potentially complete.
             */
            completeLoad: function (moduleName) {
                var found, args, mod,
                    shim = getOwn(config.shim, moduleName) || {},
                    shExports = shim.exports;

                takeGlobalQueue();

                while (defQueue.length) {
                    args = defQueue.shift();
                    if (args[0] === null) {
                        args[0] = moduleName;
                        //If already found an anonymous module and bound it
                        //to this name, then this is some other anon module
                        //waiting for its completeLoad to fire.
                        if (found) {
                            break;
                        }
                        found = true;
                    } else if (args[0] === moduleName) {
                        //Found matching define call for this script!
                        found = true;
                    }

                    callGetModule(args);
                }
                context.defQueueMap = {};

                //Do this after the cycle of callGetModule in case the result
                //of those calls/init calls changes the registry.
                mod = getOwn(registry, moduleName);

                if (!found && !hasProp(defined, moduleName) && mod && !mod.inited) {
                    if (config.enforceDefine && (!shExports || !getGlobal(shExports))) {
                        if (hasPathFallback(moduleName)) {
                            return;
                        } else {
                            return onError(makeError('nodefine',
                                             'No define call for ' + moduleName,
                                             null,
                                             [moduleName]));
                        }
                    } else {
                        //A script that does not call define(), so just simulate
                        //the call for it.
                        callGetModule([moduleName, (shim.deps || []), shim.exportsFn]);
                    }
                }

                checkLoaded();
            },

            /**
             * Converts a module name to a file path. Supports cases where
             * moduleName may actually be just an URL.
             * Note that it **does not** call normalize on the moduleName,
             * it is assumed to have already been normalized. This is an
             * internal API, not a public one. Use toUrl for the public API.
             */
            nameToUrl: function (moduleName, ext, skipExt) {
                var paths, syms, i, parentModule, url,
                    parentPath, bundleId,
                    pkgMain = getOwn(config.pkgs, moduleName);

                if (pkgMain) {
                    moduleName = pkgMain;
                }

                bundleId = getOwn(bundlesMap, moduleName);

                if (bundleId) {
                    return context.nameToUrl(bundleId, ext, skipExt);
                }

                //If a colon is in the URL, it indicates a protocol is used and it is just
                //an URL to a file, or if it starts with a slash, contains a query arg (i.e. ?)
                //or ends with .js, then assume the user meant to use an url and not a module id.
                //The slash is important for protocol-less URLs as well as full paths.
                if (req.jsExtRegExp.test(moduleName)) {
                    //Just a plain path, not module name lookup, so just return it.
                    //Add extension if it is included. This is a bit wonky, only non-.js things pass
                    //an extension, this method probably needs to be reworked.
                    url = moduleName + (ext || '');
                } else {
                    //A module that needs to be converted to a path.
                    paths = config.paths;

                    syms = moduleName.split('/');
                    //For each module name segment, see if there is a path
                    //registered for it. Start with most specific name
                    //and work up from it.
                    for (i = syms.length; i > 0; i -= 1) {
                        parentModule = syms.slice(0, i).join('/');

                        parentPath = getOwn(paths, parentModule);
                        if (parentPath) {
                            //If an array, it means there are a few choices,
                            //Choose the one that is desired
                            if (isArray(parentPath)) {
                                parentPath = parentPath[0];
                            }
                            syms.splice(0, i, parentPath);
                            break;
                        }
                    }

                    //Join the path parts together, then figure out if baseUrl is needed.
                    url = syms.join('/');
                    url += (ext || (/^data\:|\?/.test(url) || skipExt ? '' : '.js'));
                    url = (url.charAt(0) === '/' || url.match(/^[\w\+\.\-]+:/) ? '' : config.baseUrl) + url;
                }

                return config.urlArgs ? url +
                                        ((url.indexOf('?') === -1 ? '?' : '&') +
                                         config.urlArgs) : url;
            },

            //Delegates to req.load. Broken out as a separate function to
            //allow overriding in the optimizer.
            load: function (id, url) {
                req.load(context, id, url);
            },

            /**
             * Executes a module callback function. Broken out as a separate function
             * solely to allow the build system to sequence the files in the built
             * layer in the right sequence.
             *
             * @private
             */
            execCb: function (name, callback, args, exports) {
                return callback.apply(exports, args);
            },

            /**
             * callback for script loads, used to check status of loading.
             *
             * @param {Event} evt the event from the browser for the script
             * that was loaded.
             */
            onScriptLoad: function (evt) {
                //Using currentTarget instead of target for Firefox 2.0's sake. Not
                //all old browsers will be supported, but this one was easy enough
                //to support and still makes sense.
                if (evt.type === 'load' ||
                        (readyRegExp.test((evt.currentTarget || evt.srcElement).readyState))) {
                    //Reset interactive script so a script node is not held onto for
                    //to long.
                    interactiveScript = null;

                    //Pull out the name of the module and the context.
                    var data = getScriptData(evt);
                    context.completeLoad(data.id);
                }
            },

            /**
             * Callback for script errors.
             */
            onScriptError: function (evt) {
                var data = getScriptData(evt);
                if (!hasPathFallback(data.id)) {
                    var parents = [];
                    eachProp(registry, function(value, key) {
                        if (key.indexOf('_@r') !== 0) {
                            each(value.depMaps, function(depMap) {
                                if (depMap.id === data.id) {
                                    parents.push(key);
                                }
                                return true;
                            });
                        }
                    });
                    return onError(makeError('scripterror', 'Script error for "' + data.id +
                                             (parents.length ?
                                             '", needed by: ' + parents.join(', ') :
                                             '"'), evt, [data.id]));
                }
            }
        };

        context.require = context.makeRequire();
        return context;
    }

    /**
     * Main entry point.
     *
     * If the only argument to require is a string, then the module that
     * is represented by that string is fetched for the appropriate context.
     *
     * If the first argument is an array, then it will be treated as an array
     * of dependency string names to fetch. An optional function callback can
     * be specified to execute when all of those dependencies are available.
     *
     * Make a local req variable to help Caja compliance (it assumes things
     * on a require that are not standardized), and to give a short
     * name for minification/local scope use.
     */
    req = requirejs = function (deps, callback, errback, optional) {

        //Find the right context, use default
        var context, config,
            contextName = defContextName;

        // Determine if have config object in the call.
        if (!isArray(deps) && typeof deps !== 'string') {
            // deps is a config object
            config = deps;
            if (isArray(callback)) {
                // Adjust args if there are dependencies
                deps = callback;
                callback = errback;
                errback = optional;
            } else {
                deps = [];
            }
        }

        if (config && config.context) {
            contextName = config.context;
        }

        context = getOwn(contexts, contextName);
        if (!context) {
            context = contexts[contextName] = req.s.newContext(contextName);
        }

        if (config) {
            context.configure(config);
        }

        return context.require(deps, callback, errback);
    };

    /**
     * Support require.config() to make it easier to cooperate with other
     * AMD loaders on globally agreed names.
     */
    req.config = function (config) {
        return req(config);
    };

    /**
     * Execute something after the current tick
     * of the event loop. Override for other envs
     * that have a better solution than setTimeout.
     * @param  {Function} fn function to execute later.
     */
    req.nextTick = typeof setTimeout !== 'undefined' ? function (fn) {
        setTimeout(fn, 4);
    } : function (fn) { fn(); };

    /**
     * Export require as a global, but only if it does not already exist.
     */
    if (!require) {
        require = req;
    }

    req.version = version;

    //Used to filter out dependencies that are already paths.
    req.jsExtRegExp = /^\/|:|\?|\.js$/;
    req.isBrowser = isBrowser;
    s = req.s = {
        contexts: contexts,
        newContext: newContext
    };

    //Create default context.
    req({});

    //Exports some context-sensitive methods on global require.
    each([
        'toUrl',
        'undef',
        'defined',
        'specified'
    ], function (prop) {
        //Reference from contexts instead of early binding to default context,
        //so that during builds, the latest instance of the default context
        //with its config gets used.
        req[prop] = function () {
            var ctx = contexts[defContextName];
            return ctx.require[prop].apply(ctx, arguments);
        };
    });

    if (isBrowser) {
        head = s.head = document.getElementsByTagName('head')[0];
        //If BASE tag is in play, using appendChild is a problem for IE6.
        //When that browser dies, this can be removed. Details in this jQuery bug:
        //http://dev.jquery.com/ticket/2709
        baseElement = document.getElementsByTagName('base')[0];
        if (baseElement) {
            head = s.head = baseElement.parentNode;
        }
    }

    /**
     * Any errors that require explicitly generates will be passed to this
     * function. Intercept/override it if you want custom error handling.
     * @param {Error} err the error object.
     */
    req.onError = defaultOnError;

    /**
     * Creates the node for the load command. Only used in browser envs.
     */
    req.createNode = function (config, moduleName, url) {
        var node = config.xhtml ?
                document.createElementNS('http://www.w3.org/1999/xhtml', 'html:script') :
                document.createElement('script');
        node.type = config.scriptType || 'text/javascript';
        node.charset = 'utf-8';
        node.async = true;
        return node;
    };

    /**
     * Does the request to load a module for the browser case.
     * Make this a separate function to allow other environments
     * to override it.
     *
     * @param {Object} context the require context to find state.
     * @param {String} moduleName the name of the module.
     * @param {Object} url the URL to the module.
     */
    req.load = function (context, moduleName, url) {
        var config = (context && context.config) || {},
            node;
        if (isBrowser) {
            //In the browser so use a script tag
            node = req.createNode(config, moduleName, url);
            if (config.onNodeCreated) {
                config.onNodeCreated(node, config, moduleName, url);
            }

            node.setAttribute('data-requirecontext', context.contextName);
            node.setAttribute('data-requiremodule', moduleName);

            //Set up load listener. Test attachEvent first because IE9 has
            //a subtle issue in its addEventListener and script onload firings
            //that do not match the behavior of all other browsers with
            //addEventListener support, which fire the onload event for a
            //script right after the script execution. See:
            //https://connect.microsoft.com/IE/feedback/details/648057/script-onload-event-is-not-fired-immediately-after-script-execution
            //UNFORTUNATELY Opera implements attachEvent but does not follow the script
            //script execution mode.
            if (node.attachEvent &&
                    //Check if node.attachEvent is artificially added by custom script or
                    //natively supported by browser
                    //read https://github.com/jrburke/requirejs/issues/187
                    //if we can NOT find [native code] then it must NOT natively supported.
                    //in IE8, node.attachEvent does not have toString()
                    //Note the test for "[native code" with no closing brace, see:
                    //https://github.com/jrburke/requirejs/issues/273
                    !(node.attachEvent.toString && node.attachEvent.toString().indexOf('[native code') < 0) &&
                    !isOpera) {
                //Probably IE. IE (at least 6-8) do not fire
                //script onload right after executing the script, so
                //we cannot tie the anonymous define call to a name.
                //However, IE reports the script as being in 'interactive'
                //readyState at the time of the define call.
                useInteractive = true;

                node.attachEvent('onreadystatechange', context.onScriptLoad);
                //It would be great to add an error handler here to catch
                //404s in IE9+. However, onreadystatechange will fire before
                //the error handler, so that does not help. If addEventListener
                //is used, then IE will fire error before load, but we cannot
                //use that pathway given the connect.microsoft.com issue
                //mentioned above about not doing the 'script execute,
                //then fire the script load event listener before execute
                //next script' that other browsers do.
                //Best hope: IE10 fixes the issues,
                //and then destroys all installs of IE 6-9.
                //node.attachEvent('onerror', context.onScriptError);
            } else {
                node.addEventListener('load', context.onScriptLoad, false);
                node.addEventListener('error', context.onScriptError, false);
            }
            node.src = url;

            //For some cache cases in IE 6-8, the script executes before the end
            //of the appendChild execution, so to tie an anonymous define
            //call to the module name (which is stored on the node), hold on
            //to a reference to this node, but clear after the DOM insertion.
            currentlyAddingScript = node;
            if (baseElement) {
                head.insertBefore(node, baseElement);
            } else {
                head.appendChild(node);
            }
            currentlyAddingScript = null;

            return node;
        } else if (isWebWorker) {
            try {
                //In a web worker, use importScripts. This is not a very
                //efficient use of importScripts, importScripts will block until
                //its script is downloaded and evaluated. However, if web workers
                //are in play, the expectation is that a build has been done so
                //that only one script needs to be loaded anyway. This may need
                //to be reevaluated if other use cases become common.
                importScripts(url);

                //Account for anonymous modules
                context.completeLoad(moduleName);
            } catch (e) {
                context.onError(makeError('importscripts',
                                'importScripts failed for ' +
                                    moduleName + ' at ' + url,
                                e,
                                [moduleName]));
            }
        }
    };

    function getInteractiveScript() {
        if (interactiveScript && interactiveScript.readyState === 'interactive') {
            return interactiveScript;
        }

        eachReverse(scripts(), function (script) {
            if (script.readyState === 'interactive') {
                return (interactiveScript = script);
            }
        });
        return interactiveScript;
    }

    //Look for a data-main script attribute, which could also adjust the baseUrl.
    if (isBrowser && !cfg.skipDataMain) {
        //Figure out baseUrl. Get it from the script tag with require.js in it.
        eachReverse(scripts(), function (script) {
            //Set the 'head' where we can append children by
            //using the script's parent.
            if (!head) {
                head = script.parentNode;
            }

            //Look for a data-main attribute to set main script for the page
            //to load. If it is there, the path to data main becomes the
            //baseUrl, if it is not already set.
            dataMain = script.getAttribute('data-main');
            if (dataMain) {
                //Preserve dataMain in case it is a path (i.e. contains '?')
                mainScript = dataMain;

                //Set final baseUrl if there is not already an explicit one.
                if (!cfg.baseUrl) {
                    //Pull off the directory of data-main for use as the
                    //baseUrl.
                    src = mainScript.split('/');
                    mainScript = src.pop();
                    subPath = src.length ? src.join('/')  + '/' : './';

                    cfg.baseUrl = subPath;
                }

                //Strip off any trailing .js since mainScript is now
                //like a module name.
                mainScript = mainScript.replace(jsSuffixRegExp, '');

                //If mainScript is still a path, fall back to dataMain
                if (req.jsExtRegExp.test(mainScript)) {
                    mainScript = dataMain;
                }

                //Put the data-main script in the files to load.
                cfg.deps = cfg.deps ? cfg.deps.concat(mainScript) : [mainScript];

                return true;
            }
        });
    }

    /**
     * The function that handles definitions of modules. Differs from
     * require() in that a string for the module should be the first argument,
     * and the function to execute after dependencies are loaded should
     * return a value to define the module corresponding to the first argument's
     * name.
     */
    define = function (name, deps, callback) {
        var node, context;

        //Allow for anonymous modules
        if (typeof name !== 'string') {
            //Adjust args appropriately
            callback = deps;
            deps = name;
            name = null;
        }

        //This module may not have dependencies
        if (!isArray(deps)) {
            callback = deps;
            deps = null;
        }

        //If no name, and callback is a function, then figure out if it a
        //CommonJS thing with dependencies.
        if (!deps && isFunction(callback)) {
            deps = [];
            //Remove comments from the callback string,
            //look for require calls, and pull them into the dependencies,
            //but only if there are function args.
            if (callback.length) {
                callback
                    .toString()
                    .replace(commentRegExp, '')
                    .replace(cjsRequireRegExp, function (match, dep) {
                        deps.push(dep);
                    });

                //May be a CommonJS thing even without require calls, but still
                //could use exports, and module. Avoid doing exports and module
                //work though if it just needs require.
                //REQUIRES the function to expect the CommonJS variables in the
                //order listed below.
                deps = (callback.length === 1 ? ['require'] : ['require', 'exports', 'module']).concat(deps);
            }
        }

        //If in IE 6-8 and hit an anonymous define() call, do the interactive
        //work.
        if (useInteractive) {
            node = currentlyAddingScript || getInteractiveScript();
            if (node) {
                if (!name) {
                    name = node.getAttribute('data-requiremodule');
                }
                context = contexts[node.getAttribute('data-requirecontext')];
            }
        }

        //Always save off evaluating the def call until the script onload handler.
        //This allows multiple modules to be in a file without prematurely
        //tracing dependencies, and allows for anonymous module support,
        //where the module name is not known until the script onload event
        //occurs. If no context, use the global queue, and get it processed
        //in the onscript load callback.
        if (context) {
            context.defQueue.push([name, deps, callback]);
            context.defQueueMap[name] = true;
        } else {
            globalDefQueue.push([name, deps, callback]);
        }
    };

    define.amd = {
        jQuery: true
    };

    /**
     * Executes the text. Normally just uses eval, but can be modified
     * to use a better, environment-specific call. Only used for transpiling
     * loader plugins, not for plain JS modules.
     * @param {String} text the text to execute/evaluate.
     */
    req.exec = function (text) {
        /*jslint evil: true */
        return eval(text);
    };

    //Set up with config info.
    req(cfg);
}(this));
\",\n              \"ok\": true,\n              \"headers\": [\n                [\n                  \"content-type\",\n                  \"application/javascript\"\n                ]\n              ],\n              \"status\": 200,\n              \"status_text\": \"\"\n            }\n          },\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 183,\n          \"referenced_widgets\": [\n            \"4d1bd7a205b94210ba8e1fd946d75821\",\n            \"1f2847673c374813ac442322e978eec7\",\n            \"f48103aee6dc4c06b176bc115be332e0\",\n            \"c6f8b3bf7fce4c928a0db3651813347d\",\n            \"8d049af8ea834bf7b3a0fc7013fb3ff9\",\n            \"1dc9d68906594c1eb6db7c9f31876d2a\",\n            \"f9d8ea0a95924b0596ab4b7f091a94b7\",\n            \"ba7f35525a7b4cb1951ed1a1e6a57ffd\",\n            \"d3ee7a14538244b1b64abbeb24948102\",\n            \"34507ce588b04412aaacea76987f27ea\",\n            \"c32d39b32a144c4480cd8d6b1d6c199e\",\n            \"693433c2ec204437ac7878a8bee61647\",\n            \"19a09d359acf496bb0bc68c63dea1e78\",\n            \"8e81da5616354ddb887419d81251096b\",\n            \"2a084a03747d4c9984cc13136ccc4217\",\n            \"679c7f033d2940f489382161964c5c0d\",\n            \"2861d6bdfed84911ab25f8175d718e2e\",\n            \"df575c6406c0426ca9f222b818896439\",\n            \"5f214f2f5a964fdc97eda797248f720f\",\n            \"40f8b46b4b8f44dcabd98d6a5e3044b3\",\n            \"5a68a09e677d4118bc7e3efc18c35999\",\n            \"73ff50d8f1dc4f4e99062845a024a20b\",\n            \"b8aa22a0efcb4f43a752e21bdeb73589\",\n            \"f2095ed84f644757888f5746c2a10ee4\"\n          ]\n        },\n        \"outputId\": \"4aafdc98-d7c7-4930-cdf0-47b9438e2835\"\n      },\n      \"source\": [\n        \"#@title Step 4: Processing and Displaying Attention Heads\\n\",\n        \"model_version = 'bert-base-uncased'\\n\",\n        \"do_lower_case = True\\n\",\n        \"model = BertModel.from_pretrained(model_version, output_attentions=True)\\n\",\n        \"tokenizer = BertTokenizer.from_pretrained(model_version, do_lower_case=do_lower_case)\\n\",\n        \"\\n\",\n        \"\\n\",\n        \"sentence_a = \\\"The cat sleeps on the mat\\\"\\n\",\n        \"sentence_b = \\\"Le chat dors sur le tapis\\\"\\n\",\n        \"inputs = tokenizer.encode_plus(sentence_a, sentence_b, return_tensors='pt', add_special_tokens=True)\\n\",\n        \"\\n\",\n        \"\\n\",\n        \"\\n\",\n        \"token_type_ids = inputs['token_type_ids']\\n\",\n        \"input_ids = inputs['input_ids']\\n\",\n        \"attention = model(input_ids, token_type_ids=token_type_ids)[-1]\\n\",\n        \"input_id_list = input_ids[0].tolist() # Batch index 0\\n\",\n        \"tokens = tokenizer.convert_ids_to_tokens(input_id_list)\\n\",\n        \"call_html()\\n\",\n        \"\\n\",\n        \"head_view(attention, tokens)\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"4d1bd7a205b94210ba8e1fd946d75821\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=433.0, style=ProgressStyle(description_…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"d3ee7a14538244b1b64abbeb24948102\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=440473133.0, style=ProgressStyle(descri…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"2861d6bdfed84911ab25f8175d718e2e\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=231508.0, style=ProgressStyle(descripti…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"text/html\": [\n              \"\\n\",\n              \"        <script src=\\\"/static/components/requirejs/require.js\\\"></script>\\n\",\n              \"        <script>\\n\",\n              \"          requirejs.config({\\n\",\n              \"            paths: {\\n\",\n              \"              base: '/static/base',\\n\",\n              \"              \\\"d3\\\": \\\"https://cdnjs.cloudflare.com/ajax/libs/d3/3.5.8/d3.min\\\",\\n\",\n              \"              jquery: '//ajax.googleapis.com/ajax/libs/jquery/2.0.0/jquery.min',\\n\",\n              \"            },\\n\",\n              \"          });\\n\",\n              \"        </script>\\n\",\n              \"        \"\n            ],\n            \"text/plain\": [\n              \"<IPython.core.display.HTML object>\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"text/html\": [\n              \"\\n\",\n              \"              <span style=\\\"user-select:none\\\">\\n\",\n              \"                Layer: <select id=\\\"layer\\\"></select>\\n\",\n              \"              </span>\\n\",\n              \"              <div id='vis'></div> \\n\",\n              \"            \"\n            ],\n            \"text/plain\": [\n              \"<IPython.core.display.HTML object>\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/javascript\": [\n              \"window.params = {\\\"attention\\\": {\\\"all\\\": {\\\"attn\\\": [[[[0.042640648782253265, 0.09681650996208191, 0.03236351162195206, 0.01571996696293354, 0.08111880719661713, 0.10342955589294434, 0.0738406777381897, 0.20756109058856964, 0.01790483668446541, 0.027967726811766624, 0.030372662469744682, 0.030997319146990776, 0.034154053777456284, 0.017717381939291954, 0.030224351212382317, 0.01922592520713806, 0.13794498145580292], [0.11926430463790894, 0.12762202322483063, 0.09728197753429413, 0.08620084077119827, 0.15430551767349243, 0.15357902646064758, 0.1171526163816452, 0.11728273332118988, 0.002462986623868346, 0.0025408673100173473, 0.0038905502296984196, 0.002784406766295433, 0.0019145426340401173, 0.002655654214322567, 0.0045268540270626545, 0.0026093535125255585, 0.003925872500985861], [0.09301801770925522, 0.08041764795780182, 0.10636380314826965, 0.1770949810743332, 0.06251100450754166, 0.08526547998189926, 0.189857617020607, 0.15887802839279175, 0.0030797156505286694, 0.006695868447422981, 0.003967460244894028, 0.004050382412970066, 0.005367732606828213, 0.003237680299207568, 0.009736202657222748, 0.006193476263433695, 0.00426498195156455], [0.106561079621315, 0.05641898140311241, 0.19660750031471252, 0.11499432474374771, 0.18368546664714813, 0.05770811811089516, 0.15900199115276337, 0.08477166295051575, 0.0027165724895894527, 0.006085303612053394, 0.003760731313377619, 0.004821436014026403, 0.004529156256467104, 0.0029052270110696554, 0.00955934263765812, 0.0030809317249804735, 0.0027921488508582115], [0.08910197764635086, 0.07670493423938751, 0.11446008086204529, 0.10101595520973206, 0.28781038522720337, 0.07956993579864502, 0.10312031954526901, 0.12079973518848419, 0.0032753932755440474, 0.002210873644798994, 0.0031903095077723265, 0.0027171436231583357, 0.0029220920987427235, 0.003313075052574277, 0.0037452015094459057, 0.002506786026060581, 0.0035357533488422632], [0.0998498871922493, 0.11938027292490005, 0.1014232411980629, 0.09744156897068024, 0.20809046924114227, 0.1293657422065735, 0.11939773708581924, 0.10011722892522812, 0.0026749621611088514, 0.0020403428934514523, 0.0035907896235585213, 0.0028014585841447115, 0.001921422895975411, 0.0027241555508226156, 0.003754986450076103, 0.0023009770084172487, 0.003124766983091831], [0.0645175352692604, 0.025949278846383095, 0.3036326766014099, 0.2503582239151001, 0.050063785165548325, 0.025644758716225624, 0.09468463808298111, 0.13614340126514435, 0.007350584492087364, 0.004970385227352381, 0.002643715823069215, 0.00545605830848217, 0.0036450151819735765, 0.0067082783207297325, 0.0094120679423213, 0.00584537535905838, 0.0029742582701146603], [0.11769143491983414, 0.1344335973262787, 0.06413094699382782, 0.0317479632794857, 0.1513580083847046, 0.14977924525737762, 0.1319790929555893, 0.1862022876739502, 0.002302740700542927, 0.0028763131704181433, 0.0028735576197504997, 0.003654520958662033, 0.002978122793138027, 0.002326279180124402, 0.004565268289297819, 0.00349133531562984, 0.00760926678776741], [0.02562863752245903, 0.0004078300844412297, 0.0032536208163946867, 0.003330453997477889, 0.00017095707880798727, 0.00039548988570459187, 0.00199083611369133, 0.0016490630805492401, 0.167580708861351, 0.09408379346132278, 0.06544903665781021, 0.04623166471719742, 0.15448689460754395, 0.14431609213352203, 0.11098875850439072, 0.12445829063653946, 0.055577896535396576], [0.07014763355255127, 0.0005437441286630929, 0.003980056382715702, 0.0026207391638308764, 0.0005810425500385463, 0.000526122807059437, 0.0023223876487463713, 0.00579214608296752, 0.15124566853046417, 0.08132249861955643, 0.050969529896974564, 0.05689125508069992, 0.09932894259691238, 0.1200522631406784, 0.10018663853406906, 0.0984487235546112, 0.15504062175750732], [0.12969426810741425, 0.0007159645901992917, 0.002899829763919115, 0.0034432087559252977, 0.001374734565615654, 0.0007553455070592463, 0.0019275349332019687, 0.0035819439217448235, 0.11694613099098206, 0.11251003295183182, 0.061726413667201996, 0.061719123274087906, 0.14434251189231873, 0.0918242484331131, 0.05241181328892708, 0.10344237089157104, 0.11068445444107056], [0.08860807865858078, 0.00033340323716402054, 0.005995309446007013, 0.009146761149168015, 0.00066465261625126, 0.0003436091938056052, 0.0025671101175248623, 0.003198149148374796, 0.09682898223400116, 0.20771238207817078, 0.07781518250703812, 0.045778073370456696, 0.10230869054794312, 0.08032435178756714, 0.11268480122089386, 0.08862999826669693, 0.07706047594547272], [0.01774413511157036, 0.0004971520393155515, 0.005784652661532164, 0.005453730933368206, 0.0004703355662059039, 0.00048706456436775625, 0.0016657375963404775, 0.002886575646698475, 0.27312925457954407, 0.09089815616607666, 0.044461674988269806, 0.0464358851313591, 0.049647387117147446, 0.2217429131269455, 0.04777606204152107, 0.11602862924337387, 0.0748906061053276], [0.02785409986972809, 0.00048729090485721827, 0.003965858370065689, 0.00443654228001833, 0.00019893207354471087, 0.00046309587196446955, 0.0025553046725690365, 0.0016836462309584022, 0.1612589806318283, 0.09625481069087982, 0.06796625256538391, 0.04566244035959244, 0.1552688181400299, 0.13440664112567902, 0.12328854948282242, 0.12604457139968872, 0.048204127699136734], [0.07027672976255417, 0.00272547360509634, 0.006296050269156694, 0.012220696546137333, 0.002014654455706477, 0.0024726318661123514, 0.007371077314019203, 0.011058925651013851, 0.10080023854970932, 0.11292599141597748, 0.04508043825626373, 0.06322207301855087, 0.1674046814441681, 0.08792895823717117, 0.02495424449443817, 0.17958371341228485, 0.10366351157426834], [0.041481614112854004, 0.0005970805650576949, 0.004914623219519854, 0.006723812781274319, 0.00038710434455424547, 0.0006574143772013485, 0.0038090385496616364, 0.0036365508567541838, 0.11489778012037277, 0.1283486783504486, 0.08074445277452469, 0.06326085329055786, 0.11468322575092316, 0.10017228871583939, 0.13928304612636566, 0.09604785591363907, 0.10035452991724014], [0.06458704173564911, 0.0027690366841852665, 0.0016191770555451512, 0.000801018497440964, 0.0020696024876087904, 0.002871264237910509, 0.00262609519995749, 0.006281935144215822, 0.06894499808549881, 0.06760973483324051, 0.0733465924859047, 0.08226058632135391, 0.1185879334807396, 0.0593111515045166, 0.057768288999795914, 0.10335753858089447, 0.28518804907798767]], [[0.6899476051330566, 0.01711839810013771, 0.02227495238184929, 0.005241985898464918, 0.12664860486984253, 0.017489157617092133, 0.0050677284598350525, 0.00617841025814414, 0.0045233434066176414, 0.013983801938593388, 0.012540050782263279, 0.027683710679411888, 0.025330452248454094, 0.004315356723964214, 0.005250631365925074, 0.008778310380876064, 0.007627550046890974], [0.08791413903236389, 0.11619292199611664, 0.061934828758239746, 0.13592441380023956, 0.15491411089897156, 0.1203397586941719, 0.10690971463918686, 0.0958336591720581, 0.01591033674776554, 0.013104955665767193, 0.017140839248895645, 0.010103058069944382, 0.011109541170299053, 0.013218441978096962, 0.02168606035411358, 0.006693120580166578, 0.01107019279152155], [0.267109751701355, 0.0875839814543724, 0.038375139236450195, 0.14138434827327728, 0.0922815129160881, 0.08515460044145584, 0.06615043431520462, 0.09343228489160538, 0.009366502985358238, 0.04650744050741196, 0.006590259727090597, 0.009325448423624039, 0.011282031424343586, 0.00884530134499073, 0.021667787805199623, 0.004750333726406097, 0.010192859917879105], [0.12299053370952606, 0.03121461533010006, 0.12526974081993103, 0.1384095698595047, 0.042847469449043274, 0.03798247501254082, 0.08502328395843506, 0.28926846385002136, 0.008238401263952255, 0.022968605160713196, 0.004673621151596308, 0.009429254569113255, 0.020892977714538574, 0.006791743915528059, 0.0230410136282444, 0.007374530658125877, 0.02358374372124672], [0.019771628081798553, 0.042925987392663956, 0.10725907981395721, 0.48143431544303894, 0.019195804372429848, 0.05173555016517639, 0.10353858768939972, 0.07466888427734375, 0.00993169192224741, 0.02089563198387623, 0.005202791187912226, 0.005305493250489235, 0.010906443931162357, 0.008418725803494453, 0.028224799782037735, 0.004674621392041445, 0.005909830331802368], [0.09123093634843826, 0.10900420695543289, 0.08921576291322708, 0.1496429443359375, 0.13611900806427002, 0.10640604794025421, 0.11674714088439941, 0.10638976097106934, 0.011168643832206726, 0.011797369457781315, 0.011537309736013412, 0.0073274956084787846, 0.00945147406309843, 0.00891185738146305, 0.020916972309350967, 0.004690505098551512, 0.009442619979381561], [0.35553979873657227, 0.025596454739570618, 0.06601442396640778, 0.1842540055513382, 0.07079135626554489, 0.03259963542222977, 0.022814271971583366, 0.17761483788490295, 0.005418842658400536, 0.015257677994668484, 0.0014392860466614366, 0.006486260332167149, 0.0061572943814098835, 0.004111186135560274, 0.013198990374803543, 0.002174907363951206, 0.010530714876949787], [0.06799765676259995, 0.17805378139019012, 0.0691291019320488, 0.06600046157836914, 0.21040737628936768, 0.14877986907958984, 0.0630408450961113, 0.0686224177479744, 0.017828356474637985, 0.015499861910939217, 0.018729453906416893, 0.012192045338451862, 0.01176519226282835, 0.016981029883027077, 0.017346838489174843, 0.0073031471110880375, 0.010322626680135727], [0.6129382848739624, 0.004320988431572914, 0.01258255448192358, 0.015285599045455456, 0.0030379530508071184, 0.004460067022591829, 0.0025749183259904385, 0.01666790433228016, 0.008098828606307507, 0.058936167508363724, 0.016184460371732712, 0.029880866408348083, 0.06292136013507843, 0.006277794949710369, 0.043976105749607086, 0.008016981184482574, 0.09383922815322876], [0.14908069372177124, 0.0024843106511980295, 0.023524150252342224, 0.01869337633252144, 0.004766706842929125, 0.002341205021366477, 0.009652119129896164, 0.04065156355500221, 0.07159541547298431, 0.06913284212350845, 0.028304854407906532, 0.041655827313661575, 0.10864190757274628, 0.05730421468615532, 0.126399427652359, 0.023038694635033607, 0.22273264825344086], [0.08271943777799606, 0.004906867630779743, 0.041981372982263565, 0.06709666550159454, 0.007911065593361855, 0.005610652733594179, 0.015105141326785088, 0.022526629269123077, 0.04096611961722374, 0.32016775012016296, 0.02625729702413082, 0.054550983011722565, 0.07894545048475266, 0.03263111412525177, 0.08251748234033585, 0.023641759529709816, 0.09246420860290527], [0.551002025604248, 0.0070753698237240314, 0.021293554455041885, 0.018289392814040184, 0.005553947761654854, 0.0065783425234258175, 0.012154323048889637, 0.023979654535651207, 0.012174108996987343, 0.08879949152469635, 0.010085527785122395, 0.01277348306030035, 0.06179787218570709, 0.009651558473706245, 0.05894025042653084, 0.012710676528513432, 0.08714031428098679], [0.24295495450496674, 0.008797545917332172, 0.01897348091006279, 0.02240259200334549, 0.005888194777071476, 0.0081903962418437, 0.009061544202268124, 0.04251229017972946, 0.04628982022404671, 0.07775620371103287, 0.03597036749124527, 0.046389371156692505, 0.03849257156252861, 0.032815173268318176, 0.10736890137195587, 0.018913190811872482, 0.23722338676452637], [0.5907400846481323, 0.005485023837536573, 0.012210480868816376, 0.013892722316086292, 0.0033475093077868223, 0.005643690470606089, 0.002536748768761754, 0.01521963719278574, 0.008581125177443027, 0.06732795387506485, 0.017745228484272957, 0.03273386135697365, 0.06318624317646027, 0.007224984932690859, 0.04996907338500023, 0.00941492896527052, 0.0947408378124237], [0.16156518459320068, 0.02562759444117546, 0.04496842995285988, 0.03754839301109314, 0.026267273351550102, 0.025432039052248, 0.05358785018324852, 0.04366692155599594, 0.03720178082585335, 0.2182350754737854, 0.019072847440838814, 0.03881412371993065, 0.08296588063240051, 0.037006035447120667, 0.04472416266798973, 0.022943798452615738, 0.0803726390004158], [0.7009365558624268, 0.017248960211873055, 0.007276283577084541, 0.007549286354333162, 0.007020256016403437, 0.012982342392206192, 0.0027963262982666492, 0.020802771672606468, 0.012614535167813301, 0.023595063015818596, 0.007564424071460962, 0.018587982282042503, 0.03691153973340988, 0.01124848984181881, 0.03711971640586853, 0.0020862880628556013, 0.07365916669368744], [0.04765614867210388, 0.02357564866542816, 0.0076897325925529, 0.006844497285783291, 0.023701030761003494, 0.018322352319955826, 0.006876892875880003, 0.011391970328986645, 0.09617432951927185, 0.10392188280820847, 0.128093883395195, 0.08617661893367767, 0.08871164917945862, 0.093619704246521, 0.08727706968784332, 0.04504602402448654, 0.1249205619096756]], [[0.6999444961547852, 0.033271368592977524, 0.013909603469073772, 0.006980339530855417, 0.022110717371106148, 0.02150537818670273, 0.02339259162545204, 0.053715016692876816, 0.007926936261355877, 0.01761786825954914, 0.008515228517353535, 0.010299251414835453, 0.014730553142726421, 0.008134471252560616, 0.010132250376045704, 0.008362770080566406, 0.039451174437999725], [0.7096490859985352, 0.1286257803440094, 0.01919744350016117, 0.009776546619832516, 0.02203425206243992, 0.029362838715314865, 0.006909705698490143, 0.007599890232086182, 0.0020745599176734686, 0.00481249438598752, 0.006637603975832462, 0.00887293554842472, 0.003468174487352371, 0.003521648235619068, 0.008425338193774223, 0.007191610522568226, 0.021840089932084084], [0.27386799454689026, 0.46732890605926514, 0.01999024860560894, 0.013360227458178997, 0.06342943012714386, 0.007877282798290253, 0.040805548429489136, 0.010506179183721542, 0.005935221444815397, 0.0036369431763887405, 0.0033280719071626663, 0.004359117709100246, 0.005070291925221682, 0.017964519560337067, 0.04368644952774048, 0.014768614433705807, 0.004084874410182238], [0.6985294222831726, 0.051490768790245056, 0.07849828898906708, 0.012298560701310635, 0.043022263795137405, 0.01631149835884571, 0.006221814081072807, 0.045159924775362015, 0.0003402868751436472, 0.000611428520642221, 0.00032320714672096074, 0.0009421741706319153, 0.0026654524262994528, 0.000793307670392096, 0.0019224915886297822, 0.015949413180351257, 0.024919643998146057], [0.13385166227817535, 0.13671085238456726, 0.021839935332536697, 0.2342349886894226, 0.15750010311603546, 0.10865804553031921, 0.008402155712246895, 0.09069671481847763, 0.002944386564195156, 0.01758934184908867, 0.00010379388550063595, 0.0028518179897218943, 0.0009350733016617596, 0.0008568796911276877, 0.04928211122751236, 0.0006318472442217171, 0.032910313457250595], [0.26596003770828247, 0.02734042890369892, 0.006950710900127888, 0.01500026136636734, 0.5170687437057495, 0.06982631981372833, 0.010510291904211044, 0.07074353098869324, 0.002468053251504898, 0.0052584083750844, 0.001481827930547297, 9.464619506616145e-05, 0.0008707857341505587, 0.0007765362970530987, 0.0012102317996323109, 0.0026651860680431128, 0.0017740766052156687], [0.25579598546028137, 0.009068278595805168, 0.04568634554743767, 0.027461836114525795, 0.17888212203979492, 0.3413448929786682, 0.025524647906422615, 0.05428246408700943, 0.014690759591758251, 0.009647219441831112, 0.006497113034129143, 0.005052521359175444, 0.0011131246574223042, 0.001288561150431633, 0.004314302001148462, 0.008577005006372929, 0.01077277585864067], [0.3748631477355957, 0.021395862102508545, 0.002089103450998664, 0.005836137570440769, 0.013999508693814278, 0.04796065390110016, 0.3704995810985565, 0.12290674448013306, 0.008339639753103256, 0.008209849707782269, 0.002514739753678441, 0.010776760056614876, 0.0015108921797946095, 0.00013728003250434995, 0.0008048153249546885, 0.0006206078687682748, 0.00753468181937933], [0.24807600677013397, 0.0021809223107993603, 0.010174860246479511, 0.001623473595827818, 0.009058342315256596, 0.003580352058634162, 0.010590552352368832, 0.6100813150405884, 0.018959587439894676, 0.014104754664003849, 0.013513625599443913, 0.007730433717370033, 0.044733162969350815, 0.0009516052668914199, 0.000454386550700292, 0.0014315351145341992, 0.002755087101832032], [0.22682400047779083, 0.0006895898841321468, 0.0018355300417169929, 0.0032065254636108875, 0.0046776942908763885, 0.006206498946994543, 0.0023423905950039625, 0.08675538003444672, 0.5937461256980896, 0.006382175721228123, 0.012460564263164997, 0.014698871411383152, 0.004481582436710596, 0.025256581604480743, 0.0049661388620734215, 0.00046906445641070604, 0.005001252982765436], [0.28856441378593445, 0.011897936463356018, 0.0030935450922697783, 0.0022680729161947966, 0.04757849499583244, 0.010350900702178478, 0.0010078982450067997, 0.12768571078777313, 0.07826422154903412, 0.22995775938034058, 0.04328764230012894, 0.09640689939260483, 0.02851683832705021, 0.0080885523930192, 0.01143547985702753, 0.006880860775709152, 0.004714652895927429], [0.008578760549426079, 0.00121406523976475, 0.00010369140363764018, 0.0002546988253016025, 0.0010680478299036622, 0.001788561581633985, 0.0028198116924613714, 0.01274645421653986, 0.009919047355651855, 0.021305445581674576, 0.9207978248596191, 0.000465020741103217, 0.0006918059079907835, 0.008799072355031967, 0.0004177912778686732, 0.006095185875892639, 0.002934586489573121], [0.1607099175453186, 0.004049964249134064, 0.0008615506230853498, 0.0003936364664696157, 0.00018070013902615756, 0.0014086180599406362, 0.007494746707379818, 0.006544463336467743, 0.11627081036567688, 0.02389046736061573, 0.05305321142077446, 0.3347630202770233, 0.01155440229922533, 0.1400853544473648, 0.09302914887666702, 0.01309790089726448, 0.032612092792987823], [0.14028777182102203, 0.003955664113163948, 0.0034131580032408237, 0.0005156900151632726, 0.0006142694037407637, 4.92862964165397e-05, 0.0004105033876840025, 0.008855744265019894, 0.0012289606966078281, 0.005256796255707741, 0.006625024601817131, 0.027821024879813194, 0.7619102001190186, 0.014636924490332603, 0.010180609300732613, 0.009264778345823288, 0.004973613657057285], [0.31704413890838623, 0.006152885966002941, 0.008545051328837872, 0.004672444891184568, 0.0029587983153760433, 0.0029744692146778107, 0.00016444448556285352, 0.009892044588923454, 0.011383824050426483, 0.0017570228083059192, 0.009875562973320484, 0.01864154264330864, 0.1480650156736374, 0.2796936333179474, 0.04994041845202446, 0.050975698977708817, 0.07726306468248367], [0.03042708896100521, 0.006150448229163885, 0.014029327780008316, 0.01415711734443903, 0.016464460641145706, 0.001985558308660984, 0.007453904952853918, 0.00198889197781682, 0.005330587271600962, 0.07505982369184494, 0.008630000054836273, 0.25703197717666626, 0.04948176071047783, 0.0745435282588005, 0.3545231819152832, 0.016538724303245544, 0.06620363891124725], [0.5550248026847839, 0.01825815811753273, 0.002225968288257718, 0.000661232101265341, 0.006191920023411512, 0.00458022765815258, 0.001469333190470934, 0.003970153629779816, 0.00028877961449325085, 0.0017874451586976647, 0.00491069070994854, 0.004161215387284756, 0.013822426088154316, 0.012160233221948147, 0.02345888502895832, 0.13453733921051025, 0.2124912291765213]], [[0.5531274676322937, 0.038948748260736465, 0.03963252902030945, 0.022325852885842323, 0.045458946377038956, 0.018582282587885857, 0.04739651083946228, 0.030218904837965965, 0.01876020058989525, 0.024210235103964806, 0.014616807922720909, 0.014615807682275772, 0.038727737963199615, 0.016958223655819893, 0.027495944872498512, 0.0132807157933712, 0.03564314916729927], [0.23534129559993744, 0.25391167402267456, 0.20634758472442627, 0.07569573074579239, 0.016307909041643143, 0.022128622978925705, 0.02104955166578293, 0.010060575790703297, 0.008644415996968746, 0.02620367892086506, 0.011110111139714718, 0.005413709208369255, 0.015157378278672695, 0.01644430309534073, 0.05729871243238449, 0.008592666126787663, 0.010292124934494495], [0.28543907403945923, 0.5129810571670532, 0.12388613820075989, 0.014964940957725048, 0.006519661284983158, 0.0051577468402683735, 0.01266274694353342, 0.0031568193808197975, 0.00118518085218966, 0.002923388034105301, 0.001694236765615642, 0.002124629681929946, 0.00967483688145876, 0.002205616096034646, 0.007514026947319508, 0.005517646204680204, 0.002392255235463381], [0.0470183901488781, 0.24743255972862244, 0.6383838057518005, 0.02836628630757332, 0.005363557953387499, 0.010781673714518547, 0.005996208172291517, 0.003201662329956889, 0.0011405611876398325, 0.0009133138228207827, 0.0010775269474834204, 0.0008611854282207787, 0.002654226031154394, 0.00156002352014184, 0.0015406447928398848, 0.000864411354996264, 0.002843990456312895], [0.0033526041079312563, 0.11373357474803925, 0.2981511056423187, 0.5266714096069336, 0.018684813752770424, 0.014041881076991558, 0.011216058395802975, 0.004145070910453796, 0.0026330705732107162, 0.0016260731499642134, 0.00015214589075185359, 0.000457542686490342, 0.0008057541563175619, 0.0012821558630093932, 0.0016656132647767663, 0.0002884374698624015, 0.0010927255498245358], [0.012172370217740536, 0.038806330412626266, 0.10904736071825027, 0.4352467954158783, 0.3111500144004822, 0.05555146560072899, 0.016090288758277893, 0.011615954339504242, 0.0012498828582465649, 0.0035074332263320684, 0.0004545902193058282, 0.00010934298188658431, 0.0016188444569706917, 0.0004899620544165373, 0.0013882080093026161, 0.0007956181070767343, 0.0007056964677758515], [0.34951213002204895, 0.012486843392252922, 0.04216228425502777, 0.037645675241947174, 0.23600329458713531, 0.24226385354995728, 0.04729386046528816, 0.018154671415686607, 0.004687127191573381, 0.0027336678467690945, 0.0014662212925031781, 0.0006396231474354863, 0.0010707535548135638, 0.0010869363322854042, 0.0008611147059127688, 0.0005958595429547131, 0.0013360094744712114], [0.14323300123214722, 0.022829772904515266, 0.02651551365852356, 0.02545176073908806, 0.030542083084583282, 0.29597872495651245, 0.2913847267627716, 0.07135308533906937, 0.056868597865104675, 0.010234376415610313, 0.002936959732323885, 0.0041832937858998775, 0.009259468875825405, 0.002136248629540205, 0.0019678110256791115, 0.0012895981781184673, 0.0038349893875420094], [0.14364652335643768, 0.00447213975712657, 0.015378237701952457, 0.006198503077030182, 0.007982158102095127, 0.017818354070186615, 0.13508988916873932, 0.5409282445907593, 0.07128128409385681, 0.022254234179854393, 0.007588529493659735, 0.002043461659923196, 0.019120583310723305, 0.0023534067440778017, 0.0008825342520140111, 0.0005810395232401788, 0.002380817197263241], [0.09853097051382065, 0.0030160625465214252, 0.02138899452984333, 0.0075078485533595085, 0.001676246291026473, 0.018666837364435196, 0.0309753455221653, 0.36838388442993164, 0.3980045020580292, 0.026666738092899323, 0.008063904009759426, 0.0015151504194363952, 0.002796384273096919, 0.005472081713378429, 0.0017908450681716204, 0.00035887863487005234, 0.005185370799154043], [0.004498024936765432, 0.0007459388580173254, 0.0005258452729322016, 0.002558627165853977, 0.003943223040550947, 0.0035117941442877054, 0.00734774861484766, 0.0976485013961792, 0.47000575065612793, 0.3124501705169678, 0.06954536586999893, 0.00688760494813323, 0.00824044831097126, 0.0060750218108296394, 0.0029516934882849455, 0.001455658464692533, 0.0016086554387584329], [0.008639084175229073, 0.0019752781372517347, 0.006826938595622778, 0.000864996574819088, 0.0012613479048013687, 0.00697448942810297, 0.007796809542924166, 0.0354609340429306, 0.1621769815683365, 0.34291064739227295, 0.3599295914173126, 0.013132589869201183, 0.023393215611577034, 0.01077071763575077, 0.010074962861835957, 0.00524973263964057, 0.0025617198552936316], [0.19040097296237946, 0.001692387042567134, 0.00625053932890296, 0.0023625281173735857, 0.0007383263437077403, 0.0033193824347108603, 0.016812890768051147, 0.024481041356921196, 0.058065395802259445, 0.13811634480953217, 0.30562397837638855, 0.12837380170822144, 0.07123146206140518, 0.018925435841083527, 0.011775652877986431, 0.0027958799619227648, 0.019033970311284065], [0.06366129219532013, 0.0012058063875883818, 0.001200215658172965, 0.0004669454356189817, 0.00026555178919807076, 0.000212031343835406, 0.0034179112408310175, 0.009772730059921741, 0.008944135159254074, 0.03221636265516281, 0.06514137238264084, 0.07418863475322723, 0.6800320744514465, 0.04294847697019577, 0.010578269138932228, 0.0018520053708925843, 0.0038962122052907944], [0.047643065452575684, 0.0009506919304840267, 0.002307700924575329, 0.0009917699499055743, 0.002371498616412282, 0.0009200986823998392, 0.001367571298032999, 0.0159171000123024, 0.03498252481222153, 0.007331242319196463, 0.04356037825345993, 0.025966167449951172, 0.35480257868766785, 0.41760215163230896, 0.029087938368320465, 0.003710862947627902, 0.010486691258847713], [0.002624447690322995, 0.0011480419198051095, 0.0025820760056376457, 0.0015298562357202172, 0.0012255767360329628, 0.0006368904723785818, 0.0020696495193988085, 0.003914456348866224, 0.02860347181558609, 0.02604725770652294, 0.014786211773753166, 0.007875815033912659, 0.12581123411655426, 0.49345624446868896, 0.2483048290014267, 0.01538606733083725, 0.02399783954024315], [0.01755857840180397, 0.0011365225072950125, 0.0005246268701739609, 0.00020906470308545977, 0.00018786005966831, 0.00017037492943927646, 0.00012175613665021956, 0.0006612560828216374, 0.009288708679378033, 0.008440044708549976, 0.01343101728707552, 0.00490582687780261, 0.08473724871873856, 0.2772836983203888, 0.16869381070137024, 0.24652057886123657, 0.1661289930343628]], [[0.3624141812324524, 0.012535901740193367, 0.02622339129447937, 0.023359887301921844, 0.02091851457953453, 0.012288011610507965, 0.024015987291932106, 0.041488368064165115, 0.08691833168268204, 0.04299449920654297, 0.051547423005104065, 0.020742516964673996, 0.04904649034142494, 0.07918290793895721, 0.05141822621226311, 0.03543340787291527, 0.059471938759088516], [0.035816438496112823, 0.11800350248813629, 0.044975053519010544, 0.13583621382713318, 0.11961828917264938, 0.12800370156764984, 0.06930771470069885, 0.091176338493824, 0.022737184539437294, 0.026629827916622162, 0.02172294445335865, 0.024881193414330482, 0.03354039415717125, 0.02479497157037258, 0.03846859186887741, 0.031183989718556404, 0.03330357372760773], [0.1201598048210144, 0.04021308198571205, 0.021064747124910355, 0.09030243009328842, 0.09674698859453201, 0.03280641511082649, 0.06403307616710663, 0.02576272003352642, 0.03967718407511711, 0.04187723249197006, 0.020523540675640106, 0.070872001349926, 0.13243107497692108, 0.04611702635884285, 0.054625749588012695, 0.08023565262556076, 0.022551316767930984], [0.2558104395866394, 0.03588450327515602, 0.07239478081464767, 0.027127787470817566, 0.07910799235105515, 0.03960889205336571, 0.038419533520936966, 0.06509558856487274, 0.03172118589282036, 0.04581403359770775, 0.04239774867892265, 0.04441169276833534, 0.056603655219078064, 0.025636622682213783, 0.05578400939702988, 0.04242825508117676, 0.04175323247909546], [0.07175973802804947, 0.12695001065731049, 0.06962516903877258, 0.08992763608694077, 0.061048269271850586, 0.11724483221769333, 0.07279219478368759, 0.14686954021453857, 0.015917090699076653, 0.030525023117661476, 0.017779873684048653, 0.03806799650192261, 0.023232068866491318, 0.01768968440592289, 0.032533157616853714, 0.026374034583568573, 0.04166368395090103], [0.04598035663366318, 0.15256546437740326, 0.04135409742593765, 0.12453802675008774, 0.08941135555505753, 0.14862608909606934, 0.09383881837129593, 0.0897386372089386, 0.015865273773670197, 0.022009145468473434, 0.01312766782939434, 0.023622557520866394, 0.03047340363264084, 0.017952879890799522, 0.03219271078705788, 0.027798693627119064, 0.03090481460094452], [0.42808085680007935, 0.042736466974020004, 0.055460087954998016, 0.08529406040906906, 0.08022328466176987, 0.033800870180130005, 0.015010504983365536, 0.019122624769806862, 0.02632264606654644, 0.03478415310382843, 0.012839571572840214, 0.02776345983147621, 0.04488595947623253, 0.029277237132191658, 0.017974290996789932, 0.035526975989341736, 0.010896888561546803], [0.002761203097179532, 0.00048391782911494374, 0.0004983697435818613, 0.0005490362527780235, 0.0006300605600699782, 0.0009331773035228252, 0.0014834677567705512, 0.0776248574256897, 0.019254738464951515, 0.026327263563871384, 0.03957133740186691, 0.016347553580999374, 0.007111303508281708, 0.009307087399065495, 0.005713935010135174, 0.01732020266354084, 0.7740825414657593], [0.19387076795101166, 0.057324331253767014, 0.027362104505300522, 0.05215953290462494, 0.021029070019721985, 0.03585261479020119, 0.06193320080637932, 0.036798395216464996, 0.010659974068403244, 0.030238911509513855, 0.018659252673387527, 0.14139221608638763, 0.06656724959611893, 0.011400205083191395, 0.08376550674438477, 0.11414360255002975, 0.03684304282069206], [0.3142881691455841, 0.05663929134607315, 0.030714789405465126, 0.03112647496163845, 0.03478352725505829, 0.05229886248707771, 0.04520084708929062, 0.029717465862631798, 0.03471558168530464, 0.008498922921717167, 0.03623148426413536, 0.07341838628053665, 0.03728731349110603, 0.03837336227297783, 0.06595592945814133, 0.07795974612236023, 0.032789766788482666], [0.1515861302614212, 0.03841036558151245, 0.023936990648508072, 0.044587407261133194, 0.04884861037135124, 0.032511718571186066, 0.04029145464301109, 0.029132168740034103, 0.03469783440232277, 0.04939604923129082, 0.024633992463350296, 0.08683266490697861, 0.143334299325943, 0.0400252602994442, 0.07077339291572571, 0.09086612612009048, 0.050135575234889984], [0.1999911516904831, 0.05390523001551628, 0.05684918910264969, 0.06082169711589813, 0.02921750582754612, 0.04428960755467415, 0.0344964861869812, 0.023224303498864174, 0.0617038831114769, 0.05746985226869583, 0.06218429282307625, 0.04865669459104538, 0.05555034056305885, 0.06881751120090485, 0.05659811943769455, 0.057892169803380966, 0.02833206206560135], [0.39399954676628113, 0.025697950273752213, 0.0700189620256424, 0.06406722217798233, 0.0291362963616848, 0.022897573187947273, 0.026051117107272148, 0.018416333943605423, 0.02285711281001568, 0.04074666649103165, 0.04980146884918213, 0.06629952043294907, 0.016941891983151436, 0.027402490377426147, 0.05012977495789528, 0.052527423948049545, 0.023008637130260468], [0.16595052182674408, 0.06346289068460464, 0.030761510133743286, 0.039006561040878296, 0.019854631274938583, 0.03930297866463661, 0.06337015330791473, 0.045221034437417984, 0.010037682950496674, 0.0325680673122406, 0.020191669464111328, 0.15027892589569092, 0.059525396674871445, 0.010879214853048325, 0.0817238986492157, 0.11869318783283234, 0.049171701073646545], [0.09889552742242813, 0.06973684579133987, 0.069208525121212, 0.1107822135090828, 0.05574621632695198, 0.049872152507305145, 0.024334967136383057, 0.022679755464196205, 0.053051188588142395, 0.054932352155447006, 0.05127495899796486, 0.058714669197797775, 0.06867963820695877, 0.06266015768051147, 0.016280340030789375, 0.08117597550153732, 0.05197448655962944], [0.27592384815216064, 0.036020610481500626, 0.025736989453434944, 0.023358013480901718, 0.00982001330703497, 0.0292969923466444, 0.031171463429927826, 0.025641735643148422, 0.08022500574588776, 0.060013800859451294, 0.03729303926229477, 0.07992181181907654, 0.05378372594714165, 0.09005829691886902, 0.07051649689674377, 0.03422647342085838, 0.03699176013469696], [0.0025693487841635942, 0.0003728805750142783, 0.0002990306238643825, 0.0003148563264403492, 0.0002849936718121171, 0.000614148797467351, 0.0008464950369670987, 0.025200465694069862, 0.031975869089365005, 0.03877921402454376, 0.07089529931545258, 0.030728720128536224, 0.011361554265022278, 0.017042607069015503, 0.010072625242173672, 0.029468225315213203, 0.7291737794876099]], [[0.22146828472614288, 0.09557773172855377, 0.03721252456307411, 0.010895133949816227, 0.07651723176240921, 0.09909039735794067, 0.09845726937055588, 0.0745856985449791, 0.020788459107279778, 0.024861996993422508, 0.014216684736311436, 0.040189336985349655, 0.01024805847555399, 0.020380595698952675, 0.01735594868659973, 0.05288022756576538, 0.08527443557977676], [0.349394828081131, 0.07191049307584763, 0.040404897183179855, 0.0432015024125576, 0.1005854532122612, 0.06671977043151855, 0.03876152262091637, 0.16284143924713135, 0.013470688834786415, 0.020637663081288338, 0.008580397814512253, 0.008747578598558903, 0.008660301566123962, 0.0104757659137249, 0.004358518403023481, 0.009990829974412918, 0.041258305311203], [0.25236666202545166, 0.04885222017765045, 0.028824834153056145, 0.11896419525146484, 0.04094208776950836, 0.06895846128463745, 0.08705347776412964, 0.06107240542769432, 0.022229233756661415, 0.06969449669122696, 0.010668213479220867, 0.026688095182180405, 0.034861255437135696, 0.01960885338485241, 0.07318580895662308, 0.017862146720290184, 0.01816752552986145], [0.05968631058931351, 0.019690733402967453, 0.09633783251047134, 0.11084913462400436, 0.016676180064678192, 0.024355174973607063, 0.2863754332065582, 0.04760407656431198, 0.05504525452852249, 0.03317360579967499, 0.014614752493798733, 0.040386952459812164, 0.025106582790613174, 0.0449637696146965, 0.05195833742618561, 0.05384143814444542, 0.019334420561790466], [0.06758517771959305, 0.024695217609405518, 0.10462625324726105, 0.2847922742366791, 0.024211958050727844, 0.02454109489917755, 0.15730398893356323, 0.08206135034561157, 0.019399205222725868, 0.04808569326996803, 0.015354345552623272, 0.03115616738796234, 0.013969022780656815, 0.015760473906993866, 0.03557848557829857, 0.02278032898902893, 0.028098877519369125], [0.34686604142189026, 0.05750812217593193, 0.04747125506401062, 0.04895783215761185, 0.1022556722164154, 0.04081473872065544, 0.04102770984172821, 0.1867659091949463, 0.014471071772277355, 0.022346893325448036, 0.008297596126794815, 0.007435521110892296, 0.008637937717139721, 0.010762296617031097, 0.004218767397105694, 0.010377290658652782, 0.04178538918495178], [0.1493857502937317, 0.03248755261301994, 0.08947691321372986, 0.13280922174453735, 0.06091093644499779, 0.0553278774023056, 0.050607044249773026, 0.07705699652433395, 0.025356870144605637, 0.06409049034118652, 0.009429235942661762, 0.08282861858606339, 0.06852283328771591, 0.021467959508299828, 0.04371937736868858, 0.016851557418704033, 0.019670788198709488], [0.33623167872428894, 0.07561185210943222, 0.028091223910450935, 0.020133037120103836, 0.15367555618286133, 0.0813445895910263, 0.03583360090851784, 0.12775355577468872, 0.019131029024720192, 0.010492919944226742, 0.009413770399987698, 0.011702450923621655, 0.010775784030556679, 0.015646036714315414, 0.008195308037102222, 0.014027360826730728, 0.04194021224975586], [0.8023107051849365, 0.019373027607798576, 0.0025156119372695684, 0.0023727398365736008, 0.0032487292774021626, 0.02204030565917492, 0.0067085037007927895, 0.023359062150120735, 0.0045005762949585915, 0.012361705303192139, 0.004826097749173641, 0.015309019014239311, 0.006140597630292177, 0.0033337583299726248, 0.004910883028060198, 0.004779252223670483, 0.06190936267375946], [0.3708947002887726, 0.025083893910050392, 0.009627724066376686, 0.02590187080204487, 0.014447234570980072, 0.024308377876877785, 0.002569545991718769, 0.09100065380334854, 0.043360017240047455, 0.014688343740999699, 0.017659839242696762, 0.06939060240983963, 0.02606888674199581, 0.03311001881957054, 0.02343243546783924, 0.011741011403501034, 0.19671481847763062], [0.08892334252595901, 0.010166550055146217, 0.009855003096163273, 0.0682399794459343, 0.015200200490653515, 0.009594868868589401, 0.00905545987188816, 0.05146120488643646, 0.041895415633916855, 0.2228061556816101, 0.04351642727851868, 0.04928012564778328, 0.03705386444926262, 0.03508487716317177, 0.06771911680698395, 0.06095615401864052, 0.1791912466287613], [0.5418769121170044, 0.01601666957139969, 0.009150439873337746, 0.016776908189058304, 0.01569364033639431, 0.015624862164258957, 0.008765839971601963, 0.04400664195418358, 0.023828376084566116, 0.05414149910211563, 0.04032573848962784, 0.01391985546797514, 0.032621100544929504, 0.019276736304163933, 0.025139162316918373, 0.021299758926033974, 0.10153576731681824], [0.5501156449317932, 0.026163598522543907, 0.008576257154345512, 0.004659620579332113, 0.014734677970409393, 0.029290853068232536, 0.012003937736153603, 0.06651995331048965, 0.013105153106153011, 0.04000590741634369, 0.026503683999180794, 0.009788459166884422, 0.0036600125022232533, 0.010910391807556152, 0.013585356064140797, 0.02751201018691063, 0.14286457002162933], [0.7758069634437561, 0.02178819850087166, 0.003100266680121422, 0.0029092745389789343, 0.0038584470748901367, 0.023123592138290405, 0.007107515819370747, 0.02533472329378128, 0.004310137126594782, 0.014792204834520817, 0.00430460786446929, 0.017553681507706642, 0.007735233288258314, 0.0036361338570713997, 0.007432498503476381, 0.00672340439632535, 0.07048307359218597], [0.10401000827550888, 0.0037033334374427795, 0.015493758022785187, 0.11827439814805984, 0.0313827246427536, 0.004779853392392397, 0.009028956294059753, 0.016965186223387718, 0.048568468540906906, 0.1676759570837021, 0.02703569270670414, 0.0975814238190651, 0.167486771941185, 0.04472776874899864, 0.08908514678478241, 0.00761641887947917, 0.046584151685237885], [0.523699164390564, 0.022767236456274986, 0.0027363852132111788, 0.006813987623900175, 0.00969173014163971, 0.023758579045534134, 0.004062699154019356, 0.05521663650870323, 0.0249981340020895, 0.018409091979265213, 0.006632436532527208, 0.029245242476463318, 0.025296254083514214, 0.021985916420817375, 0.015527701936662197, 0.006176759954541922, 0.20298199355602264], [0.3961465656757355, 0.02734614536166191, 0.009385865181684494, 0.006295287050306797, 0.037237539887428284, 0.025111032649874687, 0.008546407334506512, 0.05030955374240875, 0.050516992807388306, 0.034983932971954346, 0.020415445789694786, 0.027928628027439117, 0.031707193702459335, 0.04860866814851761, 0.021578678861260414, 0.03710201010107994, 0.16678006947040558]], [[0.15183575451374054, 0.46849802136421204, 0.0030945604667067528, 0.0008100521517917514, 0.0020251739770174026, 0.33723846077919006, 0.0023610142525285482, 0.002125396393239498, 0.0027228249236941338, 0.003253430360928178, 0.00252532004378736, 0.006180603988468647, 0.004932955373078585, 0.002614939119666815, 0.00628370838239789, 0.0014395955950021744, 0.00205813511274755], [0.02151884324848652, 0.03354554995894432, 0.04301507771015167, 0.10622286051511765, 0.056409262120723724, 0.03551318868994713, 0.0384879969060421, 0.03723745048046112, 0.0874704122543335, 0.08928412199020386, 0.07292473316192627, 0.05680923908948898, 0.05995906516909599, 0.08004922419786453, 0.06518056243658066, 0.06115090847015381, 0.05522146821022034], [0.06864052265882492, 0.008109799586236477, 0.13258016109466553, 0.09440374374389648, 0.017020031809806824, 0.009068461135029793, 0.06857091188430786, 0.04040054604411125, 0.05879812687635422, 0.06848244369029999, 0.01864909566938877, 0.0630573034286499, 0.08155646175146103, 0.053320229053497314, 0.05525437742471695, 0.11630966514348984, 0.04577820003032684], [0.27274149656295776, 0.008846202865242958, 0.03377395495772362, 0.15767936408519745, 0.009655912406742573, 0.008760345168411732, 0.049490418285131454, 0.01542940828949213, 0.03226921334862709, 0.05230236053466797, 0.030614422634243965, 0.10226268321275711, 0.04636583849787712, 0.032384242862463, 0.05604005977511406, 0.07140760123729706, 0.01997647061944008], [0.050567951053380966, 0.02203758992254734, 0.03784537687897682, 0.095024473965168, 0.04556785151362419, 0.018033409491181374, 0.04900776222348213, 0.05462734401226044, 0.044080570340156555, 0.11471997201442719, 0.05538792163133621, 0.060299839824438095, 0.08012497425079346, 0.04651759937405586, 0.0941704735159874, 0.04694158211350441, 0.08504533022642136], [0.020432479679584503, 0.032504867762327194, 0.05106063932180405, 0.10444325953722, 0.06340580433607101, 0.032096978276968, 0.05334796756505966, 0.030322860926389694, 0.08697664737701416, 0.08812955021858215, 0.07004262506961823, 0.05981476232409477, 0.06249788776040077, 0.07779262959957123, 0.07522901147603989, 0.048624299466609955, 0.04327766224741936], [0.0072828903794288635, 0.007258305791765451, 0.04488436132669449, 0.12844499945640564, 0.0456131249666214, 0.0074371593073010445, 0.20878440141677856, 0.044451698660850525, 0.08148134499788284, 0.06542021036148071, 0.02337281033396721, 0.032936304807662964, 0.0500723198056221, 0.06744685024023056, 0.11428835988044739, 0.023170849308371544, 0.04765408858656883], [0.05660533532500267, 0.020602023229002953, 0.03657577186822891, 0.0603347048163414, 0.04797637462615967, 0.017314614728093147, 0.029355747625231743, 0.16981235146522522, 0.027038734406232834, 0.05662652850151062, 0.04447447508573532, 0.06533629447221756, 0.08327851444482803, 0.026884645223617554, 0.03523285314440727, 0.045164819806814194, 0.17738620936870575], [0.016270743682980537, 0.005787154193967581, 0.024584157392382622, 0.13429123163223267, 0.05384276434779167, 0.005131443031132221, 0.09800086170434952, 0.024946004152297974, 0.17717379331588745, 0.08928237110376358, 0.041335444897413254, 0.024807604029774666, 0.05104060098528862, 0.15616516768932343, 0.04507097229361534, 0.026637963950634003, 0.02563171647489071], [0.007343251258134842, 0.00997698213905096, 0.02885105274617672, 0.1474658101797104, 0.03641022741794586, 0.0114969527348876, 0.1515871286392212, 0.018221678212285042, 0.09226132929325104, 0.16788306832313538, 0.02956513874232769, 0.05164198577404022, 0.07984573394060135, 0.0778571367263794, 0.05325423181056976, 0.021001344546675682, 0.015336939133703709], [0.03085823357105255, 0.013755046762526035, 0.05807757005095482, 0.16315485537052155, 0.04563186690211296, 0.011047900654375553, 0.10825290530920029, 0.016059817746281624, 0.06565750390291214, 0.05843520164489746, 0.13953715562820435, 0.07260048389434814, 0.0420500822365284, 0.0528855174779892, 0.06949817389249802, 0.03663808852434158, 0.01585960201919079], [0.0951249897480011, 0.010509581305086613, 0.07797087728977203, 0.13199560344219208, 0.05527045950293541, 0.010348842479288578, 0.0719812735915184, 0.05745045840740204, 0.033593352884054184, 0.05967242270708084, 0.038192570209503174, 0.10231940448284149, 0.06335455924272537, 0.029674727469682693, 0.06450008600950241, 0.05722283944487572, 0.040818002074956894], [0.04886099696159363, 0.005254683084785938, 0.04428006708621979, 0.08492496609687805, 0.06681586802005768, 0.005440168082714081, 0.10211720317602158, 0.037304237484931946, 0.0623994879424572, 0.06731677055358887, 0.08097345381975174, 0.0452965684235096, 0.14427170157432556, 0.053012434393167496, 0.0650520846247673, 0.05750428885221481, 0.029174963012337685], [0.014032246544957161, 0.005548299755901098, 0.026132917031645775, 0.1361457258462906, 0.05952201038599014, 0.004827907774597406, 0.0944448858499527, 0.02723493054509163, 0.17695733904838562, 0.08778490126132965, 0.03840772435069084, 0.023439645767211914, 0.05571724846959114, 0.15007783472537994, 0.045711662620306015, 0.025663010776042938, 0.028351765125989914], [0.03983579948544502, 0.010064511559903622, 0.04523497447371483, 0.10800223052501678, 0.030232300981879234, 0.010920158587396145, 0.13549508154392242, 0.03564969077706337, 0.03668481111526489, 0.07009463012218475, 0.03362143412232399, 0.05591435357928276, 0.09989331662654877, 0.026779253035783768, 0.16782771050930023, 0.06860166788101196, 0.025148121640086174], [0.053188011050224304, 0.025105983018875122, 0.05498620867729187, 0.0748213678598404, 0.04776511341333389, 0.024813514202833176, 0.14287874102592468, 0.028620056807994843, 0.041766878217458725, 0.06151973828673363, 0.03982318937778473, 0.048364605754613876, 0.07417276501655579, 0.03288606181740761, 0.14530937373638153, 0.07942074537277222, 0.024557707831263542], [0.028322333469986916, 0.019281625747680664, 0.04767056554555893, 0.09834294021129608, 0.07611241936683655, 0.016271905973553658, 0.040955595672130585, 0.17107445001602173, 0.02836928330361843, 0.06596244871616364, 0.045054078102111816, 0.057401709258556366, 0.07091746479272842, 0.026767101138830185, 0.038281116634607315, 0.03188716992735863, 0.13732783496379852]], [[0.4219793379306793, 0.00042031393968500197, 0.012754668481647968, 0.013446620665490627, 0.008614479564130306, 0.00034394764224998653, 0.02357897162437439, 0.1876530796289444, 0.015162059105932713, 0.013038101606070995, 0.01811956614255905, 0.01179222110658884, 0.021664060652256012, 0.01592121832072735, 0.022597385570406914, 0.0371260903775692, 0.175788015127182], [0.3963456451892853, 0.058182019740343094, 0.0376095324754715, 0.020101111382246017, 0.05374126136302948, 0.06250861287117004, 0.07416395097970963, 0.13788622617721558, 0.010731831192970276, 0.009106392972171307, 0.014868086203932762, 0.012916970066726208, 0.008341750130057335, 0.010753404349088669, 0.010939477942883968, 0.03372671455144882, 0.048076942563056946], [0.17816555500030518, 0.0066682882606983185, 0.19567355513572693, 0.27481186389923096, 0.03811279684305191, 0.007363385055214167, 0.061939582228660583, 0.05634428188204765, 0.017225751653313637, 0.048884570598602295, 0.02404920943081379, 0.010476725175976753, 0.0072446842677891254, 0.018427714705467224, 0.02651158906519413, 0.010922207497060299, 0.01717817783355713], [0.22563618421554565, 0.006434955634176731, 0.14752331376075745, 0.11195889860391617, 0.03842564672231674, 0.0072996043600142, 0.09541713446378708, 0.10304304957389832, 0.01008765771985054, 0.030607309192419052, 0.06301691383123398, 0.03333214297890663, 0.013188469223678112, 0.009952506050467491, 0.048431310802698135, 0.023158103227615356, 0.032486774027347565], [0.47825151681900024, 0.01827845349907875, 0.02266727387905121, 0.0640328973531723, 0.06126278266310692, 0.013983933255076408, 0.048161741346120834, 0.09199195355176926, 0.014873746782541275, 0.007186683360487223, 0.05276689678430557, 0.01021625380963087, 0.010516783222556114, 0.014724992215633392, 0.020882146432995796, 0.030359597876667976, 0.03984232246875763], [0.3278120756149292, 0.06746284663677216, 0.03803589567542076, 0.02756735123693943, 0.04307401925325394, 0.06304634362459183, 0.08616501837968826, 0.16426125168800354, 0.01062772423028946, 0.0095875458791852, 0.016214709728956223, 0.014889762736856937, 0.010852629318833351, 0.01113430317491293, 0.013067127205431461, 0.04038236662745476, 0.05581899732351303], [0.14427945017814636, 0.014682902954518795, 0.13479658961296082, 0.13704460859298706, 0.05867443606257439, 0.015959259122610092, 0.03219415992498398, 0.2645726799964905, 0.01078811101615429, 0.022878075018525124, 0.009549505077302456, 0.010713727213442326, 0.007585249841213226, 0.011409432627260685, 0.03577147051692009, 0.02057187631726265, 0.06852848082780838], [0.256969153881073, 0.13058331608772278, 0.02623622864484787, 0.03898841515183449, 0.02939792536199093, 0.12544505298137665, 0.08189690858125687, 0.040305476635694504, 0.04579513892531395, 0.024622928351163864, 0.02521948516368866, 0.015646565705537796, 0.02071412280201912, 0.0475456565618515, 0.02276449091732502, 0.040912922471761703, 0.02695617824792862], [0.32555246353149414, 0.00733779463917017, 0.019659025594592094, 0.015998508781194687, 0.02902216650545597, 0.005893188528716564, 0.018417172133922577, 0.08419197052717209, 0.00990066397935152, 0.046123337000608444, 0.037808168679475784, 0.03308148682117462, 0.04145854339003563, 0.01030014269053936, 0.04175945743918419, 0.07230271399021149, 0.20119312405586243], [0.3004795014858246, 0.0086039574816823, 0.09115209430456161, 0.012931648641824722, 0.021006718277931213, 0.00829199980944395, 0.038590360432863235, 0.026586445048451424, 0.04924612492322922, 0.05942846089601517, 0.0447976179420948, 0.057260554283857346, 0.04138820245862007, 0.043168723583221436, 0.07489985972642899, 0.055241119116544724, 0.06692652404308319], [0.15479177236557007, 0.009073935449123383, 0.010230125859379768, 0.010474643670022488, 0.03937844559550285, 0.007451041601598263, 0.021844100207090378, 0.13201923668384552, 0.020737232640385628, 0.02071734331548214, 0.03985373303294182, 0.035455554723739624, 0.037478942424058914, 0.021864555776119232, 0.021733874455094337, 0.08225103467702866, 0.33464449644088745], [0.18487930297851562, 0.0055061145685613155, 0.02677842229604721, 0.07138465344905853, 0.0817866101861, 0.005371967796236277, 0.014900784008204937, 0.07587260752916336, 0.023534979671239853, 0.06999438256025314, 0.056403160095214844, 0.029908763244748116, 0.02301517315208912, 0.0213029682636261, 0.07624495774507523, 0.05997120961546898, 0.1731439232826233], [0.38577157258987427, 0.009635855443775654, 0.01232993509620428, 0.03012324497103691, 0.019429601728916168, 0.00821708794683218, 0.017908232286572456, 0.022904671728610992, 0.04930565506219864, 0.05864814296364784, 0.0598006471991539, 0.024526583030819893, 0.02399115450680256, 0.049988482147455215, 0.056823138147592545, 0.08934486657381058, 0.0812511220574379], [0.3383937180042267, 0.0084907915443182, 0.021348334848880768, 0.013287722133100033, 0.027609264478087425, 0.0070372759364545345, 0.017336726188659668, 0.08152962476015091, 0.011399364098906517, 0.0424652136862278, 0.035647790879011154, 0.034198421984910965, 0.040026433765888214, 0.011195002123713493, 0.04483586922287941, 0.06926406174898148, 0.1959344446659088], [0.032923612743616104, 0.009761340916156769, 0.033804308623075485, 0.0679296925663948, 0.026128407567739487, 0.008684392087161541, 0.07500454783439636, 0.0517345555126667, 0.07062672823667526, 0.0658923014998436, 0.07291299104690552, 0.06662863492965698, 0.11759982258081436, 0.06500908732414246, 0.07412537187337875, 0.05620782449841499, 0.10502630472183228], [0.13437867164611816, 0.02324984222650528, 0.012686249800026417, 0.02832750976085663, 0.04141411930322647, 0.01751963421702385, 0.02810688689351082, 0.09821933507919312, 0.04347569867968559, 0.038896240293979645, 0.05599648132920265, 0.03021777980029583, 0.03821050375699997, 0.04158833995461464, 0.03035077266395092, 0.045609310269355774, 0.29175257682800293], [0.17753541469573975, 0.05048844590783119, 0.008133327588438988, 0.009791803546249866, 0.010845424607396126, 0.04014540836215019, 0.029471341520547867, 0.013422233983874321, 0.14362472295761108, 0.05295664444565773, 0.06605540961027145, 0.02726936712861061, 0.04869583994150162, 0.13766781985759735, 0.0398147776722908, 0.07570957392454147, 0.06837249547243118]], [[0.4642263948917389, 0.15011587738990784, 0.03047654591500759, 0.013400952331721783, 0.009389850310981274, 0.18438056111335754, 0.03871025890111923, 0.00894933007657528, 0.014150694012641907, 0.008076917380094528, 0.006490075495094061, 0.008659686893224716, 0.025774667039513588, 0.011727891862392426, 0.006001539994031191, 0.012597724795341492, 0.006870914250612259], [0.006905724294483662, 0.05280597507953644, 0.07246076315641403, 0.25075769424438477, 0.213982954621315, 0.09163138270378113, 0.09479180723428726, 0.1713242083787918, 0.005458919797092676, 0.007970266975462437, 0.003719099098816514, 0.007347114384174347, 0.005414131097495556, 0.003277415642514825, 0.003650110447779298, 0.004056256730109453, 0.0044461810030043125], [0.0320899598300457, 0.049885429441928864, 0.06988368928432465, 0.3438400328159332, 0.10231362283229828, 0.09733299911022186, 0.14850829541683197, 0.10681233555078506, 0.00663163373246789, 0.006381039507687092, 0.00351930339820683, 0.0048299068585038185, 0.007464840542525053, 0.004364532884210348, 0.0059597063809633255, 0.007609616033732891, 0.0025730649940669537], [0.028231756761670113, 0.041846033185720444, 0.15159739553928375, 0.1434335708618164, 0.07815071940422058, 0.0799737274646759, 0.2379385083913803, 0.17769062519073486, 0.004105762112885714, 0.011196449398994446, 0.00501780491322279, 0.009534628130495548, 0.007748558185994625, 0.0020959863904863596, 0.009524949826300144, 0.007285501807928085, 0.004627921152859926], [0.004376014694571495, 0.02738312818109989, 0.08627881854772568, 0.21873681247234344, 0.029875515028834343, 0.04191931337118149, 0.20251964032649994, 0.32626673579216003, 0.007694512140005827, 0.012361187487840652, 0.006731376051902771, 0.006748461164534092, 0.006117776967585087, 0.0040097408927977085, 0.007444332353770733, 0.00570234190672636, 0.005834260489791632], [0.004580933600664139, 0.05106291547417641, 0.06682868301868439, 0.16923664510250092, 0.10087470710277557, 0.054742686450481415, 0.10758553445339203, 0.3854140639305115, 0.010294039733707905, 0.011628208681941032, 0.004730667918920517, 0.009281697683036327, 0.0049501266330480576, 0.0044130790047347546, 0.004333956632763147, 0.0048819174990057945, 0.005160150118172169], [0.07475162297487259, 0.11402847617864609, 0.08560726046562195, 0.309386670589447, 0.06192252039909363, 0.112920843064785, 0.05925436317920685, 0.13829898834228516, 0.006826133467257023, 0.007336321286857128, 0.0036889330949634314, 0.005693697836250067, 0.0066011929884552956, 0.0028285947628319263, 0.004328699316829443, 0.003951238468289375, 0.002574456622824073], [0.10571454465389252, 0.23942486941814423, 0.05093260854482651, 0.06883835792541504, 0.06267644464969635, 0.29513832926750183, 0.05219503864645958, 0.050584424287080765, 0.015356901101768017, 0.008389630354940891, 0.009617320261895657, 0.009352055378258228, 0.010324811562895775, 0.007052164059132338, 0.004579141270369291, 0.0067511857487261295, 0.0030721784569323063], [0.009705080650746822, 0.0012213971931487322, 0.0015657899202778935, 0.005194930825382471, 0.003359283786267042, 0.001350846840068698, 0.0056806886568665504, 0.01008942723274231, 0.04874679818749428, 0.062025249004364014, 0.12701448798179626, 0.15471945703029633, 0.16935895383358002, 0.05679499730467796, 0.07243680953979492, 0.1535092443227768, 0.11722658574581146], [0.01809130236506462, 0.001262002275325358, 0.003507893066853285, 0.01015250664204359, 0.001669099205173552, 0.0013958039926365018, 0.0034891057293862104, 0.00396891450509429, 0.05036746338009834, 0.05656111612915993, 0.044578149914741516, 0.20836926996707916, 0.11153224110603333, 0.07293415069580078, 0.06722274422645569, 0.2790299952030182, 0.06586828827857971], [0.007637783419340849, 0.0018096726853400469, 0.002747780177742243, 0.011290295980870724, 0.0012870709178969264, 0.0010950509458780289, 0.001856288406997919, 0.015191523358225822, 0.05480312183499336, 0.05592730641365051, 0.03805926442146301, 0.18480776250362396, 0.11843009293079376, 0.0984257236123085, 0.0685182586312294, 0.13795800507068634, 0.20015497505664825], [0.022611940279603004, 0.0025238615926355124, 0.004148697014898062, 0.00925571471452713, 0.0019382340833544731, 0.002068712841719389, 0.00439135218039155, 0.007416078820824623, 0.07023143768310547, 0.10995566844940186, 0.04266273230314255, 0.06169414892792702, 0.12709099054336548, 0.11079767346382141, 0.07347092777490616, 0.2134959101676941, 0.13624590635299683], [0.015786336734890938, 0.0011368925916031003, 0.0015711382729932666, 0.005273908376693726, 0.000806732103228569, 0.0007117970380932093, 0.001423405483365059, 0.002525951014831662, 0.07697726786136627, 0.04530157893896103, 0.03796704486012459, 0.06349309533834457, 0.038835134357213974, 0.13342352211475372, 0.12362530827522278, 0.33428314328193665, 0.11685775965452194], [0.007247959729284048, 0.0008353959419764578, 0.0011472796322777867, 0.003482480300590396, 0.0016809632070362568, 0.000651380920317024, 0.003088153200224042, 0.00551245454698801, 0.02451328933238983, 0.030066078528761864, 0.062267545610666275, 0.061817146837711334, 0.11286430060863495, 0.05333792045712471, 0.10106418281793594, 0.2870910167694092, 0.24333250522613525], [0.019530219957232475, 0.0028466288931667805, 0.010542494244873524, 0.049359098076820374, 0.002222999231889844, 0.0027230039704591036, 0.00337947066873312, 0.004580409731715918, 0.047660671174526215, 0.09290005266666412, 0.06211331859230995, 0.09507135301828384, 0.10339681804180145, 0.09029088914394379, 0.040602464228868484, 0.190764382481575, 0.18201576173305511], [0.05771249160170555, 0.002397199161350727, 0.004390460904687643, 0.009716321714222431, 0.004288450814783573, 0.0018453211523592472, 0.0018793577328324318, 0.008720731362700462, 0.03622519597411156, 0.05535881966352463, 0.04244125261902809, 0.08322694152593613, 0.09775102883577347, 0.06345096975564957, 0.044456660747528076, 0.0827159509062767, 0.40342286229133606], [0.09034334868192673, 0.006762105971574783, 0.0022370461374521255, 0.0037526944652199745, 0.0024745946284383535, 0.006730477791279554, 0.0013155407505109906, 0.002513111801818013, 0.0729508176445961, 0.03344748169183731, 0.06496766954660416, 0.08229215443134308, 0.1665639728307724, 0.12602883577346802, 0.048981137573719025, 0.14047116041183472, 0.14816780388355255]], [[0.03507465124130249, 0.02125316672027111, 0.0711810365319252, 0.008124702610075474, 0.028302202001214027, 0.019387779757380486, 0.028941886499524117, 0.007855070754885674, 0.06748174130916595, 0.036652371287345886, 0.07082614302635193, 0.07026960700750351, 0.14037807285785675, 0.052011922001838684, 0.015374376438558102, 0.31704339385032654, 0.009841871447861195], [0.30932706594467163, 0.026621421799063683, 0.041242171078920364, 0.03280939534306526, 0.02902062050998211, 0.04835886508226395, 0.030427388846874237, 0.08751203864812851, 0.042487069964408875, 0.035302869975566864, 0.03572053089737892, 0.040987443178892136, 0.04589516669511795, 0.0345478281378746, 0.01720789447426796, 0.08370008319616318, 0.05883212760090828], [0.05251365154981613, 0.01540257129818201, 0.33565106987953186, 0.048401568084955215, 0.025340106338262558, 0.02088344655930996, 0.027371453121304512, 0.058214735239744186, 0.041148267686367035, 0.24191902577877045, 0.010634061880409718, 0.014496655203402042, 0.010111714713275433, 0.03621352091431618, 0.02442062459886074, 0.011479733511805534, 0.025797780603170395], [0.19920426607131958, 0.03732261434197426, 0.01813897304236889, 0.20547422766685486, 0.02700815163552761, 0.029461078345775604, 0.12540896236896515, 0.15546664595603943, 0.0195823572576046, 0.05235731601715088, 0.012233883142471313, 0.006648073438555002, 0.010281623341143131, 0.019302314147353172, 0.02138448879122734, 0.011219250038266182, 0.04950566962361336], [0.25200411677360535, 0.0171364676207304, 0.05895107239484787, 0.036313097923994064, 0.021695924922823906, 0.017153698951005936, 0.04884760454297066, 0.12295135855674744, 0.03609376773238182, 0.04300033301115036, 0.04544870927929878, 0.048110101372003555, 0.03929496556520462, 0.02800506353378296, 0.0494893379509449, 0.0533183328807354, 0.08218605071306229], [0.26061415672302246, 0.05302901193499565, 0.07112182676792145, 0.030084414407610893, 0.02134588360786438, 0.028823940083384514, 0.03562429919838905, 0.06957235187292099, 0.03604262322187424, 0.03367941081523895, 0.03049767017364502, 0.051386669278144836, 0.06122678890824318, 0.03230244666337967, 0.028339462354779243, 0.10965961217880249, 0.046649497002363205], [0.07550113648176193, 0.0574786402285099, 0.03537502884864807, 0.09370726346969604, 0.013874487951397896, 0.024387361481785774, 0.2261875867843628, 0.02652926929295063, 0.02232443541288376, 0.0462244376540184, 0.012610047124326229, 0.006658135913312435, 0.019209476187825203, 0.01981288194656372, 0.21394480764865875, 0.0866384208202362, 0.019536582753062248], [0.3897954523563385, 0.025401389226317406, 0.04335080087184906, 0.036979757249355316, 0.02599744126200676, 0.02523444965481758, 0.0356181338429451, 0.019050469622015953, 0.03206954896450043, 0.02987455390393734, 0.03306412324309349, 0.042639825493097305, 0.054655857384204865, 0.037458181381225586, 0.03410709649324417, 0.07756749540567398, 0.0571354478597641], [0.12976402044296265, 0.012619482353329659, 0.041926220059394836, 0.00654192129150033, 0.012399904429912567, 0.00921829417347908, 0.01212246436625719, 0.02007998898625374, 0.12323251366615295, 0.07074414938688278, 0.025516817346215248, 0.05065397918224335, 0.09774885326623917, 0.16823755204677582, 0.023121541365981102, 0.11990492790937424, 0.07616745680570602], [0.13293124735355377, 0.03645618259906769, 0.2655351758003235, 0.018429987132549286, 0.014506008476018906, 0.028547951951622963, 0.007614663802087307, 0.026449257507920265, 0.04769675061106682, 0.1458970606327057, 0.022883228957653046, 0.041723813861608505, 0.03475232794880867, 0.06567822396755219, 0.02448601834475994, 0.03521248325705528, 0.05119956284761429], [0.17031683027744293, 0.05871598422527313, 0.15613187849521637, 0.0437115803360939, 0.011033320799469948, 0.03992927819490433, 0.11453671008348465, 0.0392589196562767, 0.04303819686174393, 0.05116339772939682, 0.03361740708351135, 0.019817644730210304, 0.03733835741877556, 0.03912129998207092, 0.06405390053987503, 0.023233909159898758, 0.05498141795396805], [0.26652926206588745, 0.018265899270772934, 0.08591584116220474, 0.03026704490184784, 0.01592058129608631, 0.01656624674797058, 0.016426628455519676, 0.08285809308290482, 0.04046237841248512, 0.1206236258149147, 0.025002310052514076, 0.019135933369398117, 0.029701221734285355, 0.027329592034220695, 0.07956552505493164, 0.04300723224878311, 0.08242262154817581], [0.34223321080207825, 0.007403571158647537, 0.007057654205709696, 0.02260451205074787, 0.011644713580608368, 0.01019713282585144, 0.036437779664993286, 0.024436188861727715, 0.10541103780269623, 0.04736291989684105, 0.03320001810789108, 0.009939096868038177, 0.08591890335083008, 0.058175090700387955, 0.04184796288609505, 0.09818746894598007, 0.05794272944331169], [0.12496235221624374, 0.01265787798911333, 0.0344216488301754, 0.01073166262358427, 0.015664631500840187, 0.010663534514605999, 0.020246164873242378, 0.02499452605843544, 0.21983669698238373, 0.13158147037029266, 0.022832116112113, 0.06116678565740585, 0.05226214602589607, 0.09263266623020172, 0.01572021096944809, 0.08193518966436386, 0.06769031286239624], [0.12971986830234528, 0.01953803189098835, 0.03266414627432823, 0.01434003934264183, 0.008641194552183151, 0.01585811749100685, 0.03875471651554108, 0.056302741169929504, 0.02605738490819931, 0.05778011307120323, 0.01755901239812374, 0.03622179850935936, 0.032669126987457275, 0.015651147812604904, 0.445513516664505, 0.0037283182609826326, 0.04900069162249565], [0.1585540920495987, 0.014617123641073704, 0.05991144850850105, 0.037549640983343124, 0.021935634315013885, 0.013210732489824295, 0.0725705474615097, 0.019603772088885307, 0.05697696655988693, 0.05389226973056793, 0.033428389579057693, 0.08881169557571411, 0.09632416814565659, 0.04154964163899422, 0.03833574429154396, 0.17068997025489807, 0.02203807607293129], [0.39049816131591797, 0.01921779476106167, 0.022208914160728455, 0.01832391507923603, 0.014105512760579586, 0.015184410847723484, 0.028397979214787483, 0.030795136466622353, 0.05581765994429588, 0.05468479171395302, 0.05097431316971779, 0.031241632997989655, 0.09070908278226852, 0.050291579216718674, 0.030384419485926628, 0.060416318476200104, 0.03674842044711113]], [[0.6835547089576721, 0.030619695782661438, 0.024648237973451614, 0.012478676624596119, 0.017658190801739693, 0.019794683903455734, 0.020877372473478317, 0.022738777101039886, 0.01563696563243866, 0.024021334946155548, 0.01455566193908453, 0.017218327149748802, 0.01777968741953373, 0.012504181824624538, 0.013269094750285149, 0.031215067952871323, 0.0214292760938406], [0.0025672775227576494, 0.037981852889060974, 0.6476801037788391, 0.20529739558696747, 0.05104002356529236, 0.009053058922290802, 0.0066690873354673386, 0.02334451675415039, 0.000893070362508297, 0.00032262824242934585, 0.00040878268191590905, 0.0015672162408009171, 0.001281112665310502, 0.002200143178924918, 0.0009534967830404639, 0.0023844081442803144, 0.006355911958962679], [0.047561198472976685, 0.19023966789245605, 0.08773227035999298, 0.2548995614051819, 0.22638654708862305, 0.07205486297607422, 0.030362559482455254, 0.02076762728393078, 0.03085460141301155, 0.00501651968806982, 0.00017405090329702944, 0.0020866587292402983, 0.006739509757608175, 0.016070852056145668, 0.003930972423404455, 0.002131069777533412, 0.002991439076140523], [0.44619742035865784, 0.0140035105869174, 0.03543118014931679, 0.026564523577690125, 0.3607853949069977, 0.04167652875185013, 0.006196468137204647, 0.05929796025156975, 0.0009952345862984657, 0.0004245598684065044, 0.0002643977350089699, 0.00018885159806814045, 0.000269232114078477, 0.000381864927476272, 0.0007136853528209031, 0.004742712713778019, 0.001866512349806726], [0.00279593956656754, 0.004599690902978182, 0.028453757986426353, 0.08102523535490036, 0.08169020712375641, 0.321857213973999, 0.3893742561340332, 0.059846848249435425, 0.005436278413981199, 0.0019886037334799767, 0.005381810944527388, 0.00024507747730240226, 0.00026763873756863177, 0.0003397251130081713, 0.005489187315106392, 0.007650651037693024, 0.00355784990824759], [0.001991596771404147, 0.0020991384517401457, 0.006820362992584705, 0.041917480528354645, 0.046517979353666306, 0.0389304980635643, 0.5942765474319458, 0.2518586814403534, 0.003242288716137409, 0.00744320685043931, 0.0007108635036274791, 0.0021038311533629894, 0.0004685911117121577, 2.3702839826000854e-05, 0.00029038795037195086, 0.0005804987158626318, 0.0007244090083986521], [0.017098218202590942, 0.0030449756886810064, 0.04204912856221199, 0.000555754522792995, 0.007588332053273916, 0.024241598322987556, 0.014368959702551365, 0.7750883102416992, 0.0998205915093422, 0.003383415983989835, 0.003352218307554722, 0.0030420543625950813, 0.0017866584239527583, 0.003230432514101267, 9.501189924776554e-05, 0.0007286597392521799, 0.0005256577278487384], [0.1946885585784912, 0.0007938350317999721, 0.0055067166686058044, 0.010565552860498428, 0.013212727382779121, 0.028560053557157516, 0.06848783791065216, 0.27705031633377075, 0.3122110068798065, 0.06034360080957413, 0.009430354461073875, 0.011671703308820724, 0.0023629574570804834, 0.002527414122596383, 0.0021532184910029173, 0.0002488536119926721, 0.00018520181765779853], [0.004473669454455376, 0.001349526341073215, 0.0005596168339252472, 0.003626809921115637, 0.0044386982917785645, 0.003662031376734376, 0.12421920150518417, 0.06682175397872925, 0.017130644991993904, 0.6020509004592896, 0.06115524843335152, 0.06610681116580963, 0.023587079718708992, 0.0008474200149066746, 0.008453533053398132, 0.011165961623191833, 0.00035119641688652337], [0.03805796802043915, 0.0027955675031989813, 0.001077538006938994, 0.0001239215926034376, 0.0013042258797213435, 0.008427001535892487, 0.006346818991005421, 0.037076160311698914, 0.11112441122531891, 0.021574050188064575, 0.26302972435951233, 0.36982065439224243, 0.05451921746134758, 0.045027270913124084, 0.0049868300557136536, 0.03032030165195465, 0.004388283006846905], [0.0002339819329790771, 0.0001503203238826245, 3.634392123785801e-05, 0.0001687453914200887, 1.8997658116859384e-05, 5.495042205438949e-05, 0.0001846592640504241, 0.001094429986551404, 0.004904120694845915, 0.006834257394075394, 0.00628278125077486, 0.9237989187240601, 0.04089167341589928, 0.007585433311760426, 0.005260462872684002, 0.0006855250103399158, 0.0018143767956644297], [0.018246766179800034, 0.003626000601798296, 0.0010573231847956777, 0.00020592061628121883, 0.00044333809637464583, 8.768463885644451e-05, 0.0010477087926119566, 0.011776024475693703, 0.012682265602052212, 0.02705039642751217, 0.0704200491309166, 0.01181273814290762, 0.44889014959335327, 0.1765323132276535, 0.03051019459962845, 0.17728671431541443, 0.00832442007958889], [0.009013975970447063, 0.0017323887441307306, 0.00813030730932951, 0.0006099447491578758, 0.0003372595820110291, 0.00035771142574958503, 0.00025945118977688253, 0.001733841490931809, 0.006379575002938509, 0.013327370397746563, 0.05485324189066887, 0.04168546199798584, 0.01541446428745985, 0.4224446415901184, 0.07960078120231628, 0.30029019713401794, 0.043829433619976044], [0.003527288557961583, 0.0033705581445246935, 0.0017958969110623002, 0.006987975910305977, 0.0008359006606042385, 0.00036122617893852293, 0.0026796746533364058, 0.00015825964510440826, 0.00014985023881308734, 0.0070972018875181675, 0.0021169153042137623, 0.03400379419326782, 0.051015760749578476, 0.02188577875494957, 0.3721461594104767, 0.4560808837413788, 0.035786934196949005], [0.013157173991203308, 0.00765022961422801, 0.007690007798373699, 0.0005113592487759888, 0.007597107905894518, 0.0020123065914958715, 0.000723605917301029, 0.0008679111488163471, 0.000268133997451514, 0.0003639897040557116, 0.0024393678177148104, 0.0017095449147745967, 0.04405433312058449, 0.03876866400241852, 0.01856165938079357, 0.7617796063423157, 0.09184505045413971], [0.013514102436602116, 0.003335647750645876, 0.004363333340734243, 0.0032574981451034546, 0.00024108888464979827, 0.0017462241230532527, 0.002117993077263236, 0.0005242348415777087, 0.0002929646288976073, 0.00036590106901712716, 0.0012298704823479056, 0.006304456852376461, 0.03709431365132332, 0.015022866427898407, 0.029339397326111794, 0.03288932517170906, 0.8483607172966003], [0.23818586766719818, 0.004009730648249388, 0.005934701766818762, 0.012230360880494118, 0.004234594758599997, 0.0007666320889256895, 0.002873557386919856, 0.005815331358462572, 0.000986849539913237, 0.0026981448754668236, 0.0005416061612777412, 0.003058027010411024, 0.0315338633954525, 0.013829936273396015, 0.07865103334188461, 0.17703162133693695, 0.41761818528175354]], [[0.5748903751373291, 0.2055213898420334, 0.0021189977414906025, 0.0013741275761276484, 0.010712096467614174, 0.14480623602867126, 0.00690052006393671, 0.02354525960981846, 0.004093955270946026, 0.0024314592592418194, 0.0010480450000613928, 0.0020388357806950808, 0.001960600260645151, 0.0028788712806999683, 0.0007706018513999879, 0.0005029322928749025, 0.014405693858861923], [0.6202282905578613, 0.05801479518413544, 0.01603803224861622, 0.0995836928486824, 0.03990320861339569, 0.04373876750469208, 0.009945983067154884, 0.046737030148506165, 0.0019837168511003256, 0.020817169919610023, 0.00585089111700654, 0.0029879482463002205, 0.0019064333755522966, 0.0014513868372887373, 0.008081994950771332, 0.0016799644799903035, 0.02105073817074299], [0.07212022691965103, 0.17141683399677277, 0.011774636805057526, 0.17223061621189117, 0.09503298997879028, 0.06763198226690292, 0.04786262288689613, 0.20975786447525024, 0.015425659716129303, 0.004829015117138624, 0.0037946358788758516, 0.019266359508037567, 0.012203112244606018, 0.01224062591791153, 0.025870582088828087, 0.02267357148230076, 0.03586873412132263], [0.46808117628097534, 0.04571835696697235, 0.030704684555530548, 0.045404739677906036, 0.08194413036108017, 0.04125532507896423, 0.1387435644865036, 0.09956207871437073, 0.004566682502627373, 0.012431211769580841, 0.0023831191938370466, 0.001737649436108768, 0.0028251931071281433, 0.003089276608079672, 0.0021989336237311363, 0.005020700395107269, 0.01433322299271822], [0.1803184598684311, 0.034573424607515335, 0.05301624536514282, 0.4264952838420868, 0.02292993664741516, 0.05125298723578453, 0.0260234996676445, 0.07284117490053177, 0.0025138994678854942, 0.055029790848493576, 0.00866913702338934, 0.004213879816234112, 0.0008359877392649651, 0.001163340755738318, 0.028745518997311592, 0.004427660722285509, 0.026949668303132057], [0.44204655289649963, 0.0537542998790741, 0.01040646806359291, 0.11555084586143494, 0.08234629780054092, 0.06093606352806091, 0.029317263513803482, 0.14425598084926605, 0.0015941639430820942, 0.0319666713476181, 0.004808458965271711, 0.0021366700530052185, 0.0019780765287578106, 0.0004961491213180125, 0.007310742978006601, 0.0012951588723808527, 0.00979999452829361], [0.11010923981666565, 0.03641044721007347, 0.027566038072109222, 0.2185249775648117, 0.02677285484969616, 0.07854060083627701, 0.011256965808570385, 0.3570291996002197, 0.014237076975405216, 0.03179732710123062, 0.0046987696550786495, 0.009977028705179691, 0.005547455046325922, 0.003923743963241577, 0.03134789690375328, 0.010670391842722893, 0.0215899795293808], [0.5494299530982971, 0.05859263986349106, 0.012013251893222332, 0.02968718856573105, 0.0263986736536026, 0.11010942608118057, 0.021669108420610428, 0.12640173733234406, 0.01015305146574974, 0.018706828355789185, 0.006329825147986412, 0.007065310142934322, 0.002457199152559042, 0.0019661211408674717, 0.002662734128534794, 0.0008065528818406165, 0.015550383366644382], [0.06536279618740082, 0.0025010586250573397, 0.00534816225990653, 0.02648935280740261, 0.0015693290624767542, 0.003755953861400485, 0.042422108352184296, 0.21213607490062714, 0.009258792735636234, 0.29660090804100037, 0.0207565538585186, 0.0481487475335598, 0.15190944075584412, 0.002029404742643237, 0.022240549325942993, 0.028559746220707893, 0.060911014676094055], [0.1276891827583313, 0.007563093677163124, 0.0052275643683969975, 0.005885203834623098, 0.0030459442641586065, 0.018730707466602325, 0.030419515445828438, 0.32947471737861633, 0.04104619845747948, 0.0050090462900698185, 0.08824228495359421, 0.049041520804166794, 0.018300162628293037, 0.017343206331133842, 0.04928332939743996, 0.014393845573067665, 0.1893044263124466], [0.0932212695479393, 0.007898903451859951, 0.006572507321834564, 0.017050648108124733, 0.005419469904154539, 0.006595875136554241, 0.008183586411178112, 0.05790412798523903, 0.10048915445804596, 0.3671805262565613, 0.049166664481163025, 0.0824677050113678, 0.04933436959981918, 0.04703320190310478, 0.013704411685466766, 0.03412412479519844, 0.053653474897146225], [0.2898176312446594, 0.006054127123206854, 0.0036440405528992414, 0.0148227633908391, 0.0055983830243349075, 0.0041818018071353436, 0.02088664285838604, 0.16474846005439758, 0.06509563326835632, 0.07743409276008606, 0.04931618645787239, 0.0034628785215318203, 0.010789560154080391, 0.07114007323980331, 0.043165579438209534, 0.029482915997505188, 0.14035911858081818], [0.031310074031353, 0.0015320440288633108, 0.001122340327128768, 0.01144096814095974, 0.00019982451340183616, 0.0020434160251170397, 0.005857226438820362, 0.02850714884698391, 0.18676814436912537, 0.021792737767100334, 0.017353367060422897, 0.09742826968431473, 0.014666317962110043, 0.23537909984588623, 0.16398537158966064, 0.05463998019695282, 0.125973641872406], [0.047279637306928635, 0.0026445304974913597, 0.004015921615064144, 0.01847054250538349, 0.0007131362217478454, 0.0010796872666105628, 0.014234290458261967, 0.03782493621110916, 0.0033129299990832806, 0.14728915691375732, 0.017355022951960564, 0.06086990237236023, 0.3371700942516327, 0.006533367559313774, 0.06553712487220764, 0.08295073360204697, 0.15271908044815063], [0.21096786856651306, 0.008210297673940659, 0.030143508687615395, 0.05999598652124405, 0.003010110929608345, 0.007182965520769358, 0.003914268221706152, 0.09623613208532333, 0.004739530850201845, 0.1234077662229538, 0.013820938766002655, 0.04265225678682327, 0.021119462326169014, 0.01205094251781702, 0.06651522219181061, 0.008013414219021797, 0.28801923990249634], [0.04040185734629631, 0.0029071501921862364, 0.017151542007923126, 0.031191686168313026, 0.000608003931120038, 0.0015823027351871133, 0.016976885497570038, 0.025271734222769737, 0.013457790948450565, 0.10208696126937866, 0.005976181477308273, 0.04973730817437172, 0.0627124160528183, 0.03763953223824501, 0.30612167716026306, 0.008525248616933823, 0.27765169739723206], [0.5110345482826233, 0.021214107051491737, 0.0031484689097851515, 0.006975341122597456, 0.005062417592853308, 0.008420225232839584, 0.0024418008979409933, 0.01910398341715336, 0.00479544885456562, 0.025157257914543152, 0.010209484957158566, 0.012089238502085209, 0.015126225538551807, 0.016794148832559586, 0.020974386483430862, 0.01814226433634758, 0.299310564994812]]], [[[0.2942064106464386, 0.039160460233688354, 0.03317175433039665, 0.016283294185996056, 0.1586073637008667, 0.05797572433948517, 0.06269232928752899, 0.04130171984434128, 0.039460714906454086, 0.02741401083767414, 0.042905692011117935, 0.023507410660386086, 0.03312396630644798, 0.03594551607966423, 0.01842297986149788, 0.030025390908122063, 0.04579522833228111], [0.11234013736248016, 0.05400193855166435, 0.22000369429588318, 0.0892023965716362, 0.04441475495696068, 0.04633288457989693, 0.12221872061491013, 0.13413378596305847, 0.009912466630339622, 0.028676168993115425, 0.01018514670431614, 0.013805604539811611, 0.012227486819028854, 0.009516485035419464, 0.012937908060848713, 0.019573643803596497, 0.060516759753227234], [0.07918878644704819, 0.13633184134960175, 0.07489942014217377, 0.010963517241179943, 0.02356603369116783, 0.06768477708101273, 0.009014979004859924, 0.4256519377231598, 0.00580556457862258, 0.007527557667344809, 0.002217942615970969, 0.00226124981418252, 0.00528138130903244, 0.0034260950051248074, 0.003916064742952585, 0.007136075291782618, 0.1351267695426941], [0.1494211107492447, 0.04864032566547394, 0.04137279465794563, 0.18501988053321838, 0.1036091148853302, 0.027430381625890732, 0.08121100813150406, 0.20665158331394196, 0.004257259424775839, 0.003457094542682171, 0.022281158715486526, 0.0232387762516737, 0.010450298897922039, 0.0038015139289200306, 0.008541776798665524, 0.017915673553943634, 0.06270021200180054], [0.16649559140205383, 0.15137293934822083, 0.04484995827078819, 0.04200722277164459, 0.05168299004435539, 0.09953606128692627, 0.05699102580547333, 0.2679547071456909, 0.005656800698488951, 0.006703716702759266, 0.002015704521909356, 0.00437629921361804, 0.007285885978490114, 0.0030511224176734686, 0.009616243652999401, 0.004954732954502106, 0.07544898241758347], [0.18440648913383484, 0.06000363826751709, 0.16277562081813812, 0.040696606040000916, 0.046857479959726334, 0.053631287068128586, 0.08474541455507278, 0.14826610684394836, 0.015691353008151054, 0.019942766055464745, 0.012469463050365448, 0.015263124369084835, 0.02123332768678665, 0.013269676826894283, 0.016003357246518135, 0.023385388776659966, 0.08135880529880524], [0.10291789472103119, 0.1424521803855896, 0.01512816920876503, 0.010362650267779827, 0.024956541135907173, 0.11401620507240295, 0.014625986106693745, 0.39091184735298157, 0.02003214880824089, 0.0047315312549471855, 0.0016476793680340052, 0.0016453261487185955, 0.005502951797097921, 0.007993718609213829, 0.0084117716178298, 0.007989581674337387, 0.12667381763458252], [0.2583490014076233, 0.051390502601861954, 0.03961044177412987, 0.03366640955209732, 0.18128560483455658, 0.0538775697350502, 0.08490049093961716, 0.03567209839820862, 0.024326015263795853, 0.01971268467605114, 0.06535258889198303, 0.023481445387005806, 0.02565721981227398, 0.022959930822253227, 0.029794815927743912, 0.021767769008874893, 0.028195347636938095], [0.048217400908470154, 0.031461894512176514, 0.0073286741971969604, 0.01988249458372593, 0.00930216908454895, 0.029779816046357155, 0.014332936145365238, 0.2404378205537796, 0.06219132989645004, 0.01606019213795662, 0.0033377560321241617, 0.030328329652547836, 0.13096046447753906, 0.029362710192799568, 0.00808022078126669, 0.0675598531961441, 0.2513759732246399], [0.45828524231910706, 0.026813596487045288, 0.00527245132252574, 0.0036671084817498922, 0.0740370973944664, 0.02479485049843788, 0.014194597490131855, 0.16702677309513092, 0.016573546454310417, 0.004108494613319635, 0.004460279364138842, 0.010082172229886055, 0.019173365086317062, 0.010918173007667065, 0.0027174027636647224, 0.009232476353645325, 0.1486424207687378], [0.24192474782466888, 0.02460918016731739, 0.0049170455895364285, 0.053803227841854095, 0.010416906327009201, 0.01665031537413597, 0.013680667616426945, 0.12323537468910217, 0.18243002891540527, 0.01017761044204235, 0.01942775398492813, 0.009469333104789257, 0.025349965319037437, 0.11039222031831741, 0.006360658910125494, 0.029608098790049553, 0.11754690110683441], [0.19179785251617432, 0.03607403114438057, 0.007599355187267065, 0.05878721550107002, 0.02901655063033104, 0.022668182849884033, 0.009196601808071136, 0.25340649485588074, 0.013587312772870064, 0.007168347481638193, 0.0202492605894804, 0.015386853367090225, 0.0069312360137701035, 0.01162862591445446, 0.007474367041140795, 0.026734009385108948, 0.28229376673698425], [0.25642696022987366, 0.016078144311904907, 0.005840683821588755, 0.00486754160374403, 0.016855480149388313, 0.021550389006733894, 0.022262882441282272, 0.11863991618156433, 0.1705816686153412, 0.058954812586307526, 0.011266568675637245, 0.010386349633336067, 0.02374374121427536, 0.12205943465232849, 0.013529784977436066, 0.022475330159068108, 0.10448024421930313], [0.10627448558807373, 0.029222659766674042, 0.005022116936743259, 0.01839040406048298, 0.010139775462448597, 0.02755764126777649, 0.009408225305378437, 0.21240746974945068, 0.07160402834415436, 0.009935073554515839, 0.0032641200814396143, 0.021732883527874947, 0.1268530637025833, 0.030779622495174408, 0.004728104919195175, 0.08964622020721436, 0.22303417325019836], [0.37329256534576416, 0.05806131660938263, 0.00556525494903326, 0.006698992568999529, 0.015171640552580357, 0.03911692649126053, 0.057422224432229996, 0.22861680388450623, 0.02002881094813347, 0.003992111887782812, 0.0037069320678710938, 0.007012305781245232, 0.00407114252448082, 0.009052271023392677, 0.003640666836872697, 0.015430751256644726, 0.14911936223506927], [0.09488152712583542, 0.034952495247125626, 0.008762393146753311, 0.02270941622555256, 0.014650014229118824, 0.03508550301194191, 0.01500701904296875, 0.15862365067005157, 0.1304904669523239, 0.054211974143981934, 0.007967072539031506, 0.01394654717296362, 0.04352014884352684, 0.10877382010221481, 0.024538293480873108, 0.01472149882465601, 0.21715816855430603], [0.601395308971405, 0.015973228961229324, 0.012312193401157856, 0.009063299745321274, 0.05247541889548302, 0.021258752793073654, 0.019890563562512398, 0.023481935262680054, 0.02930409647524357, 0.016061773523688316, 0.03538122400641441, 0.01889180764555931, 0.032414760440588, 0.027522653341293335, 0.02159687504172325, 0.026658568531274796, 0.03631753474473953]], [[0.5807123780250549, 0.04120909795165062, 0.008867739699780941, 0.01733371801674366, 0.014305485412478447, 0.029534505680203438, 0.01442652102559805, 0.09234919399023056, 0.026837516576051712, 0.011541194282472134, 0.012685221619904041, 0.028400318697094917, 0.018343161791563034, 0.020003899931907654, 0.007370438892394304, 0.031067311763763428, 0.04501226171851158], [0.37534400820732117, 0.024888383224606514, 0.499559611082077, 0.006246662698686123, 0.009760579094290733, 0.0105689512565732, 0.0017580765997990966, 0.041352529078722, 0.0001288901548832655, 1.9783539755735546e-05, 7.075294433889212e-06, 0.0015742983669042587, 0.00016656685329508036, 0.0002697305171750486, 3.338929309393279e-05, 0.0007923780358396471, 0.027528982609510422], [0.5255334377288818, 0.06502868980169296, 0.04805760830640793, 0.03629973530769348, 0.047795362770557404, 0.10859572887420654, 0.015880245715379715, 0.08095794916152954, 0.013673626817762852, 0.0020732588600367308, 1.0221308912150562e-05, 0.004519252106547356, 0.0001549973094370216, 0.00304097100161016, 0.004427528940141201, 0.0001912375882966444, 0.04376015439629555], [0.5188033580780029, 0.010038449428975582, 0.002708421554416418, 0.013112924993038177, 0.36794209480285645, 0.04042849689722061, 0.0024678409099578857, 0.03982872888445854, 0.00046830365317873657, 0.00026383629301562905, 1.0797327377076726e-05, 3.960409958381206e-05, 0.000123862293548882, 0.00010555232438491657, 0.00015543651534244418, 0.002345107262954116, 0.0011572744697332382], [0.10290932655334473, 0.0029478082433342934, 0.002373269060626626, 0.0008280323818325996, 0.02518332377076149, 0.8410524129867554, 0.0053685675375163555, 0.013445304706692696, 8.01453206804581e-05, 4.956334669259377e-05, 0.0004158160008955747, 2.610838600958232e-05, 5.9792532738356385e-06, 5.7897059377864935e-06, 0.0003356564266141504, 0.0023850970901548862, 0.002587782684713602], [0.14715026319026947, 0.004375548101961613, 0.006219474598765373, 0.00407334603369236, 0.010198993608355522, 0.05666745454072952, 0.6661040186882019, 0.09151221066713333, 0.0005084021249786019, 0.002335856668651104, 6.258254870772362e-05, 0.0056824227795004845, 0.00014237957657314837, 2.4906428279791726e-06, 7.442053902195767e-05, 0.0008465154096484184, 0.004043702036142349], [0.1569405198097229, 0.0031464421190321445, 0.009687644429504871, 0.0004722019948530942, 0.0017986721359193325, 0.00691115390509367, 0.006497683469206095, 0.8090660572052002, 0.0016536825569346547, 0.0016000033356249332, 0.00013849970127921551, 0.0008609144133515656, 1.6799260265543126e-05, 2.2572765374206938e-05, 6.893676527397474e-07, 0.0002916282683145255, 0.0008948236354626715], [0.8889017701148987, 4.895301754004322e-05, 0.0002072897186735645, 0.0006039748550392687, 0.0006328316521830857, 0.0016732490621507168, 0.0008611673838458955, 0.0894637256860733, 0.013337576761841774, 0.0007119080983102322, 5.4896667279535905e-05, 0.0020501252729445696, 9.651265281718224e-05, 0.0001773969706846401, 1.278231866308488e-05, 2.30489549721824e-05, 0.0011427255813032389], [0.056153230369091034, 0.0002748226106632501, 0.00010805700730998069, 1.051290655595949e-05, 0.0008257327717728913, 0.0009225833928212523, 0.003326368983834982, 0.025836175307631493, 0.003687663469463587, 0.8992720246315002, 0.0010510360589250922, 0.0024148609954863787, 0.000846002425532788, 0.00010444538202136755, 0.0046789576299488544, 6.968784873606637e-05, 0.0004179610696155578], [0.11362070590257645, 0.010236592032015324, 0.0005324642406776547, 4.938308393320767e-06, 0.002385772531852126, 0.03629636764526367, 0.0027964620385318995, 0.09169431030750275, 0.003992430865764618, 0.013213243335485458, 0.7041981220245361, 0.007061961572617292, 0.00031745037995278835, 0.0012300860835239291, 0.0007365556666627526, 0.0007005892693996429, 0.010981929488480091], [0.0016391429817304015, 0.0003202382358722389, 1.8410186385153793e-05, 6.494987019323162e-07, 5.893495540476579e-07, 0.00023699835583101958, 1.5843927030800842e-05, 0.0011491361074149609, 0.00010579307854641229, 0.00010398239828646183, 0.00030339680961333215, 0.9921616911888123, 0.0006543122581206262, 6.531050166813657e-05, 7.265472231665626e-05, 2.4089669750537723e-05, 0.0031278685200959444], [0.6928556561470032, 0.05536835268139839, 0.0010622063418850303, 0.00011805184476543218, 3.433192614465952e-05, 0.00027247238904237747, 0.0004102397651877254, 0.11748170852661133, 0.0007884047809056938, 0.0011498411186039448, 0.0009269376751035452, 0.004287737421691418, 0.08373153954744339, 0.005113203544169664, 0.0016939418856054544, 0.004109365865588188, 0.030596064403653145], [0.3296129107475281, 0.0017103067366406322, 0.001048304489813745, 0.0005151937948539853, 0.0002531539066694677, 0.0010566882556304336, 5.435044840851333e-06, 0.04597603902220726, 0.0064400359988212585, 0.00016019880422390997, 0.0004900129279121757, 0.002726091304793954, 0.0020133943762630224, 0.5553218126296997, 0.0010315522085875273, 0.004665195941925049, 0.04697369039058685], [0.10348385572433472, 0.0016293178778141737, 0.0014750288100913167, 6.713111361023039e-05, 0.0007799062877893448, 0.0009861844591796398, 0.0007190742180682719, 0.0007262665894813836, 0.0002717929019127041, 0.04833551496267319, 0.00011137499677715823, 0.0038250633515417576, 0.003235937561839819, 0.009693200699985027, 0.800777018070221, 0.0027022138237953186, 0.021181104704737663], [0.11845811456441879, 0.03623312711715698, 0.0005437781801447272, 3.384737283340655e-05, 0.10683819651603699, 0.010278448462486267, 3.426504190429114e-05, 0.004006266128271818, 1.1007806278939825e-05, 0.00017413990281056613, 0.0003026370541192591, 0.0008250735700130463, 0.0021501961164176464, 0.0004862987552769482, 0.00499759241938591, 0.6025999784469604, 0.11202705651521683], [0.4829951822757721, 0.018130507320165634, 0.0010809306986629963, 0.0004454187292139977, 1.4198865756043233e-05, 0.009204886853694916, 0.0008735992014408112, 0.007062417455017567, 0.00018919719150289893, 1.1025390449503902e-05, 9.030234650708735e-05, 0.0023876202758401632, 0.0003625082899816334, 0.001364001422189176, 0.00043839000863954425, 0.005595126189291477, 0.46975481510162354], [0.9607875347137451, 0.001442094799131155, 0.00026001298101618886, 0.0010312420781701803, 0.0003748074232134968, 0.0001437041792087257, 0.00023468966537620872, 0.005791218485683203, 8.685141074238345e-05, 6.448047497542575e-05, 1.6306505585816922e-06, 0.0007040995405986905, 0.0006305401329882443, 9.136833250522614e-05, 0.00026675942353904247, 0.0008705088985152543, 0.02721867524087429]], [[0.7185878157615662, 0.02183319441974163, 0.01043851301074028, 0.009065371006727219, 0.013156350702047348, 0.019503843039274216, 0.00973360612988472, 0.02891448512673378, 0.026824763044714928, 0.011336914263665676, 0.016757961362600327, 0.012981743551790714, 0.02045232430100441, 0.01836247742176056, 0.012017393484711647, 0.023685093969106674, 0.026348087936639786], [0.20503754913806915, 0.045343562960624695, 0.013054374605417252, 0.12613950669765472, 0.06906301528215408, 0.03549940884113312, 0.011455625295639038, 0.1908731609582901, 0.02940906397998333, 0.03421906754374504, 0.04253161698579788, 0.023407593369483948, 0.029890717938542366, 0.016298439353704453, 0.01451877225190401, 0.029133891686797142, 0.08412458747625351], [0.16809307038784027, 0.03815235570073128, 0.007063749246299267, 0.06280918419361115, 0.04072794318199158, 0.04073070362210274, 0.031230933964252472, 0.15413278341293335, 0.07205695658922195, 0.031824029982089996, 0.04444906488060951, 0.01685229130089283, 0.09054188430309296, 0.07142390310764313, 0.033193450421094894, 0.01541399210691452, 0.08130370080471039], [0.16384652256965637, 0.039215974509716034, 0.03818850219249725, 0.015100257471203804, 0.0674314945936203, 0.03755272552371025, 0.029478436335921288, 0.1745021939277649, 0.03720156103372574, 0.02902248688042164, 0.1436021775007248, 0.03061773255467415, 0.03010174073278904, 0.030388785526156425, 0.014857119880616665, 0.029757888987660408, 0.08913430571556091], [0.2911185026168823, 0.05079743266105652, 0.019740605726838112, 0.06522077322006226, 0.06794581562280655, 0.052945900708436966, 0.012776588089764118, 0.1323663592338562, 0.035251107066869736, 0.02701755054295063, 0.029728448018431664, 0.010967324487864971, 0.05439675971865654, 0.021521709859371185, 0.03704003989696503, 0.03818592056632042, 0.05297906696796417], [0.2500564455986023, 0.05407721921801567, 0.014342760667204857, 0.13691078126430511, 0.07112404704093933, 0.04487422853708267, 0.011892417445778847, 0.1402219533920288, 0.028884680941700935, 0.02508612722158432, 0.04562627896666527, 0.016080211848020554, 0.025460783392190933, 0.01715720258653164, 0.019535791128873825, 0.02987912856042385, 0.06878980994224548], [0.2926269769668579, 0.04051666334271431, 0.006198144983500242, 0.17157699167728424, 0.06531418859958649, 0.05105903372168541, 0.003978185821324587, 0.1036854013800621, 0.015256752260029316, 0.001727095223031938, 0.018350234255194664, 0.0031101771164685488, 0.11367517709732056, 0.014649041928350925, 0.013706203550100327, 0.031246080994606018, 0.05332352593541145], [0.37957513332366943, 0.019051088020205498, 0.016184326261281967, 0.027420559898018837, 0.012990938499569893, 0.017659984529018402, 0.00643672002479434, 0.1412927210330963, 0.04876267910003662, 0.023993901908397675, 0.11988711357116699, 0.019860200583934784, 0.03006940893828869, 0.027107445523142815, 0.01870284415781498, 0.02353781834244728, 0.06746706366539001], [0.15483614802360535, 0.03448833152651787, 0.021084820851683617, 0.02340063266456127, 0.03486000746488571, 0.029577482491731644, 0.015768539160490036, 0.019442999735474586, 0.08799503743648529, 0.05374117195606232, 0.07713525742292404, 0.006903733126819134, 0.3438015878200531, 0.04599263146519661, 0.009979521855711937, 0.029269399121403694, 0.011722663417458534], [0.29459184408187866, 0.02919173799455166, 0.0029196240939199924, 0.010601839981973171, 0.035028330981731415, 0.025251595303416252, 0.006742431782186031, 0.02819422446191311, 0.08141349256038666, 0.004742322489619255, 0.037918973714113235, 0.003895762376487255, 0.3174453675746918, 0.05936523154377937, 0.010600786656141281, 0.036253657191991806, 0.015842894092202187], [0.42960917949676514, 0.0707779973745346, 0.014211025089025497, 0.057334061712026596, 0.024310622364282608, 0.06716123223304749, 0.003025626763701439, 0.01715385727584362, 0.016845136880874634, 0.009447413496673107, 0.06951411068439484, 0.015528563410043716, 0.03785258159041405, 0.014569601975381374, 0.047628991305828094, 0.09305128455162048, 0.011978820897638798], [0.5906081199645996, 0.05667980760335922, 0.01000857912003994, 0.02584378980100155, 0.037622712552547455, 0.047004155814647675, 0.005465665832161903, 0.026525944471359253, 0.02192653715610504, 0.007482872344553471, 0.04053524509072304, 0.022266842424869537, 0.03644001483917236, 0.013884156011044979, 0.016780512407422066, 0.02253296598792076, 0.018392154946923256], [0.43626952171325684, 0.05828136205673218, 0.010051964782178402, 0.026334578171372414, 0.043492771685123444, 0.038921479135751724, 0.016574105247855186, 0.01216467097401619, 0.03822875767946243, 0.04473632574081421, 0.07633769512176514, 0.0048355706967413425, 0.06783358007669449, 0.03637557849287987, 0.02087695151567459, 0.05562075600028038, 0.013064419850707054], [0.25771477818489075, 0.02758728340268135, 0.01262574177235365, 0.013424351811408997, 0.02330782450735569, 0.021995695307850838, 0.008802018128335476, 0.010746505111455917, 0.08095359057188034, 0.02560603618621826, 0.060701634734869, 0.004561653360724449, 0.35815349221229553, 0.045421190559864044, 0.014096408151090145, 0.026111792773008347, 0.008190049789845943], [0.3968939781188965, 0.05589500814676285, 0.015484144911170006, 0.024567078799009323, 0.04629111662507057, 0.03775835037231445, 0.003694878425449133, 0.03292253240942955, 0.025954164564609528, 0.005798643920570612, 0.06226382404565811, 0.011265062727034092, 0.05703730508685112, 0.020007332786917686, 0.0383986234664917, 0.13976548612117767, 0.026002492755651474], [0.21753010153770447, 0.05323253944516182, 0.010753534734249115, 0.04870949313044548, 0.054636772722005844, 0.04739021137356758, 0.006550057325512171, 0.030185826122760773, 0.11537887156009674, 0.04335810989141464, 0.04652369022369385, 0.011112015694379807, 0.16087791323661804, 0.0689692348241806, 0.03168438374996185, 0.03546340763568878, 0.017643921077251434], [0.504054069519043, 0.029357917606830597, 0.013304616324603558, 0.01969185657799244, 0.01639614626765251, 0.02725977450609207, 0.008326550014317036, 0.06166190281510353, 0.0350794643163681, 0.021031586453318596, 0.10826375335454941, 0.02171511948108673, 0.017729321494698524, 0.0188139621168375, 0.030704084783792496, 0.023235877975821495, 0.0433739572763443]], [[0.6482798457145691, 0.056268446147441864, 0.003863082267343998, 0.0026292616967111826, 0.039393048733472824, 0.08332168310880661, 0.012232718989253044, 0.036429259926080704, 0.01495497114956379, 0.0034526928793638945, 0.022732499986886978, 0.006106847431510687, 0.008678500540554523, 0.010644311085343361, 0.007000730838626623, 0.014619525521993637, 0.029392564669251442], [0.5142877101898193, 0.07849133014678955, 0.02068483643233776, 0.012633571401238441, 0.05599180981516838, 0.10365787148475647, 0.0544997900724411, 0.07132288813591003, 0.014599810354411602, 0.005192063748836517, 0.004522743634879589, 0.005961945280432701, 0.011081777513027191, 0.008508051745593548, 0.004094616509974003, 0.016000935807824135, 0.018468184396624565], [0.10855378955602646, 0.03128325566649437, 0.058896731585264206, 0.15028676390647888, 0.03277243301272392, 0.023637644946575165, 0.26459750533103943, 0.09785537421703339, 0.04340459033846855, 0.008874040096998215, 0.008821253664791584, 0.023001469671726227, 0.035862091928720474, 0.026109417900443077, 0.012029734440147877, 0.044511765241622925, 0.02950209006667137], [0.24740450084209442, 0.08083457499742508, 0.1066495031118393, 0.019950374960899353, 0.08000475913286209, 0.06995340436697006, 0.19463695585727692, 0.06610240042209625, 0.015075323171913624, 0.013580495491623878, 0.005866491701453924, 0.008991090580821037, 0.0201895572245121, 0.010048913769423962, 0.01784593053162098, 0.019156377762556076, 0.023709291592240334], [0.45973992347717285, 0.0838560238480568, 0.03167903050780296, 0.019002815708518028, 0.028639955446124077, 0.08407121151685715, 0.08578956872224808, 0.09881563484668732, 0.015408722683787346, 0.005072189494967461, 0.003282167948782444, 0.013174903579056263, 0.01473909243941307, 0.008775072172284126, 0.005173812620341778, 0.01753339171409607, 0.02524654008448124], [0.5217321515083313, 0.09265062212944031, 0.017699306830763817, 0.007632432971149683, 0.0415399968624115, 0.1067304015159607, 0.03292735666036606, 0.08387015759944916, 0.01637311466038227, 0.004499124363064766, 0.004953694995492697, 0.0064506810158491135, 0.011463688686490059, 0.009908732026815414, 0.003970184829086065, 0.013669596053659916, 0.023928778246045113], [0.21077480912208557, 0.02199426479637623, 0.2693169414997101, 0.05162999406456947, 0.07286183536052704, 0.020699888467788696, 0.18559083342552185, 0.03266936540603638, 0.030355406925082207, 0.016375990584492683, 0.008912145160138607, 0.012442930601537228, 0.014205786399543285, 0.014281070791184902, 0.013571527786552906, 0.01751352660357952, 0.006803711876273155], [0.738476037979126, 0.028772180899977684, 0.0059258416295051575, 0.005328870378434658, 0.06409876048564911, 0.04519809037446976, 0.016800470650196075, 0.02020532265305519, 0.008163309656083584, 0.0025025801733136177, 0.015291946940124035, 0.004845321178436279, 0.005921830423176289, 0.007073286455124617, 0.004532909952104092, 0.016456278041005135, 0.010407062247395515], [0.2738190293312073, 0.0019465177319943905, 0.021412242203950882, 0.018045809119939804, 0.011627238243818283, 0.0013136413181200624, 0.06891851872205734, 0.007880487479269505, 0.03933050110936165, 0.020788244903087616, 0.04275975376367569, 0.04082588106393814, 0.17708967626094818, 0.029133278876543045, 0.02443244494497776, 0.21165108680725098, 0.009025638923048973], [0.7809253334999084, 0.0022239754907786846, 0.0026762064080685377, 0.0025304199662059546, 0.021669309586286545, 0.001493410556577146, 0.015894845128059387, 0.005270833615213633, 0.028719596564769745, 0.0018381065456196666, 0.020441675558686256, 0.014622879214584827, 0.04177571088075638, 0.019133802503347397, 0.008058125153183937, 0.026967113837599754, 0.005758552812039852], [0.5344483852386475, 0.0030383518896996975, 0.016202306374907494, 0.004435799550265074, 0.007209454197436571, 0.001891518710181117, 0.02948477864265442, 0.013029280118644238, 0.052405521273612976, 0.007596147246658802, 0.029785839840769768, 0.07504831254482269, 0.09318602085113525, 0.035661567002534866, 0.01922645792365074, 0.05800577253103256, 0.019344469532370567], [0.7889672517776489, 0.0016604720149189234, 0.005169859621673822, 0.002659570425748825, 0.007253003306686878, 0.001258523785509169, 0.010985956527292728, 0.004596759099513292, 0.020941907539963722, 0.004492275416851044, 0.027013882994651794, 0.0073482622392475605, 0.035710811614990234, 0.012800206430256367, 0.0056478530168533325, 0.05694454908370972, 0.006548953242599964], [0.686377227306366, 0.0013815655838698149, 0.0031483580823987722, 0.004112154711037874, 0.018335191532969475, 0.0013337370473891497, 0.027473770081996918, 0.0023715607821941376, 0.04432801529765129, 0.005097361747175455, 0.0301175769418478, 0.015001141466200352, 0.05351507291197777, 0.04007567837834358, 0.006751265376806259, 0.0562930703163147, 0.004287160001695156], [0.45378702878952026, 0.0022841976024210453, 0.010307500138878822, 0.0074905999936163425, 0.013973871245980263, 0.0017781691858544946, 0.060081906616687775, 0.006397246848791838, 0.04203445091843605, 0.014567393809556961, 0.03784400224685669, 0.029349369928240776, 0.1302204579114914, 0.025764714926481247, 0.01733858697116375, 0.1396312415599823, 0.0071492986753582954], [0.7541185617446899, 0.00429846066981554, 0.010749293491244316, 0.0036266499664634466, 0.014584883116185665, 0.003240632824599743, 0.024570491164922714, 0.015176241286098957, 0.02355566807091236, 0.003427969990298152, 0.02503485418856144, 0.01805908977985382, 0.02976321242749691, 0.01465181726962328, 0.00450160913169384, 0.034061249345541, 0.016579294577240944], [0.1961231827735901, 0.007989483885467052, 0.012025274336338043, 0.011479729786515236, 0.013616496697068214, 0.0038176991511136293, 0.02841930463910103, 0.012658189982175827, 0.17357461154460907, 0.021702662110328674, 0.05500391870737076, 0.0586363710463047, 0.21332289278507233, 0.10048922151327133, 0.022290268912911415, 0.04815292730927467, 0.02069772779941559], [0.8800064921379089, 0.007257149554789066, 0.001227105502039194, 0.0009484358015470207, 0.021190879866480827, 0.00999467633664608, 0.0030454162042587996, 0.006028722506016493, 0.006778924725949764, 0.0026835864409804344, 0.017334172502160072, 0.0037876174319535494, 0.00748257664963603, 0.005199508275836706, 0.003970787860453129, 0.014934180304408073, 0.008129695430397987]], [[0.5322138071060181, 0.05933433398604393, 0.010671328753232956, 0.010927482508122921, 0.03034929186105728, 0.06263474375009537, 0.01133034098893404, 0.04588230326771736, 0.04248083382844925, 0.014661109074950218, 0.02040206640958786, 0.013272065669298172, 0.023626890033483505, 0.04685518518090248, 0.016681471839547157, 0.009675146080553532, 0.04900157451629639], [0.9938390851020813, 0.004061238840222359, 0.0003432311932556331, 0.00012922285532113165, 0.0011145296739414334, 0.00025364678003825247, 8.129048364935443e-05, 3.760206936931354e-06, 3.2183340863412013e-06, 1.5198660548776388e-05, 9.523035942038405e-07, 2.779620081128087e-05, 2.3479910851165187e-06, 3.648057600003085e-06, 8.977054676506668e-05, 1.846124723670073e-05, 1.2597896784427576e-05], [0.7676234841346741, 0.21918344497680664, 0.004388763569295406, 0.00033558369614183903, 0.001214080723002553, 0.00039992135134525597, 0.003672762541100383, 0.0009896543342620134, 4.782309042639099e-05, 9.155010047834367e-05, 4.821628681384027e-05, 8.420432277489454e-05, 0.0002386862033745274, 0.00013925391249358654, 0.00032304777414537966, 0.00023822775983717293, 0.0009813919896259904], [0.40121373534202576, 0.23819109797477722, 0.04959322512149811, 0.057301830500364304, 0.12643630802631378, 0.06179250031709671, 0.0002992528607137501, 0.03793821856379509, 0.0009409622871316969, 7.92603605077602e-05, 3.2129068131325766e-05, 5.359593706089072e-05, 0.0003190866264048964, 0.001614773995243013, 7.724146416876465e-05, 0.0004582251131068915, 0.02365851402282715], [0.06375028192996979, 0.008976681157946587, 0.003667504759505391, 0.8871744871139526, 0.016758857294917107, 0.015066245570778847, 0.00020275152928661555, 0.002365291817113757, 0.0010034320876002312, 0.0002209538797615096, 3.8030430005164817e-07, 2.3840839276090264e-05, 2.6758620151667856e-05, 0.00019337960111442953, 0.000254443206358701, 1.5094071386556607e-05, 0.000299659906886518], [0.12632784247398376, 0.0032044644467532635, 0.002884334186092019, 0.025388970971107483, 0.7636216282844543, 0.06612717360258102, 0.0037245412822812796, 0.0059339189901947975, 0.000208754456252791, 0.0017219517612829804, 2.2113670638646e-05, 4.07795869250549e-06, 1.8508389985072426e-05, 5.5507618526462466e-05, 0.00014228637155611068, 0.0004887224640697241, 0.00012521845928858966], [0.7504099011421204, 0.0003849198401439935, 0.001604793593287468, 0.000517133332323283, 0.0416472926735878, 0.17969587445259094, 0.015804870054125786, 0.007611360400915146, 0.00036103447200730443, 0.0004629761097021401, 0.00014215364353731275, 1.985096969292499e-05, 2.304069539604825e-06, 4.643395732273348e-05, 5.9009191318182275e-05, 5.155307735549286e-05, 0.0011784540256485343], [0.8821566700935364, 0.0017508944729343057, 0.0001217467724927701, 8.866190910339355e-05, 0.002436433918774128, 0.012974086217582226, 0.014909601770341396, 0.08267084509134293, 0.0007912681321613491, 0.0002855501079466194, 0.00010120873776031658, 0.0005827238783240318, 0.0002209423837484792, 1.0959885003103409e-05, 2.3135657102102414e-05, 2.3013921236270107e-05, 0.0008523568976670504], [0.7817734479904175, 0.00028792876400984824, 0.00028915563598275185, 3.149616895825602e-05, 0.00019328990310896188, 0.00039737255428917706, 0.00222280016168952, 0.21118281781673431, 0.0025429772213101387, 0.0005032427725382149, 6.459236465161666e-05, 8.669061935506761e-05, 0.00019611688912846148, 1.480718856328167e-05, 6.188515726535115e-06, 4.014252681372454e-06, 0.00020310617401264608], [0.7818301320075989, 6.580071931239218e-05, 0.000216232831007801, 0.0004095242475159466, 9.056257840711623e-05, 0.0013834183337166905, 0.00039748172275722027, 0.03528838977217674, 0.17597194015979767, 0.003006905084475875, 0.00020858384959865361, 0.00023858265194576234, 3.998364263679832e-05, 0.00031036860309541225, 7.727265619905666e-05, 3.848614142043516e-06, 0.00046105170622467995], [0.6835386157035828, 8.250562677858397e-05, 2.0654072159231873e-06, 2.0827848857152276e-05, 0.00011750896373996511, 0.0002098310797009617, 2.7722477170755155e-05, 0.014225042425096035, 0.013533206656575203, 0.282716304063797, 0.0025119122583419085, 0.0012656727340072393, 0.00020385101379361004, 0.00014132836076896638, 0.0011584586463868618, 2.4897783077904023e-05, 0.0002202541072620079], [0.002755305962637067, 0.00010995811317116022, 2.8522592856461415e-06, 2.0898632158150576e-07, 4.821248239750275e-06, 0.0006785945151932538, 2.5067362003028393e-05, 0.0008490686886943877, 0.0002457723021507263, 0.0013420209288597107, 0.9932505488395691, 0.000194040680071339, 0.00010481755452929065, 0.00015917800192255527, 2.7955224140896462e-05, 8.863612310960889e-05, 0.0001610802864888683], [0.8889709115028381, 0.0010295920073986053, 1.7409825886716135e-05, 2.002551991608925e-05, 4.3120994632772636e-06, 0.0009308324079029262, 0.00021298634237609804, 0.002540617249906063, 0.0007428344688378274, 0.0003329126047901809, 0.004486019257456064, 0.08645206689834595, 0.0024344518315047026, 0.0009742826805450022, 0.00023639085702598095, 4.58802460343577e-05, 0.010568439960479736], [0.9405987858772278, 0.0007398508605547249, 3.7416426494019106e-05, 3.6118006391916424e-05, 3.491567986202426e-05, 5.727924417442409e-06, 1.9410481399972923e-05, 0.011941532604396343, 2.960453275591135e-05, 0.00022587859712075442, 7.24075798643753e-05, 0.002490999409928918, 0.03911217674612999, 0.0013571237213909626, 0.00035801593912765384, 7.163146074162796e-05, 0.002868492854759097], [0.9501668810844421, 3.099176683463156e-05, 3.646867844508961e-05, 2.174437940993812e-05, 6.733972259098664e-05, 4.87365759909153e-05, 5.966923595224216e-07, 0.0009204033412970603, 0.000405986764235422, 2.584258618298918e-05, 1.7954338545678183e-05, 7.146958523662761e-05, 0.0004567163996398449, 0.04421154037117958, 0.0016181188402697444, 4.1068658902077004e-05, 0.0018581242766231298], [0.007667409256100655, 0.0008544888696633279, 0.0001818528544390574, 0.00015856089885346591, 0.002547712530940771, 0.0005943468422628939, 0.00011536364763742313, 0.0006223046220839024, 3.4030377719318494e-05, 0.002404649741947651, 0.0006005006143823266, 0.00017793492588680238, 0.00240223272703588, 0.006532533559948206, 0.9651476740837097, 0.0031476859003305435, 0.006810721475630999], [0.9668761491775513, 0.0026801300700753927, 2.2525275198859163e-05, 9.780722393770702e-06, 0.0005209475057199597, 0.0005234674317762256, 1.4087261661188677e-05, 0.0009302936960011721, 1.614800021343399e-05, 4.7282206651289016e-05, 0.0001923212839756161, 8.943941065808758e-05, 0.00022120511857792735, 0.0004131650784984231, 0.001382562331855297, 0.0081690214574337, 0.017891457304358482]], [[0.7547858357429504, 0.009864872321486473, 0.0054228943772614, 0.01671447791159153, 0.006792051717638969, 0.010559231974184513, 0.003916178364306688, 0.10249600559473038, 0.003391697071492672, 0.003089041216298938, 0.006658147554844618, 0.004883144982159138, 0.002287093084305525, 0.0027324501425027847, 0.006697478704154491, 0.007319116964936256, 0.05239033326506615], [0.08516756445169449, 0.056832633912563324, 0.029573997482657433, 0.045987095683813095, 0.052306391298770905, 0.09999803453683853, 0.02295447140932083, 0.515548050403595, 0.002257472835481167, 0.006224208045750856, 0.005903398618102074, 0.003002558369189501, 0.0011153602972626686, 0.001535334624350071, 0.002816055901348591, 0.0010675416560843587, 0.06770980358123779], [0.4737701416015625, 0.06567241996526718, 0.010594836436212063, 0.029000218957662582, 0.022467730566859245, 0.08191531151533127, 0.03092263825237751, 0.1958407461643219, 0.01950962096452713, 0.0016905348747968674, 0.0034030135720968246, 0.005839413497596979, 0.0013458247995004058, 0.00862917210906744, 0.016522999852895737, 0.005661205388605595, 0.027214230969548225], [0.21973469853401184, 0.06196912005543709, 0.020438281819224358, 0.04035696014761925, 0.05862792208790779, 0.13625621795654297, 0.03700202330946922, 0.33872920274734497, 0.0016601807437837124, 0.004187688231468201, 0.002491835504770279, 0.002028044778853655, 0.0033214904833585024, 0.0011582830920815468, 0.0017884550616145134, 0.004360937979072332, 0.06588860601186752], [0.12031736969947815, 0.04219011962413788, 0.018179818987846375, 0.024624444544315338, 0.02486785128712654, 0.09344056993722916, 0.1815103143453598, 0.42021048069000244, 0.0044550406746566296, 0.0021856268867850304, 0.0030118508730083704, 0.0103220846503973, 0.00108128204010427, 0.0015761815011501312, 0.0028439683374017477, 0.0031991302967071533, 0.04598383605480194], [0.1319248080253601, 0.14204636216163635, 0.03194127604365349, 0.038015034049749374, 0.07182426005601883, 0.2037440687417984, 0.0371730737388134, 0.26704585552215576, 0.004954897798597813, 0.005547970533370972, 0.005295769777148962, 0.005893618334084749, 0.0014092357596382499, 0.0028523108921945095, 0.006510399281978607, 0.0032029093708842993, 0.040618088096380234], [0.1496758759021759, 0.048582617193460464, 0.03475802019238472, 0.03501974046230316, 0.06768042594194412, 0.0673070028424263, 0.045701343566179276, 0.14727887511253357, 0.10842838883399963, 0.012343262322247028, 0.006155111361294985, 0.07159121334552765, 0.010248725302517414, 0.06651417911052704, 0.0400426872074604, 0.07041127979755402, 0.01826133206486702], [0.4400784373283386, 0.041901037096977234, 0.006975466851145029, 0.05934727564454079, 0.013717503286898136, 0.03826424106955528, 0.029440617188811302, 0.2506181299686432, 0.0035429615527391434, 0.005923509132117033, 0.013415023684501648, 0.005728854797780514, 0.003088337369263172, 0.0024096479173749685, 0.005949001759290695, 0.010541481897234917, 0.06905857473611832], [0.29571259021759033, 0.00459167780354619, 0.009576745331287384, 0.006729051936417818, 0.003299379488453269, 0.0018090351950377226, 0.006344038527458906, 0.11935541033744812, 0.05753304064273834, 0.052367717027664185, 0.011496080085635185, 0.226090669631958, 0.009002748876810074, 0.03081720694899559, 0.06823030859231949, 0.03684920817613602, 0.06019514054059982], [0.5871413946151733, 0.017467418685555458, 0.010061989538371563, 0.008851512335240841, 0.021066589280962944, 0.0156331118196249, 0.0075898123905062675, 0.0758996456861496, 0.034988582134246826, 0.001633160631172359, 0.027430277317762375, 0.03720615431666374, 0.002294766018167138, 0.014355944469571114, 0.0341450534760952, 0.04447627067565918, 0.059758394956588745], [0.05961215868592262, 0.0029146946035325527, 0.0018578103045001626, 0.0008145995670929551, 0.0014347018441185355, 0.002391014713793993, 0.006449660286307335, 0.024876069277524948, 0.05961885303258896, 0.010143116116523743, 0.0030843070708215237, 0.532457172870636, 0.013193241320550442, 0.03973464295268059, 0.07789616286754608, 0.13865229487419128, 0.024869419634342194], [0.6850425004959106, 0.006926869973540306, 0.015194338746368885, 0.00325207132846117, 0.005424173083156347, 0.0056083472445607185, 0.0034356960095465183, 0.10793834924697876, 0.007308503147214651, 0.009034384042024612, 0.024001508951187134, 0.007774698548018932, 0.01121484860777855, 0.005287432577461004, 0.02718975953757763, 0.00790848582983017, 0.06745807826519012], [0.5220034718513489, 0.011350404471158981, 0.020897135138511658, 0.007640219293534756, 0.004908623173832893, 0.004910303745418787, 0.0038586559239774942, 0.061488740146160126, 0.011166245676577091, 0.01019758265465498, 0.01955796778202057, 0.0735439658164978, 0.002632837975397706, 0.01213520485907793, 0.0838482528924942, 0.03891734033823013, 0.1109430193901062], [0.5259912610054016, 0.003858619136735797, 0.0057672723196446896, 0.007925664074718952, 0.002772738691419363, 0.0010913294972851872, 0.002737729111686349, 0.06518160551786423, 0.025037206709384918, 0.022859372198581696, 0.004437198396772146, 0.15586748719215393, 0.004478969611227512, 0.01785382069647312, 0.052680715918540955, 0.04928094893693924, 0.05217800289392471], [0.3079712390899658, 0.005045867525041103, 0.003825489431619644, 0.005349821411073208, 0.008479290641844273, 0.0047453916631639, 0.004876612685620785, 0.030276503413915634, 0.04090511053800583, 0.004555382300168276, 0.013405626639723778, 0.16041071712970734, 0.02725648134946823, 0.03921004384756088, 0.012301505543291569, 0.2912074029445648, 0.04017749801278114], [0.4210013151168823, 0.00943884439766407, 0.0537964403629303, 0.0037189212162047625, 0.00750102661550045, 0.008036928251385689, 0.004910516552627087, 0.06729831546545029, 0.01641993038356304, 0.016574537381529808, 0.05032430216670036, 0.06922657042741776, 0.04528547823429108, 0.02794649451971054, 0.07216256856918335, 0.03959736227989197, 0.08676045387983322], [0.7766464352607727, 0.01463171374052763, 0.0028898667078465223, 0.017473895102739334, 0.005510909482836723, 0.011998109519481659, 0.0023421430960297585, 0.04297836497426033, 0.002867629285901785, 0.0036693245638161898, 0.006553813815116882, 0.004545251838862896, 0.001970538403838873, 0.0029834469314664602, 0.007688202429562807, 0.015110191889107227, 0.08014018088579178]], [[0.7727504372596741, 0.00858038105070591, 0.008474203757941723, 0.08900085091590881, 0.006152392365038395, 0.007445027586072683, 0.01290886476635933, 0.03962743654847145, 0.003136805957183242, 0.004099338781088591, 0.005153907462954521, 0.004629241302609444, 0.005629480816423893, 0.0036341461818665266, 0.003073272993788123, 0.006103338673710823, 0.019601019099354744], [0.7571249008178711, 0.006165395025163889, 0.006377467419952154, 0.20041045546531677, 0.011193528771400452, 0.0001714882382657379, 0.00045254462747834623, 0.017748812213540077, 6.345866495394148e-06, 2.214358937635552e-05, 1.5330775795519003e-06, 1.4185648979037069e-05, 3.0172430342645384e-05, 1.3593024050351232e-05, 2.6174966478720307e-05, 3.9219714381033555e-05, 0.00020206038607284427], [0.8827226758003235, 0.011961707845330238, 0.0038428890984505415, 0.0423479825258255, 0.027766501531004906, 0.013665142469108105, 0.002174933673813939, 0.015072389505803585, 1.706885450403206e-05, 3.157136598019861e-05, 2.0502813640632667e-05, 1.6153117030626163e-05, 2.7608568871073658e-06, 1.665282979956828e-05, 2.5792258384171873e-05, 8.78935752552934e-05, 0.00022741139400750399], [0.866417646408081, 0.00900233443826437, 0.004781858529895544, 0.032116495072841644, 0.014910922385752201, 0.022375911474227905, 0.034004345536231995, 0.015567164868116379, 2.1950445443508215e-05, 3.420053326408379e-05, 6.7084656620863825e-06, 6.353843491524458e-05, 6.186431619426003e-06, 2.6617815365170827e-06, 1.7567826944286935e-05, 0.00011519487452460453, 0.0005552679067477584], [0.24361801147460938, 0.008528145030140877, 0.011232786811888218, 0.14387273788452148, 0.010411730036139488, 0.026654507964849472, 0.06520693749189377, 0.48821792006492615, 8.200912270694971e-05, 0.00017415017646271735, 2.0941733964718878e-05, 0.00038427338586188853, 5.561183934332803e-05, 7.95800588093698e-06, 8.004183655430097e-06, 0.00024596485309302807, 0.001278352807275951], [0.10533745586872101, 2.1844052753294818e-05, 0.004666682332754135, 0.08646493405103683, 0.0056816949509084225, 0.003854665905237198, 0.017150860279798508, 0.7756156921386719, 0.00040859426371753216, 4.9661932280287147e-05, 1.2085332855349407e-05, 0.00014235515845939517, 2.277098064951133e-05, 1.323454034718452e-05, 2.1933972220722353e-06, 3.1844076147535816e-05, 0.0005234709242358804], [0.7225679755210876, 7.276464020833373e-05, 0.0004051068681292236, 0.0486372672021389, 0.010843642055988312, 0.008268492296338081, 0.01186375506222248, 0.19530434906482697, 0.0012752986513078213, 0.0006019490538164973, 1.8358503439230844e-05, 3.2692038075765595e-05, 3.1323241273639724e-05, 1.655262713029515e-05, 9.927593964675907e-06, 9.369588951813057e-06, 4.111229281988926e-05], [0.7284811735153198, 0.0004558685759548098, 0.0003409234923310578, 0.004853690974414349, 0.010716617107391357, 0.033291079103946686, 0.03605383262038231, 0.14945097267627716, 0.003909823950380087, 0.012587624602019787, 0.016525037586688995, 0.001618077396415174, 0.00031643203692510724, 0.00015267208800651133, 0.00018170903786085546, 0.0006065759807825089, 0.00045785840484313667], [0.8143633008003235, 5.058960596215911e-05, 6.601514905923977e-05, 0.00011247646762058139, 7.365751662291586e-05, 0.002744140103459358, 0.003993631806224585, 0.0474659763276577, 0.006942323409020901, 0.00759667856618762, 0.07336601614952087, 0.03639700263738632, 0.0003467315691523254, 0.00013484105875249952, 0.00022518700279761106, 0.0006242834497243166, 0.005497250705957413], [0.8592362403869629, 0.000318439764669165, 6.11943905823864e-05, 0.0005421902751550078, 9.548538946546614e-05, 0.00031821129960007966, 0.005460259970277548, 0.0934712141752243, 0.0044052088633179665, 0.002436827402561903, 0.00501026539131999, 0.020191127434372902, 0.006386762484908104, 0.0002276133600389585, 0.0001223232684424147, 0.00022394463303498924, 0.0014927684096619487], [0.7854921817779541, 4.700837052951101e-06, 2.3539672838523984e-05, 1.887868893390987e-05, 9.844242413237225e-06, 1.9562367015169002e-05, 4.437869574758224e-05, 0.08131130784749985, 0.02372152730822563, 0.0025420449674129486, 0.01236686296761036, 0.03351642191410065, 0.03439062088727951, 0.023202955722808838, 0.00028791805380024016, 0.00033490938949398696, 0.0027124183252453804], [0.9087998270988464, 6.950809620320797e-06, 1.8277785784448497e-05, 0.00027538620634004474, 4.0405193431070074e-05, 3.802767605520785e-05, 4.5045173465041444e-05, 0.003248580265790224, 0.013191413134336472, 0.00484170438721776, 0.010461622849106789, 0.013002006337046623, 0.00804837979376316, 0.028533047065138817, 0.005456393118947744, 0.0013203005073592067, 0.002672546776011586], [0.899337649345398, 0.0004280074208509177, 1.5242433619278017e-05, 0.00025497114984318614, 0.00013280603161547333, 0.00010954194294754416, 0.0001598851231392473, 0.002347684931010008, 0.0002547506883274764, 0.006213400047272444, 0.014653104357421398, 0.009877592325210571, 0.004160686396062374, 0.004716258496046066, 0.027887441217899323, 0.014736087992787361, 0.014714706689119339], [0.8118686079978943, 0.0001314570545218885, 0.00012550312385428697, 3.756134901777841e-05, 1.564570629852824e-05, 0.0001305926125496626, 0.00011821193766081706, 0.0016397623112425208, 3.7506906664930284e-05, 0.00017408770509064198, 0.005983081646263599, 0.015478395856916904, 0.0015276235062628984, 0.004042705520987511, 0.006746071856468916, 0.03724897280335426, 0.11469421535730362], [0.9067817330360413, 8.468512532999739e-05, 7.332599489018321e-05, 0.0001577970542712137, 4.4326957322482485e-06, 1.2403487744450103e-05, 0.0001045143508235924, 0.0006467544590122998, 6.317425868473947e-05, 3.9384194678859785e-05, 8.317382889799774e-05, 0.0042150286026299, 0.004913404583930969, 0.003568806452676654, 0.0024683072697371244, 0.004602170083671808, 0.07218081504106522], [0.9178918600082397, 1.5581457773805596e-05, 0.0001291971857426688, 0.0005804685642942786, 3.7173002056078985e-05, 1.0767875210149214e-05, 4.340753730502911e-05, 0.002072532195597887, 7.196416117949411e-05, 5.8155150327365845e-05, 4.099506259080954e-05, 0.0005912611377425492, 0.004896979779005051, 0.007019140291959047, 0.004506316967308521, 0.005761944688856602, 0.05627242848277092], [0.920553982257843, 7.148887380026281e-05, 0.0005422616959549487, 0.004240149166435003, 0.00048113736556842923, 0.00047682979493401945, 0.00041780731407925487, 0.0012514679692685604, 0.0006324729765765369, 0.0003759305109269917, 0.00042534328531473875, 0.000647897191811353, 0.0013609044253826141, 0.008396115154027939, 0.009276730939745903, 0.011861667968332767, 0.03898792713880539]], [[0.8752707242965698, 0.010145715437829494, 0.005290976259857416, 0.008911198936402798, 0.009171922691166401, 0.01042933575809002, 0.0028390849474817514, 0.03172311186790466, 0.004482789896428585, 0.0015522890025749803, 0.004269621334969997, 0.0034597674384713173, 0.0038721219170838594, 0.003467997768893838, 0.0038933816831558943, 0.006058037281036377, 0.015161937102675438], [0.12199195474386215, 0.10259052366018295, 0.0732407495379448, 0.11539902538061142, 0.08404726535081863, 0.08322817832231522, 0.09814020246267319, 0.228071928024292, 0.015235588885843754, 0.01320422813296318, 0.00803247932344675, 0.0050290413200855255, 0.011816405691206455, 0.010095865465700626, 0.00433484697714448, 0.006313996855169535, 0.019227614626288414], [0.19048869609832764, 0.1132664903998375, 0.034792397171258926, 0.10661054402589798, 0.08579465001821518, 0.08308365195989609, 0.07952732592821121, 0.22200128436088562, 0.01260889507830143, 0.016600003466010094, 0.006054143887013197, 0.006848284974694252, 0.01018479187041521, 0.006280827336013317, 0.0049152979627251625, 0.006135324481874704, 0.01480743009597063], [0.14377117156982422, 0.14006492495536804, 0.04763822630047798, 0.0871535986661911, 0.06705541163682938, 0.08288515359163284, 0.08975692838430405, 0.2298852652311325, 0.013663147576153278, 0.018573755398392677, 0.008791024796664715, 0.007961658760905266, 0.015092577785253525, 0.008341463282704353, 0.005843704100698233, 0.014051779173314571, 0.019470177590847015], [0.13443051278591156, 0.11599360406398773, 0.0981329083442688, 0.09486522525548935, 0.045501094311475754, 0.06330541521310806, 0.10006223618984222, 0.21180610358715057, 0.03202548250555992, 0.018302710726857185, 0.012764256447553635, 0.008794573135674, 0.01669544167816639, 0.015479540452361107, 0.0052488804794847965, 0.008742270059883595, 0.017849694937467575], [0.15074889361858368, 0.09295818209648132, 0.07630269974470139, 0.11040675640106201, 0.0776258334517479, 0.07391699403524399, 0.08898284286260605, 0.231175497174263, 0.01645219698548317, 0.011076070368289948, 0.008417180739343166, 0.006378249730914831, 0.011853267438709736, 0.01100036408752203, 0.005741327069699764, 0.007449564523994923, 0.019514095038175583], [0.1970466524362564, 0.12849080562591553, 0.024045836180448532, 0.059321995824575424, 0.08589904010295868, 0.10990596562623978, 0.05104204639792442, 0.2871304154396057, 0.009855733253061771, 0.007175843697041273, 0.0028255407232791185, 0.003019479801878333, 0.0072609977796673775, 0.005628496874123812, 0.004100958351045847, 0.004671034403145313, 0.01257922314107418], [0.6705631017684937, 0.044899310916662216, 0.0219928827136755, 0.03749915212392807, 0.04241444915533066, 0.04187631234526634, 0.019852908328175545, 0.08123528957366943, 0.004100162535905838, 0.0019887255039066076, 0.0036571682430803776, 0.0027003043796867132, 0.0035193979274481535, 0.0034368811175227165, 0.00423912750557065, 0.00586186908185482, 0.010162982158362865], [0.1978415846824646, 0.04601369798183441, 0.008730841800570488, 0.011434298008680344, 0.023564912378787994, 0.033846549689769745, 0.004836800508201122, 0.12212932109832764, 0.05675040930509567, 0.05211790278553963, 0.019618095830082893, 0.03532019257545471, 0.0737646296620369, 0.025149980559945107, 0.03066072426736355, 0.057265907526016235, 0.2009541541337967], [0.262468159198761, 0.0637156218290329, 0.02018571086227894, 0.028907230123877525, 0.016689879819750786, 0.04343951866030693, 0.00859843660145998, 0.14422668516635895, 0.0613672249019146, 0.024906964972615242, 0.039697229862213135, 0.024464523419737816, 0.046197280287742615, 0.02328525111079216, 0.019431978464126587, 0.058470141142606735, 0.11394807696342468], [0.25981080532073975, 0.04037712514400482, 0.008166557177901268, 0.01575249806046486, 0.014789687469601631, 0.026177292689681053, 0.008472079411149025, 0.052758339792490005, 0.08939885348081589, 0.047802381217479706, 0.023678692057728767, 0.09680598974227905, 0.08022461831569672, 0.0599735863506794, 0.03231457993388176, 0.06283147633075714, 0.08066533505916595], [0.3184558153152466, 0.03179793804883957, 0.006532294675707817, 0.021300408989191055, 0.019234439358115196, 0.020586656406521797, 0.006849492434412241, 0.06715408712625504, 0.08047652989625931, 0.03255534544587135, 0.058423276990652084, 0.027272721752524376, 0.05204018950462341, 0.047450654208660126, 0.04165048897266388, 0.07342234253883362, 0.09479725360870361], [0.15729250013828278, 0.044398095458745956, 0.010941185988485813, 0.01879245415329933, 0.015220096334815025, 0.026137061417102814, 0.005978063680231571, 0.04783995449542999, 0.12114252150058746, 0.051962222903966904, 0.033028390258550644, 0.05308720842003822, 0.05677604302763939, 0.07138530910015106, 0.04082570597529411, 0.10034617781639099, 0.1448470652103424], [0.31807082891464233, 0.06866411119699478, 0.008636459708213806, 0.013184705749154091, 0.027427302673459053, 0.0509958453476429, 0.0044118547812104225, 0.0864173173904419, 0.04712434485554695, 0.020672496408224106, 0.013058923184871674, 0.029771368950605392, 0.05219049006700516, 0.028300724923610687, 0.028539758175611496, 0.049859873950481415, 0.15267358720302582], [0.20261572301387787, 0.033346958458423615, 0.0078211585059762, 0.010914979502558708, 0.020004548132419586, 0.02041877619922161, 0.010511903092265129, 0.04238000512123108, 0.06677564978599548, 0.05525178089737892, 0.02893720380961895, 0.08278454840183258, 0.12399426102638245, 0.052768904715776443, 0.03813236579298973, 0.10307128727436066, 0.10026988387107849], [0.086479052901268, 0.026898792013525963, 0.014558898285031319, 0.03878985345363617, 0.02113404870033264, 0.012427226640284061, 0.012107809074223042, 0.11183538287878036, 0.10491549968719482, 0.1290065199136734, 0.022619588300585747, 0.02078002132475376, 0.09285224974155426, 0.057389579713344574, 0.037040211260318756, 0.03523697331547737, 0.17592838406562805], [0.6081128120422363, 0.016238315030932426, 0.004509724676609039, 0.011494138278067112, 0.013666429556906223, 0.014681649394333363, 0.003276598174124956, 0.04067643731832504, 0.020164117217063904, 0.009898505173623562, 0.014938415959477425, 0.01751135103404522, 0.020025357604026794, 0.015707440674304962, 0.02811858057975769, 0.0406319759786129, 0.12034818530082703]], [[0.2965709865093231, 0.05796460807323456, 0.028041359037160873, 0.06846131384372711, 0.049271389842033386, 0.0419328548014164, 0.024971364066004753, 0.14257101714611053, 0.026336094364523888, 0.021561192348599434, 0.023983754217624664, 0.028174186125397682, 0.027149595320224762, 0.023773105815052986, 0.015901505947113037, 0.022946517914533615, 0.10038920491933823], [0.1620849072933197, 0.07640958577394485, 0.053408633917570114, 0.0429617278277874, 0.03849251568317413, 0.049491479992866516, 0.02253851853311062, 0.35348573327064514, 0.022746887058019638, 0.010367569513618946, 0.006514961831271648, 0.0072015211917459965, 0.010149077512323856, 0.020953739061951637, 0.009680919349193573, 0.005482788663357496, 0.10802946984767914], [0.32796457409858704, 0.0664539486169815, 0.002340588951483369, 0.09588932245969772, 0.096953846514225, 0.06315144896507263, 0.10226022452116013, 0.11719150841236115, 0.012873605825006962, 0.00693670054897666, 0.010051289573311806, 0.006668702233582735, 0.02084077149629593, 0.011893057264387608, 0.016445191577076912, 0.012358658015727997, 0.029726533219218254], [0.13260376453399658, 0.08388900011777878, 0.024045556783676147, 0.0023716071154922247, 0.03167177736759186, 0.05999348685145378, 0.02283409796655178, 0.340464323759079, 0.017512794584035873, 0.015509285032749176, 0.018998095765709877, 0.01168860960751772, 0.09478573501110077, 0.013187335804104805, 0.010122915729880333, 0.0238198172301054, 0.09650178253650665], [0.45002344250679016, 0.04326621815562248, 0.023874878883361816, 0.026409652084112167, 0.015789851546287537, 0.03234647214412689, 0.02699875459074974, 0.2045053094625473, 0.017781440168619156, 0.011987457983195782, 0.004687793552875519, 0.00794391892850399, 0.014181495644152164, 0.01691306009888649, 0.017679505050182343, 0.007848555222153664, 0.0777621790766716], [0.2955580949783325, 0.05694920942187309, 0.07093565165996552, 0.05220045521855354, 0.029995739459991455, 0.04047683626413345, 0.020987287163734436, 0.24816808104515076, 0.019382355734705925, 0.011134741827845573, 0.003946724347770214, 0.006022250279784203, 0.007192783523350954, 0.018431687727570534, 0.012680443935096264, 0.005082977470010519, 0.10085470229387283], [0.33947139978408813, 0.06250279396772385, 0.19213902950286865, 0.06544287502765656, 0.029661916196346283, 0.06670798361301422, 0.004766721744090319, 0.06272544711828232, 0.008210159838199615, 0.02393733523786068, 0.006362354848533869, 0.016089720651507378, 0.06075449287891388, 0.007174529600888491, 0.010380832478404045, 0.019972067326307297, 0.023700203746557236], [0.09940823167562485, 0.09289997816085815, 0.045589741319417953, 0.13956570625305176, 0.10594592988491058, 0.08467381447553635, 0.04422207176685333, 0.15545640885829926, 0.023374300450086594, 0.029611827805638313, 0.02025497704744339, 0.020671185106039047, 0.027196241542696953, 0.025844756513834, 0.009663451462984085, 0.020381417125463486, 0.055239904671907425], [0.4941464960575104, 0.017830027267336845, 0.010883533395826817, 0.015018312260508537, 0.005468357354402542, 0.010275592096149921, 0.015259749256074429, 0.12616495788097382, 0.00798334926366806, 0.007314665243029594, 0.02900024689733982, 0.030528120696544647, 0.03532757610082626, 0.005817428696900606, 0.02185441367328167, 0.034009139984846115, 0.13311809301376343], [0.789246678352356, 0.007169062737375498, 0.004507251549512148, 0.01064229290932417, 0.009931791573762894, 0.0059505645185709, 0.01041458360850811, 0.048848774284124374, 0.010202658362686634, 0.0004368418885860592, 0.006939977873116732, 0.008138621225953102, 0.011395268142223358, 0.010558436624705791, 0.007068782113492489, 0.008172587491571903, 0.05037596449255943], [0.7570291757583618, 0.005169500596821308, 0.004183517768979073, 0.0017245536437258124, 0.005001619458198547, 0.0037477731239050627, 0.002091288100928068, 0.062359947711229324, 0.021220941096544266, 0.00537097966298461, 0.002068424364551902, 0.010308386757969856, 0.03712970018386841, 0.011901701800525188, 0.005133411381393671, 0.010311917401850224, 0.055247075855731964], [0.5845919251441956, 0.009964639320969582, 0.006419515702873468, 0.005631175823509693, 0.010992555879056454, 0.0051114861853420734, 0.0025202161632478237, 0.08422207832336426, 0.02567335031926632, 0.01109843235462904, 0.020684778690338135, 0.012485543265938759, 0.04537379369139671, 0.020455460995435715, 0.005697763524949551, 0.018804602324962616, 0.13027268648147583], [0.5869547724723816, 0.009998509660363197, 0.008663649670779705, 0.005790055729448795, 0.00996569637209177, 0.005736503284424543, 0.0019345359178259969, 0.07315226644277573, 0.01612280309200287, 0.006665478926151991, 0.009646130725741386, 0.027015507221221924, 0.01428042259067297, 0.01069749053567648, 0.015059876255691051, 0.03172402083873749, 0.1665922999382019], [0.5891892313957214, 0.011307432316243649, 0.008555843494832516, 0.01591670885682106, 0.005247768945991993, 0.006901806220412254, 0.010654964484274387, 0.09543900191783905, 0.005321124568581581, 0.006281343754380941, 0.027220914140343666, 0.028006751090288162, 0.02189008891582489, 0.004013807978481054, 0.019893022254109383, 0.0387670174241066, 0.10539322346448898], [0.7979394197463989, 0.004585400223731995, 0.0035583695862442255, 0.012907174415886402, 0.00285825552418828, 0.0024788815062493086, 0.00041983116534538567, 0.03758448362350464, 0.014598624780774117, 0.005396198946982622, 0.007394668646156788, 0.0077080740593373775, 0.03056112676858902, 0.010578134097158909, 0.00033840228570625186, 0.006862669251859188, 0.05423015356063843], [0.6078534126281738, 0.009102982468903065, 0.005256517790257931, 0.005688854493200779, 0.005931193940341473, 0.005734494887292385, 0.004310488235205412, 0.06900998204946518, 0.023739254102110863, 0.007173456717282534, 0.03328167647123337, 0.03403979912400246, 0.031842414289712906, 0.021391989663243294, 0.017976811155676842, 0.008574506267905235, 0.10909217596054077], [0.20441009104251862, 0.030978698283433914, 0.011559037491679192, 0.04631765931844711, 0.036239732056856155, 0.025698883458971977, 0.008637321181595325, 0.1222122460603714, 0.031708549708127975, 0.03536667302250862, 0.037282370030879974, 0.04394269734621048, 0.05799673870205879, 0.037924546748399734, 0.021652495488524437, 0.05294360965490341, 0.19512876868247986]], [[0.3097732663154602, 0.07611045241355896, 0.028505505993962288, 0.051321763545274734, 0.05360238999128342, 0.06786826997995377, 0.02993333712220192, 0.10924328118562698, 0.020359760150313377, 0.025305403396487236, 0.014417652040719986, 0.021259214729070663, 0.028137091547250748, 0.014933962374925613, 0.018348360434174538, 0.029939303174614906, 0.10094097256660461], [0.32382121682167053, 0.052873674780130386, 0.026390526443719864, 0.02858990617096424, 0.024377157911658287, 0.03987927362322807, 0.03398594260215759, 0.14598192274570465, 0.02593742311000824, 0.03536425530910492, 0.02090512029826641, 0.018381379544734955, 0.035993870347738266, 0.023887448012828827, 0.031661566346883774, 0.018134748563170433, 0.11383458226919174], [0.36130234599113464, 0.034167371690273285, 0.003056361572816968, 0.016692889854311943, 0.03206206485629082, 0.03063301369547844, 0.041085463017225266, 0.20007984340190887, 0.025691639631986618, 0.021005313843488693, 0.011044769547879696, 0.02232205681502819, 0.03607909381389618, 0.019792387261986732, 0.019628971815109253, 0.02782295271754265, 0.09753341972827911], [0.36917203664779663, 0.035691455006599426, 0.01841595396399498, 0.0023412376176565886, 0.023779405280947685, 0.028775649145245552, 0.03528599441051483, 0.1225910410284996, 0.03664986044168472, 0.08388462662696838, 0.02336886152625084, 0.03264681622385979, 0.05369969829916954, 0.029495390132069588, 0.01450154185295105, 0.013139416463673115, 0.07656116038560867], [0.3503187596797943, 0.05832456052303314, 0.042019329965114594, 0.06252440810203552, 0.024023819714784622, 0.045568518340587616, 0.028528563678264618, 0.08869403600692749, 0.04464893043041229, 0.032980676740407944, 0.020669126883149147, 0.014231042936444283, 0.03118501603603363, 0.03955991193652153, 0.02464015781879425, 0.011275513097643852, 0.0808076485991478], [0.2954652011394501, 0.0557560995221138, 0.023065125569701195, 0.03262137621641159, 0.029068676754832268, 0.039246104657649994, 0.03266097232699394, 0.12911869585514069, 0.03800166770815849, 0.0426485538482666, 0.022462062537670135, 0.01992109604179859, 0.037244442850351334, 0.03146583214402199, 0.041810229420661926, 0.014161068946123123, 0.11528283357620239], [0.5176048874855042, 0.02175033465027809, 0.06592509895563126, 0.017054669559001923, 0.01047627255320549, 0.01671774312853813, 0.002046997891739011, 0.14995016157627106, 0.02593217045068741, 0.028596408665180206, 0.01253886241465807, 0.011914929375052452, 0.009603501297533512, 0.019538739696145058, 0.008933363482356071, 0.009966001845896244, 0.07144976407289505], [0.11796353757381439, 0.07321932911872864, 0.05922737345099449, 0.05399736016988754, 0.06588315218687057, 0.06435232609510422, 0.0630432665348053, 0.11272327601909637, 0.02995479479432106, 0.03994587063789368, 0.03458644822239876, 0.023355834186077118, 0.042384177446365356, 0.02815030701458454, 0.049182213842868805, 0.03708231821656227, 0.10494840145111084], [0.4405638575553894, 0.06216609477996826, 0.024902554228901863, 0.02215476892888546, 0.03576335683465004, 0.05957154557108879, 0.04554615914821625, 0.14371877908706665, 0.0002048244496108964, 0.0029951068572700024, 0.021087462082505226, 0.03582749515771866, 0.00459287129342556, 0.00023303313355427235, 0.015625810250639915, 0.032139793038368225, 0.05290646106004715], [0.42117029428482056, 0.04143610596656799, 0.006092498078942299, 0.11454833298921585, 0.034141480922698975, 0.050804637372493744, 0.015443027950823307, 0.1753954440355301, 0.0033253333531320095, 0.00017724001372698694, 0.008149168454110622, 0.011504090391099453, 0.004735163878649473, 0.003161653643473983, 0.010470982640981674, 0.027541300281882286, 0.07190326601266861], [0.5646669864654541, 0.033477023243904114, 0.007792834658175707, 0.004369085654616356, 0.0158088319003582, 0.02870163880288601, 0.006828259211033583, 0.11246144026517868, 0.02150244452059269, 0.017794713377952576, 0.003527245484292507, 0.017825162038207054, 0.04423310607671738, 0.015048909932374954, 0.01779608242213726, 0.016137506812810898, 0.07202871888875961], [0.4031176269054413, 0.048444714397192, 0.007807986345142126, 0.013952632434666157, 0.03815782070159912, 0.03946596011519432, 0.01272290013730526, 0.15799611806869507, 0.05029098689556122, 0.005117946304380894, 0.0179555993527174, 0.0013285815948620439, 0.018555017188191414, 0.036394815891981125, 0.026839597150683403, 0.01676013693213463, 0.10509151965379715], [0.4245763421058655, 0.02908717654645443, 0.019622495397925377, 0.09167809784412384, 0.01870063692331314, 0.027208421379327774, 0.00341343623585999, 0.16979795694351196, 0.011765527538955212, 0.0020747811067849398, 0.02077442780137062, 0.004534529522061348, 0.0005970205529592931, 0.006293031387031078, 0.007846449501812458, 0.05287493020296097, 0.10915481299161911], [0.5070799589157104, 0.05312657356262207, 0.01926664635539055, 0.019094401970505714, 0.03115960955619812, 0.053129106760025024, 0.03742797300219536, 0.13670435547828674, 0.0002592909731902182, 0.002618880942463875, 0.019042203202843666, 0.025540700182318687, 0.0034402059391140938, 0.00025993879535235465, 0.012535981833934784, 0.02498341165482998, 0.054330844432115555], [0.5291358828544617, 0.017840318381786346, 0.021332835778594017, 0.011296787299215794, 0.00956611055880785, 0.017879504710435867, 0.0036630607210099697, 0.1037057489156723, 0.05965113267302513, 0.020368333905935287, 0.007805848494172096, 0.024187324568629265, 0.03566086292266846, 0.0507967509329319, 0.0027069412171840668, 0.01876167766749859, 0.06564092636108398], [0.34555691480636597, 0.04936022311449051, 0.006963954772800207, 0.008069342002272606, 0.028998441994190216, 0.0356253907084465, 0.01248763594776392, 0.10747640579938889, 0.07810013741254807, 0.027188358828425407, 0.0448719747364521, 0.017323995009064674, 0.043513376265764236, 0.054780300706624985, 0.05175561085343361, 0.006437510251998901, 0.08149056881666183], [0.12881092727184296, 0.06950867176055908, 0.03859624266624451, 0.05177735164761543, 0.06152310222387314, 0.060178741812705994, 0.04639644920825958, 0.12166110426187515, 0.03004666231572628, 0.032973237335681915, 0.04428943246603012, 0.024865174666047096, 0.03649240732192993, 0.028815530240535736, 0.05670784041285515, 0.04551705718040466, 0.12184006720781326]], [[0.7110787630081177, 0.027927642688155174, 0.009552759118378162, 0.014623880386352539, 0.015180319547653198, 0.024047110229730606, 0.007120250258594751, 0.08197902143001556, 0.007845137268304825, 0.005259108263999224, 0.007155620492994785, 0.015986017882823944, 0.006887119263410568, 0.004580909386277199, 0.00681166211143136, 0.00976585689932108, 0.04419892281293869], [0.7689254879951477, 0.026723772287368774, 0.017129555344581604, 0.056290559470653534, 0.01882445625960827, 0.03465570509433746, 0.007394162006676197, 0.052174292504787445, 0.004640564788132906, 0.0010288794292137027, 0.000809956225566566, 0.0023578691761940718, 0.0010662488639354706, 0.0018314884509891272, 0.0006359520484693348, 0.0024800170212984085, 0.0030311348382383585], [0.5054264068603516, 0.05908431485295296, 0.01152538787573576, 0.0853986069560051, 0.03131937235593796, 0.08983318507671356, 0.08388520032167435, 0.022242950275540352, 0.013807837851345539, 0.006013629026710987, 0.004907405935227871, 0.006878827698528767, 0.0017845273250713944, 0.006666569039225578, 0.006865022704005241, 0.05985284596681595, 0.004507862962782383], [0.05515551194548607, 0.2709078788757324, 0.03020547516644001, 0.059518877416849136, 0.24618178606033325, 0.1714058220386505, 0.05475560203194618, 0.08084814995527267, 0.00529443146660924, 0.00539929885417223, 0.0033454729709774256, 0.0037494811695069075, 0.0022265207953751087, 0.0018627563258633018, 0.0026252958923578262, 0.0008570272475481033, 0.0056606768630445], [0.08860161900520325, 0.16614261269569397, 0.10864855349063873, 0.19587187469005585, 0.026470569893717766, 0.06053324416279793, 0.13963888585567474, 0.18032796680927277, 0.0072058020159602165, 0.003577714553102851, 0.003631172701716423, 0.00317259319126606, 0.002401125617325306, 0.0014812910230830312, 0.00213058665394783, 0.005159215070307255, 0.0050051165744662285], [0.16291791200637817, 0.2019491344690323, 0.07593633234500885, 0.13697446882724762, 0.043486788868904114, 0.08810041844844818, 0.02907589077949524, 0.21042227745056152, 0.028409093618392944, 0.0030243671499192715, 0.002265175571665168, 0.0020514256320893764, 0.002718042116612196, 0.004621841479092836, 0.0014017298817634583, 0.002168684732168913, 0.004476376809179783], [0.3419612646102905, 0.06726314127445221, 0.027406899258494377, 0.2288077175617218, 0.08786772191524506, 0.1352757066488266, 0.015128012746572495, 0.04535001516342163, 0.014032970182597637, 0.00563087360933423, 0.0009072761749848723, 0.014148584567010403, 0.002122335135936737, 0.004114200826734304, 0.002215046901255846, 0.005772580858319998, 0.001995665952563286], [0.8622167706489563, 0.013754216954112053, 0.004114906303584576, 0.007589736022055149, 0.011954630725085735, 0.031024159863591194, 0.006212920416146517, 0.045845817774534225, 0.0017793033039197326, 0.0013376649003475904, 0.002112209564074874, 0.004218058194965124, 0.0007563874823972583, 0.0004498552007135004, 0.0003593052679207176, 0.0019464321667328477, 0.00432771909981966], [0.6874514222145081, 0.019140642136335373, 0.004409873392432928, 0.007162123452872038, 0.007483558729290962, 0.042244695127010345, 0.004043205175548792, 0.1125873550772667, 0.0024270396679639816, 0.0065034315921366215, 0.005022839177399874, 0.05935463309288025, 0.006321829743683338, 0.0009106681682169437, 0.0017249510856345296, 0.015043573454022408, 0.01816817745566368], [0.7241984605789185, 0.01891995407640934, 0.006214221939444542, 0.013416610658168793, 0.013520952314138412, 0.03287103772163391, 0.005572477821260691, 0.12138576805591583, 0.009694283828139305, 0.0005665191565640271, 0.004584840033203363, 0.01580716483294964, 0.0030038179829716682, 0.0033317464403808117, 0.0007722878362983465, 0.005620711017400026, 0.02051912620663643], [0.6256803870201111, 0.01544839609414339, 0.004969930276274681, 0.0071341427974402905, 0.01677573285996914, 0.03486441448330879, 0.009676879271864891, 0.21265809237957, 0.00704914191737771, 0.0015971852699294686, 0.0010824110358953476, 0.016238976269960403, 0.002351901028305292, 0.0026962580159306526, 0.0017549839103594422, 0.014265239238739014, 0.025755880400538445], [0.5370416641235352, 0.04764509201049805, 0.014612805098295212, 0.006854006554931402, 0.023454369977116585, 0.05693439021706581, 0.009042345918715, 0.16389532387256622, 0.03377465531229973, 0.007931066676974297, 0.037468332797288895, 0.004621597006917, 0.004921542014926672, 0.009726802818477154, 0.007861110381782055, 0.003718385938555002, 0.03049648553133011], [0.5746358633041382, 0.03487038239836693, 0.0035669011995196342, 0.013662824407219887, 0.010050659999251366, 0.03838569298386574, 0.012951690703630447, 0.14810103178024292, 0.01723843440413475, 0.0076309689320623875, 0.010396546684205532, 0.019433971494436264, 0.0013526052935048938, 0.008914723992347717, 0.005154790822416544, 0.0383501872420311, 0.05530280992388725], [0.6587098240852356, 0.011414572596549988, 0.002196417422965169, 0.0038269194774329662, 0.0033770480658859015, 0.022111741825938225, 0.0022246174048632383, 0.14017727971076965, 0.002240057336166501, 0.0038571900222450495, 0.00511352252215147, 0.030566321685910225, 0.007710849866271019, 0.0014465353451669216, 0.0033194960560649633, 0.02675457112491131, 0.07495308667421341], [0.863104522228241, 0.005362910684198141, 0.0018485154723748565, 0.0034691973123699427, 0.010621408931910992, 0.01624123938381672, 0.0020722737535834312, 0.04495903104543686, 0.0032996542286127806, 0.00044658457045443356, 0.0011551921488717198, 0.008624700829386711, 0.00197734241373837, 0.0024578964803367853, 0.0003257961943745613, 0.003830981906503439, 0.030202768743038177], [0.34848809242248535, 0.018607933074235916, 0.0037660030648112297, 0.008738338015973568, 0.015155485831201077, 0.030680716037750244, 0.01632816158235073, 0.12528109550476074, 0.09329309314489365, 0.013585846871137619, 0.010476295836269855, 0.014774148352444172, 0.08450304716825485, 0.15278802812099457, 0.01722089946269989, 0.006792874541133642, 0.03951994702219963], [0.8442959785461426, 0.003645744640380144, 0.0007190723554231226, 0.001659274217672646, 0.002519639441743493, 0.0075706737115979195, 0.0011258405866101384, 0.057927630841732025, 0.0027019220869988203, 0.001275690970942378, 0.0027634426951408386, 0.004853555932641029, 0.0022383565083146095, 0.0030044398736208677, 0.002414053538814187, 0.004696938209235668, 0.056587785482406616]], [[0.38903576135635376, 0.07133189588785172, 0.01173278596252203, 0.011611643247306347, 0.03593862056732178, 0.04239055886864662, 0.009422672912478447, 0.17532263696193695, 0.007530996110290289, 0.01816808432340622, 0.009301709942519665, 0.01393071562051773, 0.030338410288095474, 0.007863985374569893, 0.011107753030955791, 0.015544839203357697, 0.13942685723304749], [0.24447773396968842, 0.024418605491518974, 0.005231281742453575, 0.009238220751285553, 0.06914270669221878, 0.13657811284065247, 0.008181962184607983, 0.18644508719444275, 0.0212774109095335, 0.031458813697099686, 0.021971862763166428, 0.02044578455388546, 0.02386799454689026, 0.014258131384849548, 0.013974858447909355, 0.007653276436030865, 0.1613781601190567], [0.2509060204029083, 0.02114516869187355, 0.41429293155670166, 0.0007701053400523961, 0.004686474800109863, 0.02570347674190998, 0.002771765924990177, 0.09230562299489975, 0.005403014365583658, 0.049270082265138626, 0.00910914409905672, 0.007360628806054592, 0.0007370669627562165, 0.0032858906779438257, 0.007006220519542694, 0.004124969244003296, 0.10112148523330688], [0.295585960149765, 0.06220528110861778, 0.0041531529277563095, 0.26937955617904663, 0.00662674754858017, 0.022593991830945015, 0.006198883056640625, 0.12951865792274475, 0.012682843022048473, 0.019845308735966682, 0.020230544731020927, 0.003786548273637891, 0.007568683009594679, 0.014782285317778587, 0.019013414159417152, 0.003552051493898034, 0.10227608680725098], [0.12344686686992645, 0.04357608035206795, 0.00291255721822381, 0.0010759409051388502, 0.4924272298812866, 0.0130477175116539, 0.002582717686891556, 0.12221682816743851, 0.026410656049847603, 0.0045435745269060135, 0.004750614520162344, 0.0027752534952014685, 0.013904971070587635, 0.015176290646195412, 0.007841837592422962, 0.0011693054111674428, 0.12214155495166779], [0.2540737986564636, 0.30792585015296936, 0.008542569354176521, 0.010614048689603806, 0.03832758963108063, 0.02322378195822239, 0.002646507928147912, 0.11939003318548203, 0.01744941622018814, 0.02782294526696205, 0.007238340564072132, 0.01058943010866642, 0.02251370996236801, 0.01374638732522726, 0.008555963635444641, 0.006178680341690779, 0.12116090953350067], [0.3064430058002472, 0.07677601277828217, 0.018216054886579514, 0.037853650748729706, 0.00851031020283699, 0.008972520008683205, 0.3541720509529114, 0.06866519898176193, 0.0051787071861326694, 0.011201721616089344, 0.003618421498686075, 0.0007962197414599359, 0.004803159274160862, 0.007356555201113224, 0.008031015284359455, 0.0015090003143996, 0.07789646834135056], [0.2596839666366577, 0.0846216008067131, 0.04221224784851074, 0.06803110241889954, 0.03722790256142616, 0.03896056115627289, 0.014936673454940319, 0.041980061680078506, 0.01769174076616764, 0.04076812043786049, 0.02048587240278721, 0.021915532648563385, 0.03866751492023468, 0.02959289588034153, 0.04400334879755974, 0.04380278289318085, 0.15541806817054749], [0.03808952495455742, 0.015375161543488503, 0.0007993357139639556, 0.0007905364618636668, 0.003858056152239442, 0.006685994099825621, 0.00027509077335707843, 0.009094920009374619, 0.3042495846748352, 0.0004373933479655534, 0.0013684971490874887, 0.0008673985139466822, 0.00312657724134624, 0.6027000546455383, 0.00045409484300762415, 0.00035374233266338706, 0.01147408876568079], [0.2508527934551239, 0.02940240129828453, 0.06534533202648163, 0.008422117680311203, 0.006698856130242348, 0.0397794209420681, 0.011135798878967762, 0.2847210764884949, 0.002817385597154498, 0.1251867413520813, 0.0036841267719864845, 0.0011069232132285833, 0.006826427299529314, 0.006079099606722593, 0.017484400421380997, 0.0008016785141080618, 0.1396554410457611], [0.21936455368995667, 0.04131676256656647, 0.011005657725036144, 0.015474973246455193, 0.008186543360352516, 0.01935935765504837, 0.0036965063773095608, 0.13971573114395142, 0.01750980317592621, 0.009601340629160404, 0.4141910970211029, 0.001424020854756236, 0.005060848314315081, 0.016052646562457085, 0.017519867047667503, 0.0016003827331587672, 0.05891992524266243], [0.20305441319942474, 0.06034668907523155, 0.010924513451755047, 0.00515152420848608, 0.007079590577632189, 0.0265373345464468, 0.0019292245851829648, 0.08756648749113083, 0.01015008706599474, 0.002419837052002549, 0.002502580638974905, 0.5255581736564636, 0.0014382272493094206, 0.003980069886893034, 0.002235535765066743, 0.018066557124257088, 0.03105914033949375], [0.2183578759431839, 0.0287260003387928, 0.0027343754190951586, 0.00624172855168581, 0.031237689778208733, 0.01669887639582157, 0.003237685654312372, 0.15117305517196655, 0.032024599611759186, 0.0254222322255373, 0.010399964638054371, 0.0009532354888506234, 0.32737693190574646, 0.007788069546222687, 0.01549386978149414, 0.0024165955837816, 0.1197172999382019], [0.03817454352974892, 0.012452982366085052, 0.000669431930873543, 0.0006629990530200303, 0.0024828994646668434, 0.0058865416795015335, 0.000513443083036691, 0.012288613244891167, 0.8036785125732422, 0.0012242600787431002, 0.0017775868764147162, 0.0007082129595801234, 0.0013355595292523503, 0.10647126287221909, 0.00018847138562705368, 0.000123789650388062, 0.011360909789800644], [0.2685365676879883, 0.02049204334616661, 0.004818696994334459, 0.005789389368146658, 0.012365080416202545, 0.05163702368736267, 0.020101044327020645, 0.2846330404281616, 0.002266187686473131, 0.011631825007498264, 0.018799886107444763, 0.0033910851925611496, 0.03144542872905731, 0.0012760047102347016, 0.08715970814228058, 0.0005714880535379052, 0.17508552968502045], [0.37668177485466003, 0.013363177888095379, 0.0018170825205743313, 0.0016844520578160882, 0.009707030840218067, 0.013781100511550903, 0.0018701067892834544, 0.12596026062965393, 0.0014569963095709682, 0.002299179555848241, 0.003745058784261346, 0.017324060201644897, 0.004923918750137091, 0.0009610761771909893, 0.0003490169474389404, 0.39674392342567444, 0.027331868186593056], [0.24386854469776154, 0.0739862322807312, 0.023206090554594994, 0.050268206745386124, 0.04014477878808975, 0.05262504890561104, 0.050710152834653854, 0.21451424062252045, 0.021700013428926468, 0.04436247795820236, 0.018844349309802055, 0.013754173181951046, 0.036516059190034866, 0.017738400027155876, 0.02154546231031418, 0.012895023450255394, 0.0633205994963646]]], [[[0.9355422854423523, 0.0006494321278296411, 0.0004744344623759389, 0.001345394761301577, 0.001354434061795473, 0.0006321360124275088, 0.00012200047785881907, 0.005767859984189272, 0.013485876843333244, 0.003417539643123746, 0.0013144105905666947, 0.002121343044564128, 0.00269438442774117, 0.004497264511883259, 0.0006851769867353141, 0.0022490795236080885, 0.023646948859095573], [2.4183942514355294e-05, 2.663561826921068e-05, 0.9999362230300903, 6.729634151270147e-06, 2.4651697572153353e-08, 7.71648203112818e-08, 2.4117517227750795e-07, 1.3230322792878724e-06, 2.2574633362637542e-07, 6.688897352669088e-11, 8.345146795818437e-08, 6.750530445742697e-08, 3.5828691125061596e-08, 2.093240254907869e-06, 7.551505254443924e-11, 3.291590289222768e-08, 2.0395289084262913e-06], [4.9522532208357006e-05, 1.2538701810171915e-07, 5.1543138397391886e-05, 0.9998676776885986, 2.0876732378383167e-05, 5.3843027814082234e-08, 1.6032204896987423e-08, 1.3356617500903667e-06, 8.168541171471588e-06, 5.034894723365824e-09, 1.3684407339925597e-12, 1.7777360694637423e-09, 3.9445861688136574e-08, 3.6639386280512554e-07, 1.20783084867071e-07, 9.111669752037699e-11, 1.651739012231701e-07], [2.3706088541075587e-05, 3.999061846116092e-07, 2.9908974852332904e-07, 4.186195656075142e-05, 0.999927282333374, 4.991550667909905e-06, 1.1205658623225645e-08, 1.2119709253965993e-06, 3.243880053460657e-09, 1.1853795456318039e-07, 8.557883290905011e-10, 9.03196313854944e-14, 4.379467899440215e-10, 5.088097165817373e-10, 1.1983359859968346e-09, 1.484856397837575e-07, 4.890620797226575e-09], [4.1800649341894314e-05, 5.8263367463951e-07, 4.6524004915227124e-07, 1.768398476542643e-07, 3.5129418392898515e-05, 0.9998550415039062, 5.951595085207373e-05, 3.5462303458189126e-06, 3.824240124572498e-08, 2.533420584427404e-09, 9.89551040220249e-07, 1.4470968867641432e-09, 4.817335224303887e-13, 6.062939039708226e-11, 2.3247517799696027e-10, 1.3970891643566574e-07, 2.533301312723779e-06], [5.8842222642851993e-05, 1.7936281437869184e-07, 6.611982428239571e-08, 3.15730488864574e-07, 2.0993208238451189e-07, 3.707683572429232e-05, 0.9998800754547119, 1.8106306015397422e-05, 3.504309518120863e-07, 2.589069936220767e-06, 3.130722703303945e-08, 1.0975093118759105e-06, 2.365457518749281e-08, 4.988917807514925e-13, 2.697697620845929e-09, 2.181169023174334e-09, 9.959522913050023e-07], [0.006497211288660765, 1.4868014375224448e-07, 6.426125764846802e-06, 1.6409568814879094e-08, 1.6087889775917574e-07, 5.386507240245919e-08, 1.3310499298313516e-06, 0.9932237863540649, 1.216377040691441e-05, 4.672782716319546e-10, 1.3840391943631403e-07, 1.2232224122499247e-08, 6.989983830862911e-06, 5.921562351574039e-09, 9.378612431004446e-16, 2.530046172566358e-09, 0.0002515384112484753], [0.38218584656715393, 2.7172676553277597e-08, 6.39225181657821e-05, 0.0003378564724698663, 2.509233854652848e-05, 8.042449735512491e-06, 1.3365769291340257e-07, 0.004005192779004574, 0.612531304359436, 0.00019031904230359942, 1.0191830369876698e-05, 2.3659126782149542e-06, 4.838503627979662e-06, 0.00015785740106366575, 3.517355253279675e-06, 1.08113181340741e-06, 0.0004723687598016113], [2.41368326214797e-07, 3.6208297644890752e-12, 1.0343443766422378e-12, 6.947926678435579e-10, 2.9276256974242187e-09, 1.212914577802815e-11, 1.392966852975519e-09, 1.6996103413546848e-09, 8.832465141495049e-07, 0.9999966621398926, 2.2813285340816947e-06, 2.443766577986395e-10, 4.216922810940105e-09, 1.0999680983767024e-13, 4.531743158509016e-08, 4.631131589327708e-11, 3.0618947047950096e-12], [2.1098087472637417e-06, 1.0241648640274548e-09, 1.32145974718334e-10, 7.450339173722953e-14, 1.234571378461169e-08, 3.8343351604908094e-08, 9.489530689021919e-11, 1.854712365911837e-07, 1.0038334607997967e-08, 4.434824916188518e-07, 0.9999945163726807, 1.0689757345971884e-06, 8.551377383980707e-10, 2.644171270826945e-10, 1.4889549012145342e-13, 1.6551505268580513e-06, 1.0440869857575308e-07], [2.2537390620414044e-08, 6.81242345521027e-12, 3.287557726050494e-11, 9.86782880342714e-14, 6.149711189368029e-16, 5.5573518009666145e-11, 8.634102877103089e-10, 2.0973868475326896e-10, 6.167482524688239e-09, 1.0174643350069346e-09, 4.445814454356878e-07, 0.9999994039535522, 9.945665624400135e-09, 8.234613685376146e-11, 6.620368316057057e-12, 2.4458145578276635e-11, 6.345809566710159e-08], [4.715019713330548e-06, 5.2157865582103113e-08, 1.8168208271163167e-08, 1.9601325007556625e-08, 8.170213811053983e-11, 1.1775035970145592e-13, 9.797312294779204e-09, 2.802044036798179e-06, 6.428618082310322e-09, 3.233865868423891e-07, 2.5456442287463688e-08, 4.73509658149851e-07, 0.9999853372573853, 5.9189414969296195e-06, 9.494866226589238e-09, 1.2566689022719402e-08, 1.6351934561953385e-07], [3.981900817962014e-07, 6.746429109805163e-11, 9.87180783340591e-07, 2.4476123905436964e-10, 3.1708488612558483e-10, 2.6177122969262e-11, 2.225533752647996e-15, 4.338565062766975e-08, 1.8450238314926537e-07, 4.650263507599561e-11, 6.29014778041892e-08, 7.823311998222948e-10, 8.268973488156917e-07, 0.9999974966049194, 3.1792581012268784e-08, 8.2652276134354e-09, 3.105888879417762e-08], [1.927006287871791e-08, 1.1122814475017506e-11, 5.089676510111607e-12, 8.135398466002641e-10, 5.365475425067601e-12, 4.1413338145064593e-13, 2.452288318157553e-13, 9.42025429016835e-13, 7.460297028576146e-13, 4.16874712527715e-09, 6.561941094662682e-12, 1.1381785271213918e-11, 1.2697748408285747e-09, 3.3190685826411936e-06, 0.9999964237213135, 2.0666004729719134e-07, 2.7829707427429184e-09], [3.7666080743292696e-07, 1.1337527894283994e-07, 9.196482117501681e-12, 2.5277460788571127e-13, 2.394596521071435e-09, 2.2095480826933578e-12, 5.933642032579511e-12, 7.22328419300311e-10, 1.5305704038894936e-16, 4.35294827960675e-10, 4.5447310981217015e-07, 7.224255048343675e-12, 1.161479357136841e-09, 1.5278165399479349e-09, 9.749639957590261e-07, 0.9999970197677612, 9.957230986401555e-07], [0.0015373423229902983, 3.1003571621113224e-06, 8.651769007883559e-07, 8.01367860958635e-08, 1.0216561818765513e-08, 4.962153980159201e-06, 1.823811714984913e-07, 2.5269819161621854e-06, 1.1550569389839893e-08, 3.0552460561494854e-10, 5.582612175203394e-06, 3.791042399825528e-05, 3.9955369857125334e-07, 2.3942457119119354e-05, 5.23545440955786e-06, 0.0009931641397997737, 0.9973847270011902], [0.9593814611434937, 9.632896080802311e-07, 7.254982847371139e-06, 0.00018230268324259669, 5.615074201159587e-07, 6.559254472904286e-09, 2.4137492005138483e-07, 0.00032646520412527025, 5.8050823099620175e-06, 1.783136553967779e-06, 1.0075653200658508e-08, 1.8380191022515646e-06, 0.002660952275618911, 2.3716840587439947e-05, 5.146314833837096e-06, 1.774206612026319e-05, 0.03738361969590187]], [[0.5949885845184326, 0.04145917296409607, 0.014269188977777958, 0.011253397911787033, 0.016904832795262337, 0.02736825868487358, 0.008986677043139935, 0.1156281977891922, 0.013131280429661274, 0.007402143441140652, 0.008258018642663956, 0.004602531902492046, 0.011755825020372868, 0.006515786983072758, 0.008498439565300941, 0.009520783089101315, 0.09945692121982574], [0.8546807765960693, 0.0226163100451231, 0.018080921843647957, 0.012616914696991444, 0.009288068860769272, 0.0024607141967862844, 0.0013012217823415995, 0.05018533021211624, 9.382356074638665e-05, 0.0009396235109306872, 8.55288963066414e-05, 0.000521920039318502, 0.0005513743381015956, 7.170735625550151e-05, 0.0011948422761633992, 0.003039079252630472, 0.02227189764380455], [0.25310471653938293, 0.4599505662918091, 0.003079465590417385, 0.005965010728687048, 0.08530064672231674, 0.026872683316469193, 0.0030082762241363525, 0.10711006820201874, 0.00019498378969728947, 0.002360929036512971, 0.00014745222870260477, 2.446347934892401e-05, 2.7329191652825102e-05, 8.088388858595863e-05, 7.531927258241922e-05, 0.00017295591533184052, 0.05252426862716675], [0.7643661499023438, 0.04683690145611763, 0.006889530457556248, 0.0015992140397429466, 0.03807903826236725, 0.01547530759125948, 0.0024274331517517567, 0.07721491903066635, 6.371150811901316e-05, 2.1336063582566567e-05, 8.816142508294433e-05, 0.000335853110300377, 2.844283517333679e-05, 4.3501207983354107e-05, 0.0036157267168164253, 0.004941575229167938, 0.037973254919052124], [0.19517144560813904, 0.05839504301548004, 0.012696070596575737, 0.5632587671279907, 0.0027082546148449183, 0.04422884061932564, 0.0004030381969641894, 0.09544773399829865, 0.00035377044696360826, 0.000536356121301651, 7.78046014602296e-05, 2.2460399122792296e-05, 0.0012518352596089244, 0.0001873059809440747, 0.002509131096303463, 0.00020179111743345857, 0.022550296038389206], [0.5002495050430298, 0.03241099417209625, 0.025222670286893845, 0.07769428938627243, 0.08382061868906021, 0.052557431161403656, 0.0135029973462224, 0.17214134335517883, 0.00057251937687397, 0.002517190296202898, 0.00015763800183776766, 0.00022777655976824462, 0.0008719456382095814, 7.019796612439677e-05, 0.001091563142836094, 0.0022897934541106224, 0.03460157290101051], [0.4310643970966339, 0.021289434283971786, 0.006160495337098837, 0.0009870436042547226, 0.2854451537132263, 0.11022341996431351, 0.0004944716347381473, 0.10603439807891846, 0.0011509193573147058, 0.0015419807750731707, 0.0010350528173148632, 0.000659246405120939, 0.0016612800536677241, 9.060092270374298e-05, 0.0009832715149968863, 0.0056259045377373695, 0.02555299736559391], [0.796432614326477, 0.006353548262268305, 0.0032617892138659954, 0.004934101365506649, 0.0033642577473074198, 0.007846800610423088, 0.003419620217755437, 0.13450399041175842, 0.001016809605062008, 0.0005014926427975297, 0.0005373733583837748, 0.0009752211044542491, 0.00024762199609540403, 0.00013472182035911828, 0.00016016718291211873, 0.0013363180914893746, 0.03497352451086044], [0.24919486045837402, 0.007206745911389589, 0.0001356502325506881, 0.0019404733320698142, 0.005103836301714182, 0.016646314412355423, 0.002942462917417288, 0.23032179474830627, 0.000714728725142777, 0.40486544370651245, 0.029963817447423935, 0.0007177259540185332, 0.005146562587469816, 2.7008532924810424e-05, 0.00014822804951108992, 0.0018870854983106256, 0.04303720220923424], [0.40319758653640747, 0.0197871346026659, 0.00014951803314033896, 0.0008175448747351766, 0.005618836730718613, 0.051850032061338425, 0.006674640346318483, 0.2511081099510193, 0.040659405291080475, 0.03697432205080986, 0.045180536806583405, 0.00874173454940319, 0.007354083936661482, 0.0007959581562317908, 6.845649477327242e-05, 0.00873834453523159, 0.11228380352258682], [0.10848839581012726, 0.00024361119722016156, 2.8878013836219907e-05, 2.0344685253803618e-05, 0.0001451286516385153, 0.0007513011805713177, 0.003403593087568879, 0.054972633719444275, 0.002998771844431758, 0.016414230689406395, 0.00018222177459392697, 0.7936817407608032, 0.002682511694729328, 0.00032181941787712276, 0.0006228039273992181, 0.000173580163391307, 0.014868473634123802], [0.5428603291511536, 0.013690683990716934, 0.0002944112056866288, 0.0008217784925363958, 0.0010721463477239013, 0.01077132485806942, 0.0022588202264159918, 0.2994878888130188, 0.00425883661955595, 0.0010804233606904745, 0.04121650755405426, 0.000346769840689376, 0.0029246043413877487, 0.0009662922821007669, 0.0013209349708631635, 0.004413546994328499, 0.0722147598862648], [0.5985311269760132, 0.004378383047878742, 2.260666406073142e-06, 0.0006068684160709381, 0.0016325123142451048, 0.001627118093892932, 0.005191441625356674, 0.15198488533496857, 0.012242529541254044, 0.007205548696219921, 0.03400833159685135, 0.01812802255153656, 0.01433394942432642, 0.03324098512530327, 0.00713706249371171, 0.002238480607047677, 0.10751036554574966], [0.3219894766807556, 0.012197881005704403, 5.496832454809919e-05, 0.000931154761929065, 0.0009875481482595205, 0.001988510601222515, 0.0008117770194076002, 0.14286759495735168, 0.0001122108951676637, 0.07392741739749908, 0.061075158417224884, 0.002721543191000819, 0.1522398591041565, 0.003901849268004298, 0.007672846782952547, 0.023880938068032265, 0.19263917207717896], [0.6205939054489136, 0.0039581917226314545, 0.0005913144559599459, 0.0001976605853997171, 0.0012620574561879039, 0.001228285487741232, 0.00010953479068120942, 0.04177296161651611, 0.0003529075183905661, 0.03517037257552147, 0.007956725545227528, 0.01971692219376564, 0.09413152188062668, 0.004131590947508812, 0.003335351124405861, 0.021190563216805458, 0.144300177693367], [0.2399197220802307, 0.07880997657775879, 0.00259426049888134, 0.002576143480837345, 0.0008089802577160299, 0.015935951843857765, 0.002184772863984108, 0.1388407051563263, 0.004522915463894606, 0.0009857991244643927, 0.011517094448208809, 0.0011087798047810793, 0.12236630916595459, 0.0357777401804924, 0.03258981555700302, 0.003595049725845456, 0.30586597323417664], [0.8447940349578857, 0.0023220782168209553, 0.0007928755367174745, 0.0009924678597599268, 0.0005624364712275565, 0.001095983781851828, 0.0009029190987348557, 0.0418771468102932, 0.00025011805701069534, 0.00031846619094721973, 0.0003772870113607496, 0.001040619215928018, 0.00121454824693501, 0.0006862554582767189, 0.0014170415233820677, 0.004296107217669487, 0.09705953299999237]], [[0.6794950366020203, 0.020080437883734703, 0.009186024777591228, 0.016070690006017685, 0.010829854756593704, 0.016985945403575897, 0.041587211191654205, 0.07047858089208603, 0.00965174287557602, 0.006120110861957073, 0.0038138360250741243, 0.009115786291658878, 0.013074948452413082, 0.00718285795301199, 0.005220686551183462, 0.028262505307793617, 0.05284372717142105], [0.7979868054389954, 0.010404558852314949, 0.02692222408950329, 0.010519581846892834, 0.004837049171328545, 0.008305289782583714, 0.018681367859244347, 0.04176031053066254, 0.012916214764118195, 0.008285220712423325, 0.0033736240584403276, 0.004157812334597111, 0.002895163604989648, 0.010550260543823242, 0.008037804625928402, 0.009152938611805439, 0.021213700994849205], [0.5050345659255981, 0.04889919236302376, 0.0021495306864380836, 0.005230682902038097, 0.020860718563199043, 0.07513469457626343, 0.005232284311205149, 0.15812332928180695, 0.001692757592536509, 0.0017810034332796931, 0.0042657991871237755, 0.01695844531059265, 0.008787930943071842, 0.0014454179909080267, 0.01248086616396904, 0.013968534767627716, 0.11795426160097122], [0.723012387752533, 0.02347753196954727, 0.0035447818227112293, 0.0031816321425139904, 0.011554001830518246, 0.023514436557888985, 0.010746429674327374, 0.10511188954114914, 0.0031964888330549, 0.003487018868327141, 0.00423961179330945, 0.0009529824601486325, 0.0007597259245812893, 0.0022798865102231503, 0.00235563307069242, 0.002616351703181863, 0.07596906274557114], [0.9389620423316956, 0.0031622087117284536, 0.00816851295530796, 0.00289873406291008, 0.0005961715360172093, 0.0027973565738648176, 0.0018213825533166528, 0.02155333384871483, 0.0006787073216401041, 0.0009381890413351357, 0.0015244201058521867, 0.0007928678533062339, 0.000431269669206813, 0.0006296270876191556, 0.0005316688329912722, 0.0006454806425608695, 0.013868081383407116], [0.753540575504303, 0.012931165285408497, 0.02717972919344902, 0.011035887524485588, 0.006233082618564367, 0.009879739955067635, 0.02271539531648159, 0.060485802590847015, 0.014293571002781391, 0.011909821070730686, 0.004617311526089907, 0.0033763854298740625, 0.0032438780181109905, 0.010758349671959877, 0.00498055387288332, 0.007952501997351646, 0.034866221249103546], [0.6807866096496582, 0.024723999202251434, 0.013768007047474384, 0.02800874039530754, 0.022718045860528946, 0.021652555093169212, 0.0020194954704493284, 0.08672391623258591, 0.014468180015683174, 0.004552105907350779, 0.0029184448067098856, 0.0027586177457123995, 0.003531964961439371, 0.010132170282304287, 0.005204258486628532, 0.010821963660418987, 0.06521096080541611], [0.5612258911132812, 0.028047649189829826, 0.016220543533563614, 0.014265069738030434, 0.017594577744603157, 0.023234743624925613, 0.0356149896979332, 0.10591285675764084, 0.02076452597975731, 0.018292412161827087, 0.005868815816938877, 0.008068295195698738, 0.01283286139369011, 0.013952228240668774, 0.004266048315912485, 0.033777762204408646, 0.08006081730127335], [0.8799819946289062, 0.004421754274517298, 0.005495412275195122, 0.0029790292028337717, 0.0012005098396912217, 0.004702387377619743, 0.01796642877161503, 0.021590828895568848, 0.0003034147375728935, 0.00293393200263381, 0.002668288303539157, 0.0069808089174330235, 0.008658720180392265, 0.00027484112069942057, 0.011845042929053307, 0.014737254939973354, 0.013259399682283401], [0.7632590532302856, 0.011373892426490784, 0.011581245809793472, 0.033029116690158844, 0.00474944431334734, 0.0209407489746809, 0.0038896873593330383, 0.03430390730500221, 0.006271999794989824, 0.0007240584236569703, 0.01409540232270956, 0.0035669973585754633, 0.014447689987719059, 0.008148306980729103, 0.006337392143905163, 0.02071329951286316, 0.042567718774080276], [0.8940169215202332, 0.006516528315842152, 0.0008585987379774451, 0.0007568628643639386, 0.0019729332998394966, 0.008235134184360504, 0.00253323488868773, 0.03385712951421738, 0.00320843025110662, 0.0013705930905416608, 0.0004802478360943496, 0.002106709871441126, 0.010157761164009571, 0.0032736239954829216, 0.00032396192546002567, 0.0027145911008119583, 0.027616964653134346], [0.6837222576141357, 0.01950814388692379, 0.005392636638134718, 0.009299716912209988, 0.00788936112076044, 0.020118579268455505, 0.001508766203187406, 0.10875685513019562, 0.004666187800467014, 0.0054798126220703125, 0.0034487403463572264, 0.0031756414100527763, 0.002699614269658923, 0.0036273326259106398, 0.0005925296572968364, 0.0031115885358303785, 0.11700227111577988], [0.8573183417320251, 0.008999879471957684, 0.01650077849626541, 0.006297715473920107, 0.004899804014712572, 0.006857826840132475, 0.008868544362485409, 0.028955595567822456, 0.008493779227137566, 0.001923307660035789, 0.010739739052951336, 0.0015105127822607756, 0.00026008865097537637, 0.007146472577005625, 0.0009599243057891726, 0.0017145522870123386, 0.028553275391459465], [0.8732692003250122, 0.004498566500842571, 0.004311298951506615, 0.0031807872001081705, 0.0015554772689938545, 0.0044036624021828175, 0.018561385571956635, 0.02422196976840496, 0.00034519791370257735, 0.0027839289978146553, 0.0029735141433775425, 0.008310222066938877, 0.013398518785834312, 0.00029091929900459945, 0.007897323928773403, 0.014986430294811726, 0.015011569485068321], [0.8171095848083496, 0.010777651332318783, 0.010929337702691555, 0.00822288915514946, 0.00596046494320035, 0.011628340929746628, 0.00045918094110675156, 0.03933892026543617, 0.011541320011019707, 0.00124576676171273, 0.008243943564593792, 0.005604828242212534, 0.006028258241713047, 0.00924117024987936, 8.495533256791532e-05, 0.004458678420633078, 0.04912463575601578], [0.7714602947235107, 0.026443015784025192, 0.0020406756084412336, 0.002663692459464073, 0.015086560510098934, 0.01921915076673031, 0.003041022690013051, 0.0637235939502716, 0.004517432767897844, 0.0005923475255258381, 0.002148760948330164, 0.0010903128422796726, 0.0015405797166749835, 0.003435541642829776, 0.0003489606606308371, 0.0007614369387738407, 0.08188647776842117], [0.43853560090065, 0.034660011529922485, 0.013434701599180698, 0.017402980476617813, 0.01563013717532158, 0.03564619645476341, 0.08030953258275986, 0.12421101331710815, 0.014718686230480671, 0.017459692433476448, 0.005664506461471319, 0.014510716311633587, 0.01727837324142456, 0.011015098541975021, 0.006577224470674992, 0.056417204439640045, 0.0965283215045929]], [[0.5815844535827637, 0.02720939740538597, 0.0157015323638916, 0.03102879598736763, 0.012070601806044579, 0.01922813057899475, 0.013879239559173584, 0.08372199535369873, 0.02299232967197895, 0.017558297142386436, 0.014973497949540615, 0.009932146407663822, 0.026017161086201668, 0.01971599832177162, 0.009903118014335632, 0.01970537006855011, 0.07477787137031555], [0.1692742109298706, 0.054433368146419525, 0.022639498114585876, 0.05578644573688507, 0.033741313964128494, 0.04558313637971878, 0.07267678529024124, 0.1259724348783493, 0.037145696580410004, 0.054348308593034744, 0.027663199231028557, 0.017808442935347557, 0.05746692791581154, 0.03015071153640747, 0.021804338321089745, 0.03797299787402153, 0.13553231954574585], [0.4722102880477905, 0.03741598129272461, 0.00181068223901093, 0.012383394874632359, 0.007177495863288641, 0.02557646669447422, 0.016764646396040916, 0.07205918431282043, 0.0568358488380909, 0.02328498102724552, 0.008952340111136436, 0.008760440163314342, 0.0918356403708458, 0.050526779145002365, 0.010255200788378716, 0.018460946157574654, 0.08568958193063736], [0.2148580402135849, 0.026853758841753006, 0.0072295768186450005, 0.004100218880921602, 0.01129070296883583, 0.01742633618414402, 0.010483977384865284, 0.15515193343162537, 0.12032398581504822, 0.041423339396715164, 0.00701836496591568, 0.03778037428855896, 0.10201407968997955, 0.0885799378156662, 0.01197520736604929, 0.025951823219656944, 0.1175384595990181], [0.3852078318595886, 0.04560801014304161, 0.010762291960418224, 0.024914126843214035, 0.016002479940652847, 0.038918256759643555, 0.03740183636546135, 0.12538009881973267, 0.023994291201233864, 0.025541409850120544, 0.015669062733650208, 0.016547728329896927, 0.02709055133163929, 0.020456785336136818, 0.017996516078710556, 0.022095780819654465, 0.14641302824020386], [0.22539757192134857, 0.04281027615070343, 0.02305930107831955, 0.03603752329945564, 0.025167472660541534, 0.032921578735113144, 0.060300275683403015, 0.11208374798297882, 0.04323510453104973, 0.04555424302816391, 0.029335828498005867, 0.026096701622009277, 0.07451300323009491, 0.03385065495967865, 0.03003452904522419, 0.053211942315101624, 0.1063903272151947], [0.21871009469032288, 0.0513530969619751, 0.011265574023127556, 0.006773148663341999, 0.011489405296742916, 0.04077046364545822, 0.0018833070062100887, 0.05675496906042099, 0.13768060505390167, 0.04151282459497452, 0.010876127518713474, 0.028814086690545082, 0.12604494392871857, 0.11376316845417023, 0.018502499908208847, 0.0657174214720726, 0.058088213205337524], [0.7058432698249817, 0.01679087243974209, 0.007516131270676851, 0.011418919079005718, 0.007463071960955858, 0.012064480222761631, 0.017614472657442093, 0.06049349159002304, 0.014102059416472912, 0.011629853397607803, 0.014399201609194279, 0.010711430571973324, 0.014296703971922398, 0.010744431056082249, 0.006797045469284058, 0.011764036491513252, 0.0663505345582962], [0.1433573067188263, 0.06983232498168945, 0.04338371008634567, 0.06718699634075165, 0.04012880101799965, 0.051670514047145844, 0.15788646042346954, 0.12298418581485748, 0.00500798923894763, 0.03456784412264824, 0.03514351323246956, 0.011213215999305248, 0.038101568818092346, 0.0048529040068387985, 0.015241623856127262, 0.03346850723028183, 0.12597255408763885], [0.35274598002433777, 0.07534167915582657, 0.05326433479785919, 0.05724025517702103, 0.05773056298494339, 0.05824365094304085, 0.03344985470175743, 0.11632174253463745, 0.022036297246813774, 0.0005809458671137691, 0.030180415138602257, 0.008157889358699322, 0.014675678685307503, 0.017044993117451668, 0.004692139569669962, 0.015336982905864716, 0.08295659720897675], [0.5790175795555115, 0.028419334441423416, 0.010066869668662548, 0.014237898401916027, 0.013887273147702217, 0.02822345681488514, 0.014961293898522854, 0.11086398363113403, 0.013128424063324928, 0.009187465533614159, 0.0044557033106684685, 0.006255378946661949, 0.01893085241317749, 0.01157055888324976, 0.006718425080180168, 0.022045638412237167, 0.10802987217903137], [0.44963908195495605, 0.03398113697767258, 0.035185620188713074, 0.01729641482234001, 0.014622033573687077, 0.03139468654990196, 0.04932186380028725, 0.09485386312007904, 0.019954705610871315, 0.027963638305664062, 0.010881787165999413, 0.003121713176369667, 0.029573190957307816, 0.019426889717578888, 0.015149048529565334, 0.03391532227396965, 0.11371901631355286], [0.21124984323978424, 0.05526366084814072, 0.015933411195874214, 0.07138905674219131, 0.04252435639500618, 0.05880718678236008, 0.13843990862369537, 0.12324792146682739, 0.018241779878735542, 0.013994716107845306, 0.01485170517116785, 0.01285125408321619, 0.0029458908829838037, 0.014646648429334164, 0.028690563514828682, 0.03138447925448418, 0.14553764462471008], [0.21742378175258636, 0.06340129673480988, 0.03287157043814659, 0.053827784955501556, 0.032982222735881805, 0.042721688747406006, 0.1290695071220398, 0.12705668807029724, 0.0055196830071508884, 0.031154394149780273, 0.036424294114112854, 0.009855358861386776, 0.03458060696721077, 0.004623322747647762, 0.014992689713835716, 0.03287259116768837, 0.1306225061416626], [0.6842207312583923, 0.02986539527773857, 0.009423304349184036, 0.002455885522067547, 0.013980555348098278, 0.020320314913988113, 0.01330432016402483, 0.0479150153696537, 0.024849439039826393, 0.0023381586652249098, 0.019646957516670227, 0.005946996621787548, 0.056288741528987885, 0.017741305753588676, 0.0008609762298874557, 0.005518859252333641, 0.045323099941015244], [0.36001014709472656, 0.027944333851337433, 0.02824557013809681, 0.02590601146221161, 0.011874223127961159, 0.023193221539258957, 0.02439897507429123, 0.12978418171405792, 0.0382370725274086, 0.03340689092874527, 0.02585063874721527, 0.007932947017252445, 0.036404069513082504, 0.030620813369750977, 0.026394160464406013, 0.01917407102882862, 0.15062275528907776], [0.7096184492111206, 0.023926839232444763, 0.010619768872857094, 0.013453569263219833, 0.010290422476828098, 0.01543080247938633, 0.015900254249572754, 0.06299453228712082, 0.01348634622991085, 0.007427625358104706, 0.015428809449076653, 0.011548458598554134, 0.007830829359591007, 0.010287066921591759, 0.0042676618322730064, 0.010458126664161682, 0.05703045427799225]], [[0.6058481335639954, 0.0253519956022501, 0.006869847886264324, 0.02331080473959446, 0.013111171312630177, 0.025573665276169777, 0.010937036946415901, 0.09205174446105957, 0.015806013718247414, 0.00848170556128025, 0.008922051638364792, 0.016189537942409515, 0.012357783503830433, 0.01296777743846178, 0.009534301236271858, 0.028084704652428627, 0.08460170775651932], [0.3333311080932617, 0.13556884229183197, 0.01859838329255581, 0.027172811329364777, 0.019308721646666527, 0.08105447888374329, 0.01326698251068592, 0.14831332862377167, 0.04469044879078865, 0.007027589716017246, 0.008336471393704414, 0.004321874585002661, 0.018051037564873695, 0.03156008571386337, 0.004057455342262983, 0.022323332726955414, 0.08301709592342377], [0.5671290159225464, 0.027801064774394035, 0.005680502392351627, 0.08517289906740189, 0.010959060862660408, 0.012946318835020065, 0.00936854537576437, 0.07888246327638626, 0.037053726613521576, 0.013440326787531376, 0.006712822709232569, 0.009968185797333717, 0.01830669306218624, 0.029227375984191895, 0.01506025530397892, 0.030402272939682007, 0.04188845679163933], [0.673143208026886, 0.008338067680597305, 0.06539830565452576, 0.06388434767723083, 0.03449659422039986, 0.008033735677599907, 0.014819418080151081, 0.045905113220214844, 0.0032489330042153597, 0.029535092413425446, 0.002964033978059888, 0.022185180336236954, 0.0037812592927366495, 0.002082084072753787, 0.001272389548830688, 0.003433247795328498, 0.01747893914580345], [0.24137534201145172, 0.022460442036390305, 0.01723959855735302, 0.222296342253685, 0.1049489825963974, 0.03090766631066799, 0.0516129732131958, 0.14631067216396332, 0.011282529681921005, 0.0011694047134369612, 0.012264223769307137, 0.007438526954501867, 0.03554824739694595, 0.012532058171927929, 0.00207935506477952, 0.01606513187289238, 0.06446851044893265], [0.29123392701148987, 0.1424478143453598, 0.020962093025445938, 0.027036501094698906, 0.028803322464227676, 0.14765538275241852, 0.007624009158462286, 0.15184204280376434, 0.041935935616493225, 0.0029757674783468246, 0.0075397915206849575, 0.003990232013165951, 0.009279372170567513, 0.02702932618558407, 0.0030242090579122305, 0.017513636499643326, 0.06910670548677444], [0.24581553041934967, 0.0313277393579483, 0.015972675755620003, 0.2633145749568939, 0.04884167015552521, 0.03901386260986328, 0.02762654423713684, 0.09373121708631516, 0.02646380104124546, 0.0008216437418013811, 0.019053399562835693, 0.015661345794796944, 0.057423364371061325, 0.039659179747104645, 0.003662919858470559, 0.014991740696132183, 0.05661878362298012], [0.4324389100074768, 0.08541450649499893, 0.007437588647007942, 0.01671590656042099, 0.00935688242316246, 0.08448371291160583, 0.009448128752410412, 0.18262578547000885, 0.008880075067281723, 0.0034841494634747505, 0.004519734065979719, 0.011026954278349876, 0.002134965965524316, 0.005588314030319452, 0.00492390152066946, 0.01731644570827484, 0.11420413106679916], [0.22228237986564636, 0.06658090651035309, 0.020575420930981636, 0.022468898445367813, 0.009674722328782082, 0.09498171508312225, 0.012938061729073524, 0.16718639433383942, 0.0006061867461539805, 0.0026482364628463984, 0.12931525707244873, 0.007239471655339003, 0.003939827438443899, 0.0003299923846498132, 0.029332712292671204, 0.08185838907957077, 0.12804140150547028], [0.5818532705307007, 0.013720554299652576, 0.0018779218662530184, 0.03872267156839371, 0.005376925691962242, 0.010774753987789154, 0.008912059478461742, 0.11085251718759537, 0.0034989304840564728, 0.0010915175080299377, 0.0646749958395958, 0.010324764996767044, 0.01761830784380436, 0.0025115488097071648, 0.0024469487834721804, 0.027936693280935287, 0.09780576080083847], [0.33331605792045593, 0.013803028501570225, 0.000333157746354118, 0.005709727760404348, 0.0013226809678599238, 0.014582417905330658, 0.0010145908454433084, 0.10755118727684021, 0.249624103307724, 0.009553052484989166, 0.0008679964812472463, 0.003544895676895976, 0.019413676112890244, 0.17553210258483887, 0.0005073253996670246, 0.010822988115251064, 0.05250106379389763], [0.34077519178390503, 0.04579440876841545, 0.00508628785610199, 0.006581885274499655, 0.031532078981399536, 0.04037422686815262, 0.016493279486894608, 0.1931513398885727, 0.006358105689287186, 0.003800834994763136, 0.02671726606786251, 0.003260429948568344, 0.009607400745153427, 0.005959047935903072, 0.006782330106943846, 0.010761281475424767, 0.24696464836597443], [0.31444963812828064, 0.026757458224892616, 0.006010961718857288, 0.009595262818038464, 0.019186679273843765, 0.0241029541939497, 0.00830118078738451, 0.1584509015083313, 0.023749465122818947, 0.008241751231253147, 0.04405442625284195, 0.0072031281888484955, 0.032528460025787354, 0.026813674718141556, 0.017404893413186073, 0.11518726497888565, 0.1579618901014328], [0.1614888608455658, 0.06875857710838318, 0.02178669162094593, 0.017640182748436928, 0.010145997628569603, 0.0975291058421135, 0.009621965698897839, 0.16886742413043976, 0.0004612971388269216, 0.002318678656592965, 0.09921890497207642, 0.006933483295142651, 0.003259926801547408, 0.000463211938040331, 0.04867072030901909, 0.11754919588565826, 0.16528579592704773], [0.3964332640171051, 0.032853882759809494, 0.0013279204722493887, 0.005148499272763729, 0.004246916621923447, 0.03142772614955902, 0.0005245922948233783, 0.21750952303409576, 0.027621380984783173, 0.010780484415590763, 0.0010161803802475333, 0.0049780686385929585, 0.021854573860764503, 0.028553089126944542, 0.0020125696901232004, 0.01694355346262455, 0.19676776230335236], [0.2180866152048111, 0.023630503565073013, 0.004087154287844896, 0.012857169844210148, 0.021206026896834373, 0.017039740458130836, 0.0014415261102840304, 0.11514461040496826, 0.03040764294564724, 0.012752312235534191, 0.012508979998528957, 0.007340692449361086, 0.2948306202888489, 0.06804068386554718, 0.004302926827222109, 0.005076535977423191, 0.15124629437923431], [0.39016395807266235, 0.05084690451622009, 0.003221756312996149, 0.006970633752644062, 0.005004544276744127, 0.0618128776550293, 0.005754649639129639, 0.17235592007637024, 0.011689548380672932, 0.006776392925530672, 0.006221544463187456, 0.01918535679578781, 0.0045694918371737, 0.010254628956317902, 0.009098042733967304, 0.03265119343996048, 0.20342253148555756]], [[0.582505464553833, 0.02546740137040615, 0.04924190044403076, 0.044356029480695724, 0.01463341061025858, 0.030199341475963593, 0.10021436959505081, 0.058093320578336716, 0.005660651717334986, 0.017507027834653854, 0.003995534963905811, 0.007855177856981754, 0.00870499387383461, 0.004224460572004318, 0.007459970191121101, 0.012171356938779354, 0.027709605172276497], [0.1877545416355133, 0.10667350888252258, 0.10201557725667953, 0.11406737565994263, 0.07527679949998856, 0.09606726467609406, 0.07152875512838364, 0.09858537465333939, 0.011088890954852104, 0.015129963867366314, 0.01033075526356697, 0.008662494830787182, 0.03582123667001724, 0.01243582833558321, 0.025038450956344604, 0.010599627159535885, 0.018923452123999596], [0.5206281542778015, 0.082667775452137, 0.0018030989449471235, 0.006330402567982674, 0.024240126833319664, 0.08025576919317245, 0.001909830141812563, 0.23971572518348694, 0.002570951357483864, 0.0004839225730393082, 0.0033152918331325054, 0.0030120927840471268, 0.0022512711584568024, 0.0030405730940401554, 0.0030764658004045486, 0.0006405285676009953, 0.024057943373918533], [0.6422134637832642, 0.033057257533073425, 0.0026392668951302767, 0.013757710345089436, 0.025444896891713142, 0.06301417201757431, 0.009336418472230434, 0.1418127566576004, 0.0029442645609378815, 0.0014438031939789653, 0.0019679900724440813, 0.0014616451226174831, 0.0027139412704855204, 0.00308454898186028, 0.007041700650006533, 0.0009452001540921628, 0.047121044248342514], [0.24980372190475464, 0.07706596702337265, 0.019942507147789, 0.057168569415807724, 0.1871625930070877, 0.12090840190649033, 0.05699198320508003, 0.17078334093093872, 0.0027274582535028458, 0.0031944988295435905, 0.01525421254336834, 0.00202236813493073, 0.004111922346055508, 0.0017890288727357984, 0.005243231076747179, 0.0028422202449291945, 0.02298792637884617], [0.17239253222942352, 0.02828207053244114, 0.07990632206201553, 0.06958632916212082, 0.05648583546280861, 0.09246767312288284, 0.28592216968536377, 0.13000918924808502, 0.006378619000315666, 0.011332699097692966, 0.012371061369776726, 0.009071980603039265, 0.013289705850183964, 0.0030504572205245495, 0.007369216997176409, 0.005593894515186548, 0.016490206122398376], [0.5687983632087708, 0.012919253669679165, 0.00533297797665, 0.00926362443715334, 0.0031991549767553806, 0.024911878630518913, 0.07413862645626068, 0.2129899561405182, 0.0021864664740860462, 0.0010578305227681994, 0.0015351250767707825, 0.0030664284713566303, 0.0010965625988319516, 0.0012242335360497236, 0.0014874160988256335, 0.002019115723669529, 0.07477303594350815], [0.46560055017471313, 0.03224664926528931, 0.10046101361513138, 0.05452308803796768, 0.028736744076013565, 0.05134398117661476, 0.11766839027404785, 0.04460185393691063, 0.0075304340571165085, 0.022453168407082558, 0.005035162903368473, 0.012518107891082764, 0.012070742435753345, 0.004226121120154858, 0.012571001425385475, 0.013694709166884422, 0.014718231745064259], [0.39393001794815063, 0.040288373827934265, 0.0026253084652125835, 0.009588421322405338, 0.011799974367022514, 0.03533002361655235, 0.006263634189963341, 0.21772681176662445, 0.07718871533870697, 0.018702641129493713, 0.02745433710515499, 0.01970721036195755, 0.021050162613391876, 0.019020356237888336, 0.008839542046189308, 0.008620994165539742, 0.08186350017786026], [0.41198137402534485, 0.019247496500611305, 0.009578035213053226, 0.009465794079005718, 0.009994862601161003, 0.030319368466734886, 0.042050961405038834, 0.1346721351146698, 0.047908201813697815, 0.016177354380488396, 0.02969096414744854, 0.036451712250709534, 0.06305212527513504, 0.030253730714321136, 0.017770856618881226, 0.019500145688652992, 0.07188490778207779], [0.6045801043510437, 0.014017041772603989, 0.0015931782545521855, 0.002544862451031804, 0.0073415860533714294, 0.015469339676201344, 0.006513651926070452, 0.12261021882295609, 0.029429838061332703, 0.003224156331270933, 0.02866539917886257, 0.014520345255732536, 0.01581287570297718, 0.03067084215581417, 0.006541646551340818, 0.005499962251633406, 0.0909649059176445], [0.6020297408103943, 0.022959209978580475, 0.0034842458553612232, 0.006723849102854729, 0.008959964849054813, 0.022289635613560677, 0.004260982386767864, 0.1385488659143448, 0.02162935584783554, 0.004556650761514902, 0.00586526095867157, 0.0131843201816082, 0.01027646753937006, 0.026674263179302216, 0.0075353821739554405, 0.008229343220591545, 0.09279245883226395], [0.2420206516981125, 0.02764251083135605, 0.004622197709977627, 0.004616035148501396, 0.004732013214379549, 0.01552066020667553, 0.004600547719746828, 0.11105573177337646, 0.07855802029371262, 0.02207113429903984, 0.02056225948035717, 0.04186885058879852, 0.057524099946022034, 0.11470051109790802, 0.03868047893047333, 0.012506169266998768, 0.1987181156873703], [0.3321884572505951, 0.03389744088053703, 0.002693361835554242, 0.0071026929654181, 0.005533934570848942, 0.01564689911901951, 0.003093804232776165, 0.18151551485061646, 0.029250681400299072, 0.0081030810251832, 0.014644493348896503, 0.029862424358725548, 0.026823272928595543, 0.0492832213640213, 0.030096178874373436, 0.016970288008451462, 0.21329432725906372], [0.6845720410346985, 0.016754653304815292, 0.0009863210143521428, 0.002535318722948432, 0.0037805584724992514, 0.014086965471506119, 0.0013210945762693882, 0.08875895291566849, 0.009227129630744457, 0.00434531457722187, 0.009984754025936127, 0.013242486864328384, 0.020694760605692863, 0.016962971538305283, 0.02019704133272171, 0.010012459009885788, 0.08253715932369232], [0.5398125648498535, 0.015081505291163921, 0.0018217518227174878, 0.004363028332591057, 0.0071001797914505005, 0.015609458088874817, 0.005663950927555561, 0.1367199718952179, 0.012434837408363819, 0.00426257262006402, 0.007367572747170925, 0.018330521881580353, 0.020279422402381897, 0.014222918078303337, 0.006555483210831881, 0.011862548068165779, 0.17851173877716064], [0.36825042963027954, 0.01007560919970274, 0.04554915800690651, 0.03139343485236168, 0.00729637686163187, 0.01621279865503311, 0.07366807013750076, 0.028060002252459526, 0.016219986602663994, 0.05559534206986427, 0.005889969877898693, 0.04531412199139595, 0.05655630677938461, 0.017460372298955917, 0.06137128174304962, 0.08160776644945145, 0.07947903126478195]], [[0.10150274634361267, 0.056775402277708054, 0.027641616761684418, 0.055415842682123184, 0.05959916487336159, 0.05462685972452164, 0.04789852350950241, 0.1458617001771927, 0.03546008840203285, 0.017090609297156334, 0.02668726071715355, 0.04303312674164772, 0.031764257699251175, 0.03336093947291374, 0.02486257627606392, 0.09027407318353653, 0.14814525842666626], [0.0863855853676796, 0.24787335097789764, 0.003853786503896117, 0.003847409738227725, 0.005954352207481861, 0.28800687193870544, 0.0014496278017759323, 0.16159002482891083, 0.012423563748598099, 0.0025480466429144144, 0.0009974673157557845, 0.001931938691996038, 0.0011101821437478065, 0.013208563439548016, 0.004920025356113911, 0.0032107937149703503, 0.16068826615810394], [0.14062336087226868, 0.013474451377987862, 0.3341744840145111, 0.001062109018675983, 0.002328150672838092, 0.015371915884315968, 0.012867647223174572, 0.17474909126758575, 0.003574281930923462, 0.02682465873658657, 0.0005727342795580626, 0.004803636576980352, 0.0012695365585386753, 0.004581225570291281, 0.029927408322691917, 0.0009980346076190472, 0.23279722034931183], [0.05724711716175079, 0.0033608940429985523, 0.0012052609818056226, 0.6935656666755676, 0.0059132869355380535, 0.010741160251200199, 0.018963724374771118, 0.10247406363487244, 0.000613437790889293, 0.00036747835110872984, 0.0037309376057237387, 0.001134525053203106, 0.0008378717466257513, 0.0009415954118594527, 0.0005056040245108306, 0.01611877605319023, 0.08227851986885071], [0.05343194678425789, 0.007357680704444647, 0.0008152057998813689, 0.003345476696267724, 0.7409043908119202, 0.007481908425688744, 0.00017718825256451964, 0.06375930458307266, 0.014160959981381893, 0.0002252731064800173, 0.0009103623451665044, 0.009406579658389091, 0.0017068530432879925, 0.011185236275196075, 0.0013475895393639803, 0.0019056780729442835, 0.08187828958034515], [0.07601923495531082, 0.45464256405830383, 0.0016442922642454505, 0.005464432295411825, 0.005849150009453297, 0.1937789022922516, 0.0013004663633182645, 0.14252859354019165, 0.0065056439489126205, 0.001619619899429381, 0.0005873207119293511, 0.0018004789017140865, 0.0006894461112096906, 0.004495232366025448, 0.007708333898335695, 0.0008719105971977115, 0.09449435025453568], [0.14645135402679443, 0.00824565440416336, 0.003093477338552475, 0.021993644535541534, 0.0005127917975187302, 0.005400036461651325, 0.5194729566574097, 0.12501980364322662, 0.0007646023877896369, 0.006850738078355789, 0.003791574388742447, 0.0018295790068805218, 0.0006944182678125799, 0.0004898236365988851, 0.010378297418355942, 0.0007684637093916535, 0.1442427933216095], [0.18492233753204346, 0.051034316420555115, 0.015085405670106411, 0.042665183544158936, 0.02634424902498722, 0.035478249192237854, 0.05155931040644646, 0.17810136079788208, 0.020249750465154648, 0.04976426810026169, 0.01902586594223976, 0.02346872165799141, 0.015254486352205276, 0.015495894476771355, 0.026960913091897964, 0.0406220480799675, 0.20396772027015686], [0.11262324452400208, 0.0034453312400728464, 0.0025132603477686644, 0.0005145873292349279, 0.019906997680664062, 0.0038114232011139393, 0.0007193675264716148, 0.10669032484292984, 0.23912738263607025, 0.0010404441272839904, 0.0005012480542063713, 0.005631100386381149, 0.005967219825834036, 0.4252619445323944, 0.0007809204398654401, 0.002135257702320814, 0.06932993978261948], [0.15359656512737274, 0.02635994926095009, 0.05118342489004135, 0.004991552792489529, 0.0014750513946637511, 0.013714841566979885, 0.021671149879693985, 0.20006506145000458, 0.009619561024010181, 0.3566414415836334, 0.0008415657794103026, 0.009659632109105587, 0.00103670300450176, 0.01356533169746399, 0.016946449875831604, 0.002617922145873308, 0.11601381748914719], [0.17769388854503632, 0.0027802397962659597, 0.0015493643004447222, 0.006638471968472004, 0.0032385496888309717, 0.002391247544437647, 0.007084549404680729, 0.20605194568634033, 0.001683704787865281, 0.0003792982897721231, 0.4159244894981384, 0.0007561771781183779, 0.008840864524245262, 0.0020081130787730217, 0.001441675703972578, 0.0004729137581307441, 0.16106446087360382], [0.0668032243847847, 0.0026265436317771673, 0.0035894776228815317, 0.0006340295076370239, 0.008292131125926971, 0.002639294136315584, 0.0014065058203414083, 0.07170724123716354, 0.0018339167581871152, 0.0006682593375444412, 0.0002708070387598127, 0.8025994896888733, 0.0007157629006542265, 0.0018917274428531528, 0.0003930495004169643, 0.0005359607748687267, 0.03339254856109619], [0.11458969116210938, 0.006263757590204477, 0.0023777647875249386, 0.002864033216610551, 0.01796647720038891, 0.009822512976825237, 0.0018172648269683123, 0.08093534409999847, 0.02149457484483719, 0.0011926060542464256, 0.006302511319518089, 0.013844604603946209, 0.5653843879699707, 0.02411363460123539, 0.0026775277219712734, 0.007272959686815739, 0.12108036130666733], [0.17340035736560822, 0.0020083824638277292, 0.0027011942584067583, 0.0006685554981231689, 0.015232319943606853, 0.0014870482264086604, 0.0008552268263883889, 0.10619340091943741, 0.3732151687145233, 0.0022064638324081898, 0.0009895730763673782, 0.0050681582652032375, 0.0027907853946089745, 0.20185750722885132, 0.0014800208155065775, 0.0021708288695663214, 0.1076749935746193], [0.12609931826591492, 0.028834374621510506, 0.006291717756539583, 0.0016308606136590242, 0.000967914005741477, 0.00683382386341691, 0.002945375395938754, 0.08620961755514145, 0.0005601923330686986, 0.005568929947912693, 0.0008325201342813671, 0.0004995049675926566, 0.0004126435669604689, 0.000431784923421219, 0.6565546989440918, 0.0028775108512490988, 0.07244917750358582], [0.0808182954788208, 0.007231817115098238, 0.0007726553594693542, 0.0107741579413414, 0.004917825106531382, 0.00674121268093586, 0.0012940966989845037, 0.20596443116664886, 0.0030732129234820604, 0.00018728358554653823, 0.00017955902148969471, 0.0024609218817204237, 0.0017491645412519574, 0.002376766176894307, 0.0017995280213654041, 0.539187490940094, 0.13047149777412415], [0.18937397003173828, 0.05623513460159302, 0.02961532026529312, 0.042857296764850616, 0.023699268698692322, 0.028199635446071625, 0.06605412065982819, 0.19971922039985657, 0.025171738117933273, 0.039639391005039215, 0.015038169920444489, 0.012659987434744835, 0.03845212236046791, 0.027315106242895126, 0.04270714521408081, 0.033653244376182556, 0.12960918247699738]], [[0.7072710394859314, 0.02025127410888672, 0.008623739704489708, 0.022592656314373016, 0.009532532654702663, 0.012545792385935783, 0.00797272752970457, 0.05966459587216377, 0.024847362190485, 0.012376786209642887, 0.00644359365105629, 0.007226041983813047, 0.01551248412579298, 0.020753052085638046, 0.005207892507314682, 0.013097947463393211, 0.04608050361275673], [0.2058332860469818, 0.1329808384180069, 0.06840626150369644, 0.08813412487506866, 0.0321757011115551, 0.08071182668209076, 0.035094212740659714, 0.11282630264759064, 0.042788613587617874, 0.015489951707422733, 0.014115683734416962, 0.009140207432210445, 0.04922871291637421, 0.045428887009620667, 0.007329728454351425, 0.01670478843152523, 0.043610889464616776], [0.14562542736530304, 0.18104705214500427, 0.02394924871623516, 0.10756579786539078, 0.04833333566784859, 0.11318337172269821, 0.0711887776851654, 0.13359986245632172, 0.02750270999968052, 0.008604022674262524, 0.008918151259422302, 0.004984763450920582, 0.028118327260017395, 0.022080594673752785, 0.011575060896575451, 0.014446396380662918, 0.04927710443735123], [0.25682151317596436, 0.12538205087184906, 0.10437680780887604, 0.03524830564856529, 0.04944097250699997, 0.06037474051117897, 0.035989273339509964, 0.09861882030963898, 0.054988257586956024, 0.01650240086019039, 0.008534936234354973, 0.017553573474287987, 0.0325353667140007, 0.049732524901628494, 0.015073303133249283, 0.00849836878478527, 0.030328797176480293], [0.381420761346817, 0.05739995464682579, 0.051154471933841705, 0.0935162901878357, 0.015592883341014385, 0.03177342563867569, 0.07335171103477478, 0.12216147780418396, 0.04713563993573189, 0.006516763940453529, 0.010556009598076344, 0.0056880321353673935, 0.03486748784780502, 0.030796300619840622, 0.0014682153705507517, 0.006805592682212591, 0.02979503758251667], [0.18502046167850494, 0.12758572399616241, 0.050649192184209824, 0.09789030253887177, 0.03500235080718994, 0.08044002205133438, 0.04070030525326729, 0.1257300078868866, 0.0493810810148716, 0.011870051734149456, 0.011907748878002167, 0.006355403922498226, 0.05904830992221832, 0.05142189562320709, 0.006249067839235067, 0.01577920652925968, 0.04496882110834122], [0.24962207674980164, 0.06421149522066116, 0.07388907670974731, 0.12005764991044998, 0.03547848388552666, 0.04370953515172005, 0.009727505967020988, 0.19703389704227448, 0.04452044144272804, 0.013740920461714268, 0.010302780196070671, 0.006059207953512669, 0.010932795703411102, 0.041698601096868515, 0.006756617221981287, 0.00753893842920661, 0.06471994519233704], [0.8676420450210571, 0.009835900738835335, 0.002584961475804448, 0.004810415208339691, 0.007630026899278164, 0.004555467050522566, 0.0026950433384627104, 0.046919312328100204, 0.008078555576503277, 0.002796290908008814, 0.001584212644957006, 0.0015981710748746991, 0.004415057133883238, 0.008153198286890984, 0.0018554717535153031, 0.0036000539548695087, 0.02124566212296486], [0.1824914813041687, 0.08712238818407059, 0.04976505786180496, 0.028436629101634026, 0.018137233331799507, 0.07061360031366348, 0.06778471171855927, 0.11746978014707565, 0.030701857060194016, 0.020283183082938194, 0.05955550819635391, 0.033622290939092636, 0.020324481651186943, 0.01827339082956314, 0.0497424453496933, 0.06593020260334015, 0.0797458067536354], [0.27385109663009644, 0.06333151459693909, 0.01755063608288765, 0.01068738754838705, 0.015490111894905567, 0.05293416604399681, 0.006554890889674425, 0.08919497579336166, 0.09335487335920334, 0.020913520827889442, 0.018606366589665413, 0.027294263243675232, 0.09664979577064514, 0.05416751280426979, 0.008564081974327564, 0.05615193396806717, 0.0947028324007988], [0.48727378249168396, 0.023071911185979843, 0.0075753857381641865, 0.026162711903452873, 0.006361059844493866, 0.01431692112237215, 0.01638052426278591, 0.07400745898485184, 0.06480858474969864, 0.009130040183663368, 0.004551251884549856, 0.017120515927672386, 0.09465217590332031, 0.04308564215898514, 0.009638577699661255, 0.053390566259622574, 0.04847288504242897], [0.16506165266036987, 0.016937734559178352, 0.009962411597371101, 0.026620345190167427, 0.017974555492401123, 0.02112235501408577, 0.006529505830258131, 0.030426068231463432, 0.11578191816806793, 0.04105665162205696, 0.03256804496049881, 0.022309020161628723, 0.27163925766944885, 0.12392012029886246, 0.011107023805379868, 0.05113895237445831, 0.0358443483710289], [0.20663395524024963, 0.0599508173763752, 0.008706527762115002, 0.018223673105239868, 0.011991174891591072, 0.04738074913620949, 0.02089555747807026, 0.05381520837545395, 0.1368630826473236, 0.010366475209593773, 0.06298252195119858, 0.045971982181072235, 0.04792414978146553, 0.12080392241477966, 0.02490728162229061, 0.04904209077358246, 0.07354079931974411], [0.22065909206867218, 0.07472027093172073, 0.03604720160365105, 0.020893141627311707, 0.015495178289711475, 0.06506757438182831, 0.06635229289531708, 0.1403273642063141, 0.032369840890169144, 0.01293972972780466, 0.04850924387574196, 0.025209493935108185, 0.023000935092568398, 0.02444223314523697, 0.03753481060266495, 0.05492141470313072, 0.101510189473629], [0.6286150217056274, 0.013461850583553314, 0.0055913557298481464, 0.0010578696383163333, 0.002348033245652914, 0.009881854988634586, 0.00440909992903471, 0.057110778987407684, 0.05197405070066452, 0.004761812277138233, 0.014561306685209274, 0.005125395022332668, 0.08251536637544632, 0.04107102379202843, 0.0008804557146504521, 0.027141733095049858, 0.04949304834008217], [0.06919750571250916, 0.015849336981773376, 0.0034721260890364647, 0.0072216917760670185, 0.004556418862193823, 0.01166294515132904, 0.0030687975231558084, 0.01746642403304577, 0.24035799503326416, 0.02604944072663784, 0.021381542086601257, 0.006992953363806009, 0.2801233232021332, 0.25395768880844116, 0.005604046396911144, 0.011747616343200207, 0.021290110424160957], [0.8101101517677307, 0.007141087669879198, 0.0013573500327765942, 0.003956690896302462, 0.004529180005192757, 0.004052420612424612, 0.0017949139000847936, 0.03936672583222389, 0.01639990508556366, 0.006694929674267769, 0.00391845079138875, 0.004969191271811724, 0.014184552244842052, 0.019815120846033096, 0.005173931363970041, 0.008216982707381248, 0.04831838235259056]], [[0.7205291986465454, 0.01972651109099388, 0.004841483663767576, 0.008108249865472317, 0.00906450767070055, 0.018766412511467934, 0.010340290144085884, 0.09753643721342087, 0.009674835950136185, 0.002809976926073432, 0.004401346668601036, 0.004032909870147705, 0.005608757957816124, 0.006970263551920652, 0.001958990702405572, 0.006524901837110519, 0.0691048800945282], [0.1944400817155838, 0.15130627155303955, 0.014380430802702904, 0.04605203494429588, 0.07361453771591187, 0.1953025758266449, 0.04238109290599823, 0.13135938346385956, 0.007811491843312979, 0.0047035724855959415, 0.014010553248226643, 0.006448366679251194, 0.01134528312832117, 0.005740555468946695, 0.01579647697508335, 0.006385642569512129, 0.07892167568206787], [0.3568216860294342, 0.0807134285569191, 0.012620361521840096, 0.07988269627094269, 0.08753029257059097, 0.0518876314163208, 0.06770256906747818, 0.16095077991485596, 0.007481612730771303, 0.004240153357386589, 0.006185802631080151, 0.0060904379934072495, 0.0037427342031151056, 0.0043992046266794205, 0.0025104053784161806, 0.0027659162878990173, 0.06447428464889526], [0.4519554674625397, 0.0926629826426506, 0.011256865225732327, 0.007133648730814457, 0.07802079617977142, 0.09134230017662048, 0.027379531413316727, 0.1143922358751297, 0.00988288689404726, 0.006507444195449352, 0.005873312242329121, 0.0032645121682435274, 0.007257004268467426, 0.005953365005552769, 0.002365853637456894, 0.0023222421295940876, 0.08242962509393692], [0.5478538870811462, 0.058316558599472046, 0.0035570503678172827, 0.008939341641962528, 0.053859394043684006, 0.07035426795482635, 0.008053836412727833, 0.14636379480361938, 0.004996392875909805, 0.005150305572897196, 0.003946470562368631, 0.0009508839575573802, 0.005032761953771114, 0.0023058801889419556, 0.002249420154839754, 0.0028274590149521828, 0.07524217665195465], [0.3169063329696655, 0.12778297066688538, 0.005257573910057545, 0.03707710653543472, 0.07445850223302841, 0.1684202402830124, 0.03242195025086403, 0.10388179123401642, 0.004478077869862318, 0.009408521465957165, 0.017710385844111443, 0.004453954752534628, 0.008664041757583618, 0.002642967039719224, 0.007299333345144987, 0.004143392201513052, 0.07499285787343979], [0.5461118817329407, 0.03994627296924591, 0.007004902698099613, 0.029389038681983948, 0.029334770515561104, 0.03242163360118866, 0.03923169523477554, 0.12223955243825912, 0.024496493861079216, 0.00499787786975503, 0.009530574083328247, 0.006606025621294975, 0.017486440017819405, 0.013894093222916126, 0.004353648982942104, 0.0053316266275942326, 0.06762354075908661], [0.7106369733810425, 0.029361261054873466, 0.004129546694457531, 0.004591869655996561, 0.008564203977584839, 0.03219747170805931, 0.00542434211820364, 0.1246507465839386, 0.0022715346422046423, 0.0014207608764991164, 0.001892385887913406, 0.002299071289598942, 0.0009582726052030921, 0.0017483675619587302, 0.0007167321746237576, 0.002529596211388707, 0.06660689413547516], [0.5863592624664307, 0.06978526711463928, 0.0031259639654308558, 0.017568491399288177, 0.012831158004701138, 0.04484372213482857, 0.008003173395991325, 0.07923827320337296, 0.013727568089962006, 0.004443072713911533, 0.01606028899550438, 0.002947880420833826, 0.026261065155267715, 0.007917261682450771, 0.01490645669400692, 0.004926762543618679, 0.08705437928438187], [0.6563798785209656, 0.022435182705521584, 0.003965011332184076, 0.007658019196242094, 0.01815665140748024, 0.027777278795838356, 0.005089675076305866, 0.10607391595840454, 0.01491209864616394, 0.005640522576868534, 0.011453702114522457, 0.004246762953698635, 0.0036070796195417643, 0.008176050148904324, 0.004317238926887512, 0.012927439995110035, 0.08718345314264297], [0.9022427201271057, 0.013484259136021137, 0.005736830178648233, 0.0019634913187474012, 0.005895042791962624, 0.008695744909346104, 0.005197812337428331, 0.01322328019887209, 0.00653631379827857, 0.0009133713319897652, 0.0010642585111781955, 0.0073537142015993595, 0.005280360579490662, 0.004163064993917942, 0.0023721761535853148, 0.008230739273130894, 0.007646849844604731], [0.7062962055206299, 0.01543585304170847, 0.00038290562224574387, 0.0008453846094198525, 0.01872618868947029, 0.01083575189113617, 0.005968781653791666, 0.09175468236207962, 0.006616016384214163, 0.0075217681005597115, 0.007811934221535921, 0.0007507918635383248, 0.020119305700063705, 0.003274541115388274, 0.0014504962600767612, 0.006686098407953978, 0.09552323818206787], [0.5687817335128784, 0.007500171661376953, 0.0012481000740081072, 0.0038021039217710495, 0.01597571186721325, 0.008688051253557205, 0.001626061974093318, 0.14397060871124268, 0.009618405252695084, 0.010725083760917187, 0.008235657587647438, 0.004670614842325449, 0.022284414619207382, 0.005498046986758709, 0.00850935559719801, 0.015744689851999283, 0.16312110424041748], [0.5851798057556152, 0.04845580831170082, 0.0024077247362583876, 0.01979917101562023, 0.011966938152909279, 0.028333179652690887, 0.00647295918315649, 0.08622950315475464, 0.011261907406151295, 0.004242253489792347, 0.021014271304011345, 0.004997418727725744, 0.04563039168715477, 0.009517955593764782, 0.015191788785159588, 0.0059085870161652565, 0.09339034557342529], [0.9669435024261475, 0.0022029958199709654, 0.00035214831586927176, 0.00016543437959626317, 0.0009811866329982877, 0.0016666107112541795, 0.00039290532004088163, 0.008445731364190578, 0.0010751164518296719, 0.00044181954581290483, 0.00043774713412858546, 0.0006587635725736618, 0.002310391515493393, 0.0006159336189739406, 0.00046792119974270463, 0.002146282000467181, 0.010695446282625198], [0.663899838924408, 0.017244383692741394, 0.00319538451731205, 0.015414198860526085, 0.018682176247239113, 0.011811940930783749, 0.007079834584146738, 0.09032615274190903, 0.006297524552792311, 0.008428191766142845, 0.007065104320645332, 0.007958292961120605, 0.016529854387044907, 0.003802537452429533, 0.005587233696132898, 0.013406040146946907, 0.10327138006687164], [0.6939183473587036, 0.014839899726212025, 0.001858274918049574, 0.0038684557657688856, 0.004067894537001848, 0.015002700500190258, 0.003196831326931715, 0.11537855863571167, 0.0061392392963171005, 0.0035417380277067423, 0.004894040059298277, 0.007757817395031452, 0.0036283310037106276, 0.005451475735753775, 0.0016782361781224608, 0.006084951106458902, 0.10869317501783371]], [[0.6333684325218201, 0.024028640240430832, 0.01582026295363903, 0.02619941160082817, 0.04270121082663536, 0.009178788401186466, 0.0038392902351915836, 0.21672624349594116, 0.0017483258852735162, 0.001017351751215756, 0.0010785945923998952, 0.0001320177543675527, 0.0011158701963722706, 0.0013306918554008007, 0.000298039783956483, 0.0008072047494351864, 0.020609553903341293], [1.938435343618039e-05, 6.173909059725702e-05, 0.9999021291732788, 6.603690962947439e-06, 3.5700384870551716e-08, 9.066732076234985e-08, 3.589256891700643e-07, 4.228381840221118e-06, 6.511343286774718e-08, 1.2281238526146154e-11, 3.181120078465938e-08, 5.9575171462711296e-08, 3.022314487566291e-08, 8.439465659648704e-07, 1.3673281951120941e-10, 1.6037619232633915e-08, 4.2792162275873125e-06], [2.3578810214530677e-05, 1.2059001619491028e-07, 6.968660272832494e-06, 0.9999454021453857, 2.1070180082460865e-05, 3.048452157372594e-08, 4.142956200325898e-09, 5.234975333223701e-07, 1.9411934317759005e-06, 2.668707033137707e-09, 8.73100826663184e-14, 6.171158584145076e-10, 3.570415962883544e-09, 1.8966605352943589e-07, 1.4805887360580527e-07, 4.436480349756522e-11, 1.0428436070242242e-07], [8.065016299951822e-05, 8.256847650045529e-07, 2.5425518401789304e-07, 5.274629802443087e-05, 0.9998537302017212, 7.893589099694509e-06, 2.3578591523687464e-08, 3.43811643688241e-06, 4.5912926238678153e-10, 6.059946144887363e-08, 4.253125074349384e-10, 1.0152786272289716e-13, 1.6957475423851065e-09, 9.410592305414411e-10, 5.470099040394416e-09, 3.573008200419281e-07, 1.0538805383930594e-08], [7.252969953697175e-05, 1.0088042472489178e-06, 7.699180173403875e-07, 9.171030512789002e-08, 2.4529730580979958e-05, 0.9998767375946045, 1.5409394109155983e-05, 5.737579613196431e-06, 2.1189318744063712e-08, 1.1404707178641615e-09, 1.92392076314718e-06, 5.324307661425109e-10, 1.7075421209367808e-13, 2.0237015188606655e-11, 1.554867623543288e-10, 2.7191543239268867e-08, 1.2435561984602828e-06], [0.0001449573173886165, 1.961492017699129e-07, 6.18173388033938e-08, 4.283891996692546e-07, 2.922433282037673e-07, 3.328191814944148e-05, 0.9997778534889221, 4.209042526781559e-05, 3.019092886802355e-08, 1.7570700094893255e-07, 3.8337217844741645e-09, 8.989945143866862e-08, 1.9401731332635563e-09, 2.6702029259570437e-14, 5.938007863193207e-10, 1.386070841435938e-10, 5.102368163534265e-07], [0.007875271141529083, 2.78672530384938e-07, 3.2709865536162397e-06, 1.2520325221032635e-08, 3.734522024956277e-08, 3.981714513656698e-08, 1.071748783942894e-06, 0.9919500350952148, 4.3283617401357333e-07, 4.404072245778323e-11, 5.2820126938968315e-08, 8.038200105531246e-10, 6.446478550969914e-07, 1.78486253554766e-10, 9.004659693239187e-17, 4.770594475012047e-10, 0.00016886369849089533], [0.9675384759902954, 2.4402952547575296e-08, 2.4659111659275368e-05, 0.0005393843166530132, 9.950016828952357e-05, 9.271211638406385e-06, 3.642330170805508e-07, 0.0050777713768184185, 0.026093894615769386, 0.0005052406922914088, 4.626071677193977e-05, 3.5581527413341973e-07, 1.1286647350061685e-06, 7.7222457548487e-06, 4.0241860688183806e-07, 1.0882030210268567e-06, 5.439632514026016e-05], [3.0027919706299144e-07, 3.0919871697732138e-12, 2.84315648281519e-13, 2.217392935932594e-09, 6.027913279638142e-09, 1.4596358843821378e-11, 4.318633450850484e-09, 3.70443609121196e-09, 7.221189662232064e-06, 0.9999920129776001, 5.019703053221747e-07, 8.708393728351638e-11, 5.175761597087103e-09, 2.2277747138352288e-13, 1.1703649605010469e-08, 7.325244577582879e-12, 5.253080522654718e-12], [5.72791691411112e-07, 9.049626326085303e-11, 7.4393498306069e-11, 9.07126207917025e-15, 5.032383953995634e-10, 1.0314469278682736e-08, 1.1090874746377821e-11, 2.6279504794501918e-08, 4.8859178924942626e-09, 3.091863050030952e-07, 0.999998927116394, 1.2974977536828192e-08, 1.603888133416831e-10, 3.0041871768027306e-11, 3.181084410643076e-14, 6.817750630716546e-08, 8.795264072603004e-09], [1.1951796352605015e-07, 1.9400122133750308e-11, 3.7851676654154787e-11, 2.199900689132256e-13, 2.863965354883138e-15, 1.702163687777869e-10, 3.0246940507794307e-09, 1.4474657028529236e-09, 3.899944367447006e-09, 1.523147052928664e-09, 5.010775794289657e-07, 0.9999991655349731, 5.912943468189269e-09, 6.186419432285817e-11, 5.683380502330415e-11, 1.9405043544251654e-11, 2.479856391346402e-07], [2.4267053959192708e-05, 1.472443926786582e-07, 7.291051673519178e-08, 3.203686205210943e-08, 5.360495380912766e-10, 2.116960713715102e-13, 8.399983819629142e-09, 1.3914322153141256e-05, 6.976647259904212e-09, 1.6974820482573705e-07, 3.115429336730813e-08, 4.852436177316122e-07, 0.999946117401123, 1.4210274457582273e-05, 1.5129073105413227e-08, 2.0338172035394564e-08, 5.701721761397494e-07], [1.1685061451771617e-07, 7.447960650996954e-12, 3.1582297310706053e-07, 6.020477172352656e-11, 6.321258794184104e-11, 5.942484785498303e-12, 1.2458434815132923e-15, 2.5889788091149057e-08, 1.6114299228320306e-07, 2.351414722656653e-11, 2.9442976057225678e-08, 2.647815300349521e-10, 8.733418326301035e-07, 0.9999984502792358, 2.9151566494078907e-08, 5.1976982717860665e-09, 1.964271412191465e-08], [7.690763936807343e-07, 5.7923849744456746e-11, 2.123421750932497e-11, 1.0758450130765596e-08, 1.7259421669635344e-10, 3.496171234462775e-12, 4.377409459216386e-12, 1.452169027388317e-11, 1.063384536675871e-11, 3.252171154599637e-08, 1.560671383793455e-11, 6.66970367824149e-11, 1.3440947022047567e-08, 2.2631687897956e-05, 0.9999750852584839, 1.4043592955204076e-06, 2.9219464181551302e-08], [2.985507080666139e-06, 1.4227438214220456e-06, 1.7013533637477707e-10, 1.1511919889573008e-12, 1.6513105549620377e-08, 4.302720332804988e-11, 8.268533774336007e-11, 4.258034369541974e-09, 2.1723913555235145e-16, 3.6362386435229155e-09, 5.106020125822397e-06, 8.561303575793655e-12, 9.83621273320523e-09, 1.3094116901868347e-09, 4.0106687038132804e-07, 0.999988317489624, 1.691036345619068e-06], [0.0030385619029402733, 1.633214878893341e-06, 1.3128044429322472e-06, 5.003703407169269e-08, 5.8457025886582414e-09, 9.394044013788516e-07, 1.0661939597866876e-07, 1.0856862900254782e-05, 1.429447671341677e-09, 6.110028455408312e-11, 5.1160109251213726e-06, 1.8191908566222992e-06, 9.099260012135346e-08, 3.81958079742617e-06, 7.576409757348301e-07, 0.00038831771234981716, 0.9965465664863586], [0.9982079267501831, 8.643622209092428e-07, 3.3460573831689544e-06, 0.0002227737131761387, 2.980569661303889e-06, 1.2109315328245884e-08, 4.106748008325667e-07, 0.0002561875735409558, 4.2954678036721816e-08, 1.069768131856108e-06, 1.8953638658558702e-08, 9.561464509033613e-08, 0.00022737712424714118, 3.61794889158773e-07, 7.938553494568623e-07, 1.636926390347071e-05, 0.0010594949126243591]], [[0.6119003295898438, 0.021581873297691345, 0.021193664520978928, 0.012721186503767967, 0.015900934115052223, 0.01727457530796528, 0.009155241772532463, 0.11030912399291992, 0.016146371141076088, 0.009341476485133171, 0.009282168932259083, 0.0073842392303049564, 0.014071679674088955, 0.01539422944188118, 0.00883619673550129, 0.011063641868531704, 0.0884430930018425], [0.35931894183158875, 0.03959937393665314, 0.06138524413108826, 0.022490933537483215, 0.030552275478839874, 0.034835271537303925, 0.019451741129159927, 0.17165839672088623, 0.013603885658085346, 0.012843778356909752, 0.015116498805582523, 0.013518509455025196, 0.011545107699930668, 0.01215664017945528, 0.00508335093036294, 0.014057524502277374, 0.16278253495693207], [0.09382070600986481, 0.09909869730472565, 0.05581952631473541, 0.0935056135058403, 0.04949573799967766, 0.07117585092782974, 0.12767624855041504, 0.061477504670619965, 0.03662518784403801, 0.15335127711296082, 0.013764445669949055, 0.0025596146006137133, 0.004863533657044172, 0.029972365126013756, 0.03622915595769882, 0.023739580065011978, 0.04682505875825882], [0.3372790515422821, 0.07769323140382767, 0.01718800514936447, 0.22605858743190765, 0.021703997626900673, 0.06861609220504761, 0.11675143241882324, 0.03624695539474487, 0.007911092601716518, 0.00880168005824089, 0.007015812676399946, 0.009533777832984924, 0.0010994685580953956, 0.0075613390654325485, 0.015131873078644276, 0.011829919181764126, 0.029577815905213356], [0.39484772086143494, 0.041872866451740265, 0.08691174536943436, 0.11149054020643234, 0.007923164404928684, 0.03480171784758568, 0.03658316656947136, 0.1314055174589157, 0.00800010934472084, 0.011399831622838974, 0.002736506750807166, 0.002895988989621401, 0.014981966465711594, 0.008690150454640388, 0.00405798340216279, 0.003336118534207344, 0.09806475788354874], [0.36681169271469116, 0.048759058117866516, 0.06123392656445503, 0.03902408853173256, 0.03148386999964714, 0.03719025105237961, 0.03373100236058235, 0.1484169214963913, 0.01929200254380703, 0.015458864159882069, 0.014254549518227577, 0.013169675134122372, 0.014441785402595997, 0.017074115574359894, 0.004936002194881439, 0.012161352671682835, 0.12256094068288803], [0.2650616466999054, 0.05970345437526703, 0.011520858854055405, 0.08978330343961716, 0.03528253361582756, 0.059876374900341034, 0.01722855679690838, 0.20641465485095978, 0.010591143742203712, 0.007842496037483215, 0.029254935681819916, 0.01792449876666069, 0.004247467964887619, 0.008911955170333385, 0.02435431256890297, 0.015565590932965279, 0.13643625378608704], [0.8111027479171753, 0.013256637379527092, 0.016896145418286324, 0.010062233544886112, 0.01424717903137207, 0.0110491206869483, 0.004007804673165083, 0.04021908715367317, 0.007732273545116186, 0.008060350082814693, 0.0056525953114032745, 0.002672546310350299, 0.006186809856444597, 0.006608298514038324, 0.0031156723853200674, 0.007105796132236719, 0.03202463686466217], [0.13258491456508636, 0.04388359189033508, 0.016911109909415245, 0.009300592355430126, 0.02081601321697235, 0.032853689044713974, 0.07237568497657776, 0.02264307625591755, 0.14581716060638428, 0.14241068065166473, 0.02281036600470543, 0.030502665787935257, 0.16079901158809662, 0.08642520755529404, 0.004905191715806723, 0.0346621610224247, 0.020298752933740616], [0.4857543110847473, 0.056926943361759186, 0.014180963858962059, 0.020062662661075592, 0.03338610753417015, 0.06713421642780304, 0.018575672060251236, 0.08533292263746262, 0.04026877135038376, 0.008103077299892902, 0.008086378686130047, 0.004944315645843744, 0.017765112221240997, 0.02887001819908619, 0.006037246435880661, 0.012386160902678967, 0.09218505024909973], [0.6266052722930908, 0.01786693185567856, 0.005147726275026798, 0.005475573241710663, 0.003312538843601942, 0.019775541499257088, 0.005380944348871708, 0.16185548901557922, 0.0026647131890058517, 0.0025690579786896706, 0.0008620548178441823, 0.0054298751056194305, 0.011533264070749283, 0.003829886205494404, 0.0034344352316111326, 0.0032135036308318377, 0.12104308605194092], [0.36255550384521484, 0.031023580580949783, 0.0024865244049578905, 0.011223023757338524, 0.006665357388556004, 0.014492548070847988, 0.10592179000377655, 0.07772282510995865, 0.13918937742710114, 0.018755998462438583, 0.03568809852004051, 0.0003342463169246912, 0.020573170855641365, 0.09029614180326462, 0.006884979084134102, 0.008011145517230034, 0.06817576289176941], [0.05405600741505623, 0.021632317453622818, 0.0006661695661023259, 0.0017734096618369222, 0.01594589464366436, 0.02799496054649353, 0.06631243228912354, 0.012204421684145927, 0.10479939728975296, 0.05822107940912247, 0.02415485866367817, 0.007486693561077118, 0.44230952858924866, 0.07456058263778687, 0.006768277380615473, 0.07046602666378021, 0.010647974908351898], [0.1416899859905243, 0.036284830421209335, 0.012041773647069931, 0.00978955626487732, 0.0186851117759943, 0.02894003316760063, 0.08818136900663376, 0.019620299339294434, 0.1371292918920517, 0.12121247500181198, 0.027334025129675865, 0.029566876590251923, 0.18601708114147186, 0.07799683511257172, 0.005384963471442461, 0.041254837065935135, 0.018870627507567406], [0.8988975286483765, 0.006032305303961039, 0.0009250526782125235, 0.0020901875104755163, 0.0023002855014055967, 0.008027924224734306, 0.009017824195325375, 0.036572836339473724, 0.0019775177352130413, 0.001221368322148919, 0.0008615312981419265, 0.0011286360677331686, 0.0012598119210451841, 0.0011694515123963356, 0.0007067133556120098, 0.0012603135546669364, 0.02655080333352089], [0.29498177766799927, 0.10912549495697021, 0.029262583702802658, 0.08615251630544662, 0.010567551478743553, 0.0639929547905922, 0.023371761664748192, 0.1629147082567215, 0.014571324922144413, 0.012601705268025398, 0.011871011927723885, 0.006539005320519209, 0.02212577871978283, 0.011297021992504597, 0.018693506717681885, 0.013690615072846413, 0.10824067890644073], [0.7812222242355347, 0.01291861291974783, 0.015142740681767464, 0.008516029454767704, 0.015193463303148746, 0.011530909687280655, 0.010483729653060436, 0.04752233996987343, 0.011744366958737373, 0.009586195461452007, 0.00792122446000576, 0.003679033135995269, 0.0067602889612317085, 0.010344968177378178, 0.0035452134907245636, 0.011068744584918022, 0.032819923013448715]], [[0.9251337051391602, 0.006425590254366398, 0.0013326085172593594, 0.004793581087142229, 0.0014584774617105722, 0.006270880810916424, 0.0009481451706960797, 0.01992804929614067, 0.002913089469075203, 0.0024434849619865417, 0.0017456605564802885, 0.0022675462532788515, 0.003355249995365739, 0.0021112968679517508, 0.0014518670504912734, 0.005709770135581493, 0.011711085215210915], [0.12398733198642731, 0.05022943392395973, 0.12278818339109421, 0.2391885668039322, 0.055548716336488724, 0.08650415390729904, 0.05161648988723755, 0.11533551663160324, 0.027826836332678795, 0.011841287836432457, 0.01185314916074276, 0.009901360608637333, 0.019688894972205162, 0.01652410253882408, 0.006566739175468683, 0.0269883144646883, 0.023611020296812057], [0.22464239597320557, 0.06491526961326599, 0.052171267569065094, 0.24684637784957886, 0.1097550168633461, 0.08265739679336548, 0.06476961821317673, 0.06129208207130432, 0.01808682270348072, 0.010290087200701237, 0.01295983325690031, 0.008745373226702213, 0.016657251864671707, 0.007513593882322311, 0.0022994112223386765, 0.007452774792909622, 0.00894559919834137], [0.10885557532310486, 0.05081658437848091, 0.044273555278778076, 0.04091927409172058, 0.1204332485795021, 0.1158447116613388, 0.16989220678806305, 0.11132703721523285, 0.05524253100156784, 0.03680300712585449, 0.01943988725543022, 0.013577629812061787, 0.050738219171762466, 0.019201047718524933, 0.004744599107652903, 0.016651147976517677, 0.02123967558145523], [0.13410499691963196, 0.02061476558446884, 0.13468383252620697, 0.14294801652431488, 0.016381992027163506, 0.06059090420603752, 0.1561424285173416, 0.1323607861995697, 0.040975309908390045, 0.05908168479800224, 0.01741497963666916, 0.011528359726071358, 0.022537054494023323, 0.012685073539614677, 0.004618597216904163, 0.02059810608625412, 0.012733209878206253], [0.23986829817295074, 0.029994338750839233, 0.0809846743941307, 0.09272664040327072, 0.030431149527430534, 0.0532967634499073, 0.05385315790772438, 0.1552196890115738, 0.051516227424144745, 0.039866406470537186, 0.03118482232093811, 0.014827771112322807, 0.030238475650548935, 0.01780758425593376, 0.01358115952461958, 0.0381292887032032, 0.026473522186279297], [0.6702460646629333, 0.04039071127772331, 0.010881781578063965, 0.040993090718984604, 0.017644641920924187, 0.036171287298202515, 0.0033300805371254683, 0.05849412456154823, 0.019225353375077248, 0.008896773681044579, 0.02332553267478943, 0.018135597929358482, 0.024901246652007103, 0.007929094135761261, 0.003953901119530201, 0.004606270231306553, 0.010874389670789242], [0.9814404249191284, 0.002165840473026037, 0.0005273839924484491, 0.0011563283624127507, 0.0007391585386358202, 0.002715606242418289, 0.00021227878460194916, 0.005128032993525267, 0.00044840783812105656, 0.00038368551759049296, 0.00039024706347845495, 0.0005696629523299634, 0.0005227657966315746, 0.00038487158599309623, 0.0003794225340243429, 0.0013633103808388114, 0.001472635893151164], [0.32013431191444397, 0.01341475173830986, 0.009093686006963253, 0.04590839147567749, 0.022945737466216087, 0.03237318992614746, 0.020527128130197525, 0.05668998509645462, 0.026710310950875282, 0.1174580305814743, 0.06167193129658699, 0.017793554812669754, 0.15245500206947327, 0.01663772016763687, 0.010135079734027386, 0.04125090688467026, 0.034800320863723755], [0.7257449626922607, 0.020953062921762466, 0.0028106123208999634, 0.010643127374351025, 0.008396069519221783, 0.028052086010575294, 0.0025198792573064566, 0.023551637306809425, 0.02948867902159691, 0.008803030475974083, 0.017441801726818085, 0.038056425750255585, 0.01912309229373932, 0.022980468347668648, 0.003315456910058856, 0.013302041217684746, 0.024817688390612602], [0.8992270827293396, 0.0067262048833072186, 0.0037738471291959286, 0.0050713843666017056, 0.004541181959211826, 0.006459873169660568, 0.0006238860660232604, 0.009045282378792763, 0.009381905198097229, 0.009805469773709774, 0.001834037364460528, 0.015132950618863106, 0.0069765495136380196, 0.0061210766434669495, 0.003013929817825556, 0.007298424374312162, 0.004966893699020147], [0.6119157671928406, 0.016551412642002106, 0.004575283266603947, 0.012285488657653332, 0.009446066804230213, 0.01088800374418497, 0.002462513977661729, 0.04286373034119606, 0.025588825345039368, 0.04634644091129303, 0.019791895523667336, 0.009067904204130173, 0.052891988307237625, 0.034999214112758636, 0.007858281023800373, 0.03673967346549034, 0.05572747811675072], [0.6649059653282166, 0.009184145368635654, 0.004191290121525526, 0.013850522227585316, 0.01730494201183319, 0.012185389176011086, 0.004381608683615923, 0.0180215947329998, 0.0073500704020261765, 0.031854789704084396, 0.019856369122862816, 0.013474204577505589, 0.0696917325258255, 0.01758892461657524, 0.01586998626589775, 0.045877959579229355, 0.034410424530506134], [0.4487479627132416, 0.01283120084553957, 0.009737028740346432, 0.03140019252896309, 0.02394791506230831, 0.021185917779803276, 0.014423376880586147, 0.03310423716902733, 0.007790941745042801, 0.0367874950170517, 0.019738828763365746, 0.008874916471540928, 0.13407641649246216, 0.028462158516049385, 0.023419804871082306, 0.08842384815216064, 0.05704784393310547], [0.9669104814529419, 0.003034558380022645, 0.0005289777764119208, 0.0010848940582945943, 0.002162822987884283, 0.00490755308419466, 0.00032730502425692976, 0.0036787576973438263, 0.0006356577505357563, 0.0011706246295943856, 0.0013943002559244633, 0.001032605185173452, 0.0029417066834867, 0.00118972675409168, 0.0006834525265730917, 0.0021427369210869074, 0.006173804868012667], [0.4120328426361084, 0.03165428340435028, 0.01652018167078495, 0.022160783410072327, 0.012025549076497555, 0.024525921791791916, 0.012909539975225925, 0.0346548855304718, 0.07412867248058319, 0.040148187428712845, 0.020908406004309654, 0.024011891335248947, 0.10253993421792984, 0.0767844170331955, 0.01099012978374958, 0.01676901802420616, 0.06723534315824509], [0.9908773303031921, 0.000542798254173249, 8.300925401272252e-05, 0.0002533980878069997, 0.00022805252228863537, 0.0006212692824192345, 4.867378083872609e-05, 0.0018639866029843688, 0.0002034437784459442, 0.0002092993090627715, 0.00013177577056922019, 0.0003484897024463862, 0.00038077490171417594, 0.00026910213637165725, 0.00030158163281157613, 0.0013308931374922395, 0.002306313021108508]]], [[[0.23649276793003082, 0.009575363248586655, 0.0007070523570291698, 0.0015374531503766775, 0.0012267339043319225, 0.005538392346352339, 0.0015727514401078224, 0.3931201994419098, 0.0010759999277070165, 0.0004444090591277927, 0.0005441537941806018, 0.00015489981160499156, 0.0007632747292518616, 0.0024629707913845778, 0.00022943763178773224, 0.0007042424404062331, 0.34384989738464355], [0.12603791058063507, 0.2567879855632782, 1.9506596800056286e-05, 0.00011261352483415976, 0.0020082637201994658, 0.244845911860466, 0.00035053110332228243, 0.19640083611011505, 0.0017314940923824906, 2.4647080863360316e-05, 0.00015852053184062243, 8.420042286161333e-05, 0.00022511796851176769, 0.0014762879582121968, 1.6738204067223705e-05, 0.00034344629966653883, 0.169375941157341], [0.2667919397354126, 0.0009445402538403869, 0.03838546201586723, 0.00010722418664954603, 3.0123646865831688e-05, 0.0007586829597130418, 0.00042929858318530023, 0.3687818646430969, 0.0001257827680092305, 0.0008805532124824822, 1.170871746580815e-05, 3.203524829586968e-05, 2.382057346039801e-06, 0.0001387391530442983, 1.489913665864151e-05, 2.072250390483532e-05, 0.3225440979003906], [0.25240787863731384, 0.0009592779679223895, 8.733053618925624e-06, 0.005342131946235895, 1.592511762282811e-05, 0.000510979734826833, 0.00013963374658487737, 0.41733303666114807, 6.04202868998982e-05, 7.206467125797644e-05, 9.382057214679662e-06, 4.89102887968329e-07, 3.6075623484066455e-06, 0.00012186798267066479, 1.7112070054281503e-06, 5.533490184461698e-05, 0.32295748591423035], [0.2866417169570923, 0.010120163671672344, 2.732828215812333e-05, 7.675612869206816e-05, 0.07337981462478638, 0.0013653499772772193, 1.9206754586775787e-05, 0.3546513319015503, 0.0018209987320005894, 6.871797813801095e-05, 0.00012114230048609897, 8.295612497022375e-05, 0.001291619031690061, 0.0031827157363295555, 9.813452379603405e-06, 0.00012900974252261221, 0.2670113444328308], [0.16606470942497253, 0.30059969425201416, 1.622009585844353e-05, 4.898137922282331e-05, 0.0003668418503366411, 0.03200509399175644, 4.740394069813192e-05, 0.2782410979270935, 0.00029426015680655837, 1.1166186595801264e-05, 7.531353185186163e-05, 7.454174192389473e-05, 0.0001449183328077197, 0.0013525830581784248, 9.3729968284606e-06, 0.0005967199103906751, 0.22005097568035126], [0.22409360110759735, 0.0001981118693947792, 5.5427735787816346e-05, 0.0002474568609613925, 5.681469701812603e-06, 5.1353239541640505e-05, 0.26499027013778687, 0.28145670890808105, 3.3460337363067083e-06, 0.00018685536633711308, 6.383216532412916e-05, 1.3170049896871205e-05, 5.5807347962399945e-05, 3.0161529139149934e-05, 0.00012832177162636071, 5.7618140999693424e-05, 0.22836226224899292], [0.1995595246553421, 0.005892673507332802, 0.0006851264624856412, 0.006053902208805084, 0.0026101788971573114, 0.005052225198596716, 0.0011218657018616796, 0.40491876006126404, 0.0007635498768649995, 0.0015699805226176977, 0.0013346181949600577, 0.00024893501540645957, 0.0012974567944183946, 0.0012894216924905777, 0.0027221899945288897, 0.0009985225042328238, 0.3638811409473419], [0.015748417004942894, 0.0036024393048137426, 3.0689796403748915e-05, 0.00011882036051247269, 0.0003829338529612869, 0.0011954547371715307, 4.5041655539534986e-05, 0.010944324545562267, 0.10715772211551666, 1.9123013771604747e-05, 9.353139830636792e-06, 2.318828774150461e-05, 0.00030614729621447623, 0.8494649529457092, 1.8484159227227792e-05, 0.0001321829331573099, 0.01080086175352335], [0.3048360049724579, 0.0011948402971029282, 0.002164096338674426, 0.0017807262483984232, 0.0001600413816049695, 0.0004921670770272613, 0.0003017329436261207, 0.2903909385204315, 1.725131005514413e-05, 0.15673694014549255, 3.0251478165155277e-05, 2.6057023205794394e-05, 1.7679623852018267e-05, 0.00010164028208237141, 0.0007087530102580786, 8.948415779741481e-05, 0.2409513294696808], [0.3597284257411957, 0.0014826977858319879, 2.9184247978264466e-05, 9.311461326433346e-05, 5.959664485999383e-05, 0.002033697674050927, 0.0003328493912704289, 0.33513370156288147, 2.7585723728407174e-05, 6.568998651346192e-05, 0.04598112404346466, 3.110198304057121e-05, 8.600515138823539e-05, 6.983146158745512e-05, 1.4047523109184112e-05, 0.00035343787749297917, 0.25447797775268555], [0.29739445447921753, 0.006057217717170715, 4.130787056055851e-05, 7.26759299141122e-06, 0.0001116898565669544, 0.00417375797405839, 0.0005609277868643403, 0.2740154564380646, 9.651802974985912e-05, 3.515285061439499e-05, 8.231084393628407e-06, 0.22740326821804047, 0.00019201493705622852, 0.00010530885629123077, 8.536979294149205e-06, 0.00022549816640093923, 0.18956346809864044], [0.2682710289955139, 0.0015420912532135844, 8.533003210686729e-07, 2.8026937798131257e-05, 0.00023646163754165173, 0.0007433740538544953, 5.388245699577965e-05, 0.37844619154930115, 0.0007813562406226993, 5.683265499101253e-06, 3.973229104303755e-05, 1.5615118172718212e-05, 0.06846363842487335, 0.00020468801085371524, 1.0058330190076958e-05, 0.00019216093642171472, 0.28096523880958557], [0.0045647090300917625, 0.0011219752486795187, 3.761092011700384e-05, 6.128181121312082e-05, 0.0007528176065534353, 0.0013628704473376274, 6.049275179975666e-05, 0.002771766623482108, 0.9442093372344971, 3.6061639548279345e-05, 1.3617004697152879e-05, 4.668351903092116e-05, 0.0001984307891689241, 0.04232368618249893, 6.502510586869903e-06, 7.811482646502554e-05, 0.002354126190766692], [0.28933101892471313, 0.0008484618156217039, 0.00010030897101387382, 4.103552782908082e-05, 2.538410080887843e-05, 0.0009417132823728025, 0.0010526480618864298, 0.31054484844207764, 3.440405271248892e-05, 0.00044045443064533174, 5.6101434893207625e-05, 1.0321265108359512e-05, 4.170285683358088e-05, 1.4257343536883127e-05, 0.16003845632076263, 0.00019029082613997161, 0.23628857731819153], [0.30895987153053284, 0.004043697379529476, 3.239829675294459e-05, 0.00014185896725393832, 4.520246875472367e-05, 0.0007967217243276536, 8.723305654712021e-05, 0.3647679388523102, 7.612311310367659e-05, 2.9786029699607752e-05, 4.432657078723423e-05, 1.8494247342459857e-05, 0.0005809476133435965, 6.335557554848492e-05, 1.3103333913022652e-05, 0.0838891789317131, 0.2364097535610199], [0.20048923790454865, 0.008255542255938053, 0.0012486386112868786, 0.009487465023994446, 0.004535979125648737, 0.007236729841679335, 0.0018339318921789527, 0.39370521903038025, 0.0013723339652642608, 0.002480501774698496, 0.002343823667615652, 0.000419615680584684, 0.002009877236559987, 0.0019508538534864783, 0.004005650989711285, 0.0015729168662801385, 0.35705164074897766]], [[0.16566145420074463, 0.015125798061490059, 0.01922011561691761, 0.020326795056462288, 0.004797699861228466, 0.010370586067438126, 0.023106077685952187, 0.23920071125030518, 0.039752788841724396, 0.05765015259385109, 0.017900383099913597, 0.02877078205347061, 0.031183753162622452, 0.029586652293801308, 0.011257155798375607, 0.0501849465072155, 0.23590421676635742], [0.13907481729984283, 0.08366601914167404, 0.055825281888246536, 0.036082956939935684, 0.021178683266043663, 0.055910807102918625, 0.03193854168057442, 0.16754628717899323, 0.046839211136102676, 0.021042468026280403, 0.02282017096877098, 0.01574786566197872, 0.04562472552061081, 0.06084536761045456, 0.023004189133644104, 0.0210743211209774, 0.15177834033966064], [0.35804805159568787, 0.10585717856884003, 0.014202317222952843, 0.07143095880746841, 0.013787460513412952, 0.0662609115242958, 0.02437428943812847, 0.11632745712995529, 0.019533473998308182, 0.012860754504799843, 0.014019209891557693, 0.017268164083361626, 0.017720986157655716, 0.02187144011259079, 0.020141465589404106, 0.01041004341095686, 0.09588591009378433], [0.14526860415935516, 0.03266781195998192, 0.15536606311798096, 0.13914602994918823, 0.05245282128453255, 0.03037911467254162, 0.02508748508989811, 0.12895701825618744, 0.019950805231928825, 0.017940910533070564, 0.006575548555701971, 0.015332230366766453, 0.07360490411520004, 0.024235304445028305, 0.01593637652695179, 0.005387315526604652, 0.11171156913042068], [0.10584387183189392, 0.038585737347602844, 0.054004330188035965, 0.07600826770067215, 0.036920323967933655, 0.03763601928949356, 0.0558011494576931, 0.13693569600582123, 0.061999015510082245, 0.024289187043905258, 0.016732703894376755, 0.029763350263237953, 0.12203860282897949, 0.05631129816174507, 0.013645175844430923, 0.01671905256807804, 0.11676623672246933], [0.08550261706113815, 0.04280836880207062, 0.049025677144527435, 0.041540972888469696, 0.028936117887496948, 0.04474128037691116, 0.04905322566628456, 0.2231435775756836, 0.051509030163288116, 0.015517876483500004, 0.019659003242850304, 0.011456151492893696, 0.0459018349647522, 0.054249271750450134, 0.017956694588065147, 0.020888112485408783, 0.19811008870601654], [0.014103135094046593, 0.004069712478667498, 0.004426481202244759, 0.008143501356244087, 0.0014075982617214322, 0.005606498569250107, 0.0023416897747665644, 0.5280054807662964, 0.00027254040469415486, 0.0004615779034793377, 0.00010851474507944658, 0.00018104056653100997, 4.101957892999053e-05, 0.00034542428329586983, 2.7488731575431302e-05, 0.000218697139644064, 0.43023961782455444], [0.35656288266181946, 0.0067189158871769905, 0.0018708865391090512, 0.0032620748970657587, 0.001853961730375886, 0.004631890915334225, 0.007949705235660076, 0.3221552073955536, 0.00026649871142581105, 0.0003651123261079192, 0.0010049158008769155, 0.0003800539707299322, 7.71403283579275e-05, 0.0004358759615570307, 0.00017083103011827916, 0.0011360130738466978, 0.2911580502986908], [0.38777580857276917, 0.05366511642932892, 0.009683288633823395, 0.015662986785173416, 0.005217588506639004, 0.03399496152997017, 0.0038405091036111116, 0.21795238554477692, 0.0017122601857408881, 0.0070227268151938915, 0.010918883606791496, 0.010896172374486923, 0.0026462136302143335, 0.0016673828940838575, 0.008995069190859795, 0.007768749725073576, 0.22057998180389404], [0.5097324848175049, 0.031216494739055634, 0.0042355116456747055, 0.03175543248653412, 0.0043870918452739716, 0.018827300518751144, 0.005962616764008999, 0.1682821661233902, 0.005288941785693169, 0.001818414661101997, 0.004931340925395489, 0.019717585295438766, 0.017200108617544174, 0.007164893671870232, 0.005362712312489748, 0.011771274730563164, 0.15234561264514923], [0.24368488788604736, 0.012060053646564484, 0.0036528469063341618, 0.00884238164871931, 0.004285604227334261, 0.008642337284982204, 0.003514337819069624, 0.3415936529636383, 0.013960831798613071, 0.009584360755980015, 0.0015181229682639241, 0.0023609644267708063, 0.007772835437208414, 0.012805987149477005, 0.002693768125027418, 0.004084757063537836, 0.3189423084259033], [0.39582696557044983, 0.02603474259376526, 0.01088739838451147, 0.008601061068475246, 0.006442354992032051, 0.011248040944337845, 0.0053299954161047935, 0.22557170689105988, 0.01817713864147663, 0.031103869900107384, 0.003345947479829192, 0.00441359169781208, 0.004902055021375418, 0.01641446352005005, 0.006832662038505077, 0.014490067958831787, 0.21037794649600983], [0.46436452865600586, 0.01925422064960003, 0.0055376822128891945, 0.06705516576766968, 0.0018807972082868218, 0.018117379397153854, 0.0022427611984312534, 0.21054308116436005, 0.0036799160297960043, 0.00959421694278717, 0.0010486743412911892, 0.008370766416192055, 0.0013634919887408614, 0.0034598156344145536, 0.0013772824313491583, 0.005135754123330116, 0.17697441577911377], [0.3270284831523895, 0.04445355385541916, 0.01507740281522274, 0.02340153604745865, 0.0072011942975223064, 0.029678281396627426, 0.007937619462609291, 0.2483299970626831, 0.0018150038085877895, 0.009280304424464703, 0.006424960680305958, 0.009549562819302082, 0.003548107109963894, 0.0023488106671720743, 0.012036940082907677, 0.011603156104683876, 0.24028503894805908], [0.5155463218688965, 0.020527197048068047, 0.0031477888114750385, 0.00836227834224701, 0.002185608958825469, 0.010678303427994251, 0.0004790009406860918, 0.2112290859222412, 0.0047378153540194035, 0.0020149198826402426, 0.0018609585240483284, 0.0028360304422676563, 0.0011315797455608845, 0.007264580577611923, 0.0004669177869800478, 0.001574977650307119, 0.2059566229581833], [0.13673928380012512, 0.007576501928269863, 0.0028797087725251913, 0.004442200995981693, 0.000613247393630445, 0.005959173198789358, 0.0038431473076343536, 0.4266640543937683, 0.0011041710386052728, 0.005565161816775799, 0.0005206270143389702, 0.002383708953857422, 0.00039912795182317495, 0.0014414364704862237, 0.00129184708930552, 0.0007260623970068991, 0.39785054326057434], [0.3457328677177429, 0.007541000377386808, 0.0018660450587049127, 0.002963022328913212, 0.001958847278729081, 0.005238546524196863, 0.008718038909137249, 0.3251279592514038, 0.00027395138749852777, 0.0004306524060666561, 0.001180921564809978, 0.0004179262323305011, 7.665226439712569e-05, 0.0004622643464244902, 0.00020027034042868763, 0.001322228810749948, 0.29648879170417786]], [[0.051312319934368134, 0.0876384973526001, 0.17412599921226501, 0.14350438117980957, 0.08713427931070328, 0.0724579319357872, 0.12110596150159836, 0.09044918417930603, 0.014724448323249817, 0.01631760597229004, 0.012381690554320812, 0.006825348827987909, 0.011448292061686516, 0.013631625100970268, 0.01545750256627798, 0.010381454601883888, 0.07110342383384705], [0.1969301700592041, 0.28458258509635925, 0.0038713670801371336, 0.006759915500879288, 0.008784622885286808, 0.18563655018806458, 0.005505913868546486, 0.12806057929992676, 0.01777256652712822, 0.004765874706208706, 0.005507144145667553, 0.0034387826453894377, 0.00973068829625845, 0.023596184328198433, 0.004746395628899336, 0.001729907002300024, 0.10858067125082016], [0.45950597524642944, 0.06575663387775421, 0.0063303932547569275, 0.003677841043099761, 0.008392597548663616, 0.06312037259340286, 0.002392701106145978, 0.19838546216487885, 0.004684922751039267, 0.004563748370856047, 0.0002943066938314587, 0.003986249677836895, 0.0013027605600655079, 0.004047942813485861, 0.000828122952952981, 0.0014804803067818284, 0.17124947905540466], [0.2845461964607239, 0.036605387926101685, 0.02755858190357685, 0.0032900201622396708, 0.015052550472319126, 0.04137738421559334, 0.0026921681128442287, 0.2864788770675659, 0.017347566783428192, 0.005538726691156626, 0.001475588884204626, 0.0032688952051103115, 0.0027443543076515198, 0.013423304073512554, 0.0024277393240481615, 0.0006459648138843477, 0.25552669167518616], [0.3055438995361328, 0.052283868193626404, 0.0017056307988241315, 0.001491795526817441, 0.07828649133443832, 0.10255371779203415, 0.0010234442306682467, 0.20744562149047852, 0.007912371307611465, 0.01105152815580368, 0.0037719127722084522, 0.003811924485489726, 0.005850175861269236, 0.011667022481560707, 0.0028535674791783094, 0.0007328641368076205, 0.20201420783996582], [0.17854151129722595, 0.25978195667266846, 0.0034006487112492323, 0.007435706444084644, 0.01013151090592146, 0.20271798968315125, 0.004136411007493734, 0.1492464244365692, 0.010227848775684834, 0.004627350252121687, 0.005648356396704912, 0.002689969027414918, 0.007270490750670433, 0.014641687273979187, 0.004107028245925903, 0.0013174338964745402, 0.1340775340795517], [0.23669534921646118, 0.0701078251004219, 0.01535047497600317, 0.0031450584065169096, 0.01812673918902874, 0.07420109957456589, 0.0018542851321399212, 0.2656595706939697, 0.02621058188378811, 0.004980674013495445, 0.001956499181687832, 0.005500655621290207, 0.008511850610375404, 0.02460516430437565, 0.004957708530128002, 0.0015349300811067224, 0.23660147190093994], [0.08065950870513916, 0.013604281470179558, 0.008183090947568417, 0.018565576523542404, 0.011017550714313984, 0.011542350053787231, 0.008329598233103752, 0.42129987478256226, 0.005238765385001898, 0.005411036778241396, 0.010756195522844791, 0.00836203247308731, 0.003818002063781023, 0.0055589680559933186, 0.013599388301372528, 0.010790707543492317, 0.3632631301879883], [0.3463127911090851, 0.019910847768187523, 0.0028712155763059855, 0.001937204971909523, 0.009620370343327522, 0.02225448191165924, 0.0030229464173316956, 0.1885293573141098, 0.062288522720336914, 0.03159907087683678, 0.006017755717039108, 0.03708822652697563, 0.010913186706602573, 0.06734605133533478, 0.003888201666995883, 0.0043817609548568726, 0.182017982006073], [0.40223851799964905, 0.022448886185884476, 0.004766193684190512, 0.0012084383051842451, 0.007061272393912077, 0.02783259004354477, 0.000665231142193079, 0.13956595957279205, 0.08567231893539429, 0.003324684454128146, 0.0013349952641874552, 0.006668219342827797, 0.07699213176965714, 0.06819037348031998, 0.0029621850699186325, 0.006815033499151468, 0.14225290715694427], [0.2201210856437683, 0.021527353674173355, 0.0013998292852193117, 0.001408234122209251, 0.00748630752786994, 0.02635379508137703, 0.0003560612676665187, 0.34953826665878296, 0.009027156047523022, 0.009278581477701664, 0.0009142491617240012, 0.002009307499974966, 0.006421767640858889, 0.006856012158095837, 0.001049260376021266, 0.003703055204823613, 0.33254969120025635], [0.330806702375412, 0.01894311234354973, 0.001337847439572215, 0.002257048152387142, 0.004078138619661331, 0.026078445836901665, 0.0007346675847657025, 0.20742954313755035, 0.07214827090501785, 0.0075255646370351315, 0.0015580265317112207, 0.016265228390693665, 0.012608840130269527, 0.07888384908437729, 0.003930090460926294, 0.0020720292814075947, 0.21334251761436462], [0.4056752622127533, 0.013733073137700558, 0.0008396247867494822, 0.0004131085588596761, 0.011418395675718784, 0.019627142697572708, 0.0003671601298265159, 0.16934248805046082, 0.06417147815227509, 0.019553881138563156, 0.0008884568233042955, 0.00800548680126667, 0.031497471034526825, 0.0752173513174057, 0.008101118728518486, 0.0028123182710260153, 0.16833628714084625], [0.3208017647266388, 0.020930685102939606, 0.0034892926923930645, 0.002572428435087204, 0.010344401933252811, 0.02392757497727871, 0.0018703333334997296, 0.16050612926483154, 0.09179414808750153, 0.030959784984588623, 0.008713416755199432, 0.03428856283426285, 0.020825296640396118, 0.09988607466220856, 0.004196513444185257, 0.0036085019819438457, 0.1612851768732071], [0.12618961930274963, 0.03439353033900261, 0.0016950198914855719, 0.00328842899762094, 0.01111387275159359, 0.03145797550678253, 0.00023016263730823994, 0.2627779245376587, 0.09627987444400787, 0.009207965806126595, 0.005602475255727768, 0.014568572863936424, 0.025963345542550087, 0.10419780015945435, 0.0026338330935686827, 0.006490825209766626, 0.26390883326530457], [0.23712463676929474, 0.013862146064639091, 0.002972102491185069, 0.00022881408222019672, 0.005436482839286327, 0.015493832528591156, 0.0001550440938444808, 0.23319503664970398, 0.09143111854791641, 0.020237697288393974, 0.00041963515104725957, 0.007160970475524664, 0.051827553659677505, 0.07600528001785278, 0.0038547900039702654, 0.0006768287275917828, 0.23991796374320984], [0.07357767969369888, 0.014126394875347614, 0.011256272904574871, 0.026995262131094933, 0.015389678999781609, 0.01213464792817831, 0.011027597822248936, 0.39537906646728516, 0.007940866984426975, 0.00742122158408165, 0.016475969925522804, 0.012983748689293861, 0.005858550779521465, 0.00855204463005066, 0.019185863435268402, 0.017107512801885605, 0.3445875644683838]], [[0.23812603950500488, 0.013542610220611095, 0.011273401789367199, 0.02823493629693985, 0.014286049641668797, 0.021555786952376366, 0.01884930580854416, 0.2968319356441498, 0.015098338015377522, 0.01914321258664131, 0.011285470798611641, 0.010934410616755486, 0.0128102982416749, 0.010114695876836777, 0.005489877425134182, 0.019679846242070198, 0.25274381041526794], [0.08374489098787308, 0.02542233094573021, 0.1503116935491562, 0.3700125217437744, 0.14909628033638, 0.051722683012485504, 0.05293792858719826, 0.049150899052619934, 0.006440934259444475, 0.0047494531609117985, 0.003069997299462557, 0.0014855240005999804, 0.0028392469976097345, 0.0026910281740128994, 0.004118798300623894, 0.00696194963529706, 0.03524375334382057], [0.1429978907108307, 0.024069292470812798, 0.010400200262665749, 0.2969824969768524, 0.08424403518438339, 0.18418507277965546, 0.04658111557364464, 0.11137878894805908, 0.003819148289039731, 0.002419189317151904, 0.002007892122492194, 0.000912139134015888, 0.0009279529913328588, 0.0007097056368365884, 0.0017923347186297178, 0.0024559644516557455, 0.08411677181720734], [0.11362873762845993, 0.008832193911075592, 0.007846135646104813, 0.011306566186249256, 0.09067278355360031, 0.22387845814228058, 0.26449453830718994, 0.15235784649848938, 0.003051033243536949, 0.0037924922071397305, 0.005783767439424992, 0.0014299113536253572, 0.0014694476267322898, 0.00033578972215764225, 0.0009531410178169608, 0.00407171156257391, 0.10609544813632965], [0.22259047627449036, 0.008213945664465427, 0.013412400148808956, 0.06451929360628128, 0.03955568000674248, 0.0989847257733345, 0.23565803468227386, 0.13265840709209442, 0.01001766324043274, 0.032098665833473206, 0.035502221435308456, 0.0050549875013530254, 0.004594606813043356, 0.0007314184913411736, 0.0016700889682397246, 0.006878517102450132, 0.08785880357027054], [0.2887754440307617, 0.0049813478253781796, 0.007710357196629047, 0.028955738991498947, 0.024260103702545166, 0.03360525518655777, 0.15357674658298492, 0.2310647964477539, 0.016675546765327454, 0.03503798320889473, 0.017644492909312248, 0.004566413816064596, 0.004240596666932106, 0.0014707515947520733, 0.0011851885356009007, 0.004901846405118704, 0.14134739339351654], [0.3256472945213318, 0.0029100917745381594, 0.004601185210049152, 0.004306642338633537, 0.006189735606312752, 0.009496156126260757, 0.001732576871290803, 0.382027268409729, 0.0021867689210921526, 0.0026366328820586205, 0.0029764585196971893, 0.0010122834937646985, 0.0017241064924746752, 0.0007945962715893984, 0.0007747078198008239, 0.000649699941277504, 0.2503337264060974], [0.23481814563274384, 0.002886624541133642, 0.000799752539023757, 0.0009729790035635233, 0.0010915538296103477, 0.0020563225261867046, 0.0008604609756730497, 0.40829139947891235, 0.00043629438732750714, 0.0005121291615068913, 0.001093202969059348, 0.0008208752260543406, 0.0007180587854236364, 0.0006434452370740473, 0.0006144591607153416, 0.0010497416369616985, 0.342334508895874], [0.20543630421161652, 0.0038185182493180037, 0.001219694851897657, 0.0012299768859520555, 0.0028288091998547316, 0.005674166139215231, 0.0022720531560480595, 0.18461737036705017, 0.008559633046388626, 0.09781770408153534, 0.06930069625377655, 0.1354847550392151, 0.05012443661689758, 0.005486645735800266, 0.025679532438516617, 0.028226645663380623, 0.17222300171852112], [0.15549218654632568, 0.0011795128230005503, 0.0009012238588184118, 0.0005863612750545144, 0.0005595111870206892, 0.000969412038102746, 0.002176641020923853, 0.10326727479696274, 0.0019652810879051685, 0.007372149266302586, 0.034687578678131104, 0.2771749794483185, 0.24280959367752075, 0.012176459655165672, 0.011547922156751156, 0.045714303851127625, 0.1014196053147316], [0.28772521018981934, 0.0037108364049345255, 0.006013798993080854, 0.0019293326186016202, 0.0012034496758133173, 0.0009043319732882082, 0.0009712878381833434, 0.264838844537735, 0.004278562497347593, 0.01585112325847149, 0.004252071492373943, 0.026004666462540627, 0.062172919511795044, 0.03758455067873001, 0.02155417576432228, 0.019138576462864876, 0.2418663203716278], [0.20591065287590027, 0.0025776757393032312, 0.007069122511893511, 0.01854437030851841, 0.0017259728629142046, 0.001013009692542255, 0.0010591489262878895, 0.12952493131160736, 0.00449209101498127, 0.009895481169223785, 0.004436237271875143, 0.005122073460370302, 0.20944079756736755, 0.1475449502468109, 0.043594297021627426, 0.07523138076066971, 0.13281787931919098], [0.15741169452667236, 0.0012139403261244297, 0.002037624828517437, 0.016033286228775978, 0.002611300675198436, 0.0015326469438150525, 0.002454599365592003, 0.17723743617534637, 0.0003592646971810609, 0.0031855946872383356, 0.004714293871074915, 0.015239547938108444, 0.04644538462162018, 0.047300197184085846, 0.11976227164268494, 0.1957552284002304, 0.20670568943023682], [0.22564224898815155, 0.003870989428833127, 0.00479245837777853, 0.005489078350365162, 0.004701380152255297, 0.005425821989774704, 0.0018404364818707108, 0.12566891312599182, 0.00018030180945061147, 0.0009419664274901152, 0.005283443722873926, 0.010369377210736275, 0.022576821967959404, 0.009564556181430817, 0.11748247593641281, 0.2999962270259857, 0.1561734825372696], [0.22938059270381927, 0.0019519813358783722, 0.0009188493713736534, 0.0015820966800674796, 0.0008396836929023266, 0.002055262215435505, 0.0014747388195246458, 0.3249129056930542, 0.0006282321992330253, 0.0006195993046276271, 0.002751001389697194, 0.0010078174527734518, 0.004143605474382639, 0.005169812124222517, 0.0022204730194061995, 0.030331900343298912, 0.3900115191936493], [0.2890292704105377, 0.0010926674585789442, 0.0008049040334299207, 0.0019986398983746767, 0.0010198007803410292, 0.001558020245283842, 0.0009933864930644631, 0.3131781220436096, 0.0017489832825958729, 0.0016760394209995866, 0.0011345853563398123, 0.002127464162185788, 0.007609092630445957, 0.0060449326410889626, 0.00913903396576643, 0.016268176957964897, 0.3445769250392914], [0.23727703094482422, 0.0028177034109830856, 0.0007670554332435131, 0.0008649309165775776, 0.0009330627508461475, 0.0017572446959093213, 0.000627197150606662, 0.40376582741737366, 0.0004647271416615695, 0.0004561011155601591, 0.0009076031856238842, 0.0007229156326502562, 0.0006499338196590543, 0.0006539231399074197, 0.0005998696433380246, 0.0010867762612178922, 0.3456481695175171]], [[0.015074202790856361, 0.002733819652348757, 0.013741790316998959, 0.012800196185708046, 0.0031410395167768, 0.0021775206550955772, 0.005527285858988762, 0.09581828117370605, 0.06444068253040314, 0.17903266847133636, 0.0567283071577549, 0.07805319130420685, 0.10137956589460373, 0.06181042268872261, 0.10145027935504913, 0.08284267783164978, 0.12324801832437515], [0.03455992415547371, 0.019627351313829422, 0.13794824481010437, 0.37155792117118835, 0.02415231056511402, 0.013886740431189537, 0.06560098379850388, 0.0668807402253151, 0.01592329889535904, 0.05355620011687279, 0.008395003154873848, 0.016503797844052315, 0.03332667797803879, 0.024411451071500778, 0.010600137524306774, 0.03346439450979233, 0.06960479170084], [0.015668001025915146, 0.018284857273101807, 0.10939310491085052, 0.1946064978837967, 0.03943575173616409, 0.013247070834040642, 0.09733768552541733, 0.1152714341878891, 0.017638592049479485, 0.06886610388755798, 0.03714107722043991, 0.019760239869356155, 0.01584073342382908, 0.025139881297945976, 0.056117668747901917, 0.040040940046310425, 0.11621031910181046], [0.14878350496292114, 0.04068545252084732, 0.034609366208314896, 0.048067860305309296, 0.07137387245893478, 0.048215705901384354, 0.05356723442673683, 0.27685683965682983, 0.0007094665197655559, 0.0077704014256596565, 0.004277274012565613, 0.0023541850969195366, 0.004541891627013683, 0.0017819401109591126, 0.0031853127293288708, 0.0106439720839262, 0.24257582426071167], [0.07884162664413452, 0.05685599520802498, 0.08959731459617615, 0.11665792763233185, 0.01483894418925047, 0.041199177503585815, 0.03051702491939068, 0.1842353641986847, 0.013747302815318108, 0.052646756172180176, 0.0028084099758416414, 0.012249667197465897, 0.07251479476690292, 0.019716797396540642, 0.010016750544309616, 0.028222085908055305, 0.1753341108560562], [0.029086267575621605, 0.0246526338160038, 0.1141786053776741, 0.3623295724391937, 0.024351488798856735, 0.016340157017111778, 0.08472375571727753, 0.05092674121260643, 0.01851745694875717, 0.05839090421795845, 0.009065670892596245, 0.014646440744400024, 0.07390888035297394, 0.030589772388339043, 0.007064448669552803, 0.02693883702158928, 0.054288312792778015], [0.007630626205354929, 0.059208907186985016, 0.29380542039871216, 0.1702014058828354, 0.05261504650115967, 0.04845365881919861, 0.12445542961359024, 0.015100257471203804, 0.01993914693593979, 0.019770771265029907, 0.035516154021024704, 0.00959553848952055, 0.017925925552845, 0.027808066457509995, 0.05986456573009491, 0.0231754370033741, 0.014933713711798191], [0.5680379867553711, 0.003838262287899852, 0.001604758552275598, 0.00024536254932172596, 0.0015560410683974624, 0.005650751758366823, 0.012603996321558952, 0.22344042360782623, 0.0001605024008313194, 0.0005157678970135748, 0.0016141857486218214, 0.0005872269393876195, 0.00014016139903105795, 0.0002575662510935217, 0.00021910018404014409, 0.0010514185996726155, 0.178476482629776], [0.01099062617868185, 0.008430218324065208, 0.023147163912653923, 0.01433907076716423, 0.005455352365970612, 0.0050171841867268085, 0.03917991369962692, 0.052007947117090225, 0.03773559257388115, 0.24792255461215973, 0.013387096114456654, 0.04456782341003418, 0.2775261402130127, 0.048127759248018265, 0.05128278583288193, 0.07294405996799469, 0.04793870449066162], [0.029024610295891762, 0.0059443870559334755, 0.01280670054256916, 0.010657187551259995, 0.010226013138890266, 0.004886270966380835, 0.021133311092853546, 0.07020115852355957, 0.20410768687725067, 0.0830124020576477, 0.006431234534829855, 0.02800767496228218, 0.23986245691776276, 0.17749890685081482, 0.005987014155834913, 0.020180007442831993, 0.070032998919487], [0.03204919397830963, 0.06134318932890892, 0.08600104600191116, 0.07150735706090927, 0.07479589432477951, 0.04317883029580116, 0.03521884232759476, 0.08108802884817123, 0.037794023752212524, 0.06395422667264938, 0.023168940097093582, 0.057696398347616196, 0.0741790160536766, 0.05731366202235222, 0.08684738725423813, 0.03700404241681099, 0.0768599659204483], [0.00525916600599885, 0.008785514160990715, 0.011000535450875759, 0.007610354572534561, 0.009002807550132275, 0.008144741877913475, 0.009658114053308964, 0.013908450491726398, 0.2001817673444748, 0.06475845724344254, 0.021509097889065742, 0.006954004522413015, 0.3398072123527527, 0.24629947543144226, 0.015568635426461697, 0.019150378182530403, 0.012401323765516281], [0.0737789124250412, 0.04133566468954086, 0.012458235025405884, 0.02139125019311905, 0.02812064066529274, 0.03289500251412392, 0.021069908514618874, 0.23188292980194092, 0.027373474091291428, 0.07675138860940933, 0.004353882744908333, 0.018229171633720398, 0.12620076537132263, 0.04950166866183281, 0.01073414832353592, 0.028140075504779816, 0.195782870054245], [0.00787345226854086, 0.0134547995403409, 0.029882492497563362, 0.01594468392431736, 0.0065183332189917564, 0.008612637408077717, 0.02612624503672123, 0.029550986364483833, 0.03465934842824936, 0.24081695079803467, 0.008265920914709568, 0.0465913787484169, 0.32882753014564514, 0.045924730598926544, 0.04382042959332466, 0.0864991843700409, 0.026630941778421402], [0.03260589390993118, 0.020491892471909523, 0.04921816661953926, 0.01201153825968504, 0.03276713937520981, 0.03191586583852768, 0.27126753330230713, 0.10745521634817123, 0.024816282093524933, 0.06547172367572784, 0.05209805443882942, 0.048544544726610184, 0.014689113013446331, 0.024222608655691147, 0.021391814574599266, 0.10453769564628601, 0.0864948183298111], [0.00525372801348567, 0.008562478236854076, 0.06322943419218063, 0.051881082355976105, 0.016447104513645172, 0.0070586856454610825, 0.02810019813477993, 0.014835519716143608, 0.16312631964683533, 0.10837546736001968, 0.02614545077085495, 0.03397918865084648, 0.12994147837162018, 0.22453990578651428, 0.0840318351984024, 0.021313536912202835, 0.013178596273064613], [0.5652258992195129, 0.004329378716647625, 0.0014928752789273858, 0.00023409936693497002, 0.0017497795633971691, 0.006608752068132162, 0.014801431447267532, 0.22674818336963654, 0.00014837308845017105, 0.00043031087261624634, 0.001710680895484984, 0.0005727025563828647, 0.00011728404933819547, 0.0002464406134095043, 0.00020737582235597074, 0.0010602230904623866, 0.17431631684303284]], [[0.6414362788200378, 0.03097419999539852, 0.015060914680361748, 0.013869376853108406, 0.030054345726966858, 0.04289112985134125, 0.12132196128368378, 0.02386116422712803, 0.0039678215980529785, 0.001956162741407752, 0.01697446033358574, 0.0028806745540350676, 0.0037777922116219997, 0.0039759124629199505, 0.004358331672847271, 0.02160787209868431, 0.02103157341480255], [0.7796869277954102, 0.08965377509593964, 0.0006269690929912031, 0.00013449213292915374, 0.003985970746725798, 0.0008733444265089929, 0.0030556542333215475, 0.05371616408228874, 7.197825652838219e-06, 0.0002654893323779106, 8.548847836209461e-05, 0.0009809831390157342, 0.0005150526412762702, 6.053207835066132e-05, 0.00011665627243928611, 0.002710674423724413, 0.06352473795413971], [0.13382840156555176, 0.8474864959716797, 0.0007663805736228824, 0.0004329253570176661, 0.0017503236886113882, 0.0005166102782823145, 0.0010875544976443052, 0.005970390979200602, 1.975052509806119e-05, 6.413087248802185e-05, 2.6730540412245318e-05, 0.00027211170527152717, 0.0021560448221862316, 5.8920166338793933e-05, 1.864887963165529e-05, 0.00032318488229066133, 0.0052213892340660095], [0.6396790146827698, 0.11604563146829605, 0.11961358040571213, 0.025691872462630272, 0.0039607021026313305, 0.014793678186833858, 0.0006790012703277171, 0.04284597560763359, 0.00012398074613884091, 1.4971393284213264e-05, 3.507592191454023e-05, 9.602763748262078e-05, 0.0009469330543652177, 0.0004751326923724264, 0.000462841970147565, 0.0008707231609150767, 0.03366478160023689], [0.014892622828483582, 0.00028357599512673914, 0.004274646285921335, 0.9667023420333862, 0.001094589475542307, 0.0026263876352459192, 3.664407631731592e-05, 0.005360112059861422, 4.8568246711511165e-05, 1.8892007574322633e-05, 1.7737947928253561e-06, 1.1354706657584757e-05, 1.4460270904237404e-05, 1.773163239704445e-05, 0.0010288808261975646, 2.011245123867411e-05, 0.003567290958017111], [0.13455365598201752, 0.0005738924373872578, 0.0005524298758246005, 0.005741039756685495, 0.812554657459259, 0.028746595606207848, 0.003608370665460825, 0.006458199582993984, 5.1745795644819736e-05, 0.0009732747566886246, 1.591056934557855e-05, 4.918182639812585e-06, 7.799909326422494e-06, 2.671424408617895e-05, 5.763000808656216e-05, 0.0013665275182574987, 0.004706633742898703], [0.13473674654960632, 0.0005315656308084726, 0.001562734367325902, 0.00354046025313437, 0.01571027934551239, 0.8205724358558655, 0.0024839614052325487, 0.008684917353093624, 0.0011435078922659159, 0.0015346268191933632, 0.0009490312659181654, 0.00016722585132811219, 4.825779797101859e-06, 3.3896903914865106e-05, 1.1827576599898748e-05, 0.00038634252268821, 0.007945622317492962], [0.8451223373413086, 0.018900206312537193, 0.006750135216861963, 0.004620402120053768, 0.005972793325781822, 0.017863905057311058, 0.03220708668231964, 0.0149227948859334, 0.005645844154059887, 0.0051730358973145485, 0.002922446234151721, 0.006880566012114286, 0.0029047727584838867, 0.003861810779199004, 0.004655292723327875, 0.007023051381111145, 0.014573591761291027], [0.480422705411911, 0.0035214603412896395, 0.0005336942849680781, 0.0002365068212384358, 0.0006784957367926836, 0.00166214513592422, 0.00926278904080391, 0.27294233441352844, 0.02755379118025303, 0.0330209843814373, 0.00012725594569928944, 0.003297538263723254, 0.0031243714038282633, 2.063170904875733e-05, 1.3691305866814218e-06, 0.0002789180143736303, 0.16331496834754944], [0.01811359077692032, 1.3607079381472431e-05, 9.948589649866335e-07, 1.7589618437341414e-05, 2.5885583454510197e-05, 0.00028699261019937694, 4.719393837149255e-05, 0.026368696242570877, 0.8781535029411316, 0.05810905992984772, 0.00015312856703530997, 0.0005857597570866346, 0.00010577541252132505, 8.14062004792504e-05, 9.541477083985228e-07, 2.096671323670307e-06, 0.01793370582163334], [0.05285900458693504, 0.00030160401365719736, 6.556468179041985e-06, 6.674522592220455e-05, 0.00024568778462707996, 0.0005034739151597023, 0.0002049742906820029, 0.002742514945566654, 0.008634930476546288, 0.9288884997367859, 9.068482177099213e-05, 0.0008016722276806831, 0.0003258608339820057, 0.0011096963426098228, 0.0006773598142899573, 7.933868619147688e-05, 0.0024613363202661276], [0.00221243710257113, 2.2538906705449335e-05, 1.9147739749314496e-07, 1.647128442527901e-07, 1.0417431894893525e-06, 2.8651111279032193e-05, 1.2499674085120205e-05, 0.0003085247299168259, 3.5355267300474225e-06, 0.0003461781016085297, 0.9964503049850464, 0.00022603623801842332, 2.6259762307745405e-05, 2.3586344468640164e-06, 3.885842943418538e-06, 4.942189480061643e-05, 0.000305999128613621], [0.01036781631410122, 0.0008342416840605438, 1.7128641047747806e-05, 2.0533991119009443e-05, 2.9555935725511517e-06, 3.6144359910394996e-05, 7.378499140031636e-05, 0.006765018682926893, 0.00021779925737064332, 0.0015782729024067521, 0.0007277305121533573, 0.9368734955787659, 0.03224625438451767, 0.0010896268067881465, 0.0009329294553026557, 0.00032172640203498304, 0.007894447073340416], [0.03563461825251579, 0.002855573082342744, 0.00013671958004124463, 3.781789564527571e-05, 1.0104925422638189e-05, 2.1286070932546863e-06, 3.7324938602978364e-05, 0.008594353683292866, 2.7424977815826423e-05, 0.0009628230473026633, 3.615149762481451e-05, 0.024303004145622253, 0.899254560470581, 0.017478864639997482, 0.000342755374731496, 0.003295246744528413, 0.006990519352257252], [0.23552358150482178, 0.0007450535777024925, 1.1584193998714909e-05, 3.410106728551909e-05, 1.4207294043444563e-05, 1.7315085642621852e-05, 1.5014760492704227e-06, 0.002948538865894079, 0.0024790773168206215, 0.0010183261474594474, 7.436121813952923e-05, 0.0002751484571490437, 0.003731070552021265, 0.7477232813835144, 0.001778375473804772, 0.0004652889329008758, 0.003159280400723219], [0.0037040780298411846, 5.1233841077191755e-05, 3.5408945677772863e-06, 9.383707219967619e-05, 1.5205941963358782e-05, 3.0963343306211755e-05, 3.3952089779631933e-06, 0.00026308675296604633, 2.7787433509729453e-07, 4.019607513328083e-05, 4.345218712842325e-06, 0.00011893665214302018, 8.212411194108427e-05, 0.0002901941479649395, 0.9945210218429565, 0.0004440408665686846, 0.00033355338382534683], [0.8236054182052612, 0.020156359300017357, 0.007659207563847303, 0.005220772698521614, 0.007615920156240463, 0.01960946060717106, 0.02552926167845726, 0.013913772068917751, 0.00552974920719862, 0.006079630460590124, 0.006619543768465519, 0.007675293833017349, 0.004190606065094471, 0.00814027152955532, 0.009781114757061005, 0.014565378427505493, 0.014108278788626194]], [[0.09331492334604263, 0.056674692779779434, 0.21008054912090302, 0.14979636669158936, 0.027833765372633934, 0.03818805515766144, 0.0658632218837738, 0.17810924351215363, 0.004991346970200539, 0.017665700986981392, 0.006204724311828613, 0.0036213051062077284, 0.00416893046349287, 0.005470216274261475, 0.005164945498108864, 0.0071010710671544075, 0.1257508099079132], [0.2844104468822479, 0.10228923708200455, 0.012822174467146397, 0.0030406410805881023, 0.008765681646764278, 0.05496018007397652, 0.012015220709145069, 0.26282384991645813, 0.0015031542861834168, 0.0023093027994036674, 0.004559095948934555, 0.0027201236225664616, 0.0005719957989640534, 0.001391653437167406, 0.0021230566781014204, 0.004181517753750086, 0.2395126223564148], [0.41203033924102783, 0.04571434110403061, 0.022425886243581772, 0.0031624797265976667, 0.010137534700334072, 0.023898957297205925, 0.00600487319752574, 0.24329237639904022, 0.0029769798275083303, 0.015779104083776474, 0.0025153697934001684, 0.0032908369321376085, 0.001296468311920762, 0.00203381828032434, 0.001692996476776898, 0.0025826231576502323, 0.2011651247739792], [0.29724523425102234, 0.07139614224433899, 0.024225814267992973, 0.06562759727239609, 0.030032817274332047, 0.06358474493026733, 0.041361428797245026, 0.19900482892990112, 0.0032035429030656815, 0.011618641205132008, 0.004483609925955534, 0.009382389485836029, 0.0017680958844721317, 0.0027840775437653065, 0.009580304846167564, 0.0036194443237036467, 0.16108128428459167], [0.36703553795814514, 0.06263343244791031, 0.0022157705388963223, 0.0012496617855504155, 0.015489872545003891, 0.04504157230257988, 0.00374838849529624, 0.24440152943134308, 0.002558021107688546, 0.0014142229920253158, 0.0010269286576658487, 0.002482091775164008, 0.0008897537481971085, 0.0017272342229261994, 0.00022994652681518346, 0.0007226300076581538, 0.24713343381881714], [0.3090778887271881, 0.06674632430076599, 0.006006896961480379, 0.0024640285409986973, 0.010481102392077446, 0.04672401025891304, 0.0099563617259264, 0.2720598876476288, 0.0014253626577556133, 0.002726626116782427, 0.0026314258575439453, 0.0036743918899446726, 0.0010710149072110653, 0.0013408155646175146, 0.0010161481332033873, 0.0024455806706100702, 0.2601521611213684], [0.2985815107822418, 0.033986128866672516, 0.021185586228966713, 0.06316479295492172, 0.025739949196577072, 0.017223123461008072, 0.0019348671194165945, 0.24252822995185852, 0.0057408916763961315, 0.01144476979970932, 0.006610502488911152, 0.004603979177772999, 0.0031137766782194376, 0.0064673093147575855, 0.004012224264442921, 0.012082505039870739, 0.2415798157453537], [0.17522114515304565, 0.009799904190003872, 0.005302064120769501, 0.008822753094136715, 0.012573093175888062, 0.010886980220675468, 0.023083878681063652, 0.3665170967578888, 0.0050969296135008335, 0.006993346847593784, 0.005743982270359993, 0.008875705301761627, 0.014813671819865704, 0.0055067394860088825, 0.0047635892406105995, 0.019929952919483185, 0.3160691261291504], [0.34310823678970337, 0.06502809375524521, 0.0021205185912549496, 0.00020462179963942617, 0.0010511638829484582, 0.03155646473169327, 0.0007323729223571718, 0.24103838205337524, 0.007478512357920408, 0.0021872480865567923, 0.0028576774057000875, 0.009931106120347977, 0.005292694084346294, 0.004525842145085335, 0.0065164994448423386, 0.004492613486945629, 0.2718779444694519], [0.33643779158592224, 0.024609388783574104, 0.0016998628852888942, 0.0008097393438220024, 0.0038962308317422867, 0.015501647256314754, 0.000424665748141706, 0.2828017771244049, 0.00767585588619113, 0.001466101035475731, 0.003040989860892296, 0.0051518953405320644, 0.0038221084978431463, 0.0044102780520915985, 0.008668821305036545, 0.0032396167516708374, 0.29634320735931396], [0.3713393807411194, 0.011105533689260483, 0.003743572859093547, 0.008938511833548546, 0.002995800692588091, 0.008838982321321964, 0.001157468417659402, 0.2821926176548004, 0.004074815660715103, 0.005535315256565809, 0.0030692932195961475, 0.007599976379424334, 0.0012818478280678391, 0.003024027682840824, 0.0011526128510013223, 0.0058023687452077866, 0.27814778685569763], [0.36102738976478577, 0.05066632106900215, 0.0011718260357156396, 0.005367958918213844, 0.0035248207859694958, 0.0430634431540966, 0.0011840922525152564, 0.2113596349954605, 0.01272980310022831, 0.013502378948032856, 0.0031616787891834974, 0.00816856138408184, 0.012989488430321217, 0.012959785759449005, 0.004678542260080576, 0.008028513751924038, 0.24641574919223785], [0.31085413694381714, 0.03351454436779022, 0.00103286886587739, 0.0004577648942358792, 0.001868027145974338, 0.02804107405245304, 0.000937800679821521, 0.22322045266628265, 0.024304693564772606, 0.008120846934616566, 0.018699616193771362, 0.016921715810894966, 0.002167311031371355, 0.017596295103430748, 0.02243221551179886, 0.003368664300069213, 0.2864619195461273], [0.36689120531082153, 0.0737122893333435, 0.0015678750351071358, 0.00041538808727636933, 0.0010219237301498652, 0.036112092435359955, 0.0006845110910944641, 0.22632870078086853, 0.007153042126446962, 0.0015710457228124142, 0.0023514952044934034, 0.011136310175061226, 0.005858455318957567, 0.004047358874231577, 0.00858561135828495, 0.004430875647813082, 0.24813182651996613], [0.39216870069503784, 0.017801424488425255, 0.0007150857127271593, 0.0015222164802253246, 0.0008075842051766813, 0.015213064849376678, 0.001823327736929059, 0.27745744585990906, 0.0021716472692787647, 0.004416048526763916, 0.004889229312539101, 0.001699875108897686, 0.00041639237315393984, 0.001037251902744174, 0.00032518699299544096, 0.0012386300368234515, 0.2762969434261322], [0.3371252417564392, 0.01693182811141014, 0.0005964129813946784, 0.0010466380044817924, 0.0005187720526009798, 0.01702599786221981, 0.00028795877005904913, 0.2638838291168213, 0.015598482452332973, 0.001817028969526291, 0.0019759235437959433, 0.004123141523450613, 0.004623569082468748, 0.011006705462932587, 0.0008867817814461887, 0.0015239134663715959, 0.3210277855396271], [0.1683782935142517, 0.010808063670992851, 0.006809295155107975, 0.011103985831141472, 0.015447120182216167, 0.012427483685314655, 0.030049419030547142, 0.340798556804657, 0.007226421497762203, 0.009385605342686176, 0.00882553867995739, 0.012476149946451187, 0.022951599210500717, 0.007809228263795376, 0.006592255551367998, 0.030987152829766273, 0.29792383313179016]], [[0.3024081885814667, 0.030279221013188362, 0.021122870966792107, 0.07453761994838715, 0.016274038702249527, 0.02230302430689335, 0.00780830392614007, 0.13088785111904144, 0.02475917525589466, 0.05267786234617233, 0.019592205062508583, 0.017613627016544342, 0.022161509841680527, 0.027539823204278946, 0.02985086478292942, 0.06742105633020401, 0.13276273012161255], [0.12149839103221893, 0.03515300899744034, 0.027160238474607468, 0.036827292293310165, 0.038319479674100876, 0.032634101808071136, 0.04625249281525612, 0.0819600373506546, 0.06215345114469528, 0.09483538568019867, 0.03626265376806259, 0.025327347218990326, 0.12740276753902435, 0.07758533954620361, 0.027978135272860527, 0.05486680567264557, 0.07378306984901428], [0.3243929445743561, 0.011772572994232178, 0.0036798587534576654, 0.007018841803073883, 0.010714802891016006, 0.009844991378486156, 0.013599199242889881, 0.20018962025642395, 0.010531843639910221, 0.07193948328495026, 0.040880560874938965, 0.01727849803864956, 0.05413239821791649, 0.008542153984308243, 0.01814631186425686, 0.012107236310839653, 0.18522872030735016], [0.3385339677333832, 0.04158630594611168, 0.036076080054044724, 0.01327919028699398, 0.024219930171966553, 0.047071948647499084, 0.04189755767583847, 0.16749706864356995, 0.026543328538537025, 0.015559019520878792, 0.00597268994897604, 0.005869063548743725, 0.04054190590977669, 0.028945906087756157, 0.004271315410733223, 0.01570635475218296, 0.14642831683158875], [0.16865506768226624, 0.04938710480928421, 0.019549604505300522, 0.05930985137820244, 0.07521853595972061, 0.04664945602416992, 0.024897927418351173, 0.14647944271564484, 0.03934016451239586, 0.06177183613181114, 0.013278636150062084, 0.02136322110891342, 0.03814351186156273, 0.038135841488838196, 0.02510661818087101, 0.029801618307828903, 0.14291152358055115], [0.18725530803203583, 0.05376933887600899, 0.029055269435048103, 0.049089428037405014, 0.03742990270256996, 0.05701279267668724, 0.03076740726828575, 0.10879355669021606, 0.06185257062315941, 0.07229889929294586, 0.0236397385597229, 0.02484162338078022, 0.06708837300539017, 0.04484312981367111, 0.02780052274465561, 0.02588454633951187, 0.09857764095067978], [0.2257799208164215, 0.0345316082239151, 0.006942221894860268, 0.01437275018543005, 0.039183422923088074, 0.025698179379105568, 0.008817379362881184, 0.1455419957637787, 0.0696139857172966, 0.021771183237433434, 0.025386322289705276, 0.006431358866393566, 0.09228303283452988, 0.08459044247865677, 0.016931749880313873, 0.050242576748132706, 0.13188187777996063], [0.29042553901672363, 0.007339330855756998, 0.0014254071284085512, 0.0033209100365638733, 0.00431196391582489, 0.008623596280813217, 0.0015750013990327716, 0.32417330145835876, 0.00427272217348218, 0.002996231894940138, 0.005141968838870525, 0.006836503744125366, 0.005467037204653025, 0.005742002744227648, 0.0024779916275292635, 0.0054355221800506115, 0.32043489813804626], [0.25044775009155273, 0.036818623542785645, 0.09313350915908813, 0.00952901877462864, 0.012264832854270935, 0.024558238685131073, 0.053025126457214355, 0.16229106485843658, 0.024035083130002022, 0.04518783837556839, 0.018864067271351814, 0.017499709501862526, 0.03569392114877701, 0.014961350709199905, 0.024027641862630844, 0.03000655397772789, 0.14765571057796478], [0.2250499278306961, 0.04895733296871185, 0.013785665854811668, 0.005020629148930311, 0.00792643241584301, 0.03186631575226784, 0.013938482850790024, 0.271396279335022, 0.02267514169216156, 0.007961919531226158, 0.0034771845676004887, 0.030567822977900505, 0.02528628334403038, 0.012654193677008152, 0.006924149580299854, 0.010077578015625477, 0.262434720993042], [0.3020283579826355, 0.06289812922477722, 0.017301304265856743, 0.011179446242749691, 0.006016455590724945, 0.017042119055986404, 0.03613385185599327, 0.2194403111934662, 0.01605672389268875, 0.02131962962448597, 0.007852209731936455, 0.010359067469835281, 0.012205595150589943, 0.008597283624112606, 0.025482745841145515, 0.01104278676211834, 0.21504399180412292], [0.23796281218528748, 0.0670369565486908, 0.06441561132669449, 0.008521844632923603, 0.01696154475212097, 0.03187268599867821, 0.04601573571562767, 0.1900751292705536, 0.014263161458075047, 0.02239990048110485, 0.006141927558928728, 0.012937868945300579, 0.014200568199157715, 0.008146810345351696, 0.008377047255635262, 0.06010851636528969, 0.19056174159049988], [0.33841192722320557, 0.05038106441497803, 0.02013375610113144, 0.008323549292981625, 0.007523011416196823, 0.011559674516320229, 0.013525321148335934, 0.22538666427135468, 0.01617872714996338, 0.01860102266073227, 0.0009935613488778472, 0.02454315684735775, 0.008262021467089653, 0.019163036718964577, 0.004872401710599661, 0.013227381743490696, 0.2189137190580368], [0.2676387131214142, 0.04345650225877762, 0.05921970680356026, 0.008791954256594181, 0.007398023270070553, 0.016023466363549232, 0.026174215599894524, 0.191934734582901, 0.018210653215646744, 0.02745293453335762, 0.015342566184699535, 0.021244583651423454, 0.04558000713586807, 0.022756297141313553, 0.019447900354862213, 0.0283270925283432, 0.1810005158185959], [0.23023921251296997, 0.03771331533789635, 0.00433285953477025, 0.0019842812325805426, 0.0068510472774505615, 0.012707910500466824, 0.007208711933344603, 0.27782872319221497, 0.01437422912567854, 0.009604205377399921, 0.024872267618775368, 0.01447662990540266, 0.01158254686743021, 0.011691499501466751, 0.03275063633918762, 0.009953297674655914, 0.29182863235473633], [0.287616103887558, 0.048705801367759705, 0.02851501666009426, 0.0043346332386136055, 0.010277453809976578, 0.018220730125904083, 0.017625797539949417, 0.19514334201812744, 0.019543014466762543, 0.016393378376960754, 0.010116017423570156, 0.03422543406486511, 0.03615939989686012, 0.01829969324171543, 0.02935090847313404, 0.033645447343587875, 0.19182783365249634], [0.273516446352005, 0.010180409997701645, 0.0021359883248806, 0.003978705033659935, 0.006034583318978548, 0.012492965906858444, 0.002306202193722129, 0.32091525197029114, 0.0057565392926335335, 0.0031509348191320896, 0.007490784861147404, 0.009531576186418533, 0.00661844527348876, 0.00788586214184761, 0.003700338304042816, 0.007180912885814905, 0.31712406873703003]], [[0.35998964309692383, 0.00720044644549489, 0.00952097773551941, 0.02097858302295208, 0.004048050846904516, 0.006635804660618305, 0.009357865899801254, 0.2452428787946701, 0.013117024675011635, 0.02917439490556717, 0.005705017596483231, 0.011636906303465366, 0.025099685415625572, 0.013271000236272812, 0.0032164352014660835, 0.009907032363116741, 0.22589831054210663], [0.2001110166311264, 0.03055390901863575, 0.04606832563877106, 0.18214459717273712, 0.028567230328917503, 0.03422768414020538, 0.02714725024998188, 0.15776167809963226, 0.011524679139256477, 0.031206343322992325, 0.004239538684487343, 0.008807740174233913, 0.06622479856014252, 0.01336232852190733, 0.002346902387216687, 0.00920367892831564, 0.14650234580039978], [0.04230611026287079, 0.03502862900495529, 0.33979862928390503, 0.19633087515830994, 0.02950000949203968, 0.019022446125745773, 0.026152078062295914, 0.13933001458644867, 0.003422748064622283, 0.0113377645611763, 0.0017469078302383423, 0.0019091337453573942, 0.0027396988589316607, 0.0034355332609266043, 0.0014984962763264775, 0.015023568645119667, 0.13141736388206482], [0.12133914977312088, 0.026223640888929367, 0.013829448260366917, 0.26078295707702637, 0.026229368522763252, 0.018690595403313637, 0.08555667847394943, 0.17745764553546906, 0.006205730140209198, 0.040454357862472534, 0.017176903784275055, 0.004207426682114601, 0.005872828420251608, 0.007993195205926895, 0.004594189580529928, 0.009112872183322906, 0.17427298426628113], [0.2000785917043686, 0.019389281049370766, 0.01480416115373373, 0.09598270803689957, 0.007851284928619862, 0.017362261191010475, 0.040210265666246414, 0.2831098139286041, 0.0029720584861934185, 0.023673919960856438, 0.0070809959433972836, 0.002881062915548682, 0.03115232102572918, 0.003954009152948856, 0.001562010496854782, 0.0040627093985676765, 0.24387258291244507], [0.15942883491516113, 0.042274899780750275, 0.02302929200232029, 0.1309691220521927, 0.03720428794622421, 0.03163367137312889, 0.044747963547706604, 0.22667333483695984, 0.009123533964157104, 0.015994736924767494, 0.0070763020776212215, 0.005939931608736515, 0.041402436792850494, 0.010428872890770435, 0.0033966531045734882, 0.007228354457765818, 0.2034478634595871], [0.18066667020320892, 0.017416061833500862, 0.01610175333917141, 0.11018575727939606, 0.03102722018957138, 0.01920236088335514, 0.020321322605013847, 0.3049446642398834, 0.0038466574624180794, 0.0040113721042871475, 0.004581425338983536, 0.0075630624778568745, 0.009110287763178349, 0.004809110891073942, 0.0016243770951405168, 0.0031855714041739702, 0.26140230894088745], [0.5840312838554382, 0.004743863362818956, 0.014474052004516125, 0.009881310164928436, 0.00264731771312654, 0.004685430787503719, 0.002674694638699293, 0.1871562898159027, 0.002601411659270525, 0.007300317753106356, 0.0015796282095834613, 0.0023993945214897394, 0.0017056650249287486, 0.0018584171775728464, 0.0009000025456771255, 0.005964957643300295, 0.16539600491523743], [0.30828002095222473, 0.006220230832695961, 0.0059934575110673904, 0.00644401041790843, 0.002393897855654359, 0.010106999427080154, 0.005361232906579971, 0.17305806279182434, 0.011036748997867107, 0.14138130843639374, 0.01077658124268055, 0.024243030697107315, 0.10418997704982758, 0.00816765334457159, 0.014450970105826855, 0.009320174343883991, 0.158575639128685], [0.0815197005867958, 0.020000943914055824, 0.16753031313419342, 0.03431420400738716, 0.02243422158062458, 0.029751300811767578, 0.012675809673964977, 0.12738651037216187, 0.04833992198109627, 0.04793912544846535, 0.009313643909990788, 0.029552778229117393, 0.04286762699484825, 0.028911998495459557, 0.12530866265296936, 0.04064905270934105, 0.13150420784950256], [0.5282158255577087, 0.009215841069817543, 0.02208750694990158, 0.014886071905493736, 0.0026185635942965746, 0.012227934785187244, 0.01554732397198677, 0.16134385764598846, 0.006239890120923519, 0.016384460031986237, 0.003174096578732133, 0.013376103714108467, 0.010847215540707111, 0.004526900127530098, 0.001723004737868905, 0.016203444451093674, 0.16138193011283875], [0.3043380081653595, 0.02286265231668949, 0.03255901858210564, 0.02116522192955017, 0.015146443620324135, 0.027407599613070488, 0.021879781037569046, 0.22911445796489716, 0.007677081506699324, 0.03117549605667591, 0.004477428738027811, 0.01851595565676689, 0.011677582748234272, 0.006602891720831394, 0.015908220782876015, 0.007692528888583183, 0.22179968655109406], [0.1752730756998062, 0.004588421434164047, 0.002817933913320303, 0.003289309097453952, 0.00123742560390383, 0.006643157918006182, 0.015229299664497375, 0.3235494792461395, 0.009337316267192364, 0.04946617782115936, 0.0074899159371852875, 0.005904505494982004, 0.08630012720823288, 0.013564500026404858, 0.006529935635626316, 0.0073045543394982815, 0.281474769115448], [0.3593403697013855, 0.004025507718324661, 0.006888154428452253, 0.005860699340701103, 0.0020617141854017973, 0.006544760428369045, 0.004472521133720875, 0.12614770233631134, 0.008367729373276234, 0.15482084453105927, 0.009447501040995121, 0.021249212324619293, 0.13394621014595032, 0.006845593918114901, 0.020357808098196983, 0.010948236100375652, 0.11867548525333405], [0.4243904650211334, 0.00980836059898138, 0.01192525215446949, 0.008604561910033226, 0.00566175626590848, 0.01086607575416565, 0.0030110382940620184, 0.20327475666999817, 0.026776380836963654, 0.017595678567886353, 0.012794801034033298, 0.01770150288939476, 0.014544767327606678, 0.03148660808801651, 0.002583786379545927, 0.021791616454720497, 0.17718258500099182], [0.3249542713165283, 0.01082911528646946, 0.006215124856680632, 0.010009188205003738, 0.011470712721347809, 0.010403158143162727, 0.010292298160493374, 0.24442161619663239, 0.021513354033231735, 0.026710553094744682, 0.0015979440649971366, 0.012410338968038559, 0.041961729526519775, 0.023315489292144775, 0.004604635760188103, 0.01337346713989973, 0.2259170562028885], [0.5769861936569214, 0.005360886454582214, 0.015407702885568142, 0.00983339175581932, 0.003257320960983634, 0.005299969110637903, 0.002874945756047964, 0.18718747794628143, 0.0030531752854585648, 0.007916856557130814, 0.0019868374802172184, 0.003189179114997387, 0.001881532371044159, 0.0021618418395519257, 0.00131212396081537, 0.007276593241840601, 0.16501398384571075]], [[0.6959972381591797, 0.0077446275390684605, 0.005260814446955919, 0.009486541152000427, 0.007903705351054668, 0.009182424284517765, 0.014701749198138714, 0.10258606821298599, 0.00303898798301816, 0.010865113697946072, 0.0033028051257133484, 0.0031110954005271196, 0.0056053041480481625, 0.002030021511018276, 0.0021458358969539404, 0.017398711293935776, 0.0996389240026474], [0.006331637967377901, 0.0024512875825166702, 0.9547946453094482, 0.004563211463391781, 0.00018287448619958013, 0.0005222941399551928, 0.003179222112521529, 0.014406083151698112, 5.501753912540153e-05, 4.484011515160091e-05, 7.453135185642168e-05, 0.00029518091469071805, 0.0001010975320241414, 0.00010833601118065417, 4.48917162430007e-05, 0.00024407869204878807, 0.012600668705999851], [0.5162607431411743, 0.007085038349032402, 0.015279430896043777, 0.1405944973230362, 0.004260573070496321, 0.008886747062206268, 0.0021301910746842623, 0.16376133263111115, 0.00020257163851056248, 4.803802221431397e-05, 0.00010678466060198843, 4.197925954940729e-05, 2.712254354264587e-05, 0.00010132988245459273, 0.00024568845401518047, 0.00047873269068077207, 0.1404891461133957], [0.4020403027534485, 0.004058153834193945, 0.010286534205079079, 0.0034978666808456182, 0.055199943482875824, 0.23959887027740479, 0.016667453572154045, 0.14295917749404907, 0.00016278470866382122, 0.0008327076211571693, 0.00011797354090958834, 5.316642636898905e-05, 2.631263851071708e-05, 1.9544495444279164e-05, 0.00011205946793779731, 0.0015669430140405893, 0.1228000819683075], [0.011891569942235947, 0.000536019098944962, 0.018842056393623352, 0.00038672584923915565, 0.0006348469760268927, 0.5546340346336365, 0.3308072090148926, 0.043068744242191315, 0.00023108867753762752, 0.0006083636544644833, 0.0004469580599106848, 0.00014205224579200149, 4.204640390526038e-06, 5.441520443127956e-06, 1.2392515600367915e-05, 0.0006022992893122137, 0.03714597597718239], [0.06790461391210556, 0.0013910544803366065, 0.0192977637052536, 0.0015960752498358488, 0.0051045408472418785, 0.03420741856098175, 0.7798225283622742, 0.045795269310474396, 0.0009580369805917144, 0.003074520966038108, 0.0014870609156787395, 0.0029787614475935698, 5.907970989937894e-05, 6.843272331025219e-06, 2.2279327822616324e-05, 0.0009316236828453839, 0.03536241874098778], [0.9783364534378052, 1.3265317647892516e-05, 5.307854735292494e-06, 2.6011528007074958e-06, 3.017735707544489e-06, 2.971591857203748e-05, 4.9377053073840216e-05, 0.013613603077828884, 4.718369837064529e-06, 1.877530621641199e-06, 1.7774432592432277e-07, 1.299366516605005e-07, 1.1565211934794206e-06, 1.1774271513331769e-07, 5.248184997697081e-09, 4.572282819026441e-07, 0.007938079535961151], [0.8924463391304016, 0.006859590765088797, 0.0008357313927263021, 0.003109344281256199, 0.005411035381257534, 0.0059766145423054695, 0.0022917718160897493, 0.03939186409115791, 0.0007210785406641662, 0.0006299996166490018, 0.0004647286550607532, 0.00040604808600619435, 0.00022823124891147017, 0.00035850642598234117, 0.0002650823153089732, 0.0012534415582194924, 0.03935067355632782], [0.004391891416162252, 1.3718900845560711e-05, 6.766682054148987e-05, 2.313658796992968e-06, 7.263983661687234e-06, 4.908712435280904e-05, 0.0006549089448526502, 0.019540823996067047, 0.0005456091603264213, 0.9573616981506348, 0.002654122421517968, 0.0006231378647498786, 6.865190516691655e-05, 1.4445219676417764e-05, 0.0003623472875915468, 7.237040699692443e-05, 0.013570068404078484], [0.24095791578292847, 0.001101038884371519, 2.5554365493007936e-05, 5.337648872227874e-06, 9.855561984295491e-06, 0.00028948814724572003, 0.00027733712340705097, 0.3000878095626831, 0.0028884548228234053, 0.01611434854567051, 0.13610927760601044, 0.009755046106874943, 0.010220763273537159, 0.0012481944868341088, 0.00020628025231417269, 0.0012374324724078178, 0.27946582436561584], [0.7364034652709961, 0.007772201672196388, 0.00010314675455447286, 0.0003590492415241897, 1.2588564459292684e-05, 0.0007723228773102164, 0.00023753897403366864, 0.07869347184896469, 0.0005426991265267134, 0.0005315054440870881, 0.0008695297292433679, 0.07374822348356247, 0.00907106976956129, 0.0014584313612431288, 0.00013954263704363257, 0.0008263486088253558, 0.08845888078212738], [0.20631544291973114, 0.0010554296895861626, 0.0001994435442611575, 8.430961315752938e-05, 4.798006557393819e-05, 3.440720684011467e-05, 2.4605862563475966e-05, 0.05585185065865517, 0.00041770466486923397, 0.0010945653775706887, 0.0015753385378047824, 0.0011711008846759796, 0.6608336567878723, 0.016420327126979828, 0.0019185520941391587, 0.00260834489017725, 0.05034700408577919], [0.15306153893470764, 0.0005404168623499572, 0.013829533010721207, 0.0004605992289725691, 2.9478165743057616e-05, 4.2960520659107715e-05, 4.493079177336767e-05, 0.15155275166034698, 0.0005545448511838913, 0.0022157258354127407, 0.0014832664746791124, 0.0012187822721898556, 0.02074042148888111, 0.44500479102134705, 0.015950236469507217, 0.029683463275432587, 0.16358661651611328], [0.13357573747634888, 0.0005421385867521167, 0.002285833703354001, 0.00029894046019762754, 0.00015404039004351944, 0.00014606394688598812, 0.00031683960696682334, 0.14957192540168762, 1.3634257811645512e-05, 0.003980165813118219, 0.001818205346353352, 0.0012275701155886054, 0.001928592217154801, 0.01177874393761158, 0.39223793148994446, 0.09572045505046844, 0.20440325140953064], [0.3336658477783203, 0.0020987752359360456, 0.001119056949391961, 0.0013755030231550336, 0.005426853429526091, 0.0008849873556755483, 0.00045718473847955465, 0.07830797880887985, 5.12144197273301e-06, 5.2092196710873395e-05, 0.0005658217123709619, 0.0003359355032444, 0.0007042557699605823, 0.0008648771326988935, 0.0024996171705424786, 0.4676204025745392, 0.10401563346385956], [0.8951225876808167, 0.0006138449534773827, 0.00037581889773719013, 0.00021495531836990267, 0.00012625947420019656, 0.00044330267701298, 4.806590732187033e-05, 0.039259616285562515, 3.7414527469081804e-05, 7.820419341442175e-06, 0.00016283313743770123, 0.00016219723329413682, 0.001255991286598146, 0.0008050307515077293, 0.00030190771212801337, 0.0016256648814305663, 0.05943657085299492], [0.8804554343223572, 0.010952111333608627, 0.0013115830952301621, 0.003889394225552678, 0.0059689865447580814, 0.006705315783619881, 0.0026708838995546103, 0.04049256816506386, 0.0008356599137187004, 0.0007534812903031707, 0.0005282926140353084, 0.0005306598031893373, 0.0003439544525463134, 0.0005244679050520062, 0.0004264125891495496, 0.002150031039491296, 0.041460685431957245]], [[0.16313815116882324, 0.010214782319962978, 0.007985773496329784, 0.009216719307005405, 0.005400913767516613, 0.007640828378498554, 0.006037918385118246, 0.3429224193096161, 0.015088650397956371, 0.017681289464235306, 0.013900008983910084, 0.014689023606479168, 0.024341845884919167, 0.012888504192233086, 0.005105342250317335, 0.018789200112223625, 0.3249585032463074], [0.10556863993406296, 0.0213029608130455, 0.22882398962974548, 0.029935574159026146, 0.01614001765847206, 0.024913206696510315, 0.0601780042052269, 0.2148459404706955, 0.006354323122650385, 0.02204984612762928, 0.037534791976213455, 0.007483654655516148, 0.009778348729014397, 0.005424597300589085, 0.009583383798599243, 0.01371898502111435, 0.18636366724967957], [0.21925203502178192, 0.010619424283504486, 0.009154162369668484, 0.013447648845613003, 0.002133312402293086, 0.0037887978833168745, 0.06313080340623856, 0.3484101891517639, 0.009997552260756493, 0.005610581953078508, 0.004348789807409048, 0.002054534386843443, 0.001184592256322503, 0.0038006752729415894, 0.0069999839179217815, 0.003500851569697261, 0.2925661504268646], [0.25249430537223816, 0.03045513853430748, 0.16451053321361542, 0.06770873069763184, 0.00954813789576292, 0.03322198614478111, 0.08144077658653259, 0.18078094720840454, 0.0034067360684275627, 0.015687396749854088, 0.0050127641297876835, 0.0029873165767639875, 0.004060269799083471, 0.0025715664960443974, 0.0054634371772408485, 0.004585088696330786, 0.13606488704681396], [0.06218055635690689, 0.024736234918236732, 0.04211129993200302, 0.09440165013074875, 0.003490511793643236, 0.03348754346370697, 0.6195890307426453, 0.03570161014795303, 0.0026602039579302073, 0.01320437341928482, 0.003927464596927166, 0.0029557624366134405, 0.009014463983476162, 0.0020417345222085714, 0.014210077933967113, 0.008302469737827778, 0.027985019609332085], [0.10106872767210007, 0.02227054536342621, 0.09399963170289993, 0.04352429881691933, 0.01799512840807438, 0.03703578934073448, 0.1739225834608078, 0.2045392096042633, 0.005265017971396446, 0.03545690327882767, 0.03690201789140701, 0.011800568550825119, 0.01397752482444048, 0.005207898560911417, 0.009585777297616005, 0.017163900658488274, 0.17028449475765228], [0.1737377792596817, 0.036137405782938004, 0.02912095934152603, 0.08318175375461578, 0.014746901579201221, 0.02534753829240799, 0.005852538160979748, 0.32061633467674255, 0.006611044052988291, 0.007093543652445078, 0.016293715685606003, 0.0015727926511317492, 0.015334190800786018, 0.004305475391447544, 0.0008549922495149076, 0.003098628483712673, 0.25609442591667175], [0.16611126065254211, 0.002160026226192713, 0.00027706960099749267, 0.0012991471448913217, 0.0010359458392485976, 0.0012573046842589974, 0.0008831858867779374, 0.4368273913860321, 0.0006482069147750735, 0.00022974541934672743, 0.000876160804182291, 0.00037930923281237483, 0.00035299212322570384, 0.0011706784134730697, 0.0001941129012266174, 0.0004047449037898332, 0.38589274883270264], [0.27780860662460327, 0.007181299850344658, 0.03430254012346268, 0.002727442653849721, 0.0024476083926856518, 0.005520181730389595, 0.04594927281141281, 0.2645586133003235, 0.004329823888838291, 0.010615751147270203, 0.042487066239118576, 0.004604010377079248, 0.018386365845799446, 0.004805687814950943, 0.0069451588205993176, 0.013882076367735863, 0.2534484267234802], [0.17509734630584717, 0.005132684949785471, 0.0024121240712702274, 0.005544045474380255, 0.002542763715609908, 0.0020307451486587524, 0.0033844350837171078, 0.35567519068717957, 0.05636337399482727, 0.010415504686534405, 0.0052816253155469894, 0.005710158962756395, 0.03493678197264671, 0.02345617488026619, 0.003200435545295477, 0.007878652773797512, 0.3009379506111145], [0.20018059015274048, 0.011447232216596603, 0.0001332415995420888, 0.007591171655803919, 0.0018935579573735595, 0.0034528563264757395, 0.003451855620369315, 0.3165396451950073, 0.07653333991765976, 0.00351185305044055, 0.004889857955276966, 0.004553612787276506, 0.010967076756060123, 0.05868019908666611, 0.011176601983606815, 0.003685247851535678, 0.28131207823753357], [0.08472973108291626, 0.002550333272665739, 0.0003626806428655982, 0.0013660861877724528, 0.001070957980118692, 0.0028481835033744574, 0.0029948020819574594, 0.12200871855020523, 0.008415686897933483, 0.015148157253861427, 0.6234654188156128, 0.0015276491176337004, 0.008929545991122723, 0.006448919884860516, 0.0011488809250295162, 0.00272906431928277, 0.1142551377415657], [0.3884458839893341, 0.005400855094194412, 0.0007452070130966604, 0.001395656494423747, 0.0011131340870633721, 0.005875939037650824, 0.012734247371554375, 0.26736313104629517, 0.012306464836001396, 0.007459679152816534, 0.009618441574275494, 0.004871377721428871, 0.0348256416618824, 0.018677085638046265, 0.010374339297413826, 0.014959724619984627, 0.20383313298225403], [0.23664157092571259, 0.005337074864655733, 0.020726708695292473, 0.002912538591772318, 0.002009662101045251, 0.006205644924193621, 0.06205156445503235, 0.25222960114479065, 0.004345778841525316, 0.010430662892758846, 0.0376138910651207, 0.007728245574980974, 0.044588249176740646, 0.00898000504821539, 0.02461782470345497, 0.03506685420870781, 0.23851406574249268], [0.21425940096378326, 0.0021856981329619884, 0.00013983677490614355, 0.0004723976308014244, 0.0015020743012428284, 0.004206244368106127, 0.006867186166346073, 0.38373953104019165, 0.007286078296601772, 0.011573935858905315, 0.004320519044995308, 0.003174481214955449, 0.005137848202139139, 0.009484467096626759, 0.0014578505652025342, 0.005404961295425892, 0.3387875258922577], [0.17057380080223083, 0.0025483244098722935, 0.0021330469753593206, 0.009184647351503372, 0.0007252037758007646, 0.002898829523473978, 0.006204844452440739, 0.30981993675231934, 0.003075559390708804, 0.01724020391702652, 0.005073311273008585, 0.003156411461532116, 0.007416308857500553, 0.006591300014406443, 0.1697906255722046, 0.001247695181518793, 0.28231993317604065], [0.1663760542869568, 0.0020809604320675135, 0.0002614349068608135, 0.0010972103336825967, 0.0011253571137785912, 0.0012386712478473783, 0.0008872587932273746, 0.43411368131637573, 0.0007039686315692961, 0.00027688537375070155, 0.0010195140494033694, 0.0004455139860510826, 0.0003689740551635623, 0.001333949388936162, 0.00023100862745195627, 0.00044686885667033494, 0.3879926800727844]], [[0.40364983677864075, 0.025120750069618225, 0.011074002832174301, 0.016334034502506256, 0.011914027854800224, 0.018830345943570137, 0.026117056608200073, 0.19828075170516968, 0.008338275365531445, 0.021790064871311188, 0.010964248329401016, 0.01199241355061531, 0.01866769790649414, 0.013665527105331421, 0.008113598451018333, 0.020461246371269226, 0.17468610405921936], [0.5635526776313782, 0.03385404124855995, 0.007266618311405182, 0.013103276491165161, 0.00887469481676817, 0.008410237729549408, 0.0028482729103416204, 0.20250868797302246, 0.0006163949728943408, 0.004819197580218315, 0.0018767673755064607, 0.0007006556261330843, 0.0020434099715203047, 0.0010770083172246814, 0.0007934764144010842, 0.001713984995149076, 0.1459406316280365], [0.37714195251464844, 0.14172081649303436, 0.008621757850050926, 0.00991392694413662, 0.010674310848116875, 0.00409815926104784, 0.007047262508422136, 0.22698773443698883, 0.00025643763365224004, 0.0005310469423420727, 0.00450561149045825, 0.00530346529558301, 0.003930752631276846, 0.002185943303629756, 0.0019242740236222744, 0.0038209883496165276, 0.19133560359477997], [0.3081434667110443, 0.11616382747888565, 0.0391848161816597, 0.02311808429658413, 0.013648797757923603, 0.014760948717594147, 0.012777508236467838, 0.24995149672031403, 0.000450747087597847, 0.00037882072501815856, 0.001114367856644094, 0.0010013937717303634, 0.009912222623825073, 0.0018290058942511678, 0.0013238437240943313, 0.003607409307733178, 0.20263321697711945], [0.09228277206420898, 0.11723052710294724, 0.16430756449699402, 0.3651537001132965, 0.013564775697886944, 0.021695947274565697, 0.007155416999012232, 0.11741132289171219, 0.00151314667891711, 0.0004281438887119293, 0.0003446707269176841, 0.0007054094457998872, 0.004101785831153393, 0.0024136267602443695, 0.0038866810500621796, 0.0010112510062754154, 0.08679334819316864], [0.019979704171419144, 0.01239036489278078, 0.08644694089889526, 0.5665701031684875, 0.11214841902256012, 0.027260208502411842, 0.01073313970118761, 0.08942244946956635, 0.0017136252718046308, 0.000622992985881865, 0.00014602929877582937, 0.00023118713579606265, 0.003800582140684128, 0.0019534786697477102, 0.0011341077042743564, 0.0013442205963656306, 0.06410244107246399], [0.07835068553686142, 0.007088932674378157, 0.07685209810733795, 0.14028814435005188, 0.11182418465614319, 0.08017713576555252, 0.033603373914957047, 0.24489809572696686, 0.003469242015853524, 0.0038559818640351295, 0.0016143833054229617, 0.0013459809124469757, 0.0005649477825500071, 0.0020657978020608425, 0.003467536997050047, 0.008565433323383331, 0.2019680142402649], [0.6526908874511719, 0.008380125276744366, 0.0026245457120239735, 0.003257485805079341, 0.00205105054192245, 0.005095557309687138, 0.00795469805598259, 0.15926975011825562, 0.0013332751113921404, 0.003997142892330885, 0.0021102679893374443, 0.0008814124739728868, 0.002494734711945057, 0.0007861354388296604, 0.0012181694619357586, 0.003711279947310686, 0.14214341342449188], [0.5038552284240723, 0.0037317427340894938, 0.0033640835899859667, 0.0021810438483953476, 0.0026953844353556633, 0.013543082401156425, 0.01753557100892067, 0.2367292195558548, 0.008111598901450634, 0.007439303211867809, 0.005782516207545996, 0.0027748041320592165, 0.0007396257133223116, 9.606798994354904e-05, 0.00034089345717802644, 0.0010834061540663242, 0.18999645113945007], [0.3825465142726898, 0.0011468342272564769, 0.001488346024416387, 0.00292953965254128, 0.0037661674432456493, 0.008177045732736588, 0.008102930150926113, 0.265243262052536, 0.07681730389595032, 0.010812598280608654, 0.02568924054503441, 0.008674965240061283, 0.0039160787127912045, 0.0005494539509527385, 0.0005914760404266417, 0.0008790299179963768, 0.1986692100763321], [0.17469936609268188, 0.002286924747750163, 0.002205283148214221, 0.007978418841958046, 0.009781068190932274, 0.006775341462343931, 0.005869187880307436, 0.19131705164909363, 0.07510044425725937, 0.36214667558670044, 0.0060800835490226746, 0.004411804955452681, 0.008595521561801434, 0.004916910547763109, 0.002844932721927762, 0.000892776413820684, 0.1340981423854828], [0.14529021084308624, 0.000913427968043834, 0.00042777592898346484, 0.0005965724121779203, 0.001908866805024445, 0.003260993165895343, 0.003009781241416931, 0.1113179549574852, 0.05615972355008125, 0.49080854654312134, 0.07070713490247726, 0.006693044677376747, 0.009280619211494923, 0.003961681853979826, 0.003394369501620531, 0.0024593633133918047, 0.0898098275065422], [0.19385352730751038, 0.0015383190475404263, 0.00034114616573788226, 0.0005271152476780117, 0.0002293862053193152, 0.0011010300368070602, 0.002997500589117408, 0.21300683915615082, 0.010239376686513424, 0.09337914735078812, 0.15861043334007263, 0.09760335087776184, 0.01941184140741825, 0.007924336940050125, 0.011923238635063171, 0.003684157505631447, 0.18362928926944733], [0.07784727215766907, 0.002271876437589526, 0.0009028393542394042, 0.00038943561958149076, 0.00011923765850951895, 0.00037840381264686584, 0.0019389515509828925, 0.22047960758209229, 0.002973913913592696, 0.029365219175815582, 0.049135129898786545, 0.12171231210231781, 0.25525593757629395, 0.029332533478736877, 0.014174777083098888, 0.004278494976460934, 0.18944405019283295], [0.3963894546031952, 0.0018102923640981317, 0.0005626111524179578, 0.00044449896086007357, 7.007062231423333e-05, 0.00012360965774860233, 0.00044980357051827013, 0.24974709749221802, 0.0011976974783465266, 0.009696269407868385, 0.00471750320866704, 0.014660544693470001, 0.08063288033008575, 0.029151178896427155, 0.0041847508400678635, 0.0037953064311295748, 0.20236633718013763], [0.06080355495214462, 0.002805462572723627, 0.001391838421113789, 0.0022927639074623585, 0.001576337730512023, 0.0006511642714031041, 0.0005198956350795925, 0.07306408137083054, 0.004497938323765993, 0.03316112980246544, 0.008966446854174137, 0.01888444647192955, 0.2266205996274948, 0.42207422852516174, 0.05421263724565506, 0.01587660238146782, 0.07260075211524963], [0.6575058698654175, 0.008103255182504654, 0.0024175397120416164, 0.0027755568735301495, 0.0018751597963273525, 0.0041969954036176205, 0.006183417048305273, 0.1544516384601593, 0.0014032743638381362, 0.004613389726728201, 0.002514435676857829, 0.0010804757475852966, 0.003791904542595148, 0.001292334753088653, 0.0017895629862323403, 0.0050834477879107, 0.1409216821193695]]], [[[0.06759120523929596, 0.0020871013402938843, 0.0032625612802803516, 0.0015160661423578858, 0.001005565165542066, 0.0035139506217092276, 0.005905563943088055, 0.39193961024284363, 0.018096916377544403, 0.019527878612279892, 0.013833347707986832, 0.005969060119241476, 0.04883822798728943, 0.020120108500123024, 0.010947544127702713, 0.019832663238048553, 0.36601272225379944], [0.0076933689415454865, 0.020925795659422874, 0.2287757694721222, 0.0960816890001297, 0.017543915659189224, 0.025668492540717125, 0.10246124863624573, 0.14773337543010712, 0.017791511490941048, 0.06229807808995247, 0.030050484463572502, 0.009211882948875427, 0.024715689942240715, 0.015548155643045902, 0.04370894283056259, 0.011627846397459507, 0.13816367089748383], [0.004154375288635492, 0.012604085728526115, 0.0467446930706501, 0.018834814429283142, 0.010230228304862976, 0.01367613673210144, 0.15008331835269928, 0.3571232557296753, 0.01437342632561922, 0.008501267991960049, 0.014637360349297523, 0.001713148900307715, 0.009124052710831165, 0.008379245176911354, 0.009053273126482964, 0.002256709383800626, 0.31851068139076233], [0.005138943437486887, 0.05131759122014046, 0.15513327717781067, 0.09382656216621399, 0.07724513113498688, 0.05150994285941124, 0.04356435686349869, 0.23689043521881104, 0.010444794781506062, 0.01669452339410782, 0.005279402248561382, 0.0023975425865501165, 0.016171829774975777, 0.005926951766014099, 0.01036930549889803, 0.003390578320249915, 0.21469873189926147], [0.005500501021742821, 0.013060938566923141, 0.14383065700531006, 0.4154331088066101, 0.006511515937745571, 0.015535833314061165, 0.05618159472942352, 0.10404179990291595, 0.023118028417229652, 0.0096433125436306, 0.0041173831559717655, 0.004502392839640379, 0.05395468696951866, 0.015000199899077415, 0.030274728313088417, 0.005270793102681637, 0.09402255713939667], [0.0051037585362792015, 0.012007550336420536, 0.26398882269859314, 0.11905860155820847, 0.04067576304078102, 0.015787333250045776, 0.10373185575008392, 0.14273694157600403, 0.02786121889948845, 0.011937994509935379, 0.01585511490702629, 0.004399471916258335, 0.052416909486055374, 0.023468978703022003, 0.019778341054916382, 0.005762902554124594, 0.13542842864990234], [0.044971611350774765, 0.056763384491205215, 0.04289419203996658, 0.012148425914347172, 0.017068807035684586, 0.043881647288799286, 0.019254211336374283, 0.32153376936912537, 0.03993726521730423, 0.010806065984070301, 0.0185234397649765, 0.012098167091608047, 0.034731410443782806, 0.013814995996654034, 0.009631538763642311, 0.01248070690780878, 0.2894604504108429], [0.06046663224697113, 0.0009438990382477641, 0.00029467252898029983, 0.0004694225499406457, 0.00014705250214319676, 0.001256020157597959, 0.01351354643702507, 0.4953964650630951, 0.0005610951338894665, 0.0004014973237644881, 0.0006876891129650176, 0.0003559339384082705, 0.0020943903364241123, 0.0005946520250290632, 0.00042560143629089, 0.0032888746354728937, 0.41910243034362793], [0.012081529945135117, 0.012573553249239922, 0.028330763801932335, 0.016398390755057335, 0.005909213796257973, 0.016392506659030914, 0.04355902969837189, 0.3010939657688141, 0.021138066425919533, 0.030962640419602394, 0.019797133281826973, 0.01557218935340643, 0.1505585014820099, 0.03302048519253731, 0.014300093054771423, 0.0179585050791502, 0.2603534162044525], [0.005201674997806549, 0.011796063743531704, 0.03635196387767792, 0.026281410828232765, 0.011476158164441586, 0.017529722303152084, 0.031588561832904816, 0.4209766387939453, 0.016554513946175575, 0.006975475698709488, 0.009520676918327808, 0.0019399859011173248, 0.022098654881119728, 0.006683272309601307, 0.007702033966779709, 0.0020076401997357607, 0.36531543731689453], [0.05399094521999359, 0.011939999647438526, 0.010585847310721874, 0.002828157739713788, 0.002836566651239991, 0.010031893849372864, 0.02311435155570507, 0.34777799248695374, 0.0516529381275177, 0.005325790029019117, 0.03421425446867943, 0.022316185757517815, 0.022074049338698387, 0.06020361930131912, 0.016228992491960526, 0.024430548772215843, 0.300447940826416], [0.021983498707413673, 0.004345703404396772, 0.008262895978987217, 0.003720348933711648, 0.00533885695040226, 0.004137730225920677, 0.012295020744204521, 0.26610928773880005, 0.05190911144018173, 0.009000832214951515, 0.18570266664028168, 0.00391186261549592, 0.07709624618291855, 0.08833026885986328, 0.013261094689369202, 0.008684556931257248, 0.23591001331806183], [0.03228146582841873, 0.015680421143770218, 0.005403424613177776, 0.018526973202824593, 0.010842524468898773, 0.03664136677980423, 0.010927940718829632, 0.2523205876350403, 0.06714659929275513, 0.035007499158382416, 0.030700355768203735, 0.020984796807169914, 0.15488721430301666, 0.05140920355916023, 0.012946696020662785, 0.01891997456550598, 0.22537299990653992], [0.015747083351016045, 0.009987724013626575, 0.02234824188053608, 0.007975982502102852, 0.004856911953538656, 0.009397594258189201, 0.030109865590929985, 0.27750164270401, 0.06846160441637039, 0.018422191962599754, 0.0244930200278759, 0.022746030241250992, 0.13748927414417267, 0.060596369206905365, 0.024635685607790947, 0.02349485456943512, 0.24173593521118164], [0.06242319196462631, 0.004465267062187195, 0.06603453308343887, 0.002976061310619116, 0.0028727944009006023, 0.004175681620836258, 0.06131749972701073, 0.28562819957733154, 0.006936720106750727, 0.009882132522761822, 0.03549633175134659, 0.016951177269220352, 0.015434343367815018, 0.01984216831624508, 0.08790972828865051, 0.057722121477127075, 0.25993192195892334], [0.04182877019047737, 0.011658773757517338, 0.08602061867713928, 0.019495967775583267, 0.007693324703723192, 0.011874701827764511, 0.011004664935171604, 0.28295138478279114, 0.026775594800710678, 0.013011295348405838, 0.012999633327126503, 0.007822408340871334, 0.05453909933567047, 0.04275411367416382, 0.1128067597746849, 0.008107021450996399, 0.24865587055683136], [0.05821359530091286, 0.0009150478872470558, 0.00026590420748107135, 0.00043554563308134675, 0.00013661643606610596, 0.001207694411277771, 0.01303093507885933, 0.4977428615093231, 0.0005609805230051279, 0.0003836224786937237, 0.0006572830607183278, 0.00034280819818377495, 0.0020432048477232456, 0.0005850065499544144, 0.00040298793464899063, 0.0031167056877166033, 0.419959157705307]], [[0.034093376249074936, 0.00044489759602583945, 0.006101427599787712, 0.008624356240034103, 0.00025467932573519647, 0.0016919890185818076, 0.01056747604161501, 7.700968853896484e-05, 0.13774223625659943, 0.37366387248039246, 0.0168655626475811, 0.09972906112670898, 0.09415523707866669, 0.07423056662082672, 0.04163319244980812, 0.10004891455173492, 7.615468348376453e-05], [0.001162996282801032, 0.18318381905555725, 0.09754577279090881, 0.02519836463034153, 0.060916345566511154, 0.19578894972801208, 0.18307602405548096, 0.05169988051056862, 0.010652581229805946, 0.01708405278623104, 0.02795368805527687, 0.016898225992918015, 0.003948911093175411, 0.023456687107682228, 0.03743841126561165, 0.02236994169652462, 0.04162539541721344], [0.0008591158548370004, 0.21135738492012024, 0.026117568835616112, 0.016370050609111786, 0.08285021781921387, 0.2906479239463806, 0.01270249579101801, 0.18797393143177032, 0.006389293819665909, 0.0014653268735855818, 0.005866705905646086, 0.004215463530272245, 0.0012968142982572317, 0.004627143032848835, 0.00563897704705596, 0.0030748052522540092, 0.1385466605424881], [0.00025005819043144584, 0.2050454169511795, 0.05202070251107216, 0.016712374985218048, 0.05216941609978676, 0.21739263832569122, 0.014635143801569939, 0.24492202699184418, 0.0011587913613766432, 0.001233214046806097, 0.0027785450220108032, 0.0021564876660704613, 0.00020280934404581785, 0.0014044120907783508, 0.003654917236417532, 0.0008136756950989366, 0.1834494024515152], [0.002026214497163892, 0.3687891662120819, 0.023985715582966805, 0.013692590408027172, 0.0733562782406807, 0.20370225608348846, 0.011052021756768227, 0.10990811884403229, 0.006299892440438271, 0.006270294077694416, 0.03206678852438927, 0.013266410678625107, 0.0012885939795523882, 0.015546394512057304, 0.015567432157695293, 0.008422368206083775, 0.09475944191217422], [0.0006771596963517368, 0.29912441968917847, 0.0768122673034668, 0.025488488376140594, 0.08271053433418274, 0.17811036109924316, 0.02561686560511589, 0.11073505878448486, 0.008924730122089386, 0.01303890347480774, 0.022929146885871887, 0.01709006354212761, 0.0015140761388465762, 0.02096095122396946, 0.02173973247408867, 0.005514453165233135, 0.08901271224021912], [8.765157690504566e-05, 0.08225158601999283, 0.2241508513689041, 0.037248097360134125, 0.07412245124578476, 0.11606795340776443, 0.07404104620218277, 0.21733443439006805, 0.00248711952008307, 0.003886458231136203, 0.0019898873288184404, 0.0034698129165917635, 0.00046380195999518037, 0.0022734852973371744, 0.0036474033258855343, 0.0008067613816820085, 0.15567117929458618], [0.024379286915063858, 0.033237118273973465, 0.028610697016119957, 0.04959360882639885, 0.019112832844257355, 0.02906813845038414, 0.012010613456368446, 0.4017043113708496, 0.012275252491235733, 0.008052211254835129, 0.0034952485002577305, 0.009517203085124493, 0.010037039406597614, 0.006751872133463621, 0.008247197605669498, 0.008235905319452286, 0.33567139506340027], [0.10138751566410065, 0.00537125626578927, 0.000522296002600342, 0.001473960350267589, 0.0017028582515195012, 0.007519871927797794, 0.001009704195894301, 0.00015177084424067289, 0.07597704231739044, 0.15074947476387024, 0.05290530249476433, 0.08805067837238312, 0.12109722942113876, 0.13464492559432983, 0.10689391940832138, 0.15038001537322998, 0.00016204929852392524], [0.10707805305719376, 0.01229393482208252, 0.0015551559627056122, 0.0009244831744581461, 0.008584580384194851, 0.022964930161833763, 0.005412433762103319, 0.0005409545265138149, 0.1420038342475891, 0.05400732159614563, 0.0657041072845459, 0.06703547388315201, 0.15002742409706116, 0.1606147587299347, 0.03871360793709755, 0.16197504103183746, 0.0005638703587464988], [0.24567729234695435, 0.039897967129945755, 0.009237166494131088, 0.007731622084975243, 0.015656979754567146, 0.05056357383728027, 0.007087537087500095, 0.012723633088171482, 0.208438441157341, 0.027534401044249535, 0.05370236933231354, 0.04100580886006355, 0.08251162618398666, 0.08876346796751022, 0.014441372826695442, 0.08302704989910126, 0.011999574489891529], [0.14819999039173126, 0.00983673706650734, 0.0018336654175072908, 0.0022325911559164524, 0.009923204779624939, 0.0122905932366848, 0.003470506053417921, 0.0005050482577644289, 0.12569083273410797, 0.09835729002952576, 0.04992636665701866, 0.05539185181260109, 0.13692259788513184, 0.178776353597641, 0.03752817586064339, 0.12855477631092072, 0.000559396343305707], [0.12375345081090927, 0.008482333272695541, 0.0011029040906578302, 0.0005385702825151384, 0.004361131228506565, 0.010661887936294079, 0.0016766131157055497, 0.00021660990023519844, 0.10856969654560089, 0.07274451851844788, 0.06804376095533371, 0.11106707900762558, 0.05942279100418091, 0.16802683472633362, 0.13508062064647675, 0.1260061115026474, 0.00024522561579942703], [0.13264697790145874, 0.01634392701089382, 0.0016749895876273513, 0.0024514691904187202, 0.004384741187095642, 0.023555027320981026, 0.0026720408350229263, 0.00045120014692656696, 0.08000689744949341, 0.10356317460536957, 0.09842777997255325, 0.077020563185215, 0.09826817363500595, 0.13253247737884521, 0.11127527803182602, 0.11425405740737915, 0.00047128653386607766], [0.16268791258335114, 0.027344468981027603, 0.012957192957401276, 0.013063894584774971, 0.014941452071070671, 0.03352760523557663, 0.009412107057869434, 0.009531984105706215, 0.13912621140480042, 0.06188017874956131, 0.04550754278898239, 0.09206169098615646, 0.10809604078531265, 0.13595372438430786, 0.024692809209227562, 0.10007716715335846, 0.009138070046901703], [0.11179602146148682, 0.005330259911715984, 0.0006633760640397668, 0.0008560010464861989, 0.007808025926351547, 0.01209934987127781, 0.0012636417523026466, 0.0002970393979921937, 0.12106391042470932, 0.12159663438796997, 0.05709026753902435, 0.07752157002687454, 0.16159482300281525, 0.16974347829818726, 0.03384783864021301, 0.11711487174034119, 0.0003129672259092331], [0.03881574794650078, 0.03467615693807602, 0.029788369312882423, 0.05210678279399872, 0.022062212228775024, 0.031139567494392395, 0.013375245034694672, 0.368236243724823, 0.017806876450777054, 0.01173831894993782, 0.005038955714553595, 0.013386915437877178, 0.015540560707449913, 0.009780406951904297, 0.01197050791233778, 0.011986605823040009, 0.31255051493644714]], [[0.02729378640651703, 0.03536253422498703, 0.08335191756486893, 0.3878989517688751, 0.014964355155825615, 0.038065195083618164, 0.30013415217399597, 0.04404948279261589, 0.004249783232808113, 0.004211502615362406, 0.0026292274706065655, 0.0030510423239320517, 0.004369737114757299, 0.006810400169342756, 0.00400535948574543, 0.0028801853768527508, 0.036672353744506836], [0.2588442862033844, 0.03201316297054291, 0.02406052127480507, 0.00665647629648447, 0.01594468206167221, 0.023137908428907394, 0.02796773426234722, 0.2158208042383194, 0.041085030883550644, 0.02051910199224949, 0.02638826332986355, 0.02558039501309395, 0.04221643507480621, 0.020588869228959084, 0.00467566167935729, 0.015601657330989838, 0.1988990604877472], [0.050283029675483704, 0.018296467140316963, 0.0014617539709433913, 0.0004925610846839845, 0.010010014288127422, 0.014843718148767948, 0.008168998174369335, 0.41696056723594666, 0.02266453579068184, 0.00916711613535881, 0.008462819270789623, 0.013735280372202396, 0.03405670076608658, 0.01426875963807106, 0.0015653327573090792, 0.006621213164180517, 0.36894115805625916], [0.05901726335287094, 0.023250529542565346, 0.011620938777923584, 0.003990956582129002, 0.016308292746543884, 0.012991439551115036, 0.007696281652897596, 0.3829280138015747, 0.013247104361653328, 0.030829299241304398, 0.013937773182988167, 0.03415937349200249, 0.012230556458234787, 0.006321440450847149, 0.011997092515230179, 0.010489700362086296, 0.3489839434623718], [0.040798574686050415, 0.01455429196357727, 0.012210880406200886, 0.0017526961164548993, 0.013372279703617096, 0.00888400711119175, 0.001578652299940586, 0.44035473465919495, 0.024836909025907516, 0.005290244240313768, 0.003952735103666782, 0.009811311028897762, 0.007562096696346998, 0.009357893839478493, 0.0015036851400509477, 0.0034192088060081005, 0.400759756565094], [0.24477486312389374, 0.024861207231879234, 0.020533014088869095, 0.007552160881459713, 0.018674815073609352, 0.022970907390117645, 0.013715561479330063, 0.2905973792076111, 0.03283978998661041, 0.005764991510659456, 0.009293126873672009, 0.00778951495885849, 0.016797013580799103, 0.01323192473500967, 0.002411207649856806, 0.006582543719559908, 0.2616100609302521], [0.13274763524532318, 0.003641424933448434, 0.018571987748146057, 0.0034910617396235466, 0.010063904337584972, 0.008224730379879475, 0.00047137337969616055, 0.28465813398361206, 0.12342055886983871, 0.025775868445634842, 0.0037147703114897013, 0.012255358509719372, 0.014991555362939835, 0.08823368698358536, 0.0069405678659677505, 0.005353786051273346, 0.2574436366558075], [0.03628673031926155, 0.009304738603532314, 0.006289794575423002, 0.0035528612788766623, 0.004943149164319038, 0.01157505065202713, 0.0015372883062809706, 0.48690709471702576, 0.00371186388656497, 0.0035351442638784647, 0.008849959820508957, 0.002796432003378868, 0.003097851760685444, 0.003209034213796258, 0.005304727237671614, 0.003098355373367667, 0.4059998691082001], [0.04046959802508354, 0.006622095126658678, 0.009511884301900864, 0.006179176736623049, 0.004831917583942413, 0.006426215637475252, 0.010222545824944973, 0.47560080885887146, 0.0010967375710606575, 0.0011152173392474651, 0.005692863836884499, 0.002025051275268197, 0.0021525106858462095, 0.002163525437936187, 0.002685670042410493, 0.002804780611768365, 0.4203994572162628], [0.015019603073596954, 0.0017122410936281085, 0.0009891694644466043, 0.0008226657519116998, 0.0008123521110974252, 0.001400371314957738, 0.003574265632778406, 0.4972057044506073, 0.003294630441814661, 2.5230579922208562e-05, 0.002527952194213867, 0.00131884659640491, 0.0019261041888967156, 0.0029901268426328897, 0.001084045972675085, 0.0009308232110925019, 0.46436581015586853], [0.030673760920763016, 0.010640858672559261, 0.003959595691412687, 0.0009110965766012669, 0.005242099054157734, 0.012268726713955402, 0.00345136527903378, 0.4846135973930359, 0.003065731842070818, 0.001803313265554607, 0.0015373494243249297, 0.0013151546008884907, 0.0013138540089130402, 0.0020624033641070127, 0.0015817625680938363, 0.0014447985449805856, 0.4341145157814026], [0.023727212101221085, 0.003514275187626481, 0.01694120094180107, 0.0012630359269678593, 0.008591357618570328, 0.0023179377894848585, 0.0019829028751701117, 0.4641275107860565, 0.007375624496489763, 0.0012495252303779125, 0.00422075716778636, 0.001927169389091432, 0.0024360206443816423, 0.01286728959530592, 0.0014060960384085774, 0.0016036215238273144, 0.4444483518600464], [0.020130150020122528, 0.003381857182830572, 0.002231762744486332, 0.00017563650908414274, 0.0005566083709709346, 0.003496528835967183, 0.001751009956933558, 0.5052575469017029, 0.004826385527849197, 0.0001651347120059654, 0.0010764156468212605, 0.0003924760967493057, 0.00026413126033730805, 0.0021876480896025896, 0.00029284495394676924, 0.000152118198457174, 0.4536616802215576], [0.03535379841923714, 0.0022029310930520296, 0.0019453391432762146, 0.00039174946141429245, 0.0010378964943811297, 0.0022977220360189676, 0.003394561819732189, 0.4957892894744873, 0.001168391085229814, 0.00041394037543796003, 0.0034448220394551754, 0.0013533715391531587, 0.0023750464897602797, 0.001377107691951096, 0.001184921246021986, 0.0012930562952533364, 0.4449761211872101], [0.10276742279529572, 0.013341126963496208, 0.0012722212122753263, 0.0015854777302592993, 0.0025262755807489157, 0.006436736788600683, 0.0018561372999101877, 0.45142224431037903, 0.012807054445147514, 0.0005579125136137009, 0.009497048333287239, 0.004788849502801895, 0.0021324539557099342, 0.007497038226574659, 0.00010397647565696388, 0.0010562665993347764, 0.38035187125205994], [0.09168051183223724, 0.002327520865947008, 0.0023391987197101116, 0.0002618293510749936, 0.000719301518984139, 0.001972377300262451, 0.000573167169932276, 0.4416588842868805, 0.01063733734190464, 0.001986999763175845, 0.002757611684501171, 0.0055214413441717625, 0.0016559617361053824, 0.015112296678125858, 0.0006745947757735848, 0.0004715117684099823, 0.41964954137802124], [0.03487999364733696, 0.009396177716553211, 0.006349395029246807, 0.0034983891528099775, 0.0052128382958471775, 0.012137764133512974, 0.0015995703870430589, 0.48509475588798523, 0.004058254882693291, 0.003845496801659465, 0.009597711265087128, 0.0030873839277774096, 0.0033629003446549177, 0.003476201556622982, 0.0056302170269191265, 0.0034120765049010515, 0.4053608179092407]], [[0.010449378751218319, 0.002370302565395832, 0.006219631526619196, 0.0020938452798873186, 0.0019129350548610091, 0.003857811214402318, 0.025736236944794655, 0.49155163764953613, 0.0014399965293705463, 0.001768771675415337, 0.015581194311380386, 0.002629454480484128, 0.004018147476017475, 0.0011175984982401133, 0.0018624865915626287, 0.003862247336655855, 0.423528254032135], [0.03803052008152008, 0.21394740045070648, 0.001509453053586185, 0.0002270281402161345, 0.0004100881051272154, 0.04914749041199684, 0.0009754454949870706, 0.3727686405181885, 0.011593331582844257, 0.0009730999590829015, 0.0009064914775080979, 0.0002700536570046097, 0.001385212759487331, 0.004620653577148914, 0.0004733818641398102, 0.0001700967113720253, 0.30259162187576294], [0.003021927084773779, 5.961870920145884e-05, 0.520118236541748, 6.892891542520374e-05, 2.7443256840342656e-06, 6.059904626454227e-05, 0.000348845002008602, 0.26310667395591736, 6.813973413954955e-06, 0.0007182249100878835, 2.8821723390137777e-05, 1.4471286704065278e-05, 1.110891389544122e-05, 3.1255219710146775e-06, 3.063991971430369e-05, 1.5770485333632678e-05, 0.21238355338573456], [0.011006133630871773, 9.572628914611414e-05, 0.0002653111005201936, 0.04004378989338875, 9.049934305949137e-05, 0.0002411432797089219, 0.0002766268153209239, 0.526081383228302, 0.00032705935882404447, 0.0006493815453723073, 0.00013218331150710583, 9.892815432976931e-05, 0.00013021084305364639, 0.0002959792036563158, 5.6019063777057454e-05, 0.00030215363949537277, 0.41990751028060913], [0.030737649649381638, 0.00017254157864954323, 9.292146387451794e-06, 0.0001583635457791388, 0.08549880236387253, 7.794789416948333e-05, 1.1098673894593958e-05, 0.47184062004089355, 0.0006367333116941154, 1.6993679309962317e-05, 2.284593938384205e-05, 3.769696832023328e-06, 0.0007009499822743237, 0.0005728652467951179, 1.6045010852394626e-05, 1.1100664778496139e-05, 0.4095124304294586], [0.039847083389759064, 0.06075134873390198, 0.0020495601929724216, 0.0013267422327771783, 0.00043679377995431423, 0.1257016360759735, 0.00036435495712794363, 0.4124228358268738, 0.0023381970822811127, 0.0001769150694599375, 0.00023994660296011716, 0.00016591791063547134, 0.0002726983220782131, 0.001688492833636701, 0.00022142365924082696, 5.0809336244128644e-05, 0.3519451916217804], [0.02245575562119484, 0.00021161137556191534, 0.020850537344813347, 0.00038442775257863104, 3.8728844629076775e-06, 5.7452856708550826e-05, 0.2130790799856186, 0.40496838092803955, 0.00010623854905134067, 0.0015780474059283733, 0.001641078619286418, 2.3979557226994075e-05, 7.872303103795275e-05, 4.067181362188421e-05, 0.0005441592657007277, 2.2151803932501934e-05, 0.3339538276195526], [0.002198728732764721, 0.0003699703374877572, 0.00042359440703876317, 0.0007343862089328468, 0.00019095071183983237, 0.0003677646745927632, 0.0002741754869930446, 0.5434077978134155, 0.00010143443796550855, 0.00021705155086237937, 0.00034314286313019693, 0.00018007216567639261, 0.0002978287811856717, 0.00018081092275679111, 0.00015500218432862312, 0.0004584474372677505, 0.4500989019870758], [0.042560283094644547, 0.0065888771787285805, 0.00018686740077100694, 0.0010426031658425927, 0.0024851651396602392, 0.0015640078345313668, 0.0007922332733869553, 0.3988771438598633, 0.049687668681144714, 0.00011549433838808909, 5.377915294957347e-05, 6.93651381880045e-05, 0.0030703223310410976, 0.15561144053936005, 0.000215352134546265, 7.176768849603832e-05, 0.33700770139694214], [0.037441935390233994, 0.0004136347852181643, 0.011043844744563103, 0.0011196581181138754, 3.729869786184281e-05, 0.00014811629080213606, 0.0020002175588160753, 0.41009408235549927, 0.0006227876292541623, 0.1812625527381897, 2.068597859761212e-05, 4.5539829443441704e-05, 0.00020551001944113523, 0.0012307715369388461, 0.0004834496940020472, 0.0005131270154379308, 0.3533167541027069], [0.04857879504561424, 0.0001723002060316503, 0.0003532098198775202, 0.000560449087060988, 8.844788681017235e-05, 7.446219387929887e-05, 0.0012558092130348086, 0.4835381805896759, 0.00036571864620782435, 8.022908150451258e-05, 0.0429312102496624, 4.923061169392895e-06, 0.0005013263435102999, 0.00010158536315429956, 0.00014242979523260146, 4.41177689936012e-05, 0.42120686173439026], [0.029085757210850716, 7.234243821585551e-05, 0.00012580951442942023, 0.0005900296964682639, 1.621705996512901e-05, 3.17109479510691e-05, 0.00011350863496772945, 0.4861902594566345, 0.0005849428125657141, 0.0006992157432250679, 2.1981948066240875e-06, 0.07086525112390518, 0.00010718940757215023, 0.0002574517857283354, 0.00010740019934019074, 0.000951035472098738, 0.4101995825767517], [0.01808866113424301, 0.0001583242992637679, 7.193268538685516e-05, 0.000644534535240382, 0.00027209732797928154, 5.0320708396611735e-05, 0.00013142655370756984, 0.5067704319953918, 0.003091164631769061, 0.0002688555105123669, 8.13846563687548e-05, 2.846964525815565e-05, 0.027925826609134674, 0.0004425636143423617, 3.95697497879155e-05, 2.55415743595222e-05, 0.44190889596939087], [0.03216021880507469, 0.0007213260396383703, 4.0421393350698054e-05, 0.00032297358848154545, 0.0003534347633831203, 0.0005438064108602703, 0.00019766934565268457, 0.30977270007133484, 0.3253956139087677, 0.000357115117367357, 3.9029262552503496e-05, 9.155400039162487e-05, 0.00036315375473350286, 0.07443941384553909, 2.8258295060368255e-05, 2.1233567167655565e-05, 0.2551521360874176], [0.021212834864854813, 5.230680835666135e-05, 0.00039970065699890256, 0.0003058086149394512, 9.876871627056971e-05, 2.5868299417197704e-05, 0.0008874767809174955, 0.5200281143188477, 0.0009214602177962661, 0.004243588540703058, 3.8450696592917666e-05, 4.885780072072521e-05, 0.00024015876988414675, 0.00031510720145888627, 0.02063463255763054, 2.4892324290703982e-05, 0.4305218756198883], [0.027363931760191917, 1.9208800949854776e-05, 0.000636066310107708, 0.0008665363420732319, 4.061238087160746e-06, 8.586227522755507e-06, 0.00018449639901518822, 0.522641658782959, 5.679240348399617e-05, 0.0010260299313813448, 0.0001551132882013917, 0.0006890262593515217, 0.0001461226202081889, 3.545446816133335e-05, 2.685122308321297e-05, 0.018631594255566597, 0.4275084435939789], [0.0023526190780103207, 0.00034776434767991304, 0.0004243608273100108, 0.0006887199124321342, 0.00019291638454888016, 0.0003588495892472565, 0.0002830308221746236, 0.5436081290245056, 0.00010139773803530261, 0.00021934341930318624, 0.00037271188921295106, 0.00018052791710942984, 0.0003167309332638979, 0.00017876892525237054, 0.0001575702626723796, 0.0004735547991003841, 0.44974303245544434]], [[0.07597021758556366, 0.006919255945831537, 0.0029599866829812527, 0.028846262022852898, 0.0028434819541871548, 0.0038177575916051865, 0.005664214491844177, 0.18328829109668732, 0.08995450288057327, 0.06550900638103485, 0.021192628890275955, 0.03964025899767876, 0.19827915728092194, 0.03389964997768402, 0.011517140083014965, 0.04753292351961136, 0.18216530978679657], [0.012318715453147888, 0.11714029312133789, 0.11227518320083618, 0.3797248899936676, 0.05269322916865349, 0.030792899429798126, 0.04625771939754486, 0.04122282564640045, 0.043717559427022934, 0.018247203901410103, 0.006764526478946209, 0.010199305601418018, 0.045513249933719635, 0.018859393894672394, 0.008777127601206303, 0.01673431321978569, 0.03876157104969025], [0.009736874140799046, 0.07908301800489426, 0.021416103467345238, 0.44562655687332153, 0.060280781239271164, 0.06152030825614929, 0.039246879518032074, 0.08115898817777634, 0.029900282621383667, 0.01048114150762558, 0.005554671864956617, 0.005080200266093016, 0.03273739293217659, 0.01840830035507679, 0.011064207181334496, 0.01361101120710373, 0.0750933364033699], [0.008224625140428543, 0.07518687844276428, 0.021815257146954536, 0.06683996319770813, 0.23545487225055695, 0.10646995902061462, 0.013243765570223331, 0.09121642261743546, 0.05386512354016304, 0.02465301752090454, 0.012636888772249222, 0.023020245134830475, 0.11487726122140884, 0.03181128203868866, 0.017780859023332596, 0.016344236209988594, 0.08655936270952225], [0.007346766535192728, 0.09923283010721207, 0.024501094594597816, 0.036406949162483215, 0.07115937024354935, 0.060015931725502014, 0.02000141330063343, 0.1890898495912552, 0.04575125873088837, 0.0261117871850729, 0.012910847552120686, 0.023673996329307556, 0.17047719657421112, 0.023648090660572052, 0.0038942787796258926, 0.018966713920235634, 0.16681154072284698], [0.014505370520055294, 0.09858587384223938, 0.09252838045358658, 0.10415410250425339, 0.06758163869380951, 0.04646061733365059, 0.029723752290010452, 0.10237446427345276, 0.08911532908678055, 0.032949600368738174, 0.013182645663619041, 0.03133295103907585, 0.10100719332695007, 0.04467092454433441, 0.010150223039090633, 0.027891041710972786, 0.09378580003976822], [0.007267709355801344, 0.04615989699959755, 0.07572895288467407, 0.05580735206604004, 0.054281044751405716, 0.05616047605872154, 0.04393092170357704, 0.2627074122428894, 0.025471851229667664, 0.014888076111674309, 0.04238918051123619, 0.013592816889286041, 0.03668785095214844, 0.01570899412035942, 0.012538221664726734, 0.017344873398542404, 0.21933433413505554], [0.0004470108251553029, 0.0002877892693504691, 0.0004606699221767485, 0.0014353779843077064, 0.0002551719662733376, 0.0002365635591559112, 0.0004727586347144097, 0.5710947513580322, 0.00024402773124165833, 0.0003122408234048635, 0.00037764248554594815, 0.00013155993656255305, 0.0006600871565751731, 0.0001595701032783836, 0.00011618054850259796, 0.00011551237548701465, 0.4231931269168854], [0.0066459523513913155, 0.02284112758934498, 0.026174288243055344, 0.027046095579862595, 0.06022321805357933, 0.05978621542453766, 0.03948454186320305, 0.10292305797338486, 0.03604095056653023, 0.08145276457071304, 0.02543107606470585, 0.06686289608478546, 0.17348134517669678, 0.061417356133461, 0.03257506340742111, 0.08319999277591705, 0.09441407024860382], [0.0033439621329307556, 0.013494297862052917, 0.00951046496629715, 0.025303184986114502, 0.06697545945644379, 0.019412130117416382, 0.014736411161720753, 0.23537947237491608, 0.04714599996805191, 0.04347054287791252, 0.022238923236727715, 0.019498825073242188, 0.16459587216377258, 0.053431250154972076, 0.01087503507733345, 0.036922577768564224, 0.2136656641960144], [0.01460750587284565, 0.008798026479780674, 0.03186997398734093, 0.06215869262814522, 0.0131084518507123, 0.008235911838710308, 0.03266274556517601, 0.3689597249031067, 0.008283543400466442, 0.010916988365352154, 0.014220648445189, 0.05392098054289818, 0.015910834074020386, 0.006410167086869478, 0.009260648861527443, 0.01971750147640705, 0.32095763087272644], [0.009070897474884987, 0.006778859067708254, 0.011776060797274113, 0.02764994278550148, 0.02005440928041935, 0.011405045166611671, 0.007517619989812374, 0.368602991104126, 0.02516159787774086, 0.023731978610157967, 0.0522739514708519, 0.020105788484215736, 0.034212566912174225, 0.019038686528801918, 0.007841196842491627, 0.021220432594418526, 0.33355802297592163], [0.01117764227092266, 0.009820756502449512, 0.031176244840025902, 0.008311117999255657, 0.09045500308275223, 0.03456626087427139, 0.014328337274491787, 0.23565219342708588, 0.039786700159311295, 0.05051933228969574, 0.030611393973231316, 0.04036065191030502, 0.09937126934528351, 0.04594632610678673, 0.011639961041510105, 0.03783249482512474, 0.208444282412529], [0.009080884046852589, 0.013846849091351032, 0.02912377193570137, 0.015865998342633247, 0.04273603856563568, 0.050057753920555115, 0.01747593656182289, 0.20406168699264526, 0.033020202070474625, 0.08332094550132751, 0.027308376505970955, 0.08758986741304398, 0.08520613610744476, 0.05412470921874046, 0.019447466358542442, 0.04705364629626274, 0.18067970871925354], [0.011838131584227085, 0.011740190908312798, 0.01266005914658308, 0.009252853691577911, 0.017218273133039474, 0.015812871977686882, 0.005899511743336916, 0.3968925476074219, 0.01441878080368042, 0.01344319712370634, 0.06284923851490021, 0.01454130932688713, 0.011069831438362598, 0.018460463732481003, 0.008180155418813229, 0.020474731922149658, 0.3552478551864624], [0.007180192973464727, 0.017708424478769302, 0.024055900052189827, 0.04401049390435219, 0.03129751235246658, 0.018941527232527733, 0.021190479397773743, 0.24021752178668976, 0.04559827223420143, 0.12195011973381042, 0.03490599989891052, 0.036062076687812805, 0.06146353855729103, 0.04799607768654823, 0.02054181881248951, 0.014880000613629818, 0.21200010180473328], [0.0004155752540100366, 0.0002579358988441527, 0.00042668767855502665, 0.001331348903477192, 0.00023683438485022634, 0.00021040283900219947, 0.0004458858456928283, 0.5721243619918823, 0.00021547559299506247, 0.000282162829535082, 0.00033938809065148234, 0.00011812573211500421, 0.0005860547535121441, 0.00014078384265303612, 0.00010620499233482406, 0.00010536723857512698, 0.42265740036964417]], [[0.013206672854721546, 0.0015332826878875494, 0.004247786942869425, 0.005524610169231892, 0.0035232624504715204, 0.001979164546355605, 0.00354636088013649, 0.4232901632785797, 0.020715411752462387, 0.050074558705091476, 0.006689382717013359, 0.013757086358964443, 0.04077060893177986, 0.005963142961263657, 0.012649299576878548, 0.014775129966437817, 0.37775397300720215], [0.0387628935277462, 0.0056776199489831924, 0.5421768426895142, 0.10218501091003418, 0.06321538239717484, 0.019394556060433388, 0.027700193226337433, 0.1075383722782135, 0.0012832700740545988, 0.0010397243313491344, 0.0014342182548716664, 0.0006958724115975201, 0.0008767597610130906, 0.00028362771263346076, 0.0008178178104571998, 0.0004932757001370192, 0.08642467856407166], [0.011248037219047546, 0.002659727819263935, 0.024609841406345367, 0.044674407690763474, 0.07690677046775818, 0.0641825869679451, 0.033800046890974045, 0.41705629229545593, 0.000769249745644629, 0.0025948353577405214, 0.0005109086050651968, 0.0005578022100962698, 0.0001843144273152575, 0.00018131428805645555, 0.0011019381927326322, 0.0009880842408165336, 0.3179738223552704], [0.012124452739953995, 0.0016930506099015474, 0.005616888869553804, 0.017337894067168236, 0.13421432673931122, 0.07684291154146194, 0.05173870548605919, 0.3819313049316406, 0.003475822973996401, 0.001747561153024435, 0.0009801273699849844, 0.00044221538701094687, 0.00046626964467577636, 0.001149501884356141, 0.002756391651928425, 0.0007771446253173053, 0.3067053556442261], [0.07281091064214706, 0.0003934443520847708, 0.006217190530151129, 0.0038915404584258795, 0.00322246877476573, 0.03549439087510109, 0.7055401802062988, 0.07665655761957169, 0.014282843098044395, 0.008846498094499111, 0.009721793234348297, 0.00039342354284599423, 0.0005300250486470759, 0.0005415272316895425, 0.0012351901968941092, 0.00037099054316058755, 0.0598510280251503], [0.07467374205589294, 0.000912967196200043, 0.0075102937407791615, 0.011451687663793564, 0.02884584665298462, 0.048175521194934845, 0.22033751010894775, 0.3151739835739136, 0.012889866717159748, 0.016228269785642624, 0.0118765439838171, 0.0013648230815306306, 0.002573898760601878, 0.0005812491872347891, 0.002027815906330943, 0.00016453975695185363, 0.24521136283874512], [0.003622801508754492, 7.247672328958288e-05, 6.221976946108043e-05, 0.0009076372371055186, 0.0013292254880070686, 0.0003152289136778563, 0.00029119092505425215, 0.5510453581809998, 0.0011373133165761828, 0.004779613111168146, 0.00031177664641290903, 0.0008442172547802329, 0.0012870104983448982, 0.00019324499589856714, 0.00016735470853745937, 0.00017180822032969445, 0.4334615468978882], [0.0027854014188051224, 0.00022414341219700873, 0.0006922587053850293, 0.001304261269979179, 0.0012908950448036194, 0.0006807395839132369, 0.0008070301846601069, 0.5369035601615906, 0.0007738912245258689, 0.003911640495061874, 0.00048324151430279016, 0.0005967201432213187, 0.0026828080881386995, 0.0009651672444306314, 0.0009100367315113544, 0.0017351724673062563, 0.4432530999183655], [0.024708325043320656, 0.0008062993292696774, 0.0009322196710854769, 0.0006906930939294398, 0.0006919992156326771, 0.0006177034229040146, 0.006188294850289822, 0.16756001114845276, 0.02962748520076275, 0.22270900011062622, 0.11169373244047165, 0.1086287796497345, 0.13318923115730286, 0.01141897402703762, 0.01900392957031727, 0.013614606112241745, 0.14791879057884216], [0.015017969533801079, 0.00014674702833872288, 0.00011871708557009697, 0.00038661935832351446, 0.00022052458371035755, 9.99507392407395e-05, 0.0005309984553605318, 0.2981436550617218, 0.007017528638243675, 0.014008406549692154, 0.014474696479737759, 0.03518832474946976, 0.2686300277709961, 0.05893963947892189, 0.013562672771513462, 0.017359640449285507, 0.25615394115448], [0.010787694714963436, 0.00021934015967417508, 4.3296298827044666e-05, 0.00028484140057116747, 5.253265771898441e-05, 8.860318484948948e-05, 0.00012271829473320395, 0.503818929195404, 0.0014227618230506778, 0.015306280925869942, 0.0014115954982116818, 0.009455475956201553, 0.023021597415208817, 0.017656397074460983, 0.0031584366224706173, 0.004959648475050926, 0.40818989276885986], [0.019775046035647392, 0.0004979056539013982, 0.00039776592166163027, 0.0008033571066334844, 0.0004726545885205269, 0.000195058441022411, 0.0006021548178978264, 0.13419508934020996, 0.005916981492191553, 0.018768170848488808, 0.017711156979203224, 0.03505523130297661, 0.38131290674209595, 0.11678009480237961, 0.041961733251810074, 0.10060297697782516, 0.12495163083076477], [0.06878522783517838, 0.00044730465742759407, 0.00040123920189216733, 0.000536124047357589, 0.000248724507400766, 0.00021820802066940814, 0.0007885429658927023, 0.2416054755449295, 0.0008294736035168171, 0.004048370756208897, 0.0038150856271386147, 0.006473804824054241, 0.08428594470024109, 0.10048052668571472, 0.07577506452798843, 0.19562748074531555, 0.215633362531662], [0.04427770525217056, 0.0010959581704810262, 0.0021852341014891863, 0.0010863611241802573, 0.0005874501657672226, 0.0005492506898008287, 0.0018735717749223113, 0.18085716664791107, 0.0006010523065924644, 0.00358277908526361, 0.004831456579267979, 0.00983841810375452, 0.0326489694416523, 0.038303621113300323, 0.1301141232252121, 0.3835523724555969, 0.164014533162117], [0.010868726298213005, 0.000507623830344528, 0.000992283457890153, 0.0010654046200215816, 0.00195686100050807, 0.000794819847214967, 0.004638891667127609, 0.5138244032859802, 0.0003933765401598066, 0.0021304129622876644, 0.0005681429174728692, 0.0011823376407846808, 0.0029030509758740664, 0.0023735705763101578, 0.0030291450675576925, 0.025817444548010826, 0.42695352435112], [0.008689657784998417, 0.00010850232501979917, 0.0003444029134698212, 0.0006851213402114809, 0.000611038994975388, 0.0002182644821004942, 0.0005896264337934554, 0.4846785068511963, 0.00118989625480026, 0.0017621115548536181, 0.0002332257863599807, 0.001039093709550798, 0.0039991410449147224, 0.0033981353044509888, 0.004661280661821365, 0.044583383947610855, 0.4432086944580078], [0.002714197849854827, 0.00018101122986990958, 0.0005452269688248634, 0.0010359004372730851, 0.0010138415964320302, 0.000552557990886271, 0.000664101738948375, 0.5383545756340027, 0.0006913218530826271, 0.0034049120731651783, 0.00041311196400783956, 0.000538726628292352, 0.0023400436621159315, 0.0008439483353868127, 0.0008018675143830478, 0.0016404861817136407, 0.44426414370536804]], [[0.011345873586833477, 0.0007709477213211358, 0.0014038508525118232, 0.004738782998174429, 0.0007013700669631362, 0.001179673708975315, 0.003337634028866887, 0.49937576055526733, 0.0020365407690405846, 0.0030167163349688053, 0.000999649055302143, 0.0014266648795455694, 0.007196913938969374, 0.0011097381357103586, 0.0010770433582365513, 0.003063532756641507, 0.4572192430496216], [0.01053529791533947, 0.027524331584572792, 0.03197261691093445, 0.07738640904426575, 0.028833836317062378, 0.021090110763907433, 0.0924440249800682, 0.3720938265323639, 0.0037854250986129045, 0.002752794651314616, 0.0005459869280457497, 0.0017937066731974483, 0.00702672079205513, 0.0008293442078866065, 0.0007454490987583995, 0.0014969718176871538, 0.31914323568344116], [0.009984524920582771, 0.09105601161718369, 0.043333519250154495, 0.21863463521003723, 0.036004822701215744, 0.05453525856137276, 0.12730929255485535, 0.2157953977584839, 0.0069505032151937485, 0.0016770476941019297, 0.0005822463426738977, 0.0005809149006381631, 0.0017111212946474552, 0.003899676725268364, 0.002958230674266815, 0.0012217595940455794, 0.1837649643421173], [0.017450235784053802, 0.05098915845155716, 0.015524549409747124, 0.06202205643057823, 0.030074365437030792, 0.07405059784650803, 0.060297850519418716, 0.3576740324497223, 0.005175878759473562, 0.001557640265673399, 0.00392924714833498, 0.0013206643052399158, 0.005709785036742687, 0.0022886116057634354, 0.005686908960342407, 0.00462358957156539, 0.3016248345375061], [0.022489123046398163, 0.07351426035165787, 0.030595459043979645, 0.3329576253890991, 0.016714639961719513, 0.056758470833301544, 0.044932927936315536, 0.20562781393527985, 0.008106258697807789, 0.005058916285634041, 0.004220655187964439, 0.0020410320721566677, 0.011312801390886307, 0.003253693925216794, 0.006771337706595659, 0.0018157936865463853, 0.17382921278476715], [0.00996533501893282, 0.027673326432704926, 0.015001049265265465, 0.18530084192752838, 0.08966254442930222, 0.0357201062142849, 0.01424003578722477, 0.3181236982345581, 0.005593562498688698, 0.0019186713034287095, 0.0010867591481655836, 0.0010455887531861663, 0.029361935332417488, 0.0022985092364251614, 0.0008409272413700819, 0.0011124672600999475, 0.2610546052455902], [0.011177428998053074, 0.04854544624686241, 0.012283788993954659, 0.02792965993285179, 0.06821314990520477, 0.0958329290151596, 0.00607072189450264, 0.3744601309299469, 0.005121776368469, 0.0049051446840167046, 0.0020049819722771645, 0.010967169888317585, 0.014469522051513195, 0.002558438340201974, 0.0030325050465762615, 0.0010383353801444173, 0.31138888001441956], [0.0018288403516635299, 0.00034177026827819645, 0.000868281233124435, 0.0009339270181953907, 0.00018933658429887146, 0.0005072789499536157, 0.0005265168729238212, 0.5551133751869202, 0.0003731333708856255, 0.0004173602210357785, 0.00012384286674205214, 0.00014897013898007572, 0.00047953444300219417, 0.00022992586309555918, 0.00013811467215418816, 0.00014782029029447585, 0.4376320242881775], [0.028288448229432106, 0.013921759091317654, 0.019955700263381004, 0.01564882881939411, 0.014824780635535717, 0.03738304600119591, 0.016180751845240593, 0.3463776707649231, 0.022005783393979073, 0.025955889374017715, 0.008044268935918808, 0.026218431070446968, 0.07485531270503998, 0.006993490736931562, 0.006199105177074671, 0.009632807224988937, 0.3275138735771179], [0.015722323209047318, 0.012799189426004887, 0.024145575240254402, 0.005295650102198124, 0.024525925517082214, 0.04366506263613701, 0.008266536518931389, 0.1459624469280243, 0.13981643319129944, 0.029632318764925003, 0.06461349129676819, 0.050010647624731064, 0.21709710359573364, 0.05096643790602684, 0.008363843895494938, 0.02374880760908127, 0.13536809384822845], [0.012077665887773037, 0.013800821267068386, 0.0194963701069355, 0.0014476276701316237, 0.0062560392543673515, 0.02170725166797638, 0.004694177769124508, 0.44704803824424744, 0.032420624047517776, 0.014501435682177544, 0.0016951668076217175, 0.0038341255858540535, 0.010030636563897133, 0.015393426641821861, 0.00880120787769556, 0.0031624718103557825, 0.38363295793533325], [0.025811590254306793, 0.006270482204854488, 0.021383745595812798, 0.01874261163175106, 0.008837755769491196, 0.012617083266377449, 0.012052659876644611, 0.30576974153518677, 0.058332040905952454, 0.04825763404369354, 0.015365748666226864, 0.00734691321849823, 0.1366000920534134, 0.024931928142905235, 0.005069859325885773, 0.008471430279314518, 0.2841387093067169], [0.10116562247276306, 0.00922666396945715, 0.014383894391357899, 0.020008442923426628, 0.015994219109416008, 0.028799451887607574, 0.013592690229415894, 0.21266353130340576, 0.05109625309705734, 0.037287455052137375, 0.056167565286159515, 0.04833369702100754, 0.1298355758190155, 0.017450066283345222, 0.02126818336546421, 0.01967090740799904, 0.20305587351322174], [0.05394609645009041, 0.014758073724806309, 0.019108371809124947, 0.009098870679736137, 0.012177100405097008, 0.04340613633394241, 0.006621689070016146, 0.22507818043231964, 0.053362827748060226, 0.03042423725128174, 0.019929885864257812, 0.02921587973833084, 0.21143800020217896, 0.03299817442893982, 0.020433923229575157, 0.011744922958314419, 0.2062576860189438], [0.019683953374624252, 0.004259009845554829, 0.026455890387296677, 0.007167668081820011, 0.005343365017324686, 0.012682690285146236, 0.003177955513820052, 0.3726280927658081, 0.018664361909031868, 0.021120937541127205, 0.028131481260061264, 0.015818949788808823, 0.08525123447179794, 0.017485465854406357, 0.007051728665828705, 0.018232619389891624, 0.3368445634841919], [0.0320599302649498, 0.00398058770224452, 0.007576689589768648, 0.005517119541764259, 0.006193910725414753, 0.015832168981432915, 0.0027940492145717144, 0.2618427872657776, 0.0355912446975708, 0.020826207473874092, 0.01384084951132536, 0.012926173396408558, 0.29909566044807434, 0.030012499541044235, 0.006058905739337206, 0.006729679182171822, 0.23912139236927032], [0.0017211873782798648, 0.0002902017149608582, 0.0007449788390658796, 0.0007946657133288682, 0.00015725726552773267, 0.0004422204801812768, 0.0004456284805200994, 0.5556318163871765, 0.0003441806766204536, 0.00038795583532191813, 0.00011028895096387714, 0.00013749119534622878, 0.0004364426131360233, 0.00021114558330737054, 0.0001244174491148442, 0.00013390782987698913, 0.43788620829582214]], [[0.014110767282545567, 0.0024705822579562664, 0.002178468741476536, 0.00584516953676939, 0.004958294797688723, 0.009759688749909401, 0.00439324788749218, 0.46703261137008667, 0.004235672298818827, 0.0037168862763792276, 0.02746523730456829, 0.006583313923329115, 0.008500880561769009, 0.0028269877657294273, 0.002390121342614293, 0.006044536828994751, 0.4274875819683075], [0.014817937277257442, 0.01925741694867611, 0.028608912602066994, 0.02303149923682213, 0.036657921969890594, 0.0022971874568611383, 0.01103015523403883, 0.456119179725647, 0.0012087792856618762, 0.005246355663985014, 0.0015149918617680669, 0.004557821433991194, 0.009032451547682285, 0.0027235455345362425, 0.0019944782834500074, 0.0061264643445611, 0.37577494978904724], [0.01231284998357296, 0.16504409909248352, 0.013803359121084213, 0.00564383203163743, 0.0042946659959852695, 0.001500042388215661, 0.0014556397218257189, 0.43885353207588196, 0.0019745526369661093, 0.0004274443781469017, 0.00020986043091397732, 7.920734060462564e-05, 0.00011173914390383288, 0.0013359427684918046, 0.001320637995377183, 0.0008220553863793612, 0.35081055760383606], [0.0060074664652347565, 0.004852284677326679, 0.08951862901449203, 0.022039301693439484, 0.00435992144048214, 0.0026769288815557957, 0.0045935423113405704, 0.46727070212364197, 3.2905514672165737e-05, 1.4345622730616014e-05, 0.0001393159182043746, 4.3390155042288825e-05, 0.00014946729061193764, 4.6499193558702245e-05, 0.0006850280915386975, 0.0009455970721319318, 0.3966245949268341], [0.0790741890668869, 0.0030712017323821783, 0.0073019894771277905, 0.5538678169250488, 0.004058005288243294, 0.009985352866351604, 0.06533657014369965, 0.13933014869689941, 0.00015800043183844537, 0.0004964639665558934, 0.00029919258668087423, 0.0002250416437163949, 0.000303982145851478, 0.00011089473264291883, 0.018882932141423225, 0.004893782548606396, 0.11260446161031723], [0.02261258475482464, 0.0016960124485194683, 0.012181530706584454, 0.12831464409828186, 0.28239110112190247, 0.0186010729521513, 0.024396926164627075, 0.27063822746276855, 0.0005782949738204479, 0.0035174130462110043, 0.0006784854922443628, 0.001847911742515862, 0.0007668919861316681, 0.00015998353774193674, 0.0013151849852874875, 0.003741527209058404, 0.22656220197677612], [0.06165337935090065, 0.005149534437805414, 0.0006064609042368829, 0.006854188162833452, 0.028681613504886627, 0.08585155010223389, 0.025986842811107635, 0.4233226478099823, 0.017968611791729927, 0.0009039347642101347, 0.0019463000353425741, 0.0009101430187001824, 0.0004345017368905246, 0.0011427525896579027, 0.00016804800543468446, 0.0012700501829385757, 0.3371495008468628], [0.0019813745748251677, 0.0014737880555912852, 0.0011357017792761326, 0.001283604302443564, 0.0023400718346238136, 0.0016586726997047663, 0.0010090606519952416, 0.5301252603530884, 0.0009908988140523434, 0.0012801074190065265, 0.001005787868052721, 0.0014225491322577, 0.0009969007223844528, 0.0004917752812616527, 0.0008698684396222234, 0.0007379172602668405, 0.4511966407299042], [0.014917157590389252, 0.0005298461765050888, 0.00012571160914376378, 0.003080169903114438, 0.0018169389804825187, 0.0008548601763322949, 0.021253390237689018, 0.445070743560791, 0.018381619825959206, 0.10098598152399063, 0.006466514430940151, 0.007354699075222015, 0.009819825179874897, 0.0005326664540916681, 0.00173884944524616, 0.003765143919736147, 0.36330586671829224], [0.12149844318628311, 0.0012529102386906743, 0.0001934340689331293, 0.0013229299802333117, 0.0004528471326921135, 0.0020638646092265844, 0.014508597552776337, 0.20786873996257782, 0.4181981384754181, 0.01956816017627716, 0.004226239863783121, 0.010012696497142315, 0.00897121336311102, 0.00893235020339489, 0.0007013690192252398, 0.0014338763430714607, 0.17879419028759003], [0.15141110122203827, 0.003202573861926794, 6.483922334155068e-05, 0.000140385702252388, 0.001051912666298449, 0.006273757666349411, 0.0064452276565134525, 0.34135207533836365, 0.050235822796821594, 0.07263109087944031, 0.019692091271281242, 0.01686038449406624, 0.026062026619911194, 0.004923694301396608, 0.0030663232319056988, 0.0028896499425172806, 0.2936969995498657], [0.01598452404141426, 0.00032322219340130687, 0.00014409887080546468, 0.00027765377308242023, 0.00010223880963167176, 0.00019296172831673175, 0.012768030166625977, 0.054579317569732666, 0.0011383716482669115, 0.0097724674269557, 0.828976035118103, 0.0006810228805989027, 0.02237636037170887, 0.0010828038211911917, 0.00251349457539618, 0.0003637057961896062, 0.04872376099228859], [0.06842824071645737, 0.0004867010284215212, 2.3710548703093082e-05, 0.0007310395012609661, 4.5141492591938004e-05, 0.0002220443420810625, 0.0015508838696405292, 0.3573071360588074, 0.004921369720250368, 0.041440464556217194, 0.0019064503721892834, 0.09132841229438782, 0.06100711226463318, 0.03653496876358986, 0.015409774146974087, 0.015622846782207489, 0.3030337691307068], [0.025666601955890656, 0.0009164935909211636, 0.00010547880810918286, 0.0019255708903074265, 0.00018025054305326194, 7.468437979696319e-05, 0.0012315738713368773, 0.3011631369590759, 0.001192352850921452, 0.031612835824489594, 0.009583639912307262, 0.031684670597314835, 0.2531246840953827, 0.026209086179733276, 0.024296654388308525, 0.03905860334634781, 0.2519736588001251], [0.026538629084825516, 0.001564147649332881, 0.003308072919026017, 0.00563762616366148, 0.002290179021656513, 0.00032372851273976266, 0.0012666749535128474, 0.37881410121917725, 0.01853073574602604, 0.0014329113764688373, 0.0022116375621408224, 0.009517709724605083, 0.03530459478497505, 0.10692887753248215, 0.008047818206250668, 0.07240409404039383, 0.325878381729126], [0.037326615303754807, 7.227421156130731e-05, 4.257094133208739e-06, 8.799164061201736e-05, 1.0423711501061916e-05, 1.3401118849287741e-05, 0.00014162968727760017, 0.018169838935136795, 6.649984698015032e-06, 9.621450590202585e-05, 0.0004485661047510803, 0.000213238614378497, 0.0010632271878421307, 0.0005494948127306998, 0.9247152209281921, 0.00042902069981209934, 0.01665179617702961], [0.0019838919397443533, 0.0013083294034004211, 0.001013571978546679, 0.0011206583585590124, 0.0020337167661637068, 0.0014117201790213585, 0.000931464834138751, 0.5307431221008301, 0.0009215709287673235, 0.0011540006380528212, 0.0009628679836168885, 0.0013596871867775917, 0.0009609250701032579, 0.0004782821051776409, 0.0008783574448898435, 0.000755774206481874, 0.45198214054107666]], [[0.0854300931096077, 0.0030179969035089016, 0.003994830884039402, 0.0056284512393176556, 0.004460444673895836, 0.002580961911007762, 0.002086714841425419, 0.378406286239624, 0.019140319898724556, 0.029018983244895935, 0.01661057397723198, 0.0064309281297028065, 0.04220118746161461, 0.004914135672152042, 0.012070589698851109, 0.02011053077876568, 0.36389702558517456], [0.10772743821144104, 0.08537203818559647, 0.02353050746023655, 0.04524604603648186, 0.05669783800840378, 0.20296666026115417, 0.07708732783794403, 0.18906867504119873, 0.011571051552891731, 0.004700229037553072, 0.005370364524424076, 0.003649464575573802, 0.011815380305051804, 0.004359361715614796, 0.00414521899074316, 0.005047038663178682, 0.16164535284042358], [0.042992495000362396, 0.03335381671786308, 0.006874927785247564, 0.00949427392333746, 0.0059476010501384735, 0.03842833265662193, 0.007127955090254545, 0.44648465514183044, 0.009897420182824135, 0.005644648801535368, 0.008060656487941742, 0.0017722725169733167, 0.005780004430562258, 0.006809908431023359, 0.0014666742645204067, 0.0019642391707748175, 0.36790013313293457], [0.02723478339612484, 0.010761923156678677, 0.0038043612148612738, 0.005540398880839348, 0.014922885224223137, 0.014753122813999653, 0.009761949069797993, 0.4393165409564972, 0.030955437570810318, 0.024643436074256897, 0.012778385542333126, 0.003452734788879752, 0.01537474524229765, 0.013690480962395668, 0.0014762291684746742, 0.002523239701986313, 0.36900946497917175], [0.037258297204971313, 0.007767591625452042, 0.004381546750664711, 0.032339051365852356, 0.0075458986684679985, 0.013827256858348846, 0.01443540956825018, 0.4348143935203552, 0.01115723792463541, 0.016055386513471603, 0.01146245189011097, 0.010428386740386486, 0.020782742649316788, 0.00823657214641571, 0.0034177976194769144, 0.0037883396726101637, 0.36230161786079407], [0.035372767597436905, 0.011724458076059818, 0.0033400419633835554, 0.017536710947752, 0.009815460070967674, 0.034825462847948074, 0.02377331256866455, 0.4405779242515564, 0.008879315108060837, 0.007596019189804792, 0.004719038028270006, 0.004539698362350464, 0.011677262373268604, 0.010525363497436047, 0.0014933859929442406, 0.006522198207676411, 0.36708158254623413], [0.007848944514989853, 0.004437538795173168, 0.001386468531563878, 0.003685190575197339, 0.0011456039501354098, 0.00161272962577641, 0.000268500589299947, 0.45037078857421875, 0.03153678774833679, 0.0371718592941761, 0.006742254365235567, 0.012345647439360619, 0.0115497512742877, 0.031305402517318726, 0.004572288598865271, 0.00146691151894629, 0.3925533592700958], [0.004816779866814613, 0.002383073791861534, 0.00892702117562294, 0.0069795288145542145, 0.0023289548698812723, 0.0029933147598057985, 0.005681321024894714, 0.5037974715232849, 0.0027090199291706085, 0.013236936181783676, 0.0043112714774906635, 0.00250759138725698, 0.0021912134252488613, 0.002705693244934082, 0.006174657493829727, 0.0035611451603472233, 0.4246949851512909], [0.018210338428616524, 0.007109534461051226, 0.0036550371441990137, 0.018650906160473824, 0.0034694517962634563, 0.0022885838989168406, 0.0018192676361650229, 0.41893625259399414, 0.004611875396221876, 0.0035335258580744267, 0.00683232955634594, 0.010060845874249935, 0.03501167893409729, 0.005499502178281546, 0.0351838581264019, 0.04106462374329567, 0.3840624690055847], [0.018574588000774384, 0.003245359053835273, 0.001176782650873065, 0.009598059579730034, 0.0013845268404111266, 0.001814891118556261, 0.002023482695221901, 0.2843804359436035, 0.027884092181921005, 0.00533917173743248, 0.007165585644543171, 0.03314221277832985, 0.17999014258384705, 0.08728275448083878, 0.010214883834123611, 0.06921931356191635, 0.25756365060806274], [0.01686643622815609, 0.0015880335122346878, 0.001718031009659171, 0.002575324149802327, 0.00046804227167740464, 0.0008317902684211731, 0.0012658920604735613, 0.4040377736091614, 0.026010040193796158, 0.020744938403367996, 0.00807048287242651, 0.02047741413116455, 0.03527773171663284, 0.08031362295150757, 0.0095042260363698, 0.017074642702937126, 0.35317546129226685], [0.0172945074737072, 0.007280809339135885, 0.015748262405395508, 0.011971774511039257, 0.0034243310801684856, 0.0025515342131257057, 0.0039002783596515656, 0.3353986144065857, 0.03345007449388504, 0.054423898458480835, 0.009309571236371994, 0.009196617640554905, 0.0383567251265049, 0.0792415589094162, 0.011035078205168247, 0.0546928308904171, 0.31272345781326294], [0.007672100327908993, 0.0029364190995693207, 0.002351032570004463, 0.011706591583788395, 0.001210115966387093, 0.0014930680627003312, 0.0012183137005195022, 0.4761522114276886, 0.003254745854064822, 0.003243603277951479, 0.0009521990432403982, 0.001919349073432386, 0.011584454216063023, 0.006537989713251591, 0.009366673417389393, 0.02173188142478466, 0.4366692900657654], [0.013616631738841534, 0.0025452175177633762, 0.003681562142446637, 0.012900063768029213, 0.0016006738878786564, 0.0019238409586250782, 0.0015870446804910898, 0.45826655626296997, 0.001953268889337778, 0.0016292273066937923, 0.0011369376443326473, 0.0029098610393702984, 0.006554546765983105, 0.004265179857611656, 0.017152519896626472, 0.06934789568185806, 0.39892899990081787], [0.008897403255105019, 0.0007873740396462381, 0.0007181507535278797, 0.0007311116205528378, 0.0006096087163314223, 0.0008161278674378991, 0.000388050772016868, 0.1990060657262802, 0.25867992639541626, 0.029603339731693268, 0.005455044098198414, 0.015667835250496864, 0.055782247334718704, 0.23900572955608368, 0.0018490678630769253, 0.01015045028179884, 0.1718524694442749], [0.010036182589828968, 0.0009416015818715096, 0.0015584181528538465, 0.0010633409256115556, 0.0005779470666311681, 0.0004175567300990224, 0.0004960694932378829, 0.3224344849586487, 0.11238254606723785, 0.06543972343206406, 0.011825899593532085, 0.020038258284330368, 0.04533001407980919, 0.0777665376663208, 0.01010243222117424, 0.014740225858986378, 0.30484864115715027], [0.00506402924656868, 0.002151767024770379, 0.008284646086394787, 0.006230692379176617, 0.0021166489459574223, 0.0026998238172382116, 0.005223971791565418, 0.5057055950164795, 0.0026926833670586348, 0.013170181773602962, 0.004327911883592606, 0.0024216766469180584, 0.002121603349223733, 0.002613781252875924, 0.005846674554049969, 0.0034707067534327507, 0.4258575439453125]], [[0.022880421951413155, 0.03529440611600876, 0.019696753472089767, 0.050164226442575455, 0.03654306381940842, 0.019499309360980988, 0.0031719955150038004, 0.43258923292160034, 0.006270706653594971, 0.003737811231985688, 0.0020605844911187887, 0.001559276832267642, 0.008204489015042782, 0.011015087366104126, 0.007928246632218361, 0.002333955140784383, 0.3370504379272461], [0.10088784992694855, 0.06402290612459183, 0.01898825354874134, 0.02063124068081379, 0.005467532202601433, 0.07689423114061356, 0.02549259178340435, 0.2859945595264435, 0.027556829154491425, 0.008004356175661087, 0.011517329141497612, 0.03605891764163971, 0.026614727452397346, 0.03240635246038437, 0.003420459106564522, 0.012771648354828358, 0.24327027797698975], [0.08177720755338669, 0.008181660436093807, 0.00527913449332118, 0.016394097357988358, 0.00894406158477068, 0.012210753746330738, 0.015227371826767921, 0.43873006105422974, 0.004990908782929182, 0.00395273556932807, 0.005402241367846727, 0.012191261164844036, 0.015615029260516167, 0.006603778339922428, 0.00502715865150094, 0.006678436417132616, 0.35279402136802673], [0.12789654731750488, 0.018632663413882256, 0.03575440123677254, 0.028624137863516808, 0.025642065331339836, 0.04912392422556877, 0.09279120713472366, 0.3144552409648895, 0.0044326079078018665, 0.004398720804601908, 0.007934763096272945, 0.0055986326187849045, 0.008985401131212711, 0.00439957482740283, 0.0018971936078742146, 0.003387649543583393, 0.2660452723503113], [0.05445573106408119, 0.013056723400950432, 0.017959441989660263, 0.041777029633522034, 0.0649150162935257, 0.04769081622362137, 0.06029227003455162, 0.32089272141456604, 0.007306837011128664, 0.006061144173145294, 0.009770243428647518, 0.005389487836509943, 0.052157700061798096, 0.006284154951572418, 0.003173522185534239, 0.008830040693283081, 0.2799871265888214], [0.06801784783601761, 0.06522335857152939, 0.027051173150539398, 0.04242398217320442, 0.023999491706490517, 0.1040259525179863, 0.05790415406227112, 0.2381482720375061, 0.05209377035498619, 0.012707652524113655, 0.0026207517366856337, 0.01460757851600647, 0.030093079432845116, 0.03553071990609169, 0.0030516120605170727, 0.01591617241501808, 0.20658448338508606], [0.06857061386108398, 0.0468660406768322, 0.026686836034059525, 0.025448715314269066, 0.014131818898022175, 0.018788283690810204, 0.007903056219220161, 0.4045487642288208, 0.006904031615704298, 0.007226529065519571, 0.006309757474809885, 0.00576722202822566, 0.0026206308975815773, 0.0033020966220647097, 0.004183491226285696, 0.008033758029341698, 0.3427083194255829], [0.011853308416903019, 0.0019049011170864105, 0.0020108323078602552, 0.00835198163986206, 0.005694197025150061, 0.0032695652917027473, 0.0036214820574969053, 0.5135090351104736, 0.0014499028911814094, 0.00243352516554296, 0.0033011252526193857, 0.002650870941579342, 0.0028537230100482702, 0.0021097061689943075, 0.002249825047329068, 0.004644565749913454, 0.4280913472175598], [0.028149627149105072, 0.0343836173415184, 0.025277364999055862, 0.012027369812130928, 0.014542841352522373, 0.035944972187280655, 0.015858503058552742, 0.3811124265193939, 0.0019646536093205214, 0.008018952794373035, 0.039537396281957626, 0.008972112089395523, 0.0027407859452068806, 0.001302420161664486, 0.03561503440141678, 0.023814842104911804, 0.3307371139526367], [0.020547104999423027, 0.007403127383440733, 0.00887544360011816, 0.02184613235294819, 0.013340843841433525, 0.005759109742939472, 0.005755488760769367, 0.4319256544113159, 0.007227728143334389, 0.007625177036970854, 0.01320359855890274, 0.018372714519500732, 0.023425132036209106, 0.008892761543393135, 0.010289324447512627, 0.027462823316454887, 0.368047833442688], [0.03553614392876625, 0.0033822571858763695, 0.002275908598676324, 0.004806940909475088, 0.009293694980442524, 0.0026291401591151953, 0.009329186752438545, 0.37922099232673645, 0.06237481161952019, 0.03264700248837471, 0.0010076938197016716, 0.00669941445812583, 0.018506808206439018, 0.0463733971118927, 0.040306784212589264, 0.01613820530474186, 0.3294716477394104], [0.03902363032102585, 0.011857344768941402, 0.029996488243341446, 0.010018144734203815, 0.004983636550605297, 0.0033116769045591354, 0.003961519338190556, 0.382441908121109, 0.016015594825148582, 0.06205238774418831, 0.00618678517639637, 0.019296489655971527, 0.01242037583142519, 0.01670861802995205, 0.010608470067381859, 0.026647863909602165, 0.3444690406322479], [0.024816537275910378, 0.020351756364107132, 0.05591588467359543, 0.017345983535051346, 0.018865063786506653, 0.01279106829315424, 0.01165543869137764, 0.3361521363258362, 0.010467136278748512, 0.067290298640728, 0.010313466191291809, 0.032353851944208145, 0.00629299646243453, 0.005577710922807455, 0.010494670830667019, 0.047657277435064316, 0.31165868043899536], [0.01933697611093521, 0.021230395883321762, 0.030936021357774734, 0.009255247190594673, 0.0056896451860666275, 0.016703080385923386, 0.011242924258112907, 0.39001569151878357, 0.0020494975615292788, 0.009716367349028587, 0.023819079622626305, 0.01601150445640087, 0.003656909801065922, 0.0024994804989546537, 0.043761271983385086, 0.0476105660200119, 0.3464653491973877], [0.021183954551815987, 0.0024019889533519745, 0.004433831200003624, 0.003882960882037878, 0.0018094646511599422, 0.0032343307975679636, 0.0132086630910635, 0.4638918340206146, 0.012041247449815273, 0.004053337499499321, 0.03659071400761604, 0.00323711265809834, 0.011096393689513206, 0.015315958298742771, 0.00036005032598041, 0.00608314061537385, 0.39717498421669006], [0.02599569410085678, 0.0029310311656445265, 0.009629348292946815, 0.003399013541638851, 0.0012451994698494673, 0.0015715686604380608, 0.0036003459244966507, 0.34181633591651917, 0.018057383596897125, 0.04739438742399216, 0.005861514713615179, 0.07060449570417404, 0.06857673078775406, 0.04932856559753418, 0.0038784630596637726, 0.029119020327925682, 0.3169908821582794], [0.012022064067423344, 0.0018082662718370557, 0.0018135112477466464, 0.0076340315863490105, 0.005415252409875393, 0.003205433487892151, 0.003551595378667116, 0.5135653614997864, 0.0014594000531360507, 0.0024116092827171087, 0.003419057000428438, 0.002717947354540229, 0.0029872881714254618, 0.002136907773092389, 0.0022560011129826307, 0.004935144912451506, 0.4286612570285797]], [[0.00929126888513565, 0.017191775143146515, 0.011651807464659214, 0.01235037948936224, 0.0045356242917478085, 0.009318961761891842, 0.008056349121034145, 0.4993261992931366, 0.0023149061016738415, 0.001564012374728918, 0.001460631494410336, 0.0017081897240132093, 0.0009094775305129588, 0.0021517244167625904, 0.001912733307108283, 0.002200718503445387, 0.4140552580356598], [0.02446877770125866, 0.04359372705221176, 0.029880598187446594, 0.00696707284078002, 0.0021395254880189896, 0.0012497034622356296, 0.009011239744722843, 0.47458216547966003, 0.00720573402941227, 0.00378995924256742, 0.0016322663286700845, 0.0035026094410568476, 0.006198841612786055, 0.004283383022993803, 0.002555274637416005, 0.00538793858140707, 0.3735511302947998], [0.02930302359163761, 0.08564784377813339, 0.014852220192551613, 0.00471562659367919, 0.0027983069885522127, 0.0010617021471261978, 0.00814543291926384, 0.46687400341033936, 0.0015140946488827467, 0.0011518927058205009, 0.001203755964525044, 0.0008280725451186299, 0.004201377276331186, 0.003927871584892273, 0.003063049167394638, 0.002539727371186018, 0.3681719899177551], [0.029762674123048782, 0.10279069095849991, 0.39416149258613586, 0.01715431548655033, 0.01005059015005827, 0.0016533824382349849, 0.00619015796110034, 0.2362605780363083, 0.0019211572362110019, 0.001063868636265397, 0.0011556497775018215, 0.0004680499550886452, 0.0061555905267596245, 0.0016367505304515362, 0.0014136601239442825, 0.0017721574986353517, 0.18638914823532104], [0.05517926439642906, 0.08208931982517242, 0.3490092158317566, 0.2984776496887207, 0.018951701000332832, 0.011232759803533554, 0.01455780677497387, 0.08810630440711975, 0.002505831653252244, 0.0008589171920903027, 0.0008082542335614562, 0.0010820094030350447, 0.001361397560685873, 0.0007305339677259326, 0.0012765300925821066, 0.0019563110545277596, 0.07181615382432938], [0.069474957883358, 0.13416263461112976, 0.20173947513103485, 0.19440624117851257, 0.053877852857112885, 0.018970435485243797, 0.016045285388827324, 0.1536427140235901, 0.01159395556896925, 0.003921298775821924, 0.0007665042649023235, 0.00287225772626698, 0.001958930166438222, 0.002384862396866083, 0.005944040138274431, 0.004751773085445166, 0.12348677963018417], [0.03200415149331093, 0.04867372661828995, 0.14188799262046814, 0.27125105261802673, 0.054742615669965744, 0.044986095279455185, 0.005503072403371334, 0.21964265406131744, 0.00258469651453197, 0.0011289140675216913, 0.00022259070829022676, 0.0006396376993507147, 0.0004103493702132255, 0.0008578583947382867, 0.001415062928572297, 0.0007625702419318259, 0.1732870489358902], [0.003594295121729374, 0.0029551757033914328, 0.0021916665136814117, 0.002443571574985981, 0.0010741599835455418, 0.00332530215382576, 0.002390101784840226, 0.5312371253967285, 0.0008651370299048722, 0.0013985804980620742, 0.0005148756899870932, 0.0005316686583682895, 0.0005679134046658874, 0.0017168186604976654, 0.0007962448871694505, 0.001154624274931848, 0.44324278831481934], [0.013400969095528126, 0.006885268725454807, 0.0029821305070072412, 0.004140232689678669, 0.004326180089265108, 0.015951605513691902, 0.009797601029276848, 0.49987009167671204, 0.012189355678856373, 0.001999528845772147, 0.0022221363615244627, 0.00199200795032084, 0.0005530448397621512, 0.00038220331771299243, 0.0007172237383201718, 0.0013923528604209423, 0.4211980402469635], [0.008692437782883644, 0.005577174015343189, 0.00348121440038085, 0.005139702931046486, 0.005881609860807657, 0.006603053770959377, 0.010084631852805614, 0.4377695620059967, 0.09897876530885696, 0.019061705097556114, 0.0024581176694482565, 0.009331346489489079, 0.0031951831188052893, 0.0017115366645157337, 0.001182248815894127, 0.0019272314384579659, 0.3789244592189789], [0.045874789357185364, 0.0022652929183095694, 0.0012088241055607796, 0.007476700469851494, 0.005944989621639252, 0.0031458651646971703, 0.02097923681139946, 0.443372517824173, 0.0675770491361618, 0.019192401319742203, 0.002762912306934595, 0.0120363999158144, 0.0017859921790659428, 0.0018685529939830303, 0.0008911244221962988, 0.0013174525229260325, 0.3622998297214508], [0.02091415412724018, 0.011107457801699638, 0.004869705531746149, 0.0017451482126489282, 0.00397754879668355, 0.010159249417483807, 0.014687626622617245, 0.3539581596851349, 0.06332702934741974, 0.08626952022314072, 0.04617675766348839, 0.008191022090613842, 0.042312975972890854, 0.006634667981415987, 0.004943626467138529, 0.0058705504052340984, 0.31485483050346375], [0.02598239853978157, 0.003810475580394268, 0.005120215471833944, 0.003810820635408163, 0.0035531020257622004, 0.002124243648722768, 0.0016662784619256854, 0.3997665345668793, 0.05421795696020126, 0.06993283331394196, 0.030443234369158745, 0.037636298686265945, 0.014768162742257118, 0.00509606534615159, 0.0030609883833676577, 0.001049359911121428, 0.33796101808547974], [0.026836581528186798, 0.004885324276983738, 0.0024727191776037216, 0.0026851315051317215, 0.0014422073727473617, 0.002416890347376466, 0.00156425591558218, 0.4346471428871155, 0.015200946480035782, 0.022759273648262024, 0.019903168082237244, 0.04250545799732208, 0.02814398892223835, 0.01189536415040493, 0.007383238524198532, 0.002080547856166959, 0.37317782640457153], [0.00743082445114851, 0.004954962991178036, 0.0060543823055922985, 0.003196436446160078, 0.0013914935989305377, 0.0010928340489044785, 0.002211059909313917, 0.4644028842449188, 0.011310427449643612, 0.026804577559232712, 0.002976618707180023, 0.016197824850678444, 0.0286333579570055, 0.01696348935365677, 0.004267004784196615, 0.0036603854969143867, 0.39845147728919983], [0.014812071807682514, 0.0048075332306325436, 0.004455265123397112, 0.0021066959016025066, 0.0011766948737204075, 0.0006930087110958993, 0.0004399390600156039, 0.39671948552131653, 0.004291894845664501, 0.01779583841562271, 0.001758127473294735, 0.0037072314880788326, 0.11319597065448761, 0.06394808739423752, 0.01410135067999363, 0.007343296427279711, 0.34864744544029236], [0.0036764133255928755, 0.002440114738419652, 0.0017950433539226651, 0.0019981469959020615, 0.000879047205671668, 0.00273315841332078, 0.0020253874827176332, 0.5329792499542236, 0.000845536298584193, 0.0013565077679231763, 0.0004982445389032364, 0.0005212237010709941, 0.0005679410533048213, 0.0016901021590456367, 0.0007692141807638109, 0.0011368937557563186, 0.44408777356147766]], [[0.02362612448632717, 0.02427562139928341, 0.004650562535971403, 0.005623771343380213, 0.0018807738088071346, 0.0012156838783994317, 0.008609775453805923, 0.15466398000717163, 0.5512253642082214, 0.03815317898988724, 0.010205508209764957, 0.008779735304415226, 0.007172062527388334, 0.006789559964090586, 0.01312484685331583, 0.0032977599184960127, 0.13670572638511658], [0.09047234058380127, 0.014567430131137371, 0.282764732837677, 0.011648581363260746, 0.0003048956859856844, 0.0006245659897103906, 0.0182026419788599, 0.320551335811615, 0.000742948439437896, 0.0005548172630369663, 0.0012565336655825377, 0.0003052475512959063, 0.00022943336807657033, 0.0010330253280699253, 0.001934821717441082, 0.0003034280671272427, 0.25450316071510315], [0.022552987560629845, 0.006214485969394445, 0.004649084527045488, 0.10121580213308334, 0.0076131015084683895, 0.002900101011618972, 0.004490031860768795, 0.46181392669677734, 0.0002535934327170253, 0.0004849644028581679, 0.001072361832484603, 0.0012038740096613765, 0.0002803131064865738, 0.00022057766909711063, 0.00038083482650108635, 0.0007615726790390909, 0.3838924467563629], [0.01873127371072769, 0.004037341568619013, 0.0055734338238835335, 0.012536394409835339, 0.1324429214000702, 0.09985945373773575, 0.017158987000584602, 0.38615402579307556, 0.0002708380052354187, 0.0004719299904536456, 0.0003420118591748178, 0.00017118509276770055, 0.00011580483260331675, 0.00015223106311168522, 0.0006329281022772193, 0.002381951082497835, 0.3189672529697418], [0.006352698430418968, 0.0005157016566954553, 0.0013872975250706077, 0.00039063056465238333, 0.004193692933768034, 0.5982937812805176, 0.14877165853977203, 0.1314929574728012, 0.00027685367967933416, 0.000316930643748492, 0.0001753415708662942, 9.039610449690372e-05, 9.989932550524827e-06, 3.9299859054153785e-05, 0.00038552729529328644, 0.000362535152817145, 0.1069447249174118], [0.018137339502573013, 0.0007293533999472857, 0.0020262000616639853, 0.0007749767391942441, 0.00443369010463357, 0.021947085857391357, 0.8002176284790039, 0.08242310583591461, 0.0012392179341986775, 0.001321692019701004, 0.0011541912099346519, 0.00022694906510878354, 1.4300473594630603e-05, 1.874738336482551e-05, 0.0004145015846006572, 0.00011395500041544437, 0.06480713188648224], [0.004031297285109758, 0.0067943693138659, 0.00026893490576185286, 0.0003427844203542918, 8.384278771700338e-05, 0.0007662678835913539, 0.0005927934544160962, 0.5352180600166321, 0.023701859638094902, 0.0006680027581751347, 0.0006122649647295475, 0.0002968394255731255, 2.247359043394681e-05, 4.387016815599054e-05, 2.1808027668157592e-05, 1.858772520790808e-05, 0.42651593685150146], [0.007280635181814432, 0.007592980284243822, 0.0033206199295818806, 0.003178331535309553, 0.0037511074915528297, 0.004654223565012217, 0.0035528303124010563, 0.5006781220436096, 0.002777552930638194, 0.002305650617927313, 0.0011544993612915277, 0.002511365106329322, 0.0023881744127720594, 0.0019119420321658254, 0.0018225401872768998, 0.0024332269094884396, 0.4486862123012543], [0.02610435150563717, 0.00030966848134994507, 0.00048803596291691065, 0.0003012116940226406, 4.589529999066144e-05, 0.00010399595339549705, 0.0023972210474312305, 0.14773815870285034, 0.005722071975469589, 0.6686967611312866, 0.013623042963445187, 0.001088281744159758, 0.001115524908527732, 0.00043630623258650303, 0.006188457366079092, 4.965712287230417e-05, 0.12559138238430023], [0.05554744228720665, 0.00044528223224915564, 9.817007958190516e-05, 2.751592910499312e-05, 3.120167457382195e-05, 0.0003091120452154428, 0.00015244103269651532, 0.3799756169319153, 0.008744265884160995, 0.01381969265639782, 0.17314547300338745, 0.00464810524135828, 0.01890740729868412, 0.006382541265338659, 0.001089670928195119, 0.00022417159925680608, 0.33645179867744446], [0.006000999361276627, 0.0006592776044271886, 2.06712975341361e-05, 0.000274181948043406, 0.00025896812439896166, 0.0006092973053455353, 0.0008885219576768577, 0.1126176193356514, 0.00042435526847839355, 0.0021327293943613768, 0.002024474088102579, 0.7679799795150757, 0.0008986655157059431, 0.0009718940127640963, 0.0015350972535088658, 0.0005347359692677855, 0.1021684855222702], [0.00890259351581335, 0.0007610797765664756, 0.0002384407416684553, 9.361530828755349e-05, 5.6527154811192304e-05, 0.00018591048137750477, 8.493989298585802e-05, 0.08931314945220947, 0.006592101417481899, 0.0023419533390551805, 0.011531253345310688, 0.0016765694599598646, 0.7480523586273193, 0.04368055611848831, 0.004323867615312338, 0.0006637907354161143, 0.08150133490562439], [0.005771770142018795, 0.00048654881538823247, 0.0005069742328487337, 0.00014665456546936184, 5.1430932217044756e-05, 5.334823435987346e-05, 0.00010980598744936287, 0.19393512606620789, 0.002306627109646797, 0.005576169118285179, 0.00036731045111082494, 0.005911033134907484, 0.0531952939927578, 0.55089271068573, 0.006657206453382969, 0.003828260349109769, 0.1702038198709488], [0.056521642953157425, 0.0007306957268156111, 0.0015754476189613342, 0.0005547812907025218, 0.0001042987933033146, 4.179157258477062e-05, 0.0008041991968639195, 0.20161153376102448, 0.00014450159505940974, 0.015635691583156586, 0.0030651758424937725, 0.0006524906493723392, 0.014576708897948265, 0.07221268862485886, 0.44635045528411865, 0.005009711720049381, 0.18040819466114044], [0.022162850946187973, 0.0009166505187749863, 0.000493201136123389, 0.0008983672014437616, 0.004571357276290655, 0.0021392719354480505, 0.0004797519068233669, 0.3653039038181305, 2.5122175429714844e-05, 0.00036312671727500856, 0.0017965015722438693, 0.0005681113689206541, 0.0005586883635260165, 0.0009680179646238685, 0.004053974524140358, 0.26664480566978455, 0.328056275844574], [0.010465921834111214, 0.008381130173802376, 0.0021456293761730194, 0.001199878635816276, 0.0002964165178127587, 0.0003625671670306474, 0.00039239853504113853, 0.4880073368549347, 0.00338413892313838, 0.00020065468561369926, 0.0015091702807694674, 0.0006070233648642898, 0.008503571152687073, 0.00584521284326911, 0.006344358902424574, 0.00409461697563529, 0.4582599103450775], [0.007095439825206995, 0.0069713294506073, 0.003127570729702711, 0.0028081308118999004, 0.0034372746013104916, 0.004409275483340025, 0.0031920955516397953, 0.501924991607666, 0.0025585845578461885, 0.0020071626640856266, 0.0010653807548806071, 0.002279026200994849, 0.0022955858148634434, 0.0018385223811492324, 0.0017380132339894772, 0.002388506196439266, 0.4508630931377411]]], [[[0.004083481151610613, 0.010266514495015144, 0.005836408585309982, 0.002041516825556755, 0.001631894614547491, 0.0011460310779511929, 0.00258393632248044, 0.06393755227327347, 0.3628099858760834, 0.36560237407684326, 0.010922528803348541, 0.02133435755968094, 0.04707973822951317, 0.010180212557315826, 0.015219781547784805, 0.016995001584291458, 0.05832871422171593], [0.0471838042140007, 0.04388238117098808, 0.30321961641311646, 0.051181256771087646, 0.03728484362363815, 0.006823327858000994, 0.029177667573094368, 0.25135326385498047, 0.0045381877571344376, 0.0018491029040887952, 0.0014868690632283688, 0.00035225480678491294, 0.001049170852638781, 0.003269101958721876, 0.002378885867074132, 0.0012385027948766947, 0.21373172104358673], [0.016021642833948135, 0.00934872031211853, 0.03365415334701538, 0.04665527492761612, 0.05169636756181717, 0.032226502895355225, 0.03233293816447258, 0.4206611216068268, 0.0016816986026242375, 0.0010042608482763171, 0.000550229218788445, 0.00024528440553694963, 0.00020937572116963565, 0.0005810664151795208, 0.00039817820652388036, 0.0005791793228127062, 0.3521539270877838], [0.012056097388267517, 0.0035390802659094334, 0.014607013203203678, 0.018763015046715736, 0.07704766094684601, 0.2239285558462143, 0.05175204947590828, 0.31614628434181213, 0.00179469829890877, 0.0008313191356137395, 0.0003906309721060097, 0.000695938419084996, 0.0003453704703133553, 0.000496912223752588, 0.0010078243212774396, 0.0007133716135285795, 0.275884211063385], [0.006247136276215315, 0.00033699083724059165, 0.0043464950285851955, 0.0060152397491037846, 0.01900547929108143, 0.5900396704673767, 0.10194841772317886, 0.143333300948143, 0.002663862658664584, 0.0012629416305571795, 0.00146248540841043, 0.00039225220098160207, 5.990822319290601e-05, 6.349512113956735e-05, 0.000210279889870435, 0.00021549040684476495, 0.12239653617143631], [0.019161371514201164, 0.0007033912697806954, 0.0030785133130848408, 0.005670036189258099, 0.019960638135671616, 0.03846687078475952, 0.40337875485420227, 0.25959110260009766, 0.018065504729747772, 0.006658558733761311, 0.0033417169470340014, 0.0014244046760722995, 0.00032341404585167766, 0.00022051797714084387, 0.00041163110290654004, 0.00022338639246299863, 0.2193201631307602], [0.021539896726608276, 0.004957678262144327, 0.002334498567506671, 0.0061911167576909065, 0.008874042890965939, 0.013944328762590885, 0.01621401496231556, 0.3876628875732422, 0.14049220085144043, 0.04027939587831497, 0.00615672254934907, 0.005812295712530613, 0.007646613288670778, 0.0022178643848747015, 0.0005753374425694346, 0.0013287181500345469, 0.33377233147621155], [0.007617037743330002, 0.0024497760459780693, 0.004515711218118668, 0.005138356238603592, 0.0032676239497959614, 0.004109091125428677, 0.007824215106666088, 0.511775016784668, 0.003105049254372716, 0.002679311903193593, 0.001402824535034597, 0.002359375823289156, 0.004239645320922136, 0.0021949298679828644, 0.001242075813934207, 0.0019698466639965773, 0.4341100752353668], [0.04192143306136131, 0.0003148631367366761, 0.000248601078055799, 0.0003886763588525355, 0.000142870529089123, 0.00034486514050513506, 0.004023955203592777, 0.29722899198532104, 0.04113367572426796, 0.20716656744480133, 0.06783179193735123, 0.044581446796655655, 0.030444927513599396, 0.0018056677654385567, 0.0007835851865820587, 0.001402143738232553, 0.2602359652519226], [0.009796972386538982, 0.00045521423453465104, 0.0004612221091520041, 0.0002781054354272783, 6.961745384614915e-05, 0.00047293660463765264, 0.001953547354787588, 0.26280686259269714, 0.019125495105981827, 0.06960467249155045, 0.07979968935251236, 0.16364158689975739, 0.11046423763036728, 0.027346424758434296, 0.004330432508140802, 0.015605127438902855, 0.2337879091501236], [0.007497874554246664, 0.0006281228270381689, 0.0003290157183073461, 0.000560607819352299, 0.0001788319059414789, 0.00019266043091192842, 0.001489504356868565, 0.3899276554584503, 0.003951693885028362, 0.05373711138963699, 0.015914125367999077, 0.02256133034825325, 0.1210661306977272, 0.021808939054608345, 0.005661309231072664, 0.0037624293472617865, 0.35073262453079224], [0.0201492328196764, 0.0008671115501783788, 0.0009348484454676509, 0.0012377066304907203, 0.00023453019093722105, 0.0007145720301195979, 0.0013708645710721612, 0.2616354823112488, 0.0037428438663482666, 0.010561984963715076, 0.01693057082593441, 0.07273165881633759, 0.1608487069606781, 0.1570143699645996, 0.024759210646152496, 0.029044240713119507, 0.23722198605537415], [0.013863440603017807, 0.0009968613740056753, 0.0017203486058861017, 0.0007964314427226782, 0.0002512383507564664, 0.000508443103171885, 0.0023207247722893953, 0.22237025201320648, 0.000980651006102562, 0.006551274564117193, 0.0025334483943879604, 0.014891887083649635, 0.05995171144604683, 0.20110347867012024, 0.06615838408470154, 0.20605990290641785, 0.19894157350063324], [0.029679127037525177, 0.0007730416837148368, 0.002115977928042412, 0.00475419033318758, 0.000509217323269695, 0.0006299729575403035, 0.003024585312232375, 0.2555743157863617, 0.0004509483987931162, 0.002532568760216236, 0.002277133986353874, 0.0035485082771629095, 0.024690184742212296, 0.05076415464282036, 0.054738640785217285, 0.33741432428359985, 0.22652314603328705], [0.0031695826910436153, 0.0003325995639897883, 0.000281975488178432, 0.0009938717121258378, 0.00035639634006656706, 0.0005657272413372993, 0.0005899776006117463, 0.5154324769973755, 0.00025797044509090483, 0.0002750036073848605, 0.0003977921442128718, 0.0004940443905070424, 0.005724288988858461, 0.004078098107129335, 0.0034477279987186193, 0.0037142348010092974, 0.4598882794380188], [0.018920890986919403, 0.0008689638925716281, 0.0012451376533135772, 0.004829060286283493, 0.0019166212296113372, 0.003307041712105274, 0.0067899757996201515, 0.43310561776161194, 0.0027897274121642113, 0.003279728116467595, 0.0011909687891602516, 0.003926041070371866, 0.008639281615614891, 0.008941789157688618, 0.011754726059734821, 0.09899520874023438, 0.3894991874694824], [0.006861280649900436, 0.0021093254908919334, 0.004004065878689289, 0.00449757743626833, 0.0028789546340703964, 0.003663398092612624, 0.007247333414852619, 0.5152990818023682, 0.0027661919593811035, 0.002444947836920619, 0.0012910881778225303, 0.002153773093596101, 0.00378129119053483, 0.001934899715706706, 0.0011249339440837502, 0.0018051861552521586, 0.4361366331577301]], [[0.006640621926635504, 0.0018097672145813704, 0.0025099809281527996, 0.001014064997434616, 0.0003981577174272388, 0.021083062514662743, 0.008034562692046165, 0.5003902316093445, 0.0017131459899246693, 0.0004714426177088171, 0.0032314180862158537, 0.0009747802396304905, 0.0011783323716372252, 0.001574094989337027, 0.0005130699137225747, 0.00038119463715702295, 0.4480821490287781], [0.005610721185803413, 0.01801030896604061, 0.005320126656442881, 0.002519084606319666, 0.0005323417717590928, 0.0022562427911907434, 0.0041033001616597176, 0.5088375806808472, 0.0002905707515310496, 2.116906580340583e-05, 0.0011140169808641076, 0.0007272794609889388, 0.005650756414979696, 0.0005599311552941799, 0.00036048897891305387, 0.00031290820334106684, 0.443773090839386], [0.003862259676679969, 0.01923271454870701, 0.0035116255749017, 0.01159456092864275, 0.013776338659226894, 0.0006801421986892819, 0.0006472010863944888, 0.5022839903831482, 2.8539090635604225e-05, 2.6510640964261256e-05, 0.0005876823561266065, 7.66843804740347e-05, 8.145233732648194e-05, 6.887218478368595e-05, 0.0006215582834556699, 0.00010518021008465439, 0.4428147077560425], [0.0014964548172429204, 0.015894856303930283, 0.007998041808605194, 0.00981275737285614, 0.014848231337964535, 0.0016507002292200923, 0.0003774325014092028, 0.5120782852172852, 4.96775028295815e-05, 3.367659155628644e-05, 0.0011221698950976133, 0.00019823435286525637, 0.00034292449709028006, 0.0001553969195811078, 0.00042609075899235904, 0.00025924862711690366, 0.4332558810710907], [0.05731039494276047, 0.014817115850746632, 0.00393249886110425, 0.32759469747543335, 0.01658163033425808, 0.14140360057353973, 0.002406408078968525, 0.22733789682388306, 0.00010942589869955555, 1.838424759625923e-05, 0.0001136446007876657, 0.000494811509270221, 0.001073671504855156, 0.0005494000506587327, 0.005106938071548939, 0.001328573445789516, 0.19982099533081055], [0.018363814800977707, 0.0023813648149371147, 0.0032748528756201267, 0.030158162117004395, 0.1581299751996994, 0.011901133693754673, 0.00951849389821291, 0.40328699350357056, 0.0003053880063816905, 4.557170541374944e-05, 0.0008907102164812386, 0.0003067919460590929, 0.0007250209455378354, 7.544220716226846e-05, 0.0012910173973068595, 0.0003908830403815955, 0.35895437002182007], [0.014583328738808632, 0.001506542437709868, 0.004905585665255785, 0.009288981556892395, 0.0031461655162274837, 0.25536397099494934, 0.03543156012892723, 0.3496476709842682, 0.002587232505902648, 0.0008398568606935441, 0.0019015561556443572, 0.0002983945596497506, 0.00014167024346534163, 0.00026837564655579627, 0.00010337432468077168, 0.0018542632460594177, 0.3181314170360565], [0.002774134511128068, 0.0009952118853107095, 0.00044433033326640725, 0.0014356330502778292, 0.0007511977455578744, 0.003419009270146489, 0.0006639195489697158, 0.52400141954422, 0.00027826358564198017, 0.0001647885364945978, 0.00045937582035548985, 0.0003968923701904714, 0.00030393293127417564, 0.00019963386876042932, 0.0003492451214697212, 0.00014748505782335997, 0.46321550011634827], [0.017433766275644302, 0.0004089678404852748, 0.00014158480917103589, 0.000447319180238992, 0.0012962608598172665, 0.004420291632413864, 0.022016258910298347, 0.4597463309764862, 0.02661087177693844, 0.000785876763984561, 0.03244047611951828, 0.01264140848070383, 0.004986383952200413, 0.00039549352368339896, 0.00019033250282518566, 0.0011408781865611672, 0.41489747166633606], [0.0996033325791359, 0.0010661811102181673, 0.000655685958918184, 0.0011618860298767686, 0.006712760776281357, 0.0019110942957922816, 0.013591555878520012, 0.382196307182312, 0.06624986231327057, 0.028937630355358124, 0.019741058349609375, 0.014060970395803452, 0.003300856566056609, 0.00036169777740724385, 0.0002267359523102641, 0.00045197634608484805, 0.35977044701576233], [0.013239442370831966, 0.00044689595233649015, 0.00032682702294550836, 0.0003587512474041432, 0.00013625826977659017, 0.0003253217728342861, 0.0036083334125578403, 0.49936386942863464, 0.0046426658518612385, 0.02662898413836956, 0.0011756467865779996, 0.001843846868723631, 0.0020454302430152893, 0.0002997281844727695, 0.0016349426005035639, 0.0004215297813061625, 0.4435015618801117], [0.02804984711110592, 0.0008390527218580246, 0.00014484270650427788, 0.0004606385191436857, 0.0008052413468249142, 0.0001073426246875897, 0.005835285410284996, 0.1545025110244751, 0.015224426053464413, 0.009595396928489208, 0.6226685643196106, 0.00204093218781054, 0.014330615289509296, 0.0022620302625000477, 0.0018143244087696075, 0.0004185418947599828, 0.1409004181623459], [0.005276072770357132, 0.0003650781000033021, 2.1817602828377858e-05, 0.0004760106385219842, 6.952255262149265e-06, 7.029796688584611e-05, 0.00040658327634446323, 0.4834519028663635, 0.0004288666241336614, 0.000607833091635257, 0.03700881823897362, 0.013901754282414913, 0.021115509793162346, 0.0012138357851654291, 0.0024888641200959682, 0.0026992966886609793, 0.4304604232311249], [0.016472425311803818, 0.0003363966243341565, 7.36086440156214e-05, 9.371815394842997e-05, 7.502674270654097e-05, 0.00010294261301169172, 0.0024506691843271255, 0.2977106273174286, 0.0015166756929829717, 0.0007380940951406956, 0.047028712928295135, 0.06333784759044647, 0.2629038989543915, 0.027818135917186737, 0.0025195099879056215, 0.010954486206173897, 0.2658672630786896], [0.015679681673645973, 0.0006053984980098903, 0.00010532861779211089, 0.0001780543097993359, 0.00022948473633732647, 0.0001152951517724432, 0.0002248886157758534, 0.5013548135757446, 0.0017739135073497891, 0.0002931178023573011, 0.00082678027683869, 0.0013597175711765885, 0.0032266092021018267, 0.025666100904345512, 0.001478736405260861, 0.003211310598999262, 0.44367069005966187], [0.023467374965548515, 0.0010534296743571758, 2.713039066293277e-05, 0.0010348653886467218, 0.0002579364809207618, 0.0003147345269098878, 0.0008283740608021617, 0.044748660176992416, 7.58034730097279e-05, 0.000670621870085597, 0.0007485253154300153, 0.003908259328454733, 0.0047384039498865604, 0.005386083386838436, 0.8676328659057617, 0.0020850251894444227, 0.04302188381552696], [0.002711727749556303, 0.0009087404469028115, 0.0004064514650963247, 0.0013229845790192485, 0.0006673884927295148, 0.0031175410840660334, 0.0006192386499606073, 0.5247302651405334, 0.0002568238414824009, 0.00015148324018809944, 0.0004484520177356899, 0.00037594381137751043, 0.00028538802871480584, 0.00018772503244690597, 0.0003250270674470812, 0.00014226870553102344, 0.4633425176143646]], [[0.03876497969031334, 0.022126680240035057, 0.0103555116802454, 0.005037189461290836, 0.002751772990450263, 0.006483457516878843, 0.0275814700871706, 0.2886623740196228, 0.039803650230169296, 0.08298151195049286, 0.009118134155869484, 0.01993367075920105, 0.08359593152999878, 0.02048637717962265, 0.004613171797245741, 0.06120169535279274, 0.27650249004364014], [0.020735636353492737, 0.18080101907253265, 0.09438339620828629, 0.037630680948495865, 0.021607106551527977, 0.10220450162887573, 0.25501832365989685, 0.0474819540977478, 0.040802497416734695, 0.03361200541257858, 0.003109042765572667, 0.011785808019340038, 0.05537364259362221, 0.015619650483131409, 0.009559083729982376, 0.027501536533236504, 0.042774055153131485], [0.03177400305867195, 0.11751070618629456, 0.028056221082806587, 0.02296084724366665, 0.02083009108901024, 0.05056319013237953, 0.03823808208107948, 0.27009543776512146, 0.03174891322851181, 0.01347424928098917, 0.00827068742364645, 0.01142788678407669, 0.0664491280913353, 0.013697739690542221, 0.005421096459031105, 0.033135898411273956, 0.2363457828760147], [0.039213769137859344, 0.08718899637460709, 0.03629929572343826, 0.015279576182365417, 0.033467333763837814, 0.05603574588894844, 0.04724758863449097, 0.26336684823036194, 0.035768333822488785, 0.022858239710330963, 0.006760303396731615, 0.009604404680430889, 0.06222238019108772, 0.014405464753508568, 0.01047576405107975, 0.02519325725734234, 0.23461279273033142], [0.02170759066939354, 0.07290694117546082, 0.06711792200803757, 0.037599433213472366, 0.0173651035875082, 0.04318316653370857, 0.08134033530950546, 0.22826775908470154, 0.036034271121025085, 0.03558309003710747, 0.005060053430497646, 0.048922572284936905, 0.0544293150305748, 0.01548344362527132, 0.002886928850784898, 0.027625951915979385, 0.2044862061738968], [0.013152213767170906, 0.2217816561460495, 0.06836797297000885, 0.06536387652158737, 0.013532874174416065, 0.104002445936203, 0.12223000824451447, 0.1324240267276764, 0.02194560319185257, 0.011199736036360264, 0.0016497995238751173, 0.010398166254162788, 0.05470310524106026, 0.009779373183846474, 0.004581306129693985, 0.028963813558220863, 0.11592409759759903], [0.17135226726531982, 0.3311707675457001, 0.06837645918130875, 0.06003308668732643, 0.03025788441300392, 0.04735049232840538, 0.007376356050372124, 0.1327797919511795, 0.012032506987452507, 0.005699047818779945, 0.0018891083309426904, 0.003496119286864996, 0.006733338348567486, 0.005027373321354389, 0.00044935112236998975, 0.00994600635021925, 0.10603003948926926], [0.008763023652136326, 0.0030135209672152996, 0.000634426367469132, 0.00047694332897663116, 0.0009441105066798627, 0.0012360878754407167, 0.0007386246579699218, 0.5203231573104858, 0.0018118080915883183, 0.000437173672253266, 0.0015619267942383885, 0.0006402708822861314, 0.0014853114262223244, 0.0007494789315387607, 0.0002804916293825954, 0.0010225086007267237, 0.45588117837905884], [0.013150902464985847, 0.03245753422379494, 0.11305023729801178, 0.052529383450746536, 0.022034816443920135, 0.033888883888721466, 0.05330939590930939, 0.20744991302490234, 0.01903660222887993, 0.07757310569286346, 0.0057386248372495174, 0.03152724355459213, 0.06668046861886978, 0.020894350484013557, 0.005881402641534805, 0.052494291216135025, 0.1923028528690338], [0.007649420760571957, 0.009666509926319122, 0.008746130391955376, 0.0042511168867349625, 0.0035816689487546682, 0.008443129248917103, 0.0057349069975316525, 0.4438745677471161, 0.016291992738842964, 0.008416545577347279, 0.0015642602229490876, 0.006453413981944323, 0.023659490048885345, 0.006778723560273647, 0.001809898647479713, 0.02294142358005047, 0.42013677954673767], [0.030145768076181412, 0.04984056577086449, 0.025190211832523346, 0.010861831717193127, 0.0041696783155202866, 0.029273705556988716, 0.0030322240199893713, 0.33715149760246277, 0.08034559339284897, 0.009701335802674294, 0.002904896391555667, 0.009433066472411156, 0.021289357915520668, 0.0696469396352768, 0.002125467173755169, 0.014937041327357292, 0.2999506890773773], [0.009758849628269672, 0.03703921660780907, 0.03634372353553772, 0.02543223835527897, 0.025850653648376465, 0.05589013174176216, 0.024128077551722527, 0.29243671894073486, 0.05739610642194748, 0.03775964304804802, 0.0036061694845557213, 0.010608300566673279, 0.03746131435036659, 0.04398012161254883, 0.0040132892318069935, 0.027268096804618835, 0.2710273265838623], [0.034646399319171906, 0.02288265898823738, 0.03461689129471779, 0.011713674291968346, 0.008231036365032196, 0.013418920338153839, 0.009845656342804432, 0.3393171429634094, 0.06567858159542084, 0.031617384403944016, 0.0022852676920592785, 0.013201496563851833, 0.037200313061475754, 0.03300933167338371, 0.002704222220927477, 0.027280626818537712, 0.3123503625392914], [0.00912979245185852, 0.03145089000463486, 0.08726119250059128, 0.0415167361497879, 0.01635095663368702, 0.030798984691500664, 0.02918403223156929, 0.18993403017520905, 0.05052550882101059, 0.09130696207284927, 0.005074433982372284, 0.038552116602659225, 0.09877417236566544, 0.03906463086605072, 0.007693320047110319, 0.05659914389252663, 0.17678314447402954], [0.01998925395309925, 0.014997678808867931, 0.012553824111819267, 0.004337131977081299, 0.006514464970678091, 0.017683975398540497, 0.002952870214357972, 0.3401181101799011, 0.0326274149119854, 0.010301818139851093, 0.004583478439599276, 0.02250749059021473, 0.12422005087137222, 0.026111392304301262, 0.003878270508721471, 0.039610058069229126, 0.31701260805130005], [0.013137602247297764, 0.041059356182813644, 0.039435263723134995, 0.025700561702251434, 0.009083135984838009, 0.0233684703707695, 0.015237030573189259, 0.1750115007162094, 0.10279016196727753, 0.13121238350868225, 0.0059186373837292194, 0.03298543393611908, 0.06696045398712158, 0.04063171520829201, 0.006234882399439812, 0.10606605559587479, 0.16516730189323425], [0.007749977521598339, 0.0028419147711247206, 0.0006022349116392434, 0.0004622976703103632, 0.0008658506558276713, 0.0011800676584243774, 0.0007281464640982449, 0.521176278591156, 0.0017022948013618588, 0.00042232326813973486, 0.0014915397623553872, 0.0006168946856632829, 0.00144483451731503, 0.0007075549219734967, 0.0002685310610104352, 0.000995673588477075, 0.45674359798431396]], [[0.010651291348040104, 0.04809771850705147, 0.03837062418460846, 0.05298202112317085, 0.006948170717805624, 0.04825332760810852, 0.026434265077114105, 0.41313666105270386, 0.0010254706721752882, 0.0010317113483324647, 0.0010007532546296716, 0.0009914664551615715, 0.0013289713533595204, 0.00042631823453120887, 0.00022831273963674903, 0.001576291979290545, 0.3475165069103241], [0.014919126406311989, 0.021318677812814713, 0.03674667328596115, 0.029242640361189842, 0.003777979174628854, 0.006424322258681059, 0.016700472682714462, 0.4640410840511322, 0.002469720784574747, 0.0010629519820213318, 0.0015861743595451117, 0.0021083310712128878, 0.002545704832300544, 0.0014238560106605291, 0.0007297685951925814, 0.002370092086493969, 0.3925323486328125], [0.023660337552428246, 0.034027595072984695, 0.012960715219378471, 0.01680544763803482, 0.0023265210911631584, 0.0022834965493530035, 0.004435372073203325, 0.4843156039714813, 0.0006881437147967517, 0.00023438839707523584, 0.00016126909758895636, 0.0012076067505404353, 0.0008306339732371271, 0.0004832753911614418, 0.00037797485128976405, 0.0009249240974895656, 0.414276659488678], [0.02799510769546032, 0.04243740066885948, 0.025586305186152458, 0.041481826454401016, 0.010210365988314152, 0.004426650702953339, 0.014089532196521759, 0.4427638649940491, 0.0008504438446834683, 0.000722077616956085, 0.00033967633498832583, 0.00204147188924253, 0.0019032250856980681, 0.0009658546186983585, 0.001058993162587285, 0.0032561009284108877, 0.3798711597919464], [0.06285551190376282, 0.08082985877990723, 0.04660291597247124, 0.1682017743587494, 0.004451194312423468, 0.02477400191128254, 0.03118986263871193, 0.30581629276275635, 0.0024290482979267836, 0.001410001888871193, 0.0001800085447030142, 0.003304483834654093, 0.0016586334677413106, 0.0014656806597486138, 0.0009977830341085792, 0.005537942051887512, 0.25829508900642395], [0.08650829643011093, 0.12271664291620255, 0.08358984440565109, 0.06928369402885437, 0.008653477765619755, 0.021964335814118385, 0.03701992332935333, 0.2974291443824768, 0.0014521965058520436, 0.0014835780020803213, 0.0004410279798321426, 0.00224329368211329, 0.0038228461053222418, 0.0012591707054525614, 0.0010920945787802339, 0.0038691353984177113, 0.2571713626384735], [0.08830947428941727, 0.1496984362602234, 0.1131545752286911, 0.06324763596057892, 0.046971119940280914, 0.05959483981132507, 0.027199089527130127, 0.2275477796792984, 0.0033322866074740887, 0.0017076900694519281, 0.0007157978252507746, 0.0020855353213846684, 0.0034660876262933016, 0.0015899846330285072, 0.0017984001897275448, 0.007070397026836872, 0.20251090824604034], [0.003373810788616538, 0.0019078099867329001, 0.0026289490051567554, 0.0021171006374061108, 0.0008784524980001152, 0.0016454076394438744, 0.003511374583467841, 0.5309630632400513, 0.0013943014200776815, 0.0012888247147202492, 0.00030531216179952025, 0.0013363654725253582, 0.001364801893942058, 0.0004895798047073185, 0.0005624095792882144, 0.0017100380500778556, 0.4445224404335022], [0.0173655953258276, 0.012332079000771046, 0.00822543352842331, 0.017941059544682503, 0.013750811107456684, 0.06732749938964844, 0.07059092074632645, 0.3821801543235779, 0.031292904168367386, 0.013070831075310707, 0.002057603793218732, 0.01646636240184307, 0.004295059479773045, 0.00374570838175714, 0.0011815667385235429, 0.007967965677380562, 0.3302084505558014], [0.018085109069943428, 0.00342858606018126, 0.0006941453902982175, 0.0017370340647175908, 0.001500184298492968, 0.008416896685957909, 0.030099373310804367, 0.4786571264266968, 0.02434701658785343, 0.005821674130856991, 0.0006122009363025427, 0.0014491616748273373, 0.0026358875911682844, 0.0009106830111704767, 0.0001363616465823725, 0.004542248789221048, 0.41692638397216797], [0.008778916671872139, 0.0007945428951643407, 0.00018470697978045791, 0.0003769229515455663, 0.0005714495200663805, 0.0012644171947613358, 0.007875319570302963, 0.5242693424224854, 0.0042025926522910595, 0.0012988975504413247, 0.0003172823053319007, 0.00035968300653621554, 0.001122661167755723, 0.00015210478159133345, 6.0453618061728776e-05, 0.0001846879458753392, 0.44818612933158875], [0.025506924837827682, 0.0022416897118091583, 0.000865351059474051, 0.0004380106693133712, 0.0003928901569452137, 0.0045121763832867146, 0.024780401960015297, 0.4575302302837372, 0.036716528236866, 0.018769215792417526, 0.0034955008886754513, 0.006563994567841291, 0.006595167797058821, 0.0009471868397668004, 0.0006536885630339384, 0.005240521859377623, 0.4047505557537079], [0.05743158236145973, 0.005027624312788248, 0.001369799138046801, 0.007941259071230888, 0.0008900559041649103, 0.00382581097073853, 0.035327788442373276, 0.3596605658531189, 0.12167783826589584, 0.04556646943092346, 0.003132336540147662, 0.028026103973388672, 0.005150756798684597, 0.0035878149792551994, 0.0013962306547909975, 0.007043134421110153, 0.31294485926628113], [0.01980031281709671, 0.0059632714837789536, 0.0015726395649835467, 0.002767251804471016, 0.0010932899313047528, 0.006237796042114496, 0.016724906861782074, 0.3652184009552002, 0.04725727438926697, 0.04931535944342613, 0.009173160418868065, 0.08575954288244247, 0.03465646505355835, 0.017156459391117096, 0.0033527254126966, 0.012498130090534687, 0.3214530348777771], [0.013980960473418236, 0.0009090476669371128, 0.0003003641904797405, 0.00018562388140708208, 0.000131691136630252, 0.00043581053614616394, 0.0017971521010622382, 0.5087334513664246, 0.0033680491615086794, 0.004310212563723326, 0.0007405822398141026, 0.002859764965251088, 0.01034164521843195, 0.00408777454867959, 0.0003879657597281039, 0.004286731593310833, 0.4431430995464325], [0.04793037474155426, 0.0015925753396004438, 0.001197886886075139, 0.002495001768693328, 0.00012958589650224894, 0.0005296710878610611, 0.008238089270889759, 0.24836114048957825, 0.049676563590765, 0.04282013699412346, 0.0015413867076858878, 0.07862626016139984, 0.16847676038742065, 0.08147833496332169, 0.008081121370196342, 0.04021488130092621, 0.2186102420091629], [0.0032133355271071196, 0.0016774703981354833, 0.0023382508661597967, 0.0018640425987541676, 0.0007565852720290422, 0.0014587444020435214, 0.003157190978527069, 0.5327511429786682, 0.0012912615202367306, 0.001208747737109661, 0.0002833276812452823, 0.0012581556802615523, 0.001255515730008483, 0.00045139979920350015, 0.0005192536045797169, 0.0016066492535173893, 0.44490891695022583]], [[0.02171350084245205, 0.043961405754089355, 0.027329180389642715, 0.09552139788866043, 0.01987740583717823, 0.017374461516737938, 0.024121228605508804, 0.34085068106651306, 0.010602046735584736, 0.015572507865726948, 0.002919234102591872, 0.016444621607661247, 0.00948400143533945, 0.007504765409976244, 0.005665397737175226, 0.028669318184256554, 0.3123888373374939], [0.10502757877111435, 0.1395493894815445, 0.060530662536621094, 0.09872180968523026, 0.00847651343792677, 0.041486553847789764, 0.06679069250822067, 0.20897352695465088, 0.02419460564851761, 0.018597327172756195, 0.0012517449213191867, 0.0035619433037936687, 0.007562574464827776, 0.008795957081019878, 0.0018029508646577597, 0.013797793537378311, 0.19087831676006317], [0.06388872861862183, 0.07454962283372879, 0.014247567392885685, 0.021648194640874863, 0.0031306990422308445, 0.007831403985619545, 0.14405226707458496, 0.332382470369339, 0.004642088431864977, 0.004319594241678715, 0.0009615204762667418, 0.002426187274977565, 0.013558994978666306, 0.001495914999395609, 0.0010653856443241239, 0.01319281104952097, 0.29660651087760925], [0.09268822520971298, 0.024194443598389626, 0.07024107128381729, 0.015059195458889008, 0.017504636198282242, 0.025016695261001587, 0.06702037900686264, 0.3298759460449219, 0.007229585666209459, 0.01375966053456068, 0.009553618729114532, 0.0049579450860619545, 0.0142865851521492, 0.005110567901283503, 0.00464595528319478, 0.006266890559345484, 0.29258856177330017], [0.05810362845659256, 0.010257489047944546, 0.019693927839398384, 0.06425942480564117, 0.020529065281152725, 0.03000796213746071, 0.09162784367799759, 0.3513181209564209, 0.0070266081020236015, 0.013583390973508358, 0.0047304402105510235, 0.003121491987258196, 0.006715251132845879, 0.0029583838768303394, 0.003388304030522704, 0.0029179633129388094, 0.3097607493400574], [0.06856624037027359, 0.03256785124540329, 0.014049772173166275, 0.0805036649107933, 0.016232246533036232, 0.06494304537773132, 0.09566733241081238, 0.29886895418167114, 0.017963798716664314, 0.015393528155982494, 0.001230248250067234, 0.003286918858066201, 0.004311148077249527, 0.011128339916467667, 0.0016509564593434334, 0.008068233728408813, 0.26556769013404846], [0.03624676540493965, 0.0190057884901762, 0.05280241742730141, 0.07030589878559113, 0.05238155648112297, 0.023108039051294327, 0.012540639378130436, 0.32202988862991333, 0.028573546558618546, 0.012158461846411228, 0.00569486478343606, 0.010967280715703964, 0.016376277431845665, 0.02362043596804142, 0.008242800831794739, 0.015335899777710438, 0.2906094193458557], [0.00661076745018363, 0.0014183290768414736, 0.0031545336823910475, 0.005009150598198175, 0.0044376542791724205, 0.0026984727010130882, 0.010319417342543602, 0.489732027053833, 0.0025765933096408844, 0.0028269358444958925, 0.010803340002894402, 0.0025763106532394886, 0.005805999506264925, 0.002693208632990718, 0.004991770721971989, 0.003912807442247868, 0.44043275713920593], [0.047381069511175156, 0.012359660118818283, 0.004353589378297329, 0.00653199153020978, 0.005484114401042461, 0.023549117147922516, 0.053630370646715164, 0.30780676007270813, 0.0069723245687782764, 0.036561209708452225, 0.00951547734439373, 0.01772806979715824, 0.027495892718434334, 0.006739427335560322, 0.09637050330638885, 0.05533894896507263, 0.28218144178390503], [0.034640539437532425, 0.006264038383960724, 0.005435112398117781, 0.007389870472252369, 0.0033141898456960917, 0.00421362416818738, 0.012954137288033962, 0.3591756522655487, 0.06823336333036423, 0.008139592595398426, 0.004202970769256353, 0.029871249571442604, 0.036503441631793976, 0.05490024387836456, 0.0033086237963289022, 0.03949131816625595, 0.32196205854415894], [0.01058492437005043, 0.0007235348457470536, 0.0007941168732941151, 0.002931152004748583, 0.001791826798580587, 0.0007360016461461782, 0.00902820099145174, 0.4846690893173218, 0.012151957489550114, 0.013282187283039093, 0.0014336848398670554, 0.0032457103952765465, 0.014058132655918598, 0.00560242123901844, 0.00534347677603364, 0.006035238970071077, 0.42758825421333313], [0.0384397879242897, 0.006133506540209055, 0.005038060713559389, 0.005221458617597818, 0.003494358155876398, 0.006681034341454506, 0.01237774919718504, 0.34927693009376526, 0.036411579698324203, 0.06803665310144424, 0.0020022455137223005, 0.015333653427660465, 0.04682603105902672, 0.02572544291615486, 0.003129608230665326, 0.06415924429893494, 0.3117125630378723], [0.03743297979235649, 0.005005800165235996, 0.004028538707643747, 0.010839811526238918, 0.005112554877996445, 0.012263841927051544, 0.023093393072485924, 0.3504691421985626, 0.03278140351176262, 0.04700236767530441, 0.0029000556096434593, 0.023893162608146667, 0.007907158695161343, 0.020426703616976738, 0.01360974833369255, 0.09100428223609924, 0.31222906708717346], [0.05911274626851082, 0.007463172543793917, 0.001607951009646058, 0.0021520424634218216, 0.0027579511515796185, 0.029024049639701843, 0.031181633472442627, 0.3219906985759735, 0.00858994759619236, 0.04992857947945595, 0.004128327127546072, 0.024572070688009262, 0.019279589876532555, 0.010351375676691532, 0.04234553128480911, 0.09213536977767944, 0.293379008769989], [0.04326111450791359, 0.001792263356037438, 0.001884629251435399, 0.002049521543085575, 0.0016244181897491217, 0.0033100415021181107, 0.020199742168188095, 0.40729179978370667, 0.05345752090215683, 0.01687440276145935, 0.008109679445624352, 0.004959262907505035, 0.02565581165254116, 0.02906103804707527, 0.0010828787926584482, 0.011575673706829548, 0.3678101897239685], [0.053817104548215866, 0.003218897385522723, 0.008273841813206673, 0.0031428206712007523, 0.002010904485359788, 0.003404677379876375, 0.022902250289916992, 0.26990529894828796, 0.07430361211299896, 0.1382715255022049, 0.003058353206142783, 0.04542470723390579, 0.03490329906344414, 0.05246013402938843, 0.004059332888573408, 0.040145374834537506, 0.24069781601428986], [0.006460660602897406, 0.0013142969692125916, 0.002969311783090234, 0.0046918513253331184, 0.004023895598948002, 0.002570820739492774, 0.009793058037757874, 0.4923669993877411, 0.0024306396953761578, 0.0026485170237720013, 0.010228059254586697, 0.002346064429730177, 0.005267712753266096, 0.0025212131440639496, 0.004633523058146238, 0.0036575598642230034, 0.4420759081840515]], [[0.013448210433125496, 0.0036647433880716562, 0.0012319096131250262, 0.0033310509752482176, 0.00045568624045699835, 0.0029230229556560516, 0.012249689549207687, 0.41929471492767334, 0.030255531892180443, 0.01950734667479992, 0.007640687748789787, 0.007631195243448019, 0.04963015019893646, 0.006127719301730394, 0.0011967391474172473, 0.027146028354763985, 0.39426562190055847], [0.021480988711118698, 0.20030753314495087, 0.043708205223083496, 0.007401058450341225, 0.0058999271132051945, 0.08866851031780243, 0.016682833433151245, 0.300577312707901, 0.02600044012069702, 0.0027000524569302797, 0.0015832876088097692, 0.0006119809113442898, 0.0017696209251880646, 0.0030809317249804735, 0.0005779119092039764, 0.0019842691253870726, 0.27696508169174194], [0.012251793406903744, 0.046869199723005295, 0.02878580056130886, 0.005244807805866003, 0.0007193086203187704, 0.017941992729902267, 0.0037366722244769335, 0.45272722840309143, 0.01414021197706461, 0.004053107462823391, 0.0025738326366990805, 0.0006255095941014588, 0.0006205015815794468, 0.0017212223028764129, 0.00038242997834458947, 0.0034947048407047987, 0.40411174297332764], [0.0041989777237176895, 0.00040324119618162513, 0.001691016019321978, 0.021200507879257202, 0.0019671281334012747, 0.0017548376927152276, 0.011250638402998447, 0.5035886168479919, 0.0011291989358142018, 0.0011552784126251936, 0.0030268945265561342, 0.0007752844248898327, 0.0003820057900156826, 0.0001932286686496809, 9.917026909533888e-05, 0.002725322265177965, 0.44445863366127014], [0.00999806821346283, 0.0020211555529385805, 0.0015719713410362601, 0.0025809563230723143, 0.021287374198436737, 0.021788880228996277, 0.009073415771126747, 0.49419695138931274, 0.00348889478482306, 0.0039589665830135345, 0.002632461255416274, 0.0007558973738923669, 0.002064186381176114, 0.00031555493478663266, 0.0004444464866537601, 0.001451763091608882, 0.42236897349357605], [0.01917724311351776, 0.010487818159162998, 0.0038471068255603313, 0.0035521609243005514, 0.004626747220754623, 0.049995020031929016, 0.010201611556112766, 0.45090365409851074, 0.025451844558119774, 0.004574872553348541, 0.006401558872312307, 0.0018549918895587325, 0.0020246573258191347, 0.004124858416616917, 0.0019844488706439734, 0.0014888044679537416, 0.39930251240730286], [0.01637098379433155, 0.010938969440758228, 0.0013468789402395487, 0.005412278231233358, 0.0012291853781789541, 0.015919027850031853, 0.04087802767753601, 0.4309636652469635, 0.01244663167744875, 0.017555953934788704, 0.014009170234203339, 0.009744644165039062, 0.006223001051694155, 0.003360082395374775, 0.0022692023776471615, 0.020978758111596107, 0.39035356044769287], [0.002892599208280444, 0.0006124047213234007, 0.0008798516937531531, 0.001340598682872951, 0.00022416871797759086, 0.0011904370039701462, 0.002633914817124605, 0.5279956459999084, 0.0005072889616712928, 0.0012499174335971475, 0.0009055507835000753, 0.0005750969285145402, 0.00020429751020856202, 0.00021154091518837959, 0.0003355686494614929, 0.0015126094222068787, 0.456728458404541], [0.04075222089886665, 0.012954768724739552, 0.005223289132118225, 0.0033225025981664658, 0.0019433426205068827, 0.009072254411876202, 0.0054921857081353664, 0.21823474764823914, 0.0463719442486763, 0.011752347461879253, 0.00147629389539361, 0.007255225442349911, 0.14189735054969788, 0.2490113228559494, 0.011619366705417633, 0.03508799523115158, 0.19853287935256958], [0.016528114676475525, 0.0036279086489230394, 0.002210112288594246, 0.0009533863631077111, 0.0008303510840050876, 0.00223417766392231, 0.004966397304087877, 0.32700714468955994, 0.00591988256201148, 0.04761603847146034, 0.0007012527785263956, 0.008117501623928547, 0.013477770611643791, 0.01972106657922268, 0.006106044631451368, 0.2420887053012848, 0.29789406061172485], [0.0019254329381510615, 0.00026645755860954523, 0.0008027945295907557, 0.0005093683139421046, 6.479684088844806e-05, 0.00023225342738442123, 0.000847706978674978, 0.5265746712684631, 8.768107363721356e-05, 0.0006163345533423126, 0.0007661805720999837, 0.0013035588199272752, 0.0005238872836343944, 0.0002731305721681565, 0.002233217703178525, 0.006698576267808676, 0.45627400279045105], [0.007177839521318674, 0.0005732851568609476, 0.0008213947876356542, 0.0004909157287329435, 0.0001423762005288154, 0.0008940909756347537, 0.002146889688447118, 0.413679838180542, 0.0013246947200968862, 0.004548837896436453, 0.0003284709819126874, 0.006007183808833361, 0.005461385939270258, 0.0025567919947206974, 0.0010900050401687622, 0.1837591826915741, 0.36899688839912415], [0.02561011165380478, 0.0021829032339155674, 0.0010717230616137385, 0.00046676420606672764, 0.0006157277966849506, 0.0027721738442778587, 0.005489006172865629, 0.47369009256362915, 0.0017233727267012, 0.007978984154760838, 0.00045849656453356147, 0.0019395765848457813, 0.0303040724247694, 0.001505997614003718, 0.0004254317900631577, 0.014789247885346413, 0.42897629737854004], [0.03696446493268013, 0.010212318040430546, 0.003192691830918193, 0.0015125449281185865, 0.0025499719195067883, 0.01926470175385475, 0.011892521753907204, 0.41882482171058655, 0.04431489482522011, 0.021493088454008102, 0.0007703466108068824, 0.0014951257035136223, 0.010517427697777748, 0.016892556101083755, 0.004053744953125715, 0.01722819358110428, 0.37882062792778015], [0.0012698386562988162, 0.0008998833945952356, 0.0019171577878296375, 0.0005755862803198397, 0.0003075127606280148, 0.0007472692523151636, 0.00040291453478857875, 0.5199238061904907, 0.0015692142769694328, 0.004481369163841009, 0.00031185447005555034, 0.0026504655834287405, 0.0005258583114482462, 0.0004566351417452097, 0.008261977694928646, 0.00889710895717144, 0.44680145382881165], [0.017162121832370758, 0.008930070325732231, 0.002960881683975458, 0.008458403870463371, 0.0014808080159127712, 0.002465372672304511, 0.004272541031241417, 0.37467750906944275, 0.006283679511398077, 0.048825398087501526, 0.00467823026701808, 0.01272621750831604, 0.007684091571718454, 0.0016075108433142304, 0.004277735482901335, 0.15360160171985626, 0.33990785479545593], [0.0027782453689724207, 0.0005548556218855083, 0.0008010520250536501, 0.001239580218680203, 0.00020432769088074565, 0.0011087950551882386, 0.002456858055666089, 0.5290663838386536, 0.00046014960389584303, 0.0011583742452785373, 0.000824282004032284, 0.0005381780210882425, 0.00018530858505982906, 0.0001946971460711211, 0.00030399125535041094, 0.001403992297127843, 0.4567209482192993]], [[0.007425202056765556, 0.0025974437594413757, 0.0026101202238351107, 0.0008209756342694163, 0.0010302385780960321, 0.014317428693175316, 0.003922363743185997, 0.5023362636566162, 0.002805171301588416, 0.0015503281028941274, 0.0008579694549553096, 0.0011240445310249925, 0.001057591405697167, 0.0021739264484494925, 0.0010860738111659884, 0.002075693104416132, 0.45220914483070374], [0.006770264357328415, 0.005972276441752911, 0.0021420244593173265, 0.005134768318384886, 0.0009162653004750609, 0.0005742288194596767, 0.00016202159167733043, 0.5221955180168152, 0.0002853810728993267, 4.2538566049188375e-05, 3.8002737710485235e-05, 0.0004491515865083784, 0.002704157028347254, 0.00027734399191103876, 0.00019701973360497504, 5.448918091133237e-05, 0.45208466053009033], [0.008566777221858501, 0.10941433906555176, 0.0014877432258799672, 0.0033271692227572203, 0.0007327860221266747, 0.0012509416555985808, 0.00037246147985570133, 0.47473740577697754, 0.0002347318222746253, 7.626810111105442e-05, 2.266075534862466e-05, 2.5693872885312885e-05, 0.00020165428577456623, 0.000552683777641505, 0.0003840049612335861, 0.00012086399510735646, 0.3984917402267456], [0.013074098154902458, 0.011503158137202263, 0.009244999848306179, 0.019243838265538216, 0.002696577226743102, 0.004209107253700495, 0.0013205696595832705, 0.49940025806427, 2.3900136511656456e-05, 7.994800398591906e-06, 7.149560406105593e-05, 5.2491996029857546e-05, 9.829633927438408e-05, 0.00014081670087762177, 0.00014764699153602123, 0.00020243963808752596, 0.4385622441768646], [0.026646461337804794, 0.006142487283796072, 0.03462157025933266, 0.7925723791122437, 0.0004254003579262644, 0.003416311228647828, 0.004597627092152834, 0.06949139386415482, 0.000125798731460236, 2.772988227661699e-05, 0.000123670426546596, 3.09559291054029e-05, 1.8254648239235394e-05, 0.00011723047646228224, 0.00016665719158481807, 0.0001639237452764064, 0.06131211668252945], [0.05091645196080208, 0.005027181003242731, 0.010756370611488819, 0.0665605217218399, 0.018176529556512833, 0.01172067690640688, 0.0027567048091441393, 0.4416898787021637, 0.00011143204756081104, 2.68235871772049e-05, 2.610041519801598e-05, 0.00019703178259078413, 0.00021132739493623376, 9.477348066866398e-05, 7.093177555361763e-05, 0.00011916201037820429, 0.39153802394866943], [0.011334138922393322, 0.0032153422944247723, 0.0017572115175426006, 0.0024629547260701656, 0.042147472500801086, 0.38664260506629944, 0.005464859306812286, 0.28278908133506775, 0.000676966505125165, 0.0006975207943469286, 3.374054722371511e-05, 0.0008054838399402797, 0.0001781523897079751, 0.00024365585704799742, 0.00010423064668430015, 0.001475379685871303, 0.2599712610244751], [0.008622555062174797, 0.011306416243314743, 0.0019826276693493128, 0.0018644648371264338, 0.003865110920742154, 0.01828988827764988, 0.004855854902416468, 0.4800243675708771, 0.0028362588491290808, 0.001980483066290617, 0.0007950146682560444, 0.004053756594657898, 0.0030991502571851015, 0.0038108404260128736, 0.001117386040277779, 0.007953991182148457, 0.44354182481765747], [0.0025143208913505077, 0.000777240376919508, 0.00018222740618512034, 0.00043763197027146816, 0.00032251790980808437, 0.006862354464828968, 0.00038709273212589324, 0.4879594147205353, 0.05079947039484978, 0.0043563698418438435, 0.00039685971569269896, 0.001448364811949432, 0.0017803650116547942, 0.0008035608916543424, 3.635586108430289e-05, 0.0003442949673626572, 0.4405914545059204], [0.004076679237186909, 0.0003453223325777799, 0.00010504107194719836, 6.428507185773924e-05, 0.00023372520809061825, 0.006864870432764292, 0.0011121939169242978, 0.2626172602176666, 0.4548511207103729, 0.028709890320897102, 0.004185290541499853, 0.003194513963535428, 0.0017407663399353623, 0.00314951385371387, 6.387726898537949e-05, 7.771598029648885e-05, 0.22860784828662872], [0.032990384846925735, 0.00014013412874192, 0.0003217288467567414, 7.145150448195636e-05, 0.0004984312108717859, 0.0006993322167545557, 0.0021276529878377914, 0.48495247960090637, 0.004532465711236, 0.011890790425240993, 0.009924204088747501, 0.011929348111152649, 0.0010077401529997587, 0.00040214572800323367, 0.0011490372708067298, 0.000364261562936008, 0.43699848651885986], [0.07505974173545837, 0.00037197707570157945, 0.00014326436212286353, 0.00016874067659955472, 1.0254897460981738e-05, 0.001567467232234776, 0.002743912162259221, 0.4310899078845978, 0.017633169889450073, 0.0048622558824718, 0.07169146835803986, 0.008951961062848568, 0.0009754991624504328, 0.0016519746277481318, 0.0006357599631883204, 0.00016508132102899253, 0.38227760791778564], [0.04973914846777916, 0.00028656210633926094, 0.00023465976119041443, 9.031646914081648e-05, 2.1595305952359922e-06, 5.35813596798107e-05, 0.0004354088450782001, 0.4788591265678406, 0.0004155740316491574, 0.015408193692564964, 0.0012672515586018562, 0.017031732946634293, 0.005494358018040657, 0.0004708004416897893, 0.00019231485202908516, 0.0010652100900188088, 0.42895376682281494], [0.004097452387213707, 0.0006705439300276339, 8.236811117967591e-05, 0.00018505503248889, 2.786518234643154e-05, 0.0003961905313190073, 0.00014462658145930618, 0.46966102719306946, 0.004676370415836573, 0.002447503386065364, 0.0007320625591091812, 0.003017253475263715, 0.05609467998147011, 0.03042515367269516, 0.00020597438560798764, 0.0014591793296858668, 0.42567673325538635], [0.0027298363856971264, 0.0015751059399917722, 5.002762554795481e-05, 0.00017712997214403003, 0.0009842074941843748, 0.0012724032858386636, 0.00048385566333308816, 0.4958328902721405, 0.005367135163396597, 0.0011284564388915896, 0.0008783271186985075, 0.002233640756458044, 0.012682851403951645, 0.026623155921697617, 0.00038559251697734, 0.009621751494705677, 0.43797364830970764], [0.031615011394023895, 0.000279454659903422, 0.00013370125088840723, 0.0005555158131755888, 7.119304063962772e-05, 0.0008404497639276087, 0.00015780021203681827, 0.42147353291511536, 0.0003714954655151814, 0.0003968031669501215, 0.00698791304603219, 0.0059258099645376205, 0.0036003245040774345, 0.019507480785250664, 0.12392933666706085, 0.0005593386013060808, 0.3835948407649994], [0.008464308455586433, 0.010421286337077618, 0.0017435017507523298, 0.0016736970283091068, 0.003416377352550626, 0.015824034810066223, 0.004358411300927401, 0.48361077904701233, 0.002666371176019311, 0.0018336459761485457, 0.0007482916698791087, 0.003748172428458929, 0.002972487360239029, 0.0035974611528217793, 0.001035738387145102, 0.007420164067298174, 0.44646525382995605]], [[0.013429098762571812, 0.0036419862881302834, 0.011602386832237244, 0.028114715591073036, 0.0010216834489256144, 0.005451391916722059, 0.014386426657438278, 0.3882030248641968, 0.015244131907820702, 0.048722583800554276, 0.003299308242276311, 0.034027066081762314, 0.05746200308203697, 0.0023432376328855753, 0.00691359955817461, 0.006130550522357225, 0.3600068688392639], [0.012528679333627224, 0.011536486446857452, 0.16012762486934662, 0.06681282073259354, 0.010236645117402077, 0.025194769725203514, 0.02052130550146103, 0.36283063888549805, 0.00019723729928955436, 0.00025557904154993594, 0.00025309386546723545, 4.766435449710116e-05, 0.0002895476936828345, 5.679569949279539e-05, 0.0001920399081427604, 4.5437205699272454e-05, 0.32887357473373413], [0.001518090721219778, 0.0005738929030485451, 0.0010259355185553432, 0.0029834455344825983, 0.000458627037005499, 0.0027533764950931072, 0.0013766746269538999, 0.5308411717414856, 2.408939690212719e-05, 0.0001123573092627339, 3.4430879622959765e-06, 2.3866443370934576e-05, 6.017334817443043e-05, 1.3530736396205612e-05, 2.140068136213813e-05, 1.9084871382801794e-05, 0.45819076895713806], [0.00364386267028749, 0.0008668334339745343, 0.002377714030444622, 0.007918374612927437, 0.005713435355573893, 0.01023100409656763, 0.00792755838483572, 0.5135588645935059, 0.0004096633056178689, 0.0005258533637970686, 2.253528691653628e-05, 0.00017303484492003918, 4.085870023118332e-05, 8.950007031671703e-05, 9.385969315189868e-05, 0.00011150473437737674, 0.4462955594062805], [0.01878291182219982, 0.0001994948397623375, 0.0026753866113722324, 0.00344107486307621, 0.013488512486219406, 0.02392675168812275, 0.8094979524612427, 0.06581313163042068, 0.00169488659594208, 0.002502292860299349, 0.00018336030188947916, 2.2459464162238874e-05, 8.241332579927985e-06, 5.986135874991305e-05, 1.2208309271954931e-05, 2.9859245842089877e-05, 0.057661570608615875], [0.018263019621372223, 0.0009714372572489083, 0.008089694194495678, 0.005888329818844795, 0.0034387505147606134, 0.017152858898043633, 0.21500320732593536, 0.38266468048095703, 0.003197029698640108, 0.004877487663179636, 0.0026732904370874166, 0.0002160479489248246, 0.00011549169721547514, 0.00014358569751493633, 0.0003788558824453503, 0.00014024302072357386, 0.33678603172302246], [0.0018867620965465903, 0.00015349597379099578, 1.5062193597259466e-05, 0.001711062272079289, 0.00011897140211658552, 0.0017383157974109054, 0.0004760660231113434, 0.5249843001365662, 0.0002814894251059741, 0.0006330386968329549, 5.538395998883061e-05, 0.0020936818327754736, 0.0030259904451668262, 8.97209465620108e-05, 0.00010048997501144186, 9.444625902688131e-05, 0.4625416696071625], [0.0019404401537030935, 0.00026710497331805527, 0.00043646080303005874, 0.0013448568060994148, 0.0002520801208447665, 0.0007993921753950417, 0.001253852853551507, 0.5319430828094482, 0.00019108355627395213, 0.00038255180697888136, 0.00011508238821988925, 0.00033804005943238735, 0.0006259795045480132, 0.00014497406664304435, 0.0001294982503168285, 0.00026305404026061296, 0.45957252383232117], [0.004359433893114328, 5.899837560718879e-05, 5.3954863687977195e-05, 4.453422661754303e-05, 5.324056473909877e-06, 8.950634219218045e-05, 0.0024279041681438684, 0.5195211172103882, 0.0012249051360413432, 0.022827623412013054, 0.0006791167543269694, 0.0005938378162682056, 0.0022384419571608305, 0.00018041506700683385, 0.00011777713370975107, 0.00018077304412145168, 0.4453962743282318], [0.001174118253402412, 9.919091098709032e-06, 6.992869657551637e-06, 1.2367061572149396e-05, 1.4958388874219963e-06, 1.5068068933032919e-05, 5.1338567573111504e-05, 0.5365377068519592, 7.132679456844926e-05, 0.00040911592077463865, 5.2560735639417544e-05, 0.000568644842132926, 0.0010704934829846025, 0.00015808446914888918, 5.533575313165784e-05, 5.29913158970885e-05, 0.45975250005722046], [0.008563623763620853, 0.0003431167860981077, 0.0002186862548114732, 0.00018562193145044148, 4.889450792688876e-06, 5.6954762840177864e-05, 0.0002719702897593379, 0.5050053000450134, 0.0007429488468915224, 0.007736029103398323, 0.0003802100254688412, 0.02151867002248764, 0.010546011850237846, 0.001344748423434794, 0.0002716335584409535, 0.003991624340415001, 0.43881791830062866], [0.0029660931322723627, 3.356702291057445e-05, 1.0141250641027e-05, 4.3203133827773854e-05, 5.798979600513121e-06, 2.59182106674416e-05, 0.00019521309877745807, 0.5272481441497803, 0.00017121329437941313, 0.0020905574783682823, 0.00047549520968459547, 0.001543577411212027, 0.006083074025809765, 0.0015674022724851966, 0.001168195391073823, 0.0036872567143291235, 0.4526851773262024], [0.010147598572075367, 0.00010542810196056962, 2.791151200653985e-05, 0.00012226782564539462, 1.7211483282153495e-05, 3.7915913708275184e-05, 0.0005615013651549816, 0.5207734107971191, 0.00027389396564103663, 0.00097418058430776, 0.00020318047609180212, 0.00013933134323451668, 0.0036254117731004953, 0.015212814323604107, 0.004228777252137661, 0.006219623610377312, 0.4373294413089752], [0.005111003760248423, 4.909404015052132e-05, 0.00010454035509610549, 7.38386734155938e-05, 1.671996506047435e-05, 5.804696775157936e-05, 0.0007894966402091086, 0.533173143863678, 3.253994873375632e-05, 0.00039570798981003463, 7.7376353146974e-05, 2.6055749913211912e-05, 0.0011553148506209254, 0.002018977887928486, 0.0007838904857635498, 0.0022354591637849808, 0.45389893651008606], [0.0026686619967222214, 7.297358388314024e-05, 4.0600421925773844e-05, 2.6441079171490856e-05, 6.856406798760872e-06, 3.103095878032036e-05, 0.0005197684513404965, 0.5150782465934753, 3.3197928132722154e-05, 0.0009468038915656507, 0.00015436076500918716, 0.0005838457145728171, 0.00041339886956848204, 0.00022115104366093874, 0.00026926607824862003, 0.03097512573003769, 0.44795823097229004], [0.0024070434737950563, 1.4101845408731606e-05, 1.3391898392001167e-05, 0.00010181788093177602, 2.8720201953547075e-05, 4.992862886865623e-05, 0.00023564348521176726, 0.535584568977356, 2.4876797397155315e-05, 8.097758109215647e-05, 1.265942682948662e-05, 3.748729795916006e-05, 0.00019603455439209938, 6.647333066212013e-05, 6.171799032017589e-05, 0.0011960945557802916, 0.4598884880542755], [0.0018854152876883745, 0.00024195419973693788, 0.0003892919630743563, 0.0011964889708906412, 0.0002212225808762014, 0.0007074782624840736, 0.0011521686101332307, 0.5327167510986328, 0.00017868781287688762, 0.00036445166915655136, 0.00010535194451222196, 0.0003095974388998002, 0.000576945545617491, 0.00013362325262278318, 0.0001223348081111908, 0.0002493462816346437, 0.4594489336013794]], [[0.006175864953547716, 0.003945250995457172, 0.00628415122628212, 0.00897055584937334, 0.0038659111596643925, 0.004598570987582207, 0.0033187842927873135, 0.048225488513708115, 0.24289733171463013, 0.13769705593585968, 0.015315343625843525, 0.0843270868062973, 0.22311508655548096, 0.05308913812041283, 0.016709063202142715, 0.0939989686012268, 0.04746631532907486], [0.015252792276442051, 0.04862416908144951, 0.09531359374523163, 0.13979670405387878, 0.06462687999010086, 0.19970370829105377, 0.1847115010023117, 0.11240889877080917, 0.007423353847116232, 0.004724742844700813, 0.00517642218619585, 0.003517323173582554, 0.009624018333852291, 0.003620052244514227, 0.001573661807924509, 0.003775583580136299, 0.10012663900852203], [0.013099843636155128, 0.031403522938489914, 0.0437195710837841, 0.1527964323759079, 0.05508628487586975, 0.14972637593746185, 0.12614937126636505, 0.20968861877918243, 0.006676316261291504, 0.005382124800235033, 0.0042049698531627655, 0.0040451777167618275, 0.00628983648493886, 0.004927252884954214, 0.001016740221530199, 0.0023255967535078526, 0.1834620237350464], [0.019437162205576897, 0.012755387462675571, 0.023538270965218544, 0.06124407425522804, 0.04496537894010544, 0.06899400800466537, 0.1826242357492447, 0.29200685024261475, 0.008207333274185658, 0.005039602052420378, 0.0029892302118241787, 0.0024144684430211782, 0.006978695280849934, 0.0028308345936238766, 0.0022003480698913336, 0.005020852200686932, 0.25875332951545715], [0.009907702915370464, 0.0018682557856664062, 0.006101534701883793, 0.01869630068540573, 0.0050467136316001415, 0.009216560050845146, 0.032967571169137955, 0.48622605204582214, 0.0034733815118670464, 0.003229400608688593, 0.0010227082530036569, 0.0009959868621081114, 0.0026514981873333454, 0.0010646467562764883, 0.0006860384601168334, 0.00045955844689160585, 0.41638606786727905], [0.006017141509801149, 0.0028323547448962927, 0.005344317760318518, 0.012749806977808475, 0.006344646215438843, 0.008436936885118484, 0.03260650485754013, 0.4763806462287903, 0.011315946467220783, 0.00689718360081315, 0.0013661691918969154, 0.004801446571946144, 0.00597792724147439, 0.004388711880892515, 0.0008289804100058973, 0.0023347896058112383, 0.41137656569480896], [0.006094898097217083, 0.003370960708707571, 0.003929188009351492, 0.01474632415920496, 0.01013614796102047, 0.012572094798088074, 0.008729614317417145, 0.39078688621520996, 0.0410468615591526, 0.032445598393678665, 0.009561053477227688, 0.031201500445604324, 0.04865817353129387, 0.009710405021905899, 0.003362491028383374, 0.012224867939949036, 0.36142292618751526], [0.004317966289818287, 0.00032809170079417527, 0.0006002673762850463, 0.0023465193808078766, 0.0006917279679328203, 0.0011808231938630342, 0.003571414854377508, 0.5369846224784851, 0.0004878594190813601, 0.0006555644213221967, 0.0005305142258293927, 0.0003943619958590716, 0.0011346598621457815, 0.00046252066385932267, 0.00034405410406179726, 0.0007058824994601309, 0.445263147354126], [0.0047135562635958195, 0.001496805576607585, 0.0008691507391631603, 0.0047647589817643166, 0.0006039981381036341, 0.0010083969682455063, 0.01732650399208069, 0.2374427169561386, 0.016609011217951775, 0.05085546895861626, 0.011805305257439613, 0.034794338047504425, 0.28470730781555176, 0.022188395261764526, 0.010919101536273956, 0.0834425613284111, 0.21645261347293854], [0.0036470501217991114, 0.0006071061943657696, 0.00031795151880942285, 0.0025497472379356623, 0.0002803914248943329, 0.00037247707950882614, 0.001475389814004302, 0.43816325068473816, 0.0046654753386974335, 0.005412637256085873, 0.0037714422214776278, 0.0107771847397089, 0.06160571798682213, 0.0150587884709239, 0.004622121341526508, 0.05366591736674309, 0.39300736784935], [0.0019491214770823717, 0.00033375126076862216, 0.00025799963623285294, 0.0011548256734386086, 8.759389311308041e-05, 0.00020983777358196676, 0.0008302581263706088, 0.508965015411377, 0.0017834737664088607, 0.0022154662292450666, 0.00023068675363902003, 0.0023993130307644606, 0.02320173755288124, 0.010359846986830235, 0.0013436316512525082, 0.004490940365940332, 0.4401865601539612], [0.004381018225103617, 0.001511939219199121, 0.0007991287857294083, 0.0035301579628139734, 0.0007575963390991092, 0.0007673099753446877, 0.001495296019129455, 0.4250960946083069, 0.006629580166190863, 0.008303679525852203, 0.005802220664918423, 0.006066073663532734, 0.051677949726581573, 0.028533771634101868, 0.022534245625138283, 0.040578022599220276, 0.39153581857681274], [0.006490889471024275, 0.0007184636197052896, 0.0018010372295975685, 0.0038288177456706762, 0.0013747181510552764, 0.0024147359654307365, 0.004594300873577595, 0.34336721897125244, 0.006001268047839403, 0.0060815042816102505, 0.0022297606337815523, 0.006361328065395355, 0.07384029775857925, 0.05843540281057358, 0.03870110213756561, 0.13202668726444244, 0.3117324709892273], [0.004799284972250462, 0.0005600673030130565, 0.000984428683295846, 0.003986151423305273, 0.0004072503943461925, 0.000391324982047081, 0.003005967941135168, 0.4744367301464081, 0.0012301995884627104, 0.0024145557545125484, 0.0007176084909588099, 0.0016344105824828148, 0.015994219109416008, 0.007028825581073761, 0.012257235124707222, 0.048636484891176224, 0.42151525616645813], [0.0007758397259749472, 9.463675087317824e-05, 0.00010434713476570323, 0.0006669131107628345, 0.00022609616280533373, 0.0004218752437736839, 0.0006185831152833998, 0.5243107676506042, 0.0005813146126456559, 0.0010741215664893389, 0.00032726104836910963, 0.001012048334814608, 0.005130719393491745, 0.002348396461457014, 0.0016353614628314972, 0.010688836686313152, 0.4499829113483429], [0.008535031229257584, 0.00124287698417902, 0.0007581211975775659, 0.0037881650496274233, 0.0013703524600714445, 0.0027693803422152996, 0.006553663406521082, 0.4495081603527069, 0.00795458722859621, 0.007624363526701927, 0.001745105255395174, 0.00566369853913784, 0.01906799152493477, 0.01299371663480997, 0.00965962279587984, 0.04445880278944969, 0.41630634665489197], [0.003962979186326265, 0.0002826295094564557, 0.0005205389461480081, 0.0020721559412777424, 0.0006098378798924387, 0.0010362726170569658, 0.0031905127689242363, 0.5386877059936523, 0.000428089959314093, 0.0005787468398921192, 0.0004807990335393697, 0.00034740526461973786, 0.0010038474574685097, 0.0004103929386474192, 0.0003062132454942912, 0.0006334269419312477, 0.44544848799705505]], [[0.0057395449839532375, 0.034898482263088226, 0.021690944209694862, 0.0015437914989888668, 0.0007724487804807723, 0.0008452574256807566, 0.011978999711573124, 0.17404402792453766, 0.4507349729537964, 0.1070479154586792, 0.018422221764922142, 0.004494649823755026, 0.005217648111283779, 0.002134372480213642, 0.008646692149341106, 0.003765205154195428, 0.1480228304862976], [0.00017560465494170785, 0.014144674874842167, 0.8812409043312073, 0.021323075518012047, 0.0007535643526352942, 0.0003517350705806166, 0.0063140373677015305, 0.03975151106715202, 0.0005505993030965328, 0.0004255290259607136, 3.3057815016945824e-05, 6.526439392473549e-06, 1.342506493529072e-05, 0.0001334703410975635, 0.00044544198317453265, 8.956203964771703e-05, 0.0342472568154335], [0.004669233225286007, 0.0018677617190405726, 0.005225381813943386, 0.4255354404449463, 0.004361957777291536, 0.0020976110827177763, 0.004492426756769419, 0.2924737334251404, 5.1102048018947244e-05, 2.8029238819726743e-05, 0.0005088843172416091, 7.189149619080126e-05, 0.00020526518346741796, 3.0187296943040565e-05, 5.2813076763413846e-05, 0.00032151761115528643, 0.2580067813396454], [0.008065272122621536, 0.0005423752008937299, 0.006205158308148384, 0.012323830276727676, 0.47498512268066406, 0.041645441204309464, 0.10149016231298447, 0.1886427402496338, 2.139467869710643e-05, 7.940344949020073e-05, 1.2292213796172291e-05, 2.0096278603887185e-05, 1.3449286598188337e-05, 6.081704668758903e-06, 5.505526496563107e-05, 0.002120624529197812, 0.16377151012420654], [0.00031651402241550386, 0.0016607873840257525, 0.04897366091609001, 0.0030587073415517807, 0.007253868505358696, 0.826690673828125, 0.065462626516819, 0.023037049919366837, 0.0028736458625644445, 0.0003362225543241948, 0.00012965593487024307, 2.970620244013844e-06, 6.758063477718679e-07, 2.3727472580503672e-05, 6.281709647737443e-05, 2.9669639843632467e-05, 0.02008666843175888], [0.0008942944114096463, 7.756378181511536e-05, 0.005150484852492809, 0.00036026540328748524, 0.006401918362826109, 0.00420917896553874, 0.927274763584137, 0.028796246275305748, 0.0011359560303390026, 0.0010280614951625466, 3.362509960425086e-05, 8.512562999385409e-06, 1.401302370140911e-06, 1.5549273939541308e-06, 1.1907707630598452e-05, 7.5080993156007025e-06, 0.024606691673398018], [0.0021271570585668087, 7.24953060853295e-05, 7.968379941303283e-05, 3.654774991446175e-05, 8.778824849287048e-06, 0.00013967775157652795, 0.0001836732408264652, 0.5406206250190735, 0.0008528832113370299, 0.00021936013945378363, 0.00024103006580844522, 6.219970964593813e-05, 1.4126781934464816e-05, 2.107683485519374e-06, 5.116896204526711e-07, 1.825200888561085e-05, 0.45532092452049255], [0.1939414143562317, 0.004207228776067495, 0.007781782187521458, 0.004237873945385218, 0.0041242255829274654, 0.0059728133492171764, 0.011701170355081558, 0.3897494673728943, 0.0017077679513022304, 0.00295432866550982, 0.0024328476283699274, 0.0028704700525850058, 0.0023356906604021788, 0.0020061719696968794, 0.0015157136367633939, 0.010150437243282795, 0.3523106873035431], [0.0001477715850342065, 1.1027561413357034e-05, 5.8816603996092454e-05, 1.5217424333968665e-05, 9.242963869837695e-07, 1.3892597507947357e-06, 0.0014048184966668487, 0.05455755442380905, 0.017759917303919792, 0.855491042137146, 0.023981744423508644, 0.00023354555014520884, 2.8893135095131584e-05, 4.698729753727093e-06, 0.00034749569022096694, 8.048285963013768e-06, 0.0459471270442009], [0.0005433694459497929, 1.5446536053786986e-05, 7.632502820342779e-05, 3.2322859624400735e-05, 7.213153367047198e-06, 1.6642412447254173e-05, 0.00036609749076887965, 0.33601677417755127, 0.0033035362139344215, 0.01100230123847723, 0.3101525604724884, 0.014827695675194263, 0.029352525249123573, 0.0025342488661408424, 0.0007018875912763178, 0.0004939206410199404, 0.2905570864677429], [0.0018131741089746356, 5.0253485824214295e-05, 1.2261157280590851e-05, 9.541688996250741e-06, 2.5826655019045575e-06, 1.1392711712687742e-05, 4.182837074040435e-05, 0.09569817781448364, 0.00014276904403232038, 0.0048253838904201984, 0.0019814225379377604, 0.8089271187782288, 0.0021713904570788145, 0.0002474809007253498, 5.86905343880062e-06, 0.0005838021752424538, 0.0834755226969719], [0.0006300511304289103, 5.9208097809460014e-05, 2.971307185362093e-05, 1.7761936760507524e-05, 4.7044948587426916e-05, 6.217254849616438e-06, 0.0003699070366565138, 0.027058668434619904, 0.0002372639428358525, 0.001391625264659524, 0.005546775180846453, 0.0020576382521539927, 0.9324154257774353, 0.0037649297155439854, 0.0007491814321838319, 0.0004918042686767876, 0.025126680731773376], [0.001699381391517818, 0.0009227353148162365, 0.002186755882576108, 0.0001108287979150191, 2.2968608391238376e-05, 2.5683664716780186e-05, 8.489964966429397e-05, 0.1112765371799469, 0.0005128133343532681, 0.0007594844209961593, 0.0009199709165841341, 0.0010126670822501183, 0.04829464852809906, 0.7155154943466187, 0.011472368612885475, 0.005590327084064484, 0.09959232807159424], [0.0004729439096990973, 9.460962610319257e-05, 0.0014289006358012557, 0.0004330741357989609, 2.1109253793838434e-05, 3.439404963501147e-06, 0.0006501249736174941, 0.0957702025771141, 2.7550726372282952e-05, 0.0019254108192399144, 0.0052498686127364635, 0.0008173897513188422, 0.001191838993690908, 0.031604181975126266, 0.6989022493362427, 0.07844530791044235, 0.08296176046133041], [0.0006847004988230765, 2.7315905754221603e-05, 0.0002011048054555431, 3.82870202884078e-05, 0.00015577209705952555, 7.022283534752205e-05, 0.0001641656126594171, 0.06406591087579727, 1.7839237216321635e-06, 5.3118659707251936e-05, 0.0004949983558617532, 0.0012367035960778594, 8.386502304347232e-05, 0.00031186151318252087, 0.0007020328775979578, 0.876522421836853, 0.0551857054233551], [0.002597780665382743, 0.00016966034309007227, 0.00019751473155338317, 0.0005385612021200359, 0.00012868043268099427, 0.00011791646829806268, 0.0001660403941059485, 0.5229778289794922, 0.00011765754607040435, 1.8530892702983692e-05, 0.0001089180150302127, 3.386388561921194e-05, 0.0023883029352873564, 0.0017526083393022418, 0.00030817664810456336, 0.0020702555775642395, 0.46630778908729553], [0.18669544160366058, 0.0038705889601260424, 0.007368527818471193, 0.0037887864746153355, 0.004014580510556698, 0.005711921025067568, 0.011122770607471466, 0.39559870958328247, 0.0016193253686651587, 0.0027696313336491585, 0.00231151538901031, 0.002583674853667617, 0.0023099915124475956, 0.001943324925377965, 0.001440319698303938, 0.0094740130007267, 0.3573768734931946]], [[0.003440571017563343, 0.010880602523684502, 0.02744508907198906, 0.019404573366045952, 0.01659337431192398, 0.02782086282968521, 0.041557200253009796, 0.4520622491836548, 0.0028665766585618258, 0.000810016121249646, 0.0037664955016225576, 0.0011141609866172075, 0.0011485782451927662, 0.0012300190282985568, 0.000790411897469312, 0.0007010676781646907, 0.3883681297302246], [0.10289955884218216, 0.029845770448446274, 0.045795366168022156, 0.023173559457063675, 0.0036170308012515306, 0.000955504656303674, 0.008273197337985039, 0.419643759727478, 0.0010368991643190384, 0.0012868238845840096, 0.00041938628419302404, 0.0016917828470468521, 0.005136492662131786, 0.0014233733527362347, 0.0005495689692907035, 0.0024534522090107203, 0.3517986238002777], [0.0680876225233078, 0.2535924017429352, 0.04967757686972618, 0.07282523810863495, 0.015687506645917892, 0.006525369361042976, 0.002380979713052511, 0.28112366795539856, 0.001131804077886045, 0.0005172902019694448, 0.00031889084493741393, 0.0005281688063405454, 0.004586036782711744, 0.0020697657018899918, 0.001068195910193026, 0.0008402175735682249, 0.23903927206993103], [0.018545055761933327, 0.10370425879955292, 0.4498167335987091, 0.08382479101419449, 0.0035887337289750576, 0.00277325208298862, 0.002717266557738185, 0.17746742069721222, 0.0004359005833975971, 8.122670988086611e-05, 0.00018335366621613503, 0.00023879151558503509, 0.00042872902122326195, 0.0011281869374215603, 0.0005219769082032144, 0.0008287490927614272, 0.1537155956029892], [0.015519064851105213, 0.020838193595409393, 0.15030981600284576, 0.6000524163246155, 0.004322231747210026, 0.014264766126871109, 0.006336112041026354, 0.09699022769927979, 0.001233358052559197, 0.00014386122347787023, 0.00039052986539900303, 0.00040046489448286593, 7.404699135804549e-05, 0.0005283773061819375, 0.001007423154078424, 0.0007875253795646131, 0.08680156618356705], [0.003047418314963579, 0.003457003040239215, 0.039402082562446594, 0.48182621598243713, 0.16994164884090424, 0.009276178665459156, 0.004938339348882437, 0.15056729316711426, 0.00036906340392306447, 0.00012627645628526807, 5.937435707892291e-05, 0.00018193147843703628, 0.0002493946230970323, 0.00022274210641626269, 0.0003394064260646701, 0.0006558246677741408, 0.13533978164196014], [0.008527873083949089, 0.007258834782987833, 0.010181975550949574, 0.17046326398849487, 0.20007948577404022, 0.44867488741874695, 0.007201803382486105, 0.07501641660928726, 0.0034886577632278204, 0.000571828568354249, 0.00018957628344651312, 0.0002765682293102145, 0.00023334509751293808, 0.0007874305010773242, 0.00026527224690653384, 0.0002959469857160002, 0.0664868950843811], [0.007320607081055641, 0.00566095532849431, 0.002247384050861001, 0.004465071018785238, 0.004407054278999567, 0.0045122550800442696, 0.005406326148658991, 0.4965779483318329, 0.004300887696444988, 0.0035423680674284697, 0.0034445892088115215, 0.00217936048284173, 0.003026118967682123, 0.004386344458907843, 0.0018377554370090365, 0.0025727797765284777, 0.4441121518611908], [0.2214091718196869, 0.002929472364485264, 0.0017475795466452837, 0.0013661517295986414, 0.008364797569811344, 0.020577499642968178, 0.07796277105808258, 0.24973013997077942, 0.09809868037700653, 0.08607316762208939, 0.0031449098605662584, 0.005501076579093933, 0.002677298616617918, 0.00028063173522241414, 0.00010143366671400145, 0.0015421733260154724, 0.21849308907985687], [0.037128616124391556, 0.0006448199856095016, 0.00039114427636377513, 0.0004617329977918416, 0.0005827924469485879, 0.0010484462836757302, 0.004569334909319878, 0.03115818277001381, 0.7952329516410828, 0.05366581305861473, 0.03291746973991394, 0.010212584398686886, 0.003347051562741399, 0.0005680648027919233, 0.00017261238826904446, 7.270722562680021e-05, 0.027825601398944855], [0.022663377225399017, 0.00039646547520533204, 0.0007369153317995369, 0.0003822481376118958, 0.00012961715401615947, 0.0015209624543786049, 0.014932522550225258, 0.38320720195770264, 0.11310999095439911, 0.08783471584320068, 0.011071893386542797, 0.01976374350488186, 0.0027226272504776716, 0.002128613879904151, 0.0005077141686342657, 0.00042511546052992344, 0.3384663164615631], [0.015247219242155552, 0.0008372714510187507, 0.0003675698535516858, 0.0002641345199663192, 0.0001396610023221001, 0.0006273903418332338, 0.0023229257203638554, 0.09065282344818115, 0.2543458044528961, 0.09058079868555069, 0.30392321944236755, 0.07051252573728561, 0.07200298458337784, 0.015055429190397263, 0.00094330043066293, 0.0002604113833513111, 0.08191651850938797], [0.006750714965164661, 0.0006810250342823565, 0.00020807948021683842, 0.00018833656213246286, 1.3156471140973736e-05, 0.00013248895993456244, 0.0005804749089293182, 0.1443941295146942, 0.053339485079050064, 0.05998692661523819, 0.0811389610171318, 0.4085361361503601, 0.09753821790218353, 0.016631102189421654, 0.0025038940366357565, 0.0008385027176700532, 0.12653839588165283], [0.004715628456324339, 0.000935802236199379, 0.0002654826093930751, 0.0002353394083911553, 4.3023403122788295e-05, 9.805598529055715e-05, 0.0003767317975871265, 0.07784398645162582, 0.001655018306337297, 0.03298838436603546, 0.00918527040630579, 0.0998644232749939, 0.6178584098815918, 0.07097785919904709, 0.0028955070301890373, 0.009885328821837902, 0.07017578184604645], [0.012364134192466736, 0.002775064669549465, 0.001260725548490882, 0.001959483837708831, 0.00024812520132400095, 0.00016816471179481596, 0.000226765958359465, 0.3868865966796875, 0.0032958402298390865, 0.0037555666640400887, 0.0035683782771229744, 0.022087788209319115, 0.08879271149635315, 0.11169511079788208, 0.006073730997741222, 0.00809676293283701, 0.3467450439929962], [0.010810469277203083, 0.0017047140281647444, 0.0004338170401751995, 0.001174248056486249, 0.00012571302067954093, 0.00010604305134620517, 0.0001508207933511585, 0.05373001471161842, 0.0012077274732291698, 0.0049799238331615925, 0.00605483865365386, 0.021346213296055794, 0.14662303030490875, 0.41330698132514954, 0.2739194631576538, 0.01490285899490118, 0.0494232214987278], [0.006803965661674738, 0.005220658145844936, 0.0020801853388547897, 0.004016949329525232, 0.00393712380900979, 0.0040673245675861835, 0.005031764507293701, 0.4991752505302429, 0.004091160371899605, 0.0034153074957430363, 0.0033397572115063667, 0.0020689526572823524, 0.002870053518563509, 0.004139008466154337, 0.0017325193621218204, 0.002451210515573621, 0.4455588459968567]], [[0.014440380036830902, 0.0024352974724024534, 0.0018017878755927086, 0.009026736952364445, 0.0008410246227867901, 0.002888438757508993, 0.02956835925579071, 0.4710801839828491, 0.004819110501557589, 0.002614728407934308, 0.011192835867404938, 0.008896052837371826, 0.01319106575101614, 0.003592859022319317, 0.0018756671342998743, 0.016239065676927567, 0.4054964482784271], [0.04798517003655434, 0.024043407291173935, 0.02687765657901764, 0.4849538207054138, 0.015073864720761776, 0.1215098649263382, 0.06075640767812729, 0.11261707544326782, 0.0011261440813541412, 0.0008208783692680299, 0.002831036690622568, 0.0014744948130100965, 0.0006539718597196043, 0.000732982181943953, 0.00035468724672682583, 0.001168350805528462, 0.09702024608850479], [0.02463426999747753, 0.016254328191280365, 0.016583509743213654, 0.2966495454311371, 0.009841088205575943, 0.025865640491247177, 0.018203619867563248, 0.31003156304359436, 0.0005840617814101279, 0.0007133132894523442, 0.0032667270861566067, 0.0011311076814308763, 0.0004842136986553669, 0.00022001925390213728, 0.0007448350079357624, 0.0017880105879157782, 0.2730042636394501], [0.009727727621793747, 0.00655414629727602, 0.003984907176345587, 0.011692997068166733, 0.048646315932273865, 0.007386611308902502, 0.018537964671850204, 0.47611796855926514, 0.0015214915620163083, 0.0005093598156236112, 0.0005313513684086502, 0.00011774931772379205, 0.0016934757586568594, 0.00034772080834954977, 0.0011858053039759398, 0.0007519843056797981, 0.4106924831867218], [0.002327314577996731, 0.00047327266656793654, 0.000505640113260597, 0.002556554740294814, 0.0004941041115671396, 0.0031087035313248634, 0.008915291167795658, 0.535338282585144, 0.00023432080342900008, 0.0002102612634189427, 0.00011343327787471935, 0.00018520398589316756, 0.00013442046474665403, 4.476711910683662e-05, 6.43779058009386e-05, 0.0001545885461382568, 0.44513949751853943], [0.004443404730409384, 0.0015456087421625853, 0.0005804640240967274, 0.0008983268053270876, 0.002845048438757658, 0.004139023832976818, 0.010338282212615013, 0.5248148441314697, 0.0011919512180611491, 0.0007699420093558729, 0.00034105320810340345, 0.0004539853835012764, 0.0008091671043075621, 0.00041490676812827587, 7.163146801758558e-05, 0.0003782362036872655, 0.44596412777900696], [0.010116705670952797, 0.0028069966938346624, 0.0007607156876474619, 0.0027354543562978506, 0.0043440586887300014, 0.0017847944982349873, 0.014208587817847729, 0.49573445320129395, 0.002725639846175909, 0.0014008230064064264, 0.0003409167402423918, 0.004040850792080164, 0.006733515299856663, 0.004132870119065046, 0.0005884367856197059, 0.002992566442117095, 0.44455257058143616], [0.005210913252085447, 0.002713436260819435, 0.0024838570971041918, 0.003947200253605843, 0.004117727745324373, 0.0026788036338984966, 0.005776779726147652, 0.49914804100990295, 0.0023259783629328012, 0.002620273968204856, 0.002508284756913781, 0.0028121185023337603, 0.004384955856949091, 0.0022111220750957727, 0.0022741868160665035, 0.004002066794782877, 0.45078423619270325], [0.024539778009057045, 0.0006011960213072598, 0.0005050112958997488, 0.0028635242488235235, 0.0003974633291363716, 0.0005619957810267806, 0.02069980651140213, 0.2555364668369293, 0.01254504919052124, 0.01745782233774662, 0.06002669408917427, 0.1348293572664261, 0.08860830962657928, 0.0180517565459013, 0.008795454166829586, 0.12487274408340454, 0.22910761833190918], [0.006941982079297304, 0.00038756447611376643, 0.00015058134158607572, 0.0009448044584132731, 0.00017708410450723022, 0.0004227448080200702, 0.004905796609818935, 0.47828903794288635, 0.004162816796451807, 0.007043411955237389, 0.0051765404641628265, 0.013008101843297482, 0.018811719492077827, 0.011008846573531628, 0.005730487406253815, 0.026834027841687202, 0.416004478931427], [0.015111254528164864, 0.0005826752167195082, 0.00024791978648863733, 0.000573134224396199, 5.744310328736901e-05, 0.00013534702884498984, 0.006220465060323477, 0.4845297932624817, 0.009056572802364826, 0.009682436473667622, 0.004036608152091503, 0.014424985274672508, 0.005704889073967934, 0.010819056071341038, 0.013044719584286213, 0.006429624743759632, 0.4193432033061981], [0.018200015649199486, 0.00038784457137808204, 0.0002531968639232218, 0.00016438277089037, 2.6519939638092183e-05, 0.00025292151258327067, 0.002164713339880109, 0.5075667500495911, 0.003474367782473564, 0.0005658806767314672, 0.0002546230098232627, 0.0005690058460459113, 0.010042199864983559, 0.005828756373375654, 0.0013137548230588436, 0.014839527197182178, 0.4340955913066864], [0.015351295471191406, 0.0007287039770744741, 0.00016824135673232377, 0.0009152168058790267, 0.0002544331655371934, 0.0002455971552990377, 0.0010894873412325978, 0.4655109643936157, 0.0015341357793658972, 0.0007732787053100765, 0.0009815931553021073, 0.0008880509994924068, 0.020219115540385246, 0.014216090552508831, 0.004407091531902552, 0.07239099591970444, 0.4003257751464844], [0.013404683209955692, 0.0004175263165961951, 0.00029183723381720483, 0.0014615829568356276, 0.0004576477222144604, 0.0003948585654143244, 0.0038722888566553593, 0.4164462983608246, 0.0028963391669094563, 0.002000988693907857, 0.0038559637032449245, 0.004672961309552193, 0.02300344780087471, 0.018134830519557, 0.010075806640088558, 0.11463042348623276, 0.38398250937461853], [0.0032383911311626434, 0.0001139999003498815, 8.294112194562331e-05, 0.001077339518815279, 0.00019792802049778402, 6.725641287630424e-05, 0.00046361604472622275, 0.5209450125694275, 0.0006567223463207483, 0.0005903710261918604, 0.00016616906214039773, 0.00034436825080774724, 0.001607914688065648, 0.0034573241136968136, 0.001863019773736596, 0.008785981684923172, 0.45634159445762634], [0.01300361193716526, 0.00018407718744128942, 5.160276123206131e-05, 0.0003921446914318949, 0.00043437289423309267, 9.893515380099416e-05, 0.002178665716201067, 0.5124618411064148, 0.0035824012011289597, 0.00043897164869122207, 0.00012950101518072188, 0.0003230892471037805, 0.0071645393036305904, 0.002322521060705185, 0.001444441033527255, 0.007167648524045944, 0.448621541261673], [0.005100694485008717, 0.002380759222432971, 0.002169976942241192, 0.003463252680376172, 0.003490941133350134, 0.002369044115766883, 0.005142476875334978, 0.5019798278808594, 0.0021299703512340784, 0.0023996764793992043, 0.002260474720969796, 0.002536521991714835, 0.003969251178205013, 0.002007785951718688, 0.0020910478197038174, 0.003558834781870246, 0.4529494345188141]]], [[[0.15162232518196106, 0.10971194505691528, 0.03927284851670265, 0.12180411070585251, 0.007406389806419611, 0.04297800362110138, 0.05651146546006203, 0.24628019332885742, 0.0047410340048372746, 0.0035994015634059906, 2.9562075724243186e-05, 0.0009802805725485086, 0.0004473467415664345, 0.001506675616838038, 3.284397826064378e-05, 0.0013963790843263268, 0.21167916059494019], [0.0008379274513572454, 0.14040544629096985, 0.04065341502428055, 0.004631598945707083, 0.000680327124428004, 0.0030875559896230698, 0.0010817584116011858, 0.4129577875137329, 0.001525851315818727, 0.0004354039265308529, 0.0001817269658204168, 6.307653529802337e-05, 0.00010641593689797446, 0.001676537562161684, 7.237045065267012e-05, 0.0006852989317849278, 0.39091756939888], [0.000435569672845304, 0.015458516776561737, 0.021969569846987724, 0.0013672400964424014, 0.00019268198229838163, 0.0014097458915784955, 0.0007354922709055245, 0.49640288949012756, 0.0001775386044755578, 0.00037457322468981147, 0.00025415749405510724, 0.00011290940165054053, 3.335673682158813e-05, 0.00019666469597723335, 0.0001063113086274825, 0.00026891144807450473, 0.4605039060115814], [0.0007417799206450582, 0.01002402976155281, 0.011279181577265263, 0.02220406010746956, 0.003365602344274521, 0.0031453724950551987, 0.0019389520166441798, 0.49212297797203064, 0.000198520821868442, 0.0009109890088438988, 0.00040686383727006614, 0.0003546166990417987, 0.0006436220137402415, 0.0003706440329551697, 7.661977724637836e-05, 0.0006709762383252382, 0.4515452980995178], [0.002102070953696966, 0.005805108230561018, 0.0033333466853946447, 0.0037518555764108896, 0.0075051672756671906, 0.008867616765201092, 0.0026106261648237705, 0.4984143078327179, 0.0011590142967179418, 0.0006260921363718808, 0.0002037344384007156, 0.0004197617236059159, 0.002084961626678705, 0.001419029082171619, 2.968913031509146e-05, 0.0005458381492644548, 0.46112164855003357], [0.004511271137744188, 0.04235135763883591, 0.012413250282406807, 0.0035222498700022697, 0.008466082625091076, 0.056719664484262466, 0.022529175505042076, 0.4257921278476715, 0.005296122748404741, 0.0023711728863418102, 0.0005830498412251472, 0.0006439212011173368, 0.0015566437505185604, 0.00810290314257145, 0.0004547798016574234, 0.002120276214554906, 0.40256598591804504], [0.0063224961049854755, 0.011514393612742424, 0.0058580306358635426, 0.008719604462385178, 0.0110350102186203, 0.034775421023368835, 0.014921019785106182, 0.48174166679382324, 0.00034953668364323676, 0.0005827781278640032, 8.551823702873662e-05, 0.0003156126767862588, 0.00047284780885092914, 0.00024783299886621535, 3.6832920159213245e-05, 0.0006621272768825293, 0.42235925793647766], [0.0022581887897104025, 0.001147789298556745, 0.0012683223467320204, 0.0018038679845631123, 0.0002506634045857936, 0.000794385327026248, 0.0015311819734051824, 0.5178257822990417, 0.0003137900785077363, 0.0007425009971484542, 0.00019786010670941323, 0.0004004598595201969, 0.0002937010722234845, 0.00017818355991039425, 9.80863842414692e-05, 0.0007366269128397107, 0.4701586663722992], [0.013858581893146038, 0.00709044374525547, 0.004777722526341677, 0.0023933923803269863, 0.0017736002337187529, 0.006418530363589525, 0.003300949465483427, 0.48851776123046875, 0.00790928304195404, 0.004056336358189583, 0.00014034259947948158, 0.0007017344469204545, 0.0005179547588340938, 0.006900356151163578, 0.0001466434623580426, 0.0015325637068599463, 0.4499638080596924], [0.007691337261348963, 0.0013388273073360324, 0.003775484161451459, 0.002919538877904415, 0.0006412917282432318, 0.0027484893798828125, 0.0020052860490977764, 0.5124815702438354, 0.0015227263793349266, 0.0033990847878158092, 0.00012225699902046472, 0.000259931170148775, 7.550359441665933e-05, 0.00023588957265019417, 0.000149906292790547, 0.004306497052311897, 0.4563263952732086], [0.002330730203539133, 0.0003014556714333594, 0.0004531018785201013, 0.0012653361773118377, 0.0005290283588692546, 0.0009622600628063083, 0.0008559423731639981, 0.5113089084625244, 0.00015497059212066233, 0.0006678461795672774, 0.00030202747439034283, 0.0005115241510793567, 0.0001591813488630578, 1.72836116689723e-05, 3.5303582990309224e-05, 0.0010801417520269752, 0.4790648818016052], [0.005819212645292282, 0.00021483530872501433, 0.0006852124934084713, 0.0018345721764490008, 0.00024525431217625737, 0.000588182476349175, 0.0014814576134085655, 0.5106444954872131, 0.0009050146327354014, 0.004380492493510246, 0.0009218018967658281, 0.0018163217464461923, 0.00014580517017748207, 2.1437563191284426e-05, 7.335636473726481e-05, 0.0016012336127460003, 0.4686214327812195], [0.008563409559428692, 0.0014967328170314431, 0.0018438724800944328, 0.0038195978850126266, 0.005937446374446154, 0.008097606711089611, 0.002516165841370821, 0.4588181972503662, 0.05804136022925377, 0.020236942917108536, 0.0007753559038974345, 0.0022065294906497, 0.0021761099342256784, 0.000581152446102351, 8.617012645117939e-05, 0.0008391879964619875, 0.42396411299705505], [0.0035708188079297543, 0.0018005740130320191, 0.002772502601146698, 0.00040547605021856725, 0.0012572268024086952, 0.00506405858322978, 0.002426678780466318, 0.43854063749313354, 0.09574415534734726, 0.028675615787506104, 0.0011997005203738809, 0.0018275566399097443, 0.0010486081009730697, 0.005789272021502256, 0.00032231383374892175, 0.0015148274833336473, 0.40803998708724976], [0.0008133704541251063, 0.0004984234692528844, 0.0021919405553489923, 0.005278429947793484, 0.0003053145483136177, 0.0009671378065831959, 0.0005959445261396468, 0.515633225440979, 0.0010578557848930359, 0.002933816285803914, 0.0012055500410497189, 0.0010139814112335443, 0.00035427865805104375, 0.0003622338699642569, 0.0022079104091972113, 0.0010145187843590975, 0.4635660946369171], [0.00954220350831747, 0.002636347198858857, 0.010769193060696125, 0.009865867905318737, 0.0008806141559034586, 0.0023686010390520096, 0.004097809083759785, 0.45221176743507385, 0.0071501350030303, 0.04089708253741264, 0.002739539137110114, 0.008865737356245518, 0.001317090936936438, 0.0012959641171619296, 0.0017854789039120078, 0.029335923492908478, 0.4142405688762665], [0.0022447144147008657, 0.0010734116658568382, 0.0011683915508911014, 0.001717484905384481, 0.00023086908913683146, 0.0007393963751383126, 0.00145628210157156, 0.5185094475746155, 0.00030199639149941504, 0.0007291302317753434, 0.00019331704243086278, 0.0003905851044692099, 0.00028759113047271967, 0.00016863590280991048, 9.724332630867139e-05, 0.0007416990702040493, 0.46994978189468384]], [[0.03129994124174118, 0.05397389084100723, 0.03612878918647766, 0.06880293041467667, 0.009010836482048035, 0.042996156960725784, 0.18930235505104065, 0.08727110177278519, 0.021433716639876366, 0.16774217784404755, 0.007310076616704464, 0.01882689632475376, 0.06909318268299103, 0.011242610402405262, 0.004475067835301161, 0.09938367456197739, 0.08170662820339203], [0.09871657937765121, 0.09463932365179062, 0.03297179937362671, 0.0182070042937994, 0.021062668412923813, 0.11692299693822861, 0.20972827076911926, 0.19322611391544342, 0.004545035772025585, 0.004832039587199688, 0.0009570408728905022, 0.0018668370321393013, 0.010231228545308113, 0.004413231275975704, 0.0008356698672287166, 0.007240993436425924, 0.17960324883460999], [0.03629371151328087, 0.02435954660177231, 0.01011139526963234, 0.011824710294604301, 0.017329847440123558, 0.05727069452404976, 0.04356948658823967, 0.4040355980396271, 0.0030012577772140503, 0.0023756285663694143, 0.0008529227925464511, 0.0005335372989065945, 0.004823393654078245, 0.001697343192063272, 0.0003192056610714644, 0.004285227041691542, 0.37731653451919556], [0.032456330955028534, 0.19714994728565216, 0.15353699028491974, 0.023963244631886482, 0.024091064929962158, 0.05091731250286102, 0.044780388474464417, 0.22628700733184814, 0.008570521138608456, 0.0028142097871750593, 0.0018266913248226047, 0.000771894701756537, 0.003745390335097909, 0.01064717024564743, 0.0028436852153390646, 0.0037821924779564142, 0.2118159681558609], [0.021946735680103302, 0.11858170479536057, 0.08319760113954544, 0.04987845569849014, 0.005240418016910553, 0.01651344634592533, 0.010430018417537212, 0.3407377004623413, 0.0032757564913481474, 0.006637522019445896, 0.0010336334817111492, 0.006381471175700426, 0.009055445902049541, 0.0016053136205300689, 0.0001432521385140717, 0.009627648629248142, 0.3157138228416443], [0.05243082344532013, 0.2586538791656494, 0.14304545521736145, 0.07273533940315247, 0.0032944355625659227, 0.01708938181400299, 0.03147002309560776, 0.20326100289821625, 0.004605574067682028, 0.00763388816267252, 0.00042283316724933684, 0.002630773466080427, 0.006925344932824373, 0.0013102114899083972, 0.000327576941344887, 0.007790989242494106, 0.18637242913246155], [0.0497361496090889, 0.27932223677635193, 0.11239410191774368, 0.06700069457292557, 0.02463410422205925, 0.048935454338788986, 0.04947773367166519, 0.17399564385414124, 0.007480769883841276, 0.006080171559005976, 0.0004352664982434362, 0.001014033448882401, 0.008351865224540234, 0.0008619399741292, 0.0003128921380266547, 0.010354693047702312, 0.1596122831106186], [0.005096247885376215, 0.004514685366302729, 0.0037696922663599253, 0.003816161770373583, 0.0015304754488170147, 0.0049208952113986015, 0.0015945304185152054, 0.5016276836395264, 0.0028718002140522003, 0.001891767606139183, 0.000566433125641197, 0.0014097377425059676, 0.003448096802458167, 0.0022105479147285223, 0.0003155835438519716, 0.0019418180454522371, 0.45847389101982117], [0.14860279858112335, 0.00627906946465373, 0.003943925723433495, 0.004860382527112961, 0.004753129556775093, 0.016632191836833954, 0.01723095215857029, 0.23678094148635864, 0.013921421952545643, 0.08884069323539734, 0.0037801654543727636, 0.02108956314623356, 0.1472177803516388, 0.008337331935763359, 0.0005608194624073803, 0.05095003917813301, 0.22621877491474152], [0.29361915588378906, 0.0029370232950896025, 0.0022143095266073942, 0.0014191137161105871, 0.0013048818800598383, 0.003201795509085059, 0.007958369329571724, 0.2999245524406433, 0.009542430751025677, 0.016123412176966667, 0.001946056610904634, 0.005301306024193764, 0.02014773152768612, 0.001099143992178142, 0.0003889049985446036, 0.04376094415783882, 0.28911083936691284], [0.019498111680150032, 0.004896295722573996, 0.005043943412601948, 0.002846852410584688, 0.003291438100859523, 0.0029365152586251497, 0.0026671786326915026, 0.48742565512657166, 0.011153140105307102, 0.004587016999721527, 0.0016815053531900048, 0.00145218544639647, 0.0022949164267629385, 0.0012774126371368766, 0.00011621848534559831, 0.004543577320873737, 0.444288045167923], [0.08067811280488968, 0.0038186467718333006, 0.0038021670188754797, 0.001214314135722816, 0.0018836510134860873, 0.002768467180430889, 0.006292213220149279, 0.3533766567707062, 0.07308633625507355, 0.04606712982058525, 0.003590243635699153, 0.004387885332107544, 0.03129352629184723, 0.005709726829081774, 0.0010956136975437403, 0.047655295580625534, 0.3332800567150116], [0.07768674939870834, 0.0012278907233849168, 0.0010796755086630583, 0.0004826653457712382, 0.0021101697348058224, 0.006026304326951504, 0.011747465468943119, 0.15186335146427155, 0.017967049032449722, 0.10085295140743256, 0.0016179749509319663, 0.018610544502735138, 0.009471165016293526, 0.0022461065091192722, 0.001770619535818696, 0.4545133709907532, 0.1407259702682495], [0.08089234679937363, 0.0022588029969483614, 0.002382456324994564, 0.0014732616255059838, 0.001473354990594089, 0.0026552025228738785, 0.0014451079769060016, 0.3696480393409729, 0.01832922361791134, 0.05042244866490364, 0.0019298945553600788, 0.00955882016569376, 0.0549248605966568, 0.004644791595637798, 0.0005785772809758782, 0.04746290668845177, 0.3499198257923126], [0.01620936021208763, 0.0012749010929837823, 0.0005969212506897748, 0.0003393683291506022, 0.000526351504959166, 0.0024368988815695047, 0.0004092109447810799, 0.49986138939857483, 0.008397913537919521, 0.003379252040758729, 0.0004526883421931416, 0.0009417283581569791, 0.005914950743317604, 0.003761844476684928, 6.29035203019157e-05, 0.006380514241755009, 0.4490537643432617], [0.17847469449043274, 0.027851495891809464, 0.008031203411519527, 0.003897752845659852, 0.0019736175891011953, 0.009174268692731857, 0.026784775778651237, 0.2657906115055084, 0.04462268203496933, 0.023058690130710602, 0.004798270296305418, 0.00992236565798521, 0.11499764025211334, 0.005517386831343174, 0.004190279170870781, 0.025762612000107765, 0.24515166878700256], [0.005230667535215616, 0.004257251974195242, 0.0035367209929972887, 0.003674691077321768, 0.0014655701816082, 0.004584519658237696, 0.0015474701067432761, 0.5020777583122253, 0.0029590483754873276, 0.0019484206568449736, 0.0005718856700696051, 0.0014308959944173694, 0.0035097636282444, 0.0022255724761635065, 0.0003176132158841938, 0.0020011805463582277, 0.4586609899997711]], [[0.0036108652129769325, 0.01849963143467903, 0.049411874264478683, 0.0643068253993988, 0.010310312733054161, 0.034140218049287796, 0.4312479794025421, 0.19255970418453217, 0.003589708125218749, 0.007211575750261545, 0.003217290388420224, 0.0009236152982339263, 0.0016977592604234815, 0.0016501928912475705, 0.00022912635176908225, 0.012826277874410152, 0.1645670235157013], [0.039871782064437866, 0.03398009017109871, 0.009561181999742985, 0.0066717229783535, 0.0007619662792421877, 0.003138384548947215, 0.00862803589552641, 0.46139055490493774, 0.0011553798103705049, 0.00030594554846175015, 0.00033771456219255924, 0.0010393774136900902, 0.00043156143510714173, 0.00022089455160312355, 0.00010533772001508623, 0.0021215372253209352, 0.4302784502506256], [0.010167981497943401, 0.029395127668976784, 0.02430281974375248, 0.003198578953742981, 0.00256782746873796, 0.0011957045644521713, 0.0019535047467797995, 0.48395082354545593, 0.00022646102297585458, 0.00010761396697489545, 0.00010128030407940969, 0.0008287965320050716, 0.0008256362634710968, 0.00014741538325324655, 4.266649193596095e-05, 0.0003154721634928137, 0.4406723082065582], [0.02658657729625702, 0.05772276967763901, 0.03140385076403618, 0.02475529909133911, 0.021908078342676163, 0.01041181106120348, 0.013469517230987549, 0.4175986051559448, 0.0023503766860812902, 0.00048488224274478853, 0.0006043605390004814, 0.002746684942394495, 0.0023646254558116198, 0.0026747470255941153, 0.0005481202388182282, 0.0013538564089685678, 0.38301584124565125], [0.05005335062742233, 0.06884029507637024, 0.03626161068677902, 0.07021316140890121, 0.012481145560741425, 0.020489061251282692, 0.02542150765657425, 0.3719520568847656, 0.0017478337977081537, 0.000706350663676858, 0.00047120748786255717, 0.0016299448907375336, 0.0017112191999331117, 0.0012146425433456898, 0.00026298227021470666, 0.002673480426892638, 0.33387020230293274], [0.1584283858537674, 0.16888119280338287, 0.11848776042461395, 0.07025402039289474, 0.005326312500983477, 0.034746088087558746, 0.03298741951584816, 0.20063412189483643, 0.011352629400789738, 0.001555172959342599, 0.0007421558257192373, 0.001641582348383963, 0.0008922413690015674, 0.002658517798408866, 0.00012214599701110274, 0.007874086499214172, 0.18341615796089172], [0.028842566534876823, 0.021116329357028008, 0.02084812894463539, 0.04805995523929596, 0.018136393278837204, 0.08485352247953415, 0.009988483972847462, 0.397138386964798, 0.000991090084426105, 0.0006919702864252031, 0.0006804627482779324, 0.0005334317102096975, 0.0004240366106387228, 0.00031080777989700437, 0.00019894960860256106, 0.0037451786920428276, 0.36344027519226074], [0.007629618979990482, 0.006824952084571123, 0.003120941109955311, 0.0057760304771363735, 0.003165418514981866, 0.006865301635116339, 0.011941029690206051, 0.48249566555023193, 0.004503777250647545, 0.0026128552854061127, 0.001848860178142786, 0.0019908433314412832, 0.0021825155708938837, 0.0025462007615715265, 0.0009521670290268958, 0.006207573227584362, 0.4493362605571747], [0.01958552375435829, 0.0015095955459401011, 0.000818784290459007, 0.0020499888341873884, 0.0017035930650308728, 0.007508841808885336, 0.0039483364671468735, 0.49467816948890686, 0.004700459074229002, 0.0009303622064180672, 0.00021494909015018493, 0.0003858648124150932, 0.00020934366330038756, 9.801457781577483e-05, 5.083658834337257e-05, 0.0011940813856199384, 0.4604131877422333], [0.012182013131678104, 0.0003048867511097342, 0.00022707651078235358, 0.00047681646537967026, 0.0012200751807540655, 0.0035861507058143616, 0.009847451001405716, 0.47928911447525024, 0.04041699320077896, 0.005596746690571308, 0.0006885198527015746, 0.0013021272607147694, 0.000517425884027034, 0.000577111670281738, 7.489492418244481e-05, 0.001437177648767829, 0.44225552678108215], [0.004411362577229738, 0.0006322393892332911, 0.00031119436607696116, 0.0005519564147107303, 0.0011660088784992695, 0.0009699682123027742, 0.0012022752780467272, 0.49151545763015747, 0.024453667923808098, 0.0044539375230669975, 0.0010257689282298088, 0.0018594611901789904, 0.001565079903230071, 0.00041327191866002977, 0.00016501547361258417, 0.0006913818069733679, 0.46461182832717896], [0.012597310356795788, 0.0005975068197585642, 0.0002119347918778658, 0.0002794286410789937, 0.00034651061287149787, 0.0015924399485811591, 0.0036231516860425472, 0.1744053214788437, 0.606438934803009, 0.0197446309030056, 0.002813480095937848, 0.005973202642053366, 0.0025120675563812256, 0.006549640092998743, 0.0001207633686135523, 0.0010185543214902282, 0.16117523610591888], [0.08204326033592224, 0.0008170415530912578, 0.00031860521994531155, 0.002259216969832778, 0.0010435455478727818, 0.002783864736557007, 0.01695891097187996, 0.3300830125808716, 0.1358431726694107, 0.0548134408891201, 0.03063656948506832, 0.02114885486662388, 0.017982807010412216, 0.0019634913187474012, 0.00015568539674859494, 0.006453436333686113, 0.294695109128952], [0.0223862137645483, 0.001988781150430441, 0.0009807462338358164, 0.0019236876396462321, 0.0007421050686389208, 0.0020126320887356997, 0.0041262246668338776, 0.45059120655059814, 0.011150101199746132, 0.015047219581902027, 0.008298610337078571, 0.01567929983139038, 0.03611148148775101, 0.0064739445224404335, 0.001079636043868959, 0.008603977970778942, 0.41280409693717957], [0.004495426081120968, 0.0006828714394941926, 0.000225146854063496, 0.0007350947707891464, 6.691546877846122e-05, 0.000655531301163137, 0.0013839525636285543, 0.4695415794849396, 0.00401310995221138, 0.005244715604931116, 0.002263123169541359, 0.0065982043743133545, 0.029287250712513924, 0.03328244760632515, 0.0019626300781965256, 0.009835440665483475, 0.4297265410423279], [0.07682226598262787, 0.0018054709071293473, 0.0009224780951626599, 0.002286374568939209, 0.0004908937262371182, 0.0030793934129178524, 0.002130769658833742, 0.1607736051082611, 0.020767243579030037, 0.011926773004233837, 0.007212688215076923, 0.03214466571807861, 0.19900979101657867, 0.30220094323158264, 0.0023394026793539524, 0.026617255061864853, 0.14947007596492767], [0.007577006705105305, 0.00655153626576066, 0.002997698960825801, 0.005529630929231644, 0.0029950584284961224, 0.006482533644884825, 0.011762309819459915, 0.48331111669540405, 0.004457398783415556, 0.002653115428984165, 0.0018439117120578885, 0.00200221361592412, 0.0022309725172817707, 0.0025915312580764294, 0.000963730039075017, 0.006311739794909954, 0.4497385025024414]], [[0.01109678577631712, 0.0010117096826434135, 0.0004688598564825952, 0.0005161232547834516, 0.0002338179328944534, 0.0074919238686561584, 0.0026945227291435003, 0.4938630759716034, 0.0017491556936874986, 0.0006848368211649358, 0.0014946861192584038, 0.0035486880224198103, 0.0008535421802662313, 0.0016514707822352648, 0.0002030957257375121, 0.0006212258012965322, 0.4718165397644043], [0.010621950030326843, 0.011646490544080734, 0.008539113216102123, 0.005121071822941303, 0.0014236761489883065, 0.0008478055824525654, 0.0018215227173641324, 0.4964909553527832, 6.970557296881452e-05, 0.00016002164920791984, 4.338341022958048e-05, 0.0025423970073461533, 0.00298856059089303, 0.0005104454467073083, 3.852651207125746e-05, 0.0003067178186029196, 0.45682764053344727], [0.0018188806716352701, 0.028312142938375473, 0.023681825026869774, 0.011251840740442276, 0.005648389458656311, 0.0003815161471720785, 0.002278159838169813, 0.48835256695747375, 0.00019114524184260517, 0.000489383062813431, 4.093154075235361e-06, 0.00016994534234981984, 0.0001474133023293689, 0.00031222368124872446, 0.00012619310291483998, 0.0004061810905113816, 0.43642815947532654], [0.0008866526186466217, 0.005787013564258814, 0.009233498945832253, 0.003838958917185664, 0.006553275976330042, 0.006551063619554043, 0.000157097281771712, 0.507166862487793, 0.00018700305372476578, 2.600349034764804e-05, 3.101159381913021e-05, 3.7370606150943786e-05, 6.639362982241437e-05, 0.00038180287810973823, 5.995716492179781e-05, 3.9490507333539426e-05, 0.45899659395217896], [0.0007577225915156305, 0.0016118313651531935, 0.004841862246394157, 0.2869369387626648, 0.030458858236670494, 0.04482046142220497, 0.0004212194471620023, 0.33332693576812744, 3.226616536267102e-05, 1.26361783259199e-05, 1.2770678040396888e-05, 1.6771409718785435e-05, 3.66924396075774e-05, 8.81249870872125e-05, 2.9239259674795903e-05, 6.144399230834097e-05, 0.2965342700481415], [0.0018758217338472605, 0.0007095796754583716, 0.0034280207473784685, 0.036336127668619156, 0.5324495434761047, 0.004720974247902632, 0.015195902436971664, 0.21015220880508423, 0.00012031511141685769, 0.0003778524696826935, 0.00022453632846008986, 0.0003098855377174914, 0.000224611401790753, 0.00010664076398825273, 3.177867256454192e-05, 0.0007461908971890807, 0.1929900199174881], [0.009016456082463264, 0.0010768759530037642, 0.004214041866362095, 0.008062602952122688, 0.01741735450923443, 0.012850679457187653, 0.027584875002503395, 0.4808201193809509, 0.0003081945760641247, 0.0007799368468113244, 0.0002786203986033797, 0.0008961830753833055, 4.816884029423818e-05, 8.38043779367581e-05, 8.982215513242409e-05, 0.000470328435767442, 0.4360019564628601], [0.02319958060979843, 0.0052472855895757675, 0.002158869057893753, 0.004860537126660347, 0.0040931482799351215, 0.011984923854470253, 0.005775251891463995, 0.4623115062713623, 0.002492500701919198, 0.001445327652618289, 0.0011211367091163993, 0.0023307327646762133, 0.0046090781688690186, 0.006053324323147535, 0.00101186812389642, 0.001716527040116489, 0.4595884382724762], [0.03151025250554085, 0.00031631681486032903, 7.724425086053088e-05, 0.000996450544334948, 0.008429128676652908, 0.008165273815393448, 0.006539106369018555, 0.48310983180999756, 0.011348819360136986, 0.005151040852069855, 0.00043087880476377904, 0.0010849572718143463, 0.0016597297508269548, 0.00022927937970962375, 2.3442262317985296e-06, 0.00020865823898930103, 0.440740704536438], [0.0057838778011500835, 8.949777839006856e-05, 0.00032806905801407993, 0.0002956968382932246, 0.0011012930190190673, 0.0004011471464764327, 0.006425697356462479, 0.42378294467926025, 0.09371111541986465, 0.05845797806978226, 0.0022776436526328325, 0.010545536875724792, 0.0004095543990842998, 9.192561265081167e-05, 0.0001700959837762639, 0.0013485181843861938, 0.39477941393852234], [0.005166173446923494, 0.00017908650625031441, 8.968970360001549e-05, 0.00010595053754514083, 0.00016982822853606194, 9.766011498868465e-05, 0.0008671545656397939, 0.43203505873680115, 0.014478225260972977, 0.135527104139328, 0.008805598132312298, 0.021762290969491005, 0.001476060482673347, 0.001065905555151403, 0.0005293535068631172, 0.00021927796478848904, 0.3774256110191345], [0.0005935425288043916, 8.395666554861236e-06, 6.406888132914901e-06, 4.251595237292349e-05, 3.404060407774523e-05, 1.288630755880149e-05, 0.00022723243455402553, 0.09350273758172989, 0.00020655365369748324, 0.0017144465819001198, 0.8142262101173401, 0.003288627602159977, 0.0010526729747653008, 1.9989274733234197e-05, 0.00012967838847544044, 4.442239151103422e-05, 0.08488964289426804], [0.0008409898728132248, 5.267115557217039e-05, 1.1425796401454136e-05, 8.089289622148499e-05, 4.126460680708988e-06, 1.2419686754583381e-05, 3.967360044043744e-06, 0.4865311086177826, 3.743031629710458e-05, 0.00025210093008354306, 0.0007001258200034499, 0.02711857110261917, 0.04710739850997925, 0.0010817504953593016, 4.8061840061564e-05, 4.525022814050317e-05, 0.4360716640949249], [0.002913174917921424, 0.0002600011648610234, 3.331080370116979e-05, 0.00019384610641282052, 6.820556154707447e-05, 6.168073014123365e-05, 0.00011027788423234597, 0.2345069795846939, 4.368437294033356e-05, 0.00042611753451637924, 0.0007670038612559438, 0.0048230490647256374, 0.49476343393325806, 0.0423365943133831, 0.0001985717681236565, 0.001543986494652927, 0.21695010364055634], [0.004628641065210104, 0.00019572870223782957, 8.594959217589349e-05, 0.00020368758123368025, 8.744284423300996e-05, 2.279017098771874e-05, 0.00012242970115039498, 0.4608336389064789, 0.0004829490208067, 0.004712790250778198, 3.475540142972022e-05, 0.008372235111892223, 0.0395500548183918, 0.04113396257162094, 0.0012152416165918112, 0.019672924652695656, 0.4186446964740753], [0.001314694993197918, 0.0005361109506338835, 0.0002691985573619604, 0.0007500582141801715, 0.0007724633323960006, 0.00017847558774519712, 0.0003388428594917059, 0.4435509741306305, 9.05971864995081e-06, 0.00013858739112038165, 0.0018351977923884988, 0.0008490073960274458, 0.0059469714760780334, 0.001785774016752839, 0.12267188727855682, 0.010035806335508823, 0.4090169370174408], [0.02240109257400036, 0.004995665512979031, 0.0020551285706460476, 0.004542137496173382, 0.0038950126618146896, 0.011424831114709377, 0.005368831101804972, 0.46376729011535645, 0.002410619519650936, 0.001456008991226554, 0.0010412927949801087, 0.0023929551243782043, 0.004632208961993456, 0.0061010634526610374, 0.0009693729225546122, 0.001713728765025735, 0.46083277463912964]], [[0.0022750943899154663, 0.010453112423419952, 0.020606115460395813, 0.011505103670060635, 0.02004687860608101, 0.01047897431999445, 0.008018380962312222, 0.45112091302871704, 0.01091521605849266, 0.009288142435252666, 0.00922533217817545, 0.00910355243831873, 0.005777525249868631, 0.0024394341744482517, 0.0022723490837961435, 0.009101360104978085, 0.40737250447273254], [0.004924003966152668, 0.012850104831159115, 0.040057580918073654, 0.009098364040255547, 0.005592674016952515, 0.003316461341455579, 0.01663370430469513, 0.46458467841148376, 0.002391244051977992, 0.0032729865051805973, 0.002126396866515279, 0.0024289970751851797, 0.0031920478213578463, 0.001255786162801087, 0.002194190863519907, 0.0016412868862971663, 0.4244394600391388], [0.0005608937353827059, 0.005617762450128794, 0.003305960912257433, 0.002260830719023943, 0.002065944718196988, 0.0006302290712483227, 0.0028415846172720194, 0.5147683620452881, 0.0004451780114322901, 0.0003006880870088935, 0.00035635169479064643, 0.0002530182828195393, 0.00020155945094302297, 0.0007596635259687901, 0.0019033465068787336, 0.0002076802629744634, 0.463520884513855], [0.005467827431857586, 0.053151216357946396, 0.16016684472560883, 0.026265360414981842, 0.008647864684462547, 0.006163076497614384, 0.0032885149121284485, 0.38037535548210144, 0.0005913670756854117, 0.0004674695373978466, 0.00033131203963421285, 0.001321532647125423, 0.0010609409073367715, 0.0007168218144215643, 0.0005413867183960974, 0.00021686064428649843, 0.35122621059417725], [0.0024912641383707523, 0.05242779850959778, 0.0936596617102623, 0.5487411022186279, 0.009003477171063423, 0.010475761257112026, 0.009396403096616268, 0.14255447685718536, 0.0003057606518268585, 0.0002772718435153365, 0.00016678162501193583, 0.0007525997934862971, 0.00018151775293517858, 0.00035875433241017163, 0.0005213293479755521, 0.0001885435194708407, 0.12849752604961395], [0.005962515249848366, 0.010973064228892326, 0.0978999212384224, 0.16785530745983124, 0.08133397251367569, 0.03276844695210457, 0.046465519815683365, 0.2876873314380646, 0.000597060308791697, 0.0008538936963304877, 0.0022414643317461014, 0.00045995257096365094, 0.0003708236326929182, 0.00041112012695521116, 0.001500620972365141, 0.0008427174179814756, 0.2617762088775635], [0.0028146447148174047, 0.02998090349137783, 0.026145126670598984, 0.11946006119251251, 0.18174594640731812, 0.0685737207531929, 0.006310279946774244, 0.2952812910079956, 0.000993960304185748, 0.00037089057150296867, 0.0001394861174048856, 0.0004617071244865656, 0.0002782996743917465, 9.784934081835672e-05, 0.00019737507682293653, 0.00047915009781718254, 0.26666927337646484], [0.0010333707323297858, 0.001788356457836926, 0.004041990265250206, 0.0030567541252821684, 0.002498042769730091, 0.001178353326395154, 0.0015836611855775118, 0.5104126930236816, 0.0005800735088996589, 0.0006170897977426648, 0.00066317681921646, 0.0010502913501113653, 0.0008147733751684427, 0.000383299367967993, 0.00028441069298423827, 0.0009957130532711744, 0.46901798248291016], [0.006327589508146048, 0.0012476869160309434, 0.002237314358353615, 0.0015616700984537601, 0.00508793443441391, 0.004339228384196758, 0.004981195088475943, 0.2550294101238251, 0.029436105862259865, 0.29116204380989075, 0.036931149661540985, 0.09745568782091141, 0.008074183948338032, 0.0012486024061217904, 0.014209835790097713, 0.0058762384578585625, 0.23479412496089935], [0.005215411074459553, 0.00036832771729677916, 0.0003046841884497553, 0.0009913826361298561, 0.0019874130375683308, 0.0007939427741803229, 0.0020722495391964912, 0.447187602519989, 0.04533311724662781, 0.02392612025141716, 0.021450504660606384, 0.013086003251373768, 0.0011283765779808164, 0.00046343824942596257, 0.006980156525969505, 0.0016835200367495418, 0.42702773213386536], [0.005034809000790119, 0.000981734017841518, 0.0008152078953571618, 0.0006002298905514181, 0.0014736175071448088, 0.0021326520945876837, 0.0016566928243264556, 0.31829139590263367, 0.050808992236852646, 0.047773316502571106, 0.19840949773788452, 0.04932990297675133, 0.011232908815145493, 0.004363834857940674, 0.00946979783475399, 0.0020765310619026423, 0.2955489158630371], [0.002113976050168276, 0.00018926890334114432, 0.0004069455317221582, 0.0003542519698385149, 0.00046129024121910334, 0.0002226813230663538, 0.0012670224532485008, 0.09073788672685623, 0.01864161714911461, 0.06516823917627335, 0.6230757832527161, 0.08617802709341049, 0.011425490491092205, 0.0013136977795511484, 0.013945518061518669, 0.0006066134083084762, 0.08389173448085785], [0.003073921659961343, 0.0012041267473250628, 0.0004789376980625093, 0.0006527347140945494, 0.00018202702631242573, 0.0002915275108534843, 0.00021177082089707255, 0.21347902715206146, 0.016476454213261604, 0.10448583215475082, 0.01666930690407753, 0.2526237368583679, 0.1739194095134735, 0.014894764870405197, 0.0018043859163299203, 0.0005466766888275743, 0.19900530576705933], [0.0030206767842173576, 0.00023137447715271264, 0.0009005973115563393, 0.0007224963046610355, 0.00030423898715525866, 0.00021672810544259846, 0.0021856441162526608, 0.28382641077041626, 0.0005988482153043151, 0.029751384630799294, 0.035690177232027054, 0.05504085123538971, 0.11812994629144669, 0.012597351334989071, 0.1744566112756729, 0.01699228398501873, 0.26533442735671997], [0.0024657759349793196, 0.0001970427401829511, 0.00016748807684052736, 0.00018534802075009793, 0.00014009508595336229, 0.0005248280358500779, 0.0007400502800010145, 0.24483902752399445, 0.001408064621500671, 0.009656690992414951, 0.02565629780292511, 0.029809560626745224, 0.02048562839627266, 0.06809348613023758, 0.12197684496641159, 0.23768503963947296, 0.23596878349781036], [0.004823704250156879, 0.0002510686172172427, 0.0001744956971378997, 0.00036615756107494235, 0.00026677566347643733, 0.00028106302488595247, 0.0005834019975736737, 0.10652389377355576, 0.0027059512212872505, 0.02759951539337635, 0.010923890396952629, 0.023246103897690773, 0.022929541766643524, 0.10211421549320221, 0.5651404857635498, 0.028950832784175873, 0.103118896484375], [0.0009777132654562593, 0.0016491094138473272, 0.00377093069255352, 0.002786139724776149, 0.0022742468863725662, 0.001088302698917687, 0.0014697588048875332, 0.5112897157669067, 0.0005501030245795846, 0.000579355750232935, 0.0006392940995283425, 0.0010106582194566727, 0.0007831606781110168, 0.00037130696000531316, 0.0002716354501899332, 0.0009585854131728411, 0.46953004598617554]], [[0.038976337760686874, 0.00961409229785204, 0.0013238433748483658, 0.006809653714299202, 0.003536069532856345, 0.018614185974001884, 0.009504063986241817, 0.18996426463127136, 0.28113889694213867, 0.0803370252251625, 0.004060279577970505, 0.05860796198248863, 0.050784096121788025, 0.05234220251441002, 0.002480372553691268, 0.014989659190177917, 0.1769169270992279], [0.004851902835071087, 0.029543360695242882, 0.154482901096344, 0.18825477361679077, 0.04480995610356331, 0.04563237354159355, 0.029542725533246994, 0.25785040855407715, 0.0008805884863249958, 0.0018177630845457315, 0.000761527509894222, 0.0002529105986468494, 0.002214443404227495, 0.0004629243048839271, 0.0005551670910790563, 0.0005708308890461922, 0.23751544952392578], [0.002769803162664175, 0.008197277784347534, 0.0017536348896101117, 0.03467575088143349, 0.014444954693317413, 0.014298166148364544, 0.011645697988569736, 0.4692133665084839, 0.00239817937836051, 0.0015090882079675794, 0.00044163488200865686, 0.0006872944650240242, 0.0011879749363288283, 0.0008716852753423154, 0.00012616011372301728, 0.0004894623998552561, 0.43528980016708374], [0.009362754411995411, 0.010842501185834408, 0.0064840358681976795, 0.036290477961301804, 0.06834423542022705, 0.05059254169464111, 0.06500508636236191, 0.37884455919265747, 0.0035420393105596304, 0.0037230229936540127, 0.0013872438576072454, 0.0009193831938318908, 0.0034098464529961348, 0.0021378931123763323, 0.0010663566645234823, 0.001784090418368578, 0.3562638759613037], [0.008027922362089157, 0.003128264332190156, 0.01926644891500473, 0.1734047532081604, 0.010179535485804081, 0.02857435680925846, 0.31333717703819275, 0.17900384962558746, 0.019912229850888252, 0.0470740869641304, 0.018432455137372017, 0.0010763618629425764, 0.002159517491236329, 0.0020579786505550146, 0.005780589301139116, 0.0012142780469730496, 0.16737017035484314], [0.016465097665786743, 0.0018900102004408836, 0.04238666594028473, 0.082520492374897, 0.0993977040052414, 0.018861178308725357, 0.22917813062667847, 0.24227455258369446, 0.005857329815626144, 0.015579809434711933, 0.005159114021807909, 0.004531792365014553, 0.01018898282200098, 0.0024068665225058794, 0.0013271934585645795, 0.0022894360590726137, 0.2196856439113617], [0.004391717258840799, 0.0009036744013428688, 0.0008687502122484148, 0.002722236094996333, 0.02390293963253498, 0.0037836176343262196, 0.00420044083148241, 0.49151143431663513, 0.003544037463143468, 0.0024279344361275434, 0.0017998210387304425, 0.0038089442532509565, 0.010884412564337254, 0.0015863217413425446, 0.00029889988945797086, 0.0006610953714698553, 0.44270363450050354], [0.001256324234418571, 0.0005429021548479795, 0.0005868433509021997, 0.002257002517580986, 0.0007989323930814862, 0.0013249253388494253, 0.0020168914925307035, 0.5163377523422241, 0.001068641198799014, 0.0011623078025877476, 0.00030000193510204554, 0.0009521861211396754, 0.0018375777872279286, 0.0010362897301092744, 0.00043097094749100506, 0.0008062592823989689, 0.4672842025756836], [0.01752755418419838, 0.0003488147631287575, 0.0014962316490709782, 0.0015353925991803408, 0.0014925745781511068, 0.0010170828318223357, 0.0076755378395318985, 0.22095638513565063, 0.0619484968483448, 0.3184283673763275, 0.008283027447760105, 0.07927460223436356, 0.05951099097728729, 0.004753191489726305, 0.0015185645315796137, 0.007651553023606539, 0.2065816968679428], [0.005796052049845457, 0.0002740106137935072, 0.00022713560611009598, 0.0018879066919907928, 0.0019566144328564405, 0.0007970799342729151, 0.00304929306730628, 0.4519134759902954, 0.018429743126034737, 0.028721950948238373, 0.0018114425474777818, 0.030979624018073082, 0.019514750689268112, 0.009412471204996109, 0.0017005238914862275, 0.004153670277446508, 0.4193742275238037], [0.0015857716789469123, 0.000986230792477727, 7.962297968333587e-05, 0.0012663773959502578, 0.00040817048284225166, 0.00016814640548545867, 0.0003424639580771327, 0.4402284324169159, 0.019882196560502052, 0.012405737303197384, 0.0005390960141085088, 0.05834276229143143, 0.014721404761075974, 0.022942880168557167, 0.001639563008211553, 0.008397700265049934, 0.4160633981227875], [0.008377504535019398, 0.0005180230364203453, 0.0004486166872084141, 0.0021586103830486536, 0.0019947343971580267, 0.0007909151026979089, 0.0016832307446748018, 0.3894874155521393, 0.021301815286278725, 0.046887028962373734, 0.004423712380230427, 0.04250910133123398, 0.057654090225696564, 0.030498089268803596, 0.010863645933568478, 0.012875259853899479, 0.36752817034721375], [0.0019489119295030832, 0.000336305150995031, 0.00015826374874450266, 0.00043840735452249646, 0.0002256950392620638, 0.001259137992747128, 0.0005439341184683144, 0.40097132325172424, 0.001962528331205249, 0.004655881784856319, 0.00017267126531805843, 0.007796818856149912, 0.0339060015976429, 0.05830707028508186, 0.0019901064224541187, 0.113780178129673, 0.37154674530029297], [0.007161043118685484, 0.0001757900754455477, 0.0015213820151984692, 0.00140712212305516, 0.001174174016341567, 0.00031171037699095905, 0.004333582241088152, 0.324642151594162, 0.0016992771998047829, 0.016666561365127563, 0.0027171429246664047, 0.01455305889248848, 0.059876877814531326, 0.0513005405664444, 0.037043265998363495, 0.17329469323158264, 0.30212152004241943], [0.001092555932700634, 0.0006826882599852979, 0.00048422772670164704, 0.0008324494701810181, 0.0011040645185858011, 0.00024397812376264483, 0.0012031944934278727, 0.46817296743392944, 0.0008082817657850683, 0.0036010397598147392, 0.0015608540270477533, 0.005168259609490633, 0.00932722631841898, 0.014402383007109165, 0.0033058016560971737, 0.046136755496263504, 0.44187331199645996], [0.006799872033298016, 0.0007712256046943367, 0.0007395145366899669, 0.0011672800173982978, 0.0070961518213152885, 0.007983696646988392, 0.0077833072282373905, 0.34959226846694946, 0.01439765002578497, 0.020221339538693428, 0.01570134237408638, 0.008648251183331013, 0.03951069712638855, 0.12448414415121078, 0.02468704618513584, 0.04012776538729668, 0.3302884101867676], [0.0011054305359721184, 0.00047423990326933563, 0.0005131713696755469, 0.0020329467952251434, 0.0007148956647142768, 0.0011484521673992276, 0.001780129736289382, 0.5177304744720459, 0.0009308467269875109, 0.0009908140636980534, 0.0002691643894650042, 0.0007956245099194348, 0.0016231390181928873, 0.0009040692239068449, 0.00036981774610467255, 0.0006914714467711747, 0.4679252505302429]], [[0.0032640490680933, 0.0063628386706113815, 0.0005348502891138196, 0.0003996864543296397, 9.735026105772704e-05, 0.00031834313995204866, 0.000395137001760304, 0.1579686850309372, 0.6589411497116089, 0.004236893262714148, 0.0047025191597640514, 0.0008945011650212109, 0.00047383891069330275, 0.002966162748634815, 0.0014378527412191033, 0.0008196427952498198, 0.15618647634983063], [0.004964017774909735, 0.03040875867009163, 0.05403232201933861, 0.0398433692753315, 0.006044136360287666, 0.013880222104489803, 0.015832621604204178, 0.4290490746498108, 0.0006471463129855692, 0.0007718211272731423, 0.00032877526246011257, 6.939583545317873e-05, 0.002810221631079912, 0.0010773964459076524, 0.0009998672176152468, 0.0012632767902687192, 0.39797744154930115], [0.03838713467121124, 0.007421733811497688, 0.006711688823997974, 0.024914663285017014, 0.02570611611008644, 0.024959255009889603, 0.0151298176497221, 0.4424462914466858, 0.00042809502338059247, 0.0006039860309101641, 0.0002314788434887305, 0.00014037049550097436, 0.00024619221221655607, 0.00048749501002021134, 0.00025454856222495437, 0.0008028906886465847, 0.4111282527446747], [0.02304382249712944, 0.015037196688354015, 0.011081398464739323, 0.008465694263577461, 0.19536305963993073, 0.1697010099887848, 0.014849673956632614, 0.2847273647785187, 0.0017795433523133397, 0.0003157014143653214, 0.0003637450572568923, 0.0003325349243823439, 0.0005300667253322899, 0.000598089536651969, 0.0004198823298793286, 0.0025242180563509464, 0.2708670198917389], [0.005892421118915081, 0.0028785986360162497, 0.0037849275395274162, 0.006293190643191338, 0.016768038272857666, 0.4643842875957489, 0.10188636183738708, 0.20419184863567352, 0.001725164707750082, 0.0005772890872322023, 0.0003099875757470727, 3.501834362396039e-05, 5.000212695449591e-05, 0.00020643736934289336, 5.056496229371987e-05, 0.0004776221758220345, 0.19048826396465302], [0.12253923714160919, 0.0006490188534371555, 0.0014095220249146223, 0.004252448212355375, 0.004486711695790291, 0.013474869541823864, 0.5181862115859985, 0.17146694660186768, 0.0014427476562559605, 0.0019333697855472565, 0.0001941430091392249, 2.7511263397173025e-05, 0.0004215872031636536, 0.0001227026805281639, 0.0001248656481038779, 0.00019584837718866765, 0.15907227993011475], [0.007203268352895975, 0.0014873448526486754, 0.00015209197590593249, 0.00042094741365872324, 0.00018842009012587368, 0.000716701615601778, 0.0016899446491152048, 0.4815935790538788, 0.04333335533738136, 0.0010002970229834318, 0.0013431813567876816, 0.00015307813009712845, 3.9617974834982306e-05, 0.00020723008492495865, 1.2174826224509161e-05, 0.00013406244397629052, 0.4603247344493866], [0.006852757651358843, 0.006547790020704269, 0.0030300114303827286, 0.004325081128627062, 0.005152330733835697, 0.006955202203243971, 0.005443495232611895, 0.47411683201789856, 0.0037267019506543875, 0.003180661704391241, 0.0033255796879529953, 0.0019731170032173395, 0.002653153846040368, 0.0038164539728313684, 0.0023481566458940506, 0.010356337763369083, 0.4561963975429535], [0.009126291610300541, 0.00016169888840522617, 0.00017239699081983417, 0.000273021258180961, 0.00013750324433203787, 0.0003742114349734038, 0.006471728906035423, 0.42420825362205505, 0.003851932007819414, 0.13364163041114807, 0.010721182450652122, 0.0009002761216834188, 0.001901097595691681, 0.00011155217362102121, 0.00045141851296648383, 0.0006211274303495884, 0.4068746268749237], [0.004218528047204018, 0.00025822632596828043, 3.160057167406194e-05, 8.654622070025653e-05, 3.165929956594482e-05, 0.0002686771913431585, 0.00029801478376612067, 0.4285305440425873, 0.017533686012029648, 0.020307481288909912, 0.08711591362953186, 0.007044652011245489, 0.00593388918787241, 0.00852082297205925, 0.004716299939900637, 0.001050080987624824, 0.41405338048934937], [0.01590060442686081, 0.00011074377107433975, 2.4844017389114015e-05, 7.703465962549672e-05, 3.2935549825197086e-05, 0.00017924493295140564, 0.0002676631847862154, 0.265347957611084, 0.000492884311825037, 0.01122378185391426, 0.004673244431614876, 0.43607521057128906, 0.005700152833014727, 0.0035337607841938734, 0.000295093166641891, 0.001957812812179327, 0.2541070580482483], [0.02097749337553978, 0.00045833244803361595, 4.080756116309203e-05, 0.00012208092084620148, 2.4270852009067312e-05, 0.00015646718384232372, 0.00011376404290786013, 0.12342259287834167, 0.001916329376399517, 0.002818291774019599, 0.0013625436695292592, 0.0026856688782572746, 0.6749051213264465, 0.04384029656648636, 0.002585151931270957, 0.004151083528995514, 0.12041959166526794], [0.00358110130764544, 0.0011924795107915998, 8.858641376718879e-05, 0.00011104826990049332, 6.4820133047760464e-06, 0.0003397009277250618, 0.00016871602565515786, 0.3301616311073303, 9.425164171261713e-05, 0.00027630754630081356, 0.0004103370592929423, 0.0015387332532554865, 0.024287046864628792, 0.2604977488517761, 0.023229548707604408, 0.03517642244696617, 0.31883981823921204], [0.004666340071707964, 0.00038397187017835677, 0.0005966455792076886, 0.0002070161426672712, 5.4846907005412504e-05, 8.596424595452845e-05, 0.0012470235815271735, 0.41681107878685, 7.199977062555263e-06, 0.0003268352011218667, 0.0003205059911124408, 8.725547377252951e-05, 0.003715049708262086, 0.006015043705701828, 0.11313369870185852, 0.04750971868634224, 0.40483179688453674], [0.0038006831891834736, 0.0003231786540709436, 0.00011179737339261919, 0.00015091919340193272, 0.0004341636085882783, 0.0003000767028424889, 0.00024086121993605047, 0.2502947747707367, 5.665831486112438e-05, 0.0006447812775149941, 0.00026638605049811304, 0.0006303851841948926, 0.005567341577261686, 0.007949023507535458, 0.006080927327275276, 0.48388171195983887, 0.23926633596420288], [0.00711279921233654, 0.0005954647203907371, 0.0001276496914215386, 0.0007588432636111975, 6.876347470097244e-05, 0.0007780141895636916, 0.0002643018960952759, 0.48840051889419556, 0.0009983251802623272, 8.652396354591474e-05, 0.0003033917164430022, 0.00019872773555107415, 0.006851953454315662, 0.01942499727010727, 0.0009169687400572002, 0.006074898410588503, 0.4670378863811493], [0.005994903389364481, 0.005298085510730743, 0.0024021638091653585, 0.0036500385031104088, 0.0045059435069561005, 0.006045484449714422, 0.004468916915357113, 0.48015886545181274, 0.0028006588108837605, 0.002635649172589183, 0.002653145929798484, 0.0015610060654580593, 0.0022873675916343927, 0.003135301638394594, 0.0019450521795079112, 0.009086239151656628, 0.46137118339538574]], [[0.0048123812302947044, 0.00209894310683012, 0.0011626965133473277, 0.0014687813818454742, 0.0006082436884753406, 0.0013757192064076662, 0.030025584623217583, 0.08956196904182434, 0.014125152491033077, 0.11033368855714798, 0.008902354165911674, 0.011466803960502148, 0.026069993153214455, 0.009420773945748806, 0.012056716717779636, 0.5899583101272583, 0.08655181527137756], [0.021962404251098633, 0.020488057285547256, 0.054582517594099045, 0.01990138739347458, 0.03443135693669319, 0.050552695989608765, 0.48743095993995667, 0.11724468320608139, 0.006147482432425022, 0.021113203838467598, 0.00919561181217432, 0.01109311729669571, 0.011391060426831245, 0.004070018883794546, 0.003616462927311659, 0.016193801537156105, 0.11058511584997177], [0.006378205493092537, 0.010617067106068134, 0.04643158242106438, 0.02025887928903103, 0.014094867743551731, 0.018106302246451378, 0.1390853226184845, 0.3533708453178406, 0.0027947521302849054, 0.008857923559844494, 0.020990382879972458, 0.0055335224606096745, 0.0062545533291995525, 0.0012347252340987325, 0.01096450723707676, 0.007572354283183813, 0.32745423913002014], [0.008438930846750736, 0.029820701107382774, 0.08328043669462204, 0.011176995001733303, 0.024783305823802948, 0.04985334724187851, 0.37946537137031555, 0.18348272144794464, 0.006473247427493334, 0.00402679480612278, 0.007148390635848045, 0.002528097713366151, 0.020282626152038574, 0.0022574381437152624, 0.005141792818903923, 0.010322893969714642, 0.17151685059070587], [0.013146300800144672, 0.014578698202967644, 0.04752691835165024, 0.018887920305132866, 0.011114251799881458, 0.018865643069148064, 0.0902118906378746, 0.3782249093055725, 0.00435866741463542, 0.003512841183692217, 0.0055445535108447075, 0.00442067626863718, 0.02273484691977501, 0.003563523991033435, 0.0026610682252794504, 0.011130188591778278, 0.3495170772075653], [0.026183543726801872, 0.04415304586291313, 0.11016897857189178, 0.023265209048986435, 0.015046006068587303, 0.015175879001617432, 0.32376012206077576, 0.13141563534736633, 0.019901374354958534, 0.0419188030064106, 0.024590054526925087, 0.017970411106944084, 0.01921130158007145, 0.010980273596942425, 0.008002642542123795, 0.04373222589492798, 0.12452444434165955], [0.00439279992133379, 0.021659402176737785, 0.15881158411502838, 0.019986066967248917, 0.010725073516368866, 0.006811514962464571, 0.011382153257727623, 0.3734387457370758, 0.0025932774879038334, 0.006043863017112017, 0.016087103635072708, 0.002002769848331809, 0.005180936306715012, 0.0019360696896910667, 0.0025182426907122135, 0.003972879145294428, 0.35245761275291443], [0.0058541065081954, 0.0035270738881081343, 0.0037601981312036514, 0.0030270484276115894, 0.0017382270889356732, 0.002040113089606166, 0.008299806155264378, 0.5053389668464661, 0.0012789653846994042, 0.0007271157228387892, 0.0011962797725573182, 0.0003216115874238312, 0.0027103715110570192, 0.0006156162125989795, 0.0005756017053499818, 0.0014705591602250934, 0.4575183391571045], [0.009700633585453033, 0.003023226046934724, 0.004072319716215134, 0.003685861360281706, 0.0021484140306711197, 0.0024234929587692022, 0.022623082622885704, 0.40299728512763977, 0.0017948102904483676, 0.008753431960940361, 0.0254472978413105, 0.01711825467646122, 0.027757422998547554, 0.002251633210107684, 0.03640643507242203, 0.05417483299970627, 0.37562161684036255], [0.0028829448856413364, 0.0010242618154734373, 0.0012637637555599213, 0.000558661122340709, 0.0003508103545755148, 0.0008501987322233617, 0.003323100972920656, 0.499811589717865, 0.0006993316928856075, 0.0006721063400618732, 0.007602715399116278, 0.0013754855608567595, 0.006477029528468847, 0.000623533793259412, 0.00932715367525816, 0.005771982949227095, 0.4573853015899658], [0.007727264892309904, 0.000898777914699167, 0.01313664112240076, 0.003194994991645217, 0.0014758826000615954, 0.0007962792296893895, 0.01338866539299488, 0.3851647675037384, 0.005100421607494354, 0.015264005400240421, 0.09867162257432938, 0.010212494991719723, 0.006864444352686405, 0.004543095827102661, 0.07139288634061813, 0.0035913733299821615, 0.3585764467716217], [0.005817990750074387, 0.0009788741590455174, 0.004159149713814259, 0.001518536009825766, 0.0011408962309360504, 0.0016662640264257789, 0.008779410272836685, 0.42437392473220825, 0.0065417359583079815, 0.014485418796539307, 0.053910478949546814, 0.0028802845627069473, 0.020178869366645813, 0.0059985388070344925, 0.038687944412231445, 0.012680376879870892, 0.3962012529373169], [0.004734321031719446, 0.003024928504601121, 0.0019297586986795068, 0.0026375832967460155, 0.0023297788575291634, 0.005153844598680735, 0.00532491272315383, 0.3924025595188141, 0.010840315371751785, 0.033790573477745056, 0.013390748761594296, 0.03214506059885025, 0.013913237489759922, 0.04251052439212799, 0.016102980822324753, 0.05241706222295761, 0.3673517107963562], [0.01092924177646637, 0.001503748120740056, 0.002171804429963231, 0.0014883984113112092, 0.0025665252469480038, 0.0023110362235456705, 0.006589826196432114, 0.4631040394306183, 0.0016154979821294546, 0.003656144021078944, 0.02070433273911476, 0.00933170598000288, 0.011967726983129978, 0.0022938179317861795, 0.0153458621352911, 0.02230805717408657, 0.42211225628852844], [0.03446207940578461, 0.0012501878663897514, 0.007782842963933945, 0.0013884148793295026, 0.0008539968403056264, 0.001043938216753304, 0.017917238175868988, 0.40183836221694946, 0.01055973395705223, 0.008871659636497498, 0.0603027306497097, 0.015342195518314838, 0.0035483792889863253, 0.005809720605611801, 0.048666175454854965, 0.013975669629871845, 0.366386741399765], [0.016678277403116226, 0.002050426322966814, 0.008223841898143291, 0.0022403779439628124, 0.001961521804332733, 0.0034141947980970144, 0.02179364673793316, 0.2994793951511383, 0.022490115836262703, 0.05387016385793686, 0.06793522089719772, 0.038013335317373276, 0.0262907762080431, 0.016293074935674667, 0.09536747634410858, 0.03742564097046852, 0.28647249937057495], [0.005877192597836256, 0.0034107740502804518, 0.0037220846861600876, 0.002956162439659238, 0.0016542795347049832, 0.0019468183163553476, 0.007985391654074192, 0.5060425400733948, 0.0012573145795613527, 0.0007172296172939241, 0.0011814589379355311, 0.00031311772181652486, 0.002615701174363494, 0.0006017001578584313, 0.0005493956268765032, 0.001424335641786456, 0.4577445983886719]], [[0.058081138879060745, 0.0035345377400517464, 0.00048327207332476974, 0.0018260409124195576, 0.0011258991435170174, 0.0059553589671850204, 0.009336988441646099, 0.3867037892341614, 0.014517584815621376, 0.022331276908516884, 0.0009122620103880763, 0.006296880077570677, 0.011915742419660091, 0.019564205780625343, 0.002081089187413454, 0.07221511751413345, 0.38311880826950073], [0.20696312189102173, 0.05203016474843025, 0.03296201303601265, 0.012925166636705399, 0.0025476927403360605, 0.018447132781147957, 0.011745893396437168, 0.33291444182395935, 0.004095475655049086, 0.002184044336900115, 0.0004623348359018564, 0.0006705705309286714, 0.0010425550863146782, 0.0012804197613149881, 0.000319386221235618, 0.003924714867025614, 0.31548479199409485], [0.009544258005917072, 0.06354167312383652, 0.10917437821626663, 0.017222663387656212, 0.003616398200392723, 0.01736697368323803, 0.005380899645388126, 0.3946889042854309, 0.0010804414050653577, 0.0010615169303491712, 0.00034612935269251466, 0.00022362053277902305, 0.0006903003668412566, 0.0008227587677538395, 0.00013060474884696305, 0.004481843207031488, 0.3706267178058624], [0.05291534960269928, 0.027047622948884964, 0.0243399515748024, 0.07604734599590302, 0.008007433265447617, 0.015517547726631165, 0.008383023552596569, 0.39121633768081665, 0.007392592262476683, 0.0019033858552575111, 0.0007349075167439878, 0.0020370050333440304, 0.003927053418010473, 0.0033635820727795362, 0.0005569791537709534, 0.0028744502924382687, 0.3737354576587677], [0.062217917293310165, 0.01337711326777935, 0.020553801208734512, 0.03973612189292908, 0.033664073795080185, 0.046530988067388535, 0.028501177206635475, 0.3765867352485657, 0.003768692724406719, 0.0036649045068770647, 0.0014923333656042814, 0.0033850050531327724, 0.009810353629291058, 0.0044006286188960075, 0.0002195091510657221, 0.0036168720107525587, 0.34847375750541687], [0.34146106243133545, 0.02478223294019699, 0.004990026820451021, 0.005676247179508209, 0.01033876370638609, 0.04843056574463844, 0.015825675800442696, 0.23181189596652985, 0.02017080970108509, 0.007384820841252804, 0.0008916688966564834, 0.009339377284049988, 0.0157246682792902, 0.035551056265830994, 0.0008182819583453238, 0.0075484635308384895, 0.21925443410873413], [0.3155786991119385, 0.006876218132674694, 0.003396758111193776, 0.004508413840085268, 0.0036614052951335907, 0.013870066963136196, 0.03118664026260376, 0.29524222016334534, 0.00980982556939125, 0.002955785021185875, 0.001366298645734787, 0.004347093403339386, 0.0027167159132659435, 0.011480855755507946, 0.000715260801371187, 0.0069707888178527355, 0.2853168845176697], [0.013149316422641277, 0.012902080081403255, 0.01806490309536457, 0.014573928900063038, 0.010331671684980392, 0.009624972008168697, 0.008706919848918915, 0.43649911880493164, 0.007637768052518368, 0.006162748672068119, 0.011010280810296535, 0.009294092655181885, 0.009086434729397297, 0.009444552473723888, 0.005963052622973919, 0.010287081822752953, 0.40726107358932495], [0.09136522561311722, 0.006062550004571676, 0.001603035139851272, 0.001733393408358097, 0.0003329587052576244, 0.003800787264481187, 0.0014865024713799357, 0.38748699426651, 0.01800359971821308, 0.015988672152161598, 0.0006138475146144629, 0.0213943962007761, 0.010658537037670612, 0.03979094699025154, 0.0054314760491251945, 0.025079913437366486, 0.3691672086715698], [0.007324482314288616, 0.005489994306117296, 0.001962576759979129, 0.001324493088759482, 0.0009386801975779235, 0.005905526224523783, 0.0008926731534302235, 0.4418574571609497, 0.00871247984468937, 0.01325586810708046, 0.0006926454952917993, 0.016734357923269272, 0.010610931552946568, 0.02821049839258194, 0.006918969098478556, 0.02438165806233883, 0.4247867465019226], [0.003925632685422897, 0.004230535123497248, 0.0010590058518573642, 0.0006869681528769433, 0.0003802390128839761, 0.000535926956217736, 0.0010150398593395948, 0.46231839060783386, 0.005874479189515114, 0.016517847776412964, 0.002310746582224965, 0.008089551702141762, 0.002861416433006525, 0.011940768919885159, 0.009251350536942482, 0.024087822064757347, 0.4449143409729004], [0.013991579413414001, 0.0026920612435787916, 0.0004286356270313263, 0.0012821572599932551, 0.0005682572955265641, 0.0017876577330753207, 0.0012407383183017373, 0.42504575848579407, 0.02766994945704937, 0.030809173360466957, 0.002387475920841098, 0.01229769829660654, 0.006653984542936087, 0.02307051420211792, 0.001481684623286128, 0.038690682500600815, 0.4099019765853882], [0.01480164472013712, 0.007414062973111868, 0.0005500276456587017, 0.004616001155227423, 0.0013686147285625339, 0.00413471320644021, 0.0012240175856277347, 0.4411126673221588, 0.02192091755568981, 0.015876512974500656, 0.001365436241030693, 0.005048608873039484, 0.01549906563013792, 0.017442476004362106, 0.00177295773755759, 0.031113913282752037, 0.4147384762763977], [0.05755488574504852, 0.005670232232660055, 0.004319096449762583, 0.003675673855468631, 0.0011407688725739717, 0.008057341910898685, 0.0011659596348181367, 0.3978501856327057, 0.03677073121070862, 0.014605476520955563, 0.0012661413056775928, 0.021521707996726036, 0.009133722633123398, 0.027279064059257507, 0.004793264903128147, 0.025246990844607353, 0.37994876503944397], [0.00315808760933578, 0.009061409160494804, 0.0012865527532994747, 0.00216081365942955, 0.0009869079804047942, 0.002581524895504117, 0.0005197927239350975, 0.45337435603141785, 0.011839378625154495, 0.00985421147197485, 0.0017092888010665774, 0.003585747443139553, 0.0038926894776523113, 0.015279467217624187, 0.009867019020020962, 0.031809236854314804, 0.4390336275100708], [0.008756712079048157, 0.010189360938966274, 0.0019374573603272438, 0.0012815456138923764, 0.0009366818121634424, 0.006824163720011711, 0.003176322439685464, 0.42618700861930847, 0.019318632781505585, 0.028950825333595276, 0.0020517874509096146, 0.013749299570918083, 0.00409420533105731, 0.03339924290776253, 0.0033742424566298723, 0.02643943950533867, 0.40933308005332947], [0.0122181735932827, 0.012281388975679874, 0.01749054342508316, 0.01346633117645979, 0.009635481052100658, 0.008993065916001797, 0.00810133945196867, 0.4418259263038635, 0.00723161268979311, 0.005751136690378189, 0.010379642248153687, 0.008538834750652313, 0.008417852222919464, 0.008858553133904934, 0.005616511218249798, 0.009633349254727364, 0.41156017780303955]], [[0.009939809329807758, 0.007224703673273325, 0.007533363066613674, 0.007282021455466747, 0.0034862966276705265, 0.00901864841580391, 0.031616855412721634, 0.45856714248657227, 0.007089737802743912, 0.005893452558666468, 0.005734927020967007, 0.002829513978213072, 0.0008206665515899658, 0.0009188369731418788, 0.0010493493173271418, 0.014137896709144115, 0.42685678601264954], [0.05443664640188217, 0.029737835749983788, 0.022491198033094406, 0.01302969641983509, 0.0009477597195655107, 0.0014010426821187139, 0.022573504596948624, 0.44702789187431335, 0.0001566989376442507, 0.00010116443445440382, 0.0001250420173164457, 0.0005660986062139273, 0.0007146843709051609, 0.000161769799888134, 8.476022048853338e-05, 0.0007350319065153599, 0.4057091772556305], [0.02425786480307579, 0.2085607349872589, 0.04924154654145241, 0.04065050557255745, 0.005116268526762724, 0.003940454684197903, 0.02292780950665474, 0.33670246601104736, 0.0003809206828009337, 0.00017320620827376842, 0.00012582748604472727, 0.0005399397923611104, 0.0007989048608578742, 0.0007496858015656471, 0.00017536936502438039, 0.001251032343134284, 0.3044074773788452], [0.017011234536767006, 0.08842890709638596, 0.03877810016274452, 0.04709337279200554, 0.005090255755931139, 0.007245996501296759, 0.04618887975811958, 0.38638895750045776, 0.00029823428485542536, 0.0008584815659560263, 0.0003715864149853587, 0.0006675109616480768, 0.00035607171594165266, 0.0012847973266616464, 0.00014487490989267826, 0.004457451403141022, 0.3553353250026703], [0.06363087892532349, 0.053292203694581985, 0.05096591264009476, 0.36944296956062317, 0.03395163267850876, 0.04834141954779625, 0.02734738402068615, 0.18110227584838867, 0.0002906589361373335, 0.00019604693807195872, 0.0012373443460091949, 0.00010409109381726012, 0.00012401021376717836, 0.0002704971411731094, 0.0005050405743531883, 0.003530337940901518, 0.1656673401594162], [0.014341513626277447, 0.003986303694546223, 0.030320309102535248, 0.3685612678527832, 0.40015092492103577, 0.06286061555147171, 0.05971454083919525, 0.02730235457420349, 0.0008103272411972284, 0.0007862814818508923, 0.0008068971219472587, 0.00015983544290065765, 0.0006843036389909685, 0.0005336704198271036, 0.0002578137500677258, 0.0043155960738658905, 0.024407442659139633], [0.005483838729560375, 0.0033265429083257914, 0.024957021698355675, 0.06485161185264587, 0.44609877467155457, 0.21358022093772888, 0.11668811738491058, 0.057974692434072495, 0.003344930475577712, 0.0035220894496887922, 0.0005914203356951475, 0.00038499984657391906, 0.00045922843855805695, 0.0006797462701797485, 0.0007015742594376206, 0.0034659935627132654, 0.05388921499252319], [0.009029662236571312, 0.008867698721587658, 0.002036694437265396, 0.005566149950027466, 0.003628699341788888, 0.005201783962547779, 0.008890935219824314, 0.48325616121292114, 0.0030989625956863165, 0.004227292723953724, 0.0037787994369864464, 0.0013994371984153986, 0.0019835950806736946, 0.0029882160015404224, 0.0013262588763609529, 0.00817120261490345, 0.4465484023094177], [0.8085571527481079, 0.0007493507582694292, 0.0019036189187318087, 0.0015540625900030136, 0.0038444935344159603, 0.007121680304408073, 0.0633232444524765, 0.03853936493396759, 0.021813571453094482, 0.014556693844497204, 0.0019070605048909783, 0.0004331294330768287, 0.0001968226715689525, 0.00015141199401114136, 6.762483099009842e-05, 0.0011697685113176703, 0.0341109074652195], [0.3206234872341156, 0.0010715130483731627, 0.0004725066537503153, 0.000787086202763021, 0.002252891194075346, 0.014332323335111141, 0.03703133761882782, 0.07742665708065033, 0.39005160331726074, 0.05841495096683502, 0.02228580228984356, 0.0027372236363589764, 0.0007897275499999523, 0.0009922193130478263, 0.00016248947940766811, 0.002158122370019555, 0.06841004639863968], [0.15167462825775146, 0.003144986229017377, 0.0004673275980167091, 0.0015654508024454117, 0.0017308281967416406, 0.004515755455940962, 0.062292736023664474, 0.1616169810295105, 0.1045684963464737, 0.3141041696071625, 0.02399604395031929, 0.014539767988026142, 0.0037524055223912, 0.0017867519054561853, 0.0002666361106093973, 0.0057896836660802364, 0.14418728649616241], [0.18430300056934357, 0.0006995412986725569, 0.00035440968349575996, 0.0007710016216151416, 0.0003348179452586919, 0.002954046707600355, 0.03137532249093056, 0.046687643975019455, 0.13489992916584015, 0.4926917850971222, 0.026139475405216217, 0.019150640815496445, 0.0064958231523633, 0.004663608502596617, 0.00016776786651462317, 0.006610635668039322, 0.041700541973114014], [0.10179366916418076, 0.0007348553626798093, 0.00019881267508026212, 0.0004953754832968116, 2.999552270921413e-05, 0.0004993690527044237, 0.005266892723739147, 0.2633466124534607, 0.015496296808123589, 0.08973632007837296, 0.15794748067855835, 0.08470945060253143, 0.032944921404123306, 0.0021199495531618595, 0.0006800106493756175, 0.009472664445638657, 0.23452730476856232], [0.07440144568681717, 0.00022271893976721913, 0.00046802894212305546, 0.0010866225929930806, 0.00010922667570412159, 0.00038501838571392, 0.002403436228632927, 0.05347945913672447, 0.0028376237023621798, 0.04160892218351364, 0.07838019728660583, 0.03517412766814232, 0.5606391429901123, 0.07125737518072128, 0.005065568257123232, 0.024564482271671295, 0.04791658744215965], [0.048744507133960724, 0.0014646576019003987, 0.000533664075192064, 0.0008612934616394341, 9.188978583551943e-05, 0.0005595221882686019, 0.0017017755890265107, 0.07265777885913849, 0.0032121159601956606, 0.036121610552072525, 0.012174506671726704, 0.05168210715055466, 0.3044191002845764, 0.2770574390888214, 0.0061468705534935, 0.1158781349658966, 0.06669303774833679], [0.017357023432850838, 0.0005461283726617694, 0.0009654095047153533, 0.000442716118413955, 0.000257193052675575, 0.0005188124487176538, 0.0058730789460241795, 0.008355407044291496, 0.000914216972887516, 0.06323366612195969, 0.0023887401912361383, 0.009233508259057999, 0.29986700415611267, 0.4443662762641907, 0.0030868363101035357, 0.13496631383895874, 0.00762767530977726], [0.008935264311730862, 0.008518635295331478, 0.001973578939214349, 0.0052887131460011005, 0.0034949309192597866, 0.005065209232270718, 0.008939981460571289, 0.4839116334915161, 0.002998156240209937, 0.004387673921883106, 0.0037572146393358707, 0.0014034658670425415, 0.002018640749156475, 0.003026488935574889, 0.0013112741289660335, 0.00826748926192522, 0.44670164585113525]], [[0.012184658087790012, 0.008505215868353844, 0.014193563722074032, 0.008205908350646496, 0.0014411743031814694, 0.009403233416378498, 0.003979966510087252, 0.4882141649723053, 0.004072886426001787, 0.0006297666113823652, 0.000201590868528001, 0.0009201199864037335, 0.0007396361907012761, 0.0008097634417936206, 9.02313768165186e-05, 0.0005449629970826209, 0.44586315751075745], [0.0013898422475904226, 0.008414855226874352, 0.019608808681368828, 0.0016067869728431106, 0.00014330419071484357, 0.0018221481004729867, 0.0019456377485767007, 0.5011254549026489, 0.0003726936411112547, 8.073732169577852e-05, 3.754526551347226e-05, 0.00020925446006003767, 0.00027266511460766196, 0.00048708345275372267, 6.716827192576602e-05, 6.075066630728543e-05, 0.46235519647598267], [0.000252473633736372, 0.011854629032313824, 0.02528519369661808, 0.0012852048967033625, 0.00020065312855876982, 0.0005648799706250429, 0.0005235031130723655, 0.506248950958252, 0.0001067768462235108, 3.2294177799485624e-05, 1.528438224340789e-05, 2.609648981888313e-05, 6.311253673629835e-05, 0.0002040141262114048, 8.059528227022383e-06, 1.6905221855267882e-05, 0.4533120095729828], [0.002926149405539036, 0.28323885798454285, 0.24611446261405945, 0.0254647396504879, 0.004542201291769743, 0.005212891846895218, 0.012094838544726372, 0.2143026888370514, 0.0008276054286397994, 0.0006083512562327087, 0.00013954140013083816, 0.00042719399789348245, 0.0014161287108436227, 0.0012400433188304305, 0.00013553403550758958, 0.0006795660592615604, 0.20062923431396484], [0.01579451374709606, 0.0871802270412445, 0.19255691766738892, 0.3597927391529083, 0.018881194293498993, 0.02390315756201744, 0.07935640215873718, 0.11356460303068161, 0.0011755885789170861, 0.0006382029387168586, 6.039217987563461e-05, 0.0007043501827865839, 0.0005545559106394649, 0.0008882189868018031, 0.00010235571971861646, 0.0007649580365978181, 0.1040816381573677], [0.02811458893120289, 0.12556779384613037, 0.12480274587869644, 0.07554753124713898, 0.028632069006562233, 0.02600189670920372, 0.13272206485271454, 0.23228637874126434, 0.00257579842582345, 0.0016351820668205619, 0.0005269265966489911, 0.0019535976462066174, 0.0030801454558968544, 0.002008943585678935, 0.0003782814310397953, 0.002560045337304473, 0.21160608530044556], [0.016431162133812904, 0.00885338056832552, 0.008457045070827007, 0.027995018288493156, 0.05877500772476196, 0.03618244454264641, 0.017765508964657784, 0.42946168780326843, 0.00294468249194324, 0.0005550188943743706, 0.00017534277867525816, 0.0005801208899356425, 0.0013792435638606548, 0.0016008998500183225, 0.00018133661069441587, 0.00020087146549485624, 0.38846132159233093], [0.004859008826315403, 0.006090878508985043, 0.0040610311552882195, 0.005147299263626337, 0.0019026636146008968, 0.01046158280223608, 0.006691939663141966, 0.4869372248649597, 0.002904450288042426, 0.0014126470778137445, 0.0017575236270204186, 0.0012702866224572062, 0.0034210714511573315, 0.0034555403981357813, 0.0017269020900130272, 0.0019563522655516863, 0.4559434950351715], [0.004845303483307362, 0.0017335828160867095, 0.006482381839305162, 0.0052496930584311485, 0.006233659107238054, 0.006766089238226414, 0.005063262302428484, 0.47900670766830444, 0.019519057124853134, 0.00405309209600091, 0.0002688068198040128, 0.0029997490346431732, 0.0015211037825793028, 0.0017624662723392248, 0.00018829450709745288, 0.0009046376217156649, 0.45340219140052795], [0.0007962834788486362, 0.0005222181789577007, 0.002212675055488944, 0.0013036631280556321, 0.003965959884226322, 0.001140089938417077, 0.0006054844707250595, 0.49249377846717834, 0.022363988682627678, 0.0016585325356572866, 0.0011970446212217212, 0.0007815089193172753, 0.0018286594422534108, 0.0006528430967591703, 6.864647730253637e-05, 9.251816663891077e-05, 0.46831613779067993], [0.004625837318599224, 0.0024228421971201897, 0.003148886142298579, 0.0005834151525050402, 0.021894307807087898, 0.005230520386248827, 0.0022548476699739695, 0.45780667662620544, 0.05236639827489853, 0.00411509582772851, 0.002727513900026679, 0.0013582556275650859, 0.004267165903002024, 0.001986091025173664, 0.0005386286647990346, 0.001028838800266385, 0.4336446225643158], [0.0033834499772638083, 0.0008692671544849873, 0.0026557990349829197, 0.00033716074540279806, 0.0007147450814954937, 0.000865574402268976, 0.0025856709107756615, 0.45415636897087097, 0.054058998823165894, 0.021769464015960693, 0.008878658525645733, 0.006614568643271923, 0.008991510607302189, 0.0036782962270081043, 0.00027585314819589257, 0.0009506583446636796, 0.42921391129493713], [0.027775224298238754, 0.0006304891430772841, 0.0007492569275200367, 0.0011337787145748734, 0.0007583643309772015, 0.0041227685287594795, 0.05064774677157402, 0.3734520375728607, 0.07708753645420074, 0.0488978810608387, 0.017656449228525162, 0.01979784481227398, 0.00943046249449253, 0.006686373148113489, 0.0005436926730908453, 0.0020358620677143335, 0.35859429836273193], [0.00742443697527051, 0.001189874135889113, 0.004027462098747492, 0.0025093574076890945, 0.0013430585386231542, 0.0017698142910376191, 0.012047508731484413, 0.44061076641082764, 0.013889811933040619, 0.021956544369459152, 0.002472400199621916, 0.012589454650878906, 0.036182206124067307, 0.0167858824133873, 0.0014867889694869518, 0.0029481761157512665, 0.4207664728164673], [0.0008840393857099116, 0.0007128884899429977, 0.0007232290226966143, 0.00018893781816586852, 0.00043874632683582604, 0.00045278071775101125, 0.00022807817731518298, 0.4652222692966461, 0.004917670972645283, 0.0021742689423263073, 0.001956837484613061, 0.0011202013120055199, 0.06077861413359642, 0.01824222318828106, 0.00046264741104096174, 0.002999867545440793, 0.4384966790676117], [0.002743187127634883, 0.0012244120007380843, 0.0012345373397693038, 0.0002988884225487709, 0.0016728475457057357, 0.0008148871129378676, 0.0010718100238591433, 0.3036087155342102, 0.004517070017755032, 0.011793745681643486, 0.0014015306951478124, 0.012156683020293713, 0.2692471444606781, 0.09917345643043518, 0.002273885067552328, 0.004460975993424654, 0.2823062837123871], [0.004642104264348745, 0.005710650701075792, 0.0038544596172869205, 0.004685578402131796, 0.001754248165525496, 0.009675242938101292, 0.006343010812997818, 0.48870575428009033, 0.002827167045325041, 0.0013951309956610203, 0.001721624401398003, 0.0012315827189013362, 0.0033350202720612288, 0.003388363169506192, 0.001680687884800136, 0.0018939882284030318, 0.45715540647506714]], [[0.005110002122819424, 0.005243502091616392, 0.044925522059202194, 0.18013958632946014, 0.02472485415637493, 0.1627327799797058, 0.40163204073905945, 0.08183026313781738, 0.0010027880780398846, 0.0010755606926977634, 0.011116763576865196, 0.00484110601246357, 0.001587292063049972, 0.0002706963859964162, 6.567491800524294e-05, 0.00400706147775054, 0.06969451159238815], [0.18214844167232513, 0.02809913456439972, 0.004850266966968775, 0.003685017814859748, 0.0020814971067011356, 0.00032684349571354687, 0.0030828863382339478, 0.41418221592903137, 2.054480319202412e-05, 0.0001406587107339874, 1.9035733203054406e-05, 0.0002993363596033305, 0.00430124718695879, 0.0007239219848997891, 1.1524194633238949e-05, 0.0022125791292637587, 0.35381487011909485], [0.004572773352265358, 0.6229727864265442, 0.022383665665984154, 0.002413122681900859, 0.000362670689355582, 0.002742021344602108, 0.0059003704227507114, 0.18106311559677124, 0.00011430172889959067, 0.00015165729564614594, 2.6816562694875756e-06, 2.7619146294455277e-06, 9.666436380939558e-05, 0.0007024533115327358, 3.468029899522662e-05, 0.0008733842987567186, 0.15561091899871826], [0.004115269053727388, 0.023489195853471756, 0.6333683133125305, 0.03843390569090843, 0.011588061228394508, 0.004509551916271448, 0.0018771549221128225, 0.14559462666511536, 0.0001255011884495616, 0.000122419762192294, 5.126211362949107e-06, 3.501359242363833e-05, 3.606339305406436e-05, 0.00044822378549724817, 1.091876401915215e-05, 0.0022400752641260624, 0.13400059938430786], [0.0019748113118112087, 0.0008361928630620241, 0.013866727240383625, 0.9531517028808594, 0.0035323600750416517, 0.003416527761146426, 0.0007410639664158225, 0.011606544256210327, 6.039542768121464e-06, 9.631342254579067e-05, 2.0979675241505902e-07, 9.657464397605509e-05, 2.539563865866512e-06, 7.30191186448792e-06, 2.7621141271083616e-05, 0.0004698503471445292, 0.01016773097217083], [0.015710238367319107, 0.004889908246695995, 0.004747701808810234, 0.023744938895106316, 0.551334023475647, 0.02658320777118206, 0.009716896340250969, 0.18731112778186798, 0.00018207498942501843, 0.0010387522634118795, 4.93815605295822e-06, 1.2415423952916171e-05, 0.00013230141485109925, 0.000416814349591732, 5.4612778512819204e-06, 0.005046966951340437, 0.16912229359149933], [0.0034168993588536978, 0.0022271759808063507, 0.0033042575232684612, 0.004180824849754572, 0.018737608566880226, 0.49540263414382935, 0.014411557465791702, 0.2417408674955368, 0.0014099132968112826, 0.0009504570043645799, 1.805807914934121e-05, 1.002754106593784e-05, 3.997509338660166e-05, 9.342974954051897e-05, 2.634658812894486e-05, 0.00112089142203331, 0.2129090428352356], [0.0021914467215538025, 0.00832028966397047, 0.004788258112967014, 0.005528679117560387, 0.003466195659711957, 0.013044467195868492, 0.008945983834564686, 0.4852476418018341, 0.0014283099444583058, 0.003144088201224804, 0.0022380719892680645, 0.0008132868679240346, 0.0008117115939967334, 0.0017450281884521246, 0.001616528956219554, 0.0018050218932330608, 0.454865038394928], [0.4379085600376129, 0.00028174620820209384, 3.1670977477915585e-05, 0.00015586770314257592, 0.0027239841874688864, 0.0009933231631293893, 0.17001473903656006, 0.16477473080158234, 0.004492960404604673, 0.08151695877313614, 0.00017584662418812513, 0.0016925687668845057, 0.0005805432447232306, 1.2447393601178192e-05, 1.0126451570613426e-06, 0.0015085089253261685, 0.13313452899456024], [0.003525580745190382, 4.802578405360691e-05, 2.01742604986066e-05, 9.991535989684053e-06, 4.661239927372662e-06, 5.132077421876602e-05, 0.0005717097665183246, 0.022776108235120773, 0.8859135508537292, 0.06343701481819153, 0.000866693735588342, 0.0017111633205786347, 0.00015655916649848223, 0.000185528420843184, 2.203381882281974e-05, 4.2796200432348996e-05, 0.020657191053032875], [0.011056514456868172, 3.670415026135743e-05, 3.75458002963569e-05, 5.2443712775129825e-05, 3.189638664480299e-05, 2.9558484584413236e-06, 0.005105303134769201, 0.009075704962015152, 0.003393452614545822, 0.9445227384567261, 0.0015669281128793955, 0.01678871177136898, 0.0006078589358367026, 3.815459422185086e-05, 1.7540629414725117e-05, 6.0220550949452445e-05, 0.007605218794196844], [0.001715721096843481, 4.4702055674861185e-06, 3.7012682696513366e-06, 3.2903128612815635e-06, 3.4372243362668087e-07, 6.439051389861561e-07, 0.0006992665003053844, 0.00806423556059599, 0.0006165258237160742, 0.03605213388800621, 0.9346634149551392, 0.006596107501536608, 0.003923584707081318, 9.183640941046178e-05, 9.569924441166222e-05, 0.0001224019069923088, 0.007346579805016518], [0.0001551469904370606, 5.273045644571539e-07, 3.6399751479621045e-07, 2.6008397981058806e-05, 3.606203557993126e-09, 2.4337593274026403e-08, 8.297465683426708e-06, 0.0022804904729127884, 2.89407040554579e-07, 0.0016876587178558111, 0.00042468419997021556, 0.9910705089569092, 0.0023586973547935486, 5.395537300501019e-06, 1.2956435057276394e-05, 5.2216324547771364e-05, 0.0019168899161741138], [0.0006996840238571167, 6.358242899295874e-06, 3.444561116339173e-07, 8.608779467067507e-07, 6.041139499757264e-07, 9.932793432199105e-08, 8.998684279504232e-06, 0.004120247904211283, 3.383163402759237e-07, 0.00014349669800139964, 1.1060445103794336e-05, 0.0007158363587222993, 0.9806085228919983, 0.008883124217391014, 1.6464431610074826e-05, 0.0012194132432341576, 0.003564612939953804], [0.00262492336332798, 0.0007247307221405208, 0.0001397337473463267, 2.2053094653529115e-05, 1.2582573617692105e-05, 9.890898581943475e-06, 5.660822716890834e-05, 0.053488463163375854, 0.00022304743470158428, 0.001291738823056221, 1.1688776794471778e-05, 0.0016349911456927657, 0.10247543454170227, 0.7778118848800659, 0.0005079564871266484, 0.010389856062829494, 0.048574384301900864], [0.0003329257888253778, 7.444038783432916e-05, 0.0001273355446755886, 8.453674672637135e-05, 2.3071950636222027e-05, 2.8033704438712448e-05, 0.00013234779180493206, 0.018939178436994553, 7.5294128691894e-06, 0.0002344320819247514, 0.00016444017819594592, 0.00033245462691411376, 0.011586690321564674, 0.01243089884519577, 0.9226889610290527, 0.015689915046095848, 0.01712280884385109], [0.0022594965994358063, 0.007946429774165154, 0.004695436917245388, 0.0053703333251178265, 0.003358474001288414, 0.012818355113267899, 0.008875174447894096, 0.48547399044036865, 0.0014509111642837524, 0.0032204673625528812, 0.0022641364485025406, 0.0008676875731907785, 0.000867484079208225, 0.001839510165154934, 0.0016459976322948933, 0.0019465988734737039, 0.45509955286979675]]], [[[0.006081664934754372, 0.05992679297924042, 0.004632278345525265, 0.04761708155274391, 0.0069939917884767056, 0.03733282908797264, 0.04673796519637108, 0.39511778950691223, 0.0018650954589247704, 0.0007704297895543277, 0.0002778128255158663, 0.0020284021738916636, 0.0011147432960569859, 0.00067733513424173, 7.294692477444187e-05, 0.0015523826004937291, 0.3872005045413971], [0.005391435232013464, 0.08358006924390793, 0.006630939897149801, 0.011355679482221603, 0.004883507266640663, 0.020148931071162224, 0.010913971811532974, 0.4339543879032135, 0.0006111536640673876, 0.00014911442121956497, 0.00017661698802839965, 0.00026286710635758936, 0.0004035455349367112, 0.0008672158000990748, 2.0717823645099998e-05, 0.0002563974994700402, 0.4203934967517853], [0.00915137305855751, 0.2797113358974457, 0.019832463935017586, 0.018241873010993004, 0.003129567950963974, 0.011380055919289589, 0.011015127412974834, 0.32218578457832336, 0.0005929506733082235, 0.00021194826695136726, 0.00023431847512256354, 0.0003328909515403211, 0.0003763137210626155, 0.0003916354908142239, 6.570235564140603e-05, 0.00043183378875255585, 0.3227148652076721], [0.013154246844351292, 0.3600543141365051, 0.023770911619067192, 0.030796343460679054, 0.016679560765624046, 0.03596251830458641, 0.030871694907546043, 0.24113529920578003, 0.0031435987912118435, 0.0007975840708240867, 0.0004333582182880491, 0.0005949624464847147, 0.0008247648947872221, 0.001094144769012928, 0.0002791965671349317, 0.001881363452412188, 0.23852622509002686], [0.013965611346065998, 0.5769703388214111, 0.05015614256262779, 0.06386855244636536, 0.011596623808145523, 0.02515888400375843, 0.035795003175735474, 0.10849734395742416, 0.001974851591512561, 0.0014403314562514424, 0.00027517142007127404, 0.0006395029486157, 0.0007528035785071552, 0.0004802368930540979, 9.630419663153589e-05, 0.0019671532791107893, 0.1063651442527771], [0.009709489531815052, 0.3894999623298645, 0.02210947871208191, 0.08367262780666351, 0.011140666902065277, 0.02128802239894867, 0.029205838218331337, 0.2157878875732422, 0.002081731567159295, 0.0003966822405345738, 0.00011556350364116952, 0.0005021410179324448, 0.0004580656823236495, 0.00030238102772273123, 3.912653119186871e-05, 0.0010188381420448422, 0.21267148852348328], [0.011339440010488033, 0.20982499420642853, 0.008171171881258488, 0.04262728989124298, 0.02856399677693844, 0.10336685925722122, 0.018930068239569664, 0.2827814221382141, 0.0021787085570394993, 0.000823981303256005, 0.0005304586375132203, 0.0017707452643662691, 0.0014833150198683143, 0.001551308436319232, 0.0003602523938752711, 0.0035466367844492197, 0.2821493148803711], [0.007176012732088566, 0.00820113904774189, 0.0007596280192956328, 0.003672394435852766, 0.0012578731402754784, 0.004433946218341589, 0.006090362556278706, 0.4700566530227661, 0.0012794070644304156, 0.0010410482063889503, 0.0005839013610966504, 0.0007950154831632972, 0.001299087656661868, 0.0008044294081628323, 0.0001501230290159583, 0.0016047084936872125, 0.49079430103302], [0.15465664863586426, 0.0031453038100153208, 0.0009835183154791594, 0.0016759778372943401, 0.0009102323092520237, 0.002278411528095603, 0.017927585169672966, 0.378543496131897, 0.015190532431006432, 0.004076844546943903, 0.0012658251216635108, 0.002001154702156782, 0.0015545767964795232, 0.0005735427839681506, 0.00013070402201265097, 0.0026634216774255037, 0.4124222695827484], [0.18213911354541779, 0.002082312945276499, 0.000587976595852524, 0.0014239961747080088, 0.0011733978753909469, 0.002871074015274644, 0.04821096360683441, 0.3232286870479584, 0.07176226377487183, 0.00858426932245493, 0.0020331665873527527, 0.0024506866466253996, 0.002922122133895755, 0.0017343986546620727, 0.00020298622257541865, 0.0033374675549566746, 0.34525516629219055], [0.11981701850891113, 0.0031062138732522726, 0.0012621426722034812, 0.0014903683913871646, 0.0013740435242652893, 0.002417524578049779, 0.01943671703338623, 0.31205058097839355, 0.15796837210655212, 0.01622297614812851, 0.0037208327557891607, 0.0038314065895974636, 0.009846174158155918, 0.0032112649641931057, 0.00043363115401007235, 0.0023809729609638453, 0.34142979979515076], [0.0985623449087143, 0.001958950189873576, 0.0004693427763413638, 0.0006763520068489015, 0.0006983986240811646, 0.0027868757024407387, 0.0534990057349205, 0.3235074579715729, 0.12810227274894714, 0.027528075501322746, 0.004432633053511381, 0.0032425832469016314, 0.0036657529417425394, 0.004851105622947216, 0.0001295759720960632, 0.003144132671877742, 0.3427451550960541], [0.11859655380249023, 0.010592753067612648, 0.003915958106517792, 0.006048004142940044, 0.0012478609569370747, 0.0025914048310369253, 0.03795735165476799, 0.21124321222305298, 0.20915192365646362, 0.0636608675122261, 0.014383974485099316, 0.045033980160951614, 0.031910691410303116, 0.006803985219448805, 0.00033904644078575075, 0.008478373289108276, 0.2280440628528595], [0.13088934123516083, 0.0035796703305095434, 0.0014508141903206706, 0.0024668967816978693, 0.0003476907149888575, 0.0005831770249642432, 0.010213586501777172, 0.17767012119293213, 0.06018450856208801, 0.03426986187696457, 0.016314895823597908, 0.10584431141614914, 0.2015276551246643, 0.029337994754314423, 0.002748935017734766, 0.028454555198550224, 0.1941160261631012], [0.09070423245429993, 0.0025058856699615717, 0.001340070040896535, 0.0009188575786538422, 0.0005339680355973542, 0.0017334287986159325, 0.006997799966484308, 0.07420050352811813, 0.04359907656908035, 0.026590466499328613, 0.020357169210910797, 0.05759013816714287, 0.3899690806865692, 0.17142851650714874, 0.003307122504338622, 0.028572555631399155, 0.07965105026960373], [0.05354088172316551, 0.002591901458799839, 0.0007613704074174166, 0.0023510607425123453, 0.0004944084794260561, 0.0008302785572595894, 0.017565065994858742, 0.06460551917552948, 0.03785065561532974, 0.03227800875902176, 0.010800259187817574, 0.042973730713129044, 0.5491861701011658, 0.08713904768228531, 0.0025061580818146467, 0.027050837874412537, 0.06747457385063171], [0.007503681816160679, 0.008212205022573471, 0.0007863524951972067, 0.003759582992643118, 0.0013562600361183286, 0.004557321779429913, 0.006450110115110874, 0.4691476821899414, 0.001448334543965757, 0.001180148683488369, 0.0006850242498330772, 0.0008904925780370831, 0.001459707971662283, 0.0009326364961452782, 0.00017835278413258493, 0.0018082803580909967, 0.489643931388855]], [[0.016195744276046753, 0.007531195878982544, 0.00832654070109129, 0.01957850717008114, 0.003244524821639061, 0.0041532255709171295, 0.017489759251475334, 0.14534495770931244, 0.18893791735172272, 0.20019234716892242, 0.01511116698384285, 0.048345886170864105, 0.05928065627813339, 0.03101625293493271, 0.0026634729001671076, 0.06964404881000519, 0.16294381022453308], [0.005956089124083519, 0.04857485741376877, 0.03526328131556511, 0.536970317363739, 0.09033060818910599, 0.09486401826143265, 0.036154985427856445, 0.06904285401105881, 0.0009082306060008705, 0.0010115077020600438, 0.0016392747638747096, 0.0006081801257096231, 0.003833032911643386, 0.0006445400649681687, 0.00021222887153271586, 0.0010150948073714972, 0.07297086715698242], [0.009720677509903908, 0.028409060090780258, 0.006889475509524345, 0.1505645215511322, 0.07185011357069016, 0.09998802840709686, 0.014820966869592667, 0.29232335090637207, 0.001485234941355884, 0.0008462928817607462, 0.001615152694284916, 0.0005526298773474991, 0.0035069601144641638, 0.0007641459815204144, 0.00010834964632522315, 0.0008453446207568049, 0.3157096803188324], [0.011588013730943203, 0.03263028338551521, 0.016619393602013588, 0.07424406707286835, 0.12225540727376938, 0.10742254555225372, 0.06151161342859268, 0.26838418841362, 0.004352821968495846, 0.00218875496648252, 0.002210963750258088, 0.0013849706156179309, 0.003246636362746358, 0.0015423427103087306, 0.00036032666685059667, 0.002137352479621768, 0.2879202663898468], [0.006749264895915985, 0.0028159739449620247, 0.004609315190464258, 0.0210895836353302, 0.014897584915161133, 0.03405199572443962, 0.19525428116321564, 0.3322852551937103, 0.0028566333930939436, 0.008909238502383232, 0.001465243985876441, 0.0006865789764560759, 0.0013370326487347484, 0.0007473042351193726, 0.00012875357060693204, 0.0011581083526834846, 0.3709578216075897], [0.005366782192140818, 0.003631867468357086, 0.0029813863802701235, 0.005355299450457096, 0.003701434237882495, 0.009891425259411335, 0.1378726363182068, 0.3925182521343231, 0.001643494819290936, 0.0028973990119993687, 0.000407673156587407, 0.00045835901983082294, 0.001067211152985692, 0.00043942814227193594, 3.0581992177758366e-05, 0.0007625091238878667, 0.4309742748737335], [0.026992961764335632, 0.01583678089082241, 0.001769352937117219, 0.007409827783703804, 0.009119429625570774, 0.011499504558742046, 0.01268517691642046, 0.4155344069004059, 0.012704935856163502, 0.012436505407094955, 0.0009204059024341404, 0.005355632398277521, 0.020831512287259102, 0.003900873241946101, 9.179565677186474e-05, 0.0021385583095252514, 0.4407724142074585], [0.004720430355519056, 0.002525599440559745, 0.0008383329259231687, 0.00580767123028636, 0.0027247031684964895, 0.0035044869873672724, 0.005578236188739538, 0.4559880495071411, 0.0011049964232370257, 0.0018586115911602974, 0.0009332125773653388, 0.00134023348800838, 0.002363360719755292, 0.0009372793138027191, 0.00029140099650248885, 0.002324732718989253, 0.5071585774421692], [0.0032737369183450937, 0.0008022664114832878, 0.000240363267948851, 0.00716595072299242, 0.0014526378363370895, 0.004759353585541248, 0.01834230124950409, 0.23834951221942902, 0.010684075765311718, 0.14173462986946106, 0.06206003949046135, 0.08581943064928055, 0.13188433647155762, 0.0024691049475222826, 0.0004012853023596108, 0.011505370028316975, 0.2790555953979492], [0.005403564777225256, 0.0007430469268001616, 0.00015542798792012036, 0.0023276153951883316, 0.0010992380557581782, 0.0019804127514362335, 0.005390452686697245, 0.372401624917984, 0.008150969631969929, 0.013963522389531136, 0.017318231984972954, 0.03829382359981537, 0.08700034767389297, 0.008544448763132095, 0.0004107538843527436, 0.0066061303950846195, 0.4302103519439697], [0.004853794816881418, 0.001001037540845573, 3.6372177419252694e-05, 0.0006635976606048644, 0.0006123472121544182, 0.0003230271686334163, 0.0006626614485867321, 0.29512521624565125, 0.007403214927762747, 0.009180111810564995, 0.002477343427017331, 0.06222626566886902, 0.23668812215328217, 0.03415898233652115, 0.0003038712020497769, 0.009002981707453728, 0.3352811336517334], [0.003494726028293371, 0.000769023026805371, 0.00012818830145988613, 0.0005306644015945494, 0.0005303329671733081, 0.0006576215964742005, 0.0015999723691493273, 0.40922337770462036, 0.001091606798581779, 0.0008842453826218843, 0.0030796322971582413, 0.0032097497023642063, 0.06694649904966354, 0.020804084837436676, 0.0011883035767823458, 0.010251147672533989, 0.47561079263687134], [0.000959699391387403, 0.0002990888897329569, 5.9609210438793525e-05, 0.00010200739779975265, 0.00012336000509094447, 0.00018489906506147236, 0.0013471123529598117, 0.4471541941165924, 0.00014906033175066113, 0.00012513915135059506, 0.00027241845964454114, 0.0001663800358073786, 0.007664814125746489, 0.01962592452764511, 0.000740576593670994, 0.013673599809408188, 0.5073521137237549], [0.0009458474814891815, 0.00032672841916792095, 9.562892228132114e-05, 0.00027293103630654514, 0.00012847421749029309, 0.00041943677933886647, 0.0020799310877919197, 0.4430186152458191, 3.9779115468263626e-05, 0.00017419367213733494, 0.00013837730512022972, 0.00023241918825078756, 0.004290263168513775, 0.003138928208500147, 0.0006468164501711726, 0.018844034522771835, 0.5252075791358948], [0.0006187675753608346, 0.00028512885910458863, 2.947577740997076e-05, 0.00015786872245371342, 0.00014967193419579417, 0.0004934691824018955, 0.0002329775452381, 0.4530373513698578, 7.580123201478273e-05, 0.00014958517567720264, 0.00014327304961625487, 0.00036756545887328684, 0.0020051472820341587, 0.001575519680045545, 0.0004626726149581373, 0.005856851581484079, 0.53435879945755], [0.006384200882166624, 0.001496648881584406, 0.00011051010369556025, 0.00247593829408288, 0.001244617160409689, 0.0016671734629198909, 0.0019840325694531202, 0.4408290684223175, 0.001353834057226777, 0.0007860823534429073, 0.0004793203843291849, 0.0007536641205660999, 0.013821986503899097, 0.01104042399674654, 0.00039385183481499553, 0.010251244530081749, 0.504927396774292], [0.004829013254493475, 0.0025731620844453573, 0.0008636197890155017, 0.00586956599727273, 0.002794514410197735, 0.0036605612840503454, 0.005821586586534977, 0.45518770813941956, 0.0012199264019727707, 0.002042418345808983, 0.0010405421489849687, 0.0014593041269108653, 0.00251271715387702, 0.0010186491999775171, 0.00032868131529539824, 0.002559759421274066, 0.5062181949615479]], [[0.020592810586094856, 0.031068485230207443, 0.0030091910157352686, 0.0019116721814498305, 0.0016300983261317015, 0.00534924678504467, 0.0010003162315115333, 0.2047761231660843, 0.4428405463695526, 0.013930168002843857, 0.004908159375190735, 0.010767710395157337, 0.0014716139994561672, 0.0041494728066027164, 0.0013201857218518853, 0.0018012389773502946, 0.24947291612625122], [0.008310715667903423, 0.033739812672138214, 0.2864121198654175, 0.32581403851509094, 0.021804066374897957, 0.005936089437454939, 0.006554628722369671, 0.14570818841457367, 0.0005274848081171513, 0.0024290799628943205, 0.00023074595083016902, 0.000169322345755063, 0.0032311948016285896, 0.0005978214903734624, 0.0008715527364984155, 0.0007968372665345669, 0.15686629712581635], [0.025114938616752625, 0.03411397710442543, 0.028034377843141556, 0.26350197196006775, 0.08961477130651474, 0.03336198255419731, 0.014267167076468468, 0.23327352106571198, 0.000878416292835027, 0.006170214619487524, 0.000500328082125634, 0.0007782478351145983, 0.004409650340676308, 0.0003126771771349013, 0.00012549350503832102, 0.0018747287103906274, 0.26366758346557617], [0.02972385659813881, 0.028901759535074234, 0.011909120716154575, 0.03582359105348587, 0.22567670047283173, 0.15854208171367645, 0.016133824363350868, 0.23219706118106842, 0.0031607914716005325, 0.0010064819362014532, 0.0007177618099376559, 0.0009901736630126834, 0.0010422583436593413, 0.000474760978249833, 0.0003878779534716159, 0.0016598624642938375, 0.2516520619392395], [0.01880212500691414, 0.003504497464746237, 0.0009997341549023986, 0.004814295098185539, 0.019866814836859703, 0.2643168568611145, 0.01715674065053463, 0.31928977370262146, 0.001218293677084148, 0.0004931697039864957, 0.00021046191977802664, 6.0413265600800514e-05, 8.493886707583442e-05, 0.0001404096110491082, 4.5598018914461136e-05, 0.00015505151532124728, 0.3488408625125885], [0.013494123704731464, 0.0017110737971961498, 0.005545753985643387, 0.013046718202531338, 0.014396010898053646, 0.011247918009757996, 0.41226208209991455, 0.24960345029830933, 0.0011181272566318512, 0.0035277006682008505, 0.00024853527429513633, 6.489689985755831e-05, 0.00023856516054365784, 9.435818355996162e-05, 9.150271216640249e-05, 0.00016662190319038928, 0.273142546415329], [0.023378994315862656, 0.024302160367369652, 0.001910935970954597, 0.008892455138266087, 0.003502874867990613, 0.009325496852397919, 0.003965886775404215, 0.39976391196250916, 0.023312702775001526, 0.0037509610410779715, 0.0016956980107352138, 0.0024269605055451393, 0.000281721557257697, 0.0001702914305496961, 8.011345926206559e-05, 0.0009240841027349234, 0.49231475591659546], [0.011980770155787468, 0.007396169472485781, 0.0037237638607621193, 0.008932376280426979, 0.005901596043258905, 0.00927004124969244, 0.008305061608552933, 0.43802934885025024, 0.0033466797322034836, 0.005797204095870256, 0.002863217843696475, 0.002911756746470928, 0.0023249078076332808, 0.00230137025937438, 0.0016775100957602262, 0.005442157853394747, 0.4797961413860321], [0.00800317246466875, 0.0005134400562383235, 0.00038273941027000546, 0.0046585937961936, 0.0003526542568579316, 0.0004815298307221383, 0.0014348424738273025, 0.23554135859012604, 0.011392051354050636, 0.4252050220966339, 0.01297029573470354, 0.01035100407898426, 0.002546168165281415, 0.00018028220802079886, 5.717705425922759e-05, 0.00043050097883678973, 0.28549924492836], [0.003621145850047469, 0.0004858339380007237, 0.00014547632599715143, 0.0017724293284118176, 0.00038654441596008837, 0.00041515869088470936, 0.0007591542671434581, 0.15908434987068176, 0.03863447159528732, 0.17868393659591675, 0.11524900048971176, 0.26531630754470825, 0.03404228016734123, 0.0022856583818793297, 0.0002307220274815336, 0.005717561114579439, 0.1931699514389038], [0.004400501027703285, 0.0002689460525289178, 0.00010673885117284954, 0.0004513103631325066, 0.00010925299284281209, 0.000372429087292403, 0.00023232129751704633, 0.12209988385438919, 0.0010749115608632565, 0.008683067746460438, 0.029099229723215103, 0.4268147647380829, 0.25060057640075684, 0.007010831031948328, 0.0001690016215434298, 0.004305271431803703, 0.1442009061574936], [0.0064212665893137455, 0.0009032529196701944, 3.7182209780439734e-05, 0.0011846325360238552, 0.00016604784468654543, 0.00015707682177890092, 0.00020069214224349707, 0.1174682080745697, 0.007923083379864693, 0.001099211978726089, 0.00972581934183836, 0.034272450953722, 0.5990175604820251, 0.07887446135282516, 0.0006086943903937936, 0.0013662977144122124, 0.14057400822639465], [0.005209831520915031, 0.0014653399121016264, 8.564235758967698e-05, 9.721316018840298e-05, 1.1006238310073968e-05, 0.00047946471022441983, 6.052808021195233e-05, 0.33735474944114685, 0.00010327780910301954, 7.165308488765731e-05, 0.00023468099243473262, 0.0025291775818914175, 0.0370057076215744, 0.21125300228595734, 0.0025826571509242058, 0.010629099793732166, 0.3908270299434662], [0.0008229176746681333, 0.00010462543286848813, 0.00011519669351400807, 0.0004263767914380878, 2.6006326152128167e-05, 8.14265149529092e-05, 0.0006299851229414344, 0.3966752886772156, 1.6248745851044077e-06, 0.00024319578369613737, 2.243592098238878e-05, 0.00030899079865776, 0.0006297664949670434, 0.009646696969866753, 0.001934357569552958, 0.1147339716553688, 0.47359710931777954], [0.004730875138193369, 0.0005377806373871863, 0.00015374516078736633, 0.001086399657651782, 0.0005122027359902859, 0.0008682155748829246, 0.000714510097168386, 0.34216055274009705, 0.00011033022019546479, 0.0011319939512759447, 0.0013709497870877385, 0.009159684181213379, 0.017145680263638496, 0.010857585817575455, 0.0013117485214024782, 0.21709582209587097, 0.39105188846588135], [0.010524287819862366, 0.006296233274042606, 0.000507623772136867, 0.006553971208631992, 0.0015426735626533628, 0.0038521536625921726, 0.0021818974055349827, 0.42262303829193115, 0.003211166011169553, 0.00013440323527902365, 0.00047631346387788653, 0.0014118240214884281, 0.012307706288993359, 0.02146642841398716, 0.0011419247603043914, 0.020310083404183388, 0.4854583144187927], [0.012948949821293354, 0.007929883897304535, 0.004084647633135319, 0.009830895811319351, 0.006657589226961136, 0.010164221748709679, 0.010127272456884384, 0.4344370663166046, 0.003944419790059328, 0.0068285525776445866, 0.003559018252417445, 0.003391153644770384, 0.0026713202241808176, 0.002616946818307042, 0.002035971265286207, 0.00628670072183013, 0.47248542308807373]], [[0.01765236258506775, 0.0011515406658872962, 0.00046742433914914727, 0.003537696087732911, 0.00027801250689662993, 0.0014234023401513696, 0.0016422171611338854, 0.42092835903167725, 0.01074168086051941, 0.008800662122666836, 0.0008520812261849642, 0.0025731041096150875, 0.0008202915196307003, 0.004787192214280367, 0.0005216507706791162, 0.002012457000091672, 0.521809995174408], [0.012828490696847439, 0.015513851307332516, 0.006796387955546379, 0.019984884187579155, 0.0007274491945281625, 0.0029961070977151394, 0.0029165714513510466, 0.44823238253593445, 0.0006004389724694192, 0.0003202476946171373, 6.603512883884832e-05, 0.00018798986275214702, 0.00046109905815683305, 0.000277299084700644, 5.464645801112056e-05, 0.00017403429956175387, 0.48786213994026184], [0.0021822256967425346, 0.0016321373404935002, 0.000704972364474088, 0.005071562714874744, 0.00011832808377221227, 0.00010630009637679905, 0.0020619097631424665, 0.461978942155838, 5.422638787422329e-05, 7.143030961742625e-05, 4.829871249967255e-05, 3.326690421090461e-05, 3.7241003155941144e-05, 1.0774228030641098e-05, 1.7124188161687925e-05, 4.830861507798545e-05, 0.5258228778839111], [0.007529828697443008, 0.007521424442529678, 0.0009364617289975286, 0.017082108184695244, 0.0015077664284035563, 0.004340911749750376, 0.003715225961059332, 0.46496984362602234, 0.00023513408086728305, 4.812382394447923e-05, 8.421896200161427e-05, 0.0001247006730409339, 6.331182521535084e-05, 0.00019771725055761635, 1.921921648317948e-05, 0.00015419079863931984, 0.4914698600769043], [0.004584986716508865, 0.00854632817208767, 0.006921413354575634, 0.3387047052383423, 0.007675492204725742, 0.022835748270154, 0.009250831790268421, 0.28124940395355225, 0.0003971874830313027, 6.782493437640369e-05, 9.845796012086794e-05, 3.252578608226031e-05, 6.721194222336635e-05, 0.00013949914136901498, 2.6562627681414597e-05, 5.604580292128958e-05, 0.3193458020687103], [0.003450269578024745, 0.0033433714415878057, 0.008306697010993958, 0.05035312473773956, 0.009666441939771175, 0.019783295691013336, 0.01717206835746765, 0.41438257694244385, 0.0005316047463566065, 0.0001934585307026282, 8.231119136326015e-05, 2.8238941013114527e-05, 4.0960654587252066e-05, 0.00013452132407110184, 1.7893667973112315e-05, 0.0001983147521968931, 0.4723147749900818], [0.01158907264471054, 0.0012793599162250757, 0.0012243700912222266, 0.014744999818503857, 0.00788902211934328, 0.008668545633554459, 0.010289556346833706, 0.4491034150123596, 0.0010485228849574924, 0.0005319747724570334, 0.00041739927837625146, 0.0002205324126407504, 0.00022487141541205347, 0.00013836787547916174, 0.00010937024489976466, 0.0006729429587721825, 0.49184757471084595], [0.04544135555624962, 0.00616177124902606, 0.0024318306241184473, 0.005519090220332146, 0.001453452161513269, 0.0029237736016511917, 0.017560768872499466, 0.43086448311805725, 0.001742662861943245, 0.0037070964463055134, 0.002096862066537142, 0.0026011799927800894, 0.0018225281964987516, 0.0008524906588718295, 0.0007885348168201745, 0.003470800817012787, 0.4705614149570465], [0.010534780099987984, 0.00010035747982328758, 6.403197039617226e-05, 0.00031868822406977415, 7.368931255768985e-05, 0.00025751194334588945, 0.0005848543951287866, 0.4546374976634979, 0.003709169337525964, 0.0032256983686238527, 0.000918336387258023, 0.00046460534213110805, 0.00047928676940500736, 4.511106089921668e-05, 1.1231205462536309e-05, 0.00025702163111418486, 0.5243180990219116], [0.0036661105696111917, 2.6086941943503916e-05, 2.958319237222895e-05, 0.00021588393428828567, 2.1026857211836614e-05, 2.9474829716491513e-05, 0.0007121140952222049, 0.4462484121322632, 0.004153985995799303, 0.001993260346353054, 0.0023950713220983744, 0.000887528876774013, 7.701302092755213e-05, 1.4702494809171185e-05, 1.7841908629634418e-05, 0.00017350666166748852, 0.5393384099006653], [0.007324127946048975, 0.00018661178182810545, 0.000116864794108551, 0.0005300973425619304, 6.313479389064014e-05, 0.00018069853831548244, 0.0016803512116894126, 0.43822818994522095, 0.006182958371937275, 0.008649715222418308, 0.0031037060543894768, 0.006654918193817139, 0.00051596958655864, 0.00022150081349536777, 0.00012972916010767221, 0.0010058965999633074, 0.5252255201339722], [0.006289494689553976, 0.0001930443395394832, 3.218771598767489e-05, 0.0003514834388624877, 5.091804268886335e-05, 8.218526636483148e-05, 0.0008587857009842992, 0.43409082293510437, 0.0049641621299088, 0.002512351144105196, 0.02556491270661354, 0.004035274963825941, 0.001706803566776216, 0.00036473441286943853, 8.660368621349335e-05, 0.0007990176090970635, 0.5180173516273499], [0.003098932560533285, 0.0003592980501707643, 0.00010947593546006829, 0.000615964294411242, 4.28724342782516e-06, 0.00010384851339040324, 0.0001419971522409469, 0.4401145875453949, 0.0010574485640972853, 0.0007277632248587906, 0.0005620897863991559, 0.0005332737928256392, 0.0003330994222778827, 0.0005806020344607532, 2.0575040252879262e-05, 0.00017659128934610635, 0.5514601469039917], [0.0015181989874690771, 7.185248250607401e-05, 6.781389674870297e-05, 0.00019640359096229076, 7.65226377552608e-06, 4.883259316557087e-05, 0.00039224905776791275, 0.453235387802124, 0.00010586709686322138, 0.0005570781650021672, 0.0005809744470752776, 0.00034128539846278727, 0.0031461473554372787, 0.003064568620175123, 0.00026223057648167014, 0.0020910673774778843, 0.5343123078346252], [0.002024384681135416, 0.00025236414512619376, 0.000688300933688879, 0.0006958885933272541, 1.705244358163327e-05, 6.705238047288731e-05, 0.0016066618263721466, 0.43003740906715393, 0.00024023855803534389, 0.0009365276200696826, 0.0011787806870415807, 0.0031107289250940084, 0.0018348470330238342, 0.007793189492076635, 0.0017747282981872559, 0.020573578774929047, 0.5271682739257812], [0.0019014828139916062, 0.00015799507673364133, 0.00013784242037218064, 0.0006970398244448006, 0.0002842363028321415, 0.0003044432960450649, 0.0026170460041612387, 0.4370087683200836, 0.0002136660332325846, 0.0006203326629474759, 0.002926041604951024, 0.0015660738572478294, 0.005705229472368956, 0.008290007710456848, 0.006195631809532642, 0.008172153495252132, 0.5232018828392029], [0.0529736690223217, 0.007014984730631113, 0.0028949773404747248, 0.0059104072861373425, 0.0017823688685894012, 0.00338401785120368, 0.019880235195159912, 0.42337295413017273, 0.0022250160109251738, 0.004774811211973429, 0.0026195154059678316, 0.0032879512291401625, 0.0023055823985487223, 0.0010912425350397825, 0.0010446207597851753, 0.004494143649935722, 0.4609434902667999]], [[0.0026332123670727015, 0.016766928136348724, 0.008011487312614918, 0.01790531352162361, 0.018650200217962265, 0.062303535640239716, 0.00492036622017622, 0.4069289565086365, 0.0013141982490196824, 0.0007088479469530284, 0.0002457851660437882, 0.0016835161950439215, 0.000699252646882087, 0.0008167345076799393, 0.0002552729856688529, 0.004614090546965599, 0.45154228806495667], [0.021348735317587852, 0.06308237463235855, 0.0012783269630745053, 0.0008726424421183765, 0.0020216379780322313, 0.001760077546350658, 0.03428907319903374, 0.4050016403198242, 1.2061688721587416e-05, 0.00016670834156684577, 4.294016616768204e-05, 2.7111689632874914e-05, 0.001610580482520163, 0.00010187678708462045, 4.417030140757561e-05, 0.015155099332332611, 0.45318493247032166], [0.003091631457209587, 0.594883382320404, 0.0010874372674152255, 0.0003623396041803062, 0.001062209834344685, 0.0035483611281961203, 0.0017792407888919115, 0.18863049149513245, 0.00012455935939215124, 0.00010312092490494251, 3.243057562940521e-06, 6.509753347927472e-06, 0.0002912225027102977, 8.259034075308591e-05, 1.1080795047746506e-05, 0.0010622382396832108, 0.20387029647827148], [0.0022930591367185116, 0.12563984096050262, 0.08801192045211792, 0.017069920897483826, 0.0015811006305739284, 0.00283697503618896, 0.004461849573999643, 0.35876205563545227, 0.0002696115698199719, 0.00011372932203812525, 6.011414006934501e-05, 7.020973862381652e-05, 7.607688894495368e-05, 9.619046613806859e-05, 2.4265851607196964e-05, 0.00170729064848274, 0.39692577719688416], [0.0021572434343397617, 0.002745784819126129, 0.004188232123851776, 0.906137228012085, 0.0030303839594125748, 0.0010624447604641318, 0.0026167738251388073, 0.03699648752808571, 8.34439470054349e-06, 8.540863200323656e-05, 2.928004505520221e-06, 0.00012868872727267444, 2.883228717109887e-06, 3.5744585602515144e-06, 8.613767022325192e-06, 0.00033215826260857284, 0.04049265757203102], [0.001158821047283709, 0.001286138198338449, 0.0002362321683904156, 0.01615927368402481, 0.922779381275177, 0.007019980810582638, 0.0010226935846731067, 0.02288481406867504, 3.143897993140854e-05, 0.0001426228991476819, 5.250073627394158e-06, 4.433151843841188e-06, 0.0005128434277139604, 8.119357516989112e-05, 1.0257708709104918e-05, 0.0008597345440648496, 0.0258049163967371], [0.0038134735077619553, 0.014512907713651657, 0.0017344387015327811, 0.005672078114002943, 0.02979953959584236, 0.7441712617874146, 0.005500673782080412, 0.09057512879371643, 0.00041286874329671264, 0.00017180165741592646, 3.044616505576414e-06, 2.81435113720363e-05, 0.00011558442929526791, 0.00011456076754257083, 7.358984294114634e-05, 0.0013731947401538491, 0.10192778706550598], [0.0031713987700641155, 0.01049118209630251, 0.0013209134340286255, 0.0023277662694454193, 0.001549050211906433, 0.007051176857203245, 0.005296653136610985, 0.45139455795288086, 0.0005366570549085736, 0.001273870700970292, 0.0002764156088232994, 0.0006945689092390239, 0.0008733040885999799, 0.0005719222826883197, 0.0002151395019609481, 0.003827531822025776, 0.5091279149055481], [0.24841752648353577, 0.0006868786877021194, 6.506958925456274e-06, 0.0007770706433802843, 0.00035913960891775787, 0.0015311617171391845, 0.22299742698669434, 0.2130577266216278, 0.0010123912943527102, 0.04461001232266426, 0.00039270363049581647, 0.00013133673928678036, 0.0002693503338377923, 8.715867352293571e-07, 3.7627869460266083e-06, 0.016293900087475777, 0.24945224821567535], [0.015003357082605362, 0.0013782646274194121, 0.00024101496092043817, 0.00011596337571972981, 0.00012209978012833744, 0.0004333557444624603, 0.008557945489883423, 0.058474693447351456, 0.7500280141830444, 0.0953151285648346, 0.002409890992566943, 0.0007866734522394836, 0.00022496342717204243, 5.2120005420874804e-05, 1.3340957593754865e-05, 0.00012583636271301657, 0.06671729683876038], [0.012346221134066582, 0.00011537739919731393, 0.0005208725924603641, 0.001600478426553309, 0.00012466148473322392, 0.0004231053462717682, 0.022761313244700432, 0.028999492526054382, 0.036587633192539215, 0.8429945111274719, 0.001648884848691523, 0.008194942027330399, 0.000262050743913278, 0.0006445805192925036, 3.0672545108245686e-05, 0.007904641330242157, 0.03484056890010834], [0.008065525442361832, 3.289232699899003e-05, 7.25282370694913e-06, 0.0002044859720626846, 2.1289970391080715e-05, 3.1236475479090586e-05, 0.005125476513057947, 0.011228681541979313, 0.004996485076844692, 0.12316475063562393, 0.8027245998382568, 0.02157537452876568, 0.007968842051923275, 0.0005096656968817115, 0.0004529108991846442, 0.001809856272302568, 0.0120806023478508], [0.00013010729162488133, 1.9808639990515076e-05, 0.00014662352623417974, 0.0006823948351666331, 2.422748821118148e-06, 1.2738895520669757e-06, 8.520845585735515e-05, 0.02025916613638401, 4.210897532175295e-05, 0.008654609322547913, 0.0010076829930767417, 0.940001904964447, 0.001866328762844205, 8.559123671147972e-05, 5.123051960254088e-05, 0.0017627060879021883, 0.02520069293677807], [0.0007561540696769953, 1.224671086674789e-05, 2.71782113259178e-07, 1.0042626854556147e-05, 2.0495381249929778e-05, 2.920824272223399e-06, 7.098415517248213e-05, 0.006611887365579605, 2.735405587372952e-06, 0.00031088394462130964, 3.813771400018595e-05, 0.0007258863770402968, 0.9741557240486145, 0.006121969316154718, 6.413153460016474e-05, 0.003135510953143239, 0.007960056886076927], [0.011815240606665611, 0.013770796358585358, 0.00016628307639621198, 0.00014302284398581833, 0.00020878014038316905, 0.001011322601698339, 0.00024132771068252623, 0.11770996451377869, 0.0022566679399460554, 0.005912574473768473, 2.368062951063621e-06, 0.0003211611183360219, 0.13078634440898895, 0.5592588186264038, 0.0001499788777437061, 0.01386864110827446, 0.14237670600414276], [0.00341402692720294, 0.0007988111465238035, 0.0005109183839522302, 0.0011417168425396085, 0.0010168449953198433, 0.0004975261399522424, 0.011145468801259995, 0.07588810473680496, 3.4240663808304816e-05, 0.017858341336250305, 0.0009391247294843197, 0.013362288475036621, 0.31077897548675537, 0.09214280545711517, 0.21333712339401245, 0.17187216877937317, 0.08526159822940826], [0.0036418098025023937, 0.012458807788789272, 0.0014593518571928144, 0.0023590882774442434, 0.0017022675601765513, 0.007798854261636734, 0.005697906482964754, 0.44995060563087463, 0.000681153847835958, 0.0014743177453055978, 0.0003242199891246855, 0.000792105623986572, 0.00107921427115798, 0.0007018313044682145, 0.00024288527492899448, 0.004314068704843521, 0.5053214430809021]], [[0.036404676735401154, 0.00592489168047905, 0.006970780435949564, 0.025631822645664215, 0.005646585952490568, 0.011554471217095852, 0.030073555186390877, 0.12259546667337418, 0.057070840150117874, 0.2043054699897766, 0.02925042062997818, 0.05809417739510536, 0.08366486430168152, 0.07597700506448746, 0.022499265149235725, 0.08973279595375061, 0.1346028447151184], [0.023810431361198425, 0.24388140439987183, 0.15159577131271362, 0.13559827208518982, 0.024114958941936493, 0.13965743780136108, 0.13844817876815796, 0.041259393095970154, 0.0037603636737912893, 0.006554889492690563, 0.0034487496595829725, 0.0023160402197390795, 0.032709624618291855, 0.0064082988537848, 0.0010581834940239787, 0.003914321307092905, 0.04146362096071243], [0.00724352290853858, 0.028957119211554527, 0.023850275203585625, 0.03370911628007889, 0.013197769410908222, 0.085672527551651, 0.16931235790252686, 0.3014773428440094, 0.0005187847418710589, 0.0013224215945228934, 0.007630565203726292, 0.0006650203722529113, 0.00461833830922842, 0.002115938114002347, 0.0023917024955153465, 0.002881410298869014, 0.31443583965301514], [0.026455210521817207, 0.10573790967464447, 0.06613335013389587, 0.1268225908279419, 0.054754335433244705, 0.2500959038734436, 0.043942276388406754, 0.13487322628498077, 0.003782139625400305, 0.006633463781327009, 0.0029290663078427315, 0.002675961470231414, 0.021187856793403625, 0.0047859493643045425, 0.0020174114033579826, 0.007072727661579847, 0.1401006281375885], [0.022197989746928215, 0.01526658795773983, 0.13132710754871368, 0.4635508954524994, 0.013824806548655033, 0.021256139501929283, 0.10771340131759644, 0.09966151416301727, 0.0025476706214249134, 0.0023534914944320917, 0.0030855608638375998, 0.0010719865094870329, 0.005523371044546366, 0.0013514339225366712, 0.0008391692535951734, 0.0016986906994134188, 0.10673018544912338], [0.022062508389353752, 0.030834706500172615, 0.16222503781318665, 0.30899718403816223, 0.01854780688881874, 0.04431447759270668, 0.27264654636383057, 0.038108501583337784, 0.007623128592967987, 0.008003332652151585, 0.007105081342160702, 0.003722228342667222, 0.022986114025115967, 0.006339218467473984, 0.0022896211594343185, 0.003946448676288128, 0.04024811461567879], [0.02057664468884468, 0.059904344379901886, 0.08036200702190399, 0.07162459194660187, 0.015420306473970413, 0.06397315859794617, 0.09348619729280472, 0.2717522978782654, 0.00303972908295691, 0.0021396896336227655, 0.004615255165845156, 0.002129098866134882, 0.012131500989198685, 0.0035935945343226194, 0.0011915522627532482, 0.009762515313923359, 0.28429749608039856], [0.0034727174788713455, 0.001563248224556446, 0.0017624144675210118, 0.004384288098663092, 0.0006182839861139655, 0.0013654198264703155, 0.007617918774485588, 0.4653688669204712, 0.00048346439143642783, 0.0004133052716497332, 0.0012232944136485457, 0.0003533849085215479, 0.0013381107710301876, 0.0005761455395258963, 0.0002277886087540537, 0.0011180340079590678, 0.5081132054328918], [0.023778393864631653, 0.0013149264268577099, 0.002899616491049528, 0.004137106705456972, 0.002222983166575432, 0.010428386740386486, 0.04450900852680206, 0.028673797845840454, 0.03317929059267044, 0.1581706702709198, 0.10455930978059769, 0.090985007584095, 0.31997284293174744, 0.07702523469924927, 0.01817440241575241, 0.049170564860105515, 0.030798496678471565], [0.03401818871498108, 0.0018349975580349565, 0.0026087488513439894, 0.009189194068312645, 0.0016569886356592178, 0.007010174449533224, 0.032535914331674576, 0.28086790442466736, 0.022682664915919304, 0.030953465029597282, 0.0239972285926342, 0.01669435389339924, 0.1358443647623062, 0.03454763814806938, 0.00619794987142086, 0.043422047048807144, 0.31593814492225647], [0.0220197606831789, 0.0015905228210613132, 0.004697291646152735, 0.0006652434240095317, 0.0008372087613679469, 0.003812365001067519, 0.026964526623487473, 0.0501592755317688, 0.08369182050228119, 0.08895114064216614, 0.060747984796762466, 0.08872146904468536, 0.24895046651363373, 0.12610971927642822, 0.029318436980247498, 0.10803248733282089, 0.054730307310819626], [0.013267026282846928, 0.002497647190466523, 0.003735042642802, 0.00415381882339716, 0.0017993756337091327, 0.004174662288278341, 0.016351640224456787, 0.047731414437294006, 0.07842813432216644, 0.1060716062784195, 0.07655243575572968, 0.028031915426254272, 0.4090787172317505, 0.10566414892673492, 0.02145230770111084, 0.027221638709306717, 0.053788457065820694], [0.04550601914525032, 0.0033663648646324873, 0.008475780487060547, 0.006979511119425297, 0.0023410047870129347, 0.007689537946134806, 0.03844694420695305, 0.07012171298265457, 0.08134126663208008, 0.12783783674240112, 0.09567268937826157, 0.06633039563894272, 0.13671033084392548, 0.14445962011814117, 0.02276717871427536, 0.06602661311626434, 0.07592718303203583], [0.008343325927853584, 0.0005641806637868285, 0.0037967467214912176, 0.0030488502234220505, 0.0009284068364650011, 0.0031955817248672247, 0.0639251098036766, 0.030720263719558716, 0.013042779639363289, 0.10724092274904251, 0.1442202478647232, 0.07095249742269516, 0.16175884008407593, 0.121917724609375, 0.08880365639925003, 0.14373967051506042, 0.033801186829805374], [0.00696480693295598, 0.0005280878976918757, 0.00468235881999135, 0.00048087540199048817, 0.00032980277319438756, 0.0025695296935737133, 0.04120693728327751, 0.026116471737623215, 0.007979660294950008, 0.04728662595152855, 0.12942172586917877, 0.037477314472198486, 0.07916063070297241, 0.11529239267110825, 0.0523444302380085, 0.4195806384086609, 0.028577713295817375], [0.00442898366600275, 0.0002960737037938088, 0.002873498247936368, 0.0007604320417158306, 0.0006338224629871547, 0.0014531416818499565, 0.03951537236571312, 0.03200171887874603, 0.015140668489038944, 0.09006698429584503, 0.25681406259536743, 0.031889986246824265, 0.11948448419570923, 0.1239023357629776, 0.1734052151441574, 0.07143285125494003, 0.03590038791298866], [0.0033199891913682222, 0.0013743824092671275, 0.0015889897476881742, 0.0038579709362238646, 0.0005784900858998299, 0.0012502586469054222, 0.006983641535043716, 0.46587660908699036, 0.0004946071421727538, 0.0004342380561865866, 0.001464478438720107, 0.00038961227983236313, 0.001373390550725162, 0.0005847654538229108, 0.00025788479251787066, 0.0011901834513992071, 0.5089805722236633]], [[0.02334071695804596, 0.003417863044887781, 0.0012010140344500542, 0.0022430107928812504, 0.0016200195532292128, 0.0034463603515177965, 0.016079798340797424, 0.43757450580596924, 0.009318018332123756, 0.0023155095987021923, 0.0009462449233978987, 0.0011029751040041447, 0.0015380559489130974, 0.0013740865979343653, 0.0005716481246054173, 0.00023246731143444777, 0.4936777651309967], [0.016842616721987724, 0.016163630411028862, 0.004704550839960575, 0.024345064535737038, 0.02072679065167904, 0.018617207184433937, 0.006456471048295498, 0.41849708557128906, 0.00020310889522079378, 0.00010640353866619989, 0.00018726183043327183, 4.943988096783869e-05, 0.0003971579426433891, 0.00024654006119817495, 0.00015375726798083633, 0.00011350440763635561, 0.47218945622444153], [0.011483881622552872, 0.0038958184886723757, 0.0032236510887742043, 0.006952177733182907, 0.01255497895181179, 0.003602716838940978, 0.004671114031225443, 0.44970831274986267, 0.0001247778709512204, 0.0007429611287079751, 0.00012146234803367406, 5.554977906285785e-05, 0.00020211789524182677, 0.00011740457557607442, 0.00038341357139870524, 0.0004642812127713114, 0.501695454120636], [0.01829781010746956, 0.009315880946815014, 0.004116649739444256, 0.02432245947420597, 0.05863891914486885, 0.034059956669807434, 0.008985783904790878, 0.3995921313762665, 0.000461219489807263, 0.0005761642823927104, 0.00025125869433395565, 0.00012400621199049056, 0.00015309845912270248, 0.00011205890768906102, 0.00025281263515353203, 0.00042673404095694423, 0.44031304121017456], [0.002048932947218418, 0.00011754844308597967, 0.0003798315301537514, 0.009035445749759674, 0.01144897285848856, 0.012676088139414787, 0.0231602992862463, 0.43892595171928406, 0.00016001671610865742, 0.0004826670337934047, 7.210757758002728e-05, 4.212051135255024e-05, 2.0793697331100702e-05, 2.235932697658427e-05, 2.2639251255895942e-05, 0.00021473340166267008, 0.5011695027351379], [0.006609529256820679, 0.0006958271842449903, 0.0005905352300032973, 0.0029306572396308184, 0.005833779461681843, 0.009688823483884335, 0.02402917854487896, 0.4391782879829407, 0.0002819157962221652, 0.0006053519900888205, 5.935047011007555e-05, 6.756571383448318e-05, 0.00015949553926475346, 4.4194744987180457e-05, 2.6636947950464673e-05, 6.5536230977159e-05, 0.5091332197189331], [0.009429986588656902, 0.0006045891204848886, 0.00024717769701965153, 0.0012584561482071877, 0.000926637148950249, 0.0018138496670871973, 0.004042741842567921, 0.4723522365093231, 0.0007235656958073378, 0.0015806583687663078, 0.0004189120954833925, 0.002250867197290063, 0.00018596992595121264, 0.0001054031090461649, 0.00017397037299815565, 0.0007486983668059111, 0.5031362771987915], [0.04861484840512276, 0.004636226687580347, 0.0027589057572185993, 0.004302005749195814, 0.003476787591353059, 0.003933227155357599, 0.014289463870227337, 0.4333992004394531, 0.0017444214317947626, 0.0023987097665667534, 0.0019793235696852207, 0.0014676316641271114, 0.001365580246783793, 0.0012826970778405666, 0.0031027907971292734, 0.0032456957269459963, 0.46800240874290466], [0.003631715662777424, 0.0003088420198764652, 6.453265086747706e-05, 6.451351509895176e-05, 5.187777787796222e-05, 0.00020016876806039363, 0.0010744031751528382, 0.4380161464214325, 0.00386659218929708, 0.03637664020061493, 0.003699705470353365, 0.0030308172572404146, 0.0006971068796701729, 9.385282464791089e-05, 0.00017142188153229654, 0.00010433657007524744, 0.508547306060791], [0.004595261532813311, 5.1648741646204144e-05, 1.4154617019812576e-05, 3.180181738571264e-05, 3.3620755857555196e-05, 3.636182373156771e-05, 0.0001568787993164733, 0.45110490918159485, 0.0006998178432695568, 0.004974895156919956, 0.006279801018536091, 0.001824003062210977, 0.0007305459585040808, 0.00021568563533946872, 0.0002705328515730798, 0.0002863524714484811, 0.5286937952041626], [0.010347146540880203, 0.00016215415962506086, 1.785890526662115e-05, 6.463612226070836e-05, 1.9570428776205517e-05, 6.15140306763351e-05, 7.926848775241524e-05, 0.42909231781959534, 0.0004602996341418475, 0.0019012303091585636, 0.0008683592895977199, 0.0508384071290493, 0.01879936270415783, 0.0030202579218894243, 0.00023789764964021742, 0.005454739555716515, 0.4785749912261963], [0.008678950369358063, 0.0003749372554011643, 3.828941407846287e-05, 9.725706331664696e-05, 4.172022454440594e-05, 8.346237154910341e-05, 0.00010827038931893185, 0.4490053057670593, 0.00039811295573599637, 0.0014210202498361468, 0.0005618249997496605, 0.002701524877920747, 0.01092784944921732, 0.0036350078880786896, 0.0008035670616663992, 0.00292084994725883, 0.5182020664215088], [0.0017807602416723967, 8.14798622741364e-05, 1.949024408531841e-05, 2.636770841490943e-05, 6.6211737248522695e-06, 1.9461362171568908e-05, 0.00011237102444283664, 0.43786904215812683, 6.049060539226048e-05, 6.120120087871328e-05, 0.0001008913095574826, 0.0002591551747173071, 0.00392237538471818, 0.034151602536439896, 0.0009867261396721005, 0.0015346537111327052, 0.5190073251724243], [0.0008413376635871828, 5.2972169214626774e-05, 1.0195322829531506e-05, 3.163064320688136e-05, 3.838329212157987e-05, 1.968651486095041e-05, 7.842334889573976e-05, 0.44952014088630676, 4.057848855154589e-06, 4.2802708776434883e-05, 1.912906918732915e-05, 3.9723501686239615e-05, 0.00035543524427339435, 0.0010322420857846737, 0.0018562062177807093, 0.00603888463228941, 0.5400186777114868], [0.003925972152501345, 6.406171451089904e-05, 3.27638590533752e-05, 1.6427151422249153e-05, 3.751390249817632e-05, 5.116049578646198e-05, 0.0007518212660215795, 0.3194265365600586, 1.138227162300609e-05, 0.00026957321097142994, 9.563883213559166e-05, 0.0009502588072791696, 0.0007739081047475338, 0.0008060032268986106, 0.002137943869456649, 0.3175145089626312, 0.35313454270362854], [0.015826690942049026, 0.0002890320320148021, 7.490795542253181e-05, 0.00017063115956261754, 9.796588710742071e-05, 0.0003676929627545178, 0.0020560543052852154, 0.4530862867832184, 0.0005198271828703582, 0.000404447055188939, 0.00021449162159115076, 0.0004192628839518875, 0.0005889731110073626, 0.0015403784345835447, 0.0010129765141755342, 0.0043958453461527824, 0.5189346671104431], [0.052716221660375595, 0.005045577883720398, 0.003161515574902296, 0.004799775779247284, 0.0040612961165606976, 0.0044054691679775715, 0.01666777953505516, 0.4275640547275543, 0.0021218350157141685, 0.0031049291137605906, 0.0024828100576996803, 0.0017261586617678404, 0.0016498207114636898, 0.0015639223856851459, 0.00406023720279336, 0.003891457337886095, 0.460977166891098]], [[0.027384456247091293, 0.003865094855427742, 0.006935993675142527, 0.006513173691928387, 0.003044238081201911, 0.0122421495616436, 0.21293111145496368, 0.3195265233516693, 0.028109388425946236, 0.008688928559422493, 0.0027171308174729347, 0.019530262798070908, 0.0040236786007881165, 0.001430067582987249, 0.000350464804796502, 0.007164575159549713, 0.33554279804229736], [0.010306515730917454, 0.02785869874060154, 0.0005697893211618066, 0.0038877699989825487, 0.004114500246942043, 0.003510303096845746, 0.0004360430466476828, 0.4481392204761505, 0.0001262290170416236, 0.00014625293260905892, 0.00022977754997555166, 0.0002344312088098377, 0.0016184109263122082, 0.00039523933082818985, 2.6176930987276137e-05, 0.00031901895999908447, 0.4980815649032593], [0.0022807903587818146, 0.05350009351968765, 0.0004899122286587954, 0.005731653887778521, 0.00285477377474308, 0.000935403979383409, 0.0001341810857411474, 0.43902426958084106, 0.00011182844900758937, 0.00014399029896594584, 1.776874159986619e-05, 2.7661961212288588e-05, 0.000830171920824796, 6.179954652907327e-05, 2.576399674580898e-05, 0.0001455369492759928, 0.4936845004558563], [0.0016307136975228786, 0.007147143129259348, 0.0004076235927641392, 0.0013824062189087272, 0.005103759933263063, 0.00895671546459198, 0.0001787373039405793, 0.46844273805618286, 9.635076276026666e-05, 2.7759524527937174e-05, 5.796884579467587e-06, 1.1905499377462547e-05, 2.6403076844871975e-05, 2.5728535547386855e-05, 8.997942131827585e-06, 5.638348011416383e-05, 0.50649094581604], [0.002747944323346019, 0.0021481853909790516, 0.00017988868057727814, 0.008824770338833332, 0.012845930643379688, 0.012115215882658958, 0.00032963082776404917, 0.45364251732826233, 1.377021999360295e-05, 1.625001459615305e-05, 5.548761237150757e-06, 6.019344709784491e-06, 1.4527436178468633e-05, 4.137966243433766e-06, 5.909572337259306e-06, 7.527507841587067e-05, 0.5070245265960693], [0.0064710755832493305, 0.0055014523677527905, 0.006762563716620207, 0.016172390431165695, 0.07339484244585037, 0.008078213781118393, 0.004816546104848385, 0.42597219347953796, 0.0008864799165166914, 0.0004943885141983628, 6.378938996931538e-05, 0.00011360318603692576, 0.0003861173172481358, 0.0002852885809261352, 6.726421270286664e-05, 0.0006380290142260492, 0.4498957097530365], [0.01601148210465908, 0.002178492955863476, 0.0002915983786806464, 0.0049717240035533905, 0.017615729942917824, 0.030614901334047318, 0.002784779528155923, 0.45284345746040344, 0.0013775276020169258, 0.00012832324136979878, 4.610059841070324e-05, 0.00014426627603825182, 0.00017321540508419275, 0.0002519008703529835, 1.649900514166802e-05, 0.00016625228454358876, 0.47038379311561584], [0.008576407097280025, 0.00485766539350152, 0.0021057312842458487, 0.0023646976333111525, 0.0021793998312205076, 0.008902637287974358, 0.007513664662837982, 0.4587917625904083, 0.001238831551745534, 0.003447196679189801, 0.001358703593723476, 0.0023625066969543695, 0.0009326458675786853, 0.0008537550456821918, 0.0008821932133287191, 0.00303064426407218, 0.49060145020484924], [0.07086576521396637, 0.0005989865749143064, 2.8617532734642737e-05, 0.0008393687894567847, 0.0007259768899530172, 0.006472375709563494, 0.009653646498918533, 0.41931697726249695, 0.0041388534009456635, 0.007326140534132719, 0.002127461601048708, 0.000667815562337637, 0.0002826448471751064, 3.635108078015037e-05, 9.59646513365442e-06, 0.00029489691951312125, 0.47661450505256653], [0.00646407064050436, 0.0002647223591338843, 3.608215047279373e-05, 0.00014765237574465573, 0.00103684701025486, 0.0006052081589587033, 0.0013685396406799555, 0.4640445113182068, 0.006257316097617149, 0.0014311211416497827, 0.00023412483278661966, 0.00036500670830719173, 0.0005167955532670021, 2.4727249183342792e-05, 2.7253681764705107e-06, 3.0023329600226134e-05, 0.5171705484390259], [0.006129029672592878, 0.00013018301979172975, 0.0002194285043515265, 0.00020228374341968447, 9.450138168176636e-05, 0.0006227205158211291, 0.0013360349694266915, 0.4303603768348694, 0.0009678167989477515, 0.03947090357542038, 0.003807917470112443, 0.001652649836614728, 0.00023406754189636558, 0.0001700032444205135, 0.00016199004312511533, 0.0006948598311282694, 0.5137451887130737], [0.004958172794431448, 0.000171467472682707, 2.084590778395068e-05, 0.00018925512267742306, 0.00011369951971573755, 0.00015574732969980687, 0.0064537073485553265, 0.2364617884159088, 0.0024626562371850014, 0.05859846621751785, 0.3970663547515869, 0.02125915139913559, 0.005378312431275845, 0.00023079576203599572, 0.0003951511171180755, 0.0013739216374233365, 0.26471051573753357], [0.0014909330056980252, 7.429483230225742e-05, 1.2632801372092217e-05, 5.102091017761268e-05, 1.266022718482418e-05, 6.795165973016992e-05, 7.344643381657079e-05, 0.45816490054130554, 0.00019593347678892314, 0.00035940087400376797, 0.0007398570887744427, 0.006173113361001015, 0.001410710858181119, 0.000392199115594849, 9.738588232721668e-06, 0.00039338134229183197, 0.5303778052330017], [0.0024054632522165775, 0.00017946798470802605, 1.3819018931826577e-05, 7.923242810647935e-05, 1.5097161849553231e-05, 3.873049354297109e-05, 0.00018185861699748784, 0.4292594790458679, 9.185023372992873e-05, 0.0009503445471636951, 0.0025395466946065426, 0.004461972508579493, 0.055510953068733215, 0.006417848169803619, 0.0003238137869630009, 0.0017766391392797232, 0.4957539439201355], [0.004398868419229984, 0.0010472489520907402, 2.8715712687699124e-05, 0.00013836275320500135, 3.299454692751169e-05, 0.00011848005669889972, 0.0005804836982861161, 0.45545029640197754, 7.839587487978861e-05, 0.0009533732663840055, 0.00017155252862721682, 0.0008656001882627606, 0.007362784817814827, 0.0048707895912230015, 0.00041766653885133564, 0.005084885284304619, 0.5183994174003601], [0.0017970808548852801, 0.00026023320970125496, 4.874147634836845e-05, 0.00022750595235265791, 0.00015482639719266444, 0.00011236843420192599, 0.0007157096406444907, 0.4581325948238373, 1.6165839042514563e-05, 0.00035737428697757423, 0.00023007097479421645, 0.00027071614749729633, 0.005896392278373241, 0.002517733257263899, 0.002036839025095105, 0.004224543925374746, 0.5230010747909546], [0.010017584078013897, 0.005770355463027954, 0.002671315334737301, 0.002746246987953782, 0.0026496490463614464, 0.010977867059409618, 0.009256823919713497, 0.4536496698856354, 0.001648116740398109, 0.004471372347325087, 0.0017110984772443771, 0.0031337442342191935, 0.0011752437567338347, 0.0011686974903568625, 0.0011543374275788665, 0.0038848938420414925, 0.48391303420066833]], [[0.03853359818458557, 0.036859918385744095, 0.011397325433790684, 0.026413539424538612, 0.01571391150355339, 0.02060040459036827, 0.22436775267124176, 0.2719573974609375, 0.016298364847898483, 0.013952111825346947, 0.006871000397950411, 0.015526541508734226, 0.008867987431585789, 0.003469701623544097, 0.0008460694225504994, 0.015167880803346634, 0.2731565237045288], [0.12904447317123413, 0.09081084281206131, 0.01562613993883133, 0.18045374751091003, 0.09362813085317612, 0.08303964138031006, 0.17073331773281097, 0.1069255843758583, 0.007132493890821934, 0.0024554196279495955, 0.0017182434676215053, 0.0015517818974331021, 0.003766770474612713, 0.001856318092904985, 0.0002420110540697351, 0.004484777804464102, 0.10653036087751389], [0.051792167127132416, 0.046312082558870316, 0.026903217658400536, 0.258914053440094, 0.15026314556598663, 0.09999839216470718, 0.08199159801006317, 0.13014163076877594, 0.00332284695468843, 0.0033064833842217922, 0.002324905479326844, 0.0012790506007149816, 0.0033780867233872414, 0.001674981089308858, 0.00044809156679548323, 0.004268608056008816, 0.13368074595928192], [0.101529560983181, 0.0892266109585762, 0.012720931321382523, 0.06323404610157013, 0.06039601191878319, 0.07705161720514297, 0.16258500516414642, 0.17579643428325653, 0.03917108476161957, 0.008783639408648014, 0.007864728569984436, 0.005652969237416983, 0.010728993453085423, 0.003939260728657246, 0.0014604658354073763, 0.006658356636762619, 0.17320029437541962], [0.10826022177934647, 0.036093614995479584, 0.00422231899574399, 0.01809084601700306, 0.007913530804216862, 0.02203749120235443, 0.10944864898920059, 0.32041695713996887, 0.01915033534169197, 0.005699771922081709, 0.0027201364282518625, 0.003512016963213682, 0.004465777892619371, 0.0008267273660749197, 0.0003198097983840853, 0.003226133529096842, 0.3335956037044525], [0.06991403549909592, 0.012323886156082153, 0.000727494596503675, 0.0024990320671349764, 0.0014475154457613826, 0.012466980144381523, 0.08723749965429306, 0.3941297233104706, 0.0043054865673184395, 0.0019219801761209965, 0.0006718478398397565, 0.0012584858341142535, 0.0008273684070445597, 0.0003590746782720089, 0.00015489688667003065, 0.0013140714727342129, 0.40844064950942993], [0.02205503173172474, 0.013402307406067848, 0.003075401997193694, 0.003126043826341629, 0.0026873883325606585, 0.01464426051825285, 0.03294937685132027, 0.4233415424823761, 0.013440222479403019, 0.007379074115306139, 0.0027028846088796854, 0.00595076521858573, 0.003715357044711709, 0.002559725660830736, 0.0014290065737441182, 0.0072624352760612965, 0.4402792155742645], [0.011862031184136868, 0.004002700559794903, 0.0010597293730825186, 0.0042723920196294785, 0.0032027927227318287, 0.005350660998374224, 0.011668965220451355, 0.4587627053260803, 0.0015704173129051924, 0.0019171726889908314, 0.0030264118686318398, 0.003633410669863224, 0.004682071041315794, 0.002473787870258093, 0.001309214043430984, 0.0076927486807107925, 0.47351276874542236], [0.02444065362215042, 0.002264506882056594, 0.0002843729453161359, 0.0039218757301568985, 0.0020329179242253304, 0.002478779759258032, 0.017001666128635406, 0.09802453219890594, 0.030774159356951714, 0.11070332676172256, 0.05051247030496597, 0.1828005611896515, 0.294685959815979, 0.023409536108374596, 0.00047149747842922807, 0.051007989794015884, 0.10518523305654526], [0.016597818583250046, 0.0020784277003258467, 0.0009343185229226947, 0.0035360793117433786, 0.002437220187857747, 0.0011332413414493203, 0.005135592073202133, 0.1517927348613739, 0.02151290513575077, 0.0657721534371376, 0.022096829488873482, 0.12264952808618546, 0.28496846556663513, 0.04092462360858917, 0.0032316772267222404, 0.09079134464263916, 0.16440702974796295], [0.009898381307721138, 0.0031737873796373606, 0.0008604836766608059, 0.0027260123752057552, 0.0007575763156637549, 0.0008582966402173042, 0.0014013038016855717, 0.14467753469944, 0.015510574914515018, 0.010667411610484123, 0.021475881338119507, 0.05291607230901718, 0.40640559792518616, 0.12622502446174622, 0.003934914246201515, 0.04111674427986145, 0.15739446878433228], [0.02213294617831707, 0.007840126752853394, 0.0009984615026041865, 0.002623229054734111, 0.0007805950008332729, 0.0008756379829719663, 0.003412999212741852, 0.206177219748497, 0.020407510921359062, 0.007533859461545944, 0.01610538363456726, 0.02503262646496296, 0.24688909947872162, 0.07922980934381485, 0.007530231960117817, 0.12342324107885361, 0.2290070354938507], [0.021368658170104027, 0.010447652079164982, 0.002178157912567258, 0.004343140870332718, 0.0004877011233475059, 0.0006717191427014768, 0.007746066432446241, 0.28512218594551086, 0.006749959662556648, 0.004146149847656488, 0.00617354828864336, 0.013535212725400925, 0.08681921660900116, 0.0354754664003849, 0.0022108464036136866, 0.19497251510620117, 0.3175518214702606], [0.019153660163283348, 0.003163108602166176, 0.000399059324990958, 0.0013737499248236418, 0.000366052525350824, 0.000996222603134811, 0.0024955912958830595, 0.3931885063648224, 0.0008913466008380055, 0.0009396941750310361, 0.0007030289270915091, 0.0024722840171307325, 0.014049634337425232, 0.015328606590628624, 0.004726288840174675, 0.10836031287908554, 0.4313928484916687], [0.013429294340312481, 0.0038024834357202053, 0.0016467941459268332, 0.0020564934238791466, 0.0011626757914200425, 0.0017001541564241052, 0.002337594050914049, 0.38051554560661316, 0.005712383426725864, 0.0036202860064804554, 0.0011358940973877907, 0.005030148662626743, 0.022634310647845268, 0.023514915257692337, 0.004136438947170973, 0.10728643089532852, 0.42027828097343445], [0.00884063821285963, 0.0013195527717471123, 0.0003157604660373181, 0.0013408466475084424, 0.0006067503127269447, 0.0010109319118782878, 0.0017813529120758176, 0.4488208293914795, 0.003582499222829938, 0.0015068219508975744, 0.0014086280716583133, 0.0016915180021896958, 0.006518447771668434, 0.0051133520901203156, 0.0018672674195840955, 0.014190413057804108, 0.5000842809677124], [0.012381944805383682, 0.004203279037028551, 0.0011711594415828586, 0.004673803225159645, 0.0035671130754053593, 0.005806634668260813, 0.012241595424711704, 0.4557145833969116, 0.0018272607121616602, 0.0022711586207151413, 0.0036467837635427713, 0.004205690696835518, 0.0053556752391159534, 0.0028828370850533247, 0.0015599740436300635, 0.008588887751102448, 0.4699016809463501]], [[0.001630918006412685, 0.0036330276634544134, 0.0017476840876042843, 0.008779381401836872, 0.0012572674313560128, 0.010803107172250748, 0.0069645983166992664, 0.4637279510498047, 0.0004857521562371403, 0.001219079946167767, 0.0005375376786105335, 0.00043037760769948363, 0.0004086603003088385, 0.00037500864709727466, 0.00035355924046598375, 0.0011502847773954272, 0.49649578332901], [0.005849900655448437, 0.026236917823553085, 0.0029655976686626673, 0.005820889491587877, 0.001578476163558662, 0.0015933995600789785, 0.013776613399386406, 0.4570389986038208, 7.735843246337026e-05, 0.0001210536720464006, 3.135226870654151e-05, 8.219595474656671e-05, 0.00027954234974458814, 3.8645386666757986e-05, 7.09274536347948e-05, 0.000435490976087749, 0.48400259017944336], [0.007559177000075579, 0.14521479606628418, 0.01958632469177246, 0.013652811758220196, 0.001643932075239718, 0.004145259037613869, 0.013748853467404842, 0.38054025173187256, 0.0006117882439866662, 0.0004906049580313265, 2.169937579310499e-05, 0.00016530000721104443, 0.00016657485684845597, 8.909327152650803e-05, 8.825505210552365e-05, 0.0005184361943975091, 0.4117567241191864], [0.016508817672729492, 0.10554523766040802, 0.0065936134196817875, 0.02353733219206333, 0.0015333673218265176, 0.010534252971410751, 0.01612282171845436, 0.39454999566078186, 0.0007361700409092009, 0.0002221357135567814, 3.769283648580313e-05, 0.0001482527586631477, 0.00010547341662459075, 7.131034362828359e-05, 4.5916272938484326e-05, 0.00046974472934380174, 0.4232378602027893], [0.02959698811173439, 0.01268436573445797, 0.00953881535679102, 0.43772831559181213, 0.05048434063792229, 0.01491355337202549, 0.04251565411686897, 0.1940881907939911, 0.000400698947487399, 0.00016604083066340536, 4.636454468709417e-05, 0.00024080794537439942, 0.00021130035747773945, 0.00016402745677623898, 2.4400264010182582e-05, 0.0005704581853933632, 0.20662568509578705], [0.006177200935781002, 0.005787085276097059, 0.01466528419405222, 0.20637789368629456, 0.5009527802467346, 0.044982749968767166, 0.02846948243677616, 0.09063484519720078, 0.0005874054040759802, 0.00035206295433454216, 0.0002605296322144568, 0.0005908579332754016, 0.0017104543512687087, 0.0005926968879066408, 0.00022485233785118908, 0.0006102999323047698, 0.09702354669570923], [0.01510736346244812, 0.011606606654822826, 0.01218446809798479, 0.1088191568851471, 0.16073215007781982, 0.22981631755828857, 0.03207985311746597, 0.19987715780735016, 0.0049982802011072636, 0.001009913394227624, 0.001004306715913117, 0.0013966960832476616, 0.002690874971449375, 0.0017732917331159115, 0.00029171674395911396, 0.0018956500571221113, 0.21471616625785828], [0.006383002735674381, 0.005902440287172794, 0.0016148111317306757, 0.007346749305725098, 0.0025664924178272486, 0.008729341439902782, 0.011538311839103699, 0.45823484659194946, 0.002566457027569413, 0.002229247009381652, 0.0016701137647032738, 0.001487839501351118, 0.0018377351807430387, 0.0018774971831589937, 0.0008590968791395426, 0.002762381685897708, 0.48239368200302124], [0.04524953290820122, 0.0013594976626336575, 0.0004093957832083106, 0.002909192582592368, 0.002391014015302062, 0.005804012063890696, 0.044699136167764664, 0.41011562943458557, 0.0087862154468894, 0.00261941971257329, 0.00020449739531613886, 0.0003144172951579094, 0.0002249893150292337, 3.501161336316727e-05, 4.206320954835974e-05, 0.0010243634460493922, 0.4738115668296814], [0.09132824093103409, 0.005113589111715555, 0.0012245092075318098, 0.007615723647177219, 0.01507547777146101, 0.029535191133618355, 0.054246921092271805, 0.23267139494419098, 0.17377068102359772, 0.11762631684541702, 0.004653451964259148, 0.0022293049842119217, 0.002251217607408762, 0.0008152805967256427, 9.909580694511533e-05, 0.002008657669648528, 0.2597349286079407], [0.030890826135873795, 0.0048774913884699345, 0.0022368342615664005, 0.0021380609832704067, 0.004099604208022356, 0.016608424484729767, 0.02315337583422661, 0.22653543949127197, 0.22455313801765442, 0.193069189786911, 0.009306511841714382, 0.0021771180909126997, 0.004968162160366774, 0.0034003539476543665, 0.0001488552225055173, 0.00161849707365036, 0.25021809339523315], [0.016647247597575188, 0.0011671415995806456, 0.0012498158030211926, 0.004579117987304926, 0.004774804692715406, 0.011453363113105297, 0.017292439937591553, 0.14293035864830017, 0.0943334773182869, 0.3871225118637085, 0.12513461709022522, 0.015245389193296432, 0.0074633886106312275, 0.005015241447836161, 0.001106962445192039, 0.0036120624281466007, 0.16087201237678528], [0.021398290991783142, 0.0023197627160698175, 0.0004182607226539403, 0.0020134795922785997, 0.0001864724763436243, 0.0018597646849229932, 0.010608920827507973, 0.42670655250549316, 0.009306972846388817, 0.013215397484600544, 0.003056164598092437, 0.010228910483419895, 0.004213388543576002, 0.0009899141732603312, 0.0001486779801780358, 0.0017029246082529426, 0.49162614345550537], [0.020024023950099945, 0.0005738435429520905, 0.0006840116693638265, 0.003592725610360503, 0.0009128357050940394, 0.0018631581915542483, 0.00553504191339016, 0.2339477241039276, 0.005209977738559246, 0.011542570777237415, 0.008849975652992725, 0.05570434778928757, 0.28781193494796753, 0.08509176969528198, 0.0027863369323313236, 0.005720905493944883, 0.27014878392219543], [0.026004817336797714, 0.0013846780639141798, 0.0009464538306929171, 0.004057134967297316, 0.0025667804293334484, 0.0030928037595003843, 0.003819472389295697, 0.16580355167388916, 0.004551692865788937, 0.029466545209288597, 0.012271486222743988, 0.02901923656463623, 0.29240652918815613, 0.2157498151063919, 0.002414435613900423, 0.021567465737462044, 0.18487711250782013], [0.007009573746472597, 0.0006911451346240938, 0.0005664720665663481, 0.0010569181758910418, 0.0031400129664689302, 0.002296663820743561, 0.004644713830202818, 0.022301090881228447, 0.0023411069996654987, 0.052116237580776215, 0.004019484389573336, 0.018984483554959297, 0.3202293813228607, 0.4991047978401184, 0.004182165954262018, 0.0328456312417984, 0.024470103904604912], [0.006987396627664566, 0.006830199621617794, 0.0018728243885561824, 0.00775423226878047, 0.0029497963842004538, 0.009837541729211807, 0.013146414421498775, 0.4543441832065582, 0.0032984348945319653, 0.002733840374276042, 0.0020361572969704866, 0.0018087761709466577, 0.0022521631326526403, 0.0024986821226775646, 0.0011204505572095513, 0.00325174443423748, 0.4772772490978241]], [[0.004985570441931486, 0.0070844898000359535, 0.010517451912164688, 0.00269911321811378, 0.011646711267530918, 0.0020164859015494585, 0.00781127717345953, 0.12247852236032486, 0.12794484198093414, 0.25989097356796265, 0.040366459637880325, 0.016538472846150398, 0.20354953408241272, 0.012263654731214046, 0.001551253953948617, 0.02000334858894348, 0.14865191280841827], [0.007212472148239613, 0.021706944331526756, 0.6324887871742249, 0.012274417094886303, 0.02395448088645935, 0.02845582738518715, 0.07491730153560638, 0.08988158404827118, 0.0006989810499362648, 0.00547898281365633, 0.0014704149216413498, 0.0008368089911527932, 0.0007665411103516817, 0.0002848693693522364, 0.0006771578919142485, 0.0007200897671282291, 0.09817446023225784], [0.004238876048475504, 0.0031583376694470644, 0.006447446066886187, 0.011673263274133205, 0.0355844683945179, 0.041932422667741776, 0.011973629705607891, 0.41724345088005066, 0.00019813873223029077, 0.0003567976818885654, 0.0019134156173095107, 0.0007581845857203007, 0.00019221074762754142, 8.615856495453045e-05, 0.0005158367566764355, 0.0006077094003558159, 0.4631195068359375], [0.0109877809882164, 0.018733065575361252, 0.02712864615023136, 0.027788721024990082, 0.1262953281402588, 0.2742388844490051, 0.0710548460483551, 0.20928955078125, 0.0010762745514512062, 0.0010656617814674973, 0.0021682600490748882, 0.0005878574447706342, 0.0013631522888317704, 0.0007512095617130399, 0.0012044229079037905, 0.001546715502627194, 0.2247195690870285], [0.004225891549140215, 0.0010135946795344353, 0.00979903806000948, 0.010551226325333118, 0.017262037843465805, 0.22785498201847076, 0.6028282046318054, 0.042211636900901794, 0.002911378862336278, 0.027683200314641, 0.0031484225764870644, 0.001322540221735835, 0.0002842957910615951, 0.0003951598482672125, 0.0005421005771495402, 0.003169513773173094, 0.04479667916893959], [0.0032131564803421497, 0.0003200930077582598, 0.004851092584431171, 0.0033079730346798897, 0.0030305986292660236, 0.021792355924844742, 0.8670767545700073, 0.04086114838719368, 0.0003363770665600896, 0.009304952807724476, 0.0015144629869610071, 0.00019917835015803576, 4.9718284572009e-05, 6.780373223591596e-05, 0.00037671293830499053, 0.0005737515166401863, 0.04312386363744736], [0.01597990095615387, 0.007580767385661602, 0.003855861024931073, 0.04266727715730667, 0.01571275293827057, 0.02619338594377041, 0.011654693633317947, 0.3867860436439514, 0.012727024033665657, 0.006532070692628622, 0.011278621852397919, 0.016349267214536667, 0.008474690839648247, 0.0027264319360256195, 0.0009684975375421345, 0.0031377419363707304, 0.4273749589920044], [0.007080857176333666, 0.004761289805173874, 0.0032202876172959805, 0.0046277400106191635, 0.0031745564192533493, 0.007106042467057705, 0.012047868221998215, 0.4487597346305847, 0.0014414878096431494, 0.0023271956015378237, 0.0033661103807389736, 0.0017005859408527613, 0.00100427377037704, 0.000732457498088479, 0.0009026590269058943, 0.004035356920212507, 0.49371153116226196], [0.0012632374418899417, 6.81255551171489e-05, 0.0009554591961205006, 0.00016757726552896202, 0.00017009727889671922, 0.0002328252448933199, 0.008141440339386463, 0.054396990686655045, 0.010663696564733982, 0.6404402256011963, 0.15377749502658844, 0.04531555250287056, 0.016736837103962898, 0.000921139435376972, 0.0010158420773223042, 0.0024056462571024895, 0.06332771480083466], [0.004471431020647287, 0.0001234428636962548, 0.0002918621466960758, 0.001051347702741623, 0.0005096677341498435, 0.00044376685400493443, 0.0016345218755304813, 0.09005781263113022, 0.017780892550945282, 0.07711305469274521, 0.2641834020614624, 0.3048361539840698, 0.09093035012483597, 0.014937801286578178, 0.00754490727558732, 0.01795584335923195, 0.10613381862640381], [0.008581430651247501, 0.00022472925775218755, 0.00013691693311557174, 0.0018651180434972048, 0.0006004610913805664, 0.000910055881831795, 0.001432877266779542, 0.1491088718175888, 0.0059226457960903645, 0.022053668275475502, 0.05656634271144867, 0.351685106754303, 0.182596817612648, 0.026104005053639412, 0.004961181897670031, 0.021125998347997665, 0.1661236435174942], [0.004873867146670818, 0.0001714636164251715, 0.00016864134522620589, 0.0006482871831394732, 0.00040015208651311696, 0.0002832242171280086, 0.0042347293347120285, 0.28713932633399963, 0.0005784629611298442, 0.009179624728858471, 0.03152197599411011, 0.02446536161005497, 0.11830843240022659, 0.02632470801472664, 0.06196695938706398, 0.10570980608463287, 0.3240249454975128], [0.0016174393240362406, 0.00016903350478969514, 0.00022815738338977098, 0.00023923083790577948, 0.00013738579582422972, 0.0007686250610277057, 0.0060425978153944016, 0.2819910943508148, 0.00011113385698990896, 0.0008490153704769909, 0.0013234179932624102, 0.0010876395972445607, 0.01586979441344738, 0.08942229300737381, 0.036007389426231384, 0.24640458822250366, 0.31773123145103455], [0.0007072212174534798, 2.0695446437457576e-05, 0.00034197827335447073, 0.0002634183911141008, 0.00010389957606093958, 0.00025751246721483767, 0.013238305225968361, 0.17560835182666779, 2.8504695364972576e-05, 0.003198443679139018, 0.001508731278590858, 0.0007918964838609099, 0.0024740584194660187, 0.02437790296971798, 0.08212650567293167, 0.50252765417099, 0.19242490828037262], [0.004707667510956526, 0.0005381138762459159, 0.00024868079344742, 0.001247760490514338, 0.0002658366283867508, 0.0011880019446834922, 0.0016888439422473311, 0.3188669681549072, 0.0012510968372225761, 0.004443360026925802, 0.012069962918758392, 0.009359556250274181, 0.011358045041561127, 0.01979394257068634, 0.013770177960395813, 0.24160723388195038, 0.357594758272171], [0.00971157569438219, 0.000529648270457983, 0.0005717400345019996, 0.0032357056625187397, 0.0011286125518381596, 0.003886112244799733, 0.012129511684179306, 0.34335795044898987, 0.003968046046793461, 0.006197195965796709, 0.008548915386199951, 0.006712019443511963, 0.014836153946816921, 0.020977962762117386, 0.030314521864056587, 0.1442478597164154, 0.3896464407444], [0.007409220561385155, 0.0048757800832390785, 0.0032984695862978697, 0.004708785098046064, 0.0033497947733849287, 0.007243669591844082, 0.013466081582009792, 0.4478532373905182, 0.001528545399196446, 0.0025286555755883455, 0.003553577698767185, 0.0018581392941996455, 0.0010200438555330038, 0.0007644384750165045, 0.0010107038542628288, 0.004269884433597326, 0.4912608861923218]], [[0.017152776941657066, 0.014300890266895294, 0.001825623563490808, 0.006907407194375992, 0.0041553061455488205, 0.042142268270254135, 0.7487141489982605, 0.07999628782272339, 0.003473843913525343, 0.00023424877144861966, 0.0003855812537949532, 0.001387853641062975, 0.0017026528948917985, 0.0010885322699323297, 0.00018030806677415967, 0.0019812420941889286, 0.0743710920214653], [0.016712557524442673, 0.04698034003376961, 0.03487275913357735, 0.0027402853593230247, 0.0004018193867523223, 0.002794938860461116, 0.0018179002217948437, 0.41797035932540894, 0.0002191155799664557, 0.00023701840837020427, 0.00017069937894120812, 0.0004319166182540357, 0.0037146045360714197, 0.0005826166598126292, 8.345235983142629e-05, 0.003850969485938549, 0.46641862392425537], [0.0009322563419118524, 0.1101909950375557, 0.007308823522180319, 0.0007050703279674053, 2.6771162083605304e-05, 0.0017128087347373366, 0.0008016882347874343, 0.41844433546066284, 0.00022863448248244822, 0.00015822470595594496, 1.2200776836834848e-05, 7.89978639659239e-06, 5.4594373068539426e-05, 6.6227228671778e-05, 0.0002040020190179348, 0.00021109527733642608, 0.45893436670303345], [0.001987552037462592, 0.06205309182405472, 0.03354446962475777, 0.007438543252646923, 0.0005691456608474255, 0.010343929752707481, 0.0005493342177942395, 0.41359367966651917, 0.00010280427522957325, 0.00020456247148104012, 3.2211992220254615e-05, 6.26455876044929e-05, 6.131632108008489e-05, 5.749647971242666e-05, 0.0001304459001403302, 0.00030673370929434896, 0.4689621329307556], [0.0017222192836925387, 0.006162389647215605, 0.04294818639755249, 0.20489472150802612, 0.0033900984562933445, 0.00456859590485692, 0.00047969515435397625, 0.34144318103790283, 1.3820853382640053e-05, 0.00010674862278392538, 5.783383676316589e-05, 0.00038419259362854064, 2.5898734747897834e-05, 5.443698682938702e-06, 9.380736446473747e-05, 0.00021982108592055738, 0.39348340034484863], [0.005642704665660858, 0.006175093352794647, 0.012648492120206356, 0.005806110333651304, 0.013703509233891964, 0.007441757246851921, 0.0032436256296932697, 0.4387225806713104, 0.0004530332225840539, 0.0004591986071318388, 0.0001350257807644084, 0.000281378161162138, 0.0006892158999107778, 0.00015151083061937243, 0.0002090871421387419, 0.0016325420001521707, 0.5026051998138428], [0.0011829162249341607, 0.005219062324613333, 0.0021068344358354807, 0.002766749821603298, 0.0009267742861993611, 0.08196073025465012, 0.010864775627851486, 0.4225058853626251, 0.0010010508121922612, 0.00027640911866910756, 6.165471859276295e-05, 6.36497134109959e-05, 1.569695450598374e-05, 7.658110553165898e-05, 0.0004226136370562017, 0.0007951443549245596, 0.4697535037994385], [0.004330813884735107, 0.004676480777561665, 0.003227201057597995, 0.004555049352347851, 0.0008623532485216856, 0.008119049482047558, 0.021918028593063354, 0.44915395975112915, 0.0007845173240639269, 0.0015179639449343085, 0.0009737300570122898, 0.0016558027127757668, 0.0005455320933833718, 0.0005733633297495544, 0.0009591910638846457, 0.0033476967364549637, 0.4927992820739746], [0.17410631477832794, 0.0011574969394132495, 0.0007727844058535993, 0.0006417507538571954, 0.0008040676475502551, 0.0018691470613703132, 0.012762860395014286, 0.3560769855976105, 0.020435351878404617, 0.006827721372246742, 0.00026263968902640045, 0.0053595975041389465, 0.002150010084733367, 6.353038770612329e-05, 7.682108844164759e-06, 0.005925437901169062, 0.4107765257358551], [0.00043420089059509337, 0.0002330208517378196, 0.00014345829549711198, 0.00013765483163297176, 2.7091500669484958e-05, 0.0007388386875391006, 0.00108600954990834, 0.4380437731742859, 0.026568379253149033, 0.004157377406954765, 9.083386976271868e-05, 0.00024997093714773655, 4.62676071038004e-05, 3.6445515434024855e-05, 2.673239760042634e-05, 4.090273068868555e-05, 0.5279389023780823], [0.007110815495252609, 0.0006985956570133567, 0.0010073481826111674, 0.0005024212296120822, 7.472249126294628e-05, 0.0007072013686411083, 0.003982080612331629, 0.37572869658470154, 0.017646752297878265, 0.12554915249347687, 0.0056295860558748245, 0.006270220503211021, 0.0006914451951161027, 8.023084956221282e-05, 0.00022673732019029558, 0.000725676363799721, 0.45336833596229553], [0.0024653507862240076, 0.00018381248810328543, 0.00020125281298533082, 0.00021382723934948444, 2.5490513507975265e-05, 8.741093915887177e-05, 0.0001434565638191998, 0.22600077092647552, 0.000556670012883842, 0.0032318911980837584, 0.48181524872779846, 0.029450450092554092, 0.0050818780437111855, 3.523018676787615e-05, 0.0014464439591392875, 0.0003723718982655555, 0.2486884444952011], [0.0015338532393798232, 0.00013136962661519647, 0.00043260850361548364, 0.0009338534437119961, 1.1047529369534459e-05, 1.282707216887502e-05, 0.0001323282631346956, 0.32224538922309875, 0.00021994147391524166, 0.001882028067484498, 0.027518661692738533, 0.2709803879261017, 0.004399177618324757, 8.04719966254197e-05, 0.0005215978599153459, 0.0005472408956848085, 0.3684171736240387], [0.01309477724134922, 0.0002805860713124275, 7.222242857096717e-05, 8.595949475420639e-05, 2.489298458385747e-05, 2.1147283405298367e-05, 8.959462138591334e-05, 0.3508531153202057, 7.45371071388945e-05, 0.000475141016067937, 0.00042077581747435033, 0.0323898009955883, 0.15466243028640747, 0.01553257554769516, 0.0004362289037089795, 0.014048591256141663, 0.4174376428127289], [0.0006918899598531425, 0.0021612101700156927, 5.895650610909797e-05, 1.7270594980800524e-05, 3.866154202114558e-06, 0.000324615539284423, 0.0005819381331093609, 0.4382053315639496, 0.000649857975076884, 0.000522997637744993, 1.5068594620970543e-05, 4.039863051730208e-05, 0.000432221801020205, 0.039983589202165604, 0.00021764133998658508, 0.0014718422899022698, 0.5146213173866272], [0.0001954266190296039, 0.00010724622552515939, 0.00020710354147013277, 9.43505801842548e-05, 3.404894232517108e-05, 7.662839198019356e-05, 0.0003322149277664721, 0.40801551938056946, 1.1294250725768507e-05, 0.00010893790749832988, 0.0002609151997603476, 0.0001756740821292624, 0.0005560967256315053, 0.001532661379314959, 0.10646553337574005, 0.0022924873046576977, 0.47953376173973083], [0.005221781320869923, 0.005444729700684547, 0.0036230292171239853, 0.005175991915166378, 0.0010552523890510201, 0.009777992032468319, 0.027179835364222527, 0.4439617097377777, 0.0010140828089788556, 0.001946925651282072, 0.0011733046267181635, 0.0021000003907829523, 0.0006945946952328086, 0.0007562997052446008, 0.0011931638000532985, 0.00435724388808012, 0.4853242337703705]]], [[[0.06222934275865555, 0.011223357170820236, 0.015787392854690552, 0.012799481861293316, 0.0033703488297760487, 0.01542157493531704, 0.016259174793958664, 0.24824345111846924, 0.07193581014871597, 0.05816247686743736, 0.026816723868250847, 0.024919578805565834, 0.11732491105794907, 0.050583213567733765, 0.004960009828209877, 0.05329500511288643, 0.2066682130098343], [0.06482189893722534, 0.08041630685329437, 0.054557379335165024, 0.05996212735772133, 0.06848599016666412, 0.14059551060199738, 0.030481331050395966, 0.28718000650405884, 0.0014944530557841063, 0.0007534728501923382, 0.000969366985373199, 0.00017907416622620076, 0.0024001137353479862, 0.001198392827063799, 0.0004355222044978291, 0.0010624536080285907, 0.2050066441297531], [0.11247313022613525, 0.0365796722471714, 0.061428140848875046, 0.01429937407374382, 0.022246574983000755, 0.0935877338051796, 0.021541139110922813, 0.36245018243789673, 0.0009293583570979536, 0.0009358442039228976, 0.0006898887222632766, 0.0001616123627172783, 0.0008578920387662947, 0.0006272319587878883, 0.00036865068250335753, 0.0021685240790247917, 0.26865503191947937], [0.073676697909832, 0.029823502525687218, 0.014031712897121906, 0.0322556309401989, 0.05778970941901207, 0.061451856046915054, 0.041167087852954865, 0.38792693614959717, 0.0052524711936712265, 0.0013548419810831547, 0.0017378648044541478, 0.0011779482010751963, 0.004544850438833237, 0.003287628758698702, 0.0009731279569678009, 0.0029378158506006002, 0.2806103527545929], [0.029387326911091805, 0.005912467837333679, 0.005861077457666397, 0.022701425477862358, 0.031860511749982834, 0.1161937490105629, 0.1600247323513031, 0.3582885265350342, 0.0028381526935845613, 0.002432293025776744, 0.0005547262262552977, 0.00044128589797765017, 0.0012787713203579187, 0.0014527833554893732, 0.0008210285450331867, 0.001963406801223755, 0.25798776745796204], [0.02013743482530117, 0.0031871285755187273, 0.0007052486762404442, 0.007773532997816801, 0.013147188350558281, 0.03924290090799332, 0.0686795637011528, 0.5026764273643494, 0.0006360196857713163, 0.0002409998414805159, 0.00024169354583136737, 0.0001310967782046646, 0.0005957477842457592, 0.0004924361710436642, 0.00027813532506115735, 0.001193216652609408, 0.3406412601470947], [0.017159676179289818, 0.0012950540985912085, 0.00046061669127084315, 0.0023834719322621822, 0.0016027853125706315, 0.004686467349529266, 0.004174637142568827, 0.5680398344993591, 0.0009863880695775151, 0.0005074794171378016, 0.0010034575825557113, 0.001329202437773347, 0.0007602209225296974, 0.00047516843187622726, 0.00022527104010805488, 0.0007380410097539425, 0.39417222142219543], [0.01735837757587433, 0.0056022778153419495, 0.002952342154458165, 0.004448907915502787, 0.0021315335761755705, 0.004583532921969891, 0.0053506093099713326, 0.543319821357727, 0.0018155629513785243, 0.0012482377933338284, 0.0015756797511130571, 0.0012242674129083753, 0.003077156376093626, 0.0025707613676786423, 0.0011548998299986124, 0.003515399293974042, 0.39807066321372986], [0.06977446377277374, 0.0014299725880846381, 0.0009855309035629034, 0.001155778532847762, 0.0011278808815404773, 0.0027726266998797655, 0.0012140030739828944, 0.2999148368835449, 0.017872991040349007, 0.0319855771958828, 0.04655005410313606, 0.03569550812244415, 0.23830150067806244, 0.012016739696264267, 0.0021897803526371717, 0.0025014900602400303, 0.2345113456249237], [0.031152071431279182, 0.00021291757002472878, 0.00024967739591374993, 0.00016816146671772003, 0.00014642412133980542, 0.00024397668312303722, 0.00010648447641870007, 0.4755999445915222, 0.003184968838468194, 0.007521115709096193, 0.019706960767507553, 0.02361619658768177, 0.07563291490077972, 0.013318437151610851, 0.0022315464448183775, 0.002504982054233551, 0.3444032371044159], [0.030725901946425438, 0.0020348012913018465, 0.0007141407113522291, 0.0002791658916976303, 0.00017581494466867298, 0.0009960209717974067, 0.0002711419074330479, 0.41817063093185425, 0.00535159045830369, 0.0022471360862255096, 0.007942823693156242, 0.012369257397949696, 0.13355253636837006, 0.051497362554073334, 0.002662493847310543, 0.016318274661898613, 0.3146909773349762], [0.03735653683543205, 0.000959041528403759, 0.0002924947766587138, 0.0002720350166782737, 0.00015356017684098333, 0.0005411563906818628, 0.0002914085052907467, 0.508170485496521, 0.002039810409769416, 0.0006371202180162072, 0.0018173230346292257, 0.0018793451599776745, 0.02393984980881214, 0.021286070346832275, 0.0033449747134000063, 0.008148154243826866, 0.3888707160949707], [0.021539948880672455, 0.0004585519200190902, 0.0003033443936146796, 0.0004209604812785983, 0.00013121710799168795, 0.0010772220557555556, 0.0009947087382897735, 0.4661819338798523, 0.0005258549354039133, 0.0005240062018856406, 0.0007703894516453147, 0.00091246870579198, 0.03184255585074425, 0.058947086334228516, 0.01618376187980175, 0.04722842201590538, 0.35195767879486084], [0.007219757419079542, 0.00015234193415381014, 8.739755867281929e-05, 0.00019506202079355717, 6.440157449105754e-05, 0.0003273941110819578, 0.0002922629937529564, 0.5533922910690308, 8.337156032212079e-05, 0.00011111984349554405, 0.00022264687868300825, 0.0002106963365804404, 0.004670191090553999, 0.010438680648803711, 0.012619102373719215, 0.024987246841192245, 0.3849259614944458], [0.004986012354493141, 0.00023959590180311352, 0.0001758344005793333, 0.0001403661590302363, 7.464329246431589e-05, 0.0006951958639547229, 0.0001451667194487527, 0.5705699920654297, 0.0001973821927094832, 0.00010197081428486854, 0.00025859347078949213, 0.00018118292791768909, 0.0007095415494404733, 0.0053916689939796925, 0.0025105448439717293, 0.011862685903906822, 0.40175962448120117], [0.0040214103646576405, 0.00012022176815662533, 3.7768608308397233e-05, 0.00021916604600846767, 6.829619087511674e-05, 0.0003861628647428006, 0.00028214906342327595, 0.5946462750434875, 4.915626414003782e-05, 4.4148564484203234e-05, 9.050131484400481e-05, 6.464384205173701e-05, 0.00015497686399612576, 0.0008500401745550334, 0.0005385751719586551, 0.004296896513551474, 0.3941296339035034], [0.014741160906851292, 0.004172677639871836, 0.0021332723554223776, 0.0033464725129306316, 0.0015576551668345928, 0.0035026692785322666, 0.004374745301902294, 0.5534631013870239, 0.0014334124280139804, 0.0009752177866175771, 0.0013255677185952663, 0.0010285977041348815, 0.0025830045342445374, 0.00213717482984066, 0.0009344946010969579, 0.0030172269325703382, 0.3992736339569092]], [[0.031850267201662064, 0.06144869327545166, 0.01711576245725155, 0.03911055624485016, 0.007903936319053173, 0.01682884246110916, 0.005235510412603617, 0.4188999533653259, 0.012495669536292553, 0.008952994830906391, 0.0014240797609090805, 0.003668492892757058, 0.005084467586129904, 0.007104380521923304, 0.003509915666654706, 0.005273715127259493, 0.3540927767753601], [0.08816834539175034, 0.01291849184781313, 0.007019080687314272, 0.006031675264239311, 0.0018723233370110393, 0.0027867103926837444, 0.00894177332520485, 0.47301506996154785, 0.00616964977234602, 0.000784550909884274, 0.0010844110511243343, 0.0016837569419294596, 0.0018067866330966353, 0.003910520114004612, 0.00044455082388594747, 0.0030423561111092567, 0.3803200125694275], [0.030113575980067253, 0.017297249287366867, 0.024459702894091606, 0.008308799006044865, 0.006992260925471783, 0.01253463700413704, 0.019958416000008583, 0.4835943877696991, 0.0047219800762832165, 0.00284932111389935, 0.0017693220870569348, 0.0028413215186446905, 0.002676408737897873, 0.003755107754841447, 0.0024709219578653574, 0.00704931328073144, 0.36860722303390503], [0.05742860212922096, 0.013436036184430122, 0.013409365899860859, 0.02353910356760025, 0.014928702265024185, 0.01586555317044258, 0.036650072783231735, 0.4366380572319031, 0.0065728225745260715, 0.0020143270958215, 0.002393505536019802, 0.0020754304714500904, 0.003310360014438629, 0.006202100310474634, 0.0017801353242248297, 0.0053640748374164104, 0.35839179158210754], [0.04131932556629181, 0.024496708065271378, 0.010757518000900745, 0.011858894489705563, 0.019040856510400772, 0.06169071048498154, 0.06135048717260361, 0.4226905405521393, 0.005163577385246754, 0.0016705517191439867, 0.001235193107277155, 0.0014847967540845275, 0.0027924857567995787, 0.004041844978928566, 0.0007494086748920381, 0.0037055264692753553, 0.3259516656398773], [0.1292821764945984, 0.007671054918318987, 0.0040414659306406975, 0.0028530049603432417, 0.007765212561935186, 0.024324992671608925, 0.0555647574365139, 0.39340993762016296, 0.006063939072191715, 0.002384188584983349, 0.0009634266025386751, 0.0037653581239283085, 0.003109327983111143, 0.008813275024294853, 0.001328925834968686, 0.007802393287420273, 0.34085655212402344], [0.050938621163368225, 0.011790183372795582, 0.0151284858584404, 0.006979555822908878, 0.007527490146458149, 0.03475088253617287, 0.019052451476454735, 0.452745646238327, 0.004711616318672895, 0.006395944394171238, 0.0015513282269239426, 0.006622905842959881, 0.002581524895504117, 0.00833315309137106, 0.0025920860935002565, 0.008542521856725216, 0.35975557565689087], [0.026124773547053337, 0.021172426640987396, 0.011393862776458263, 0.013054000213742256, 0.009728864766657352, 0.022097833454608917, 0.0471414290368557, 0.4665626585483551, 0.005562187172472477, 0.0038279315922409296, 0.004973159171640873, 0.005424310453236103, 0.006342133041471243, 0.0037479421589523554, 0.00539664039388299, 0.00815630704164505, 0.33929353952407837], [0.014546267688274384, 0.019375307485461235, 0.007183321285992861, 0.008889238350093365, 0.003311531152576208, 0.010084609501063824, 0.0075137256644666195, 0.4769587814807892, 0.015988342463970184, 0.0039009368047118187, 0.001373600447550416, 0.004342284519225359, 0.007108298130333424, 0.028479604050517082, 0.003127798903733492, 0.007488923147320747, 0.38032734394073486], [0.004450300242751837, 0.013733319006860256, 0.005209342576563358, 0.0045092240907251835, 0.004290551412850618, 0.007425542920827866, 0.008546719327569008, 0.48500946164131165, 0.017422856763005257, 0.007889966480433941, 0.003429705509915948, 0.005628917831927538, 0.007145700044929981, 0.02493269182741642, 0.004979937337338924, 0.007277855183929205, 0.3881179094314575], [0.020994912832975388, 0.012569146230816841, 0.003273850539699197, 0.0015651066787540913, 0.001924082636833191, 0.004172459710389376, 0.0075534069910645485, 0.42729267477989197, 0.05239259824156761, 0.02116963267326355, 0.004175584763288498, 0.007364147808402777, 0.021773945540189743, 0.05143410339951515, 0.009352311491966248, 0.01165570318698883, 0.34133628010749817], [0.0248522087931633, 0.03298011049628258, 0.0045846919529139996, 0.006975323427468538, 0.0021469921339303255, 0.0061341444961726665, 0.012816306203603745, 0.44374004006385803, 0.027548450976610184, 0.010629798285663128, 0.003212754847481847, 0.0031496440060436726, 0.014443567954003811, 0.03518765792250633, 0.004526130855083466, 0.0067893932573497295, 0.3602827489376068], [0.010024623945355415, 0.011513827368617058, 0.00148773193359375, 0.0023110369220376015, 0.002924047177657485, 0.007480372674763203, 0.0017478910740464926, 0.5062806010246277, 0.02079574204981327, 0.007332590874284506, 0.0013469929108396173, 0.0035072125028818846, 0.004997245967388153, 0.024244273081421852, 0.0019747044425457716, 0.004346712026745081, 0.38768434524536133], [0.03800666704773903, 0.005856029223650694, 0.004484549164772034, 0.0025923310313373804, 0.0018806204898282886, 0.00896464940160513, 0.010262347757816315, 0.4523540735244751, 0.023127853870391846, 0.008901788853108883, 0.002362973988056183, 0.009535424411296844, 0.015328394249081612, 0.03946968913078308, 0.007442819885909557, 0.012467269785702229, 0.35696250200271606], [0.018624387681484222, 0.005481986328959465, 0.003052372485399246, 0.0004058620543219149, 0.002112566027790308, 0.006461943034082651, 0.004783644340932369, 0.45013654232025146, 0.03769480809569359, 0.016478905454277992, 0.002755182096734643, 0.014361168257892132, 0.01294479425996542, 0.05430305749177933, 0.007781156338751316, 0.019296729937195778, 0.34332481026649475], [0.020761270076036453, 0.005136465188115835, 0.00492568826302886, 0.0015779684763401747, 0.00195700628682971, 0.0076730018481612206, 0.007862133905291557, 0.455022394657135, 0.020728083327412605, 0.011051773093640804, 0.0027191494591534138, 0.007383351679891348, 0.010208610445261002, 0.03861897811293602, 0.009071559645235538, 0.028266653418540955, 0.367035835981369], [0.024249102920293808, 0.01833004504442215, 0.01097728218883276, 0.011344080790877342, 0.008989119902253151, 0.019900605082511902, 0.03878051042556763, 0.4758493900299072, 0.005294352304190397, 0.004056975245475769, 0.004940883256494999, 0.005583477206528187, 0.006528070196509361, 0.0036511612124741077, 0.005437426269054413, 0.007895203307271004, 0.34819239377975464]], [[0.013189210556447506, 0.04878270626068115, 0.0004649242328014225, 0.0029211346991360188, 0.0014530338812619448, 0.006784756202250719, 0.004488692618906498, 0.36728328466415405, 0.24900442361831665, 0.0018015814712271094, 0.0017052630428224802, 0.002151469700038433, 0.00810808502137661, 0.021379593759775162, 0.0005860304809175432, 0.0006754586938768625, 0.26922038197517395], [0.0018229876877740026, 0.05620413273572922, 0.16378983855247498, 0.01121945958584547, 0.0003264013503212482, 0.0007784877670928836, 0.000895759672857821, 0.4300557076931, 0.0011591239599511027, 0.005226176232099533, 0.0005796078476123512, 0.00021477277914527804, 0.00021466496400535107, 9.204114758176729e-05, 0.00024531446979381144, 0.0002000557869905606, 0.32697543501853943], [0.01652393490076065, 0.009863244369626045, 0.0023245930206030607, 0.007279149256646633, 0.0009502455941401422, 0.0011404850520193577, 0.0014721720945090055, 0.5507407188415527, 0.0008428208529949188, 0.0004896912723779678, 0.0008783860830590129, 0.0003088038065470755, 0.0016061995411291718, 0.00040228216676041484, 4.379996607895009e-05, 0.00043336855014786124, 0.4047001004219055], [0.023029565811157227, 0.02599475532770157, 0.0037679008673876524, 0.024366283789277077, 0.16582749783992767, 0.0353570356965065, 0.015081219375133514, 0.3574579954147339, 0.025063686072826385, 0.0018054000101983547, 0.002628380199894309, 0.007968241348862648, 0.030823688954114914, 0.00617032079026103, 0.00013360724551603198, 0.0026568504981696606, 0.27186763286590576], [0.003751736134290695, 0.024706928059458733, 0.0006774174980819225, 0.0049817501567304134, 0.008362867869436741, 0.2924361228942871, 0.005819959100335836, 0.3694363534450531, 0.015816325321793556, 0.000794789579231292, 0.0009085320052690804, 0.0013681339332833886, 0.0005701824557036161, 0.010258568450808525, 0.000621883780695498, 0.0010362501488998532, 0.2584521472454071], [0.0035934317857027054, 0.0023006321862339973, 0.008290057070553303, 0.0044997152872383595, 0.0011891110334545374, 0.010744689963757992, 0.2126532942056656, 0.4179225265979767, 0.0006858897395431995, 0.02602897770702839, 0.001089641242288053, 0.00037494907155632973, 0.0003320509276818484, 0.0002383140817983076, 0.0037683306727558374, 0.001143922796472907, 0.30514442920684814], [0.010053616017103195, 0.013376235030591488, 0.001299927942454815, 0.0014597359113395214, 0.0002790637663565576, 0.003942014649510384, 0.01655123382806778, 0.5301147103309631, 0.008200157433748245, 0.0020484954584389925, 0.0013240063562989235, 0.0032621161080896854, 0.0006263716495595872, 0.0008787883562035859, 0.0016097313491627574, 0.009426687844097614, 0.39554718136787415], [0.010582360439002514, 0.011023254133760929, 0.004165771882981062, 0.006667990703135729, 0.002141132950782776, 0.008530589751899242, 0.008561355993151665, 0.5245344042778015, 0.0023198199924081564, 0.0038577166851609945, 0.0021427988540381193, 0.002523110480979085, 0.0010517118498682976, 0.0015477048000320792, 0.002201440278440714, 0.004279494285583496, 0.40386927127838135], [0.0007978323847055435, 0.0019295386737212539, 0.016145840287208557, 0.0015750628663226962, 8.509391773259267e-05, 0.00041409992263652384, 0.0008802920929156244, 0.19787639379501343, 0.0028672143816947937, 0.6273912191390991, 0.0005155407125130296, 0.00021750754967797548, 0.0003765000437851995, 0.0003788386529777199, 0.001178301521576941, 8.925032307161018e-05, 0.1472814530134201], [0.003347754245623946, 0.000289598829112947, 8.562362199882045e-05, 0.0001997091603698209, 0.00012645231618080288, 0.000697810435667634, 0.00041253294330090284, 0.5737211108207703, 0.0013303994201123714, 0.002372839255258441, 0.002059493213891983, 0.00043919828021898866, 0.0004088116984348744, 0.0004395085561554879, 0.0002064076397800818, 0.00014705183275509626, 0.4137156903743744], [0.0015253257006406784, 0.0026373090222477913, 6.336012302199379e-05, 0.0006488626822829247, 0.00012737214274238795, 0.0008151813526637852, 0.000151809785165824, 0.39348554611206055, 0.002083980478346348, 0.0015313861658796668, 6.387862958945334e-05, 0.3207513391971588, 0.00029936485225334764, 0.0004493242013268173, 2.085639380311477e-06, 0.0006630048155784607, 0.2747008502483368], [0.013906878419220448, 0.004609500057995319, 4.03863814426586e-05, 0.001327304169535637, 0.0007495511672459543, 0.0036639971658587456, 0.0009994710562750697, 0.532169759273529, 0.013522611930966377, 0.00039109497447498143, 0.0003075264685321599, 0.00028767791809514165, 0.033241916447877884, 0.006081035826355219, 8.873081242199987e-06, 0.0008172029629349709, 0.38787516951560974], [0.002221515402197838, 0.014974980615079403, 3.097394437645562e-05, 0.00022949308913666755, 0.0004134383052587509, 0.04491299018263817, 0.0004375235002953559, 0.1674930900335312, 0.1468004435300827, 0.0007777983555570245, 0.00020982879505027086, 0.0007031286950223148, 0.0034343446604907513, 0.49069473147392273, 0.00011791523138526827, 0.0002277431049151346, 0.12632013857364655], [0.0019167748978361487, 0.0008570684585720301, 0.0030271566938608885, 0.0002146833430742845, 0.0001258013362530619, 0.0009221627842634916, 0.0014258355367928743, 0.495392382144928, 0.0008911298355087638, 0.06519701331853867, 0.00048824577243067324, 0.0001445283996872604, 0.00022564265236724168, 0.0025092936120927334, 0.03353000804781914, 0.0015433602966368198, 0.39158895611763], [0.0005076006636954844, 0.000724041077774018, 3.729939999175258e-05, 1.5107215403986629e-05, 3.405720417504199e-05, 0.0003162130306009203, 0.0002251798432553187, 0.11012495309114456, 0.0007170755416154861, 0.00021392614871729165, 4.0316117519978434e-05, 0.00029746638028882444, 0.0008800480864010751, 0.002399762626737356, 0.00011079433897975832, 0.8016442060470581, 0.08171196281909943], [0.012252292595803738, 0.02004837803542614, 0.0018520201556384563, 0.0030965525656938553, 0.0009993320563808084, 0.03044125624001026, 0.012676111422479153, 0.44895321130752563, 0.03934065252542496, 0.0026687346398830414, 0.003711995203047991, 0.0008912935736589134, 0.0055306884460151196, 0.041940730065107346, 0.004334761295467615, 0.017228230834007263, 0.3540339171886444], [0.010044400580227375, 0.010836067609488964, 0.003907341510057449, 0.006655768025666475, 0.002017039805650711, 0.008173004724085331, 0.007778435945510864, 0.5272728800773621, 0.002296219114214182, 0.003268791828304529, 0.0020938930101692677, 0.0022261927369982004, 0.0010160149540752172, 0.0013968987623229623, 0.0020000780932605267, 0.0036871798802167177, 0.40532979369163513]], [[0.015617191791534424, 0.005418439861387014, 0.003117323387414217, 0.007170486729592085, 0.0023113747593015432, 0.0032656544353812933, 0.004667909815907478, 0.47467100620269775, 0.02349037304520607, 0.017136571928858757, 0.005434189457446337, 0.011097019538283348, 0.03265562653541565, 0.02229488454759121, 0.002128954278305173, 0.005348659586161375, 0.3641743063926697], [0.011775280348956585, 0.08641530573368073, 0.013553845696151257, 0.03420471027493477, 0.008827862329781055, 0.036319904029369354, 0.10537640005350113, 0.39867308735847473, 0.0018789931200444698, 0.000867321330588311, 0.0002377521595917642, 0.0005069066537544131, 0.0003014457761310041, 0.002956017618998885, 0.00020343292271718383, 0.00040449961670674384, 0.2974972724914551], [0.009508919902145863, 0.03990296274423599, 0.18809643387794495, 0.013460175134241581, 0.0024059894494712353, 0.012673699297010899, 0.021573470905423164, 0.40917134284973145, 0.0012642444344237447, 0.0006597968167625368, 0.00032517631188966334, 0.0005271052359603345, 0.00019992011948488653, 0.0005901289405301213, 0.00015245257236529142, 0.0006490251398645341, 0.29883915185928345], [0.005797344259917736, 0.08579199016094208, 0.05133094638586044, 0.0769948810338974, 0.008498845621943474, 0.03545122966170311, 0.18113744258880615, 0.31117719411849976, 0.003339532995596528, 0.0006752316839993, 0.00021815398940816522, 0.00043993929284624755, 0.00021365396969486028, 0.003225737251341343, 0.0002394437324255705, 0.0006249540601857007, 0.2348434031009674], [0.012185201980173588, 0.043789371848106384, 0.009993191808462143, 0.03264083340764046, 0.01782333105802536, 0.051266759634017944, 0.09156622737646103, 0.435250461101532, 0.0012348127784207463, 0.00047399813774973154, 0.0005110757774673402, 0.0002692266134545207, 0.00014191209629643708, 0.0006130430265329778, 0.00017101915727835149, 0.00017903759726323187, 0.3018905520439148], [0.00828859768807888, 0.042355168610811234, 0.013510494492948055, 0.028497062623500824, 0.01072603277862072, 0.06346774101257324, 0.45202213525772095, 0.21485859155654907, 0.0022965220268815756, 0.0005806196131743491, 0.0005559555720537901, 0.0005057745729573071, 0.00020080483227502555, 0.003505747765302658, 0.0003123208589386195, 0.0004578852385748178, 0.1578584760427475], [0.008462972939014435, 0.01966806873679161, 0.03113679215312004, 0.06810611486434937, 0.0060747163370251656, 0.023496203124523163, 0.02925211563706398, 0.46187150478363037, 0.0014180750586092472, 0.0010227281600236893, 0.002426047110930085, 0.0005825462285429239, 0.0006034694379195571, 0.0019065249944105744, 0.0009071054519154131, 0.000761057308409363, 0.34230390191078186], [0.011645305901765823, 0.008685487322509289, 0.004450683947652578, 0.003813832299783826, 0.0013365453341975808, 0.003629521233960986, 0.009613266214728355, 0.5594266653060913, 0.002450609114021063, 0.0019050838891416788, 0.0018358547240495682, 0.0016865541692823172, 0.0007053850567899644, 0.0018894716631621122, 0.000977774034254253, 0.0009025399340316653, 0.3850453495979309], [0.16020739078521729, 0.003631794825196266, 0.0005395996267907321, 0.003892946522682905, 0.0010661003179848194, 0.006977256387472153, 0.015545171685516834, 0.39014822244644165, 0.02911488153040409, 0.012187251821160316, 0.0025937315076589584, 0.03226833790540695, 0.009023203514516354, 0.029064293950796127, 0.001968708820641041, 0.010796521790325642, 0.290974497795105], [0.056725189089775085, 0.00038511023740284145, 0.0006700473022647202, 0.0004774250846821815, 0.0001462656946387142, 0.0004900086205452681, 0.004755864385515451, 0.5242371559143066, 0.004048623144626617, 0.009365683421492577, 0.003340738592669368, 0.009673887863755226, 0.003674849169328809, 0.0035705401096493006, 0.00279803853482008, 0.008139068260788918, 0.3675014078617096], [0.07196597754955292, 0.002173026092350483, 0.004970130976289511, 0.0008480082615278661, 0.0006631113938055933, 0.001571857021190226, 0.002658969722688198, 0.43474435806274414, 0.02785625495016575, 0.018780404701828957, 0.015478396788239479, 0.01805277168750763, 0.02770358882844448, 0.019175760447978973, 0.010547129437327385, 0.02191154845058918, 0.3208986818790436], [0.2121223658323288, 0.0019235057989135385, 0.0015495088882744312, 0.0008136879769153893, 0.000196195236640051, 0.0019061386119574308, 0.0064291758462786674, 0.3071148693561554, 0.04199030622839928, 0.08462458848953247, 0.0043339780531823635, 0.014813661575317383, 0.017379140481352806, 0.033101484179496765, 0.008024676702916622, 0.02374441921710968, 0.23993225395679474], [0.15858682990074158, 0.0018282901728525758, 0.0005473802448250353, 0.0021337626967579126, 0.0009929609950631857, 0.0028460451867431402, 0.003621053881943226, 0.41440314054489136, 0.020862845703959465, 0.015355078503489494, 0.008580535650253296, 0.021339459344744682, 0.022059110924601555, 0.029147200286388397, 0.004075473174452782, 0.007378764916211367, 0.2862420380115509], [0.21076945960521698, 0.001943841460160911, 0.0007246293826028705, 0.0028074111323803663, 0.000550757918972522, 0.004412703216075897, 0.008866420947015285, 0.2887250483036041, 0.04255829378962517, 0.027128154411911964, 0.012557504698634148, 0.05258859694004059, 0.023236654698848724, 0.06363048404455185, 0.007613915018737316, 0.032572098076343536, 0.21931400895118713], [0.06452745199203491, 0.0005462526460178196, 0.0013657561503350735, 0.00043937197187915444, 5.027664883527905e-05, 0.0005142366280779243, 0.001045677112415433, 0.4556729793548584, 0.010706824250519276, 0.02532513253390789, 0.010910225100815296, 0.019632460549473763, 0.003788391128182411, 0.016175536438822746, 0.016979815438389778, 0.03870779275894165, 0.3336118757724762], [0.1411915123462677, 0.001162552973255515, 0.0018730267183855176, 0.0007259511621668935, 0.00024323065008502454, 0.0024660606868565083, 0.004150643479079008, 0.3549830913543701, 0.02150757610797882, 0.024708885699510574, 0.009251038543879986, 0.01733894646167755, 0.024056637659668922, 0.0698123425245285, 0.02323761023581028, 0.038293883204460144, 0.2649969756603241], [0.011115743778645992, 0.006980876438319683, 0.0038017111364752054, 0.0029948491137474775, 0.0011021328391507268, 0.002912895753979683, 0.008221722207963467, 0.5669072866439819, 0.0020708078518509865, 0.0016587289283052087, 0.0017586579779163003, 0.001531869638711214, 0.0005953890504315495, 0.0015427664620801806, 0.0008914187201298773, 0.0008350283023901284, 0.3850780725479126]], [[0.007079931441694498, 0.04249761253595352, 0.04053551331162453, 0.028292205184698105, 0.01801162213087082, 0.01388684380799532, 0.05567692965269089, 0.46444037556648254, 0.006102659739553928, 0.002524963114410639, 0.0024744370020926, 0.0024597481824457645, 0.0030468127224594355, 0.0006662492523901165, 8.942422573454678e-05, 0.0003471468517091125, 0.3118675947189331], [0.005069421604275703, 0.18061916530132294, 0.08508310467004776, 0.04535532742738724, 0.007906693033874035, 0.007619315758347511, 0.003808134002611041, 0.3867223858833313, 0.0033708959817886353, 0.0012064081383869052, 0.000840686378069222, 0.0020678939763456583, 0.0007573501206934452, 0.0004131859459448606, 0.00012729110312648118, 0.00020404552924446762, 0.26882871985435486], [0.0020023926626890898, 0.016552774235606194, 0.015474947169423103, 0.0023357735481113195, 0.0010369681986048818, 0.0006717900978401303, 0.001175579847767949, 0.5891199707984924, 0.00012828156468458474, 8.558353874832392e-05, 8.079452527454123e-05, 0.00011179253488080576, 1.91898325283546e-05, 5.182092536415439e-06, 8.795864232524764e-06, 1.5517636711592786e-05, 0.37117472290992737], [0.009907450526952744, 0.15029355883598328, 0.06113695725798607, 0.02655322663486004, 0.012076417915523052, 0.012615250423550606, 0.008929894305765629, 0.40786463022232056, 0.00481663690879941, 0.0018113456899300218, 0.0008398808422498405, 0.0029380549676716328, 0.0015636322787031531, 0.0016138883074745536, 0.0005589253269135952, 0.0006919551524333656, 0.2957882583141327], [0.004874920938163996, 0.12603433430194855, 0.0903070792555809, 0.03677091374993324, 0.009274939075112343, 0.022849947214126587, 0.029611071571707726, 0.40155747532844543, 0.0011199190048500896, 0.0008934880024753511, 0.00028211509925313294, 0.0007920233183540404, 0.00019959468045271933, 0.00020209501963108778, 0.00015502539463341236, 0.0005466901930049062, 0.27452847361564636], [0.008167693391442299, 0.1946438103914261, 0.07790529727935791, 0.020525028929114342, 0.008915117010474205, 0.05095594748854637, 0.024820292368531227, 0.35087597370147705, 0.0027226670645177364, 0.0011229764204472303, 0.0004126499989069998, 0.0011565503664314747, 0.0005553723312914371, 0.000666837499011308, 0.0003422666050028056, 0.0012552457628771663, 0.25495627522468567], [0.012008543126285076, 0.12116901576519012, 0.036514949053525925, 0.14737863838672638, 0.036947667598724365, 0.16856235265731812, 0.050325650721788406, 0.22391222417354584, 0.010096030309796333, 0.004155176691710949, 0.0006808865000493824, 0.004539726302027702, 0.006600753869861364, 0.002891169162467122, 0.0005897106602787971, 0.0015595467993989587, 0.1720680147409439], [0.00860117468982935, 0.00887627899646759, 0.004439475014805794, 0.005032513290643692, 0.001813722075894475, 0.00576211791485548, 0.009958162903785706, 0.548775315284729, 0.003357859095558524, 0.002993339439854026, 0.0021517707500606775, 0.002214299514889717, 0.0014317439636215568, 0.0008117801044136286, 0.0005060252733528614, 0.0008919261745177209, 0.3923824727535248], [0.06043653190135956, 0.007919001393020153, 0.003556200535967946, 0.004104431252926588, 0.0013847766676917672, 0.0016274518566206098, 0.0021389471367001534, 0.5363924503326416, 0.005838167387992144, 0.002442255849018693, 0.0004324178444221616, 0.0024328201543539762, 0.0002271716803079471, 4.03986923629418e-05, 7.768611249048263e-05, 0.0002669844252523035, 0.3706822991371155], [0.019823433831334114, 0.0007472556899301708, 0.0009656522306613624, 0.00040573865408077836, 0.0002053646749118343, 0.0006189091945998371, 0.0011744749499484897, 0.5883796811103821, 0.00329749658703804, 0.0024121066089719534, 0.00017707794904708862, 0.0005322374636307359, 5.537117613130249e-05, 1.0394752280262765e-05, 2.3250222511705942e-05, 6.826285971328616e-05, 0.3811033368110657], [0.030321422964334488, 0.0026172210928052664, 0.0037845198530703783, 0.002246926771476865, 0.0013416967121884227, 0.002318615559488535, 0.0026267217472195625, 0.5169925689697266, 0.02428642474114895, 0.021224187687039375, 0.006568270269781351, 0.002901220228523016, 0.0006417499389499426, 0.00010277329420205206, 0.00045441227848641574, 0.0004957860219292343, 0.381075382232666], [0.05498568341135979, 0.002245381474494934, 0.0019435094436630607, 0.0013716608518734574, 0.0004313517711125314, 0.0006979386671446264, 0.001276145107112825, 0.5088890790939331, 0.02926749736070633, 0.026684243232011795, 0.0060414038598537445, 0.007802157662808895, 0.000891951727680862, 0.00011258080485276878, 0.0001252226938959211, 0.00019466722733341157, 0.35703957080841064], [0.09447833895683289, 0.0028153681196272373, 0.004212194122374058, 0.002145177684724331, 0.0011277066078037024, 0.001213623327203095, 0.004677282180637121, 0.27180686593055725, 0.17166340351104736, 0.12551721930503845, 0.07542457431554794, 0.036480676382780075, 0.010705082677304745, 0.0009031806257553399, 0.0002623548498377204, 0.0008012775797396898, 0.19576571881771088], [0.04302964359521866, 0.003142506582662463, 0.0031320664566010237, 0.0008010973106138408, 0.0002877646475099027, 0.0007515671895816922, 0.001566201914101839, 0.5131692290306091, 0.013505401089787483, 0.017976095899939537, 0.0074323248118162155, 0.022860420867800713, 0.004216240253299475, 0.001238325727172196, 0.0012785057770088315, 0.002047069137915969, 0.3635655641555786], [0.0053430176340043545, 0.00020044467237312347, 0.000236545703955926, 8.965158485807478e-05, 1.2412235264491756e-05, 0.00014920464309398085, 0.0003629731363616884, 0.601553201675415, 0.000487524172058329, 0.0008405859116464853, 0.0002341360377613455, 0.0007191156619228423, 0.00018008075130637735, 0.0001566711434861645, 0.00031270147883333266, 0.0002937922836281359, 0.38882794976234436], [0.03065670095384121, 0.0006178359035402536, 0.0008920178515836596, 0.0006766004371456802, 8.123068982968107e-05, 0.0005309798871167004, 0.0007500273059122264, 0.570378839969635, 0.0011965279700234532, 0.0016172232571989298, 0.000876771635375917, 0.001707606134004891, 0.0008385828114114702, 0.0005056423833593726, 0.0004982129903510213, 0.0013475306332111359, 0.3868277072906494], [0.008313629776239395, 0.0066047552973032, 0.0032311324030160904, 0.003895176574587822, 0.0013111758744344115, 0.004245354328304529, 0.007597950287163258, 0.5599685311317444, 0.0028207176364958286, 0.002364696701988578, 0.0017351101851090789, 0.0017988411709666252, 0.0010902706999331713, 0.000580125895794481, 0.00036508540506474674, 0.0006432330701500177, 0.39343419671058655]], [[0.009254392236471176, 0.014614791609346867, 0.002209634752944112, 0.018608586862683296, 0.0010864089708775282, 0.002084523206576705, 0.006317447405308485, 0.17381878197193146, 0.4234350025653839, 0.15785183012485504, 0.005158830434083939, 0.02107120491564274, 0.025052301585674286, 0.005005573388189077, 0.000656555755995214, 0.001541715464554727, 0.13223248720169067], [0.0031691747717559338, 0.11442793160676956, 0.1788446456193924, 0.09800092875957489, 0.00418424466624856, 0.005948270205408335, 0.0093125244602561, 0.3339674472808838, 0.0010352253448218107, 0.003541591577231884, 0.001254762290045619, 0.00027206179220229387, 0.0018073703395202756, 0.00014875340275466442, 0.00011294578143861145, 0.00020776840392500162, 0.24376431107521057], [0.003085682401433587, 0.007876350544393063, 0.00718509778380394, 0.007902244105935097, 0.0018031501676887274, 0.001686214585788548, 0.0020809604320675135, 0.5837993621826172, 0.00018087094940710813, 0.00015709889703430235, 0.00014172631199471653, 0.00017802792717702687, 0.00024700292851775885, 2.364023202972021e-05, 3.1850020604906604e-05, 9.749303717399016e-05, 0.3835233151912689], [0.008648602291941643, 0.020091773942112923, 0.004379244986921549, 0.029114916920661926, 0.06234189495444298, 0.011135824024677277, 0.011315341107547283, 0.4887467622756958, 0.004993945825845003, 0.0006746219587512314, 0.000247936841333285, 0.0008631508680991828, 0.0007977194036357105, 0.00038290381780825555, 0.0001427728566341102, 0.0009213780867867172, 0.35520124435424805], [0.0044928682036697865, 0.0036168943624943495, 0.011913510039448738, 0.021341819316148758, 0.008767028339207172, 0.10772255808115005, 0.24903860688209534, 0.3130241930484772, 0.00968844722956419, 0.044224999845027924, 0.0005796061013825238, 0.0008984240121208131, 0.00042662140913307667, 0.0009398582624271512, 0.0006077661528252065, 0.002392765134572983, 0.22032395005226135], [0.004887793213129044, 0.005551813170313835, 0.006651143077760935, 0.0040780059061944485, 0.0024328858125954866, 0.05179005488753319, 0.21276786923408508, 0.4055701196193695, 0.0065904236398637295, 0.02088720165193081, 0.0002913235512096435, 0.0007765268674120307, 0.001271469984203577, 0.0009024907485581934, 0.0005158974672667682, 0.0025344952009618282, 0.27250057458877563], [0.007741131819784641, 0.015743732452392578, 0.003156135091558099, 0.009810216724872589, 0.0012324409326538444, 0.00809271540492773, 0.005855454131960869, 0.5598330497741699, 0.003864931408315897, 0.0006580136832781136, 0.00015598016034346074, 0.0012447824701666832, 0.00020260120800230652, 0.00016710592899471521, 0.0001396617735736072, 0.0007014386355876923, 0.3814007043838501], [0.010876622051000595, 0.009622414596378803, 0.004372072871774435, 0.011824050918221474, 0.002144634025171399, 0.004311480093747377, 0.009201114065945148, 0.5341811180114746, 0.005715567618608475, 0.0036970973014831543, 0.001768902293406427, 0.0027471426874399185, 0.003299988806247711, 0.0017939411336556077, 0.0011034493800252676, 0.0026437968481332064, 0.39069658517837524], [0.0011126119643449783, 0.0008566984906792641, 0.0015441240975633264, 0.0013757634442299604, 0.0003270170127507299, 0.0003222278319299221, 0.0013188146986067295, 0.25308871269226074, 0.008395613171160221, 0.5076196789741516, 0.012135365977883339, 0.011380940675735474, 0.012624471448361874, 0.0004343383479863405, 0.0002993814996443689, 0.00037496781442314386, 0.18678931891918182], [0.001287969178520143, 0.00014162737352307886, 0.00010223017306998372, 8.578803681302816e-05, 2.6313786293030716e-05, 5.501299165189266e-05, 0.0001998942025238648, 0.5743443965911865, 0.002284629736095667, 0.011306922882795334, 0.005270640831440687, 0.028095664456486702, 0.0029770461842417717, 0.0004024481459055096, 0.00020054751075804234, 0.0009020920842885971, 0.372316837310791], [0.005382280796766281, 0.0004892282304354012, 0.00013536213373299688, 0.0002897864324040711, 1.9454235371085815e-05, 5.586165571003221e-05, 0.00019238462846260518, 0.519152820110321, 0.00447918102145195, 0.007109100930392742, 0.0022714370861649513, 0.07928027212619781, 0.014724121429026127, 0.0015532708493992686, 0.00016877238522283733, 0.0007627729792147875, 0.3639339208602905], [0.00322076422162354, 0.0008951859199441969, 0.00012561694893520325, 0.0009224353707395494, 5.0538175855763257e-05, 7.667708996450529e-05, 0.0002950096095446497, 0.5780929327011108, 0.0027451463975012302, 0.001182031468488276, 0.0011533120414242148, 0.004235925152897835, 0.01962188631296158, 0.0020296715665608644, 0.0002578027197159827, 0.001153075136244297, 0.38394200801849365], [0.00434714974835515, 0.0010242698481306434, 0.000792507256846875, 0.0005922834388911724, 9.064083860721439e-05, 0.001411275938153267, 0.0026627755723893642, 0.4084112346172333, 0.0014300968032330275, 0.0040126461535692215, 0.0011538645485416055, 0.006161487195640802, 0.03202470764517784, 0.16125911474227905, 0.01335094217211008, 0.056355878710746765, 0.3049190640449524], [0.000712214969098568, 0.00027582934126257896, 0.00032272498356178403, 0.0005960959824733436, 0.0001361142349196598, 0.0013408291852101684, 0.0043208240531384945, 0.5212433338165283, 0.00015730220184195787, 0.0037728215102106333, 4.344137414591387e-05, 0.00044489881838671863, 0.0038201187271624804, 0.0178498774766922, 0.010918423533439636, 0.09472732245922089, 0.33931779861450195], [0.0012212443398311734, 0.00014976495003793389, 4.5542590669356287e-05, 0.00010695974924601614, 4.146676292293705e-05, 0.00020151174976490438, 0.00020072115876246244, 0.5637592673301697, 9.507463983027264e-05, 0.00012587971286848187, 5.1853974582627416e-05, 0.00024788876180537045, 0.00043225576519034803, 0.0020064222626388073, 0.005542363505810499, 0.028193891048431396, 0.39757785201072693], [0.001453680801205337, 0.00039528272463940084, 7.770554657327011e-05, 0.0004527504206635058, 2.138262425432913e-05, 0.0003704636183101684, 0.001218548626638949, 0.6029167771339417, 0.0005022900295443833, 0.00025750978966243565, 5.918797614867799e-05, 0.0003361859708093107, 0.0005047914455644786, 0.0012252734741196036, 0.0006497654831036925, 0.0030441759154200554, 0.38651418685913086], [0.009416715241968632, 0.008238406851887703, 0.003558121155947447, 0.010110217146575451, 0.0017424746183678508, 0.0036868543829768896, 0.008248402737081051, 0.5419082641601562, 0.005468996707350016, 0.0036076223477721214, 0.0015987649094313383, 0.0025632374454289675, 0.002939678728580475, 0.0016528087435290217, 0.000902921543456614, 0.002282053930684924, 0.3920743763446808]], [[0.012134929187595844, 0.015492218546569347, 0.00249616801738739, 0.003497753757983446, 0.002424771897494793, 0.014491617679595947, 0.022283319383859634, 0.5162327289581299, 0.003880753880366683, 0.0028357268311083317, 0.0040343222208321095, 0.0021872492507100105, 0.003482232103124261, 0.001503036473877728, 0.0014324317453429103, 0.003211831906810403, 0.3883788585662842], [0.043539442121982574, 0.049938809126615524, 0.0034543536603450775, 0.005733744706958532, 0.0036039550323039293, 0.0013478315668180585, 0.013629858382046223, 0.49165448546409607, 0.0007937345071695745, 0.00047885117237456143, 0.001056457287631929, 0.002231602557003498, 0.0043318974785506725, 0.0006782227428629994, 0.0002386728156125173, 0.0026422881055623293, 0.37464573979377747], [0.010391141287982464, 0.07678362727165222, 0.0017067412845790386, 0.002578400308266282, 0.0011302907951176167, 0.001854997011832893, 0.003262014128267765, 0.5244001746177673, 0.0007317272247746587, 0.00018029891361948103, 3.861123332171701e-05, 7.738912245258689e-05, 0.0002810621226672083, 0.00044971765601076186, 0.00017114549700636417, 0.0003009303763974458, 0.3756616711616516], [0.005359490867704153, 0.039256688207387924, 0.0039106253534555435, 0.0056546349078416824, 0.0017336340388283134, 0.0037045013159513474, 0.0020847301930189133, 0.5379480123519897, 0.0004363158659543842, 0.00022993260063230991, 0.00028519838815554976, 0.0001994931371882558, 0.0002671126276254654, 0.0006542391492985189, 0.0004170140309724957, 0.0005312151624821126, 0.39732715487480164], [0.0071548097766935825, 0.015030900947749615, 0.0022806760389357805, 0.040182601660490036, 0.004317756742238998, 0.0035896445624530315, 0.0006421880680136383, 0.5485563278198242, 0.0003515915013849735, 6.875969847897068e-05, 1.3138859685568605e-05, 6.25635075266473e-05, 3.7071466067573056e-05, 0.00010738960554590449, 4.0497910958947614e-05, 6.492144166259095e-05, 0.37749916315078735], [0.04407753050327301, 0.022334247827529907, 0.002799727488309145, 0.024307359009981155, 0.05942212790250778, 0.016018759459257126, 0.028734520077705383, 0.4510243833065033, 0.0010471963323652744, 0.0003456895356066525, 0.0005133538506925106, 0.0005866107530891895, 0.001040262053720653, 0.00047369630192406476, 0.0007485056412406266, 0.005572082474827766, 0.34095388650894165], [0.0072427657432854176, 0.01898932084441185, 0.0011713637504726648, 0.03213275223970413, 0.1006462499499321, 0.0642051100730896, 0.028596658259630203, 0.4202783405780792, 0.0011296669254079461, 0.0005845446139574051, 0.00020177591068204492, 0.00022540071222465485, 0.00090057123452425, 0.0008120982674881816, 0.0014189507346600294, 0.0036600595340132713, 0.3178043067455292], [0.009843875654041767, 0.013248169794678688, 0.002443675184622407, 0.007478818763047457, 0.004410834982991219, 0.010061610490083694, 0.005105671472847462, 0.525078296661377, 0.0033773358445614576, 0.0027245597448199987, 0.0026539049576967955, 0.001539757358841598, 0.0019662249833345413, 0.0027861190028488636, 0.003019885392859578, 0.0022794322576373816, 0.4019818603992462], [0.045635562390089035, 0.001657123677432537, 8.148018241627142e-05, 0.0029530313331633806, 0.004975186660885811, 0.0013855256838724017, 0.004934057593345642, 0.5185210704803467, 0.015969615429639816, 0.004108143504709005, 0.0008596765692345798, 0.0023761061020195484, 0.0014269081875681877, 9.271092858398333e-05, 3.947590084862895e-05, 0.0017787711694836617, 0.39320558309555054], [0.006608237512409687, 0.0019977150950580835, 0.00012090996460756287, 0.0005137175903655589, 0.0035579074174165726, 0.0017350773559883237, 0.0012826380552724004, 0.3680080473423004, 0.3344712555408478, 0.012228765524923801, 0.0012088961666449904, 0.0010860528564080596, 0.0010729036293923855, 0.001363361836411059, 0.00015761498070787638, 0.0003474878612905741, 0.2642394006252289], [0.009772375226020813, 0.0029733136761933565, 0.00036108086351305246, 0.00047197440289892256, 0.00044530851300805807, 0.002168416976928711, 0.0035653586965054274, 0.5050207376480103, 0.019815821200609207, 0.044796548783779144, 0.006979561876505613, 0.003442759858444333, 0.0007534479955211282, 0.0003773870994336903, 0.00020334319560788572, 0.000575678248424083, 0.3982768952846527], [0.0069777388125658035, 0.001052972744219005, 0.00018323374388273805, 0.0006594301667064428, 0.0015799329848960042, 0.0009073065011762083, 0.0013683093711733818, 0.289193332195282, 0.008397076278924942, 0.020308421924710274, 0.4120912253856659, 0.024003615602850914, 0.0063629294745624065, 0.0013744058087468147, 0.0013069044798612595, 0.001271651010029018, 0.22296154499053955], [0.026758113875985146, 0.005109068937599659, 0.0012372881174087524, 0.002216388937085867, 0.00018580644973553717, 0.0011955249356105924, 0.002065366832539439, 0.44828879833221436, 0.017320923507213593, 0.014791283756494522, 0.02089679427444935, 0.04813483729958534, 0.05452694743871689, 0.009819847531616688, 0.0005457749939523637, 0.0018719785148277879, 0.345035195350647], [0.026897814124822617, 0.000751082377973944, 4.34111243521329e-05, 0.00047024624655023217, 0.0004092449089512229, 0.0003270387533120811, 0.0011243716580793262, 0.4636116623878479, 0.0008360311039723456, 0.0005534213851206005, 0.0031460970640182495, 0.010762319900095463, 0.11747442930936813, 0.011497611179947853, 0.0016384117770940065, 0.01548363734036684, 0.344973087310791], [0.008135799318552017, 0.0004903532681055367, 2.207469333370682e-05, 8.739673648960888e-05, 0.00026599777629598975, 0.0005724495276808739, 0.0009990244871005416, 0.5526165962219238, 0.0006004685419611633, 0.0005107761244289577, 0.00028252805350348353, 0.0006867104675620794, 0.00789592880755663, 0.027793321758508682, 0.0024983808398246765, 0.013901054859161377, 0.382641077041626], [0.0023997470270842314, 0.00035468500573188066, 3.624863165896386e-05, 0.00023783418873790652, 0.0006601939676329494, 0.00025841142632998526, 0.000258804444456473, 0.5626267194747925, 0.0002852977777365595, 0.0001182344785775058, 0.0015177460154518485, 0.0004161216493230313, 0.009563029743731022, 0.024062810465693474, 0.022292302921414375, 0.009626522660255432, 0.36528530716896057], [0.009326458908617496, 0.01057326141744852, 0.0018305826233699918, 0.006012782454490662, 0.0036366982385516167, 0.007874221540987492, 0.004345749504864216, 0.5343969464302063, 0.002854512305930257, 0.002143129473552108, 0.0021332998294383287, 0.0012479722499847412, 0.0016462380299344659, 0.0021972416434437037, 0.0023778406903147697, 0.0019760008435696363, 0.40542706847190857]], [[0.03539246320724487, 0.044933613389730453, 0.01424036268144846, 0.01662490889430046, 0.007354631554335356, 0.014308118261396885, 0.020172230899333954, 0.18362025916576385, 0.13767682015895844, 0.07754334807395935, 0.013296143151819706, 0.017110107466578484, 0.16701680421829224, 0.043021488934755325, 0.010231142863631248, 0.033163104206323624, 0.16429448127746582], [0.0078009855933487415, 0.05217166244983673, 0.12751184403896332, 0.20309984683990479, 0.06861145794391632, 0.1436161994934082, 0.06360876560211182, 0.1725260615348816, 0.0055521572940051556, 0.003432175377383828, 0.001461488544009626, 0.0032003019005060196, 0.0018620840273797512, 0.005717918276786804, 0.001196408411487937, 0.004492998123168945, 0.13413763046264648], [0.009762837551534176, 0.01698572374880314, 0.014574809931218624, 0.022866861894726753, 0.011399970389902592, 0.0379522331058979, 0.022190438583493233, 0.5024021863937378, 0.0021173555869609118, 0.0008548451005481184, 0.0006060765008442104, 0.002613954246044159, 0.0009099978487938643, 0.006754170637577772, 0.0006300437962636352, 0.006266572047024965, 0.3411119282245636], [0.009083726443350315, 0.03258905187249184, 0.02741721272468567, 0.09585630148649216, 0.02725798450410366, 0.06443633884191513, 0.034123435616493225, 0.4008280038833618, 0.00296266982331872, 0.0018283167155459523, 0.0014760495396330953, 0.0015239306958392262, 0.0010932724690064788, 0.006165963131934404, 0.001183610293082893, 0.004995749797672033, 0.2871783375740051], [0.005635023582726717, 0.025492656975984573, 0.05356141924858093, 0.20595863461494446, 0.04244311526417732, 0.051231324672698975, 0.035264793783426285, 0.3242082893848419, 0.004400957841426134, 0.0020648320205509663, 0.003443267662078142, 0.00299617531709373, 0.0017341814236715436, 0.002465439960360527, 0.001053825719282031, 0.0033922214061021805, 0.23465386033058167], [0.01027767639607191, 0.043764904141426086, 0.10498126596212387, 0.28035253286361694, 0.041418030858039856, 0.05303163826465607, 0.09587504714727402, 0.16408975422382355, 0.02081490494310856, 0.011076890863478184, 0.0082430774345994, 0.012575964443385601, 0.0033818064257502556, 0.007693647872656584, 0.0021150538232177496, 0.01279410533607006, 0.12751366198062897], [0.01582477055490017, 0.022368989884853363, 0.039018917828798294, 0.08423114567995071, 0.026230165734887123, 0.029954206198453903, 0.036084167659282684, 0.4014510214328766, 0.006331016309559345, 0.014243196696043015, 0.009860847145318985, 0.007150176912546158, 0.0029570087790489197, 0.0027406620793044567, 0.004102359525859356, 0.008958259597420692, 0.2884930670261383], [0.009568012319505215, 0.013815954327583313, 0.013416863046586514, 0.02583499066531658, 0.004744658712297678, 0.009024454280734062, 0.0033490112982690334, 0.5218000411987305, 0.0060274130664765835, 0.002814018167555332, 0.0030198285821825266, 0.0062002320773899555, 0.0033513852395117283, 0.0036827269941568375, 0.0019118750933557749, 0.003539275610819459, 0.36789920926094055], [0.004616744816303253, 0.008426404558122158, 0.00856455322355032, 0.012175400741398335, 0.006738622672855854, 0.016575131565332413, 0.00757252611219883, 0.3495500087738037, 0.016819272190332413, 0.020453333854675293, 0.009071428328752518, 0.03663598373532295, 0.03397708758711815, 0.07366131246089935, 0.01667950116097927, 0.09219849109649658, 0.2862841784954071], [0.003030609805136919, 0.00172978185582906, 0.0023018240462988615, 0.002950159600004554, 0.001144362729974091, 0.0026662119198590517, 0.0009847070323303342, 0.5052744150161743, 0.006031906232237816, 0.004067783709615469, 0.004534607287496328, 0.02026423066854477, 0.008977444842457771, 0.02421395666897297, 0.005714773200452328, 0.029973097145557404, 0.37614017724990845], [0.006042018067091703, 0.0015921663725748658, 0.006547433789819479, 0.0054186261259019375, 0.004201770294457674, 0.004739725962281227, 0.0017338477773591876, 0.4252581000328064, 0.01110632810741663, 0.014356458559632301, 0.008014691062271595, 0.0235330518335104, 0.014514587819576263, 0.060156408697366714, 0.026707950979471207, 0.053572364151477814, 0.3325044512748718], [0.004690694622695446, 0.011563602834939957, 0.004993120674043894, 0.004510411527007818, 0.0024428791366517544, 0.005140496417880058, 0.0030557038262486458, 0.4147408902645111, 0.03425651043653488, 0.01852973736822605, 0.006151593755930662, 0.00696222810074687, 0.018271394073963165, 0.06530910730361938, 0.019650662317872047, 0.05348753184080124, 0.32624346017837524], [0.004002328496426344, 0.004696457181125879, 0.00311167910695076, 0.009681333787739277, 0.003591477405279875, 0.004770220257341862, 0.002968086628243327, 0.4327305555343628, 0.014797625131905079, 0.01466042548418045, 0.0094672292470932, 0.012029741890728474, 0.008206343278288841, 0.042285412549972534, 0.036833468824625015, 0.06440751254558563, 0.33176007866859436], [0.003689211793243885, 0.004823937080800533, 0.007552321068942547, 0.014900093898177147, 0.0015733237378299236, 0.004042487591505051, 0.002183783333748579, 0.36661848425865173, 0.029090160503983498, 0.05831487849354744, 0.024194179102778435, 0.13349194824695587, 0.008249381557106972, 0.013522415421903133, 0.01272218395024538, 0.031696904450654984, 0.28333431482315063], [0.0031649013981223106, 0.0007944824174046516, 0.0030693113803863525, 0.0015005599707365036, 0.0002264968934468925, 0.0007838332676328719, 0.0004966675187461078, 0.4968203604221344, 0.011753031983971596, 0.015968112275004387, 0.004265510477125645, 0.09578412771224976, 0.0031019498128443956, 0.00823297630995512, 0.003583510173484683, 0.011168297380208969, 0.3392859101295471], [0.002776053035631776, 0.0021700880024582148, 0.005255658645182848, 0.004108811728656292, 0.0006918672588653862, 0.003894504625350237, 0.0025797849521040916, 0.45555153489112854, 0.008823426440358162, 0.038578420877456665, 0.020596977323293686, 0.0806693509221077, 0.0075058164075016975, 0.016817545518279076, 0.011806937865912914, 0.014642786234617233, 0.3235304653644562], [0.00860940758138895, 0.01150702778249979, 0.011126082390546799, 0.022280879318714142, 0.00390278035774827, 0.0076196459122002125, 0.0029445027466863394, 0.5325202345848083, 0.005541256628930569, 0.002665544394403696, 0.002962361555546522, 0.006112887058407068, 0.0031019311863929033, 0.0033277960028499365, 0.0017908780137076974, 0.003203160595148802, 0.37078356742858887]], [[0.02105846256017685, 0.019101882353425026, 0.0041950903832912445, 0.014711951836943626, 0.003899168223142624, 0.007391365710645914, 0.005572853144258261, 0.2256893813610077, 0.2139909863471985, 0.05929117649793625, 0.01667448878288269, 0.02313026413321495, 0.13342057168483734, 0.034869614988565445, 0.004419872537255287, 0.017629200592637062, 0.19495368003845215], [0.09151596575975418, 0.21008838713169098, 0.043939247727394104, 0.1662834882736206, 0.04082610458135605, 0.10099838674068451, 0.040499746799468994, 0.1167466789484024, 0.018107185140252113, 0.005967850796878338, 0.005310032516717911, 0.006435499060899019, 0.03703780472278595, 0.009771501645445824, 0.0008294267463497818, 0.0036287070252001286, 0.10201394557952881], [0.08679255098104477, 0.07269985973834991, 0.017979495227336884, 0.028063902631402016, 0.014814398251473904, 0.04933301359415054, 0.024485468864440918, 0.3604743778705597, 0.012586617842316628, 0.0049517834559082985, 0.0036596893332898617, 0.0038979060482233763, 0.02348783053457737, 0.005996506195515394, 0.0027627393137663603, 0.0071326992474496365, 0.2808811366558075], [0.08767355978488922, 0.13635078072547913, 0.035239845514297485, 0.13724327087402344, 0.03329010680317879, 0.044328223913908005, 0.03435865789651871, 0.1449677050113678, 0.035363081842660904, 0.017686452716588974, 0.028604324907064438, 0.026636935770511627, 0.048387184739112854, 0.03152349218726158, 0.013895610347390175, 0.015499196946620941, 0.12895148992538452], [0.0657811239361763, 0.06601422280073166, 0.018645694479346275, 0.05202465504407883, 0.04328853636980057, 0.07643458247184753, 0.029198497533798218, 0.3133590519428253, 0.019482124596834183, 0.006449607666581869, 0.003896522568538785, 0.005924086552113295, 0.02998710609972477, 0.010225072503089905, 0.002904109191149473, 0.007847762666642666, 0.24853724241256714], [0.054852887988090515, 0.08701568841934204, 0.01684819906949997, 0.06211966276168823, 0.049249399453401566, 0.10840268433094025, 0.05587795376777649, 0.24187427759170532, 0.027101662009954453, 0.004100144375115633, 0.003203147789463401, 0.0032158272806555033, 0.032954346388578415, 0.02719397284090519, 0.0028123941738158464, 0.011115944012999535, 0.21206185221672058], [0.03719504550099373, 0.047675490379333496, 0.012049965560436249, 0.012270371429622173, 0.012005084194242954, 0.049066029489040375, 0.020988933742046356, 0.4266993999481201, 0.007705371826887131, 0.002430735854431987, 0.002085383515805006, 0.002285297028720379, 0.015211105346679688, 0.00838780589401722, 0.002025302965193987, 0.006716660223901272, 0.33520203828811646], [0.007602925878018141, 0.004827121738344431, 0.0013496172614395618, 0.0022429560776799917, 0.0007215003133751452, 0.0040450310334563255, 0.005783813539892435, 0.5568481087684631, 0.0012822586577385664, 0.0004943335079587996, 0.0008652068208903074, 0.0007596623618155718, 0.0018599514150992036, 0.001120822736993432, 0.0010843497002497315, 0.0020300946198403835, 0.4070822596549988], [0.02303033135831356, 0.06632174551486969, 0.03152666613459587, 0.033328745514154434, 0.02979150041937828, 0.04299883171916008, 0.00791440811008215, 0.3318972587585449, 0.037177301943302155, 0.022176750004291534, 0.00870597641915083, 0.007801912259310484, 0.06906630843877792, 0.012161072343587875, 0.0063965716399252415, 0.019194433465600014, 0.25051015615463257], [0.007432466372847557, 0.014276672154664993, 0.004443021956831217, 0.011394022032618523, 0.006200187373906374, 0.013448784127831459, 0.0032542645931243896, 0.467684805393219, 0.024856312200427055, 0.015386702492833138, 0.004439453594386578, 0.007139013614505529, 0.05107295140624046, 0.008044007234275341, 0.004028948489576578, 0.012876519002020359, 0.34402185678482056], [0.006402334664016962, 0.014948786236345768, 0.007157870568335056, 0.010115891695022583, 0.005376024171710014, 0.008278830908238888, 0.0030313043389469385, 0.48465245962142944, 0.013261470943689346, 0.015316649340093136, 0.005895006004720926, 0.0063235219568014145, 0.027343595400452614, 0.00614633783698082, 0.007121018599718809, 0.01744796894490719, 0.36118099093437195], [0.011436976492404938, 0.037711530923843384, 0.011878268793225288, 0.01698177494108677, 0.01417006365954876, 0.023017000406980515, 0.008193421177566051, 0.4359745979309082, 0.019543413072824478, 0.019091855734586716, 0.009413733147084713, 0.0074509247206151485, 0.031808339059352875, 0.0086257578805089, 0.007620914373546839, 0.018267804756760597, 0.318813681602478], [0.0063243736512959, 0.03181307017803192, 0.007777595892548561, 0.016617251560091972, 0.010072224773466587, 0.020546574145555496, 0.003996904473751783, 0.4953460395336151, 0.015177024528384209, 0.01090487651526928, 0.001627835794351995, 0.002320181345567107, 0.01817052811384201, 0.006770993582904339, 0.0024850498884916306, 0.010009783320128918, 0.34003961086273193], [0.018122117966413498, 0.02519756555557251, 0.012873017229139805, 0.01301340851932764, 0.011127838864922523, 0.030749518424272537, 0.012082856148481369, 0.45068156719207764, 0.00900660827755928, 0.0107443081215024, 0.0034732487984001637, 0.0028818019200116396, 0.014477398246526718, 0.010671505704522133, 0.013698055408895016, 0.027970634400844574, 0.3332284986972809], [0.007621950004249811, 0.005902812350541353, 0.003439998021349311, 0.0035460893996059895, 0.0021943405736237764, 0.00785152055323124, 0.00796153862029314, 0.48468881845474243, 0.0061890799552202225, 0.009650468826293945, 0.0035811858251690865, 0.0032489814329892397, 0.009475122205913067, 0.007896237075328827, 0.022331183776259422, 0.028303522616624832, 0.3861171007156372], [0.010218862444162369, 0.010801728814840317, 0.003073164727538824, 0.008666586130857468, 0.005444214213639498, 0.019965698942542076, 0.011238189414143562, 0.47701990604400635, 0.0071023004129529, 0.011924095451831818, 0.00234043225646019, 0.0029674337711185217, 0.009586242958903313, 0.010551417246460915, 0.01121382787823677, 0.030090976506471634, 0.367794930934906], [0.006975265685468912, 0.0037775631062686443, 0.0010782586177811027, 0.0019481683848425746, 0.0006216369802132249, 0.0035798177123069763, 0.005189040210098028, 0.5588832497596741, 0.0010735791875049472, 0.00046839407877996564, 0.0007885974482633173, 0.0006893896497786045, 0.0017185320612043142, 0.000998837174847722, 0.0010542400414124131, 0.001888980739749968, 0.40926653146743774]], [[0.004648192785680294, 0.003442854853346944, 0.0026514327619224787, 0.010619414038956165, 0.006526973098516464, 0.003910184372216463, 0.0034715752117335796, 0.49433350563049316, 0.016302580013871193, 0.02519787661731243, 0.004452883265912533, 0.005397086497396231, 0.011241826228797436, 0.0031498554162681103, 0.0016576299676671624, 0.0027170495595782995, 0.40027916431427], [0.038913097232580185, 0.07803847640752792, 0.03787631541490555, 0.36051270365715027, 0.04141930118203163, 0.06129005551338196, 0.05830315873026848, 0.17197036743164062, 0.005536051467061043, 0.0042009176686406136, 0.0013033278519287705, 0.002835135441273451, 0.00472866278141737, 0.0029692533425986767, 0.00022088691184762865, 0.0005025758873671293, 0.1293797641992569], [0.05911184474825859, 0.08094269782304764, 0.09678555279970169, 0.32575079798698425, 0.12397237122058868, 0.0544070228934288, 0.03683457896113396, 0.09767790883779526, 0.0067354231141507626, 0.004448756575584412, 0.009519814513623714, 0.01185952965170145, 0.007989023812115192, 0.0021988616790622473, 0.0007562171667814255, 0.0014569781487807631, 0.07955274730920792], [0.04702761769294739, 0.0604376383125782, 0.04433160275220871, 0.05070260167121887, 0.04511605203151703, 0.05895973742008209, 0.11933939158916473, 0.3063926100730896, 0.010535024106502533, 0.004275477025657892, 0.0025875333230942488, 0.005445805378258228, 0.005525761749595404, 0.004163868259638548, 0.0013128508580848575, 0.0023702525068074465, 0.23147620260715485], [0.07964374125003815, 0.05127798020839691, 0.07750055193901062, 0.05841361731290817, 0.015070920810103416, 0.14958012104034424, 0.1835750937461853, 0.19679200649261475, 0.009893614798784256, 0.006574552971869707, 0.0020257041323930025, 0.004653098061680794, 0.004912586882710457, 0.002804320538416505, 0.0012980365427210927, 0.002898741513490677, 0.15308527648448944], [0.05142050236463547, 0.03977026417851448, 0.018897950649261475, 0.021504629403352737, 0.00429795915260911, 0.07824846357107162, 0.21567653119564056, 0.32269376516342163, 0.003890097141265869, 0.003306406084448099, 0.00033910846104845405, 0.0013466336531564593, 0.0014239212032407522, 0.0034608645364642143, 0.0007846765220165253, 0.0017554879887029529, 0.23118266463279724], [0.019600534811615944, 0.02895163930952549, 0.008638061583042145, 0.0042654480785131454, 0.003703672206029296, 0.08109024912118912, 0.015439261682331562, 0.471588134765625, 0.008895229548215866, 0.002278968458995223, 0.0007404423085972667, 0.0011031500762328506, 0.001379833440296352, 0.003823250997811556, 0.0006155156879685819, 0.0016413830453529954, 0.34624528884887695], [0.01047151442617178, 0.010869299992918968, 0.005988932680338621, 0.010030053555965424, 0.006244510877877474, 0.013405241072177887, 0.014496971853077412, 0.5180286765098572, 0.004804976750165224, 0.002887410344555974, 0.00278679421171546, 0.0027594459243118763, 0.0035072937607765198, 0.003620272036641836, 0.0014990769559517503, 0.0044788033701479435, 0.38412073254585266], [0.01577749475836754, 0.002443083329126239, 0.000985043472610414, 0.004559807945042849, 0.0016035408480092883, 0.003919276874512434, 0.005553583614528179, 0.14113229513168335, 0.06641194969415665, 0.19951926171779633, 0.05228813365101814, 0.08286251872777939, 0.2813529074192047, 0.015316086821258068, 0.002431280678138137, 0.008883558213710785, 0.11496027559041977], [0.018782716244459152, 0.0026479994412511587, 0.001701736357063055, 0.004286630544811487, 0.0016593735199421644, 0.004862004891037941, 0.0032281363382935524, 0.3294934034347534, 0.10797224193811417, 0.07477851212024689, 0.03742383420467377, 0.04901175945997238, 0.06568162143230438, 0.014153708703815937, 0.004842368420213461, 0.018946781754493713, 0.2605271637439728], [0.011821421794593334, 0.003464779118075967, 0.001223136205226183, 0.002425777493044734, 0.0007846727385185659, 0.0028822661843150854, 0.004055694676935673, 0.016376618295907974, 0.11163365840911865, 0.06842425465583801, 0.05581047013401985, 0.17675118148326874, 0.4582419693470001, 0.04552078992128372, 0.002799856010824442, 0.023654261603951454, 0.01412912830710411], [0.010904503986239433, 0.0028875672724097967, 0.0010423241183161736, 0.0019737579859793186, 0.000777874025516212, 0.0010597530053928494, 0.0014656251296401024, 0.05957600846886635, 0.18799330294132233, 0.20645125210285187, 0.07385137677192688, 0.056843988597393036, 0.276348739862442, 0.05028552561998367, 0.007932296954095364, 0.010650788433849812, 0.04995530843734741], [0.009988445788621902, 0.0023751899134367704, 0.0011509406613186002, 0.0020258519798517227, 0.0003051830572076142, 0.0022316589020192623, 0.005942929070442915, 0.33934950828552246, 0.03223421797156334, 0.08526736497879028, 0.021822165697813034, 0.06290415674448013, 0.08239644765853882, 0.03805691748857498, 0.011208119802176952, 0.041998788714408875, 0.26074209809303284], [0.006338858976960182, 0.0015343844424933195, 0.001248016837053001, 0.00037822622107341886, 0.00032369286054745317, 0.003694478888064623, 0.013205138966441154, 0.29161491990089417, 0.012927810661494732, 0.019628722220659256, 0.0072577581740915775, 0.015659401193261147, 0.06105148419737816, 0.1240500882267952, 0.055599551647901535, 0.1522640436887741, 0.23322340846061707], [0.006750902626663446, 0.0013924959348514676, 0.0010033359285444021, 9.653009328758344e-05, 0.00013429997488856316, 0.003255929332226515, 0.004341586958616972, 0.20430268347263336, 0.010519138537347317, 0.008869549259543419, 0.0068966541439294815, 0.013011535629630089, 0.03686394914984703, 0.11436480283737183, 0.05841456726193428, 0.35930654406547546, 0.17047546803951263], [0.004729514010250568, 0.0007008167449384928, 0.0005396735505200922, 0.0001490341528551653, 0.00012796091323252767, 0.003760706167668104, 0.002943586790934205, 0.42082223296165466, 0.007424837443977594, 0.006587756332010031, 0.0023053567856550217, 0.003017071168869734, 0.006946408189833164, 0.11773241311311722, 0.04483383893966675, 0.04712298512458801, 0.33025580644607544], [0.00839365553110838, 0.007631905842572451, 0.004248633980751038, 0.00776818348094821, 0.0046569244004786015, 0.01021517999470234, 0.011845833621919155, 0.534807562828064, 0.0037106736563146114, 0.002298369538038969, 0.0023386748507618904, 0.0021257945336401463, 0.002675264608114958, 0.002868054900318384, 0.0012678381754085422, 0.0034805487375706434, 0.38966691493988037]], [[0.013903412036597729, 0.010623575188219547, 0.000635845644865185, 0.0016100846696645021, 0.002886539790779352, 0.001707747345790267, 0.0022309802006930113, 0.16394196450710297, 0.25547999143600464, 0.11303569376468658, 0.015546607784926891, 0.043146368116140366, 0.215436190366745, 0.017529167234897614, 0.002148736733943224, 0.006052414886653423, 0.1340847611427307], [0.026143644005060196, 0.13789579272270203, 0.08231703191995621, 0.03356163576245308, 0.040601372718811035, 0.007436808664351702, 0.02873421274125576, 0.19873061776161194, 0.060738205909729004, 0.12310147285461426, 0.013294029980897903, 0.012296794913709164, 0.07028704881668091, 0.00518072908744216, 0.0006492637330666184, 0.0009117064182646573, 0.1581195890903473], [0.007914688438177109, 0.008608396165072918, 0.00834632571786642, 0.025869596749544144, 0.01830594427883625, 0.0027505101170390844, 0.0028210869058966637, 0.5277080535888672, 0.0026642580050975084, 0.00494261784479022, 0.0032171569764614105, 0.002120115328580141, 0.0038374774158000946, 0.00048555684043094516, 0.00021879689302295446, 0.0001335832930635661, 0.38005581498146057], [0.007410080172121525, 0.09402276575565338, 0.33337515592575073, 0.037222158163785934, 0.028156965970993042, 0.007217737380415201, 0.00403521116822958, 0.2510848045349121, 0.005124175921082497, 0.009653392247855663, 0.014411757700145245, 0.0032002590596675873, 0.0036925855092704296, 0.0007734932005405426, 0.0009143032948486507, 0.0003807791799772531, 0.19932451844215393], [0.015755310654640198, 0.10360319912433624, 0.39242416620254517, 0.10861736536026001, 0.010206320323050022, 0.005603366531431675, 0.028675023466348648, 0.17195270955562592, 0.006664469838142395, 0.012468398548662663, 0.006742625962942839, 0.0023948028683662415, 0.002496040193364024, 0.0008527169120498002, 0.00045099237468093634, 0.0006650317227467895, 0.1304275542497635], [0.013940802775323391, 0.01025816798210144, 0.008621418848633766, 0.009512112475931644, 0.016198186203837395, 0.015350515022873878, 0.07733681797981262, 0.46530842781066895, 0.004986444488167763, 0.010743516497313976, 0.0016347746131941676, 0.001350825885310769, 0.0056904093362390995, 0.006746944971382618, 0.001316023408435285, 0.003725755959749222, 0.34727880358695984], [0.015867462381720543, 0.0253658015280962, 0.004912342876195908, 0.03481610119342804, 0.047514911741018295, 0.06363902986049652, 0.03599691763520241, 0.41193175315856934, 0.008644253946840763, 0.012689891271293163, 0.0014663455076515675, 0.0011744051007553935, 0.01156623661518097, 0.010853887535631657, 0.0005922340205870569, 0.0019756925757974386, 0.31099265813827515], [0.0038976618088781834, 0.0074794115498661995, 0.0038347868248820305, 0.0065645803697407246, 0.003302107797935605, 0.0028914031572639942, 0.00468798540532589, 0.5540540218353271, 0.004090470261871815, 0.0027382499538362026, 0.002466007135808468, 0.0014583992306143045, 0.004061603918671608, 0.0017166860634461045, 0.00045810415758751333, 0.0008728450629860163, 0.3954256772994995], [0.014434382319450378, 0.015744900330901146, 0.005661110393702984, 0.009067351929843426, 0.003317477647215128, 0.000837066036183387, 0.004613218363374472, 0.10526461154222488, 0.2558676600456238, 0.3303821086883545, 0.02587483637034893, 0.04727783054113388, 0.09159015864133835, 0.0071126967668533325, 0.0012165674706920981, 0.001704144524410367, 0.08003375679254532], [0.018281469121575356, 0.010284986346960068, 0.006266843993216753, 0.007103536743670702, 0.008600609377026558, 0.007157580927014351, 0.008107493631541729, 0.12825043499469757, 0.29364362359046936, 0.21345259249210358, 0.03612866997718811, 0.06717440485954285, 0.06798845529556274, 0.015639716759324074, 0.004650560673326254, 0.006075994111597538, 0.10119305551052094], [0.014898733235895634, 0.013707383535802364, 0.03452673554420471, 0.011496474035084248, 0.00272614904679358, 0.001218108693137765, 0.0013226651353761554, 0.42525601387023926, 0.019792325794696808, 0.041148070245981216, 0.02803383767604828, 0.024172931909561157, 0.025110594928264618, 0.0031056897714734077, 0.001663245726376772, 0.0016802671598270535, 0.3501408100128174], [0.010551651939749718, 0.00870425533503294, 0.0070005152374506, 0.004969916306436062, 0.0029133870266377926, 0.0014959933469071984, 0.003403209615498781, 0.11750689893960953, 0.18754586577415466, 0.3975834548473358, 0.01882622018456459, 0.030286753550171852, 0.09300632774829865, 0.013402355834841728, 0.004880982916802168, 0.0020068923477083445, 0.0959153026342392], [0.016389602795243263, 0.00616547092795372, 0.0024295656476169825, 0.009044338017702103, 0.0011576034594327211, 0.0005836543277837336, 0.003226289991289377, 0.17262287437915802, 0.18557308614253998, 0.28851574659347534, 0.043155282735824585, 0.07012853771448135, 0.06217099726200104, 0.0028795977123081684, 0.0009284336119890213, 0.0016919494373723865, 0.13333703577518463], [0.0022212008479982615, 0.0012258847709745169, 0.000569426454603672, 0.0011578048579394817, 0.0012834003427997231, 0.0013029169058427215, 0.0030831322073936462, 0.501327633857727, 0.010309464298188686, 0.010621180757880211, 0.004099587444216013, 0.0021275437902659178, 0.015102606266736984, 0.050413161516189575, 0.008771194145083427, 0.016186730936169624, 0.3701971769332886], [0.0006288525182753801, 4.4573782361112535e-05, 3.8154961657710373e-05, 0.00017193764506373554, 0.00038921847590245306, 0.0017609260976314545, 0.0007235166267491877, 0.5134615898132324, 0.0015800328692421317, 0.0015052987728267908, 0.0016500088386237621, 0.0013270501513034105, 0.0024736092891544104, 0.04744929075241089, 0.016072507947683334, 0.031157471239566803, 0.37956592440605164], [0.0008650070521980524, 0.0001002375574898906, 0.00017956364899873734, 0.0006215755711309612, 0.0016774908872321248, 0.0017772327410057187, 0.0012352804187685251, 0.4272758364677429, 0.0034950943663716316, 0.012254757806658745, 0.0028750027995556593, 0.000918667356017977, 0.008368018083274364, 0.1513427346944809, 0.03895694017410278, 0.022276053205132484, 0.3257805407047272], [0.0030939553398638964, 0.0054433648474514484, 0.002927724039182067, 0.004843967035412788, 0.00233019283041358, 0.0021360409446060658, 0.0036734200548380613, 0.5652053952217102, 0.0033471521455794573, 0.0022795540280640125, 0.0020678197033703327, 0.001098231179639697, 0.0031936434097588062, 0.0013571378076449037, 0.0003659389913082123, 0.0006664511747658253, 0.39597001671791077]], [[0.010656710714101791, 0.0008398214704357088, 0.0010085308458656073, 0.0013813243713229895, 0.0008489462779834867, 0.0012689926661550999, 0.006463557481765747, 0.551730215549469, 0.003169531933963299, 0.00386054371483624, 0.0008925192523747683, 0.0027943372260779142, 0.0018041931325569749, 0.0007877142052166164, 0.00026937652728520334, 0.000547932751942426, 0.41167569160461426], [0.024827051907777786, 0.044606443494558334, 0.011051664128899574, 0.01900968700647354, 0.002818159991875291, 0.004833602346479893, 0.04853357374668121, 0.5010358691215515, 0.0012092306278645992, 0.0005354605964384973, 0.0001630623737582937, 0.00044838222675025463, 0.0004824246861971915, 0.0001172791889985092, 1.595236244611442e-05, 2.2071924831834622e-05, 0.34029003977775574], [0.050069279968738556, 0.21279992163181305, 0.01933296024799347, 0.036924269050359726, 0.003237913129851222, 0.01795029826462269, 0.044430509209632874, 0.3559248149394989, 0.003103818278759718, 0.0007501734071411192, 0.00027870447956956923, 0.0006802668212912977, 0.00045331745059229434, 0.00026039130170829594, 0.00013657848467119038, 0.00028547897818498313, 0.25338131189346313], [0.023023685440421104, 0.12141137570142746, 0.017255553975701332, 0.011121259070932865, 0.0038232621736824512, 0.01696798764169216, 0.07609598338603973, 0.4247925579547882, 0.001495402306318283, 0.0007399548776447773, 0.00034370593493804336, 0.0004371475079096854, 0.00029517506482079625, 0.0003966991207562387, 0.00010218834358965978, 0.0002587549388408661, 0.3014392554759979], [0.09576106071472168, 0.19742576777935028, 0.04419824853539467, 0.15897086262702942, 0.009728380478918552, 0.03078167699277401, 0.047233711928129196, 0.23739659786224365, 0.003990234341472387, 0.0007658710819669068, 0.00021691889560315758, 0.0005430500023066998, 0.0006197905750013888, 0.0003445236652623862, 3.96148934669327e-05, 0.00022218287631403655, 0.17176155745983124], [0.051156047731637955, 0.11404503136873245, 0.03036099672317505, 0.13940556347370148, 0.06586091220378876, 0.04301173985004425, 0.08722876757383347, 0.26729586720466614, 0.002337574027478695, 0.0006473364192061126, 0.0002715618466027081, 0.0005246303626336157, 0.0016815715935081244, 0.000513846636749804, 9.511327516520396e-05, 0.0002063548017758876, 0.19535697996616364], [0.026418259367346764, 0.038911156356334686, 0.016785142943263054, 0.0314890518784523, 0.06365862488746643, 0.10812786966562271, 0.05684245750308037, 0.3726513385772705, 0.002627086825668812, 0.0007097285706549883, 0.0007340862066484988, 0.0004208340251352638, 0.002608712762594223, 0.0046216025948524475, 0.00044560618698596954, 0.0007183123962022364, 0.2722301185131073], [0.008012780919671059, 0.008392502553761005, 0.006005004979670048, 0.008211031556129456, 0.004169235937297344, 0.010460609570145607, 0.014394670724868774, 0.523620069026947, 0.003403652459383011, 0.00355497351847589, 0.00378659600391984, 0.0017423898680135608, 0.004212078172713518, 0.0038116970099508762, 0.0018717258935794234, 0.003007060382515192, 0.391343891620636], [0.190398171544075, 0.004022962413728237, 0.003225849010050297, 0.005385962780565023, 0.003614953951910138, 0.004278045147657394, 0.017346439883112907, 0.3776796758174896, 0.0671217292547226, 0.00980730913579464, 0.0014958289684727788, 0.022284438833594322, 0.008077270351350307, 0.0008733576396480203, 0.00015115168935153633, 0.0003204523236490786, 0.2839163541793823], [0.05777157098054886, 0.0007115869084373116, 0.0005771724972873926, 0.0015480992151424289, 0.002678077667951584, 0.002544946502894163, 0.009322685189545155, 0.4118673503398895, 0.1719420701265335, 0.01720162108540535, 0.0033340263180434704, 0.015078948810696602, 0.005922044161707163, 0.0034504143986850977, 0.0005800679209642112, 0.0006591881974600255, 0.2948101758956909], [0.04606853052973747, 0.00315968063659966, 0.0008404816035181284, 0.0011048185406252742, 0.00036002599517814815, 0.0016067589167505503, 0.0026760550681501627, 0.4431912899017334, 0.1331116408109665, 0.00894797034561634, 0.0017203271854668856, 0.01989992894232273, 0.005695056170225143, 0.0028438426088541746, 0.0009144539362750947, 0.0014165209140628576, 0.326442688703537], [0.07055703550577164, 0.0013580756494775414, 0.001080965274013579, 0.0013442065101116896, 0.0015060213627293706, 0.0032457797788083553, 0.01137834507972002, 0.3686525821685791, 0.17252062261104584, 0.03023175708949566, 0.00809342972934246, 0.01638713665306568, 0.02535460889339447, 0.00908887479454279, 0.0011310952249914408, 0.0007668025209568441, 0.277302622795105], [0.2886202037334442, 0.005896302871406078, 0.0023113794159144163, 0.0027034154627472162, 0.00023407158732879907, 0.0025111210998147726, 0.017270607873797417, 0.2812270522117615, 0.11277918517589569, 0.027458515018224716, 0.0064676683396101, 0.04070272296667099, 0.007689335383474827, 0.0016982550732791424, 0.0003148906398564577, 0.0004952427698299289, 0.20162004232406616], [0.054601039737463, 0.002698288531973958, 0.001911144470795989, 0.002415318740531802, 0.001526957144960761, 0.0054902611300349236, 0.006297803949564695, 0.40457311272621155, 0.02545112371444702, 0.008527134545147419, 0.003263151738792658, 0.01973588578402996, 0.11266074329614639, 0.022901998832821846, 0.005088830832391977, 0.006071664858609438, 0.316785603761673], [0.012839782983064651, 0.0012222270015627146, 0.0009808631148189306, 0.0005109180929139256, 0.0008188929641619325, 0.006628413684666157, 0.0039040548726916313, 0.4408884346485138, 0.0070266821421682835, 0.0049997977912425995, 0.0030268284026533365, 0.011389974504709244, 0.052978985011577606, 0.04722047969698906, 0.022955598309636116, 0.055931977927684784, 0.32667607069015503], [0.00836222991347313, 0.0004100291698705405, 0.0004995992640033364, 0.0005281352787278593, 0.0004244785523042083, 0.0022620155941694975, 0.0026634102687239647, 0.515430748462677, 0.0015850422205403447, 0.0012324906419962645, 0.0019174340413883328, 0.0026088703889399767, 0.039381951093673706, 0.024177078157663345, 0.01811404339969158, 0.006271424703299999, 0.3741310238838196], [0.006829690653830767, 0.006050325930118561, 0.004600776359438896, 0.006124165374785662, 0.00295969657599926, 0.007735991384834051, 0.011275558732450008, 0.5379719734191895, 0.002759274560958147, 0.003173155477270484, 0.0033373888581991196, 0.0014604296302422881, 0.0034570696298033, 0.00301708304323256, 0.001678207772783935, 0.002508054720237851, 0.3950611650943756]]], [[[0.011778379790484905, 0.03165699914097786, 0.007932649925351143, 0.009723009541630745, 0.004191084299236536, 0.01757962256669998, 0.07558417320251465, 0.5286822319030762, 0.008867246098816395, 0.003844513790681958, 0.0023858908098191023, 0.0016925567761063576, 0.02116350457072258, 0.0019341084407642484, 0.0006740562967024744, 0.0024238734040409327, 0.2698860764503479], [0.0070331585593521595, 0.03474276885390282, 0.11704154312610626, 0.012540064752101898, 0.000989006832242012, 0.005053846165537834, 0.032846849411726, 0.46950772404670715, 0.003426749724894762, 0.00802597776055336, 0.0009419370908290148, 0.0006603601505048573, 0.0033269578125327826, 0.0016024248907342553, 0.0005893349880352616, 0.0010437554446980357, 0.30062752962112427], [0.001831729430705309, 0.0043436517007648945, 0.17206604778766632, 0.002442621625959873, 0.0003059872251469642, 0.0009829388000071049, 0.009025774896144867, 0.5763779282569885, 0.0001724223984638229, 0.002573538338765502, 0.00020257063442841172, 0.00010553075117059052, 0.0001371508842566982, 0.0002438993687974289, 0.0007907812250778079, 0.0009292939212173223, 0.22746816277503967], [0.00187400181312114, 0.005452608224004507, 0.04723650589585304, 0.02573348954319954, 0.0006291329045780003, 0.0013828753726556897, 0.0057067666202783585, 0.6844225525856018, 0.00017170840874314308, 0.0013233365025371313, 0.00026600121054798365, 0.00023294986749533564, 0.00012011083890683949, 0.00010966901027131826, 0.00014024645497556776, 0.00025225302670150995, 0.22494575381278992], [0.006594211794435978, 0.006388525478541851, 0.13849176466464996, 0.07465358078479767, 0.020006483420729637, 0.007328663021326065, 0.06784012913703918, 0.36474844813346863, 0.0006209492566995323, 0.00875956378877163, 0.0008182940073311329, 0.0006090298993512988, 0.0009420326096005738, 0.0004526966658886522, 0.0004898518673144281, 0.0015127587830647826, 0.29974302649497986], [0.00468576792627573, 0.008722933940589428, 0.04795348644256592, 0.0064021022990345955, 0.0017310800030827522, 0.022112052887678146, 0.17764176428318024, 0.4242194592952728, 0.004014854785054922, 0.007171340752393007, 0.0009121242328546941, 0.0007501594373025, 0.0018686262192204595, 0.003460242412984371, 0.0017804150702431798, 0.004522507078945637, 0.28205111622810364], [0.028164034709334373, 0.01194404810667038, 0.03830389305949211, 0.0060393111780285835, 0.0017788761761039495, 0.008039308711886406, 0.07464370131492615, 0.5123236775398254, 0.0032751006074249744, 0.005244655534625053, 0.002541617024689913, 0.0008791483123786747, 0.003665814409032464, 0.0026929182931780815, 0.00917515717446804, 0.0015130304964259267, 0.28977569937705994], [0.0022811752278357744, 0.02319641225039959, 0.037737857550382614, 0.02059542015194893, 0.0031552696600556374, 0.00955635029822588, 0.0028077377937734127, 0.7622511982917786, 0.0014645615592598915, 0.0034280193503946066, 0.0008564864401705563, 0.0010002563940361142, 0.0035692057572305202, 0.0014860100345686078, 0.003335065906867385, 0.003217922057956457, 0.12006112933158875], [0.009641136974096298, 0.025326576083898544, 0.026875916868448257, 0.008188514970242977, 0.00274420203641057, 0.025599239394068718, 0.07167235016822815, 0.31681177020072937, 0.11773303151130676, 0.05146321654319763, 0.006947430316358805, 0.008199144154787064, 0.06222199648618698, 0.030910279601812363, 0.005212176591157913, 0.006267891265451908, 0.22418498992919922], [0.002109761815518141, 0.004226265009492636, 0.07210744917392731, 0.010651719756424427, 0.0014636669075116515, 0.007354007102549076, 0.13022923469543457, 0.3874727189540863, 0.009231919422745705, 0.0857505053281784, 0.0015441038412973285, 0.005902897100895643, 0.009464292787015438, 0.005167586263269186, 0.0015904736937955022, 0.00941320788115263, 0.2563200891017914], [0.004223366733640432, 0.002626447705551982, 0.011037878692150116, 0.015666762366890907, 0.0010215368820354342, 0.002879766281694174, 0.009394120424985886, 0.5177751779556274, 0.0034479289315640926, 0.014206061139702797, 0.04850877448916435, 0.01615968719124794, 0.010355575010180473, 0.00647677993401885, 0.005947130266577005, 0.009348835796117783, 0.32092413306236267], [0.006386110093444586, 0.003683935385197401, 0.0056730760261416435, 0.014042403548955917, 0.0012330538593232632, 0.009419182315468788, 0.030077114701271057, 0.5017910599708557, 0.0035807460080832243, 0.019586797803640366, 0.005253526847809553, 0.0842403993010521, 0.011976697482168674, 0.004221876617521048, 0.0032336607109755278, 0.01388236228376627, 0.28171804547309875], [0.012155064381659031, 0.007409754674881697, 0.004397288896143436, 0.008233885280787945, 0.0013320136349648237, 0.0056645008735358715, 0.0318310484290123, 0.4605944752693176, 0.01015162467956543, 0.017817294225096703, 0.008433387614786625, 0.010874640196561813, 0.12859173119068146, 0.005994078703224659, 0.0028773690573871136, 0.00665317103266716, 0.2769886255264282], [0.0027537245769053698, 0.0036328022833913565, 0.005577662028372288, 0.0038106930442154408, 0.0019492539577186108, 0.012945123016834259, 0.011119065806269646, 0.44588541984558105, 0.014291144907474518, 0.011657710187137127, 0.0022143220994621515, 0.003548076841980219, 0.03355540707707405, 0.1519237756729126, 0.02607247419655323, 0.03367946296930313, 0.23538394272327423], [0.00029883993556723, 0.0003465866611804813, 0.0022576830815523863, 0.0005133686936460435, 5.995362880639732e-05, 0.001136656734161079, 0.00424750754609704, 0.6050891876220703, 0.00021222331270109862, 0.001574298250488937, 0.00022403067850973457, 0.0003947735531255603, 0.00047106482088565826, 0.0037779808044433594, 0.09869913756847382, 0.009850185364484787, 0.27084654569625854], [0.0018904947210103273, 0.0020954327192157507, 0.02458838000893593, 0.0033845575526356697, 0.0014684107154607773, 0.008489667437970638, 0.04170070216059685, 0.47830501198768616, 0.0010329067008569837, 0.011650288477540016, 0.0013705156743526459, 0.0018217930337414145, 0.0063153961673378944, 0.01822103187441826, 0.04106822982430458, 0.10012187063694, 0.2564753293991089], [0.0014136541867628694, 0.010964887216687202, 0.02134266123175621, 0.009557553566992283, 0.0013062885263934731, 0.004398785065859556, 0.0017203751485794783, 0.8262906074523926, 0.0007609241874888539, 0.002056643832474947, 0.0006245280965231359, 0.0005568765918724239, 0.0017411516746506095, 0.0006453447276726365, 0.001584174344316125, 0.0017488424200564623, 0.11328675597906113]], [[0.019277745857834816, 0.024589868262410164, 0.005696321837604046, 0.002298590261489153, 0.004114520736038685, 0.023255854845046997, 0.05529598146677017, 0.36301013827323914, 0.07500237971544266, 0.045327987521886826, 0.01216894667595625, 0.03438764065504074, 0.06312470138072968, 0.023146696388721466, 0.003449641866609454, 0.03772445395588875, 0.20812854170799255], [0.08875267952680588, 0.10880963504314423, 0.05643032118678093, 0.0065253726206719875, 0.008524295873939991, 0.060144368559122086, 0.013020815327763557, 0.370817095041275, 0.012173419818282127, 0.007771195378154516, 0.006266776937991381, 0.005544445477426052, 0.013791763223707676, 0.010020928457379341, 0.0016520287608727813, 0.008558062836527824, 0.22119686007499695], [0.016882501542568207, 0.029341034591197968, 0.02126455120742321, 0.005653906147927046, 0.0027595649007707834, 0.01737160235643387, 0.002424478530883789, 0.6704595685005188, 0.0021094700787216425, 0.0013319095596671104, 0.0012366862501949072, 0.0009391726925969124, 0.0027546531055122614, 0.002956267213448882, 0.001471922965720296, 0.0017157003749161959, 0.21932698786258698], [0.029202254489064217, 0.06692252308130264, 0.06995777785778046, 0.016461443156003952, 0.010741173289716244, 0.042946673929691315, 0.008874984458088875, 0.4949212372303009, 0.002817600965499878, 0.0034561334177851677, 0.003916086163371801, 0.003637258429080248, 0.00453875120729208, 0.004226867109537125, 0.0016914233565330505, 0.0031992478761821985, 0.23248858749866486], [0.09601905941963196, 0.06840411573648453, 0.1224394217133522, 0.03544906899333, 0.027376096695661545, 0.09945977479219437, 0.19540347158908844, 0.13829892873764038, 0.0061953128315508366, 0.03889516741037369, 0.005382696632295847, 0.00826807040721178, 0.00625613471493125, 0.0082086231559515, 0.010191448964178562, 0.01449411828070879, 0.11925851553678513], [0.05596718564629555, 0.021584687754511833, 0.007339956238865852, 0.0012268004938960075, 0.0016217041993513703, 0.025291679427027702, 0.011732579208910465, 0.5418863892555237, 0.0060881138779222965, 0.0046478756703436375, 0.0026409958954900503, 0.003076463472098112, 0.007349640130996704, 0.00541339349001646, 0.002005556132644415, 0.003051833948120475, 0.299075186252594], [0.08472137898206711, 0.041236136108636856, 0.02110617607831955, 0.0075948042795062065, 0.004644628148525953, 0.05971939116716385, 0.0019351777154952288, 0.5277090668678284, 0.0009569675312377512, 0.0006223632954061031, 0.0028414034750312567, 0.00039757147897034883, 0.0028592136222869158, 0.006746345199644566, 0.005237489473074675, 0.0015501779271289706, 0.23012177646160126], [0.03361385315656662, 0.042061176151037216, 0.01714962162077427, 0.009775884449481964, 0.006430193781852722, 0.03701549395918846, 0.018116027116775513, 0.5115548372268677, 0.014006099663674831, 0.006320218089967966, 0.00706414645537734, 0.005013005342334509, 0.009616820141673088, 0.013563885353505611, 0.0034947008825838566, 0.009073307737708092, 0.25613075494766235], [0.02069874107837677, 0.01043448131531477, 0.003622575895860791, 0.0027720793150365353, 0.00163747847545892, 0.0060441140085458755, 0.015880828723311424, 0.4972248077392578, 0.019370531663298607, 0.06456679105758667, 0.020001903176307678, 0.03698835149407387, 0.03504651039838791, 0.007444628514349461, 0.0032831078860908747, 0.00914907455444336, 0.24583400785923004], [0.009396915324032307, 0.008754069916903973, 0.005035027861595154, 0.0009472919045947492, 0.000816609594039619, 0.005389122758060694, 0.004731811583042145, 0.5392392873764038, 0.028001463040709496, 0.0255533866584301, 0.01331025455147028, 0.03941193222999573, 0.04092125594615936, 0.01978723146021366, 0.01359209232032299, 0.010372031480073929, 0.2347402572631836], [0.0020249842200428247, 0.002214999170973897, 0.0013396504800766706, 0.0010446255328133702, 0.00047166665899567306, 0.003662594361230731, 0.012982950545847416, 0.6640608310699463, 0.002101437421515584, 0.0054976497776806355, 0.005041120573878288, 0.0063076866790652275, 0.008934580720961094, 0.00215730257332325, 0.004157139919698238, 0.00498522212728858, 0.273015558719635], [0.013376330956816673, 0.006371388677507639, 0.0028769562486559153, 0.0012117158621549606, 0.000582011416554451, 0.002980900229886174, 0.004880707710981369, 0.6246200799942017, 0.009149904362857342, 0.01310441829264164, 0.007637325674295425, 0.0168150644749403, 0.03047161176800728, 0.002694958820939064, 0.0031748469918966293, 0.002735040383413434, 0.2573166787624359], [0.008672471158206463, 0.0027354855556041002, 0.0018763638800010085, 0.0012120106257498264, 0.001036526053212583, 0.0037757621612399817, 0.011649711057543755, 0.6081286668777466, 0.005035641137510538, 0.03107079304754734, 0.00998520478606224, 0.01426475029438734, 0.019839461892843246, 0.005180987063795328, 0.016154052689671516, 0.013235386461019516, 0.24614675343036652], [0.005501341074705124, 0.0009594668517820537, 0.00035717972787097096, 0.000612736155744642, 0.0002737265604082495, 0.0013586089480668306, 0.004921034909784794, 0.6922136545181274, 0.0015662068035453558, 0.010418660007417202, 0.002307949820533395, 0.004086006432771683, 0.009144686162471771, 0.004018046427518129, 0.006625715643167496, 0.00546954246237874, 0.2501654326915741], [0.003278226824477315, 0.0015071288216859102, 0.0013276173267513514, 0.0006581317284144461, 0.00038516594213433564, 0.001981346169486642, 0.002276766812428832, 0.6643900871276855, 0.0025823330506682396, 0.0032902774401009083, 0.00143627158831805, 0.004889187403023243, 0.006653347983956337, 0.013423419557511806, 0.017084164544939995, 0.020127128809690475, 0.2547093331813812], [0.008548463694751263, 0.0018340627430006862, 0.0010490010026842356, 0.00043611900764517486, 0.0004821592883672565, 0.0032333978451788425, 0.0038205687887966633, 0.6366986632347107, 0.0035057871136814356, 0.007899132557213306, 0.003116113832220435, 0.0064576067961752415, 0.015924787148833275, 0.01298163365572691, 0.030124012380838394, 0.009028472937643528, 0.254859983921051], [0.017512725666165352, 0.022267917171120644, 0.009943819604814053, 0.008128554560244083, 0.00439094752073288, 0.024363696575164795, 0.01218484714627266, 0.6067892909049988, 0.006701506208628416, 0.0039465464651584625, 0.004547390155494213, 0.0032975177746266127, 0.006328345276415348, 0.00785874854773283, 0.0024738095235079527, 0.005037389229983091, 0.2542269825935364]], [[0.009548085741698742, 0.0028994539752602577, 0.0022675113286823034, 0.0038795615546405315, 0.0028191269375383854, 0.004908949136734009, 0.00444787135347724, 0.5617684125900269, 0.0030798427760601044, 0.0015259026549756527, 0.005525792017579079, 0.0010173111222684383, 0.0075424970127642155, 0.007785332389175892, 0.009663975797593594, 0.011220501735806465, 0.3600998520851135], [0.009240121580660343, 0.03983833268284798, 0.033098019659519196, 0.046897850930690765, 0.01221412979066372, 0.08378443121910095, 0.05175050348043442, 0.4775988757610321, 0.0010246318997815251, 0.00012529813102446496, 0.0006063429755158722, 0.0001693956000963226, 0.0010100267827510834, 0.004384277854114771, 0.0021107816137373447, 0.002663786755874753, 0.23348309099674225], [0.014946705661714077, 0.030549131333827972, 0.020450806245207787, 0.013584188185632229, 0.008975313045084476, 0.04261016473174095, 0.045676350593566895, 0.5558201670646667, 0.000862622749991715, 0.00018538205767981708, 0.0003546209482010454, 0.00015718504437245429, 0.0006780424737371504, 0.0013617015210911632, 0.0012286343844607472, 0.0015506905037909746, 0.26100826263427734], [0.023747343569993973, 0.1289290338754654, 0.08005097508430481, 0.030489761382341385, 0.0216085035353899, 0.10249754786491394, 0.10376390069723129, 0.3171675205230713, 0.00045616814168170094, 0.00013853952987119555, 0.0005267811357043684, 0.00011460176756372675, 0.0002873706689570099, 0.0012152366107329726, 0.000678465177770704, 0.0026929776649922132, 0.18563532829284668], [0.03583849221467972, 0.1469113528728485, 0.08085722476243973, 0.09292234480381012, 0.019100571051239967, 0.08099400252103806, 0.06385605037212372, 0.29668158292770386, 0.0013059884076938033, 0.0002887963782995939, 0.0004303515306673944, 0.00019347367924638093, 0.0011550384806469083, 0.0010017602471634746, 0.0003973857965320349, 0.0012967276852577925, 0.17676874995231628], [0.04228191077709198, 0.2066255658864975, 0.10754743963479996, 0.170302152633667, 0.01715237833559513, 0.04689003527164459, 0.06873517483472824, 0.17552845180034637, 0.006985299289226532, 0.0008863514522090554, 0.0024411864578723907, 0.0014710624236613512, 0.007220716681331396, 0.011309077963232994, 0.004798203241080046, 0.008603946305811405, 0.12122111767530441], [0.02197185903787613, 0.12422139197587967, 0.16263368725776672, 0.059415049850940704, 0.03301416337490082, 0.21124252676963806, 0.06738223135471344, 0.221749946475029, 0.0012990013929083943, 0.0006360728875733912, 0.0012972187250852585, 0.0006106226937845349, 0.0033489069901406765, 0.001453361357562244, 0.0009312123293057084, 0.0021342451218515635, 0.08665846288204193], [0.01113000139594078, 0.05467669293284416, 0.07480213791131973, 0.08117908239364624, 0.008772493340075016, 0.04066956788301468, 0.09051277488470078, 0.39867278933525085, 0.0031911665573716164, 0.0008829750586301088, 0.0028709243051707745, 0.0011193316895514727, 0.005654980894178152, 0.006104514468461275, 0.007275376468896866, 0.008365781977772713, 0.204119473695755], [0.01182614453136921, 0.002485171426087618, 0.002414643531665206, 0.0026340181939303875, 0.001206871005706489, 0.007508565671741962, 0.01859925128519535, 0.4847816228866577, 0.01897297613322735, 0.006356467958539724, 0.044379182159900665, 0.004919344559311867, 0.041026972234249115, 0.03034653514623642, 0.013348580338060856, 0.01709182932972908, 0.29210183024406433], [0.012277799658477306, 0.0016887430101633072, 0.0007670294726267457, 0.001266277628019452, 0.002165739657357335, 0.004966673906892538, 0.010419261641800404, 0.6174615621566772, 0.003374255495145917, 0.0014915247447788715, 0.002912902506068349, 0.0006261896924115717, 0.007321385201066732, 0.0023824884556233883, 0.0029167276807129383, 0.004694441333413124, 0.32326701283454895], [0.011045924387872219, 0.0048765926621854305, 0.005600411910563707, 0.003934202250093222, 0.002241854788735509, 0.006728596985340118, 0.028498411178588867, 0.21724306046962738, 0.05350759997963905, 0.029792364686727524, 0.08372130244970322, 0.02958553284406662, 0.10807976126670837, 0.0913369357585907, 0.08131635934114456, 0.08765298128128052, 0.1548379808664322], [0.006472658831626177, 0.0008831421146169305, 0.0010603027185425162, 0.0005164258764125407, 0.0010638408130034804, 0.003708388190716505, 0.009364652447402477, 0.5351151823997498, 0.017358379438519478, 0.008001948706805706, 0.026606092229485512, 0.003296970622614026, 0.03447214886546135, 0.017558395862579346, 0.014075424522161484, 0.014985928311944008, 0.3054601550102234], [0.00969822145998478, 0.0018468430498614907, 0.00239880895242095, 0.0006269347504712641, 0.000963465019594878, 0.004778528586030006, 0.007834906689822674, 0.6011707782745361, 0.007816808298230171, 0.004037713166326284, 0.0124910157173872, 0.002769134007394314, 0.023751815780997276, 0.011076435446739197, 0.006946835666894913, 0.011204487644135952, 0.29058730602264404], [0.0072907982394099236, 0.0021899049170315266, 0.005169613752514124, 0.002075376221910119, 0.0006285405834205449, 0.0020117382518947124, 0.013657594099640846, 0.20741455256938934, 0.03824053704738617, 0.027511343359947205, 0.11825791746377945, 0.02845815010368824, 0.14829619228839874, 0.09032692760229111, 0.07183998823165894, 0.10176077485084534, 0.1348700225353241], [0.0038723512552678585, 0.001241197227500379, 0.003882938763126731, 0.001069758553057909, 0.0004290594079066068, 0.001760890823788941, 0.02291986532509327, 0.3274613320827484, 0.00783026497811079, 0.010338897816836834, 0.017167063429951668, 0.004411616362631321, 0.026874320581555367, 0.040231890976428986, 0.15321749448776245, 0.16144263744354248, 0.21584835648536682], [0.007035073358565569, 0.0019884223584085703, 0.0055932216346263885, 0.0016798193100839853, 0.0006719385855831206, 0.001851701526902616, 0.01952170766890049, 0.11296360939741135, 0.03880291432142258, 0.054122138768434525, 0.07528164237737656, 0.020604917779564857, 0.099748894572258, 0.07716884464025497, 0.27084189653396606, 0.11766897141933441, 0.09445425122976303], [0.0056172520853579044, 0.02024506963789463, 0.029372185468673706, 0.02645188383758068, 0.004771166481077671, 0.023836195468902588, 0.03631103038787842, 0.582915723323822, 0.0026415756437927485, 0.0008657829603180289, 0.002403985010460019, 0.000845111149828881, 0.004852882120758295, 0.004164927173405886, 0.004528185352683067, 0.005741606466472149, 0.2444353997707367]], [[0.022456655278801918, 0.021947825327515602, 0.003134569153189659, 0.06960521638393402, 0.02730356529355049, 0.025096148252487183, 0.2664371728897095, 0.3517386019229889, 0.004518304020166397, 0.0015029646456241608, 0.0009028114145621657, 0.0019885553047060966, 0.0053007639944553375, 0.0024185706861317158, 0.0012482303427532315, 0.0006944097112864256, 0.19370558857917786], [0.05893108993768692, 0.03343524783849716, 0.00914047658443451, 0.06665322184562683, 0.028155440464615822, 0.003755362704396248, 0.015476088039577007, 0.4962178170681, 0.0012719521764665842, 0.0013590195449069142, 0.000604169734288007, 0.0003547204833012074, 0.001978820189833641, 0.000249986769631505, 0.00011212703248020262, 0.00041208104812540114, 0.281892329454422], [0.01853582076728344, 0.2517050802707672, 0.050153836607933044, 0.18930920958518982, 0.01920868456363678, 0.021971946582198143, 0.016142411157488823, 0.2739299237728119, 0.004557641688734293, 0.0035171485505998135, 0.0028798289131373167, 0.0020486400462687016, 0.002549149328842759, 0.0007066048565320671, 0.00021160405594855547, 0.0016870193649083376, 0.1408853828907013], [0.02184363827109337, 0.0399944968521595, 0.004134890157729387, 0.017478760331869125, 0.010829243808984756, 0.008624649606645107, 0.007139834575355053, 0.5936956405639648, 0.0015365126309916377, 0.0012042404850944877, 0.00047354603884741664, 0.0006593057769350708, 0.0006214440218172967, 0.0003055845445487648, 0.00011743933282559738, 0.00026513918419368565, 0.2910757064819336], [0.04136659950017929, 0.06156838685274124, 0.007219827733933926, 0.2892051637172699, 0.20373280346393585, 0.048078179359436035, 0.01939469203352928, 0.18154558539390564, 0.002606752561405301, 0.002397174248471856, 0.00042817284702323377, 0.0006995027652010322, 0.0015020128339529037, 0.0007009211694821715, 0.00029365430236794055, 0.001315823057666421, 0.137944757938385], [0.018971599638462067, 0.015424394980072975, 0.0060884905979037285, 0.07139655202627182, 0.20629224181175232, 0.02861129678785801, 0.10659796744585037, 0.2987186312675476, 0.0007767421775497496, 0.00114180997479707, 0.0004270350618753582, 0.00039904977893456817, 0.0029550576582551003, 0.0005881515680812299, 0.0007020286284387112, 0.001407432253472507, 0.23950159549713135], [0.01070720236748457, 0.0330820269882679, 0.017394213005900383, 0.0592963881790638, 0.16619333624839783, 0.18072688579559326, 0.04802052304148674, 0.26000210642814636, 0.006736087612807751, 0.004113651812076569, 0.0019290994387120008, 0.0026642093434929848, 0.007190258242189884, 0.006868966855108738, 0.004474995657801628, 0.010398702695965767, 0.18020132184028625], [0.06455700099468231, 0.041797466576099396, 0.015422980301082134, 0.03485666960477829, 0.02291341871023178, 0.03552282974123955, 0.06079132854938507, 0.2982935607433319, 0.03033851459622383, 0.029037032276391983, 0.02657744474709034, 0.025076409801840782, 0.03708687424659729, 0.022693945094943047, 0.018732743337750435, 0.028577197343111038, 0.20772463083267212], [0.12974059581756592, 0.014950132928788662, 0.0012736949138343334, 0.010799301788210869, 0.017651354894042015, 0.004274541977792978, 0.009416776709258556, 0.5103034377098083, 0.010291455313563347, 0.00874420627951622, 0.002133300993591547, 0.0034947518724948168, 0.01764869876205921, 0.001449701376259327, 0.0009878795826807618, 0.00324090919457376, 0.2535991370677948], [0.1732528954744339, 0.034419331699609756, 0.002133424859493971, 0.01040138490498066, 0.041287679225206375, 0.008173053152859211, 0.0031463054474443197, 0.29453790187835693, 0.11390023678541183, 0.032152775675058365, 0.010630889795720577, 0.012554335407912731, 0.08591412752866745, 0.015542315319180489, 0.002182758878916502, 0.01234707422554493, 0.14742346107959747], [0.03705664724111557, 0.03378953039646149, 0.0021149488165974617, 0.012498689815402031, 0.005732903257012367, 0.009516551159322262, 0.004738130606710911, 0.44205066561698914, 0.08005455881357193, 0.021972032263875008, 0.011893939226865768, 0.03666716814041138, 0.03173566982150078, 0.013363306410610676, 0.00517281936481595, 0.009985632263123989, 0.24165676534175873], [0.040867049247026443, 0.006408818531781435, 0.0008636480779387057, 0.006286347284913063, 0.008997154422104359, 0.004769672639667988, 0.004067576956003904, 0.5399175882339478, 0.01839715801179409, 0.013025360181927681, 0.01708166114985943, 0.00895574688911438, 0.025088349357247353, 0.00862768106162548, 0.004291999153792858, 0.009642605669796467, 0.28271156549453735], [0.04358716681599617, 0.009029299952089787, 0.00043730303877964616, 0.003099664580076933, 0.0020504770800471306, 0.005667248275130987, 0.0049999915063381195, 0.626607358455658, 0.014447716996073723, 0.010923797264695168, 0.0031206731218844652, 0.005062049720436335, 0.011322245001792908, 0.004709620960056782, 0.0018938934663310647, 0.002454267116263509, 0.25058725476264954], [0.009867788292467594, 0.0020610562060028315, 0.0009217222104780376, 0.002791976323351264, 0.013349143788218498, 0.0070045506581664085, 0.00919921975582838, 0.5282589793205261, 0.003543288679793477, 0.005040670279413462, 0.004555019084364176, 0.002274732105433941, 0.032936401665210724, 0.01601550541818142, 0.022406984120607376, 0.028827598318457603, 0.31094545125961304], [0.004374152049422264, 0.0018487671623006463, 0.0009372894419357181, 0.0010584363481029868, 0.006303075235337019, 0.011689656414091587, 0.0048846034333109856, 0.46653640270233154, 0.003789415815845132, 0.006957157980650663, 0.010274345055222511, 0.00495007773861289, 0.042025431990623474, 0.0339990071952343, 0.01766934245824814, 0.1132320985198021, 0.2694707214832306], [0.007941938005387783, 0.0023342163767665625, 0.0007820338360033929, 0.005454189144074917, 0.012781509198248386, 0.006826317869126797, 0.006013272795826197, 0.541538417339325, 0.003465645480901003, 0.004199946764856577, 0.0015145627548918128, 0.0014299885369837284, 0.01982855424284935, 0.027335969731211662, 0.014360249973833561, 0.01678217388689518, 0.327411025762558], [0.03722979500889778, 0.027692332863807678, 0.011576073244214058, 0.0180767010897398, 0.009861810132861137, 0.025834443047642708, 0.03891235962510109, 0.4274941384792328, 0.018572892993688583, 0.021416790783405304, 0.022626884281635284, 0.02072407677769661, 0.01835119165480137, 0.013281087391078472, 0.015771254897117615, 0.0159608107060194, 0.25661730766296387]], [[0.04273043945431709, 0.010925447568297386, 0.005210877396166325, 0.0032660067081451416, 0.0020151452627032995, 0.0037017532158643007, 0.012778128497302532, 0.31039005517959595, 0.12078451365232468, 0.08311635255813599, 0.014020518399775028, 0.012070292606949806, 0.08964333683252335, 0.020753590390086174, 0.009982466697692871, 0.011838034726679325, 0.2467731237411499], [0.007399150636047125, 0.1258382946252823, 0.13569891452789307, 0.10218138992786407, 0.015062427148222923, 0.04485991224646568, 0.03572942316532135, 0.35047611594200134, 0.0004776774439960718, 0.0005739125772379339, 0.0007727932534180582, 0.00019142891687806696, 0.0007229851908050478, 0.00026804913068190217, 0.00030213070567697287, 0.0007308548665605485, 0.1787145882844925], [0.001321085961535573, 0.024015599861741066, 0.016293777152895927, 0.023575162515044212, 0.003283160040155053, 0.011694109998643398, 0.004126603249460459, 0.6879144906997681, 6.340537947835401e-05, 0.00012875768879894167, 0.00016115524340420961, 3.621394716901705e-05, 0.00012140576291130856, 4.55836379842367e-05, 0.00016726837202440947, 0.0003153449506498873, 0.2267368882894516], [0.009141659364104271, 0.08237382024526596, 0.017047986388206482, 0.09430057555437088, 0.009857485070824623, 0.045781295746564865, 0.01893915794789791, 0.5224566459655762, 0.00030265742680057883, 0.00012161168706370518, 0.00028831709641963243, 6.121781916590407e-05, 0.00014336152526084334, 0.0001676673418842256, 8.097315730992705e-05, 0.00030263231019489467, 0.19863292574882507], [0.009433622471988201, 0.09246213734149933, 0.01818373054265976, 0.22767426073551178, 0.05350743234157562, 0.09488721191883087, 0.019041597843170166, 0.34482529759407043, 0.0005079461843706667, 0.00043087685480713844, 0.0002716234012041241, 9.168343967758119e-05, 0.0006290461169555783, 0.0007786031346768141, 0.0005363051895983517, 0.0013717144029214978, 0.13536690175533295], [0.015284883789718151, 0.10623462498188019, 0.034661464393138885, 0.1645023375749588, 0.027306262403726578, 0.06057338789105415, 0.0940045639872551, 0.3338431715965271, 0.0005853463662788272, 0.0002055448858300224, 0.0005140762077644467, 0.00014623426250182092, 0.0006559526664204895, 0.00039254321018233895, 0.0003486171190161258, 0.0010051511926576495, 0.15973585844039917], [0.03406502306461334, 0.07375169545412064, 0.028445472940802574, 0.08269959688186646, 0.01772785745561123, 0.1501976102590561, 0.057427335530519485, 0.35381844639778137, 0.003376561449840665, 0.0016309269703924656, 0.0027231350541114807, 0.001046848250553012, 0.0038115649949759245, 0.0018797150114551187, 0.0004354999109636992, 0.0019243810093030334, 0.1850382387638092], [0.031044378876686096, 0.07606549561023712, 0.03163041174411774, 0.052209511399269104, 0.011481339111924171, 0.02878413163125515, 0.15662439167499542, 0.3389674127101898, 0.0045960890129208565, 0.0024492552038282156, 0.0022612647153437138, 0.0011915897484868765, 0.003949080593883991, 0.001848690677434206, 0.0011421144008636475, 0.004432867746800184, 0.2513218820095062], [0.011731116101145744, 0.00788077898323536, 0.0008507791790179908, 0.002779273083433509, 0.0010942775988951325, 0.004038802348077297, 0.002311753574758768, 0.5421214699745178, 0.06768756359815598, 0.04160356894135475, 0.028307568281888962, 0.009283688850700855, 0.06931769102811813, 0.00840551033616066, 0.0026959136594086885, 0.005769119132310152, 0.19412115216255188], [0.0026390759740024805, 0.0007802514592185616, 0.00021088791254442185, 0.0008833729079924524, 0.00015455765242222697, 0.00034645397681742907, 0.00031303163268603384, 0.7197991609573364, 0.014235266484320164, 0.006716256495565176, 0.0025569095741957426, 0.0014836564660072327, 0.005455496720969677, 0.0029846525285393, 0.00422503100708127, 0.004242599010467529, 0.23297329246997833], [0.0054240841418504715, 0.005965266842395067, 0.0008701793267391622, 0.003537815995514393, 0.0004694205126725137, 0.0021527146454900503, 0.000971428060438484, 0.609679639339447, 0.018107783049345016, 0.024918559938669205, 0.017089668661355972, 0.013821265660226345, 0.024988412857055664, 0.004319941624999046, 0.00562899699434638, 0.0046049985103309155, 0.2574498653411865], [0.007058931514620781, 0.002013731049373746, 0.0003378711699042469, 0.001891269814223051, 0.0003427169576752931, 0.0012942911125719547, 0.001309575280174613, 0.5599812269210815, 0.04482473433017731, 0.021705903112888336, 0.03798069432377815, 0.012835314497351646, 0.02319633588194847, 0.0074927546083927155, 0.005582157522439957, 0.009797673672437668, 0.2623548209667206], [0.009803524240851402, 0.0029658377170562744, 0.0010147469583898783, 0.0012084090849384665, 0.0004892810829915106, 0.0015343212289735675, 0.0020414497703313828, 0.5519570708274841, 0.048851098865270615, 0.023732688277959824, 0.01146373338997364, 0.018584730103611946, 0.032608337700366974, 0.016888994723558426, 0.005994120147079229, 0.011097241193056107, 0.2597644627094269], [0.014140384271740913, 0.0031386781483888626, 0.0004538831708487123, 0.0032695969566702843, 0.0008648817893117666, 0.0035730735398828983, 0.006762362085282803, 0.5959579944610596, 0.016225961968302727, 0.008355597034096718, 0.009427226148545742, 0.003593903034925461, 0.023407243192195892, 0.01857495680451393, 0.017281439155340195, 0.023952100425958633, 0.25102075934410095], [0.004708952270448208, 0.001267062034457922, 0.0006041160668246448, 0.0015428881160914898, 0.00017277186270803213, 0.0012620817869901657, 0.0012238433118909597, 0.5757011771202087, 0.004138552583754063, 0.004114903509616852, 0.0029165279120206833, 0.002329937182366848, 0.012792681343853474, 0.01566375233232975, 0.03188899904489517, 0.09874361008405685, 0.24092817306518555], [0.007550605107098818, 0.0019501439528539777, 0.0008454478229396045, 0.00232049822807312, 0.0003437183331698179, 0.0015127371298149228, 0.004983719903975725, 0.6133180260658264, 0.003920156508684158, 0.006099069956690073, 0.0034598740749061108, 0.0011328920954838395, 0.0051718661561608315, 0.006166866049170494, 0.04804175719618797, 0.024923225864768028, 0.2682594358921051], [0.014222539961338043, 0.029148094356060028, 0.01745740696787834, 0.020666802302002907, 0.005656434688717127, 0.012613311409950256, 0.03119693696498871, 0.4950505495071411, 0.003500905353575945, 0.0030985630583018064, 0.002852467354387045, 0.001683223876170814, 0.0038790518883615732, 0.001813106588087976, 0.0017073705093935132, 0.004356758203357458, 0.3510964810848236]], [[0.011158901266753674, 0.004521003924310207, 0.0007136596250347793, 0.0007620264077559114, 0.0013626370346173644, 0.0027816647198051214, 0.0026639881543815136, 0.06025908887386322, 0.26260703802108765, 0.12689010798931122, 0.013043787330389023, 0.019864559173583984, 0.3376324474811554, 0.07171697914600372, 0.011365552432835102, 0.0148816779255867, 0.0577748566865921], [0.002692459849640727, 0.08263766020536423, 0.3456130623817444, 0.1343354880809784, 0.056612368673086166, 0.10466238111257553, 0.08449030667543411, 0.08749958127737045, 0.0038051169831305742, 0.005857839714735746, 0.0025309559423476458, 0.0007692703511565924, 0.01071303803473711, 0.004306906834244728, 0.002287847688421607, 0.003971503581851721, 0.0672142431139946], [0.0009448195924051106, 0.0059924800880253315, 0.04499656707048416, 0.011285275220870972, 0.01102572400122881, 0.011542621068656445, 0.028732718899846077, 0.568437397480011, 0.0002918278332799673, 0.0007985151023603976, 0.0010200971737504005, 0.00020962917187716812, 0.0008846409618854523, 0.0005966214812360704, 0.006591382902115583, 0.003406726522371173, 0.30324292182922363], [0.0027174961287528276, 0.027588602155447006, 0.09413395822048187, 0.02922937460243702, 0.01616697758436203, 0.029608123004436493, 0.0493009053170681, 0.457540899515152, 0.0012701961677521467, 0.0021479427814483643, 0.001426215749233961, 0.00035268341889604926, 0.0013589683221653104, 0.0011366907274350524, 0.0045110746286809444, 0.002552115125581622, 0.2789577841758728], [0.003021674230694771, 0.045711830258369446, 0.18625934422016144, 0.09758056700229645, 0.04106978327035904, 0.052408237010240555, 0.1505175232887268, 0.21909047663211823, 0.0040421634912490845, 0.009419073350727558, 0.0022433525882661343, 0.0010041756322607398, 0.006586202885955572, 0.0021299482323229313, 0.004082442726939917, 0.00436494080349803, 0.1704683154821396], [0.0032792820129543543, 0.08111587911844254, 0.29668134450912476, 0.0739460438489914, 0.024001166224479675, 0.07609695941209793, 0.3239842653274536, 0.04282943159341812, 0.0029396878089755774, 0.004696402233093977, 0.0025891661643981934, 0.0008170694927684963, 0.00800387654453516, 0.004587074741721153, 0.004224342294037342, 0.00638478621840477, 0.04382329806685448], [0.004747460596263409, 0.02744414657354355, 0.06158469244837761, 0.022616134956479073, 0.03276015818119049, 0.044029105454683304, 0.0463341660797596, 0.49174752831459045, 0.003418539883568883, 0.0055325767025351524, 0.0020336902234703302, 0.000597115489654243, 0.003906251396983862, 0.004696831572800875, 0.0036662518978118896, 0.005834254901856184, 0.23905107378959656], [0.010392608121037483, 0.22914732992649078, 0.1337614506483078, 0.11753973364830017, 0.02351071685552597, 0.1414770632982254, 0.16925881803035736, 0.10205116868019104, 0.003986799623817205, 0.00211681192740798, 0.0012361564440652728, 0.0011931228218600154, 0.007259796839207411, 0.003208471927791834, 0.001928925747051835, 0.00350143457762897, 0.048429690301418304], [0.01589983142912388, 0.014701459556818008, 0.019344279542565346, 0.0051245903596282005, 0.007373419590294361, 0.015348684042692184, 0.024280158802866936, 0.10684330761432648, 0.09720287472009659, 0.10340415686368942, 0.014108635485172272, 0.02476648800075054, 0.34748783707618713, 0.05735474079847336, 0.023075560107827187, 0.03820529580116272, 0.08547863364219666], [0.02258485183119774, 0.006129854824393988, 0.008313841186463833, 0.004836616571992636, 0.00945699866861105, 0.006378726102411747, 0.013692068867385387, 0.33381393551826477, 0.043640367686748505, 0.08847491443157196, 0.007146009709686041, 0.011188755743205547, 0.07713112235069275, 0.021844208240509033, 0.03167419880628586, 0.037142928689718246, 0.2765505313873291], [0.018296917900443077, 0.004053308628499508, 0.01671082153916359, 0.004922103136777878, 0.005493265576660633, 0.006995729636400938, 0.016017628833651543, 0.376251220703125, 0.0168628953397274, 0.031931012868881226, 0.010762694291770458, 0.01932031475007534, 0.038631368428468704, 0.02878636121749878, 0.07986465841531754, 0.08324635773897171, 0.24185340106487274], [0.03560624644160271, 0.005974023137241602, 0.006559197790920734, 0.0025752075016498566, 0.004245084710419178, 0.0052162278443574905, 0.014633269980549812, 0.2707018554210663, 0.05423135682940483, 0.09029130637645721, 0.01172156073153019, 0.017886098474264145, 0.09245486557483673, 0.02883598580956459, 0.03858368471264839, 0.06316250562667847, 0.25732147693634033], [0.030650025233626366, 0.011764624156057835, 0.008643248118460178, 0.0024146963842213154, 0.003851533867418766, 0.00712395366281271, 0.01014288142323494, 0.31867292523384094, 0.06474132835865021, 0.0849815309047699, 0.012711056508123875, 0.01607181690633297, 0.0888357013463974, 0.031135128811001778, 0.038838621228933334, 0.04391677677631378, 0.22550411522388458], [0.01888444274663925, 0.01145879551768303, 0.021535808220505714, 0.003775527235120535, 0.002406398067250848, 0.012178423814475536, 0.04780573397874832, 0.17063415050506592, 0.04587945342063904, 0.03875456377863884, 0.018893133848905563, 0.034133147448301315, 0.18193329870700836, 0.09197019040584564, 0.0766286700963974, 0.09655112028121948, 0.12657716870307922], [0.017223220318555832, 0.002907813061028719, 0.008872142061591148, 0.0025642900727689266, 0.0021096616983413696, 0.0030711661092936993, 0.057768262922763824, 0.14524421095848083, 0.016294686123728752, 0.03436644375324249, 0.022975541651248932, 0.026953907683491707, 0.033345017582178116, 0.05669255182147026, 0.28809434175491333, 0.14350973069667816, 0.1380070298910141], [0.027745669707655907, 0.005201238207519054, 0.018724678084254265, 0.004337739665061235, 0.002414804883301258, 0.004956688266247511, 0.06717684119939804, 0.13012005388736725, 0.028314024209976196, 0.07089640200138092, 0.0216904878616333, 0.03879619762301445, 0.0755167081952095, 0.06026041880249977, 0.18603254854679108, 0.13385190069675446, 0.12396354228258133], [0.0066335913725197315, 0.03936045616865158, 0.04464258253574371, 0.033172063529491425, 0.012921283021569252, 0.04102327674627304, 0.05410892888903618, 0.5457468628883362, 0.002781172515824437, 0.003042014315724373, 0.001652611419558525, 0.0011335837189108133, 0.004574459511786699, 0.0022479919716715813, 0.004736421629786491, 0.004569776356220245, 0.19765296578407288]], [[0.12167654186487198, 0.0075149559415876865, 0.00568244606256485, 0.015084643848240376, 0.01070312224328518, 0.013011274859309196, 0.02738182432949543, 0.4257955253124237, 0.027130404487252235, 0.023754164576530457, 0.00953770149499178, 0.013212757185101509, 0.04177046939730644, 0.009404133073985577, 0.0007858492317609489, 0.012394112534821033, 0.23516003787517548], [0.036025520414114, 0.10796020925045013, 0.008994265459477901, 0.0045096902176737785, 0.002820024499669671, 0.016944676637649536, 0.0127598587423563, 0.39483538269996643, 0.04500686004757881, 0.017008762806653976, 0.009139886125922203, 0.016575990244746208, 0.008166804909706116, 0.03665744885802269, 0.0004256327520124614, 0.012831593863666058, 0.26933741569519043], [0.04463111236691475, 0.06399090588092804, 0.08614081889390945, 0.01091488916426897, 0.009027852676808834, 0.02923930250108242, 0.023150887340307236, 0.44600412249565125, 0.005042591597884893, 0.02119682915508747, 0.004272188991308212, 0.0035210398491472006, 0.0038310610689222813, 0.002406490035355091, 0.0005241000326350331, 0.005662405397742987, 0.24044331908226013], [0.056566815823316574, 0.012478964403271675, 0.005365872755646706, 0.09342630952596664, 0.0054843612015247345, 0.00976174883544445, 0.005522636231034994, 0.4518509805202484, 0.009336377494037151, 0.011234820820391178, 0.00781663041561842, 0.05988069623708725, 0.010795553214848042, 0.007403965573757887, 0.0015120870666578412, 0.008721054531633854, 0.24284112453460693], [0.08154693245887756, 0.0027746197301894426, 0.0013148515718057752, 0.011014983057975769, 0.10185054689645767, 0.018399015069007874, 0.0054785641841590405, 0.45541641116142273, 0.004451461136341095, 0.0068691265769302845, 0.0016468436224386096, 0.00757928192615509, 0.009812194854021072, 0.00464586028829217, 0.0010231704218313098, 0.0028087319806218147, 0.2833673059940338], [0.05010572075843811, 0.00985656213015318, 0.000960049219429493, 0.0021305258851498365, 0.006634508725255728, 0.2027469277381897, 0.015112022869288921, 0.37733858823776245, 0.023084938526153564, 0.003908269107341766, 0.0024688125122338533, 0.006782107055187225, 0.003491209354251623, 0.052300289273262024, 0.001991088269278407, 0.0075134700164198875, 0.23357492685317993], [0.016254669055342674, 0.014096998609602451, 0.005467211827635765, 0.0045744129456579685, 0.006139964796602726, 0.04467669501900673, 0.06678405404090881, 0.4398675262928009, 0.01339893788099289, 0.07942674309015274, 0.0029487742576748133, 0.002640861552208662, 0.022433796897530556, 0.014198639430105686, 0.009087512269616127, 0.030869876965880394, 0.22713324427604675], [0.027892235666513443, 0.017620913684368134, 0.00917307659983635, 0.010625910013914108, 0.009526582434773445, 0.02136579342186451, 0.017181389033794403, 0.5515143275260925, 0.008893921039998531, 0.009031180292367935, 0.008005949668586254, 0.007138391491025686, 0.009437083266675472, 0.00878322683274746, 0.002699170960113406, 0.007876801304519176, 0.2732340395450592], [0.06346085667610168, 0.038456402719020844, 0.0005582711310125887, 0.005327790044248104, 0.004390807822346687, 0.024011097848415375, 0.008978872559964657, 0.253012090921402, 0.39613470435142517, 0.01788020133972168, 0.0029545528814196587, 0.006175209302455187, 0.006637741345912218, 0.029089756309986115, 6.256261258386075e-05, 0.0026612654328346252, 0.14020775258541107], [0.051207736134529114, 0.028233295306563377, 0.006243335083127022, 0.01151212491095066, 0.0074438066221773624, 0.02501414157450199, 0.025927409529685974, 0.31088733673095703, 0.02345028519630432, 0.25393667817115784, 0.009313829243183136, 0.018825778737664223, 0.012124288827180862, 0.008955847471952438, 0.0008209029911085963, 0.0067854346707463264, 0.19931770861148834], [0.07341303676366806, 0.009023376740515232, 0.0012193082366138697, 0.0036357028875499964, 0.0009598745382390916, 0.004928015638142824, 0.005239473190158606, 0.22134119272232056, 0.007957222871482372, 0.005379486363381147, 0.5196216106414795, 0.011210243217647076, 0.005913604516535997, 0.0015146328369155526, 0.0001174936187453568, 0.0021524287294596434, 0.1263732761144638], [0.09293791651725769, 0.008962472900748253, 0.003793872892856598, 0.02564520575106144, 0.004378491081297398, 0.009970835410058498, 0.007413683459162712, 0.47899267077445984, 0.006562027148902416, 0.0032391552813351154, 0.00940283015370369, 0.07483521848917007, 0.013380778022110462, 0.0031927209347486496, 0.0005609721993096173, 0.0039094723761081696, 0.252821683883667], [0.144046351313591, 0.0074857925064861774, 0.00152568647172302, 0.017268776893615723, 0.022590212523937225, 0.005724957678467035, 0.014552626758813858, 0.38225996494293213, 0.009616016410291195, 0.009515472687780857, 0.005666916258633137, 0.02074669487774372, 0.1149546280503273, 0.009165883995592594, 0.00021735642803832889, 0.009443937800824642, 0.22521871328353882], [0.011630082502961159, 0.0037700356915593147, 0.00020818108168896288, 0.0004194157081656158, 0.0016446671215817332, 0.013736303895711899, 0.008694665506482124, 0.563090443611145, 0.009616745635867119, 0.0033997532445937395, 0.00030985273770056665, 0.0008021551184356213, 0.0012680522631853819, 0.15150929987430573, 0.0008413230534642935, 0.004538369830697775, 0.22452062368392944], [0.006878736428916454, 0.0006689493311569095, 0.00032080794335342944, 0.00022600509691983461, 0.0004274733364582062, 0.013320536352694035, 0.01279827393591404, 0.5222806334495544, 0.00021103110339026898, 0.0009719950030557811, 0.00010051353456219658, 0.0006432710215449333, 0.0001427538227289915, 0.003683994524180889, 0.18318185210227966, 0.005724438466131687, 0.24841876327991486], [0.027138030156493187, 0.002518105087801814, 0.0008993195369839668, 0.0006400993443094194, 0.0008186582126654685, 0.006884696893393993, 0.02197936736047268, 0.6433086395263672, 0.001396615756675601, 0.0023123135324567556, 0.0009465551120229065, 0.000827718002256006, 0.001291204011067748, 0.004476301837712526, 0.0005229961825534701, 0.012062284164130688, 0.27197712659835815], [0.03181227669119835, 0.013683241792023182, 0.007585412822663784, 0.007766473572701216, 0.007233914919197559, 0.01794290728867054, 0.020610477775335312, 0.525634229183197, 0.006388143170624971, 0.008704784326255322, 0.0070458874106407166, 0.006547593045979738, 0.006779894232749939, 0.0083800433203578, 0.0026259450241923332, 0.009229892864823341, 0.3120289444923401]], [[0.03988378867506981, 0.016114575788378716, 0.0019474404398351908, 0.003629102371633053, 0.0048802695237100124, 0.0040347822941839695, 0.0175474863499403, 0.422654926776886, 0.05543402582406998, 0.1350107192993164, 0.004487651400268078, 0.017643090337514877, 0.06273152679204941, 0.004586480092257261, 0.0026807342655956745, 0.006280739791691303, 0.2004527747631073], [0.008343123830854893, 0.06480352580547333, 0.06734392046928406, 0.15420940518379211, 0.020311636850237846, 0.007981322705745697, 0.03950008377432823, 0.44986358284950256, 0.0035799748729914427, 0.006267112214118242, 0.00032282964093610644, 0.0013669135514646769, 0.007876849733293056, 0.000252675439696759, 4.106622509425506e-05, 0.0006186027894727886, 0.16731739044189453], [0.0007404094212688506, 0.00471192691475153, 0.028277264907956123, 0.02903577871620655, 0.0038026783149689436, 0.002236943459138274, 0.0075379288755357265, 0.692842960357666, 7.785890920786187e-05, 0.0017359106568619609, 0.0001720025611575693, 0.00022148275456856936, 0.00040373875526711345, 2.4747483621467836e-05, 0.0003487702342681587, 0.0007463162182830274, 0.2270832657814026], [0.0010223849676549435, 0.008215422742068768, 0.03839661180973053, 0.03378839045763016, 0.005131866317242384, 0.004833152983337641, 0.01119320560246706, 0.6800437569618225, 0.0002889969327952713, 0.0015977438306435943, 0.00013645924627780914, 0.0002946999447885901, 0.0019201133400201797, 9.586686792317778e-05, 7.265232125064358e-05, 0.0003744226705748588, 0.21259425580501556], [0.006782756187021732, 0.01822287030518055, 0.06442795693874359, 0.2947053015232086, 0.033405888825654984, 0.027554195374250412, 0.2540339231491089, 0.15079092979431152, 0.002170210238546133, 0.023842360824346542, 0.00027821955154649913, 0.0013749625068157911, 0.006694023963063955, 0.0005033547058701515, 0.0004729980428237468, 0.0014454862102866173, 0.1132945716381073], [0.006512910593301058, 0.022717759013175964, 0.14369069039821625, 0.09887434542179108, 0.023103442043066025, 0.010689659975469112, 0.146686851978302, 0.35266202688217163, 0.0024755720514804125, 0.004967536311596632, 0.00045513565419241786, 0.0012000019196420908, 0.015088378451764584, 0.0009681986994110048, 0.00019964328384958208, 0.0014922262635082006, 0.1682155579328537], [0.0059058330953121185, 0.019681943580508232, 0.04358895495533943, 0.060179464519023895, 0.013434232212603092, 0.010341964662075043, 0.05094344913959503, 0.5666913390159607, 0.0018343155970796943, 0.0027329346630722284, 0.00017579644918441772, 0.0005925801815465093, 0.0028153720777481794, 0.001525204279460013, 0.00048009847523644567, 0.0020878068171441555, 0.2169886976480484], [0.006440408062189817, 0.01626414805650711, 0.01415314432233572, 0.01805269531905651, 0.0033530760556459427, 0.007003031205385923, 0.016636159271001816, 0.6600470542907715, 0.00398850254714489, 0.008421809412539005, 0.001386127551086247, 0.003300258656963706, 0.010614327155053616, 0.0020348820835351944, 0.0008414218900725245, 0.00505413394421339, 0.22240883111953735], [0.012853391468524933, 0.004776861052960157, 0.0032535125501453876, 0.003657883033156395, 0.002092045033350587, 0.0011796079343184829, 0.019057398661971092, 0.377326101064682, 0.03503330424427986, 0.15644468367099762, 0.004486836027354002, 0.01563706248998642, 0.1515296846628189, 0.00624211085960269, 0.0011050363536924124, 0.0059441314078867435, 0.19938041269779205], [0.0030522027518600225, 0.0014701259788125753, 0.0025209663435816765, 0.004543520510196686, 0.0018592218402773142, 0.0005791988805867732, 0.0039147580973804, 0.34437522292137146, 0.03269872069358826, 0.2222934067249298, 0.007597112096846104, 0.03466250002384186, 0.10350389033555984, 0.005456297658383846, 0.0015545738860964775, 0.005576569586992264, 0.22434177994728088], [0.0018067974597215652, 0.0011239623418077826, 0.002497342647984624, 0.001246894709765911, 0.0005790196591988206, 0.0005001184181310236, 0.0032356134615838528, 0.5607758164405823, 0.01831120438873768, 0.015686875209212303, 0.013449402526021004, 0.018523871898651123, 0.05473627150058746, 0.024943485856056213, 0.005261460784822702, 0.015012739226222038, 0.26230910420417786], [0.004094633273780346, 0.001175370649434626, 0.001451934571377933, 0.001484404900111258, 0.0005301795899868011, 0.0004408232925925404, 0.007176279090344906, 0.43427199125289917, 0.04389841482043266, 0.05618460848927498, 0.008489617146551609, 0.013249777257442474, 0.11628926545381546, 0.017227115109562874, 0.00130241340957582, 0.005977923516184092, 0.2867552638053894], [0.004619250539690256, 0.001441951491869986, 0.0010173094924539328, 0.0006104934145696461, 0.0003160855558235198, 0.0015940176090225577, 0.013335556723177433, 0.5105360746383667, 0.016380852088332176, 0.053148671984672546, 0.003988656215369701, 0.00916975922882557, 0.04018981382250786, 0.012267998419702053, 0.0022186245769262314, 0.005697234068065882, 0.3234676122665405], [0.003948579076677561, 0.0011868085712194443, 0.0015574127901345491, 0.000564271118491888, 0.0003261391830164939, 0.001464462373405695, 0.013518979772925377, 0.5092573165893555, 0.00967163685709238, 0.02442529983818531, 0.0047238050028681755, 0.007272594142705202, 0.07750549167394638, 0.06930477172136307, 0.007030998356640339, 0.020727451890707016, 0.24751393496990204], [0.002848736010491848, 0.0007423038478009403, 0.0048252884298563, 0.0009277939680032432, 0.00026626561884768307, 0.0007104513933882117, 0.0038932666648179293, 0.35254186391830444, 0.011644139885902405, 0.024805789813399315, 0.008339283987879753, 0.008197507821023464, 0.10998876392841339, 0.10456009954214096, 0.02098405361175537, 0.10858126729726791, 0.23614317178726196], [0.0019770367071032524, 0.0005874845664948225, 0.005322036799043417, 0.001890881103463471, 0.0007340035517700016, 0.0004957995843142271, 0.02190230041742325, 0.25823909044265747, 0.03236871585249901, 0.05970766022801399, 0.007474488578736782, 0.009765254333615303, 0.16584376990795135, 0.14644932746887207, 0.02097996324300766, 0.07949724048376083, 0.1867649257183075], [0.002232104307040572, 0.005492780823260546, 0.0063850162550807, 0.005171177908778191, 0.0010029423283413053, 0.0024654208682477474, 0.005871399771422148, 0.7407540082931519, 0.0016971953446045518, 0.003997058607637882, 0.0009317698422819376, 0.0014509912580251694, 0.004707454703748226, 0.001061597722582519, 0.0006480918964371085, 0.0029277054127305746, 0.21320320665836334]], [[0.13112637400627136, 0.098299041390419, 0.03797876834869385, 0.08319570124149323, 0.01173576433211565, 0.06636365503072739, 0.3366886079311371, 0.09494784474372864, 0.015217344276607037, 0.002942705526947975, 0.0007382103358395398, 0.0037700894754379988, 0.03500792384147644, 0.003419391345232725, 0.0005956703098490834, 0.0026796257589012384, 0.07529328763484955], [0.08929329365491867, 0.17417871952056885, 0.07677249610424042, 0.05096729099750519, 0.014344840310513973, 0.03537403792142868, 0.04329613223671913, 0.19510518014431, 0.054338205605745316, 0.01832316629588604, 0.004299004562199116, 0.005284843500703573, 0.011951224878430367, 0.008844699710607529, 0.01188187301158905, 0.01629089191555977, 0.18945419788360596], [0.018616752699017525, 0.026499483734369278, 0.1842060536146164, 0.05412338301539421, 0.04867803305387497, 0.022157808765769005, 0.049234408885240555, 0.2578485310077667, 0.01987922005355358, 0.026008866727352142, 0.012836821377277374, 0.007937817834317684, 0.0034649462904781103, 0.013394936919212341, 0.026508202776312828, 0.031100325286388397, 0.1975044161081314], [0.010292326100170612, 0.009523626416921616, 0.034536782652139664, 0.2279977798461914, 0.07522223144769669, 0.016023078933358192, 0.055351823568344116, 0.27501586079597473, 0.007916836999356747, 0.009241070598363876, 0.0021122493781149387, 0.002398906974121928, 0.0032665543258190155, 0.004605850670486689, 0.01365593820810318, 0.021331992000341415, 0.23150701820850372], [0.0076224831864237785, 0.024256709963083267, 0.13846223056316376, 0.39482539892196655, 0.1134888157248497, 0.028290383517742157, 0.0966339111328125, 0.07660440355539322, 0.011470011435449123, 0.004032960627228022, 0.0009242743835784495, 0.0012889470672234893, 0.002588564995676279, 0.0034280067775398493, 0.003765709465369582, 0.01141941174864769, 0.08089785277843475], [0.060092926025390625, 0.07290763407945633, 0.0268718209117651, 0.07306665182113647, 0.038684502243995667, 0.08964470028877258, 0.1679389327764511, 0.12508589029312134, 0.060754843056201935, 0.02643582783639431, 0.006259375251829624, 0.010927294380962849, 0.026710020378232002, 0.017997274175286293, 0.009899232536554337, 0.03717024251818657, 0.14955288171768188], [0.046239420771598816, 0.029400838539004326, 0.037940654903650284, 0.027903564274311066, 0.013527109287679195, 0.012348546646535397, 0.07214798778295517, 0.48802459239959717, 0.002776437671855092, 0.0025310534983873367, 0.0015120228054001927, 0.0016497262986376882, 0.010194830596446991, 0.0019752690568566322, 0.00380500964820385, 0.013101187534630299, 0.23492184281349182], [0.06643960624933243, 0.07960347831249237, 0.04663100838661194, 0.06740008294582367, 0.0429096557199955, 0.06993631273508072, 0.06676341593265533, 0.13688868284225464, 0.036490168422460556, 0.028345592319965363, 0.029728369787335396, 0.03036848083138466, 0.03555829077959061, 0.02969147264957428, 0.031145256012678146, 0.043867386877536774, 0.1582326889038086], [0.029347438365221024, 0.054731737822294235, 0.02423117868602276, 0.018885619938373566, 0.0028740796260535717, 0.022202041000127792, 0.08798391371965408, 0.04044013470411301, 0.3346095085144043, 0.0208131093531847, 0.011984352953732014, 0.08050338178873062, 0.14684908092021942, 0.06363319605588913, 0.0010034691076725721, 0.006376439705491066, 0.05353127792477608], [0.028015557676553726, 0.07446622848510742, 0.0808921679854393, 0.046195488423109055, 0.02071835659444332, 0.05952954664826393, 0.08718353509902954, 0.09213827550411224, 0.12403269857168198, 0.0449085496366024, 0.010576332919299603, 0.04887360334396362, 0.12313409149646759, 0.0640578344464302, 0.002776630688458681, 0.014259631745517254, 0.0782414972782135], [0.02594302035868168, 0.010174636729061604, 0.01232949085533619, 0.008751103654503822, 0.002157071838155389, 0.009560581296682358, 0.025953933596611023, 0.3126137852668762, 0.06809480488300323, 0.014314180240035057, 0.039235055446624756, 0.04431382194161415, 0.06792160123586655, 0.08493804186582565, 0.0036732121370732784, 0.01663738489151001, 0.25338831543922424], [0.01993793621659279, 0.020772119984030724, 0.01225958950817585, 0.017012128606438637, 0.004575349856168032, 0.02879076451063156, 0.0663299635052681, 0.1862061321735382, 0.19002898037433624, 0.01549505814909935, 0.015554988756775856, 0.07573859393596649, 0.061943169683218, 0.09717091917991638, 0.0015127590158954263, 0.012691052630543709, 0.17398056387901306], [0.05456272140145302, 0.035197675228118896, 0.020155729725956917, 0.0217574592679739, 0.004355205222964287, 0.017263438552618027, 0.060405176132917404, 0.25601309537887573, 0.10454913228750229, 0.005443638190627098, 0.008728220127522945, 0.04110456630587578, 0.09524977207183838, 0.03495359420776367, 0.0024226305540651083, 0.011277208104729652, 0.2265608012676239], [0.01369340717792511, 0.024213401600718498, 0.024550100788474083, 0.02103821188211441, 0.006036289501935244, 0.015708865597844124, 0.049844756722450256, 0.1699150651693344, 0.19129596650600433, 0.017268482595682144, 0.033372703939676285, 0.0654980018734932, 0.0750468298792839, 0.09836339950561523, 0.0037567552644759417, 0.017078053206205368, 0.1733197122812271], [0.007743460591882467, 0.019458720460534096, 0.07256335765123367, 0.017791632562875748, 0.029292121529579163, 0.035826176404953, 0.03024124912917614, 0.3797381520271301, 0.0035287984646856785, 0.013947450555860996, 0.005080304574221373, 0.003365755081176758, 0.0075022196397185326, 0.014577755704522133, 0.02779126539826393, 0.09007030725479126, 0.2414812445640564], [0.014611335471272469, 0.049723733216524124, 0.11213212460279465, 0.06209392473101616, 0.0251474529504776, 0.06838910281658173, 0.08262432366609573, 0.245748832821846, 0.010322009213268757, 0.005408711265772581, 0.0063196769915521145, 0.005497562233358622, 0.018217522650957108, 0.030363747850060463, 0.012123221531510353, 0.0676974430680275, 0.18357928097248077], [0.05940486863255501, 0.05989648029208183, 0.04702525585889816, 0.061764635145664215, 0.03798598051071167, 0.05553501844406128, 0.09038205444812775, 0.17047938704490662, 0.027133382856845856, 0.025649219751358032, 0.026100708171725273, 0.024749338626861572, 0.032878562808036804, 0.024263527244329453, 0.025066150352358818, 0.04202720522880554, 0.18965817987918854]], [[0.039511971175670624, 0.023227136582136154, 0.020986786112189293, 0.021415524184703827, 0.01165375579148531, 0.04903898388147354, 0.06704988330602646, 0.3532221019268036, 0.009824379347264767, 0.03167214244604111, 0.012398001737892628, 0.0768713727593422, 0.008869635872542858, 0.006480373442173004, 0.005483519285917282, 0.026049794629216194, 0.23624460399150848], [0.017336489632725716, 0.032022152096033096, 0.014891345985233784, 0.03003665618598461, 0.0279243104159832, 0.018684355542063713, 0.0761832669377327, 0.48912903666496277, 0.004628525581210852, 0.005104967392981052, 0.006504908204078674, 0.011768101714551449, 0.00846546795219183, 0.0019485322991386056, 0.0005397393833845854, 0.006927827373147011, 0.24790436029434204], [0.008171319961547852, 0.084152452647686, 0.013201438821852207, 0.012643053196370602, 0.009366299025714397, 0.034474655985832214, 0.007402634713798761, 0.5951064229011536, 0.020463852211833, 0.00323022180236876, 0.0012982020853087306, 0.0016313092783093452, 0.0021555216517299414, 0.011141949333250523, 0.0002599233412183821, 0.0027131270617246628, 0.1925877034664154], [0.010296142660081387, 0.031723108142614365, 0.020072493702173233, 0.02865152806043625, 0.024666277691721916, 0.019877085462212563, 0.01879529282450676, 0.594613790512085, 0.004515630658715963, 0.006347935181111097, 0.0027950911317020655, 0.006079019047319889, 0.0013216607039794326, 0.001422750297933817, 0.00019706672173924744, 0.00798801053315401, 0.22063709795475006], [0.009207528084516525, 0.005360978655517101, 0.0016547112027183175, 0.07299891114234924, 0.01845093071460724, 0.009594419039785862, 0.004134891089051962, 0.6762566566467285, 0.00041089014848694205, 0.0005293187568895519, 0.00021270563593134284, 0.00038877929910086095, 0.0007947482517920434, 0.0005327356047928333, 0.00011561471910681576, 0.002714182948693633, 0.19664205610752106], [0.016401639208197594, 0.046295344829559326, 0.005714193917810917, 0.014934827573597431, 0.05732293426990509, 0.04511968791484833, 0.03517167642712593, 0.39767220616340637, 0.02251366153359413, 0.0012446582550182939, 0.010193945840001106, 0.005028130952268839, 0.06642040610313416, 0.023687804117798805, 0.00040203702519647777, 0.010458456352353096, 0.24141839146614075], [0.01394050195813179, 0.04503249377012253, 0.034818243235349655, 0.02575930953025818, 0.06134670972824097, 0.12455001473426819, 0.034929059445858, 0.37352633476257324, 0.01239830069243908, 0.003985373303294182, 0.00104584451764822, 0.001423695357516408, 0.005541021004319191, 0.008495138958096504, 0.0003801221610046923, 0.00732740294188261, 0.24550047516822815], [0.027604270726442337, 0.034146737307310104, 0.0149156479164958, 0.017389317974448204, 0.011722655035555363, 0.030974365770816803, 0.024882083758711815, 0.4633851647377014, 0.008861260488629341, 0.005521596409380436, 0.004859026055783033, 0.006376627832651138, 0.013753213919699192, 0.01222193706780672, 0.0027775082271546125, 0.019386490806937218, 0.30122217535972595], [0.06330961734056473, 0.022987423464655876, 0.0022124273236840963, 0.008632184006273746, 0.006022817455232143, 0.008915998041629791, 0.033968206495046616, 0.4593507945537567, 0.023617399856448174, 0.025318635627627373, 0.01784508302807808, 0.06438597291707993, 0.02260000631213188, 0.003553911345079541, 0.00034618188510648906, 0.009915308095514774, 0.2270180583000183], [0.007415013387799263, 0.027772966772317886, 0.0027599381282925606, 0.004768472630530596, 0.0063121565617620945, 0.015607202425599098, 0.0040848334319889545, 0.17809036374092102, 0.44697999954223633, 0.010828511789441109, 0.008027183823287487, 0.006266927346587181, 0.006259840447455645, 0.17732086777687073, 0.0007812769035808742, 0.006514615844935179, 0.09020976722240448], [0.011212128214538097, 0.011064586229622364, 0.00929186586290598, 0.004193339496850967, 0.0035620415583252907, 0.007418776396661997, 0.011346505954861641, 0.4610574245452881, 0.026436612010002136, 0.07780169695615768, 0.019763408228754997, 0.04618450254201889, 0.009108318015933037, 0.016566812992095947, 0.001661119400523603, 0.04846096411347389, 0.23486988246440887], [0.023965485394001007, 0.00413536699488759, 0.0002565238100942224, 0.002042680513113737, 0.0006106494111008942, 0.0022966735996305943, 0.006188013590872288, 0.5016011595726013, 0.009961288422346115, 0.006403713952749968, 0.21530619263648987, 0.00360947591252625, 0.011030085384845734, 0.006289708893746138, 0.00014728176756761968, 0.002058879006654024, 0.20409689843654633], [0.023803701624274254, 0.005861188285052776, 0.0017903328407555819, 0.006711424794048071, 0.0020297346636652946, 0.010083501227200031, 0.02964775636792183, 0.42952361702919006, 0.015205491334199905, 0.0437641441822052, 0.013394557870924473, 0.09778983891010284, 0.008603587746620178, 0.013961265794932842, 0.0009002153528854251, 0.04851935803890228, 0.24841026961803436], [0.023781629279255867, 0.004507013596594334, 0.0009687670972198248, 0.0035743513144552708, 0.00451942253857851, 0.0041238125413656235, 0.010898624546825886, 0.5111874341964722, 0.004257243126630783, 0.004228512290865183, 0.007192783523350954, 0.010231683030724525, 0.09671378135681152, 0.013191650621592999, 0.001124248723499477, 0.020542453974485397, 0.2789566218852997], [0.0025529831182211637, 0.003895567962899804, 0.0009600265184417367, 0.0006933326949365437, 0.0018411152996122837, 0.0051749334670603275, 0.0005668971571139991, 0.6422987580299377, 0.0033138084691017866, 0.0003125722869299352, 0.00016155010962393135, 0.0005037641967646778, 0.007405666168779135, 0.05880843475461006, 0.0009845438180491328, 0.011188008822500706, 0.25933799147605896], [0.007157592568546534, 0.0022572800517082214, 0.001597763504832983, 0.0014352599391713738, 0.0025404782500118017, 0.004817312583327293, 0.011677275411784649, 0.6589229106903076, 0.001687935902737081, 0.005234307609498501, 0.0021309650037437677, 0.003476101206615567, 0.0032438577618449926, 0.015381939709186554, 0.05429365858435631, 0.012633481062948704, 0.21151188015937805], [0.025185398757457733, 0.02365874871611595, 0.012047477066516876, 0.015559067018330097, 0.008607608266174793, 0.023236973211169243, 0.018744250759482384, 0.5220514535903931, 0.007733246777206659, 0.005960468202829361, 0.0038136481307446957, 0.004987360443919897, 0.0098489448428154, 0.009204031899571419, 0.002490875544026494, 0.012728097848594189, 0.29414233565330505]], [[0.018313711509108543, 0.005940890405327082, 0.005876679439097643, 0.009499255567789078, 0.008861305192112923, 0.008682413958013058, 0.010257997550070286, 0.3123798370361328, 0.05833736062049866, 0.03860795125365257, 0.031123854219913483, 0.06081439554691315, 0.09164400398731232, 0.05548124015331268, 0.036623723804950714, 0.0506565198302269, 0.19689884781837463], [0.014657718129456043, 0.09716122597455978, 0.2672867178916931, 0.07259227335453033, 0.02334175445139408, 0.12757930159568787, 0.10065503418445587, 0.19113008677959442, 0.0003826104511972517, 0.0002926587185356766, 0.0008912059129215777, 0.000835260609164834, 0.0010664568981155753, 0.0005245751235634089, 0.0005235543358139694, 0.0006538259331136942, 0.10042572021484375], [0.008349491283297539, 0.024902788922190666, 0.01833903044462204, 0.01826833374798298, 0.008128257468342781, 0.03520580008625984, 0.0535917654633522, 0.6064145565032959, 0.00034187757410109043, 0.00030555357807315886, 0.0007571222959086299, 0.0003847410553134978, 0.0012817602837458253, 0.0004431113484315574, 0.0007479701307602227, 0.0005889513995498419, 0.22194884717464447], [0.017153488472104073, 0.03340350091457367, 0.053740888833999634, 0.01841166988015175, 0.013879344798624516, 0.06374309211969376, 0.04480122774839401, 0.5024486780166626, 0.00024233687145169824, 0.00022781525331083685, 0.0006953829433768988, 0.0005272513371892273, 0.0012173546710982919, 0.0006870084907859564, 0.0015585289802402258, 0.0010477708419784904, 0.24621467292308807], [0.03209710493683815, 0.05804307386279106, 0.09475362300872803, 0.07003344595432281, 0.02263137511909008, 0.0779813677072525, 0.06429700553417206, 0.37905317544937134, 0.0005263236234895885, 0.0006622651126235723, 0.0011128010228276253, 0.001376914675347507, 0.0021012439392507076, 0.0008525536395609379, 0.0023185620084404945, 0.0020378162153065205, 0.1901213526725769], [0.02833014540374279, 0.04718078300356865, 0.1283685714006424, 0.04076273739337921, 0.010877200402319431, 0.0696110725402832, 0.1136908158659935, 0.3647468388080597, 0.000303879874991253, 0.00021889631170779467, 0.0013681071577593684, 0.0008177232812158763, 0.0015278429491445422, 0.0006616472383029759, 0.0028530596755445004, 0.0008280561887659132, 0.1878526657819748], [0.003715561470016837, 0.009763323701918125, 0.004705017898231745, 0.006761363707482815, 0.001900262082926929, 0.010291218757629395, 0.0034087798558175564, 0.7826794981956482, 6.97610157658346e-05, 5.0676189857767895e-05, 0.00021489750361070037, 5.193654942559078e-05, 0.0001578514202265069, 9.552129631629214e-05, 0.00032906749402172863, 0.00019312824588268995, 0.17561209201812744], [0.017970794811844826, 0.062402356415987015, 0.16696622967720032, 0.10648570209741592, 0.015602869912981987, 0.05875176191329956, 0.09175565093755722, 0.3043697774410248, 0.0012507723877206445, 0.000907141191419214, 0.0042973109520971775, 0.001423128880560398, 0.0034312354400753975, 0.0013226415030658245, 0.0031563902739435434, 0.001798111479729414, 0.15810813009738922], [0.0052038091234862804, 0.0024959673173725605, 0.008137423545122147, 0.0035186398308724165, 0.0017063079867511988, 0.005162283778190613, 0.007024023216217756, 0.41655731201171875, 0.009800322353839874, 0.0238154549151659, 0.056142132729291916, 0.07141576707363129, 0.07245911657810211, 0.01625915803015232, 0.06637050956487656, 0.02129381150007248, 0.21263788640499115], [0.0008455067873001099, 0.0003738566010724753, 0.000620844482909888, 0.0002351491857552901, 0.00020837137708440423, 0.0007162520196288824, 0.0007110076257959008, 0.7040684819221497, 0.0015717188362032175, 0.0019563871901482344, 0.010446811094880104, 0.003112013917416334, 0.0041496604681015015, 0.002115382347255945, 0.012127276510000229, 0.003165746806189418, 0.25357550382614136], [0.0014939660904929042, 0.0009045872720889747, 0.0023026205599308014, 0.0007152347243390977, 0.00040488646482117474, 0.0016372604295611382, 0.0027870042249560356, 0.6150442957878113, 0.0015586180379614234, 0.0034028575755655766, 0.019005898386240005, 0.014205393381416798, 0.022172749042510986, 0.004024488851428032, 0.04001874476671219, 0.008228459395468235, 0.26209285855293274], [0.001227494445629418, 0.000421987846493721, 0.0011604413157328963, 0.0008276190492324531, 0.000390278291888535, 0.0010860420297831297, 0.0027716755867004395, 0.5841808319091797, 0.0030065886676311493, 0.005965443793684244, 0.028624000027775764, 0.012488435953855515, 0.0202639102935791, 0.006492956541478634, 0.047732431441545486, 0.012052754871547222, 0.271307110786438], [0.0020037859212607145, 0.0006074415287002921, 0.0015963200712576509, 0.0007590478053316474, 0.0006195286405272782, 0.0015772695187479258, 0.002311371499672532, 0.5919230580329895, 0.0019783054012805223, 0.005207367707043886, 0.00911188405007124, 0.0070528993383049965, 0.012199620716273785, 0.007806513458490372, 0.05115320906043053, 0.027284417301416397, 0.276807963848114], [0.0009521570173092186, 0.0004146900027990341, 0.002537815598770976, 0.0016025726217776537, 0.00038626641617156565, 0.001143086003139615, 0.0031797948759049177, 0.6058680415153503, 0.0007702126749791205, 0.0024959351867437363, 0.010859010741114616, 0.004986986052244902, 0.012542469426989555, 0.002838464453816414, 0.05980003625154495, 0.009359298273921013, 0.28026318550109863], [0.00013300411228556186, 5.819192301714793e-05, 0.00026542070554569364, 8.597633132012561e-05, 2.5102901417994872e-05, 0.00017787377873901278, 0.00018030447245109826, 0.7573187947273254, 4.654625809052959e-05, 0.0001712797675281763, 0.0005805459804832935, 0.00016586539277341217, 0.00038547965232282877, 0.00017712608678266406, 0.003550883149728179, 0.0009645915124565363, 0.23571306467056274], [0.0006251715822145343, 0.00020973793289158493, 0.0009723530965857208, 0.00034984468948096037, 9.998944733524695e-05, 0.0007569047156721354, 0.0025404649786651134, 0.6856247782707214, 0.00031071557896211743, 0.0008398502832278609, 0.005234739743173122, 0.0011217205319553614, 0.00293088611215353, 0.0013648845488205552, 0.031489331275224686, 0.0037305825389921665, 0.26179805397987366], [0.007324892096221447, 0.014801794663071632, 0.03206262364983559, 0.02388639748096466, 0.006559554021805525, 0.017767498269677162, 0.020332414656877518, 0.6187670826911926, 0.0007042190409265459, 0.0006598670734092593, 0.0021553265396505594, 0.0007668306352570653, 0.0019324008608236909, 0.000753125234041363, 0.0023014014586806297, 0.0015532320830971003, 0.24767139554023743]], [[0.03434949740767479, 0.09732656925916672, 0.02025909721851349, 0.019034333527088165, 0.026469845324754715, 0.02357860468327999, 0.04970265179872513, 0.35647785663604736, 0.009763795882463455, 0.013690073043107986, 0.0070798443630337715, 0.003477126359939575, 0.00954556092619896, 0.006846221629530191, 0.008786574006080627, 0.011321065947413445, 0.30229121446609497], [0.10732719302177429, 0.0077919322066009045, 0.0046666888520121574, 0.01163120474666357, 0.02692550979554653, 0.015101059339940548, 0.07073713093996048, 0.27855873107910156, 0.02543705515563488, 0.016258245334029198, 0.007674371358007193, 0.008647495880723, 0.08392240107059479, 0.02051379717886448, 0.006652924697846174, 0.035290226340293884, 0.2728639543056488], [0.016788607463240623, 0.005209881812334061, 0.0005765125388279557, 0.00066500308457762, 0.0019229698227718472, 0.00864559318870306, 0.03432951867580414, 0.5353429913520813, 0.011305931955575943, 0.009043002501130104, 0.005086126271635294, 0.007540910970419645, 0.014036419801414013, 0.00844405498355627, 0.00811686273664236, 0.016590815037488937, 0.31635481119155884], [0.02454465441405773, 0.021562485024333, 0.00226368079893291, 0.00042608322110027075, 0.0018893694505095482, 0.01237315684556961, 0.03123902902007103, 0.40341389179229736, 0.060900088399648666, 0.03134068101644516, 0.018840786069631577, 0.023252304643392563, 0.02691102959215641, 0.021040039137005806, 0.007053886540234089, 0.02019565925002098, 0.29275304079055786], [0.01828431710600853, 0.02547898329794407, 0.005784305743873119, 0.001544356346130371, 0.002042484236881137, 0.01551911886781454, 0.05223841220140457, 0.30757850408554077, 0.09024263173341751, 0.0343596525490284, 0.02852541394531727, 0.026669323444366455, 0.053964436054229736, 0.0375848188996315, 0.01355960126966238, 0.03747671842575073, 0.24914702773094177], [0.060479335486888885, 0.01665169931948185, 0.004634980112314224, 0.005256681703031063, 0.0076979827135801315, 0.005134024657309055, 0.039985742419958115, 0.14534661173820496, 0.11328884959220886, 0.018417738378047943, 0.017103714868426323, 0.021427813917398453, 0.21878406405448914, 0.08965367078781128, 0.01007001381367445, 0.06670805811882019, 0.1593589186668396], [0.027636492624878883, 0.009725716896355152, 0.0029750564135611057, 0.0017882157117128372, 0.00404423987492919, 0.008586341515183449, 0.004201411735266447, 0.5246608853340149, 0.027693891897797585, 0.007156014908105135, 0.006775363348424435, 0.01339282002300024, 0.028928864747285843, 0.03153292462229729, 0.013443696312606335, 0.02900322899222374, 0.2584547698497772], [0.0916278287768364, 0.07970874011516571, 0.04223792254924774, 0.07642164081335068, 0.034982819110155106, 0.05841580405831337, 0.09722817689180374, 0.07318754494190216, 0.04362071305513382, 0.03164798393845558, 0.03369677811861038, 0.05337397754192352, 0.05776224657893181, 0.050733018666505814, 0.0373653918504715, 0.050092630088329315, 0.08789677172899246], [0.043698132038116455, 0.028239743784070015, 0.02698923833668232, 0.03819839656352997, 0.05100812390446663, 0.0450790673494339, 0.15290401875972748, 0.3101671040058136, 0.0005759591003879905, 0.0037953967694193125, 0.0009508946677669883, 0.0002976637042593211, 0.0014606856275349855, 0.0008908226154744625, 0.008029889315366745, 0.005630896892398596, 0.2820839285850525], [0.018396327272057533, 0.018730850890278816, 0.013195384293794632, 0.021400542929768562, 0.016324162483215332, 0.028658747673034668, 0.04996702820062637, 0.4518030881881714, 0.003378210822120309, 0.001182866282761097, 0.004205227363854647, 0.0020564193837344646, 0.004762664437294006, 0.01012951135635376, 0.023584511131048203, 0.017732229083776474, 0.31449222564697266], [0.03573884814977646, 0.011488850228488445, 0.01618473045527935, 0.021006055176258087, 0.028205715119838715, 0.018216989934444427, 0.08476338535547256, 0.4386696219444275, 0.000896258803550154, 0.004407650791108608, 0.0009507917566224933, 0.0006028193165548146, 0.002079796278849244, 0.0014561492716893554, 0.02278987690806389, 0.01099998876452446, 0.3015424609184265], [0.025901664048433304, 0.01776093803346157, 0.027034148573875427, 0.02124025486409664, 0.034851785749197006, 0.043526556342840195, 0.10954408347606659, 0.39232337474823, 0.0005592065863311291, 0.002172097796574235, 0.0012641492066904902, 0.0001356762513751164, 0.001580702723003924, 0.0018871459178626537, 0.023597436025738716, 0.013929898850619793, 0.2826909124851227], [0.025609178468585014, 0.04390484094619751, 0.04311978816986084, 0.018645450472831726, 0.03771309554576874, 0.048595983535051346, 0.0823967456817627, 0.3866143524646759, 0.0005619650473818183, 0.007204623892903328, 0.0012324160197749734, 0.000353985553374514, 0.00040680725942365825, 0.0006121664773672819, 0.0071108099073171616, 0.003588866675272584, 0.29232898354530334], [0.0790477767586708, 0.023823222145438194, 0.02318662777543068, 0.01139883417636156, 0.008783871307969093, 0.028504153713583946, 0.11555903404951096, 0.3652096390724182, 0.0017764047952368855, 0.00642913905903697, 0.0020696839783340693, 0.0010477869072929025, 0.0022462178021669388, 0.002651205752044916, 0.014996293932199478, 0.009383216500282288, 0.3038869798183441], [0.06269603967666626, 0.034566786140203476, 0.04043963551521301, 0.015412146225571632, 0.008476761169731617, 0.03715137392282486, 0.09280227869749069, 0.3399847149848938, 0.01990005187690258, 0.008868354372680187, 0.008542236872017384, 0.012698394246399403, 0.01711549609899521, 0.029993318021297455, 0.006099820137023926, 0.024025702849030495, 0.24122688174247742], [0.07257869839668274, 0.06747972220182419, 0.07185255736112595, 0.02079324796795845, 0.010351445525884628, 0.04249030351638794, 0.055663954466581345, 0.26376715302467346, 0.027923552319407463, 0.021356778219342232, 0.015784848481416702, 0.0158582404255867, 0.017359206452965736, 0.026533914729952812, 0.030341243371367455, 0.022461794316768646, 0.21740344166755676], [0.06814394891262054, 0.07158570736646652, 0.051978837698698044, 0.06937866657972336, 0.029563002288341522, 0.0553324818611145, 0.07971587777137756, 0.1107180118560791, 0.04086760804057121, 0.030400117859244347, 0.02911742776632309, 0.04730575159192085, 0.05056744068861008, 0.04930172115564346, 0.037636250257492065, 0.04828399047255516, 0.1301032453775406]]], [[[0.22334961593151093, 0.04544507712125778, 0.012901648879051208, 0.11139638721942902, 0.022305287420749664, 0.028880508616566658, 0.14566709101200104, 0.06444352120161057, 0.07382683455944061, 0.008636174723505974, 0.011243883520364761, 0.01433570496737957, 0.08674127608537674, 0.028488367795944214, 0.003859615186229348, 0.011602390557527542, 0.10687657445669174], [0.2054121494293213, 0.1656344085931778, 0.016584224998950958, 0.01615143194794655, 0.007097871974110603, 0.04185329005122185, 0.07906866073608398, 0.0988493263721466, 0.14013904333114624, 0.02221917174756527, 0.008556939661502838, 0.009960401803255081, 0.048155996948480606, 0.008807132951915264, 0.0017102649435400963, 0.010089295916259289, 0.11971042305231094], [0.11538565158843994, 0.1335320621728897, 0.049986790865659714, 0.03760025277733803, 0.02531510777771473, 0.057201310992240906, 0.03913629427552223, 0.11763870716094971, 0.023321058601140976, 0.017073171213269234, 0.005146743729710579, 0.0031193189788609743, 0.010933952406048775, 0.006641590967774391, 0.0030563760083168745, 0.00934162363409996, 0.34556999802589417], [0.19842711091041565, 0.1941831111907959, 0.017649250105023384, 0.05712096020579338, 0.014766949228942394, 0.047162532806396484, 0.05200577154755592, 0.09889904409646988, 0.044739555567502975, 0.05029186233878136, 0.004741714335978031, 0.005300873424857855, 0.027047136798501015, 0.005149001721292734, 0.0032374528236687183, 0.010221526026725769, 0.16905608773231506], [0.07979316264390945, 0.039140235632658005, 0.023519888520240784, 0.026372916996479034, 0.016375048086047173, 0.04451238736510277, 0.04727036505937576, 0.14241185784339905, 0.023241175338625908, 0.0342213436961174, 0.004796512890607119, 0.004432917572557926, 0.01822643168270588, 0.005359556060284376, 0.006127100437879562, 0.014163713902235031, 0.47003546357154846], [0.09409326314926147, 0.09914492070674896, 0.012539638206362724, 0.025707218796014786, 0.011062676087021828, 0.08826858550310135, 0.13363900780677795, 0.1642266809940338, 0.04140052944421768, 0.015601911582052708, 0.0041093360632658005, 0.004592863842844963, 0.018961625173687935, 0.005631592124700546, 0.0017440576339140534, 0.006951889954507351, 0.2723243236541748], [0.03883233293890953, 0.12268908321857452, 0.03693877533078194, 0.09221889823675156, 0.022110823541879654, 0.07010053843259811, 0.04896273463964462, 0.12092266231775284, 0.02243798039853573, 0.005595621652901173, 0.002802611328661442, 0.004663331899791956, 0.014370420947670937, 0.0054308767430484295, 0.0007929342100396752, 0.004955661948770285, 0.3861747086048126], [0.029682574793696404, 0.023776760324835777, 0.02015557698905468, 0.018913792446255684, 0.006949894595891237, 0.020058300346136093, 0.03902122378349304, 0.19137701392173767, 0.012320042587816715, 0.009402570314705372, 0.008538808673620224, 0.004676864016801119, 0.008718949742615223, 0.007471541408449411, 0.006121929734945297, 0.01721966825425625, 0.5755945444107056], [0.20950822532176971, 0.02008242905139923, 0.00826579611748457, 0.0074532185681164265, 0.001041739247739315, 0.008228019811213017, 0.06663142889738083, 0.08298065513372421, 0.24637892842292786, 0.005853989627212286, 0.03994271159172058, 0.023511014878749847, 0.0934559628367424, 0.04390285536646843, 0.0031090739648789167, 0.009409905411303043, 0.13024404644966125], [0.07167889922857285, 0.03921680897474289, 0.04868743196129799, 0.014409826137125492, 0.0027814190834760666, 0.019617771729826927, 0.026637515053153038, 0.09544185549020767, 0.15808409452438354, 0.01385003887116909, 0.04845315217971802, 0.056853797286748886, 0.05582408979535103, 0.06893220543861389, 0.005575505085289478, 0.02964528277516365, 0.24431034922599792], [0.06710069626569748, 0.016882674768567085, 0.02311631478369236, 0.018601343035697937, 0.0013110608560964465, 0.012744827196002007, 0.012153306044638157, 0.06607306748628616, 0.2938641607761383, 0.006731635425239801, 0.0906231626868248, 0.0702696219086647, 0.07799869030714035, 0.10082665085792542, 0.004739957395941019, 0.011453518643975258, 0.1255093216896057], [0.09179053455591202, 0.01987321302294731, 0.030382022261619568, 0.009743300266563892, 0.0013207471929490566, 0.012596304528415203, 0.03721640259027481, 0.11156687885522842, 0.17090731859207153, 0.011862560175359249, 0.04265942424535751, 0.033101994544267654, 0.050616104155778885, 0.07870694249868393, 0.004741842858493328, 0.015433902852237225, 0.2774805426597595], [0.2576143741607666, 0.02710997313261032, 0.021341141313314438, 0.019533343613147736, 0.0027714292518794537, 0.015587026253342628, 0.05105403810739517, 0.08885181695222855, 0.17595644295215607, 0.006353248842060566, 0.020285453647375107, 0.014229584485292435, 0.06799513101577759, 0.036016955971717834, 0.004394363146275282, 0.00834150705486536, 0.1825641691684723], [0.17798307538032532, 0.011355855502188206, 0.00811835564672947, 0.004083146806806326, 0.0007110852748155594, 0.006480460055172443, 0.03835621848702431, 0.07030012458562851, 0.2545212507247925, 0.007946201600134373, 0.04424753040075302, 0.04049238935112953, 0.0911286324262619, 0.10645229369401932, 0.008307570591568947, 0.014278880320489407, 0.11523699760437012], [0.0330929234623909, 0.011904238723218441, 0.02422855980694294, 0.02565036341547966, 0.0029049755539745092, 0.005626334343105555, 0.01817009039223194, 0.17400415241718292, 0.031150320544838905, 0.013966593891382217, 0.04520203545689583, 0.010443846695125103, 0.015949053689837456, 0.021923020482063293, 0.014933396130800247, 0.02724243514239788, 0.5236077308654785], [0.06568560749292374, 0.01716724969446659, 0.028122151270508766, 0.022932587191462517, 0.0036854213103652, 0.013836492784321308, 0.04440990090370178, 0.1338202953338623, 0.04952190816402435, 0.022777052596211433, 0.03206106647849083, 0.01763630285859108, 0.021503973752260208, 0.022958431392908096, 0.015991495922207832, 0.025937054306268692, 0.46195289492607117], [0.016917074099183083, 0.017068440094590187, 0.04986407607793808, 0.02099430188536644, 0.015518118627369404, 0.026000872254371643, 0.03714475780725479, 0.1345081925392151, 0.01887369155883789, 0.032998282462358475, 0.027714664116501808, 0.020497098565101624, 0.014761744998395443, 0.01996416039764881, 0.03010367974638939, 0.0459233857691288, 0.4711473882198334]], [[0.04745788127183914, 0.02330286242067814, 0.00460421247407794, 0.022984430193901062, 0.001726088230498135, 0.008383411914110184, 0.0055729420855641365, 0.11049287021160126, 0.016018880531191826, 0.0343489944934845, 0.006528163328766823, 0.0102576594799757, 0.02138156071305275, 0.004719866905361414, 0.0015259117353707552, 0.005403659772127867, 0.6752907037734985], [0.0009574544965289533, 0.002554595237597823, 0.027084091678261757, 0.005700032226741314, 0.0003321235708426684, 0.0007152469479478896, 0.003384950803592801, 0.04437915235757828, 0.0007298921700567007, 0.002300325781106949, 0.00017232676327694207, 0.0008278345339931548, 0.002465510740876198, 0.0004068814741913229, 0.0002494769578333944, 0.0016195033676922321, 0.906120777130127], [0.00016531070286873728, 0.0006252555176615715, 0.1420842856168747, 0.0010585576528683305, 0.00020448744180612266, 0.0005079521215520799, 0.001388776465319097, 0.029470516368746758, 9.030548244481906e-05, 0.0008749518892727792, 4.1537332435837016e-05, 0.0003755349025595933, 0.00014518832904286683, 0.00010915933671640232, 0.0002005560090765357, 0.0010218382813036442, 0.8216357231140137], [0.00047921971417963505, 0.001248587155714631, 0.0066655585542321205, 0.06171967834234238, 0.0013536717742681503, 0.0008476137300021946, 0.003490882460027933, 0.029017062857747078, 0.0009102790500037372, 0.0009072656976059079, 0.00031423219479620457, 0.001746626803651452, 0.001966055715456605, 0.0005210721283219755, 0.0001863153011072427, 0.0013494685990735888, 0.8872764706611633], [0.0004168582381680608, 0.0010174739873036742, 0.005488310940563679, 0.0066293273121118546, 0.00565591175109148, 0.0025001955218613148, 0.010205171070992947, 0.034396130591630936, 0.0005716965533792973, 0.0014805422397330403, 0.0005790446302853525, 0.0009272907627746463, 0.0028798049315810204, 0.00030082796001806855, 0.00012959912419319153, 0.0016169508453458548, 0.925204873085022], [0.0006333431228995323, 0.000854389276355505, 0.022259458899497986, 0.008508424274623394, 0.000823633570689708, 0.00474295299500227, 0.01637878455221653, 0.05356863513588905, 0.0004959264770150185, 0.00320037011988461, 0.00023786294332239777, 0.0010376371210440993, 0.0011369785061106086, 0.00040708534652367234, 0.0004274403618182987, 0.0022035797592252493, 0.8830835819244385], [0.0003692205063998699, 0.0005525645683519542, 0.014511391520500183, 0.004177744500339031, 0.0008507512975484133, 0.0016379663720726967, 0.017305616289377213, 0.04013440012931824, 0.000274520629318431, 0.0017871917225420475, 0.0004125804698560387, 0.0003394914383534342, 0.0013362254248932004, 0.00046668306458741426, 0.0005942053976468742, 0.0013454346917569637, 0.9139041304588318], [0.0005615242989733815, 0.0008174806134775281, 0.0031197357457131147, 0.0016172801842913032, 0.00022943184012547135, 0.0008133886149153113, 0.0027137103024870157, 0.05553648993372917, 0.0002570735232438892, 0.0006341561675071716, 0.0003574947768356651, 0.0004817527951672673, 0.0006982883205637336, 0.0003166442911606282, 0.00031529372790828347, 0.000871897442266345, 0.9306583404541016], [0.009538985788822174, 0.0036278332117944956, 0.001779107959009707, 0.005225180648267269, 0.0006457989220507443, 0.001817913493141532, 0.01900104247033596, 0.051933590322732925, 0.1896081566810608, 0.027474183589220047, 0.012506905011832714, 0.020788928493857384, 0.07309415191411972, 0.05930856242775917, 0.001197939389385283, 0.010513280518352985, 0.5119383931159973], [0.00324865966103971, 0.0017587152542546391, 0.017829954624176025, 0.0037363844458013773, 0.00023736088769510388, 0.0007947667618282139, 0.024374626576900482, 0.05261950194835663, 0.0075270733796060085, 0.16821813583374023, 0.0031757475808262825, 0.010573476552963257, 0.013765878044068813, 0.00248114881105721, 0.0017197425477206707, 0.01186852715909481, 0.6760703325271606], [0.0013691228814423084, 0.00028685404686257243, 0.0004353309341240674, 0.000653051829431206, 0.00011953420471400023, 0.00024034528178162873, 0.007357578258961439, 0.04939660802483559, 0.002377317985519767, 0.005314792972058058, 0.1814413219690323, 0.004728224594146013, 0.014040631242096424, 0.0022460210602730513, 0.0008619906147941947, 0.005620535928755999, 0.7235108017921448], [0.0035129443276673555, 0.0021358351223170757, 0.003529864829033613, 0.00904749147593975, 0.00033770076697692275, 0.001375570078380406, 0.013912446796894073, 0.06434755772352219, 0.010392139665782452, 0.013372473418712616, 0.008855434134602547, 0.12270186096429825, 0.013607763685286045, 0.005099679343402386, 0.0012336504878476262, 0.025387519970536232, 0.701150119304657], [0.006513859145343304, 0.0012168746907263994, 0.00026273741968907416, 0.0013329992070794106, 0.0002614783588796854, 0.0005129208439029753, 0.00653555104508996, 0.039823200553655624, 0.02370712161064148, 0.005794275552034378, 0.023297714069485664, 0.008835051208734512, 0.2976222634315491, 0.011080462485551834, 0.000275573693215847, 0.005315711721777916, 0.5676122307777405], [0.0050578187219798565, 0.0015462074661627412, 0.0017522982088848948, 0.0019736173562705517, 0.00048714448348619044, 0.0019928989931941032, 0.02408319152891636, 0.0606706477701664, 0.06658952683210373, 0.011744302697479725, 0.019833983853459358, 0.026256138458848, 0.0824577733874321, 0.167005717754364, 0.0025754880625754595, 0.04184817522764206, 0.4841251075267792], [9.332132322015241e-05, 0.00016968167619779706, 0.002285520313307643, 0.00037579829222522676, 4.4935575715499e-05, 0.00014303285570349544, 0.0015385915758088231, 0.027171317487955093, 0.0001933723542606458, 0.0010026845848187804, 0.000521582318469882, 0.0007205139263533056, 0.0004809320962522179, 0.0006698325742036104, 0.018094154074788094, 0.004332810174673796, 0.9421619772911072], [0.0012205411912873387, 0.0009156119194813073, 0.0058319903910160065, 0.0012008778285235167, 0.00015880161663517356, 0.0009117217850871384, 0.00689153466373682, 0.05865723267197609, 0.0019034824799746275, 0.002919857855886221, 0.0017273338744416833, 0.00487901596352458, 0.006055849604308605, 0.003261469304561615, 0.0011902570258826017, 0.03724973276257515, 0.8650246858596802], [0.0010529852006584406, 0.0012099314481019974, 0.004019669257104397, 0.0023122832644730806, 0.0005112547078169882, 0.0011719244066625834, 0.00534292496740818, 0.05445542931556702, 0.0010839805472642183, 0.0018686613766476512, 0.0015111233806237578, 0.0015619475161656737, 0.0020400662906467915, 0.001221836544573307, 0.0008549784542992711, 0.0026152473874390125, 0.9171657562255859]], [[0.01675097830593586, 0.029751010239124298, 0.01335798017680645, 0.08050186187028885, 0.00881408154964447, 0.021077139303088188, 0.1438671350479126, 0.27450665831565857, 0.030976010486483574, 0.009374797344207764, 0.016454769298434258, 0.015129131264984608, 0.031961724162101746, 0.008224301040172577, 0.0026940149255096912, 0.003284914419054985, 0.2932735085487366], [0.009807988069951534, 0.04848639667034149, 0.018892770633101463, 0.04522402957081795, 0.006287489086389542, 0.02015862427651882, 0.012982192449271679, 0.20467634499073029, 0.0012946610804647207, 0.0037379288114607334, 0.001596081187017262, 0.0005624869372695684, 0.0007748717907816172, 0.0009768962627276778, 0.002531283302232623, 0.0017632426461204886, 0.6202467083930969], [0.0033734412863850594, 0.036748338490724564, 0.10612611472606659, 0.06509694457054138, 0.01802372746169567, 0.02739577554166317, 0.021190552040934563, 0.12178847938776016, 0.00028220072272233665, 0.002377759199589491, 0.0008520626579411328, 0.00016741218860261142, 0.0002790550352074206, 0.00043050170643255115, 0.006088642403483391, 0.0032250024378299713, 0.5865539908409119], [0.016044748947024345, 0.01759292371571064, 0.012684546411037445, 0.04510921984910965, 0.013592444360256195, 0.01100713200867176, 0.006792750675231218, 0.15109962224960327, 0.0019960757344961166, 0.005790626630187035, 0.004013760946691036, 0.001138357911258936, 0.0024620031472295523, 0.0011090605985373259, 0.005360256880521774, 0.0026596926618367434, 0.7015467882156372], [0.016137422993779182, 0.011018023826181889, 0.008834225125610828, 0.01629466377198696, 0.01579330675303936, 0.02493588626384735, 0.009196176193654537, 0.1381547451019287, 0.0018121449975296855, 0.0028578429482877254, 0.0012880483409389853, 0.0012763841077685356, 0.0015178743051365018, 0.0026097972877323627, 0.002755651483312249, 0.0030731752049177885, 0.742444634437561], [0.019156169146299362, 0.0235374104231596, 0.01014165859669447, 0.025943588465452194, 0.035142578184604645, 0.09075773507356644, 0.0418345145881176, 0.19485025107860565, 0.0013698843540623784, 0.008079328574240208, 0.001734949299134314, 0.0018870810745283961, 0.00258996500633657, 0.0020618541166186333, 0.00639730878174305, 0.008622993715107441, 0.5258926749229431], [0.08572693169116974, 0.02584565058350563, 0.011317632161080837, 0.021288689225912094, 0.06010239198803902, 0.0797385424375534, 0.09059523791074753, 0.18106722831726074, 0.008589528501033783, 0.015902483835816383, 0.0034983570221811533, 0.0019173118053004146, 0.015633245930075645, 0.004648653324693441, 0.0028057326562702656, 0.008108342066407204, 0.38321396708488464], [0.017198767513036728, 0.020206568762660027, 0.009553317911922932, 0.01158920768648386, 0.005120620597153902, 0.022092066705226898, 0.020027711987495422, 0.22345352172851562, 0.001523945014923811, 0.0035616119857877493, 0.0022848770022392273, 0.0015681169461458921, 0.002939742524176836, 0.0017024994594976306, 0.0031027314253151417, 0.0035284715704619884, 0.6505462527275085], [0.030897509306669235, 0.029706252738833427, 0.005836248863488436, 0.012706184759736061, 0.001009413506835699, 0.008909190073609352, 0.03950071334838867, 0.22726328670978546, 0.01513813529163599, 0.00919159036129713, 0.0046218703500926495, 0.012753555551171303, 0.016143543645739555, 0.0076603456400334835, 0.0010620193788781762, 0.0023256863933056593, 0.5752743482589722], [0.0716332271695137, 0.12027689814567566, 0.018818138167262077, 0.029427461326122284, 0.010100743733346462, 0.06141945719718933, 0.056331392377614975, 0.18822944164276123, 0.01111608650535345, 0.013545231893658638, 0.0010341029847040772, 0.003141344292089343, 0.005898731295019388, 0.004912815988063812, 0.005096282344311476, 0.004002546425908804, 0.3950161039829254], [0.05005652830004692, 0.023346595466136932, 0.013149992562830448, 0.008236930705606937, 0.0016482234932482243, 0.016047902405261993, 0.021162724122405052, 0.26056280732154846, 0.010271149687469006, 0.01741165854036808, 0.008194400928914547, 0.020672256126999855, 0.016174929216504097, 0.013013890944421291, 0.008251837454736233, 0.014158356003463268, 0.4976397752761841], [0.036740124225616455, 0.056287072598934174, 0.010855757631361485, 0.02545304037630558, 0.0025417148135602474, 0.017475077882409096, 0.030599679797887802, 0.2700594961643219, 0.0076386891305446625, 0.006507040932774544, 0.0016206931322813034, 0.0035116563085466623, 0.0031255113426595926, 0.0031680818647146225, 0.0018867311300709844, 0.0025250008329749107, 0.5200045704841614], [0.04290362074971199, 0.013052191585302353, 0.0032662921585142612, 0.006345386616885662, 0.0014085173606872559, 0.006052630487829447, 0.037504758685827255, 0.26874879002571106, 0.029189229011535645, 0.007478197105228901, 0.005296583753079176, 0.008596551604568958, 0.023249449208378792, 0.016859835013747215, 0.002502261893823743, 0.003873936366289854, 0.5236716866493225], [0.03169481083750725, 0.029814638197422028, 0.011814289726316929, 0.008182005025446415, 0.0032363005448132753, 0.022044910117983818, 0.056778859347105026, 0.26119160652160645, 0.009015236049890518, 0.009280598722398281, 0.004600833635777235, 0.013636325486004353, 0.021582169458270073, 0.023674391210079193, 0.007662277203053236, 0.018768150359392166, 0.467022567987442], [0.039222002029418945, 0.043641261756420135, 0.02286531776189804, 0.004107121843844652, 0.0040258243680000305, 0.045961979776620865, 0.04067198932170868, 0.20811690390110016, 0.0079402020201087, 0.018867485225200653, 0.006081747822463512, 0.00906203594058752, 0.016396893188357353, 0.016430266201496124, 0.054756175726652145, 0.06617768108844757, 0.3956751823425293], [0.02755782939493656, 0.0332137756049633, 0.01498511154204607, 0.007432879880070686, 0.00479091377928853, 0.036900416016578674, 0.03010246902704239, 0.24406196177005768, 0.004864844027906656, 0.008963802829384804, 0.0019401784520596266, 0.004367450252175331, 0.007558388635516167, 0.012306895107030869, 0.011382388882339, 0.01914852298796177, 0.5304221510887146], [0.013592668808996677, 0.01773017644882202, 0.018561914563179016, 0.029333721846342087, 0.00874125026166439, 0.01348085142672062, 0.01935093104839325, 0.1219952255487442, 0.0028546026442199945, 0.005693513434380293, 0.0035189699847251177, 0.00231643277220428, 0.004245359916239977, 0.002541603520512581, 0.004276231862604618, 0.004385537002235651, 0.7273810505867004]], [[0.009952981024980545, 0.04738154634833336, 0.005876624956727028, 0.00358164613135159, 0.0065017701126635075, 0.033864617347717285, 0.20481941103935242, 0.16318625211715698, 0.01629616506397724, 0.009542688727378845, 0.011546830646693707, 0.008731205016374588, 0.006188470404595137, 0.01297728717327118, 0.016307521611452103, 0.028125157579779625, 0.41511979699134827], [0.09298623353242874, 0.00035160023253411055, 0.0002208138903370127, 0.0005738306790590286, 0.00026605636230669916, 0.001048707403242588, 0.05959992855787277, 0.09746970236301422, 0.006356380879878998, 0.013193927705287933, 0.0026062987744808197, 0.002854249905794859, 0.011599290184676647, 0.010597079992294312, 0.0027982122264802456, 0.023695431649684906, 0.6737822890281677], [0.06408531218767166, 0.0017792348517104983, 0.0005354708409868181, 0.0022419702727347612, 0.0016292397631332278, 0.0034711831249296665, 0.10403171926736832, 0.17723245918750763, 0.006626461166888475, 0.01894967071712017, 0.004174056928604841, 0.005490654148161411, 0.022562619298696518, 0.010317179374396801, 0.008353099226951599, 0.04515991359949112, 0.5233598351478577], [0.033806588500738144, 0.0013781951274722815, 0.0008582998998463154, 0.000281147105852142, 0.0002780203358270228, 0.002088468987494707, 0.015017666853964329, 0.14943912625312805, 0.009429430589079857, 0.006226611789315939, 0.004327022936195135, 0.0029319231398403645, 0.007113713771104813, 0.010861149057745934, 0.0021692633163183928, 0.008762134239077568, 0.7450311183929443], [0.025979332625865936, 0.004423297941684723, 0.0023171042557805777, 0.001342316623777151, 0.00048344553215429187, 0.005202721804380417, 0.08007914572954178, 0.17749916017055511, 0.013963770121335983, 0.011629631742835045, 0.006705607753247023, 0.00352342426776886, 0.006770276464521885, 0.014785654842853546, 0.0034628526773303747, 0.016041573137044907, 0.6257906556129456], [0.04126323014497757, 0.000779440626502037, 0.0002913215139415115, 0.00040699890814721584, 0.00019081206119153649, 0.0005449484451673925, 0.04097115620970726, 0.131930872797966, 0.00479675829410553, 0.004714667331427336, 0.00319970422424376, 0.001387981348671019, 0.005079562775790691, 0.00883167702704668, 0.00305686192587018, 0.013283140026032925, 0.7392708659172058], [0.02620951645076275, 0.008779098279774189, 0.010552973486483097, 0.0015640028286725283, 0.0023082434199750423, 0.009384912438690662, 0.01815946400165558, 0.13902446627616882, 0.012570589780807495, 0.009399722330272198, 0.01834506168961525, 0.008715079165995121, 0.0056093232706189156, 0.017529653385281563, 0.015946317464113235, 0.01876763068139553, 0.6771338582038879], [0.11347553879022598, 0.025345291942358017, 0.015065250918269157, 0.01908656768500805, 0.011953286826610565, 0.02736031450331211, 0.17395128309726715, 0.11931822448968887, 0.029880160465836525, 0.030435921624302864, 0.028830060735344887, 0.025185909122228622, 0.038336385041475296, 0.043343670666217804, 0.02363160438835621, 0.06330331414937973, 0.21149714291095734], [0.02720571495592594, 0.025808049365878105, 0.002410859102383256, 0.015553703531622887, 0.03539576753973961, 0.12781444191932678, 0.07928664982318878, 0.19401277601718903, 0.0026284398045390844, 0.006241445429623127, 0.003444816218689084, 0.0034225096460431814, 0.006596450228244066, 0.0057450272142887115, 0.003905800636857748, 0.01936313696205616, 0.44116446375846863], [0.014987330883741379, 0.01654433086514473, 0.004670808557420969, 0.005397018976509571, 0.01839473471045494, 0.0595112070441246, 0.056780021637678146, 0.24405008554458618, 0.005477063823491335, 0.005542357452213764, 0.012659691274166107, 0.004686854314059019, 0.006718002259731293, 0.018565839156508446, 0.008033482357859612, 0.022138889878988266, 0.49584221839904785], [0.023367779329419136, 0.00929228775203228, 0.0057346755638718605, 0.010682579129934311, 0.03002633899450302, 0.03418240323662758, 0.06527934223413467, 0.15962931513786316, 0.0058802710846066475, 0.016777807846665382, 0.005971783772110939, 0.005403390619903803, 0.013459893874824047, 0.018081102520227432, 0.011667517945170403, 0.0308490339666605, 0.5537144541740417], [0.033160556107759476, 0.018078090623021126, 0.003678599139675498, 0.015722043812274933, 0.022328738123178482, 0.035864416509866714, 0.035239096730947495, 0.20345520973205566, 0.008283070288598537, 0.00493867602199316, 0.006215236149728298, 0.0013074162416160107, 0.011122860945761204, 0.016681626439094543, 0.0038104248233139515, 0.013658096082508564, 0.5664558410644531], [0.008265090174973011, 0.07553514093160629, 0.012609598226845264, 0.009644989855587482, 0.020348671823740005, 0.18203967809677124, 0.04354019835591316, 0.1949760913848877, 0.0037799314595758915, 0.003312955843284726, 0.004396146163344383, 0.0024564391933381557, 0.0014623665483668447, 0.005814823321998119, 0.0032012425363063812, 0.010792434215545654, 0.41782429814338684], [0.02328040637075901, 0.031033296138048172, 0.004317097365856171, 0.02397785894572735, 0.03226369246840477, 0.155626580119133, 0.10364849865436554, 0.21960018575191498, 0.0026270789094269276, 0.005100365728139877, 0.0027488917112350464, 0.0032041475642472506, 0.005506304558366537, 0.004633365664631128, 0.002524188719689846, 0.014377628453075886, 0.3655303418636322], [0.054188668727874756, 0.01049194484949112, 0.02951459214091301, 0.00977024994790554, 0.028101466596126556, 0.031393393874168396, 0.12242131680250168, 0.18197093904018402, 0.010021853260695934, 0.02751804329454899, 0.018421322107315063, 0.0161918755620718, 0.016567107290029526, 0.025737939402461052, 0.007757308427244425, 0.06321997195482254, 0.34671199321746826], [0.03405828773975372, 0.04465678706765175, 0.044988345354795456, 0.0158408060669899, 0.036020077764987946, 0.07103562355041504, 0.11676257103681564, 0.22488602995872498, 0.011830955743789673, 0.009717056527733803, 0.009797455742955208, 0.0113305589184165, 0.015888774767518044, 0.018153980374336243, 0.0065796454437077045, 0.011301378719508648, 0.3171516954898834], [0.07268758118152618, 0.06594744324684143, 0.049214161932468414, 0.06161388009786606, 0.06561169028282166, 0.07568227499723434, 0.16902905702590942, 0.05642540752887726, 0.040534183382987976, 0.04548986628651619, 0.035365715622901917, 0.03673427551984787, 0.051265012472867966, 0.04364994168281555, 0.030975796282291412, 0.05540724843740463, 0.044366504997015]], [[0.005204841028898954, 0.03407548367977142, 0.002107110572978854, 0.1225392073392868, 0.015153540298342705, 0.050548799335956573, 0.6777923107147217, 0.04215997830033302, 0.0019765684846788645, 0.0022024039644747972, 0.0008170875371433794, 0.0005653582629747689, 0.0031630704179406166, 0.00042511516949161887, 0.00014279474271461368, 0.0006465389742515981, 0.04047980159521103], [0.14373140037059784, 0.02910265326499939, 0.008474052883684635, 0.00771362753584981, 0.006538440473377705, 0.017508842051029205, 0.006405793130397797, 0.17720316350460052, 0.011413845233619213, 0.02183070033788681, 0.0028578867204487324, 0.0057427240535616875, 0.020800096914172173, 0.0018734498880803585, 0.0013526835246011615, 0.008654248900711536, 0.528796374797821], [0.02394009567797184, 0.017240235581994057, 0.012724577449262142, 0.01197814755141735, 0.004304948262870312, 0.018189184367656708, 0.002811320824548602, 0.106321319937706, 0.0020229408983141184, 0.0029710757080465555, 0.0015049027279019356, 0.0019795820116996765, 0.003968261182308197, 0.0018299929797649384, 0.0014813030138611794, 0.002951623173430562, 0.7837804555892944], [0.07994788140058517, 0.014072759076952934, 0.03966859355568886, 0.026510458439588547, 0.006724148523062468, 0.0042208340018987656, 0.0015313595067709684, 0.07985836267471313, 0.005621130578219891, 0.025973543524742126, 0.005685980431735516, 0.007424330338835716, 0.010493740439414978, 0.0017971335910260677, 0.0018485661130398512, 0.009578398428857327, 0.6790427565574646], [0.04743397980928421, 0.03711431100964546, 0.08984775096178055, 0.04952513054013252, 0.027655702084302902, 0.046471353620290756, 0.011732588522136211, 0.14207734167575836, 0.0027335607446730137, 0.00987397599965334, 0.003886388847604394, 0.0024463788140565157, 0.005156119354069233, 0.001468632253818214, 0.0023436136543750763, 0.010650315321981907, 0.5095828175544739], [0.12473386526107788, 0.06508834660053253, 0.02551909349858761, 0.016711929813027382, 0.016396967694163322, 0.050547242164611816, 0.01509903371334076, 0.23998820781707764, 0.011749571189284325, 0.013211091049015522, 0.004407459404319525, 0.00413055345416069, 0.018108565360307693, 0.005678538698703051, 0.002544565824791789, 0.014523865655064583, 0.37156108021736145], [0.03510328754782677, 0.03967694193124771, 0.04875274375081062, 0.04046376422047615, 0.026164203882217407, 0.10067471116781235, 0.03682190552353859, 0.19395391643047333, 0.0035414870362728834, 0.008903183043003082, 0.006341889966279268, 0.0020892443135380745, 0.008663601242005825, 0.006561058573424816, 0.005935242865234613, 0.018559131771326065, 0.4177936911582947], [0.0015197531320154667, 0.0026193922385573387, 0.0029102624393999577, 0.002283281646668911, 0.0006328781601041555, 0.0021114093251526356, 0.0032510675955563784, 0.052734874188899994, 0.0003216416807845235, 0.0007327760104089975, 0.0007540354272350669, 0.0002972047950606793, 0.0010756902629509568, 0.0002729342086240649, 0.0004701016878243536, 0.0013630398316308856, 0.9266497492790222], [0.013844256289303303, 0.007088549435138702, 0.0024541073944419622, 0.002832758706063032, 0.0003913055988959968, 0.0017200567526742816, 0.0017184932949021459, 0.03334829583764076, 0.009892580099403858, 0.012575415894389153, 0.0038891471922397614, 0.00887538492679596, 0.013123350217938423, 0.0020779240876436234, 0.0011346886167302728, 0.0037584041710942984, 0.8812752366065979], [0.10204585641622543, 0.05041077360510826, 0.015535542741417885, 0.01755315065383911, 0.001965764444321394, 0.006713863927870989, 0.010417301207780838, 0.022889409214258194, 0.1259976178407669, 0.1013011485338211, 0.009401791729032993, 0.061093609780073166, 0.08726859092712402, 0.020065616816282272, 0.0038671300280839205, 0.01900539919734001, 0.34446752071380615], [0.0034805976320058107, 0.004502805881202221, 0.004970182664692402, 0.0036184703931212425, 0.0003958944871556014, 0.0020440497901290655, 0.003213439369574189, 0.021063873544335365, 0.006273280829191208, 0.016385797411203384, 0.008800084702670574, 0.008656669408082962, 0.01625555381178856, 0.005845724139362574, 0.003200517501682043, 0.010672219097614288, 0.8806210160255432], [0.027767449617385864, 0.02237100526690483, 0.009050381369888783, 0.00980295054614544, 0.0015436640242114663, 0.004774736240506172, 0.005631148815155029, 0.03611321002244949, 0.0648488700389862, 0.038524650037288666, 0.012134503573179245, 0.03662845864892006, 0.062070515006780624, 0.0170111283659935, 0.004340749699622393, 0.014499379321932793, 0.6328871250152588], [0.01581813581287861, 0.003917735535651445, 0.0013730424689128995, 0.0033506813924759626, 0.0002956163662020117, 0.0017931296024471521, 0.004185955971479416, 0.02266732230782509, 0.009949246421456337, 0.013715957291424274, 0.004446935374289751, 0.011877600103616714, 0.01683041825890541, 0.003616312285885215, 0.0017293255077674985, 0.007327555678784847, 0.8771049380302429], [0.006913828197866678, 0.005203189793974161, 0.0014318230096250772, 0.0014848390128463507, 0.0003076017019338906, 0.0026958186645060778, 0.0020691261161118746, 0.048091497272253036, 0.00479669077321887, 0.0034601581282913685, 0.0038403982762247324, 0.003438242245465517, 0.008097176440060139, 0.004802146460860968, 0.0026416077744215727, 0.007860969752073288, 0.8928649425506592], [0.0025386104825884104, 0.0012687371345236897, 0.0012562754563987255, 0.0013890565605834126, 0.00016666334704495966, 0.0016160283703356981, 0.004016486927866936, 0.02181178145110607, 0.003016178496181965, 0.006748720537871122, 0.004395125433802605, 0.006255495827645063, 0.007816332392394543, 0.010647032409906387, 0.011209199205040932, 0.024585027247667313, 0.8912632465362549], [0.00715211033821106, 0.011410025879740715, 0.007488959934562445, 0.007057667709887028, 0.0012691087322309613, 0.00901399552822113, 0.0116407610476017, 0.047445062547922134, 0.012017988599836826, 0.030119163915514946, 0.010706176050007343, 0.016342613846063614, 0.017787788063287735, 0.025879979133605957, 0.017043061554431915, 0.05693989619612694, 0.7106855511665344], [0.002402522834017873, 0.003384125418961048, 0.0087100425735116, 0.008039746433496475, 0.0018000125419348478, 0.0034357854165136814, 0.0035392078571021557, 0.03138187900185585, 0.0014184446772560477, 0.004346534144133329, 0.004506658297032118, 0.0019616528879851103, 0.003484683111310005, 0.00155422103125602, 0.002486765617504716, 0.00468825688585639, 0.9128594994544983]], [[0.019963618367910385, 0.017944909632205963, 0.008660348132252693, 0.0829276368021965, 0.008861251175403595, 0.01492066215723753, 0.012928025797009468, 0.13546733558177948, 0.00262853573076427, 0.0014781263889744878, 0.003441938664764166, 0.0023351472336798906, 0.01256809663027525, 0.0022736184764653444, 0.0034237003419548273, 0.0075202807784080505, 0.6626569032669067], [0.001451709191314876, 0.02922528237104416, 0.011584442108869553, 0.010571611113846302, 0.0013170856982469559, 0.008208170533180237, 0.006780949886888266, 0.08624384552240372, 0.0052875555120408535, 0.0038706744089722633, 0.004101715981960297, 0.0027224882505834103, 0.0015493856044486165, 0.0028075301088392735, 0.0006345934816636145, 0.006242303643375635, 0.8174006938934326], [0.0001719670108286664, 0.0035341333132237196, 0.021625351160764694, 0.0036666709929704666, 0.000366371386917308, 0.0018375549698248506, 0.0027572105173021555, 0.04479680582880974, 0.00033248934778384864, 0.001326196943409741, 0.002177425194531679, 0.001169707509689033, 0.00035607090103439987, 0.00029292749240994453, 0.0004649751936085522, 0.0027508726343512535, 0.9123733639717102], [0.0009402955183759332, 0.006159205920994282, 0.005289788357913494, 0.03663551062345505, 0.003113113809376955, 0.0038910494185984135, 0.0035400190390646458, 0.054859526455402374, 0.002104731509461999, 0.0027322738897055387, 0.002351493341848254, 0.004767206497490406, 0.005189696326851845, 0.001345552853308618, 0.0006212627631612122, 0.003477463498711586, 0.8629818558692932], [0.0007799412705935538, 0.002961116610094905, 0.002921845531091094, 0.008369104005396366, 0.008242540061473846, 0.006066012196242809, 0.004739702679216862, 0.06448233872652054, 0.0012943390756845474, 0.003386968746781349, 0.0032445744145661592, 0.00345986126922071, 0.003348858328536153, 0.0014997366815805435, 0.00042004932765848935, 0.0017598906997591257, 0.8830230832099915], [0.0011904079001396894, 0.01727544330060482, 0.010482990182936192, 0.006945687346160412, 0.002596807898953557, 0.03711846098303795, 0.013369420543313026, 0.08942043036222458, 0.00395588856190443, 0.004293615464121103, 0.0016496313037350774, 0.0031550463754683733, 0.0008638176368549466, 0.005054011009633541, 0.0011756974272429943, 0.004799703136086464, 0.7966529130935669], [0.0023757999297231436, 0.011649273335933685, 0.011796004138886929, 0.008691658265888691, 0.003217199118807912, 0.0060715265572071075, 0.013582095503807068, 0.09593246132135391, 0.007818793877959251, 0.014080547727644444, 0.0024773161858320236, 0.003488111076876521, 0.003472256241366267, 0.0015034826938062906, 0.002870396710932255, 0.0073859053663909435, 0.8035871982574463], [0.0007149599259719253, 0.004970194306224585, 0.007638526149094105, 0.0038881544023752213, 0.0009603402577340603, 0.0033868192695081234, 0.004386423621326685, 0.06881207227706909, 0.0008635923149995506, 0.0020303898490965366, 0.0020160796120762825, 0.000930745794903487, 0.0011523665161803365, 0.0009559186873957515, 0.0013196816435083747, 0.0020660783629864454, 0.8939076066017151], [0.0027490798383951187, 0.06937077641487122, 0.0027846633456647396, 0.010555428452789783, 0.004104809835553169, 0.016678549349308014, 0.005868773441761732, 0.07777124643325806, 0.16701601445674896, 0.02342330850660801, 0.004890852142125368, 0.008676744997501373, 0.017197806388139725, 0.019120611250400543, 0.002128365682438016, 0.00863741710782051, 0.5590255856513977], [0.0003786626330111176, 0.006038539577275515, 0.0031546750105917454, 0.002935584168881178, 0.0008885682909749448, 0.002901389030739665, 0.006924943532794714, 0.04647042974829674, 0.007295882795006037, 0.06296028941869736, 0.0011092759668827057, 0.0033829393796622753, 0.0038824931252747774, 0.0013635430950671434, 0.0022644298151135445, 0.005618731491267681, 0.8424296975135803], [0.0005607164348475635, 0.003467060159891844, 0.003592583118006587, 0.007783203385770321, 0.0010583025868982077, 0.0011570238275453448, 0.0033899289555847645, 0.043512772768735886, 0.0023253445979207754, 0.0033401160035282373, 0.06721832603216171, 0.0019491189159452915, 0.006089289207011461, 0.0014075723011046648, 0.005742185283452272, 0.00932177621871233, 0.8380846381187439], [0.002854106714949012, 0.013637579046189785, 0.0017915490316227078, 0.004972030408680439, 0.0011234721168875694, 0.007480855565518141, 0.007400804664939642, 0.05811676010489464, 0.009035220369696617, 0.006369575392454863, 0.002448309911414981, 0.039697445929050446, 0.006502930074930191, 0.004875097889453173, 0.003461618674919009, 0.012202311307191849, 0.8180304169654846], [0.006158399861305952, 0.01355404406785965, 0.002361282706260681, 0.016571441665291786, 0.006993391085416079, 0.004436856601387262, 0.0062576839700341225, 0.09070660918951035, 0.00902932696044445, 0.011618823744356632, 0.005224694963544607, 0.0052330526523292065, 0.08861920237541199, 0.005328644532710314, 0.0035941810347139835, 0.008537694811820984, 0.7157747149467468], [0.0006229078280739486, 0.015455705113708973, 0.0012113885022699833, 0.0027264016680419445, 0.0016949604032561183, 0.01601889356970787, 0.002823124174028635, 0.0633392482995987, 0.008544481359422207, 0.005897350609302521, 0.0011063824640586972, 0.0031771736685186625, 0.002762431977316737, 0.04461854323744774, 0.007901536300778389, 0.0131955835968256, 0.8089039325714111], [1.9743392840609886e-05, 0.0004500415816437453, 0.0004715374088846147, 0.00039749735151417553, 8.697786688571796e-05, 0.0004887345130555332, 0.0008893915801309049, 0.025480810552835464, 0.0001090382065740414, 0.0008887277217581868, 0.00031570845749229193, 0.00028931332053616643, 0.00013554411998484284, 0.000275845464784652, 0.018402287736535072, 0.003188299247995019, 0.9481105208396912], [0.00037971074925735593, 0.006612955126911402, 0.0020894003100693226, 0.0037192211020737886, 0.0006177524919621646, 0.004095480777323246, 0.003083091462031007, 0.044498663395643234, 0.0010159629164263606, 0.0017183946911245584, 0.0009487641509622335, 0.0017218609573319554, 0.0007459381013177335, 0.0025372295640408993, 0.00574138667434454, 0.015787392854690552, 0.904686689376831], [0.001449964358471334, 0.009770967066287994, 0.01471242681145668, 0.006604079622775316, 0.0021621491760015488, 0.005783585831522942, 0.007534752134233713, 0.0740838572382927, 0.003073271131142974, 0.005856260657310486, 0.00516884122043848, 0.002985231811180711, 0.0042382958345115185, 0.0026556849479675293, 0.0028897086158394814, 0.004687050823122263, 0.8463438153266907]], [[0.016096120700240135, 0.13382329046726227, 0.014357886277139187, 0.021329296752810478, 0.02052520215511322, 0.07405827939510345, 0.408550500869751, 0.1327538937330246, 0.0018403137801215053, 0.0024281744845211506, 0.0011819369392469525, 0.0015521059976890683, 0.0058926865458488464, 0.0023284454364329576, 0.00250889896415174, 0.004178596194833517, 0.1565943956375122], [0.037153638899326324, 0.038786303251981735, 0.02145415171980858, 0.01877027563750744, 0.003492446383461356, 0.010193943977355957, 0.4106038510799408, 0.0827629566192627, 0.002281911438331008, 0.0074691204354166985, 0.000514674000442028, 0.001166763948276639, 0.0034067693632096052, 0.0007805405184626579, 0.0007238438702188432, 0.004326911643147469, 0.3561118245124817], [0.001365329371765256, 0.0035628604236990213, 0.08014890551567078, 0.040386129170656204, 0.007226244080811739, 0.0015976839931681752, 0.04852479696273804, 0.03362465649843216, 0.0003860781725961715, 0.003915050532668829, 0.0007521227817051113, 0.000802406168077141, 0.0008350430289283395, 0.00021051250223536044, 0.0006633169250562787, 0.0017157656839117408, 0.7742831110954285], [0.01886221393942833, 0.006805638782680035, 0.07404080033302307, 0.11813027411699295, 0.013047714717686176, 0.0032274613622576, 0.18080078065395355, 0.03769517317414284, 0.0007883197977207601, 0.00939881894737482, 0.0008239536546170712, 0.001461843610741198, 0.0025893014390021563, 0.00025677744997665286, 0.0007734624086879194, 0.004707112908363342, 0.5265903472900391], [0.011932761408388615, 0.006194130517542362, 0.058642152696847916, 0.11044751852750778, 0.022584617137908936, 0.013125652447342873, 0.3767683804035187, 0.043282393366098404, 0.0004197584348730743, 0.005317150615155697, 0.00017148190818261355, 0.0005991854704916477, 0.0009085366036742926, 0.0003171299467794597, 0.0011311935959383845, 0.003780751023441553, 0.34437718987464905], [0.06856103241443634, 0.03890499100089073, 0.023272819817066193, 0.03888506442308426, 0.004010130185633898, 0.016962900757789612, 0.4789724051952362, 0.08434727042913437, 0.0011918001109734178, 0.004493058659136295, 0.0001661858696024865, 0.0008081780397333205, 0.0016254259971901774, 0.0003497777506709099, 0.00077697669621557, 0.003230108181014657, 0.2334417998790741], [0.11682802438735962, 0.009409385733306408, 0.021767914295196533, 0.035798151046037674, 0.003938011359423399, 0.008100315928459167, 0.08025515824556351, 0.08431868255138397, 0.013857990503311157, 0.03613949939608574, 0.004222785122692585, 0.00971829704940319, 0.01599297672510147, 0.006514201872050762, 0.003820879617705941, 0.017440643161535263, 0.5318770408630371], [0.003087541786953807, 0.004042636137455702, 0.0030598745215684175, 0.002984779654070735, 0.000565577473025769, 0.0017542984569445252, 0.019127970561385155, 0.07924424856901169, 0.0008036916260607541, 0.0014210441149771214, 0.000969588290899992, 0.0006972216651774943, 0.0013304202584549785, 0.0007782673928886652, 0.0008404210093431175, 0.002356303157284856, 0.8769360184669495], [0.00957976933568716, 0.005308450665324926, 0.004039777908474207, 0.004764830693602562, 0.0005008836160413921, 0.0008811689913272858, 0.006140991114079952, 0.03846541419625282, 0.010539148934185505, 0.0108424611389637, 0.0034803643357008696, 0.00765868229791522, 0.013385786674916744, 0.0035465536639094353, 0.002243749564513564, 0.007352892309427261, 0.8712690472602844], [0.02937508001923561, 0.009161903522908688, 0.03872396796941757, 0.010456687770783901, 0.0009737348882481456, 0.0019515060121193528, 0.01184465829282999, 0.027693478390574455, 0.0035726658534258604, 0.02475596033036709, 0.0010291518410667777, 0.008745943196117878, 0.012134036980569363, 0.0014358563348650932, 0.0039915586821734905, 0.014203697443008423, 0.7999500036239624], [0.0030119242146611214, 0.0006561827030964196, 0.0032892420422285795, 0.0023429165594279766, 0.0001588680170243606, 0.00018379726679995656, 0.005729831289499998, 0.02208845131099224, 0.002577513223513961, 0.007097077090293169, 0.0030587513465434313, 0.003712387289851904, 0.004916946403682232, 0.0019920659251511097, 0.005571035202592611, 0.017654256895184517, 0.91595858335495], [0.04107620194554329, 0.007080568466335535, 0.004519632086157799, 0.0042799669317901134, 0.0004111203597858548, 0.0007392471306957304, 0.006474243942648172, 0.04911993816494942, 0.008500535972416401, 0.010510614141821861, 0.0033881955314427614, 0.010264644399285316, 0.014346550218760967, 0.0040259999223053455, 0.0032017845660448074, 0.008532804436981678, 0.8235278725624084], [0.011058983393013477, 0.0024936494883149862, 0.0023217913694679737, 0.0016212797490879893, 0.0002243022172478959, 0.0006530315149575472, 0.01382074411958456, 0.03666466102004051, 0.006207116413861513, 0.012667644768953323, 0.003621638985350728, 0.007075544912368059, 0.012804175727069378, 0.0060827829875051975, 0.00492517976090312, 0.014195364899933338, 0.8635622262954712], [0.013653285801410675, 0.003218978177756071, 0.0027714110910892487, 0.0016830825479701161, 0.000355742551619187, 0.0008025728748179972, 0.013211856596171856, 0.05723506212234497, 0.011904837563633919, 0.00827172864228487, 0.007605534978210926, 0.008654761128127575, 0.022202743217349052, 0.021843893453478813, 0.013907914981245995, 0.04029374197125435, 0.7723827958106995], [0.020741207525134087, 0.0030393709894269705, 0.013729305937886238, 0.002319663530215621, 0.0003424892493057996, 0.0007230569026432931, 0.01394813135266304, 0.039713066071271896, 0.005860691890120506, 0.012212174013257027, 0.006439492106437683, 0.004778615664690733, 0.027000730857253075, 0.01280252169817686, 0.09212090820074081, 0.082762211561203, 0.6614664196968079], [0.02625867910683155, 0.00376529130153358, 0.008246025070548058, 0.0036081441212445498, 0.0003811988281086087, 0.0006819619447924197, 0.025134939700365067, 0.03763602301478386, 0.0060315802693367004, 0.013072504661977291, 0.004697764292359352, 0.006810815539211035, 0.02479272149503231, 0.012210153043270111, 0.03661264106631279, 0.10013675689697266, 0.6899227499961853], [0.002393735572695732, 0.0016227104933932424, 0.003556228242814541, 0.005267258267849684, 0.0013422287302091718, 0.0011458147782832384, 0.009916886687278748, 0.03763684630393982, 0.0017688735388219357, 0.004488819278776646, 0.004330613184720278, 0.002203918294981122, 0.003038444323465228, 0.0015174599830061197, 0.0019180168164893985, 0.004547069314867258, 0.9133050441741943]], [[0.000906763831153512, 0.050744153559207916, 0.02639336884021759, 0.187159925699234, 0.08230937272310257, 0.03982504457235336, 0.061395373195409775, 0.09639473259449005, 0.004528772085905075, 0.007414672989398241, 0.004482804797589779, 0.0026245275512337685, 0.009220033884048462, 0.0054501877166330814, 0.024053607136011124, 0.0198899507522583, 0.37720662355422974], [0.011070351116359234, 0.009799045510590076, 0.005767861381173134, 0.009205021895468235, 0.0011129496851935983, 0.004700861871242523, 0.049469154328107834, 0.1221969798207283, 0.027990350499749184, 0.009087374433875084, 0.010297801345586777, 0.009934695437550545, 0.031303998082876205, 0.029729288071393967, 0.030012857168912888, 0.03372729569673538, 0.6045941710472107], [0.004853948950767517, 0.006268101744353771, 0.002549688797444105, 0.003455031430348754, 0.001003713347017765, 0.0059422897174954414, 0.08274704217910767, 0.16008983552455902, 0.004605581518262625, 0.0038173275534063578, 0.004711626097559929, 0.006361815147101879, 0.009377473033964634, 0.005067574791610241, 0.014150059781968594, 0.02004263736307621, 0.6649561524391174], [0.00945646595209837, 0.004001582507044077, 0.005150882992893457, 0.0021252234000712633, 0.0005514695658348501, 0.0015678622294217348, 0.030907846987247467, 0.06205762177705765, 0.014962452463805676, 0.00566716818138957, 0.006028847303241491, 0.00827923696488142, 0.01201458740979433, 0.009773745201528072, 0.01991659216582775, 0.013877646066248417, 0.7936607599258423], [0.00647905096411705, 0.0073857782408595085, 0.008242154493927956, 0.003512867959216237, 0.001139424741268158, 0.010912303812801838, 0.19769339263439178, 0.13164570927619934, 0.005824823398143053, 0.0037134026642888784, 0.00609795656055212, 0.0038323327898979187, 0.004521284252405167, 0.007579228840768337, 0.009747693315148354, 0.016744494438171387, 0.5749281048774719], [0.020055562257766724, 0.008585106581449509, 0.02076585590839386, 0.00804352480918169, 0.0019872854463756084, 0.005722924135625362, 0.14465537667274475, 0.1684512197971344, 0.01796167902648449, 0.012763816863298416, 0.009934457018971443, 0.007416831329464912, 0.02281554415822029, 0.0272609144449234, 0.03374168276786804, 0.06175985187292099, 0.42807838320732117], [0.006483990699052811, 0.011460481211543083, 0.008030151017010212, 0.019388141110539436, 0.002664940431714058, 0.010789227671921253, 0.02192516438663006, 0.15021349489688873, 0.0035100807435810566, 0.004454520530998707, 0.005899609066545963, 0.002998376963660121, 0.006996167823672295, 0.004226486198604107, 0.00841425359249115, 0.015870770439505577, 0.7166740894317627], [0.0015931505477055907, 0.013661577366292477, 0.012497920542955399, 0.01631677709519863, 0.002686195308342576, 0.010868576355278492, 0.011905360035598278, 0.08679123967885971, 0.0012486709747463465, 0.0015782775590196252, 0.0017742505297064781, 0.0006825308664701879, 0.002101172460243106, 0.0022120897192507982, 0.005754286888986826, 0.010470524430274963, 0.8178573250770569], [0.007110044360160828, 0.01783519610762596, 0.010521511547267437, 0.018914561718702316, 0.008851875551044941, 0.019236227497458458, 0.07956486195325851, 0.15075530111789703, 0.0012141402112320065, 0.002753601176664233, 0.0010021496564149857, 0.0005999617860652506, 0.0036634220741689205, 0.0018728013383224607, 0.015453258529305458, 0.01620630733668804, 0.6444448232650757], [0.011477424763143063, 0.006543254479765892, 0.02102700062096119, 0.015857897698879242, 0.008194937370717525, 0.00981262605637312, 0.03838208690285683, 0.10718920081853867, 0.0018838875694200397, 0.0039007668383419514, 0.0018143182387575507, 0.0025233193300664425, 0.006021380890160799, 0.003142409725114703, 0.03089847043156624, 0.022452956065535545, 0.7088780403137207], [0.013289843685925007, 0.023302970454096794, 0.010529108345508575, 0.013399682939052582, 0.00630025751888752, 0.024835191667079926, 0.15268081426620483, 0.14439168572425842, 0.0020065451972186565, 0.0037215007469058037, 0.00220035994425416, 0.00199275859631598, 0.00788432452827692, 0.0032435525208711624, 0.03668351098895073, 0.03677188977599144, 0.5167659521102905], [0.005821907427161932, 0.0161801278591156, 0.009214363060891628, 0.011653145775198936, 0.008608686737716198, 0.01652952842414379, 0.08754339814186096, 0.14197038114070892, 0.0006180545897223055, 0.0020852372981607914, 0.0009915768168866634, 0.0003506006905809045, 0.001670464756898582, 0.0012539064045995474, 0.026027970016002655, 0.016824612393975258, 0.652656078338623], [0.004533727653324604, 0.04382079094648361, 0.009842464700341225, 0.028596336022019386, 0.011908582411706448, 0.03771304339170456, 0.07249567657709122, 0.16308701038360596, 0.0013943443773314357, 0.0023586414754390717, 0.0013987715356051922, 0.0008791270083747804, 0.0014847518177703023, 0.0017259694868698716, 0.010521831922233105, 0.0167612973600626, 0.591477632522583], [0.024227894842624664, 0.048664551228284836, 0.016500722616910934, 0.014033595100045204, 0.006679017562419176, 0.05169355124235153, 0.22779114544391632, 0.19557057321071625, 0.0030704704113304615, 0.004500692710280418, 0.0025277244858443737, 0.001336314482614398, 0.00819408893585205, 0.005154045298695564, 0.03850921615958214, 0.04654024913907051, 0.3050061762332916], [0.024144252762198448, 0.024788230657577515, 0.028786135837435722, 0.0032743774354457855, 0.0028314979281276464, 0.02723122574388981, 0.27553293108940125, 0.1290067881345749, 0.002800507005304098, 0.00669030612334609, 0.0034308359026908875, 0.005752743221819401, 0.011940257623791695, 0.008507481776177883, 0.03805782273411751, 0.07352161407470703, 0.3337029814720154], [0.01830858178436756, 0.03916364908218384, 0.023426348343491554, 0.006471519824117422, 0.0043768370524048805, 0.021874023601412773, 0.14725050330162048, 0.1528235226869583, 0.006464660633355379, 0.00871059950441122, 0.006051142234355211, 0.0065357135608792305, 0.013793005608022213, 0.02010812610387802, 0.0761624276638031, 0.05391079932451248, 0.39456862211227417], [0.0029914663173258305, 0.009616502560675144, 0.009361304342746735, 0.02810416929423809, 0.006877273321151733, 0.011342697776854038, 0.0069199674762785435, 0.05347660556435585, 0.00226470991037786, 0.003851720131933689, 0.00387831823900342, 0.0021672972943633795, 0.004126263316720724, 0.0025478960014879704, 0.007080969400703907, 0.011008071713149548, 0.8343847393989563]], [[0.04422936215996742, 0.08186426758766174, 0.003895988455042243, 0.014713240787386894, 0.009084610268473625, 0.02035294659435749, 0.5066334009170532, 0.07532518357038498, 0.04929773136973381, 0.03144022449851036, 0.0032217337284237146, 0.006176397204399109, 0.06304515153169632, 0.016060909256339073, 0.0010696607641875744, 0.02516286075115204, 0.04842633008956909], [0.004972547758370638, 0.27344971895217896, 0.032672423869371414, 0.006800626404583454, 0.0041805775836110115, 0.04383252561092377, 0.37001678347587585, 0.13129942119121552, 0.007802905980497599, 0.028581181541085243, 0.0003461095329839736, 0.0007519221398979425, 0.00501923868432641, 0.003604618366807699, 0.006832199636846781, 0.014377926476299763, 0.06545936316251755], [0.003839134704321623, 0.06916609406471252, 0.044810377061367035, 0.022687876597046852, 0.02155645191669464, 0.10964982211589813, 0.2649608254432678, 0.15751959383487701, 0.01089551206678152, 0.02678564004600048, 0.00358565803617239, 0.00321840844117105, 0.006970642134547234, 0.010073749348521233, 0.058066755533218384, 0.017138628289103508, 0.16907480359077454], [0.013488298282027245, 0.05678299814462662, 0.006234597414731979, 0.05780656635761261, 0.012818526476621628, 0.030081605538725853, 0.15294481813907623, 0.21541714668273926, 0.023089488968253136, 0.05930586904287338, 0.0070408787578344345, 0.003524349769577384, 0.04462914541363716, 0.007422216236591339, 0.009717191569507122, 0.05410899594426155, 0.24558718502521515], [0.008145415224134922, 0.13866642117500305, 0.008757086470723152, 0.05967237800359726, 0.012664929032325745, 0.09258584678173065, 0.1426382064819336, 0.21640470623970032, 0.010038260370492935, 0.01872316375374794, 0.0015596395824104548, 0.00040536149754188955, 0.006937309633940458, 0.003195555182173848, 0.005054440349340439, 0.011285543441772461, 0.2632657587528229], [0.021163474768400192, 0.17975778877735138, 0.016180288046598434, 0.0160412285476923, 0.010386673733592033, 0.11669427901506424, 0.28582677245140076, 0.20279473066329956, 0.014920133166015148, 0.007966996170580387, 0.000678957614582032, 0.0006675595068372786, 0.005925905425101519, 0.007760666310787201, 0.004529130179435015, 0.013593859039247036, 0.09511159360408783], [0.006479478441178799, 0.0238842461258173, 0.03678719326853752, 0.009921584278345108, 0.004772893153131008, 0.038999397307634354, 0.13747265934944153, 0.23777203261852264, 0.015634512528777122, 0.016395902261137962, 0.0044180224649608135, 0.006543873809278011, 0.016910163685679436, 0.03873804956674576, 0.022174153476953506, 0.0436687134206295, 0.33942708373069763], [0.027796359732747078, 0.03939647972583771, 0.01240848284214735, 0.014503776095807552, 0.009274546056985855, 0.022522369399666786, 0.06825103610754013, 0.23021787405014038, 0.012478718534111977, 0.009328285232186317, 0.008027424104511738, 0.006730545312166214, 0.016370205208659172, 0.016843261197209358, 0.011168819852173328, 0.0328465960919857, 0.46183517575263977], [0.07623476535081863, 0.04295634105801582, 0.0681067556142807, 0.012107250280678272, 0.007213548757135868, 0.005195703357458115, 0.07805454730987549, 0.11254994571208954, 0.0288547370582819, 0.02977735735476017, 0.020339468494057655, 0.011463291943073273, 0.08301892131567001, 0.023762298747897148, 0.006418890785425901, 0.12325628101825714, 0.2706899642944336], [0.02573772333562374, 0.02405744045972824, 0.11452793329954147, 0.024566099047660828, 0.022897500544786453, 0.034666210412979126, 0.1071513220667839, 0.1657632440328598, 0.03306864947080612, 0.0048604197800159454, 0.0056849573738873005, 0.0026986473239958286, 0.014614125713706017, 0.05339628458023071, 0.01030398067086935, 0.09361931681632996, 0.26238611340522766], [0.010657121427357197, 0.01862807385623455, 0.05080060288310051, 0.005595373455435038, 0.0034739139955490828, 0.0044942134991288185, 0.035013530403375626, 0.1197856143116951, 0.018958808854222298, 0.021456224843859673, 0.02199172042310238, 0.090030238032341, 0.024669427424669266, 0.07318168133497238, 0.05655091628432274, 0.23228950798511505, 0.21242302656173706], [0.019412251189351082, 0.04209768399596214, 0.021388910710811615, 0.01845170557498932, 0.010852758772671223, 0.011179538443684578, 0.10152754932641983, 0.10517358034849167, 0.013303502462804317, 0.006351870018988848, 0.08745172619819641, 0.005429872311651707, 0.025271447375416756, 0.022244367748498917, 0.008216959424316883, 0.045145392417907715, 0.4565008580684662], [0.030408412218093872, 0.048282112926244736, 0.021394852548837662, 0.03328685089945793, 0.008457688614726067, 0.004440601449459791, 0.0780673399567604, 0.1384260058403015, 0.029416752979159355, 0.010393653996288776, 0.030407484620809555, 0.019211532548069954, 0.05424105376005173, 0.03377123922109604, 0.004179730545729399, 0.1097353920340538, 0.3458792567253113], [0.03147478774189949, 0.010380346328020096, 0.01406579278409481, 0.006880968809127808, 0.004325403366237879, 0.0029883310198783875, 0.02699275314807892, 0.09793651849031448, 0.04302297532558441, 0.016078708693385124, 0.03824734315276146, 0.03171144798398018, 0.1631564050912857, 0.1261427104473114, 0.02615637145936489, 0.14880377054214478, 0.21163539588451385], [0.00778172817081213, 0.009830988943576813, 0.059318918734788895, 0.004314181860536337, 0.004251779057085514, 0.007212448865175247, 0.01041297148913145, 0.16321970522403717, 0.0102895712479949, 0.0036337373312562704, 0.009812244214117527, 0.004179798066616058, 0.013696269132196903, 0.11269930005073547, 0.06324440985918045, 0.18382528424263, 0.3322765827178955], [0.008445397950708866, 0.016613002866506577, 0.02462240308523178, 0.01025697123259306, 0.008217697963118553, 0.008807056583464146, 0.08861903101205826, 0.15083225071430206, 0.020876074209809303, 0.014402365311980247, 0.010052460245788097, 0.011736811138689518, 0.03781759366393089, 0.09352831542491913, 0.1438012272119522, 0.15646834671497345, 0.19490307569503784], [0.019796166568994522, 0.027480879798531532, 0.019196586683392525, 0.0236325953155756, 0.016570519655942917, 0.022032076492905617, 0.019750358536839485, 0.17440006136894226, 0.01535042468458414, 0.012110797688364983, 0.017709651961922646, 0.015547724440693855, 0.01754426583647728, 0.018689867109060287, 0.020275279879570007, 0.02797064185142517, 0.5319421291351318]], [[0.015246931463479996, 0.013385843485593796, 0.02019420638680458, 0.010327785275876522, 0.0018006489844992757, 0.007633740082383156, 0.00939234346151352, 0.07253661006689072, 0.004861520137637854, 0.0054192920215427876, 0.001565556158311665, 0.0023039754014462233, 0.019431259483098984, 0.005093954969197512, 0.0026894467882812023, 0.010698676109313965, 0.7974181771278381], [0.00046393301454372704, 0.043234530836343765, 0.011875048279762268, 0.011147168464958668, 0.0012852405197918415, 0.010253116488456726, 0.0039650024846196175, 0.061497680842876434, 0.02370206080377102, 0.0139395110309124, 0.0011737135937437415, 0.009147457778453827, 0.004871539771556854, 0.013085869140923023, 0.0015664977254346013, 0.013381113298237324, 0.775410532951355], [0.0002411292225588113, 0.003081181086599827, 0.22594444453716278, 0.002291774144396186, 0.00026171235367655754, 0.0012673604069277644, 0.0025909747928380966, 0.03713593631982803, 0.0006664738175459206, 0.007223309017717838, 0.00032766282674856484, 0.0010235554073005915, 0.0019719924312084913, 0.001377926324494183, 0.0014617781853303313, 0.007239329628646374, 0.705893337726593], [0.0036751343868672848, 0.014062722213566303, 0.01124251913279295, 0.22738303244113922, 0.002794047584757209, 0.008393730036914349, 0.004528466146439314, 0.08486386388540268, 0.01189387310296297, 0.008551799692213535, 0.0017982262652367353, 0.018838314339518547, 0.0077637420035898685, 0.004962647799402475, 0.0007381319883279502, 0.005396024323999882, 0.5831136703491211], [0.001979227177798748, 0.005542028695344925, 0.004086197819560766, 0.043277814984321594, 0.06966544687747955, 0.0117275295779109, 0.004571523983031511, 0.12147005647420883, 0.0034921264741569757, 0.010702955536544323, 0.0012043867027387023, 0.009387250058352947, 0.012918848544359207, 0.003164538647979498, 0.0010140646481886506, 0.015557792969048023, 0.6802381873130798], [0.0003339638060424477, 0.0157296322286129, 0.009720291942358017, 0.013261746615171432, 0.00472083268687129, 0.04568473622202873, 0.008833806030452251, 0.07925986498594284, 0.011851947754621506, 0.01566905342042446, 0.0018718215869739652, 0.003584359772503376, 0.002406198065727949, 0.009435252286493778, 0.002932731993496418, 0.012082514353096485, 0.7626213431358337], [0.00036728763370774686, 0.00993561465293169, 0.01923593319952488, 0.004037714563310146, 0.0018432453507557511, 0.012020604684948921, 0.10690385848283768, 0.06688430905342102, 0.007739650551229715, 0.026065493002533913, 0.0036709890700876713, 0.000726095458958298, 0.008345190435647964, 0.0041562700644135475, 0.0033376573119312525, 0.009675882756710052, 0.7150542736053467], [0.00022553755843546242, 0.002428916282951832, 0.0031281420961022377, 0.0021392726339399815, 0.0004553703183773905, 0.0022357061970978975, 0.0020539462566375732, 0.04660065844655037, 0.0006771025946363807, 0.0016337988199666142, 0.000808191136457026, 0.0006064434419386089, 0.0008463480626232922, 0.0008048076997511089, 0.0008260268368758261, 0.002653659088537097, 0.9318761229515076], [0.008302413858473301, 0.033818021416664124, 0.0011010293383151293, 0.020701998844742775, 0.00560405757278204, 0.01674659550189972, 0.0029658584389835596, 0.029762158170342445, 0.6319584250450134, 0.008551562204957008, 0.0034740413539111614, 0.024547694250941277, 0.015272450633347034, 0.05780765414237976, 0.00039393105544149876, 0.010477058589458466, 0.1285150647163391], [0.0018502206075936556, 0.0021611053962260485, 0.010355519130825996, 0.002824824769049883, 0.0009256464545615017, 0.003334233071655035, 0.06727015972137451, 0.04201066866517067, 0.0018075774423778057, 0.5992783308029175, 0.0019182218238711357, 0.0035904920659959316, 0.0027330697048455477, 0.0008578994311392307, 0.0007589388405904174, 0.0035008222330361605, 0.25482216477394104], [0.00044527111458592117, 0.0016157794743776321, 0.0011639399453997612, 0.0018496413249522448, 0.00042264279909431934, 0.001678523258306086, 0.006782063748687506, 0.045586612075567245, 0.001176418038085103, 0.002683894708752632, 0.4704757034778595, 0.0006513787084259093, 0.0019001486944034696, 0.0010079003404825926, 0.000517104403115809, 0.003692140569910407, 0.45835092663764954], [0.008614740334451199, 0.007661318406462669, 0.0035113003104925156, 0.03026190958917141, 0.005858259275555611, 0.006414595991373062, 0.0068552070297300816, 0.06202523037791252, 0.01065262220799923, 0.029981572180986404, 0.005856513511389494, 0.348324179649353, 0.00908289197832346, 0.005513958167284727, 0.001579249743372202, 0.016099082306027412, 0.4417072832584381], [0.01733076199889183, 0.0054876599460840225, 0.0026900027878582478, 0.010198323987424374, 0.010756014846265316, 0.002862523775547743, 0.007879423908889294, 0.052967619150877, 0.014288024045526981, 0.0157814621925354, 0.008304889313876629, 0.013206522911787033, 0.4126764237880707, 0.007508930284529924, 0.0008136812830343843, 0.021050244569778442, 0.39619749784469604], [0.00876555684953928, 0.03801996633410454, 0.0028394702821969986, 0.019567538052797318, 0.010557890869677067, 0.05843011289834976, 0.00859613437205553, 0.08866763114929199, 0.11872130632400513, 0.015160423703491688, 0.00968720018863678, 0.032527294009923935, 0.01618373394012451, 0.2176075130701065, 0.002371416660025716, 0.04627738520503044, 0.3060193657875061], [0.0002691778645385057, 0.0013008109526708722, 0.0019378368742763996, 0.0012751177418977022, 0.00048548803897574544, 0.0032885875552892685, 0.010943532921373844, 0.05474488437175751, 0.00016520568169653416, 0.004106463864445686, 0.0006718923687003553, 0.00045737726031802595, 0.00048116882680915296, 0.0005757768522016704, 0.07284609973430634, 0.0031370376236736774, 0.8433135747909546], [0.002497287467122078, 0.007153709419071674, 0.009711643680930138, 0.006086050998419523, 0.003268761560320854, 0.008485984988510609, 0.017787836492061615, 0.0934140756726265, 0.0030209331307560205, 0.010301587171852589, 0.004446374252438545, 0.003211465198546648, 0.004309740383177996, 0.0049566784873604774, 0.00233865762129426, 0.11591348052024841, 0.7030957341194153], [0.0003529141831677407, 0.002473391592502594, 0.006027156952768564, 0.002118234522640705, 0.0006853873492218554, 0.001961715519428253, 0.00332681299187243, 0.041182007640600204, 0.000996368587948382, 0.003425709903240204, 0.0020180977880954742, 0.001026528305374086, 0.002127985702827573, 0.0013180241221562028, 0.0017968215979635715, 0.0038581674452871084, 0.9253048300743103]], [[0.14615179598331451, 0.29475322365760803, 0.0492791049182415, 0.02877049706876278, 0.016756070777773857, 0.03605883568525314, 0.034481655806303024, 0.05537745729088783, 0.010581755079329014, 0.0008250960963778198, 0.000367346394341439, 0.001139401108957827, 0.008443539962172508, 0.0007200954714789987, 2.9688404538319446e-05, 0.0003143743670079857, 0.31595009565353394], [0.19807837903499603, 0.02737523801624775, 0.011400039307773113, 0.021505562588572502, 0.0029361823108047247, 0.007447346113622189, 0.0032321747858077288, 0.033192187547683716, 0.1084413081407547, 0.49240246415138245, 0.006663669366389513, 0.011003026738762856, 0.03700610622763634, 0.010328358970582485, 0.009748650714755058, 0.01872849650681019, 0.00051089096814394], [0.06220073625445366, 0.00612953444942832, 0.30659228563308716, 0.032677680253982544, 0.006595325656235218, 0.004487255588173866, 0.0137703288346529, 0.06155421957373619, 0.03515449911355972, 0.29578545689582825, 0.010319575667381287, 0.011571934446692467, 0.010242334567010403, 0.00864422507584095, 0.1012292131781578, 0.03122827410697937, 0.0018170855473726988], [0.007882066071033478, 0.0007458381587639451, 0.0009798771934583783, 0.6692019701004028, 0.02973533235490322, 0.00211479258723557, 0.0010880716145038605, 0.04914745315909386, 0.007429335732012987, 0.1416892409324646, 0.0049313935451209545, 0.01167623233050108, 0.010421382263302803, 0.005857311654835939, 0.010126985609531403, 0.046354420483112335, 0.0006183586665429175], [0.01203923486173153, 0.0020760800689458847, 0.006593538913875818, 0.20589783787727356, 0.5641418099403381, 0.020643990486860275, 0.010946057736873627, 0.09221168607473373, 0.0018855034140869975, 0.050740379840135574, 0.0003873748646583408, 0.001300169387832284, 0.0014675124548375607, 0.0027888373006135225, 0.006019601598381996, 0.016401542350649834, 0.004458867944777012], [0.14207597076892853, 0.009997328743338585, 0.011813648976385593, 0.04260970279574394, 0.032204367220401764, 0.1273876577615738, 0.036678433418273926, 0.12282378226518631, 0.034193381667137146, 0.3177248537540436, 0.005357630085200071, 0.016820499673485756, 0.013895963318645954, 0.012307005003094673, 0.018956920132040977, 0.052573926746845245, 0.002578927204012871], [0.1775095909833908, 0.012253967113792896, 0.01328973937779665, 0.01807781495153904, 0.007587058003991842, 0.040781907737255096, 0.12309662252664566, 0.08889757841825485, 0.12185962498188019, 0.08338769525289536, 0.025157373398542404, 0.05244005471467972, 0.08698497712612152, 0.03367258235812187, 0.013419395312666893, 0.0958399772644043, 0.005743961315602064], [0.1272442787885666, 0.05447671562433243, 0.04500873386859894, 0.05599890649318695, 0.02597709186375141, 0.03943775221705437, 0.059632230550050735, 0.12422442436218262, 0.07362616807222366, 0.05150140821933746, 0.038823697715997696, 0.03703863546252251, 0.060089826583862305, 0.046705905348062515, 0.03205953538417816, 0.048013836145401, 0.08014088124036789], [0.009373354725539684, 0.40875306725502014, 0.05365069583058357, 0.018372341990470886, 0.009999965317547321, 0.06317738443613052, 0.2760085463523865, 0.020824125036597252, 0.0048731653951108456, 0.00248927203938365, 0.003025827230885625, 0.0013944386737421155, 0.0035483287647366524, 0.0031825387850403786, 0.0001831016707001254, 0.0005708260578103364, 0.12057309597730637], [0.002632666612043977, 0.12195388227701187, 0.16283290088176727, 0.16862183809280396, 0.051586512476205826, 0.0804373100399971, 0.19991786777973175, 0.03918028622865677, 0.0011933052446693182, 0.003579026786610484, 0.0007870274130254984, 0.0005229761591181159, 0.0002491029445081949, 0.0009731090976856649, 0.0002609949151519686, 0.000721848919056356, 0.1645493358373642], [0.014372067525982857, 0.08523133397102356, 0.04417887702584267, 0.01496412418782711, 0.004440127406269312, 0.03523626923561096, 0.1911214292049408, 0.08577097952365875, 0.012516510672867298, 0.0017523553688079119, 0.12714983522891998, 0.020963400602340698, 0.010145303793251514, 0.02465946227312088, 0.004319602623581886, 0.005421238485723734, 0.31775709986686707], [0.0070927999913692474, 0.13741712272167206, 0.03032277338206768, 0.024265119805932045, 0.010920636355876923, 0.061935149133205414, 0.3456210792064667, 0.045897673815488815, 0.0033660910557955503, 0.0007586276042275131, 0.005913919769227505, 0.008542979136109352, 0.002115550683811307, 0.004603209439665079, 9.34754207264632e-05, 0.0008967682369984686, 0.31023693084716797], [0.01701468601822853, 0.24480147659778595, 0.04201527312397957, 0.01571742631494999, 0.012819208204746246, 0.06899499893188477, 0.27978062629699707, 0.03977333381772041, 0.004928700625896454, 0.0005385311087593436, 0.007037974428385496, 0.001527403830550611, 0.008739849552512169, 0.006277165841311216, 0.00018359145906288177, 0.0011387733975425363, 0.24871094524860382], [0.010000265203416348, 0.07943227142095566, 0.021069636568427086, 0.009315059520304203, 0.011345758102834225, 0.03819071128964424, 0.4252828061580658, 0.05690844729542732, 0.004192695952951908, 0.001430243020877242, 0.013546287082135677, 0.0040448433719575405, 0.0035740595776587725, 0.03202353045344353, 0.001291537773795426, 0.005628380458801985, 0.28272345662117004], [0.007395669352263212, 0.026031125336885452, 0.10057783871889114, 0.021848665550351143, 0.031300514936447144, 0.022250138223171234, 0.31541767716407776, 0.0791143998503685, 0.0015168387908488512, 0.0010223957942798734, 0.0014750591944903135, 0.0006549181998707354, 0.001038587768562138, 0.007426396012306213, 0.047537434846162796, 0.02111954800784588, 0.31427279114723206], [0.012238102033734322, 0.045410145074129105, 0.10139075666666031, 0.044205330312252045, 0.03329649195075035, 0.029889388009905815, 0.2617954909801483, 0.07685995101928711, 0.003647920908406377, 0.0026716962456703186, 0.003937035333365202, 0.002962136175483465, 0.0026276179123669863, 0.01465463824570179, 0.0068154786713421345, 0.025190886110067368, 0.33240699768066406], [0.06677703559398651, 0.07386758178472519, 0.10022735595703125, 0.06604262441396713, 0.05307063087821007, 0.06625064462423325, 0.09755215793848038, 0.08723471313714981, 0.039779406040906906, 0.030507462099194527, 0.03146573156118393, 0.030283087864518166, 0.03821169584989548, 0.029712621122598648, 0.027375781908631325, 0.031202251091599464, 0.13043923676013947]], [[0.043839484453201294, 0.018458504229784012, 0.018967917189002037, 0.033240530639886856, 0.011227412149310112, 0.008591712452471256, 0.005425500217825174, 0.05192077159881592, 0.08482605218887329, 0.12034690380096436, 0.04170461744070053, 0.022460008040070534, 0.1212872713804245, 0.03566421568393707, 0.04893875867128372, 0.12908567488193512, 0.20401468873023987], [0.0005790653522126377, 0.04906642809510231, 0.01104047428816557, 0.29612118005752563, 0.021664103493094444, 0.04097248241305351, 0.23143498599529266, 0.09270496666431427, 0.0005533567164093256, 0.0017051202012225986, 0.002947843400761485, 0.0007732336525805295, 0.0019642161205410957, 0.00042760741780512035, 0.0011643823236227036, 0.002625903580337763, 0.2442546784877777], [0.0037722799461334944, 0.028293704614043236, 0.0056546409614384174, 0.14773711562156677, 0.036736391484737396, 0.05317968130111694, 0.12043008208274841, 0.14426356554031372, 0.0007439480978064239, 0.0008441451354883611, 0.0021425888407975435, 0.0012242362136021256, 0.0010087455157190561, 0.000494159001391381, 0.0010413274867460132, 0.0008897356456145644, 0.45154353976249695], [0.0011590784415602684, 0.02242174930870533, 0.008470247499644756, 0.1018047109246254, 0.014652893878519535, 0.03303777053952217, 0.14731980860233307, 0.17970600724220276, 0.0012993603013455868, 0.000856948783621192, 0.005502596497535706, 0.0011445913696661592, 0.0020417727064341307, 0.001182502950541675, 0.001575973117724061, 0.0022075287997722626, 0.4756163954734802], [0.0015097997384145856, 0.01921578124165535, 0.01033713947981596, 0.08234476298093796, 0.006325128488242626, 0.030992664396762848, 0.11893230676651001, 0.15150590240955353, 0.0004954837495461106, 0.0006650839932262897, 0.0017170842038467526, 0.00047383885248564184, 0.0006071251118555665, 0.0005245858337730169, 0.001256351126357913, 0.0021577603183686733, 0.5709391236305237], [0.001060275943018496, 0.0924941748380661, 0.02343287132680416, 0.34283247590065, 0.01669440232217312, 0.0347212553024292, 0.2420918494462967, 0.0902428925037384, 0.0005636181449517608, 0.0007679818663746119, 0.0018245448591187596, 0.0005250336253084242, 0.0011792421573773026, 0.0005724553484469652, 0.001877280999906361, 0.002356108045205474, 0.14676354825496674], [0.0033337578643113375, 0.16263458132743835, 0.05366549268364906, 0.27147290110588074, 0.03128451108932495, 0.11822152137756348, 0.1743975281715393, 0.08086487650871277, 0.004194461740553379, 0.002514925319701433, 0.004633753560483456, 0.002993798116222024, 0.012483573518693447, 0.0014694520505145192, 0.0009913062676787376, 0.002747619990259409, 0.07209596782922745], [0.009775435552001, 0.02853635512292385, 0.011745673604309559, 0.06012551859021187, 0.0066396137699484825, 0.019984662532806396, 0.05341937020421028, 0.13376447558403015, 0.0012752044713124633, 0.0011308762477710843, 0.0029649478383362293, 0.0014005870325490832, 0.0017976699164137244, 0.0010451172711327672, 0.0013771617086604238, 0.003825996769592166, 0.6611912250518799], [0.20546390116214752, 0.0021567512303590775, 0.001431040815077722, 0.00487711513414979, 0.0010296498658135533, 0.0010882050264626741, 0.0035127755254507065, 0.03880619257688522, 0.01014336571097374, 0.026925912126898766, 0.010471105575561523, 0.009412623941898346, 0.02493830770254135, 0.0058933040127158165, 0.010588611476123333, 0.023249909281730652, 0.6200113296508789], [0.2145441323518753, 0.0006978691671974957, 0.0006750321481376886, 0.001458006096072495, 0.0003609904379118234, 0.0004274809907656163, 0.0010334079852327704, 0.025741081684827805, 0.005036884918808937, 0.006596334278583527, 0.010167498141527176, 0.008488005958497524, 0.006865047849714756, 0.005198287311941385, 0.010797584429383278, 0.01665516570210457, 0.6852571368217468], [0.2058870643377304, 0.0017421097727492452, 0.0013315603137016296, 0.0027966240886598825, 0.0006096321740187705, 0.0009131401893682778, 0.002946594264358282, 0.04171640798449516, 0.00991006288677454, 0.011625617742538452, 0.00837872177362442, 0.008499730378389359, 0.013049280270934105, 0.007724135648459196, 0.019342202693223953, 0.01733972877264023, 0.646187424659729], [0.1754949390888214, 0.00043812947114929557, 0.0004553050093818456, 0.0011572041548788548, 0.00018751212337519974, 0.00023844323004595935, 0.0008088863105513155, 0.023792898282408714, 0.0042212833650410175, 0.006916162557899952, 0.007370330858975649, 0.003825801657512784, 0.006451516877859831, 0.0024136498104780912, 0.004804171156138182, 0.006885697599500418, 0.7545379400253296], [0.14991098642349243, 0.0008220280287787318, 0.0010433158604428172, 0.0010635626967996359, 0.00029038111097179353, 0.0004619797400664538, 0.0007762779132463038, 0.03153219446539879, 0.005100863985717297, 0.008750739507377148, 0.007443456910550594, 0.005169952288269997, 0.008033890277147293, 0.005532578565180302, 0.01243700459599495, 0.02779497765004635, 0.7338358759880066], [0.24290090799331665, 0.004247015342116356, 0.001626569894142449, 0.005964693147689104, 0.0008909081807360053, 0.0016059810295701027, 0.003402833128347993, 0.04859835281968117, 0.008693018928170204, 0.00931200198829174, 0.007068591192364693, 0.0045773945748806, 0.012528970837593079, 0.006521244999021292, 0.02302958071231842, 0.030857477337121964, 0.5881744623184204], [0.22516126930713654, 0.001240291865542531, 0.0004349542723502964, 0.0018391059711575508, 0.00025114748859778047, 0.0006547844968736172, 0.0010822409531101584, 0.039358437061309814, 0.015868132933974266, 0.004862118046730757, 0.010113048367202282, 0.006636506412178278, 0.013325654901564121, 0.00992174819111824, 0.006901532877236605, 0.013714304193854332, 0.648634672164917], [0.21473778784275055, 0.0015322559047490358, 0.0008709524408914149, 0.0015236164908856153, 0.0002425406564725563, 0.0007383427582681179, 0.0018147750524803996, 0.05486539378762245, 0.008572828024625778, 0.0051580676808953285, 0.008073004893958569, 0.005754687823355198, 0.00822393223643303, 0.008547737263143063, 0.019347606226801872, 0.01674085669219494, 0.6432556509971619], [0.05667409300804138, 0.004187911283224821, 0.004363237880170345, 0.01009769830852747, 0.002748620230704546, 0.004088571760803461, 0.008385852910578251, 0.07417638599872589, 0.005385924596339464, 0.00445749145001173, 0.010954239405691624, 0.009569751098752022, 0.007125008385628462, 0.004697263706475496, 0.0048525030724704266, 0.010394505225121975, 0.7778409123420715]]], [[[0.119682677090168, 0.03657415509223938, 0.005879373289644718, 0.06013047322630882, 0.011748104356229305, 0.03125850111246109, 0.02326781302690506, 0.21318161487579346, 0.008368825539946556, 0.03946993127465248, 0.008382030762732029, 0.010378782637417316, 0.03116231970489025, 0.004587644245475531, 0.018135081976652145, 0.036566365510225296, 0.3412263095378876], [0.004728312138468027, 0.014894712716341019, 0.010823353193700314, 0.024893706664443016, 0.003325273049995303, 0.005501526407897472, 0.011762293986976147, 0.30839094519615173, 0.0011613927781581879, 0.0036468077450990677, 0.0012750983005389571, 0.001678634900599718, 0.0023293273989111185, 0.0007841555052436888, 0.0007959415088407695, 0.0025151828303933144, 0.6014932990074158], [0.003607547376304865, 0.004824730101972818, 0.00545975286513567, 0.008038468658924103, 0.0023065393324941397, 0.003487263573333621, 0.003361665876582265, 0.3172805607318878, 0.00027798183145932853, 0.0011741180205717683, 0.000451065570814535, 0.00047036277828738093, 0.00047873161383904517, 0.00030445720767602324, 0.000465520191937685, 0.002131069079041481, 0.6458801627159119], [0.004912715870887041, 0.003867042250931263, 0.007869108580052853, 0.011273816227912903, 0.0029014560859650373, 0.0028220918029546738, 0.00621917424723506, 0.30781400203704834, 0.0007236211095005274, 0.0022337862756103277, 0.0005384947289712727, 0.0011393395252525806, 0.001475237775593996, 0.0005238472949713469, 0.0009945313213393092, 0.002126105595380068, 0.6425655484199524], [0.0030781675595790148, 0.0014166809851303697, 0.010156515054404736, 0.0033183882478624582, 0.0009473193786107004, 0.0015475393738597631, 0.0037747046444565058, 0.3000498414039612, 0.0006951651885174215, 0.0039981659501791, 0.0006662574596703053, 0.0009595245937816799, 0.0011959432158619165, 0.000577148690354079, 0.0012109503149986267, 0.0019583944231271744, 0.6644493937492371], [0.0033948083873838186, 0.0072891297750175, 0.010611327365040779, 0.011299016885459423, 0.0021306446287781, 0.004896234255284071, 0.008769741281867027, 0.3206828832626343, 0.000801234389655292, 0.0028742686845362186, 0.000648608838673681, 0.0014026300050318241, 0.0012696697376668453, 0.0005421547102741897, 0.0009618783951736987, 0.0019377015996724367, 0.6204881072044373], [0.004026367794722319, 0.013461663387715816, 0.010646340437233448, 0.011515709571540356, 0.003191379364579916, 0.009856035932898521, 0.013087722472846508, 0.32650697231292725, 0.0006740486132912338, 0.0013634562492370605, 0.0010407947702333331, 0.0007359760929830372, 0.0008314431761391461, 0.0006198071641847491, 0.00041414666338823736, 0.0035610832273960114, 0.5984670519828796], [0.0015933964168652892, 0.003376591484993696, 0.0025837922003120184, 0.0027546784840524197, 0.001450772164389491, 0.002090931637212634, 0.0017436647322028875, 0.30805233120918274, 0.0002941359707619995, 0.0006639895145781338, 0.0003366142336744815, 0.0005711842095479369, 0.0007556130294688046, 0.00044608174357563257, 0.0007832861738279462, 0.0026589930057525635, 0.669843852519989], [0.0016423105262219906, 0.002691995818167925, 0.0007097254856489599, 0.003509949892759323, 0.0004096374032087624, 0.0005920448456890881, 0.0004623478453140706, 0.2503424286842346, 0.0003968988312408328, 0.0005569732165895402, 0.000437364709796384, 0.0006568634416908026, 0.001920860493555665, 0.000758867128752172, 0.0005842428654432297, 0.001861237222328782, 0.7324662804603577], [0.003160088323056698, 0.0035014653112739325, 0.0034264493733644485, 0.0032941659446805716, 0.0006368197500705719, 0.0021194887813180685, 0.0022297552786767483, 0.2753337323665619, 0.0012888801284134388, 0.0014904489507898688, 0.0012694848701357841, 0.0009460471919737756, 0.00278063234873116, 0.002565214177593589, 0.0020621048752218485, 0.004887385759502649, 0.6890078186988831], [0.0015936325071379542, 0.0004078051424585283, 0.0003029261715710163, 0.0009525819914415479, 0.000191978964721784, 0.00023177142429631203, 0.00030657180468551815, 0.23669315874576569, 0.00018273202294949442, 0.00042320872307755053, 0.00035787600791081786, 0.0003527290828060359, 0.0014475565403699875, 0.0006441067671403289, 0.0009437997941859066, 0.0022278870455920696, 0.7527396082878113], [0.0015819136751815677, 0.0015424680896103382, 0.0009919197764247656, 0.0028215101920068264, 0.0005708199460059404, 0.0008290550904348493, 0.0010436951415613294, 0.2607632875442505, 0.00033504777820780873, 0.0008360492647625506, 0.000572861754335463, 0.0006374907679855824, 0.0016542492667213082, 0.0008701059850864112, 0.001188513240776956, 0.0023548221215605736, 0.7214062213897705], [0.004719776567071676, 0.0015401177806779742, 0.0010998566867783666, 0.001277686795219779, 0.00029387581162154675, 0.0005708645912818611, 0.0005930970655754209, 0.24179522693157196, 0.0011372871231287718, 0.002820243127644062, 0.002008718904107809, 0.001389667158946395, 0.005891652777791023, 0.003633887507021427, 0.0023290328681468964, 0.007792665157467127, 0.7211062908172607], [0.003471784293651581, 0.0018713419558480382, 0.0005353756132535636, 0.0016390401870012283, 0.0003377064422238618, 0.0007498878403566778, 0.0006242459639906883, 0.2591135501861572, 0.0005434353370219469, 0.0007002720958553255, 0.0005820516380481422, 0.0007835139404051006, 0.0028150943107903004, 0.0017070486210286617, 0.0013770179357379675, 0.0041333651170134544, 0.7190151810646057], [0.0068931421265006065, 0.006108304485678673, 0.0026366703677922487, 0.0018198699690401554, 0.000469216174678877, 0.0016236165538430214, 0.0010705067543312907, 0.3010624945163727, 0.0015292505268007517, 0.0020762707572430372, 0.0008401786326430738, 0.0015810151817277074, 0.00412071542814374, 0.004189734812825918, 0.003698602318763733, 0.006428340915590525, 0.6538521647453308], [0.0063268868252635, 0.0027964734472334385, 0.0027081381995230913, 0.003542772028595209, 0.0008408916764892638, 0.0017972388304769993, 0.0033439956605434418, 0.28153344988822937, 0.001615142566151917, 0.0028539197519421577, 0.0033585582859814167, 0.0033434887882322073, 0.007771339267492294, 0.006587252486497164, 0.0055069769732654095, 0.013843133114278316, 0.652230441570282], [0.0020433806348592043, 0.004115737974643707, 0.0030028889887034893, 0.0034804437309503555, 0.0019187111174687743, 0.0024421897251158953, 0.0020967663731426, 0.31322988867759705, 0.00046413615928031504, 0.0008238296722993255, 0.0005198113503865898, 0.0007801755564287305, 0.0011932514607906342, 0.0007622826960869133, 0.0010801141615957022, 0.0036541586741805077, 0.6583921909332275]], [[0.08217322826385498, 0.011769932694733143, 0.0049791294150054455, 0.002052875468507409, 0.004368272144347429, 0.008919407613575459, 0.003587773535400629, 0.007302803453058004, 0.0786328911781311, 0.1414363533258438, 0.08520347625017166, 0.08471868932247162, 0.13337840139865875, 0.11672113835811615, 0.07130829244852066, 0.15378789603710175, 0.00965946726500988], [0.020278314128518105, 0.04187458008527756, 0.01494475919753313, 0.02258390560746193, 0.009907709434628487, 0.023853639140725136, 0.020233917981386185, 0.3241523802280426, 0.005544676911085844, 0.015030697919428349, 0.006997787859290838, 0.005184730049222708, 0.008194798603653908, 0.007977431640028954, 0.009925232268869877, 0.015825485810637474, 0.44748997688293457], [0.010544795542955399, 0.029439564794301987, 0.00422721728682518, 0.019836995750665665, 0.014153113588690758, 0.031493835151195526, 0.0258620698004961, 0.3534133732318878, 0.0011475352803245187, 0.003103602211922407, 0.001400218578055501, 0.0012866349425166845, 0.0030750080477446318, 0.001606268109753728, 0.0012649665586650372, 0.004861037712544203, 0.4932836890220642], [0.015261903405189514, 0.048418477177619934, 0.011135386303067207, 0.01936301775276661, 0.013434821739792824, 0.03379392251372337, 0.025605417788028717, 0.3265272378921509, 0.004502957221120596, 0.01239609532058239, 0.005546471104025841, 0.005603297147899866, 0.005943938158452511, 0.0053307292982935905, 0.00275730830617249, 0.014084560796618462, 0.45029449462890625], [0.004126267973333597, 0.013302577659487724, 0.005113800521939993, 0.020256631076335907, 0.007126645650714636, 0.0193606186658144, 0.018013369292020798, 0.34319260716438293, 0.0013680547708645463, 0.005474824924021959, 0.0022018759045749903, 0.0018413036596029997, 0.003111222293227911, 0.0022224027197808027, 0.0015112675027921796, 0.0066583664156496525, 0.5451182126998901], [0.015158239752054214, 0.03687872365117073, 0.011417957954108715, 0.017705122008919716, 0.011984637938439846, 0.03286717087030411, 0.018257884308695793, 0.3262115716934204, 0.003933407831937075, 0.010555451735854149, 0.006867057178169489, 0.004661425016820431, 0.009707938879728317, 0.007308835629373789, 0.007942474447190762, 0.019533582031726837, 0.45900845527648926], [0.08240387588739395, 0.16304844617843628, 0.054812416434288025, 0.08966787904500961, 0.056542836129665375, 0.14350254833698273, 0.05549096316099167, 0.13249064981937408, 0.012549197301268578, 0.013415309600532055, 0.005827395711094141, 0.0061945198103785515, 0.018521862104535103, 0.008129687048494816, 0.004391726106405258, 0.00952915195375681, 0.1434815227985382], [0.0019068621331825852, 0.004382157698273659, 0.0018128089141100645, 0.00594665901735425, 0.0024996513966470957, 0.005481114611029625, 0.00356927327811718, 0.3514494299888611, 0.0007200753898359835, 0.00167653092648834, 0.0008328179828822613, 0.0009431089856661856, 0.001657589222304523, 0.0013315133983269334, 0.0017236035782843828, 0.00315262982621789, 0.6109141111373901], [0.008972134441137314, 0.011384209617972374, 0.0017613935051485896, 0.009415870532393456, 0.003064833115786314, 0.007505562622100115, 0.006640893407166004, 0.33340805768966675, 0.004573074635118246, 0.007375861518085003, 0.0043639978393912315, 0.003601818112656474, 0.007725631818175316, 0.007405474316328764, 0.005547629203647375, 0.008368537761271, 0.5688850283622742], [0.039139069616794586, 0.02945924922823906, 0.006195316556841135, 0.02591659501194954, 0.019007060676813126, 0.03293213993310928, 0.012089717201888561, 0.2783716917037964, 0.01563883386552334, 0.01757432520389557, 0.008490326814353466, 0.01594448834657669, 0.039579931646585464, 0.017504308372735977, 0.00606664689257741, 0.01680098846554756, 0.4192892611026764], [0.0020675009582191706, 0.0020794107113033533, 0.0004512038140092045, 0.0013710932107642293, 0.0008696326403878629, 0.0031197366770356894, 0.0028233258053660393, 0.3159424364566803, 0.0008264543721452355, 0.0026929317973554134, 0.0006418319535441697, 0.0012668330455198884, 0.004074615426361561, 0.0018067086348310113, 0.003187961643561721, 0.006306775379925966, 0.6504716277122498], [0.008379057049751282, 0.007660302333533764, 0.0016622405964881182, 0.007345058489590883, 0.004450858570635319, 0.011136407032608986, 0.007255153264850378, 0.3107338547706604, 0.006476256996393204, 0.010816916823387146, 0.006973301526159048, 0.004971171263605356, 0.017492862418293953, 0.007554244715720415, 0.003542584367096424, 0.009426993317902088, 0.5741226077079773], [0.02080857753753662, 0.017689505591988564, 0.0028127802070230246, 0.008033149875700474, 0.0027605409268289804, 0.019351810216903687, 0.014678510837256908, 0.3184175491333008, 0.015050753951072693, 0.016025638207793236, 0.019362716004252434, 0.014855110086500645, 0.013317299075424671, 0.016443397849798203, 0.012139454483985901, 0.021641263738274574, 0.4666120707988739], [0.003064144402742386, 0.002824552357196808, 0.0005567368934862316, 0.0025749315973371267, 0.0012546905782073736, 0.004475915804505348, 0.00318732438609004, 0.33179470896720886, 0.003044947749003768, 0.00405132444575429, 0.003746705362573266, 0.0027252414729446173, 0.008651207201182842, 0.005377228371798992, 0.004926381632685661, 0.00733488192781806, 0.6104092001914978], [0.0020273199770599604, 0.002550124190747738, 0.0004479243070818484, 0.001750982366502285, 0.0014624390751123428, 0.005270801484584808, 0.0035210230853408575, 0.33058226108551025, 0.0022681145928800106, 0.004073954187333584, 0.0014548917533829808, 0.0038858617190271616, 0.006654147990047932, 0.004717937204986811, 0.002741041826084256, 0.008752353489398956, 0.6178388595581055], [0.016818545758724213, 0.015338807366788387, 0.003248669672757387, 0.00875090528279543, 0.006946306675672531, 0.027110446244478226, 0.01833324506878853, 0.277842253446579, 0.019766584038734436, 0.033802684396505356, 0.014293723739683628, 0.01551347877830267, 0.04968661069869995, 0.026264827698469162, 0.018351318314671516, 0.027776110917329788, 0.42015540599823], [0.0026396960020065308, 0.005944147240370512, 0.002647501416504383, 0.008423368446528912, 0.00321288057602942, 0.007306626997888088, 0.00531102204695344, 0.35773780941963196, 0.0011507720919325948, 0.0022929287515580654, 0.001232140464708209, 0.0014707983937114477, 0.00245307176373899, 0.002024380723014474, 0.0023018536157906055, 0.004039027728140354, 0.5898120403289795]], [[0.04800168424844742, 0.083704873919487, 0.0362408272922039, 0.0612020418047905, 0.01939435862004757, 0.042116910219192505, 0.059538647532463074, 0.027630694210529327, 0.11539368331432343, 0.043917082250118256, 0.028212858363986015, 0.0462634451687336, 0.16703559458255768, 0.11943523585796356, 0.031129876151680946, 0.03965875878930092, 0.031123431399464607], [0.024722959846258163, 0.12860937416553497, 0.07810866087675095, 0.06464578956365585, 0.046371445059776306, 0.05859127268195152, 0.068037249147892, 0.06396441161632538, 0.06323465704917908, 0.060312479734420776, 0.02402551844716072, 0.09665482491254807, 0.027649683877825737, 0.034777283668518066, 0.053983185440301895, 0.03702535480260849, 0.06928592920303345], [0.036931660026311874, 0.023660529404878616, 0.03167734667658806, 0.012308362871408463, 0.01149784866720438, 0.016100024804472923, 0.025463471189141273, 0.25650081038475037, 0.01571366935968399, 0.028854407370090485, 0.023389151319861412, 0.031172964721918106, 0.03470131382346153, 0.014117686077952385, 0.014034667983651161, 0.03899049013853073, 0.3848855495452881], [0.03456014394760132, 0.04433441907167435, 0.034770604223012924, 0.04458116367459297, 0.0323064811527729, 0.03720846772193909, 0.029751814901828766, 0.23611228168010712, 0.03089768998324871, 0.01945125125348568, 0.018567804247140884, 0.029807372018694878, 0.025672396644949913, 0.016509700566530228, 0.037210896611213684, 0.028814375400543213, 0.2994431257247925], [0.07590455561876297, 0.03289211168885231, 0.021031538024544716, 0.025601878762245178, 0.023594042286276817, 0.05189882591366768, 0.03998985141515732, 0.18357814848423004, 0.02816777490079403, 0.034736260771751404, 0.013865520246326923, 0.07580208033323288, 0.018452633172273636, 0.02570684254169464, 0.051033712923526764, 0.07774655520915985, 0.21999764442443848], [0.04332859441637993, 0.059592265635728836, 0.036509159952402115, 0.04535648971796036, 0.0506095290184021, 0.0830661952495575, 0.05717173218727112, 0.1383264809846878, 0.039717722684144974, 0.04031620919704437, 0.021302545443177223, 0.07544399797916412, 0.020062098279595375, 0.028950637206435204, 0.0510261245071888, 0.049120478332042694, 0.1600998491048813], [0.01835658960044384, 0.01365467719733715, 0.00576998433098197, 0.014165760949254036, 0.008845683187246323, 0.025218207389116287, 0.0257381834089756, 0.26286232471466064, 0.02751028537750244, 0.0195473525673151, 0.029272977262735367, 0.026405496522784233, 0.04130988195538521, 0.026949208229780197, 0.013819715939462185, 0.04643242061138153, 0.3941412568092346], [0.005077761132270098, 0.008481858298182487, 0.006909430492669344, 0.0063754115253686905, 0.00626607658341527, 0.009907298721373081, 0.011660799384117126, 0.3494066894054413, 0.003568548010662198, 0.0036061976570636034, 0.006961391307413578, 0.004364240448921919, 0.008432558737695217, 0.004373473580926657, 0.0063056801445782185, 0.010208433493971825, 0.5480942130088806], [0.04479110985994339, 0.020938469097018242, 0.029449041932821274, 0.011638815514743328, 0.01205146498978138, 0.021687813103199005, 0.01566934399306774, 0.20999109745025635, 0.04454429820179939, 0.045737188309431076, 0.03080802410840988, 0.030575744807720184, 0.06096185743808746, 0.029980752617120743, 0.04901446774601936, 0.027733733877539635, 0.31442686915397644], [0.020245997235178947, 0.011888718232512474, 0.01683388650417328, 0.008572207763791084, 0.007634407840669155, 0.015447032637894154, 0.017190929502248764, 0.19868670403957367, 0.01987418159842491, 0.05860292539000511, 0.04583500698208809, 0.06363064050674438, 0.06451527774333954, 0.028580324724316597, 0.02860206924378872, 0.06422074884176254, 0.3296389877796173], [0.013758668676018715, 0.009595423936843872, 0.006749797612428665, 0.004344916436821222, 0.003931278362870216, 0.010937336832284927, 0.007002472877502441, 0.2755400538444519, 0.011350201442837715, 0.02099747397005558, 0.04609410837292671, 0.019317712634801865, 0.04497183859348297, 0.01591489277780056, 0.01662302203476429, 0.027341432869434357, 0.46552935242652893], [0.00878020841628313, 0.014445461332798004, 0.011212329380214214, 0.013908833265304565, 0.0068495012819767, 0.017579704523086548, 0.02152176946401596, 0.18696393072605133, 0.03176922723650932, 0.04556567221879959, 0.06310161203145981, 0.08729543536901474, 0.03383221477270126, 0.05239216983318329, 0.05396859720349312, 0.0716511458158493, 0.27916210889816284], [0.011757960543036461, 0.03441855311393738, 0.018426017835736275, 0.013140322640538216, 0.011919529177248478, 0.0438603013753891, 0.02173873968422413, 0.25471290946006775, 0.026234019547700882, 0.0267641544342041, 0.013883348554372787, 0.03176715224981308, 0.02942175790667534, 0.03432067856192589, 0.05594983324408531, 0.03221704438328743, 0.3394675552845001], [0.018792476505041122, 0.03650365769863129, 0.03408867493271828, 0.015645088627934456, 0.01188909262418747, 0.03398241847753525, 0.03505265340209007, 0.25461286306381226, 0.016482515260577202, 0.02781561017036438, 0.016233807429671288, 0.027501383796334267, 0.02342269755899906, 0.023273857310414314, 0.0534541979432106, 0.03316257894039154, 0.33808639645576477], [0.04774869233369827, 0.02932026982307434, 0.01863635890185833, 0.01836075633764267, 0.008509619161486626, 0.043175600469112396, 0.04796186834573746, 0.09238579869270325, 0.036234937608242035, 0.05946137756109238, 0.027325576171278954, 0.11441640555858612, 0.07072995603084564, 0.04912677779793739, 0.11453230679035187, 0.10464799404144287, 0.11742564290761948], [0.050868045538663864, 0.027037782594561577, 0.01641017571091652, 0.019620303064584732, 0.008655347861349583, 0.03136634826660156, 0.03830014169216156, 0.22560995817184448, 0.014443653635680676, 0.03386171907186508, 0.016614075750112534, 0.033631712198257446, 0.02582203783094883, 0.02169068157672882, 0.03762481361627579, 0.07834500819444656, 0.32009828090667725], [0.0058554490096867085, 0.008349255658686161, 0.007420639973133802, 0.007008175365626812, 0.006696092430502176, 0.009897248819470406, 0.012517311610281467, 0.34621790051460266, 0.00386608368717134, 0.004134634044021368, 0.00832203309983015, 0.004663162399083376, 0.009876835159957409, 0.00481819175183773, 0.007104278542101383, 0.011770089156925678, 0.54148268699646]], [[0.2074430286884308, 0.14741578698158264, 0.0033191435504704714, 0.04809736832976341, 0.01198700163513422, 0.08085609972476959, 0.0562736950814724, 0.06281157582998276, 0.09421762079000473, 0.012885448522865772, 0.020288599655032158, 0.012258591130375862, 0.09828852862119675, 0.06451018899679184, 0.002725484548136592, 0.015925418585538864, 0.06069640815258026], [0.024529313668608665, 0.0593734011054039, 0.017329687252640724, 0.02269977331161499, 0.010221821255981922, 0.0484471395611763, 0.01769878901541233, 0.30250242352485657, 0.016154900193214417, 0.016319243237376213, 0.005342547316104174, 0.005754286888986826, 0.0094306580722332, 0.01667174883186817, 0.009067065082490444, 0.008337298408150673, 0.41011983156204224], [0.00961106363683939, 0.01739589497447014, 0.017575876787304878, 0.009604421444237232, 0.007021583151072264, 0.013477222993969917, 0.006432646885514259, 0.34103113412857056, 0.002910499693825841, 0.003699944820255041, 0.0010820941533893347, 0.001586521277204156, 0.0011604282772168517, 0.0029735988937318325, 0.0018788606394082308, 0.00433000735938549, 0.5582281947135925], [0.019364019855856895, 0.03563128039240837, 0.02533000521361828, 0.035002484917640686, 0.02213258668780327, 0.04294087737798691, 0.013286074623465538, 0.29597947001457214, 0.006531943567097187, 0.005418973974883556, 0.0016909103142097592, 0.0026729563251137733, 0.0026244947221130133, 0.0039365640841424465, 0.0019549408461898565, 0.003959295339882374, 0.48154303431510925], [0.009628181345760822, 0.03215481713414192, 0.021827371791005135, 0.031453363597393036, 0.018662652000784874, 0.030965562909841537, 0.00971188023686409, 0.31667032837867737, 0.004043431952595711, 0.004723276477307081, 0.001941994414664805, 0.0016159628285095096, 0.002147024730220437, 0.0032830198761075735, 0.0027111289091408253, 0.0036455481313169003, 0.5048143863677979], [0.040183790028095245, 0.06514546275138855, 0.02323908545076847, 0.0454246923327446, 0.02404799498617649, 0.11239853501319885, 0.05933195725083351, 0.23828360438346863, 0.01974361203610897, 0.017040347680449486, 0.00971305463463068, 0.009082472883164883, 0.008680508472025394, 0.01414855383336544, 0.007187818642705679, 0.01137139555066824, 0.2949770987033844], [0.017410650849342346, 0.007138041313737631, 0.003709490643814206, 0.009871086105704308, 0.0063913362100720406, 0.022145552560687065, 0.011498366482555866, 0.343851774930954, 0.00515216076746583, 0.005045154131948948, 0.0018807390006259084, 0.002558512380346656, 0.001532169757410884, 0.002005973132327199, 0.001107715885154903, 0.003863105783239007, 0.5548381805419922], [0.004079516977071762, 0.0047203125432133675, 0.002821495523676276, 0.003850964829325676, 0.0016170816961675882, 0.0039874231442809105, 0.0015067528001964092, 0.3421068787574768, 0.0018677576445043087, 0.0021211367566138506, 0.0019017898011952639, 0.0013703879667446017, 0.0016723050503060222, 0.0017877464415505528, 0.0010031778365373611, 0.0031638951040804386, 0.6204213500022888], [0.04106517881155014, 0.02077764831483364, 0.0013252486241981387, 0.008748630993068218, 0.0031970154959708452, 0.009084319695830345, 0.0027802586555480957, 0.3039073944091797, 0.0253163930028677, 0.00783852580934763, 0.02427324280142784, 0.005749383941292763, 0.018512684851884842, 0.006781077943742275, 0.0016864087665453553, 0.0036968847271054983, 0.5152596831321716], [0.012803342193365097, 0.011074734851717949, 0.0019112450536340475, 0.0040617771446704865, 0.00308970850892365, 0.004915929399430752, 0.0032301375176757574, 0.3082076907157898, 0.012998942285776138, 0.007384052500128746, 0.01677638851106167, 0.004890452139079571, 0.005245671607553959, 0.004357943776994944, 0.002176630310714245, 0.0046351198107004166, 0.5922402143478394], [0.001983937807381153, 0.001338261878117919, 0.00037334332591854036, 0.001632934552617371, 0.0006188882980495691, 0.001111381221562624, 0.00016824305930640548, 0.2932882010936737, 0.0008275036234408617, 0.000866173708345741, 0.009019426070153713, 0.0006821405841037631, 0.0009236318292096257, 0.0006526560173369944, 0.0005210568197071552, 0.0011271530529484153, 0.6848650574684143], [0.01473785750567913, 0.03897244110703468, 0.0044228206388652325, 0.011582763865590096, 0.005530445836484432, 0.012332597747445107, 0.007379963528364897, 0.29217132925987244, 0.022224081680178642, 0.010707680135965347, 0.04970953240990639, 0.015391586348414421, 0.010536017827689648, 0.004699649754911661, 0.003318654838949442, 0.005951371509581804, 0.4903312027454376], [0.012756599113345146, 0.005584961734712124, 0.0003656526387203485, 0.0028600210789591074, 0.0012193903094157577, 0.0033296719193458557, 0.002186309080570936, 0.3172440230846405, 0.011467457748949528, 0.004850554745644331, 0.005832584574818611, 0.0048761009238660336, 0.009281857870519161, 0.004790020175278187, 0.001543831080198288, 0.0035061032976955175, 0.6083047986030579], [0.01446382887661457, 0.015259920619428158, 0.0024432679638266563, 0.005181195680052042, 0.0026743989437818527, 0.010456634685397148, 0.002858532126992941, 0.3055904507637024, 0.017479103058576584, 0.010918620973825455, 0.021726764738559723, 0.007857512682676315, 0.01921847276389599, 0.014361138455569744, 0.006669118534773588, 0.010390042327344418, 0.5324509143829346], [0.00508884247392416, 0.006047142669558525, 0.0015585072105750442, 0.003651480423286557, 0.002216636436060071, 0.006175580900162458, 0.004610598087310791, 0.28795498609542847, 0.007041393779218197, 0.005238358862698078, 0.019474858418107033, 0.007629035506397486, 0.011667465791106224, 0.010918911546468735, 0.037192609161138535, 0.0171225443482399, 0.5664111375808716], [0.009754200465977192, 0.01005519088357687, 0.0027675193268805742, 0.006834049243479967, 0.0037423474714159966, 0.010609932243824005, 0.005040863994508982, 0.29746177792549133, 0.014197624288499355, 0.010821344330906868, 0.017718350514769554, 0.011069850996136665, 0.017552195116877556, 0.016413239762187004, 0.006116026546806097, 0.019156591966748238, 0.5406889915466309], [0.0050877779722213745, 0.005290530156344175, 0.0033858553506433964, 0.0044509475119411945, 0.0019621127285063267, 0.004259302746504545, 0.0015664122765883803, 0.33982330560684204, 0.0023756742011755705, 0.0027328773867338896, 0.0025786100886762142, 0.001844326383434236, 0.002129345666617155, 0.0022728934418410063, 0.0012245237594470382, 0.004008973482996225, 0.6150065660476685]], [[0.05618243291974068, 0.2741358280181885, 0.09632766991853714, 0.07832100987434387, 0.0835067629814148, 0.18855415284633636, 0.1566363126039505, 0.025941016152501106, 0.0029971522744745016, 0.001959498506039381, 0.0007813793490640819, 0.0006628622650168836, 0.004309113137423992, 0.002362916711717844, 0.002040873747318983, 0.0021966425701975822, 0.02308446727693081], [0.03346271067857742, 0.077074334025383, 0.020460108295083046, 0.021783312782645226, 0.008212226442992687, 0.05149408429861069, 0.04030098766088486, 0.2847473621368408, 0.01152790617197752, 0.011548184789717197, 0.0028917156159877777, 0.005928909871727228, 0.01574292965233326, 0.0068449582904577255, 0.0064364150166511536, 0.015974383801221848, 0.3855694532394409], [0.016293957829475403, 0.030578477308154106, 0.02606211043894291, 0.026662101969122887, 0.014459903351962566, 0.029192639514803886, 0.04235725477337837, 0.295195996761322, 0.0070534637197852135, 0.017041651532053947, 0.002871521282941103, 0.007637311704456806, 0.01679398864507675, 0.004611434880644083, 0.003939683549106121, 0.013077064417302608, 0.4461714029312134], [0.031093744561076164, 0.0347423292696476, 0.027745530009269714, 0.03439443185925484, 0.01247028075158596, 0.026796694844961166, 0.03315943479537964, 0.2898346185684204, 0.014364472590386868, 0.01565786637365818, 0.0045464495196938515, 0.010651420801877975, 0.025935987010598183, 0.006835500244051218, 0.0027143689803779125, 0.01221383921802044, 0.4168429672718048], [0.008422421291470528, 0.01071685180068016, 0.006573622114956379, 0.00888756848871708, 0.005669698119163513, 0.013364444486796856, 0.039052627980709076, 0.33330732583999634, 0.0011352422880008817, 0.004425784572958946, 0.001581091433763504, 0.002609552349895239, 0.0068012019619345665, 0.0013644706923514605, 0.004122323356568813, 0.006186676677316427, 0.5457791686058044], [0.022102737799286842, 0.04145985096693039, 0.016822926700115204, 0.021822649985551834, 0.010476550087332726, 0.048897262662649155, 0.056118521839380264, 0.28729087114334106, 0.008364906534552574, 0.015172748826444149, 0.004519328940659761, 0.007900619879364967, 0.01319943182170391, 0.008985215798020363, 0.011731725186109543, 0.023695075884461403, 0.40143951773643494], [0.19030217826366425, 0.07101424038410187, 0.019784459844231606, 0.05834343284368515, 0.024618420749902725, 0.07370980083942413, 0.01655205525457859, 0.18270185589790344, 0.020684864372015, 0.05508466437458992, 0.004532761871814728, 0.010942146182060242, 0.01257702149450779, 0.0026696305721998215, 0.004249166697263718, 0.013324275612831116, 0.23890899121761322], [0.00343344290740788, 0.0018569489475339651, 0.0017211452359333634, 0.003444572677835822, 0.0013160778908059, 0.0019132639281451702, 0.0012120791943743825, 0.31317800283432007, 0.0007890362176112831, 0.0014767281245440245, 0.0009965880308300257, 0.0011177058331668377, 0.0020568494219332933, 0.0009606924722902477, 0.0016576608177274466, 0.004134895280003548, 0.6587343811988831], [0.016159964725375175, 0.026301125064492226, 0.002665724139660597, 0.011301453225314617, 0.002337057376280427, 0.0139158945530653, 0.010839746333658695, 0.31047338247299194, 0.012295790947973728, 0.004322476219385862, 0.0017919761594384909, 0.003039939794689417, 0.015270252712070942, 0.005859699100255966, 0.002085032407194376, 0.005300505552440882, 0.5560399889945984], [0.008601350709795952, 0.017361970618367195, 0.004159077536314726, 0.010683134198188782, 0.0035069817677140236, 0.011520921252667904, 0.018825039267539978, 0.3069593608379364, 0.019341062754392624, 0.024759512394666672, 0.005204393062740564, 0.00882721971720457, 0.027684073895215988, 0.013680237345397472, 0.0038942911196500063, 0.01611429639160633, 0.49887707829475403], [0.005718030035495758, 0.00431881844997406, 0.0017100432887673378, 0.002536152023822069, 0.0014022400137037039, 0.004803814925253391, 0.005190389230847359, 0.32394370436668396, 0.0037135633174329996, 0.006583063863217831, 0.00435568718239665, 0.005466507747769356, 0.01612507365643978, 0.005814654286950827, 0.004786695819348097, 0.014074505306780338, 0.5894571542739868], [0.009385529905557632, 0.007813487201929092, 0.002013125456869602, 0.00750894658267498, 0.002573978155851364, 0.007493186742067337, 0.011063402518630028, 0.31510862708091736, 0.00472642108798027, 0.005400654394179583, 0.001698134932667017, 0.0027712963055819273, 0.008107032626867294, 0.002659829333424568, 0.0011809028219431639, 0.003839410375803709, 0.6066559553146362], [0.023887677118182182, 0.03224392980337143, 0.00695052882656455, 0.011273288168013096, 0.005296899937093258, 0.015338189899921417, 0.01803724281489849, 0.3041954040527344, 0.02121538110077381, 0.008265489712357521, 0.0039183697663247585, 0.004489358048886061, 0.032892171293497086, 0.01347479410469532, 0.0062412372790277, 0.015099842101335526, 0.4771801829338074], [0.009308203123509884, 0.013097485527396202, 0.0020599409472197294, 0.007820328697562218, 0.003675005864351988, 0.012056520208716393, 0.010700075887143612, 0.3255276083946228, 0.004543522372841835, 0.0033079974818974733, 0.0022691485937684774, 0.0025945971719920635, 0.011508548632264137, 0.008368651382625103, 0.005574628245085478, 0.011517897248268127, 0.5660699009895325], [0.008813844993710518, 0.009122364223003387, 0.0036607286892831326, 0.004631753545254469, 0.0023903970140963793, 0.00773899769410491, 0.006384776905179024, 0.28648608922958374, 0.019776713103055954, 0.0071369344368577, 0.017064454033970833, 0.010294297710061073, 0.022359345108270645, 0.039768584072589874, 0.025850174948573112, 0.05024990066885948, 0.4782707095146179], [0.04512160271406174, 0.0341656431555748, 0.009562162682414055, 0.02559579163789749, 0.010760623030364513, 0.02610526606440544, 0.02129720337688923, 0.27874723076820374, 0.016410348936915398, 0.00865959096699953, 0.008814923465251923, 0.008417502045631409, 0.02331508696079254, 0.0190575048327446, 0.014341109432280064, 0.0313006155192852, 0.41832786798477173], [0.00435033580288291, 0.0020859953947365284, 0.001911466009914875, 0.00408110860735178, 0.0015553674893453717, 0.0020089251920580864, 0.0012865117751061916, 0.3109433948993683, 0.0012692773016169667, 0.0019976466428488493, 0.0015325964195653796, 0.001660522073507309, 0.003067702054977417, 0.0015166333178058267, 0.002194985980167985, 0.005449657328426838, 0.6530879735946655]], [[0.04232114180922508, 0.09649496525526047, 0.14928686618804932, 0.026820320636034012, 0.02290155366063118, 0.11227355897426605, 0.23739971220493317, 0.08867177367210388, 0.004175014328211546, 0.04837368056178093, 0.008059220388531685, 0.016035662963986397, 0.015324994921684265, 0.004734131507575512, 0.009198764339089394, 0.024865470826625824, 0.09306320548057556], [0.011350112035870552, 0.013305430300533772, 0.03328068181872368, 0.01918182335793972, 0.009499961510300636, 0.011643611826002598, 0.027522534132003784, 0.28266119956970215, 0.01953517459332943, 0.03351131081581116, 0.0075009106658399105, 0.007862258702516556, 0.02282257005572319, 0.006842359900474548, 0.005692195147275925, 0.010853681713342667, 0.4769342839717865], [0.004513021092861891, 0.009924188256263733, 0.21584582328796387, 0.015526071190834045, 0.02126963622868061, 0.017546894028782845, 0.060557592660188675, 0.23653392493724823, 0.00602701585739851, 0.02567918598651886, 0.0029657084960490465, 0.0019612533506006002, 0.004334204830229282, 0.0034998648334294558, 0.007672810461372137, 0.007519862614572048, 0.3586229383945465], [0.019241122528910637, 0.018564864993095398, 0.03537897765636444, 0.08180919289588928, 0.057457081973552704, 0.018250595778226852, 0.03951866552233696, 0.24245139956474304, 0.01797865703701973, 0.03656528890132904, 0.009643707424402237, 0.007593615911900997, 0.019110914319753647, 0.006289719603955746, 0.008322046138346195, 0.00626251008361578, 0.3755616545677185], [0.00936030875891447, 0.003056734800338745, 0.022374963387846947, 0.013503648340702057, 0.007125658914446831, 0.010499159805476665, 0.05613376945257187, 0.2740703225135803, 0.00987249892205, 0.0406830869615078, 0.008957182988524437, 0.006763970945030451, 0.007040590979158878, 0.005469941534101963, 0.007926110178232193, 0.0057427422143518925, 0.5114192962646484], [0.009155905805528164, 0.005224063992500305, 0.027860505506396294, 0.010310674086213112, 0.010266975499689579, 0.016401495784521103, 0.04853912815451622, 0.29373931884765625, 0.009256888180971146, 0.03005668707191944, 0.005743303801864386, 0.004920328967273235, 0.007401268929243088, 0.005481506697833538, 0.00827424693852663, 0.010651263408362865, 0.4967164993286133], [0.00552574684843421, 0.00856723915785551, 0.04260649532079697, 0.02267889492213726, 0.048777222633361816, 0.029723694548010826, 0.26542070508003235, 0.19538618624210358, 0.005196097306907177, 0.035904355347156525, 0.0015820814296603203, 0.0016703640576452017, 0.0061356620863080025, 0.0043607973493635654, 0.02419430948793888, 0.012237961404025555, 0.2900322377681732], [0.0030194553546607494, 0.004915549885481596, 0.007716563064604998, 0.006234907079488039, 0.004013874102383852, 0.0052932859398424625, 0.005869765300303698, 0.35623010993003845, 0.0021981706377118826, 0.004564995877444744, 0.002698264317587018, 0.0036055566743016243, 0.005008687265217304, 0.0025475553702563047, 0.006630823481827974, 0.0067527382634580135, 0.5726997256278992], [0.02040507085621357, 0.0032341263722628355, 0.0173238143324852, 0.0076188319362699986, 0.001606703968718648, 0.0032612630166113377, 0.014017333276569843, 0.10487155616283417, 0.11201010644435883, 0.10296545177698135, 0.1182243674993515, 0.041134633123874664, 0.15693186223506927, 0.06926154345273972, 0.014525934122502804, 0.016634685918688774, 0.19597262144088745], [0.015897301957011223, 0.007367060519754887, 0.08161136507987976, 0.022293994203209877, 0.016770781949162483, 0.012517319060862064, 0.02607555314898491, 0.17320872843265533, 0.030148213729262352, 0.18042561411857605, 0.024517560377717018, 0.021644312888383865, 0.028065212070941925, 0.01871764473617077, 0.013476415537297726, 0.0192316472530365, 0.30803123116493225], [0.003858668962493539, 0.0007258429541252553, 0.0029697781428694725, 0.004007524345070124, 0.0008872230537235737, 0.0016181610990315676, 0.0013871000846847892, 0.1982976645231247, 0.011856707744300365, 0.031639933586120605, 0.1489957720041275, 0.029670868068933487, 0.04494030773639679, 0.0288385022431612, 0.013618751429021358, 0.019381625577807426, 0.4573054313659668], [0.013071308843791485, 0.002433515153825283, 0.013056688942015171, 0.012140342965722084, 0.002132544293999672, 0.004230610094964504, 0.005484212189912796, 0.21550752222537994, 0.01726675033569336, 0.046896159648895264, 0.08775759488344193, 0.07683480530977249, 0.027206772938370705, 0.018976856023073196, 0.008337763138115406, 0.02235312946140766, 0.42631348967552185], [0.01285009179264307, 0.0024513816460967064, 0.0058423294685781, 0.007112795487046242, 0.0015669222921133041, 0.0033078743144869804, 0.007918434217572212, 0.15291278064250946, 0.02040022611618042, 0.07820294052362442, 0.08586178719997406, 0.08420699089765549, 0.1439448744058609, 0.04103025421500206, 0.017675936222076416, 0.039698079228401184, 0.29501622915267944], [0.010009771212935448, 0.001352645573206246, 0.006296622566878796, 0.0025057534221559763, 0.0006921530002728105, 0.0030623802449554205, 0.013235009275376797, 0.18998777866363525, 0.04038381949067116, 0.0403616726398468, 0.06355856359004974, 0.020783577114343643, 0.06914114207029343, 0.10579755902290344, 0.03138360381126404, 0.02990671433508396, 0.3715413212776184], [0.002076906617730856, 0.001370165147818625, 0.016123879700899124, 0.0017693289555609226, 0.0016236965311691165, 0.0032006725668907166, 0.010895710438489914, 0.26106199622154236, 0.0031777445692569017, 0.010591310448944569, 0.006238625384867191, 0.003371572121977806, 0.0077077532187104225, 0.011078106239438057, 0.15918317437171936, 0.015074336901307106, 0.4854550063610077], [0.005353460554033518, 0.002908449387177825, 0.012110202573239803, 0.003961585462093353, 0.00306505779735744, 0.006802800111472607, 0.01967332325875759, 0.2574544847011566, 0.005549621302634478, 0.021577833220362663, 0.014396502636373043, 0.010954482480883598, 0.015402917750179768, 0.019289463758468628, 0.07337898015975952, 0.06304943561553955, 0.46507129073143005], [0.003073973348364234, 0.005010214634239674, 0.007118604611605406, 0.006891571916639805, 0.004044119734317064, 0.0048408471047878265, 0.0050462097860872746, 0.35102880001068115, 0.0026515841018408537, 0.004968714900314808, 0.003124925307929516, 0.003689427627250552, 0.0061007109470665455, 0.0031185720581561327, 0.008159730583429337, 0.00749787176027894, 0.5736342668533325]], [[0.07369639724493027, 0.06646707653999329, 0.027117695659399033, 0.11574169993400574, 0.05512894690036774, 0.04044606164097786, 0.20047423243522644, 0.12875236570835114, 0.008158335462212563, 0.04547131434082985, 0.006007398013025522, 0.015745803713798523, 0.005470056552439928, 0.0014908823650330305, 0.006962532643228769, 0.02823432721197605, 0.17463482916355133], [0.05542846396565437, 0.01124868355691433, 0.02454628422856331, 0.026355070993304253, 0.019488122314214706, 0.012210761196911335, 0.023015080019831657, 0.26772618293762207, 0.005193982273340225, 0.03074617125093937, 0.0034885250497609377, 0.013948245905339718, 0.0072042085230350494, 0.0035873064771294594, 0.004246192518621683, 0.014170823618769646, 0.4773958921432495], [0.04697722941637039, 0.022519346326589584, 0.028560655191540718, 0.0222694743424654, 0.015614613890647888, 0.02410825714468956, 0.0283406563103199, 0.2919083535671234, 0.0038357439916580915, 0.024649107828736305, 0.002359227742999792, 0.01195433083921671, 0.005251923110336065, 0.0017506118165329099, 0.002866454655304551, 0.013217863626778126, 0.45381617546081543], [0.02970200777053833, 0.007593267131596804, 0.008573697879910469, 0.016933970153331757, 0.017368700355291367, 0.01071375422179699, 0.01126678567379713, 0.28950244188308716, 0.00538649270310998, 0.024895720183849335, 0.0034789550118148327, 0.01681510917842388, 0.007550680544227362, 0.0035433887969702482, 0.003161534434184432, 0.01561226136982441, 0.5279012322425842], [0.020988058298826218, 0.013602803461253643, 0.011609681881964207, 0.026721207424998283, 0.0187680646777153, 0.01995508000254631, 0.013124396093189716, 0.3104734718799591, 0.002716768765822053, 0.014677703380584717, 0.0017458200454711914, 0.008210321888327599, 0.0027182952035218477, 0.0013459989568218589, 0.003196855541318655, 0.006851984187960625, 0.5232934951782227], [0.033656422048807144, 0.018862219527363777, 0.02450457029044628, 0.028104659169912338, 0.020014004781842232, 0.019139496609568596, 0.020495254546403885, 0.2794070541858673, 0.005529644433408976, 0.026969892904162407, 0.002682621357962489, 0.011337081901729107, 0.005688576493412256, 0.0024268722627311945, 0.005383355543017387, 0.009453056380152702, 0.48634523153305054], [0.03913676366209984, 0.05213551968336105, 0.045441094785928726, 0.029415041208267212, 0.052475299686193466, 0.06715046614408493, 0.0392816998064518, 0.2356158345937729, 0.013735495507717133, 0.033451665192842484, 0.005136617459356785, 0.01994931511580944, 0.013192065991461277, 0.005165292881429195, 0.0072790514677762985, 0.01997443474829197, 0.32146430015563965], [0.004923336673527956, 0.009209001436829567, 0.004183240234851837, 0.00962235126644373, 0.0033193263225257397, 0.007896710187196732, 0.004053845070302486, 0.3053766191005707, 0.0017435811460018158, 0.00570980878546834, 0.003149841446429491, 0.003646510886028409, 0.003116130130365491, 0.0020640771836042404, 0.003027547150850296, 0.009284725412726402, 0.6196733713150024], [0.011320503428578377, 0.01287847850471735, 0.004231036640703678, 0.014071362093091011, 0.0029739534948021173, 0.00562923913821578, 0.004689460154622793, 0.2829137146472931, 0.0037934472784399986, 0.003342414740473032, 0.0035454153548926115, 0.005482340697199106, 0.011129328049719334, 0.006163335405290127, 0.002601891290396452, 0.01211392879486084, 0.6131200790405273], [0.012826054356992245, 0.03667604923248291, 0.008872858248651028, 0.02027476765215397, 0.006596200633794069, 0.018432293087244034, 0.012225233018398285, 0.27316218614578247, 0.007550734095275402, 0.009530528448522091, 0.011643902398645878, 0.012616467662155628, 0.021441683173179626, 0.008149184286594391, 0.00552860414609313, 0.021854035556316376, 0.512619137763977], [0.012938289903104305, 0.006909831892699003, 0.003271212335675955, 0.006321427412331104, 0.0015122736804187298, 0.003253504866734147, 0.0018738596700131893, 0.2612558603286743, 0.009647320955991745, 0.0071680257096886635, 0.01058227475732565, 0.011552436277270317, 0.036340419203042984, 0.017787376418709755, 0.007855677045881748, 0.016623713076114655, 0.5851064324378967], [0.014614767394959927, 0.014787524938583374, 0.010407062247395515, 0.011491176672279835, 0.0045379167422652245, 0.009533913806080818, 0.006606985814869404, 0.27419313788414, 0.005573054775595665, 0.00898142158985138, 0.010056321509182453, 0.008097633719444275, 0.01585564576089382, 0.007022731006145477, 0.004047931171953678, 0.019683275371789932, 0.5745095610618591], [0.016067491844296455, 0.008714119903743267, 0.0033892665524035692, 0.008226985111832619, 0.0022685329895466566, 0.004592550452798605, 0.004389595706015825, 0.2804817855358124, 0.007465191651135683, 0.004341114778071642, 0.010291137732565403, 0.009035010822117329, 0.023603443056344986, 0.013946149498224258, 0.0044854735024273396, 0.018871871754527092, 0.5798302292823792], [0.016847174614667892, 0.021087702363729477, 0.004635756369680166, 0.013821098953485489, 0.002564225113019347, 0.007469319738447666, 0.0032626574393361807, 0.28790995478630066, 0.005061584059149027, 0.0038806104566901922, 0.004189354833215475, 0.007197131868451834, 0.017809836193919182, 0.00875290110707283, 0.003011691849678755, 0.013249438256025314, 0.5792496800422668], [0.013578974641859531, 0.02599377930164337, 0.015929365530610085, 0.011834003031253815, 0.004205123987048864, 0.01191774196922779, 0.003465887624770403, 0.2665345072746277, 0.006189959589391947, 0.010602318681776524, 0.010265601798892021, 0.016645876690745354, 0.0272119902074337, 0.01637011580169201, 0.009696041233837605, 0.037133775651454926, 0.5124248266220093], [0.01754230447113514, 0.034152958542108536, 0.00626472057774663, 0.01711321622133255, 0.00392908463254571, 0.011529541574418545, 0.004422815516591072, 0.26351505517959595, 0.01865469478070736, 0.010900565423071384, 0.01442689634859562, 0.02330930344760418, 0.043453577905893326, 0.02524515613913536, 0.006000441964715719, 0.02027956396341324, 0.47926002740859985], [0.005555327050387859, 0.011321173049509525, 0.004849494434893131, 0.010968709364533424, 0.004033650271594524, 0.00914172362536192, 0.004737885668873787, 0.30682477355003357, 0.002726864069700241, 0.006774841342121363, 0.0044799125753343105, 0.005042620934545994, 0.004981603939086199, 0.003614451503381133, 0.004061547107994556, 0.012512635439634323, 0.5983728170394897]], [[0.03580186516046524, 0.05296562984585762, 0.028911808505654335, 0.01862507313489914, 0.012676135636866093, 0.03156374394893646, 0.09162185341119766, 0.17097042500972748, 0.005672859959304333, 0.031676508486270905, 0.011108944192528725, 0.009009270928800106, 0.026260053738951683, 0.009014475159347057, 0.13359680771827698, 0.09297959506511688, 0.23754499852657318], [0.0030962699092924595, 0.010401977226138115, 0.023885956034064293, 0.013663072139024734, 0.005381747614592314, 0.005242825485765934, 0.01035457756370306, 0.27398738265037537, 0.001726997783407569, 0.008226906880736351, 0.0018223561346530914, 0.0026275673881173134, 0.0028652807231992483, 0.0009516663267277181, 0.00705374451354146, 0.008208733052015305, 0.6205028891563416], [0.00846416037529707, 0.011402738280594349, 0.03220270201563835, 0.017334338277578354, 0.0064049880020320415, 0.010059409774839878, 0.03178410977125168, 0.30258655548095703, 0.0020582920406013727, 0.01351278368383646, 0.0013449022080749273, 0.0036434463690966368, 0.0026107511948794127, 0.0015680515207350254, 0.01581207476556301, 0.018937107175588608, 0.5202736258506775], [0.003462919732555747, 0.004322130233049393, 0.021770374849438667, 0.01821298524737358, 0.007592326961457729, 0.004668138455599546, 0.00881330668926239, 0.2756400406360626, 0.0022820935118943453, 0.009952614083886147, 0.001797498669475317, 0.004641758278012276, 0.0025868876837193966, 0.001001017983071506, 0.0041532572358846664, 0.007438203785568476, 0.6216645240783691], [0.0021714605391025543, 0.0025463092606514692, 0.01565895415842533, 0.008556120097637177, 0.0034909965470433235, 0.004218393471091986, 0.01106936950236559, 0.28213393688201904, 0.0008059018873609602, 0.008641283959150314, 0.0007920727366581559, 0.0018297000788152218, 0.0008702632621861994, 0.0005323301302269101, 0.0036083965096622705, 0.004299781285226345, 0.6487747430801392], [0.0061777327209711075, 0.012426759116351604, 0.030953293666243553, 0.01757928356528282, 0.008832531981170177, 0.01592283323407173, 0.031343281269073486, 0.2863965630531311, 0.00206932844594121, 0.018237069249153137, 0.002117638709023595, 0.0035185744054615498, 0.002775121247395873, 0.0011351193534210324, 0.010397687554359436, 0.013095575384795666, 0.5370215177536011], [0.013585938140749931, 0.014022024348378181, 0.04878785461187363, 0.013455034233629704, 0.018033839762210846, 0.020292801782488823, 0.03259448707103729, 0.27387621998786926, 0.008689194917678833, 0.029829464852809906, 0.005522464402019978, 0.011460673063993454, 0.006726657040417194, 0.004167247097939253, 0.01179629284888506, 0.02146286517381668, 0.46569696068763733], [0.0013113915920257568, 0.0016930617857724428, 0.004754963796585798, 0.0032898152712732553, 0.0008957107784226537, 0.0009467920172028244, 0.0017142423894256353, 0.28658854961395264, 0.001904387609101832, 0.0025567507836967707, 0.0012811769265681505, 0.0016908157849684358, 0.0014801995130255818, 0.0020137240644544363, 0.0026195296086370945, 0.003304528072476387, 0.6819543242454529], [0.005051429383456707, 0.001122567686252296, 0.002647553803399205, 0.003190950257703662, 0.0005463911220431328, 0.0005844064289703965, 0.0032161446288228035, 0.255161851644516, 0.003166359616443515, 0.005654721986502409, 0.00737469457089901, 0.002448225161060691, 0.008524739183485508, 0.005858925171196461, 0.007647832855582237, 0.007743903901427984, 0.6800593137741089], [0.02101922780275345, 0.007849006913602352, 0.023462947458028793, 0.010967777110636234, 0.002890623640269041, 0.005473650526255369, 0.017792506143450737, 0.2821122705936432, 0.007170030847191811, 0.01771111786365509, 0.005200207699090242, 0.0049148425459861755, 0.0129205621778965, 0.009769205003976822, 0.0252385213971138, 0.02831202931702137, 0.5171955227851868], [0.0019734955858439207, 0.0005032189656049013, 0.001218911842443049, 0.0010221324628219008, 0.00012615148443728685, 0.00022399377485271543, 0.0011505853617563844, 0.2465369552373886, 0.000807486881967634, 0.0022182385437190533, 0.0020889616571366787, 0.000790018355473876, 0.002242471557110548, 0.0018575420835986733, 0.003560044337064028, 0.0028841132298111916, 0.730795681476593], [0.005579705815762281, 0.003236145479604602, 0.006129849702119827, 0.004484180826693773, 0.0007361247553490102, 0.0012803201097995043, 0.0054097846150398254, 0.2785070240497589, 0.0017415224574506283, 0.004133229609578848, 0.0025663375854492188, 0.0013421970652416348, 0.0032312730327248573, 0.002524322597309947, 0.0060724071227014065, 0.006892625708132982, 0.6661330461502075], [0.015517828054726124, 0.004696711432188749, 0.0029803819488734007, 0.005829711444675922, 0.0008812235319055617, 0.0020777916070073843, 0.004754746798425913, 0.2766418755054474, 0.007372830994427204, 0.00752002838999033, 0.015328435227274895, 0.004653593059629202, 0.01788552850484848, 0.020296813920140266, 0.012626509182155132, 0.014490563422441483, 0.5864454507827759], [0.007105082273483276, 0.0013332953676581383, 0.0014424254186451435, 0.001723151421174407, 0.0002221063623437658, 0.0007341952295973897, 0.002808653051033616, 0.26791858673095703, 0.001972602680325508, 0.003920631017535925, 0.006122084800153971, 0.0017708615632727742, 0.008285640738904476, 0.006828767247498035, 0.011915232986211777, 0.013608649373054504, 0.6622880101203918], [0.015015607699751854, 0.007305089849978685, 0.0064068809151649475, 0.0022947925608605146, 0.000549672928173095, 0.003930551931262016, 0.011729201301932335, 0.28113240003585815, 0.0031697957310825586, 0.006746622733771801, 0.00741637172177434, 0.002470766194164753, 0.00917785707861185, 0.008826141245663166, 0.05215026065707207, 0.032781194895505905, 0.5488969087600708], [0.011647439561784267, 0.006352242548018694, 0.0063038975931704044, 0.00410382030531764, 0.0007571507594548166, 0.0031009891536086798, 0.005768034607172012, 0.28709715604782104, 0.004472686909139156, 0.00952212419360876, 0.0066847545094788074, 0.004218485206365585, 0.00735751586034894, 0.010021993890404701, 0.02135382406413555, 0.02835448831319809, 0.5828834772109985], [0.001497334218584001, 0.002003897214308381, 0.00551882479339838, 0.0038237886037677526, 0.0010628955205902457, 0.0010984818218275905, 0.0016785989282652736, 0.2884233295917511, 0.0028514890000224113, 0.0028835597913712263, 0.0015897626290097833, 0.0021541921887546778, 0.0020303339697420597, 0.003260774305090308, 0.0031158949714154005, 0.0039804778061807156, 0.6730262637138367]], [[0.10231384634971619, 0.04822657257318497, 0.014035449363291264, 0.22435247898101807, 0.060774121433496475, 0.07985565066337585, 0.24080970883369446, 0.055229272693395615, 0.010837005451321602, 0.04434259980916977, 0.010527676902711391, 0.007928035221993923, 0.01310907956212759, 0.0022820786107331514, 0.0021689175628125668, 0.010605605319142342, 0.07260184735059738], [0.046490781009197235, 0.250521183013916, 0.020897692069411278, 0.04409096762537956, 0.10381070524454117, 0.10050298273563385, 0.010810698382556438, 0.046857304871082306, 0.14423811435699463, 0.021540630608797073, 0.018639007583260536, 0.02510441653430462, 0.05081072077155113, 0.03674139827489853, 0.003478321246802807, 0.02884363941848278, 0.04662139341235161], [0.014627782627940178, 0.04120290279388428, 0.4446811378002167, 0.01467923354357481, 0.04158969968557358, 0.024069221690297127, 0.014216679148375988, 0.11584466695785522, 0.024268614128232002, 0.038884032517671585, 0.0037368417251855135, 0.01601598784327507, 0.023707224056124687, 0.010531970299780369, 0.0017431159503757954, 0.03134063631296158, 0.1388603001832962], [0.01673767901957035, 0.04881143942475319, 0.07567895948886871, 0.33459264039993286, 0.08873006701469421, 0.03391461819410324, 0.009355081245303154, 0.07427796721458435, 0.04295920580625534, 0.014623157680034637, 0.0398203581571579, 0.0317721888422966, 0.059089481830596924, 0.01797494851052761, 0.001508086803369224, 0.018500154837965965, 0.09165400266647339], [0.00837042834609747, 0.030092310160398483, 0.013476437889039516, 0.03354636952280998, 0.7834225296974182, 0.054135654121637344, 0.005421343259513378, 0.009862793609499931, 0.010424251668155193, 0.0073937526904046535, 0.00498407194390893, 0.0034986701793968678, 0.01356214378029108, 0.0037819528952240944, 0.00024778442457318306, 0.007580883335322142, 0.010198523290455341], [0.04008343815803528, 0.10951713472604752, 0.03197752684354782, 0.05107555165886879, 0.15460726618766785, 0.199435755610466, 0.02965453267097473, 0.058008693158626556, 0.09214220941066742, 0.03842421621084213, 0.016791483387351036, 0.03116643987596035, 0.025181081146001816, 0.026474973186850548, 0.0045374855399131775, 0.0352134145796299, 0.05570877343416214], [0.011607118882238865, 0.01520081702619791, 0.04260588809847832, 0.020866304636001587, 0.00976637564599514, 0.022626353427767754, 0.09240585565567017, 0.31149742007255554, 0.004977457225322723, 0.02185690589249134, 0.0049994527362287045, 0.003003663383424282, 0.00992362666875124, 0.0025300884153693914, 0.008049491792917252, 0.012959687039256096, 0.4051235318183899], [0.002230716636404395, 0.006503380835056305, 0.009635924361646175, 0.008061449974775314, 0.005050257313996553, 0.005539311561733484, 0.005834879819303751, 0.3444388508796692, 0.0015526474453508854, 0.006807988043874502, 0.006233785301446915, 0.002358405152335763, 0.007827170193195343, 0.001676688902080059, 0.006574063096195459, 0.012214935384690762, 0.5674595236778259], [0.015985960140824318, 0.015500819310545921, 0.0011227468494325876, 0.00436287559568882, 0.003419020213186741, 0.005777842830866575, 0.0018582041375339031, 0.0070347567088902, 0.6480066776275635, 0.014237723313272, 0.03944635018706322, 0.04382740706205368, 0.07246467471122742, 0.10652437061071396, 0.0007181759574450552, 0.012464463710784912, 0.007248013745993376], [0.02855873852968216, 0.030858943238854408, 0.016458123922348022, 0.01886364072561264, 0.015572315081954002, 0.02101227641105652, 0.05973706394433975, 0.06015005335211754, 0.07825782150030136, 0.37543660402297974, 0.050123896449804306, 0.05066576600074768, 0.06622787564992905, 0.01768210157752037, 0.007731163874268532, 0.03566728159785271, 0.066996268928051], [0.006492645479738712, 0.004605474881827831, 0.0010465719969943166, 0.01177778746932745, 0.002194005064666271, 0.001985299400985241, 0.002851333934813738, 0.04574306681752205, 0.04607429727911949, 0.010095133446156979, 0.6637588143348694, 0.017695482820272446, 0.07973530143499374, 0.021500155329704285, 0.005192994140088558, 0.02168217860162258, 0.05756930261850357], [0.016236571595072746, 0.013694696128368378, 0.002915945602580905, 0.014230689965188503, 0.007542142178863287, 0.007519168313592672, 0.005503657273948193, 0.0369376502931118, 0.1028691753745079, 0.042689524590969086, 0.06700694561004639, 0.45826080441474915, 0.07909713685512543, 0.04616483300924301, 0.0025631412863731384, 0.05615502595901489, 0.040612928569316864], [0.012862255796790123, 0.004321208223700523, 0.0007380644674412906, 0.004697688855230808, 0.005368534475564957, 0.0017172519583255053, 0.002345202723518014, 0.024267908185720444, 0.07405127584934235, 0.008670864626765251, 0.11784759163856506, 0.02963525801897049, 0.6094574928283691, 0.05104106292128563, 0.0010725581087172031, 0.02144717611372471, 0.03045850619673729], [0.011344737373292446, 0.008953439071774483, 0.0014549760380759835, 0.0039064921438694, 0.0030496688559651375, 0.004957438446581364, 0.0009691601153463125, 0.017054885625839233, 0.27739036083221436, 0.011209480464458466, 0.03466898202896118, 0.05288417264819145, 0.1435890793800354, 0.34889301657676697, 0.0029962072148919106, 0.05683060735464096, 0.019847268238663673], [0.003368671052157879, 0.013583527877926826, 0.009259716607630253, 0.004006437957286835, 0.0038680133875459433, 0.006436121184378862, 0.007880812510848045, 0.12874320149421692, 0.01580236107110977, 0.022467104718089104, 0.08036766201257706, 0.027030685916543007, 0.055513132363557816, 0.04136744141578674, 0.24670298397541046, 0.15156280994415283, 0.18203933537006378], [0.007473790552467108, 0.014781187288463116, 0.010337840765714645, 0.011942895129323006, 0.009828347712755203, 0.008312221616506577, 0.00318410387262702, 0.08268597722053528, 0.052749909460544586, 0.023263053968548775, 0.053277984261512756, 0.039682745933532715, 0.1319746971130371, 0.12371502071619034, 0.01185782253742218, 0.30487263202667236, 0.11005986481904984], [0.0016825601924210787, 0.005063294433057308, 0.008983827196061611, 0.006539860274642706, 0.004001611843705177, 0.004307132214307785, 0.004967803601175547, 0.33764851093292236, 0.0012313788756728172, 0.006174267269670963, 0.005538249853998423, 0.0018429573392495513, 0.007342248689383268, 0.001477861893363297, 0.006959833204746246, 0.011220687068998814, 0.58501797914505]], [[0.11064932495355606, 0.2861958146095276, 0.015067524276673794, 0.07355459779500961, 0.06046264246106148, 0.20357170701026917, 0.10814206302165985, 0.019283557310700417, 0.008805382996797562, 0.04570617154240608, 0.004187706392258406, 0.010015050880610943, 0.008163661696016788, 0.0019381511956453323, 0.010395367629826069, 0.016593165695667267, 0.017268195748329163], [0.06361417472362518, 0.12997613847255707, 0.04846347123384476, 0.09925758093595505, 0.1981370598077774, 0.09832987934350967, 0.05452072620391846, 0.042111024260520935, 0.03370045870542526, 0.039444901049137115, 0.01645437628030777, 0.023641135543584824, 0.025199059396982193, 0.01792309805750847, 0.03229411691427231, 0.03223419934511185, 0.04469858109951019], [0.04947299137711525, 0.037043869495391846, 0.05764047056436539, 0.01292980182915926, 0.027329223230481148, 0.028892677277326584, 0.03291632980108261, 0.29473304748535156, 0.006244855001568794, 0.010496613569557667, 0.00594155490398407, 0.014162021689116955, 0.006714094430208206, 0.006820941809564829, 0.010017624124884605, 0.02373078651726246, 0.3749130964279175], [0.05016633868217468, 0.05179364234209061, 0.03664080798625946, 0.0956832617521286, 0.16762837767601013, 0.048265911638736725, 0.02495983988046646, 0.19143706560134888, 0.007578858640044928, 0.01655939407646656, 0.009319035336375237, 0.011616279371082783, 0.007665322162210941, 0.008742409758269787, 0.016078194603323936, 0.020136144012212753, 0.23572911322116852], [0.05871862545609474, 0.11475560069084167, 0.029622670263051987, 0.051327742636203766, 0.2336670309305191, 0.09800241887569427, 0.06129336729645729, 0.1315382421016693, 0.0043980637565255165, 0.013080784119665623, 0.006968965288251638, 0.006243482697755098, 0.007549273781478405, 0.006339468993246555, 0.015101193450391293, 0.018916534259915352, 0.14247655868530273], [0.062167420983314514, 0.13224390149116516, 0.03961222246289253, 0.06640327721834183, 0.17744794487953186, 0.1431320756673813, 0.07713640481233597, 0.052284203469753265, 0.027417412027716637, 0.028554832562804222, 0.016788501292467117, 0.027056578546762466, 0.014256540685892105, 0.023119473829865456, 0.02033320814371109, 0.0383477583527565, 0.05369832366704941], [0.013414938934147358, 0.012547017075121403, 0.027256891131401062, 0.03875349834561348, 0.025836626067757607, 0.024479571729898453, 0.05171208456158638, 0.29541683197021484, 0.004627103451639414, 0.00885968841612339, 0.0077529242262244225, 0.007690089754760265, 0.008301527239382267, 0.004220880102366209, 0.007130585145205259, 0.009454840794205666, 0.4525448679924011], [0.0185551755130291, 0.004556307103484869, 0.00894142035394907, 0.010129105299711227, 0.007047406397759914, 0.00627224612981081, 0.011747623793780804, 0.32451847195625305, 0.005878430791199207, 0.00430064694955945, 0.008754890412092209, 0.009346920065581799, 0.009809249080717564, 0.00710773142054677, 0.0052851540967822075, 0.01579752005636692, 0.5419517755508423], [0.06530153751373291, 0.05581533536314964, 0.004615980200469494, 0.008040301501750946, 0.0324745774269104, 0.055171724408864975, 0.022884028032422066, 0.2389223426580429, 0.02392970398068428, 0.02138812467455864, 0.02622242271900177, 0.026544980704784393, 0.026619836688041687, 0.015003722161054611, 0.030615337193012238, 0.04791983589529991, 0.2985301911830902], [0.0186065211892128, 0.03110244683921337, 0.007531535346060991, 0.006384595762938261, 0.01573231630027294, 0.02773648127913475, 0.014105214737355709, 0.3217688202857971, 0.006777422036975622, 0.019051678478717804, 0.011984478682279587, 0.016232356429100037, 0.007189901079982519, 0.006299290806055069, 0.015625718981027603, 0.02901366539299488, 0.44485756754875183], [0.06479699909687042, 0.0203217975795269, 0.0049687158316373825, 0.006786097772419453, 0.015438761562108994, 0.020039653405547142, 0.014200984500348568, 0.261074960231781, 0.014420581050217152, 0.023350222036242485, 0.056368596851825714, 0.0322117917239666, 0.042492445558309555, 0.01031960267573595, 0.03449470177292824, 0.031758785247802734, 0.3469553291797638], [0.07967878133058548, 0.03959923982620239, 0.00844874419271946, 0.0067182788625359535, 0.022977136075496674, 0.03395850583910942, 0.016678674146533012, 0.2410414218902588, 0.019603190943598747, 0.019815417006611824, 0.027524985373020172, 0.03829849511384964, 0.03646773844957352, 0.01738143153488636, 0.02577395550906658, 0.054980892688035965, 0.3110530972480774], [0.06913606822490692, 0.04853967949748039, 0.00267248647287488, 0.006781389936804771, 0.011949611827731133, 0.027468914166092873, 0.008769519627094269, 0.22675353288650513, 0.024301622062921524, 0.02624249830842018, 0.049577295780181885, 0.03207225352525711, 0.08995068818330765, 0.020404692739248276, 0.03635886684060097, 0.04128899425268173, 0.2777319550514221], [0.09848746657371521, 0.04065364971756935, 0.003563826670870185, 0.0045530120842158794, 0.021409759297966957, 0.03902905434370041, 0.014178195036947727, 0.21531948447227478, 0.025104945525527, 0.01594237983226776, 0.033953890204429626, 0.04222805052995682, 0.04058384150266647, 0.03738243877887726, 0.04146037623286247, 0.06771257519721985, 0.25843706727027893], [0.02186203934252262, 0.009089235216379166, 0.008449574932456017, 0.0069407629780471325, 0.010747908614575863, 0.011311180889606476, 0.01207310613244772, 0.2488556206226349, 0.02082022838294506, 0.02715897187590599, 0.04647752642631531, 0.06585762649774551, 0.03935054689645767, 0.02987242117524147, 0.03858870267868042, 0.04665840044617653, 0.35588622093200684], [0.06198921054601669, 0.03536825254559517, 0.007430571597069502, 0.012308758683502674, 0.021207600831985474, 0.03148888424038887, 0.022302895784378052, 0.2061266154050827, 0.036144208163022995, 0.02327052690088749, 0.03840610012412071, 0.045426931232213974, 0.06916413456201553, 0.04437774047255516, 0.032215967774391174, 0.05239536613225937, 0.2603762149810791], [0.016722913831472397, 0.004445492755621672, 0.009419319219887257, 0.01058664359152317, 0.0068637048825621605, 0.006029916927218437, 0.010071957483887672, 0.32445138692855835, 0.007086324971169233, 0.005023763980716467, 0.010848034173250198, 0.010410626418888569, 0.011794120073318481, 0.008979752659797668, 0.0061179823242127895, 0.016704149544239044, 0.5344439744949341]], [[0.008838932029902935, 0.05999119207262993, 0.02032579854130745, 0.029770970344543457, 0.0046866172924637794, 0.012430781498551369, 0.08697380870580673, 0.1206798180937767, 0.047204192727804184, 0.009483088739216328, 0.06672019511461258, 0.008920884691178799, 0.10948628187179565, 0.19007766246795654, 0.016055172309279442, 0.043300095945596695, 0.16505447030067444], [0.006277316249907017, 0.030027559027075768, 0.016165612265467644, 0.02920420467853546, 0.0074911778792738914, 0.010401354171335697, 0.005729466676712036, 0.3119165003299713, 0.005573493894189596, 0.0040230778977274895, 0.0021689997520297766, 0.002085156738758087, 0.0056680114939808846, 0.0027740884106606245, 0.0014002544339746237, 0.00316954986192286, 0.555924117565155], [0.0028562152292579412, 0.010485831648111343, 0.00589174497872591, 0.007633655797690153, 0.004016880411654711, 0.00910889357328415, 0.0030102611053735018, 0.32462581992149353, 0.0011670918902382255, 0.0014104503206908703, 0.0007403984200209379, 0.0007315311231650412, 0.0019150173757225275, 0.0007510724826715887, 0.0008684338536113501, 0.0017024397384375334, 0.6230842471122742], [0.013247065246105194, 0.06848253309726715, 0.02582527883350849, 0.059877052903175354, 0.01903846673667431, 0.03915196284651756, 0.013449899852275848, 0.2992263436317444, 0.005860950797796249, 0.005228393245488405, 0.0021665478125214577, 0.0017033166950568557, 0.003786279819905758, 0.0031911644618958235, 0.0013446449302136898, 0.0035184372682124376, 0.4349015951156616], [0.001979042077437043, 0.011797063983976841, 0.00874242465943098, 0.011050427332520485, 0.0038752045948058367, 0.009132196195423603, 0.004224082455039024, 0.320546954870224, 0.0004811668477486819, 0.0011068056337535381, 0.00024345020938199013, 0.00024754495825618505, 0.0005716445157304406, 0.00018906612240243703, 0.0003468822978902608, 0.0007660789997316897, 0.6247000098228455], [0.005987359676510096, 0.02407584711909294, 0.01559499092400074, 0.016291232779622078, 0.008457396179437637, 0.019713999703526497, 0.006674196105450392, 0.32652342319488525, 0.002775958739221096, 0.003231125883758068, 0.0012888950295746326, 0.0009976557921618223, 0.0021340963430702686, 0.001570488209836185, 0.0014272535918280482, 0.0032574962824583054, 0.559998631477356], [0.0023013888858258724, 0.007948913611471653, 0.007530550938099623, 0.006467541679739952, 0.00343892490491271, 0.00926176831126213, 0.002284273272380233, 0.3121257424354553, 0.0011618515709415078, 0.0009515214478597045, 0.001110097044147551, 0.000712997920345515, 0.0013345349580049515, 0.0009808051399886608, 0.0006136141601018608, 0.002311359392479062, 0.6394640803337097], [0.0018660355126485229, 0.007614978589117527, 0.0026823594234883785, 0.007038292475044727, 0.00157541676890105, 0.0035376683808863163, 0.0011946283048018813, 0.3208935558795929, 0.0009267051354981959, 0.0009504372719675303, 0.0007879172917455435, 0.0006295080529525876, 0.001908165984787047, 0.0009390199556946754, 0.0010099190985783935, 0.0015289455186575651, 0.6449163556098938], [0.0025385264307260513, 0.015972135588526726, 0.003545205807313323, 0.011749769560992718, 0.0012723729014396667, 0.002697229152545333, 0.004866490140557289, 0.27537378668785095, 0.0058863842859864235, 0.004376258701086044, 0.004904803354293108, 0.0019107566913589835, 0.012556239031255245, 0.004626601003110409, 0.0021985922940075397, 0.0031842042226344347, 0.6423406004905701], [0.003268518252298236, 0.012585888616740704, 0.004801989998668432, 0.011512072756886482, 0.002396664349362254, 0.003887286875396967, 0.0017279277089983225, 0.283633291721344, 0.003216083627194166, 0.0019018937600776553, 0.005095880478620529, 0.0013470402918756008, 0.005691589321941137, 0.0033170911483466625, 0.001266071922145784, 0.0029785381630063057, 0.6513722538948059], [0.002200132003054023, 0.005949343089014292, 0.001533872913569212, 0.003685093717649579, 0.0005745512899011374, 0.001878154231235385, 0.0009750121389515698, 0.28034859895706177, 0.0033737639896571636, 0.0026834208983927965, 0.009662826545536518, 0.0014492352493107319, 0.006672397255897522, 0.004902185406535864, 0.004605260211974382, 0.00593555485829711, 0.6635706424713135], [0.001111872959882021, 0.004992477595806122, 0.0035979158710688353, 0.0033878388348966837, 0.0009550845134072006, 0.0020844158716499805, 0.0022347532212734222, 0.2775435745716095, 0.0037635385524481535, 0.0018603479256853461, 0.0032637796830385923, 0.0012520966120064259, 0.006311315111815929, 0.0037632009480148554, 0.0020016119815409184, 0.003100041765719652, 0.6787762641906738], [0.006274266634136438, 0.015597670339047909, 0.0019414409762248397, 0.014022943563759327, 0.001832918613217771, 0.0031529469415545464, 0.0034341164864599705, 0.28615373373031616, 0.009429996833205223, 0.004364234395325184, 0.014302671886980534, 0.0033574793487787247, 0.021188819780945778, 0.008690145798027515, 0.0040708379819989204, 0.005184987559914589, 0.597000777721405], [0.0028356562834233046, 0.010088915936648846, 0.002142612123861909, 0.0065392423421144485, 0.0010128070134669542, 0.002616879530251026, 0.005847576539963484, 0.2856804430484772, 0.0028883544728159904, 0.0038166623562574387, 0.0035903428215533495, 0.0015511433593928814, 0.008969041518867016, 0.003468763316050172, 0.0032929072622209787, 0.0042611476965248585, 0.6513975262641907], [0.002507975557819009, 0.008276339620351791, 0.0015707237180322409, 0.0030484518501907587, 0.0005828459979966283, 0.0034471892286092043, 0.001863445620983839, 0.3165230453014374, 0.002972877351567149, 0.001105941948480904, 0.004632243886590004, 0.0015756785869598389, 0.0044962200336158276, 0.00519570941105485, 0.003240439807996154, 0.0046338895335793495, 0.6343269944190979], [0.0028794854879379272, 0.016901135444641113, 0.0030379469972103834, 0.007939142175018787, 0.0014289936516433954, 0.004951691720634699, 0.0038265036419034004, 0.3014661967754364, 0.0033361786045134068, 0.0020475382916629314, 0.004913855344057083, 0.0019554055761545897, 0.007757754065096378, 0.005630883853882551, 0.0032016821205615997, 0.00585649348795414, 0.622869074344635], [0.002648302586749196, 0.00996107142418623, 0.003456014906987548, 0.00959792174398899, 0.002230860060080886, 0.004387015476822853, 0.0014840762596577406, 0.32245972752571106, 0.001398561173118651, 0.0013238589745014906, 0.0011692801490426064, 0.0010149111039936543, 0.0028998046182096004, 0.0014115675585344434, 0.0014841669471934438, 0.002164526144042611, 0.6309084296226501]], [[0.07250411808490753, 0.054838016629219055, 0.03025709092617035, 0.03168412297964096, 0.01117117889225483, 0.02596318908035755, 0.02019568160176277, 0.17584320902824402, 0.03772613778710365, 0.017292149364948273, 0.019033158197999, 0.015981798991560936, 0.10748451948165894, 0.06339283287525177, 0.026982644572854042, 0.025968389585614204, 0.2636817693710327], [0.00449938653036952, 0.007869871333241463, 0.005855365190654993, 0.010057754814624786, 0.003850877285003662, 0.006389282643795013, 0.0058118123561143875, 0.3005697727203369, 0.003926183097064495, 0.02833758294582367, 0.0019055336015298963, 0.004731093067675829, 0.006856117397546768, 0.0018985210917890072, 0.0037874961271882057, 0.005142093636095524, 0.5985113382339478], [0.002887526759877801, 0.005752807483077049, 0.021710360422730446, 0.01651262491941452, 0.007668592501431704, 0.006368731148540974, 0.010878670029342175, 0.2871168553829193, 0.001873801345936954, 0.018829399719834328, 0.0014974785735830665, 0.003380925627425313, 0.004635887686163187, 0.0014531337656080723, 0.005097417160868645, 0.009678236208856106, 0.5946574807167053], [0.0032901836093515158, 0.006757782306522131, 0.013796627521514893, 0.020112836733460426, 0.008938129991292953, 0.004249601159244776, 0.004514114465564489, 0.28531527519226074, 0.004239200614392757, 0.014444817788898945, 0.00107086100615561, 0.0034148218110203743, 0.004675847943872213, 0.0013690005289390683, 0.002087400760501623, 0.004465100821107626, 0.6172584891319275], [0.002208573743700981, 0.0023240309674292803, 0.008742237463593483, 0.005920421332120895, 0.00325100333429873, 0.003751414828002453, 0.007118323352187872, 0.29076340794563293, 0.0009553448180668056, 0.023026814684271812, 0.0007277590921148658, 0.0022186446003615856, 0.0026698641013354063, 0.000758698966819793, 0.004438812844455242, 0.004820294212549925, 0.6363043785095215], [0.01057443581521511, 0.013326886110007763, 0.013629400171339512, 0.011955070309340954, 0.005824630614370108, 0.02150745503604412, 0.014331351034343243, 0.31061574816703796, 0.005320559721440077, 0.03795351833105087, 0.0034003520850092173, 0.008019414730370045, 0.008709263987839222, 0.003563303966075182, 0.008108800277113914, 0.008816267363727093, 0.5143435597419739], [0.023246681317687035, 0.036682963371276855, 0.0468820296227932, 0.07191494852304459, 0.043857038021087646, 0.05066068097949028, 0.02793613262474537, 0.23398587107658386, 0.010748549364507198, 0.04845832288265228, 0.003108717268332839, 0.009237070567905903, 0.011475318111479282, 0.004741620738059282, 0.008399915881454945, 0.011427357792854309, 0.3572368621826172], [0.001065234886482358, 0.0030463573057204485, 0.005035394802689552, 0.0063715483993291855, 0.0016224693972617388, 0.0017974732909351587, 0.0018142752815037966, 0.2865598499774933, 0.0011633801041170955, 0.004326092079281807, 0.0005308861727826297, 0.001274771522730589, 0.002319390419870615, 0.0009796912781894207, 0.0024901360739022493, 0.0032277535647153854, 0.6763752698898315], [0.006462869234383106, 0.004796402063220739, 0.0025641212705522776, 0.005078374408185482, 0.0010948996059596539, 0.00148495240136981, 0.002638250356540084, 0.24808421730995178, 0.006665803957730532, 0.015178035944700241, 0.003392704762518406, 0.00270862621255219, 0.018063317984342575, 0.009636455215513706, 0.005009351298213005, 0.0033341115340590477, 0.6638075709342957], [0.04251081123948097, 0.03791697695851326, 0.02159583382308483, 0.038590263575315475, 0.012144549749791622, 0.019638122990727425, 0.017482291907072067, 0.17553822696208954, 0.02592628449201584, 0.1147778257727623, 0.010924885980784893, 0.025900382548570633, 0.06118728220462799, 0.027059901505708694, 0.015693916007876396, 0.020478995516896248, 0.3326333463191986], [0.0016267145983874798, 0.0010850365506485105, 0.001038845512084663, 0.0027872996870428324, 0.0005339140770956874, 0.000669448112603277, 0.001308221137151122, 0.24292205274105072, 0.002076371805742383, 0.009975813329219818, 0.004298142623156309, 0.001802602899260819, 0.0131455073133111, 0.004136352334171534, 0.004994882736355066, 0.0043364521116018295, 0.7032623887062073], [0.009279221296310425, 0.0059870085678994656, 0.004043014254420996, 0.0066634006798267365, 0.0019153476459905505, 0.0027499194256961346, 0.0026665106415748596, 0.25666531920433044, 0.005988645367324352, 0.02468833141028881, 0.004796822555363178, 0.0063317217864096165, 0.015372059307992458, 0.008533770218491554, 0.004591592121869326, 0.009010883048176765, 0.6307163238525391], [0.007363625802099705, 0.0056667448952794075, 0.0023991030175238848, 0.004123271908611059, 0.0010621732799336314, 0.0017300231847912073, 0.0015881824074313045, 0.25956991314888, 0.0074172052554786205, 0.01860951818525791, 0.0054871696047484875, 0.003380029695108533, 0.02878706157207489, 0.014314514584839344, 0.0049970257095992565, 0.005947984755039215, 0.6275563836097717], [0.0027076993137598038, 0.0016159438528120518, 0.0011922736885026097, 0.0012111011892557144, 0.00041656449320726097, 0.0010283944429829717, 0.0019091839203611016, 0.2607748508453369, 0.0019935735035687685, 0.0035810943227261305, 0.002863318659365177, 0.001211909344419837, 0.010081212036311626, 0.008832305669784546, 0.005484686233103275, 0.004947139415889978, 0.690148651599884], [0.004162090364843607, 0.003181458218023181, 0.005689968820661306, 0.00298204249702394, 0.001212175004184246, 0.0026147994212806225, 0.0022495894227176905, 0.2610611915588379, 0.0037972412537783384, 0.004069909453392029, 0.008129466325044632, 0.003986326977610588, 0.00973745808005333, 0.014778226613998413, 0.020307740196585655, 0.011769690550863743, 0.6402708292007446], [0.00649907486513257, 0.005668060854077339, 0.005573372822254896, 0.003342200070619583, 0.001615032204426825, 0.004149514250457287, 0.003641796763986349, 0.28125372529029846, 0.004318587016314268, 0.007333166431635618, 0.005432142876088619, 0.00388256274163723, 0.012013954110443592, 0.019479017704725266, 0.011880111880600452, 0.021110909059643745, 0.6028066873550415], [0.0014380291104316711, 0.004073725547641516, 0.006834553554654121, 0.008995609357953072, 0.0022059043403714895, 0.0022548993583768606, 0.0023127931635826826, 0.28964006900787354, 0.0018081108573824167, 0.005677299574017525, 0.0007354802219197154, 0.0016671495977789164, 0.0033539654687047005, 0.001479115104302764, 0.003517432138323784, 0.004354709759354591, 0.6596513390541077]]]], \\\"left_text\\\": [\\\"[CLS]\\\", \\\"the\\\", \\\"cat\\\", \\\"sleeps\\\", \\\"on\\\", \\\"the\\\", \\\"mat\\\", \\\"[SEP]\\\", \\\"le\\\", \\\"chat\\\", \\\"do\\\", \\\"##rs\\\", \\\"sur\\\", \\\"le\\\", \\\"tap\\\", \\\"##is\\\", \\\"[SEP]\\\"], \\\"right_text\\\": [\\\"[CLS]\\\", \\\"the\\\", \\\"cat\\\", \\\"sleeps\\\", \\\"on\\\", \\\"the\\\", \\\"mat\\\", \\\"[SEP]\\\", \\\"le\\\", \\\"chat\\\", \\\"do\\\", \\\"##rs\\\", \\\"sur\\\", \\\"le\\\", \\\"tap\\\", \\\"##is\\\", \\\"[SEP]\\\"]}}, \\\"default_filter\\\": \\\"all\\\"}\"\n            ],\n            \"text/plain\": [\n              \"<IPython.core.display.Javascript object>\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/javascript\": [\n              \"/**\\n\",\n              \" * @fileoverview Transformer Visualization D3 javascript code.\\n\",\n              \" *\\n\",\n              \" *\\n\",\n              \" *  Based on: https://github.com/tensorflow/tensor2tensor/blob/master/tensor2tensor/visualization/attention.js\\n\",\n              \" *\\n\",\n              \" * Change log:\\n\",\n              \" *\\n\",\n              \" * 12/19/18  Jesse Vig   Assorted cleanup. Changed orientation of attention matrices.\\n\",\n              \" */\\n\",\n              \"\\n\",\n              \"requirejs(['jquery', 'd3'], function($, d3) {\\n\",\n              \"\\n\",\n              \"const TEXT_SIZE = 15;\\n\",\n              \"const BOXWIDTH = 110;\\n\",\n              \"const BOXHEIGHT = 22.5;\\n\",\n              \"const MATRIX_WIDTH = 115;\\n\",\n              \"const CHECKBOX_SIZE = 20;\\n\",\n              \"const TEXT_TOP = 30;\\n\",\n              \"const HEAD_COLORS = d3.scale.category10();\\n\",\n              \"\\n\",\n              \"var params = window.params;\\n\",\n              \"var config = {};\\n\",\n              \"initialize();\\n\",\n              \"\\n\",\n              \"function lighten(color) {\\n\",\n              \"  var c = d3.hsl(color);\\n\",\n              \"  var increment = (1 - c.l) * 0.6;\\n\",\n              \"  c.l += increment;\\n\",\n              \"  c.s -= increment;\\n\",\n              \"  return c;\\n\",\n              \"}\\n\",\n              \"\\n\",\n              \"function transpose(mat) {\\n\",\n              \"  return mat[0].map(function(col, i) {\\n\",\n              \"    return mat.map(function(row) {\\n\",\n              \"      return row[i];\\n\",\n              \"    });\\n\",\n              \"  });\\n\",\n              \"}\\n\",\n              \"\\n\",\n              \"function zip(a, b) {\\n\",\n              \"  return a.map(function (e, i) {\\n\",\n              \"    return [e, b[i]];\\n\",\n              \"  });\\n\",\n              \"}\\n\",\n              \"\\n\",\n              \"function render() {\\n\",\n              \"\\n\",\n              \"  var attnData = config.attention[config.filter];\\n\",\n              \"  var leftText = attnData.left_text;\\n\",\n              \"  var rightText = attnData.right_text;\\n\",\n              \"  var attentionHeads = attnData.attn[config.layer];\\n\",\n              \"\\n\",\n              \"  $(\\\"#vis svg\\\").empty();\\n\",\n              \"  $(\\\"#vis\\\").empty();\\n\",\n              \"\\n\",\n              \"  var height = config.initialTextLength * BOXHEIGHT + TEXT_TOP;\\n\",\n              \"  var svg = d3.select(\\\"#vis\\\")\\n\",\n              \"            .append('svg')\\n\",\n              \"            .attr(\\\"width\\\", \\\"100%\\\")\\n\",\n              \"            .attr(\\\"height\\\", height + \\\"px\\\");\\n\",\n              \"\\n\",\n              \"  var attData = [];\\n\",\n              \"  for (var i=0; i < config.nHeads; i++) {\\n\",\n              \"    var att = attentionHeads[i];\\n\",\n              \"    var att_trans = transpose(att);\\n\",\n              \"    attData.push(zip(att_trans, att));\\n\",\n              \"  }\\n\",\n              \"\\n\",\n              \"  renderText(svg, leftText, true, attData, 0);\\n\",\n              \"  renderText(svg, rightText, false, attData, MATRIX_WIDTH + BOXWIDTH);\\n\",\n              \"\\n\",\n              \"  renderAttentionHighlights(svg, attData);\\n\",\n              \"\\n\",\n              \"  svg.append(\\\"g\\\").classed(\\\"attentionHeads\\\", true);\\n\",\n              \"\\n\",\n              \"  renderAttention(svg, attentionHeads);\\n\",\n              \"\\n\",\n              \"  drawCheckboxes(0, svg, attentionHeads);\\n\",\n              \"\\n\",\n              \"}\\n\",\n              \"\\n\",\n              \"function renderText(svg, text, isLeft, attData, leftPos) {\\n\",\n              \"  // attData: list of tuples (att, att_trans), one for each layer. att and att_trans are attention matrics for each layer.\\n\",\n              \"  //           att is of shape [nHeads, source_len, target_len)\\n\",\n              \"  var id = isLeft ? \\\"left\\\" : \\\"right\\\";\\n\",\n              \"  var textContainer = svg.append(\\\"svg:g\\\")\\n\",\n              \"                         .attr(\\\"id\\\", id);\\n\",\n              \"\\n\",\n              \"  textContainer.append(\\\"g\\\").classed(\\\"attentionBoxes\\\", true)\\n\",\n              \"               .selectAll(\\\"g\\\")\\n\",\n              \"               .data(attData)\\n\",\n              \"               .enter()\\n\",\n              \"               .append(\\\"g\\\")\\n\",\n              \"               .selectAll(\\\"rect\\\")\\n\",\n              \"               .data(function(d) {return d;})\\n\",\n              \"               .enter()\\n\",\n              \"               .append(\\\"rect\\\")\\n\",\n              \"               .attr(\\\"x\\\", function(d, i, j) {\\n\",\n              \"                 return leftPos + boxOffsets(j);\\n\",\n              \"               })\\n\",\n              \"               .attr(\\\"y\\\", function(d, i) {\\n\",\n              \"                 return (+1) * BOXHEIGHT;\\n\",\n              \"               })\\n\",\n              \"               .attr(\\\"width\\\", BOXWIDTH / activeHeads())\\n\",\n              \"               .attr(\\\"height\\\", function() { return BOXHEIGHT; })\\n\",\n              \"               .attr(\\\"fill\\\", function(d, i, j) {\\n\",\n              \"                  return HEAD_COLORS(j);\\n\",\n              \"                })\\n\",\n              \"               .style(\\\"opacity\\\", 0.0);\\n\",\n              \"\\n\",\n              \"  var tokenContainer = textContainer.append(\\\"g\\\").selectAll(\\\"g\\\")\\n\",\n              \"                                    .data(text)\\n\",\n              \"                                    .enter()\\n\",\n              \"                                    .append(\\\"g\\\");\\n\",\n              \"\\n\",\n              \"  tokenContainer.append(\\\"rect\\\")\\n\",\n              \"                .classed(\\\"background\\\", true)\\n\",\n              \"                .style(\\\"opacity\\\", 0.0)\\n\",\n              \"                .attr(\\\"fill\\\", \\\"lightgray\\\")\\n\",\n              \"                .attr(\\\"x\\\", leftPos)\\n\",\n              \"                .attr(\\\"y\\\", function(d, i) {\\n\",\n              \"                  return TEXT_TOP + i * BOXHEIGHT;\\n\",\n              \"                })\\n\",\n              \"                .attr(\\\"width\\\", BOXWIDTH)\\n\",\n              \"                .attr(\\\"height\\\", BOXHEIGHT);\\n\",\n              \"\\n\",\n              \"  var textEl = tokenContainer.append(\\\"text\\\")\\n\",\n              \"                              .text(function(d) { return d; })\\n\",\n              \"                              .attr(\\\"font-size\\\", TEXT_SIZE + \\\"px\\\")\\n\",\n              \"                              .style(\\\"cursor\\\", \\\"default\\\")\\n\",\n              \"                              .style(\\\"-webkit-user-select\\\", \\\"none\\\")\\n\",\n              \"                              .attr(\\\"x\\\", leftPos)\\n\",\n              \"                              .attr(\\\"y\\\", function(d, i) {\\n\",\n              \"                                return TEXT_TOP + i * BOXHEIGHT;\\n\",\n              \"                              });\\n\",\n              \"\\n\",\n              \"  if (isLeft) {\\n\",\n              \"    textEl.style(\\\"text-anchor\\\", \\\"end\\\")\\n\",\n              \"           .attr(\\\"dx\\\", BOXWIDTH - 0.5 * TEXT_SIZE)\\n\",\n              \"           .attr(\\\"dy\\\", TEXT_SIZE);\\n\",\n              \"  } else {\\n\",\n              \"    textEl.style(\\\"text-anchor\\\", \\\"start\\\")\\n\",\n              \"           .attr(\\\"dx\\\", + 0.5 * TEXT_SIZE)\\n\",\n              \"           .attr(\\\"dy\\\", TEXT_SIZE);\\n\",\n              \"  }\\n\",\n              \"\\n\",\n              \"  tokenContainer.on(\\\"mouseover\\\", function(d, index) {\\n\",\n              \"    textContainer.selectAll(\\\".background\\\")\\n\",\n              \"                 .style(\\\"opacity\\\", function(d, i) {\\n\",\n              \"                   return i == index ? 1.0 : 0.0;\\n\",\n              \"                 });\\n\",\n              \"\\n\",\n              \"    svg.selectAll(\\\".attentionHeads\\\").style(\\\"display\\\", \\\"none\\\");\\n\",\n              \"\\n\",\n              \"    svg.selectAll(\\\".lineHeads\\\")  // To get the nesting to work.\\n\",\n              \"       .selectAll(\\\".attLines\\\")\\n\",\n              \"       .attr(\\\"stroke-opacity\\\", function(d) {\\n\",\n              \"          return 1.0;\\n\",\n              \"        })\\n\",\n              \"       .attr(\\\"y1\\\", function(d, i) {\\n\",\n              \"        if (isLeft) {\\n\",\n              \"          return TEXT_TOP + index * BOXHEIGHT + (BOXHEIGHT/2);\\n\",\n              \"        } else {\\n\",\n              \"          return TEXT_TOP + i * BOXHEIGHT + (BOXHEIGHT/2);\\n\",\n              \"        }\\n\",\n              \"     })\\n\",\n              \"     .attr(\\\"x1\\\", BOXWIDTH)\\n\",\n              \"     .attr(\\\"y2\\\", function(d, i) {\\n\",\n              \"       if (isLeft) {\\n\",\n              \"          return TEXT_TOP + i * BOXHEIGHT + (BOXHEIGHT/2);\\n\",\n              \"        } else {\\n\",\n              \"          return TEXT_TOP + index * BOXHEIGHT + (BOXHEIGHT/2);\\n\",\n              \"        }\\n\",\n              \"     })\\n\",\n              \"     .attr(\\\"x2\\\", BOXWIDTH + MATRIX_WIDTH)\\n\",\n              \"     .attr(\\\"stroke-width\\\", 2)\\n\",\n              \"     .attr(\\\"stroke\\\", function(d, i, j) {\\n\",\n              \"        return HEAD_COLORS(j);\\n\",\n              \"      })\\n\",\n              \"     .attr(\\\"stroke-opacity\\\", function(d, i, j) {\\n\",\n              \"      if (isLeft) {d = d[0];} else {d = d[1];}\\n\",\n              \"      if (config.headVis[j]) {\\n\",\n              \"        if (d) {\\n\",\n              \"          return d[index];\\n\",\n              \"        } else {\\n\",\n              \"          return 0.0;\\n\",\n              \"        }\\n\",\n              \"      } else {\\n\",\n              \"        return 0.0;\\n\",\n              \"      }\\n\",\n              \"     });\\n\",\n              \"\\n\",\n              \"    function updateAttentionBoxes() {\\n\",\n              \"      var id = isLeft ? \\\"right\\\" : \\\"left\\\";\\n\",\n              \"      var leftPos = isLeft ? MATRIX_WIDTH + BOXWIDTH : 0;\\n\",\n              \"      svg.select(\\\"#\\\" + id)\\n\",\n              \"         .selectAll(\\\".attentionBoxes\\\")\\n\",\n              \"         .selectAll(\\\"g\\\")\\n\",\n              \"         .selectAll(\\\"rect\\\")\\n\",\n              \"         .attr(\\\"x\\\", function(d, i, j) { return leftPos + boxOffsets(j); })\\n\",\n              \"         .attr(\\\"y\\\", function(d, i) { return TEXT_TOP + i * BOXHEIGHT; })\\n\",\n              \"         .attr(\\\"width\\\", BOXWIDTH/activeHeads())\\n\",\n              \"         .attr(\\\"height\\\", function() { return BOXHEIGHT; })\\n\",\n              \"         .style(\\\"opacity\\\", function(d, i, j) {\\n\",\n              \"            if (isLeft) {d = d[0];} else {d = d[1];}\\n\",\n              \"            if (config.headVis[j])\\n\",\n              \"              if (d) {\\n\",\n              \"                return d[index];\\n\",\n              \"              } else {\\n\",\n              \"                return 0.0;\\n\",\n              \"              }\\n\",\n              \"            else\\n\",\n              \"              return 0.0;\\n\",\n              \"         });\\n\",\n              \"    }\\n\",\n              \"\\n\",\n              \"    updateAttentionBoxes();\\n\",\n              \"  });\\n\",\n              \"\\n\",\n              \"  textContainer.on(\\\"mouseleave\\\", function() {\\n\",\n              \"    d3.select(this).selectAll(\\\".background\\\")\\n\",\n              \"                   .style(\\\"opacity\\\", 0.0);\\n\",\n              \"    svg.selectAll(\\\".attLines\\\").attr(\\\"stroke-opacity\\\", 0.0);\\n\",\n              \"    svg.selectAll(\\\".attentionHeads\\\").style(\\\"display\\\", \\\"inline\\\");\\n\",\n              \"    svg.selectAll(\\\".attentionBoxes\\\")\\n\",\n              \"       .selectAll(\\\"g\\\")\\n\",\n              \"       .selectAll(\\\"rect\\\")\\n\",\n              \"       .style(\\\"opacity\\\", 0.0);\\n\",\n              \"  });\\n\",\n              \"}\\n\",\n              \"\\n\",\n              \"function renderAttentionHighlights(svg, attention) {\\n\",\n              \"  var line_container = svg.append(\\\"g\\\");\\n\",\n              \"  line_container.selectAll(\\\"g\\\")\\n\",\n              \"                .data(attention)\\n\",\n              \"                .enter()\\n\",\n              \"                .append(\\\"g\\\")\\n\",\n              \"                .classed(\\\"lineHeads\\\", true)\\n\",\n              \"                .selectAll(\\\"line\\\")\\n\",\n              \"                .data(function(d){return d;})\\n\",\n              \"                .enter()\\n\",\n              \"                .append(\\\"line\\\").classed(\\\"attLines\\\", true);\\n\",\n              \"}\\n\",\n              \"\\n\",\n              \"function renderAttention(svg, attentionHeads) {\\n\",\n              \"  var line_container = svg.selectAll(\\\".attentionHeads\\\");\\n\",\n              \"  line_container.html(null);\\n\",\n              \"  for(var h=0; h<attentionHeads.length; h++) {\\n\",\n              \"    for(var s=0; s<attentionHeads[h].length; s++) {\\n\",\n              \"      for(var a=0; a<attentionHeads[h][s].length; a++) {\\n\",\n              \"        line_container.append(\\\"line\\\")\\n\",\n              \"        .attr(\\\"y1\\\", TEXT_TOP + s * BOXHEIGHT + (BOXHEIGHT/2))\\n\",\n              \"        .attr(\\\"x1\\\", BOXWIDTH)\\n\",\n              \"        .attr(\\\"y2\\\", TEXT_TOP + a * BOXHEIGHT + (BOXHEIGHT/2))\\n\",\n              \"        .attr(\\\"x2\\\", BOXWIDTH + MATRIX_WIDTH)\\n\",\n              \"        .attr(\\\"stroke-width\\\", 2)\\n\",\n              \"        .attr(\\\"stroke\\\", HEAD_COLORS(h))\\n\",\n              \"        .attr(\\\"stroke-opacity\\\", function() {\\n\",\n              \"          if (config.headVis[h]) {\\n\",\n              \"            return attentionHeads[h][s][a]/activeHeads();\\n\",\n              \"          } else {\\n\",\n              \"            return 0.0;\\n\",\n              \"          }\\n\",\n              \"        }());\\n\",\n              \"      }\\n\",\n              \"    }\\n\",\n              \"  }\\n\",\n              \"}\\n\",\n              \"\\n\",\n              \"// Checkboxes\\n\",\n              \"function boxOffsets(i) {\\n\",\n              \"  var numHeadsAbove = config.headVis.reduce(\\n\",\n              \"      function(acc, val, cur) {return val && cur < i ? acc + 1: acc;}, 0);\\n\",\n              \"  return numHeadsAbove * (BOXWIDTH / activeHeads());\\n\",\n              \"}\\n\",\n              \"\\n\",\n              \"function activeHeads() {\\n\",\n              \"  return config.headVis.reduce(function(acc, val) {\\n\",\n              \"    return val ? acc + 1: acc;\\n\",\n              \"  }, 0);\\n\",\n              \"}\\n\",\n              \"\\n\",\n              \"function drawCheckboxes(top, svg, attentionHeads) {\\n\",\n              \"  var checkboxContainer = svg.append(\\\"g\\\");\\n\",\n              \"  var checkbox = checkboxContainer.selectAll(\\\"rect\\\")\\n\",\n              \"                                  .data(config.headVis)\\n\",\n              \"                                  .enter()\\n\",\n              \"                                  .append(\\\"rect\\\")\\n\",\n              \"                                  .attr(\\\"fill\\\", function(d, i) {\\n\",\n              \"                                    return HEAD_COLORS(i);\\n\",\n              \"                                  })\\n\",\n              \"                                  .attr(\\\"x\\\", function(d, i) {\\n\",\n              \"                                    return i * CHECKBOX_SIZE;\\n\",\n              \"                                  })\\n\",\n              \"                                  .attr(\\\"y\\\", top)\\n\",\n              \"                                  .attr(\\\"width\\\", CHECKBOX_SIZE)\\n\",\n              \"                                  .attr(\\\"height\\\", CHECKBOX_SIZE);\\n\",\n              \"\\n\",\n              \"  function updateCheckboxes() {\\n\",\n              \"    checkboxContainer.selectAll(\\\"rect\\\")\\n\",\n              \"                              .data(config.headVis)\\n\",\n              \"                              .attr(\\\"fill\\\", function(d, i) {\\n\",\n              \"      var headColor = HEAD_COLORS(i);\\n\",\n              \"      var color = d ? headColor : lighten(headColor);\\n\",\n              \"      return color;\\n\",\n              \"    });\\n\",\n              \"  }\\n\",\n              \"\\n\",\n              \"  updateCheckboxes();\\n\",\n              \"\\n\",\n              \"  checkbox.on(\\\"click\\\", function(d, i) {\\n\",\n              \"    if (config.headVis[i] && activeHeads() == 1) return;\\n\",\n              \"    config.headVis[i] = !config.headVis[i];\\n\",\n              \"    updateCheckboxes();\\n\",\n              \"    renderAttention(svg, attentionHeads);\\n\",\n              \"  });\\n\",\n              \"\\n\",\n              \"  checkbox.on(\\\"dblclick\\\", function(d, i) {\\n\",\n              \"    // If we double click on the only active head then reset\\n\",\n              \"    if (config.headVis[i] && activeHeads() == 1) {\\n\",\n              \"      config.headVis = new Array(config.nHeads).fill(true);\\n\",\n              \"    } else {\\n\",\n              \"      config.headVis = new Array(config.nHeads).fill(false);\\n\",\n              \"      config.headVis[i] = true;\\n\",\n              \"    }\\n\",\n              \"    updateCheckboxes();\\n\",\n              \"    renderAttention(svg, attentionHeads);\\n\",\n              \"  });\\n\",\n              \"}\\n\",\n              \"\\n\",\n              \"function initialize() {\\n\",\n              \"  config.attention = params['attention'];\\n\",\n              \"  config.filter = params['default_filter'];\\n\",\n              \"  config.nLayers = config.attention[config.filter]['attn'].length;\\n\",\n              \"  console.log('num layers')\\n\",\n              \"  console.log(config.nLayers)\\n\",\n              \"  config.nHeads = config.attention[config.filter]['attn'][0].length;\\n\",\n              \"  config.headVis  = new Array(config.nHeads).fill(true);\\n\",\n              \"  config.layer = 0;\\n\",\n              \"  config.initialTextLength = config.attention[config.filter].right_text.length;\\n\",\n              \"  console.log('initial text length')\\n\",\n              \"  console.log(config.initialTextLength)\\n\",\n              \"}\\n\",\n              \"\\n\",\n              \"$(\\\"#layer\\\").empty();\\n\",\n              \"for(var i=0; i<config.nLayers; i++) {\\n\",\n              \"  $(\\\"#layer\\\").append($(\\\"<option />\\\").val(i).text(i));\\n\",\n              \"}\\n\",\n              \"\\n\",\n              \"$(\\\"#layer\\\").on('change', function(e) {\\n\",\n              \"  config.layer = +e.currentTarget.value;\\n\",\n              \"  render();\\n\",\n              \"});\\n\",\n              \"\\n\",\n              \"$(\\\"#filter\\\").on('change', function(e) {\\n\",\n              \"  config.filter = e.currentTarget.value;\\n\",\n              \"  render();\\n\",\n              \"});\\n\",\n              \"\\n\",\n              \"render();\\n\",\n              \"\\n\",\n              \"});\"\n            ],\n            \"text/plain\": [\n              \"<IPython.core.display.Javascript object>\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        }\n      ]\n    }\n  ]\n}"
  },
  {
    "path": "Chapter07/Summarizing_Text_with_T5.ipynb",
    "content": "{\n  \"nbformat\": 4,\n  \"nbformat_minor\": 0,\n  \"metadata\": {\n    \"colab\": {\n      \"name\": \"Summarizing_Text_with_T5.ipynb\",\n      \"provenance\": [],\n      \"collapsed_sections\": []\n    },\n    \"kernelspec\": {\n      \"name\": \"python3\",\n      \"display_name\": \"Python 3\"\n    },\n    \"widgets\": {\n      \"application/vnd.jupyter.widget-state+json\": {\n        \"311dd5614ff1456fa608b06c6a486333\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_483e8e39248842cca6fdb69497417033\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_898bb243539e4693a1a024e54dac42e2\",\n              \"IPY_MODEL_ab283be8197e4a638fd83dc475b57486\"\n            ]\n          }\n        },\n        \"483e8e39248842cca6fdb69497417033\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"898bb243539e4693a1a024e54dac42e2\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_af7cb58500024f7cac6545a44045bdd1\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 1200,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 1200,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_d252ead0d3fc4885bed21c592dc9394c\"\n          }\n        },\n        \"ab283be8197e4a638fd83dc475b57486\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_8ed15a3ae7c64fdea663832082e80529\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 1.20k/1.20k [03:27&lt;00:00, 5.78B/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_6a4014978e96442789889fef7919d87f\"\n          }\n        },\n        \"af7cb58500024f7cac6545a44045bdd1\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"d252ead0d3fc4885bed21c592dc9394c\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"8ed15a3ae7c64fdea663832082e80529\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"6a4014978e96442789889fef7919d87f\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"173d4c16ef284a6085b183da2be8574e\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_15d71a8c88ab4dc2bfb20853b5bac3d3\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_fef9e2aa436040c0822605be0df0c359\",\n              \"IPY_MODEL_9b0393c7c87e42838da0e38fb72b2941\"\n            ]\n          }\n        },\n        \"15d71a8c88ab4dc2bfb20853b5bac3d3\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"fef9e2aa436040c0822605be0df0c359\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_3707cb9b877c451bb245d051c19af8fa\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 2950825948,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 2950825948,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_28ac8f3a26ca4c51ab913c482234f3ab\"\n          }\n        },\n        \"9b0393c7c87e42838da0e38fb72b2941\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_3ef057edd14f4fafbbb0b6fb81e9ba97\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 2.95G/2.95G [03:27&lt;00:00, 14.2MB/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_8ff347600d6f42e5bf198232909fe086\"\n          }\n        },\n        \"3707cb9b877c451bb245d051c19af8fa\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"28ac8f3a26ca4c51ab913c482234f3ab\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"3ef057edd14f4fafbbb0b6fb81e9ba97\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"8ff347600d6f42e5bf198232909fe086\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"31a90525042744a49f8d48305354b9ec\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_55811622761d49559291f3e74f072d31\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_3c68af00a0784efca7a3dce29c060eef\",\n              \"IPY_MODEL_07acd3ae43eb4dfb893cc65e72320367\"\n            ]\n          }\n        },\n        \"55811622761d49559291f3e74f072d31\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"3c68af00a0784efca7a3dce29c060eef\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_4a1b68c322254a53a77e4d05e9ee7631\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 791656,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 791656,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_e7c516f923f74666a8a4b11e17c333bd\"\n          }\n        },\n        \"07acd3ae43eb4dfb893cc65e72320367\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_0be43a9c00324bb9abeedeb2a67b883a\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 792k/792k [00:00&lt;00:00, 1.68MB/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_369a4af884f3436683d68ec126d7b6ec\"\n          }\n        },\n        \"4a1b68c322254a53a77e4d05e9ee7631\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"e7c516f923f74666a8a4b11e17c333bd\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"0be43a9c00324bb9abeedeb2a67b883a\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"369a4af884f3436683d68ec126d7b6ec\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        }\n      }\n    }\n  },\n  \"cells\": [\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"RcdcqBkV0MTU\"\n      },\n      \"source\": [\n        \"#Summarizing Text with T5\\n\",\n        \"Copyright 2020, Denis Rothman. MIT License. Hugging Face usage example was modified for educational purposes.\\n\",\n        \"\\n\",\n        \"[Hugging Face Models](https://huggingface.co/transformers/model_doc/t5.html)\\n\",\n        \"\\n\",\n        \"[Hugging Face Framework Usage](https://huggingface.co/transformers/usage.html)\\n\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"06QFZGxsf_KJ\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"4bb34ef6-44f5-4ae5-d334-8fa716bf9656\"\n      },\n      \"source\": [\n        \"!pip install transformers==4.0.0\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Collecting transformers==4.0.0\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/99/84/7bc03215279f603125d844bf81c3fb3f2d50fe8e511546eb4897e4be2067/transformers-4.0.0-py3-none-any.whl (1.4MB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 1.4MB 9.1MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: filelock in /usr/local/lib/python3.6/dist-packages (from transformers==4.0.0) (3.0.12)\\n\",\n            \"Requirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.6/dist-packages (from transformers==4.0.0) (2019.12.20)\\n\",\n            \"Requirement already satisfied: dataclasses; python_version < \\\"3.7\\\" in /usr/local/lib/python3.6/dist-packages (from transformers==4.0.0) (0.8)\\n\",\n            \"Requirement already satisfied: tqdm>=4.27 in /usr/local/lib/python3.6/dist-packages (from transformers==4.0.0) (4.41.1)\\n\",\n            \"Collecting tokenizers==0.9.4\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/0f/1c/e789a8b12e28be5bc1ce2156cf87cb522b379be9cadc7ad8091a4cc107c4/tokenizers-0.9.4-cp36-cp36m-manylinux2010_x86_64.whl (2.9MB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 2.9MB 26.2MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: requests in /usr/local/lib/python3.6/dist-packages (from transformers==4.0.0) (2.23.0)\\n\",\n            \"Requirement already satisfied: packaging in /usr/local/lib/python3.6/dist-packages (from transformers==4.0.0) (20.4)\\n\",\n            \"Collecting sacremoses\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/7d/34/09d19aff26edcc8eb2a01bed8e98f13a1537005d31e95233fd48216eed10/sacremoses-0.0.43.tar.gz (883kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 890kB 41.5MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: numpy in /usr/local/lib/python3.6/dist-packages (from transformers==4.0.0) (1.18.5)\\n\",\n            \"Requirement already satisfied: chardet<4,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests->transformers==4.0.0) (3.0.4)\\n\",\n            \"Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.6/dist-packages (from requests->transformers==4.0.0) (2020.11.8)\\n\",\n            \"Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests->transformers==4.0.0) (2.10)\\n\",\n            \"Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from requests->transformers==4.0.0) (1.24.3)\\n\",\n            \"Requirement already satisfied: six in /usr/local/lib/python3.6/dist-packages (from packaging->transformers==4.0.0) (1.15.0)\\n\",\n            \"Requirement already satisfied: pyparsing>=2.0.2 in /usr/local/lib/python3.6/dist-packages (from packaging->transformers==4.0.0) (2.4.7)\\n\",\n            \"Requirement already satisfied: click in /usr/local/lib/python3.6/dist-packages (from sacremoses->transformers==4.0.0) (7.1.2)\\n\",\n            \"Requirement already satisfied: joblib in /usr/local/lib/python3.6/dist-packages (from sacremoses->transformers==4.0.0) (0.17.0)\\n\",\n            \"Building wheels for collected packages: sacremoses\\n\",\n            \"  Building wheel for sacremoses (setup.py) ... \\u001b[?25l\\u001b[?25hdone\\n\",\n            \"  Created wheel for sacremoses: filename=sacremoses-0.0.43-cp36-none-any.whl size=893257 sha256=cd41d8610aac8d370f5d6a1c57d73172911ecb039b55e1a3e3f8780a2420a753\\n\",\n            \"  Stored in directory: /root/.cache/pip/wheels/29/3c/fd/7ce5c3f0666dab31a50123635e6fb5e19ceb42ce38d4e58f45\\n\",\n            \"Successfully built sacremoses\\n\",\n            \"Installing collected packages: tokenizers, sacremoses, transformers\\n\",\n            \"Successfully installed sacremoses-0.0.43 tokenizers-0.9.4 transformers-4.0.0\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"id\": \"3tYFf-cEIkKL\",\n        \"outputId\": \"6061799b-9e78-42c5-9902-d268217273e4\"\n      },\n      \"source\": [\n        \"!pip install sentencepiece==0.1.94\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Collecting sentencepiece==0.1.94\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/e5/2d/6d4ca4bef9a67070fa1cac508606328329152b1df10bdf31fb6e4e727894/sentencepiece-0.1.94-cp36-cp36m-manylinux2014_x86_64.whl (1.1MB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 1.1MB 9.5MB/s \\n\",\n            \"\\u001b[?25hInstalling collected packages: sentencepiece\\n\",\n            \"Successfully installed sentencepiece-0.1.94\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"FEQO4tDl7xH_\"\n      },\n      \"source\": [\n        \"display_architecture=True\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"q8suV48O07TW\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 235,\n          \"referenced_widgets\": [\n            \"311dd5614ff1456fa608b06c6a486333\",\n            \"483e8e39248842cca6fdb69497417033\",\n            \"898bb243539e4693a1a024e54dac42e2\",\n            \"ab283be8197e4a638fd83dc475b57486\",\n            \"af7cb58500024f7cac6545a44045bdd1\",\n            \"d252ead0d3fc4885bed21c592dc9394c\",\n            \"8ed15a3ae7c64fdea663832082e80529\",\n            \"6a4014978e96442789889fef7919d87f\",\n            \"173d4c16ef284a6085b183da2be8574e\",\n            \"15d71a8c88ab4dc2bfb20853b5bac3d3\",\n            \"fef9e2aa436040c0822605be0df0c359\",\n            \"9b0393c7c87e42838da0e38fb72b2941\",\n            \"3707cb9b877c451bb245d051c19af8fa\",\n            \"28ac8f3a26ca4c51ab913c482234f3ab\",\n            \"3ef057edd14f4fafbbb0b6fb81e9ba97\",\n            \"8ff347600d6f42e5bf198232909fe086\",\n            \"31a90525042744a49f8d48305354b9ec\",\n            \"55811622761d49559291f3e74f072d31\",\n            \"3c68af00a0784efca7a3dce29c060eef\",\n            \"07acd3ae43eb4dfb893cc65e72320367\",\n            \"4a1b68c322254a53a77e4d05e9ee7631\",\n            \"e7c516f923f74666a8a4b11e17c333bd\",\n            \"0be43a9c00324bb9abeedeb2a67b883a\",\n            \"369a4af884f3436683d68ec126d7b6ec\"\n          ]\n        },\n        \"outputId\": \"cb65b089-f9f1-423c-8c35-f5f39bc7c139\"\n      },\n      \"source\": [\n        \"import torch\\n\",\n        \"import json \\n\",\n        \"from transformers import T5Tokenizer, T5ForConditionalGeneration, T5Config\\n\",\n        \"\\n\",\n        \"model = T5ForConditionalGeneration.from_pretrained('t5-large')\\n\",\n        \"tokenizer = T5Tokenizer.from_pretrained('t5-large')\\n\",\n        \"device = torch.device('cpu')\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"311dd5614ff1456fa608b06c6a486333\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=1200.0, style=ProgressStyle(description…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"173d4c16ef284a6085b183da2be8574e\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=2950825948.0, style=ProgressStyle(descr…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Some weights of the model checkpoint at t5-large were not used when initializing T5ForConditionalGeneration: ['decoder.block.0.layer.1.EncDecAttention.relative_attention_bias.weight']\\n\",\n            \"- This IS expected if you are initializing T5ForConditionalGeneration from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\\n\",\n            \"- This IS NOT expected if you are initializing T5ForConditionalGeneration from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\\n\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"31a90525042744a49f8d48305354b9ec\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=791656.0, style=ProgressStyle(descripti…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"Q6zHDK7I1GsY\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"21dc9317-47da-414e-b6fb-a24f60e1cda6\"\n      },\n      \"source\": [\n        \"if display_architecture==True:\\n\",\n        \" print(model.config)\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"T5Config {\\n\",\n            \"  \\\"_name_or_path\\\": \\\"t5-large\\\",\\n\",\n            \"  \\\"architectures\\\": [\\n\",\n            \"    \\\"T5WithLMHeadModel\\\"\\n\",\n            \"  ],\\n\",\n            \"  \\\"d_ff\\\": 4096,\\n\",\n            \"  \\\"d_kv\\\": 64,\\n\",\n            \"  \\\"d_model\\\": 1024,\\n\",\n            \"  \\\"decoder_start_token_id\\\": 0,\\n\",\n            \"  \\\"dropout_rate\\\": 0.1,\\n\",\n            \"  \\\"eos_token_id\\\": 1,\\n\",\n            \"  \\\"feed_forward_proj\\\": \\\"relu\\\",\\n\",\n            \"  \\\"initializer_factor\\\": 1.0,\\n\",\n            \"  \\\"is_encoder_decoder\\\": true,\\n\",\n            \"  \\\"layer_norm_epsilon\\\": 1e-06,\\n\",\n            \"  \\\"model_type\\\": \\\"t5\\\",\\n\",\n            \"  \\\"n_positions\\\": 512,\\n\",\n            \"  \\\"num_decoder_layers\\\": 24,\\n\",\n            \"  \\\"num_heads\\\": 16,\\n\",\n            \"  \\\"num_layers\\\": 24,\\n\",\n            \"  \\\"output_past\\\": true,\\n\",\n            \"  \\\"pad_token_id\\\": 0,\\n\",\n            \"  \\\"relative_attention_num_buckets\\\": 32,\\n\",\n            \"  \\\"task_specific_params\\\": {\\n\",\n            \"    \\\"summarization\\\": {\\n\",\n            \"      \\\"early_stopping\\\": true,\\n\",\n            \"      \\\"length_penalty\\\": 2.0,\\n\",\n            \"      \\\"max_length\\\": 200,\\n\",\n            \"      \\\"min_length\\\": 30,\\n\",\n            \"      \\\"no_repeat_ngram_size\\\": 3,\\n\",\n            \"      \\\"num_beams\\\": 4,\\n\",\n            \"      \\\"prefix\\\": \\\"summarize: \\\"\\n\",\n            \"    },\\n\",\n            \"    \\\"translation_en_to_de\\\": {\\n\",\n            \"      \\\"early_stopping\\\": true,\\n\",\n            \"      \\\"max_length\\\": 300,\\n\",\n            \"      \\\"num_beams\\\": 4,\\n\",\n            \"      \\\"prefix\\\": \\\"translate English to German: \\\"\\n\",\n            \"    },\\n\",\n            \"    \\\"translation_en_to_fr\\\": {\\n\",\n            \"      \\\"early_stopping\\\": true,\\n\",\n            \"      \\\"max_length\\\": 300,\\n\",\n            \"      \\\"num_beams\\\": 4,\\n\",\n            \"      \\\"prefix\\\": \\\"translate English to French: \\\"\\n\",\n            \"    },\\n\",\n            \"    \\\"translation_en_to_ro\\\": {\\n\",\n            \"      \\\"early_stopping\\\": true,\\n\",\n            \"      \\\"max_length\\\": 300,\\n\",\n            \"      \\\"num_beams\\\": 4,\\n\",\n            \"      \\\"prefix\\\": \\\"translate English to Romanian: \\\"\\n\",\n            \"    }\\n\",\n            \"  },\\n\",\n            \"  \\\"use_cache\\\": true,\\n\",\n            \"  \\\"vocab_size\\\": 32128\\n\",\n            \"}\\n\",\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"5LaWN15NPIPC\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"623480ce-b408-462b-a8c5-5c12959f724c\"\n      },\n      \"source\": [\n        \"if(display_architecture==True):\\n\",\n        \"  print(model)\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"T5ForConditionalGeneration(\\n\",\n            \"  (shared): Embedding(32128, 1024)\\n\",\n            \"  (encoder): T5Stack(\\n\",\n            \"    (embed_tokens): Embedding(32128, 1024)\\n\",\n            \"    (block): ModuleList(\\n\",\n            \"      (0): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (relative_attention_bias): Embedding(32, 16)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (1): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (2): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (3): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (4): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (5): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (6): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (7): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (8): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (9): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (10): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (11): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (12): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (13): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (14): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (15): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (16): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (17): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (18): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (19): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (20): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (21): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (22): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (23): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (final_layer_norm): T5LayerNorm()\\n\",\n            \"    (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"  )\\n\",\n            \"  (decoder): T5Stack(\\n\",\n            \"    (embed_tokens): Embedding(32128, 1024)\\n\",\n            \"    (block): ModuleList(\\n\",\n            \"      (0): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (relative_attention_bias): Embedding(32, 16)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (1): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (2): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (3): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (4): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (5): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (6): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (7): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (8): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (9): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (10): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (11): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (12): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (13): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (14): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (15): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (16): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (17): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (18): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (19): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (20): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (21): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (22): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (23): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (final_layer_norm): T5LayerNorm()\\n\",\n            \"    (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"  )\\n\",\n            \"  (lm_head): Linear(in_features=1024, out_features=32128, bias=False)\\n\",\n            \")\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"DS2twf1P1UYI\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"e060a702-cf27-41b4-d162-ffb21872b81c\"\n      },\n      \"source\": [\n        \"if display_architecture==True:\\n\",\n        \"  print(model.encoder)\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"T5Stack(\\n\",\n            \"  (embed_tokens): Embedding(32128, 1024)\\n\",\n            \"  (block): ModuleList(\\n\",\n            \"    (0): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (relative_attention_bias): Embedding(32, 16)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (1): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (2): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (3): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (4): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (5): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (6): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (7): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (8): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (9): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (10): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (11): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (12): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (13): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (14): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (15): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (16): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (17): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (18): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (19): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (20): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (21): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (22): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (23): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"  )\\n\",\n            \"  (final_layer_norm): T5LayerNorm()\\n\",\n            \"  (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \")\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"MCwdhX9U1MA5\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"ae39b023-77ec-483d-fb33-2bc59d2c0996\"\n      },\n      \"source\": [\n        \"if display_architecture==True:\\n\",\n        \"  print(model.decoder)\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"T5Stack(\\n\",\n            \"  (embed_tokens): Embedding(32128, 1024)\\n\",\n            \"  (block): ModuleList(\\n\",\n            \"    (0): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (relative_attention_bias): Embedding(32, 16)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (1): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (2): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (3): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (4): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (5): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (6): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (7): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (8): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (9): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (10): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (11): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (12): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (13): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (14): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (15): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (16): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (17): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (18): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (19): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (20): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (21): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (22): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (23): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"  )\\n\",\n            \"  (final_layer_norm): T5LayerNorm()\\n\",\n            \"  (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \")\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"GmrCDtcL1hPn\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"d93ba363-7336-4514-b693-b4c33dd8cb07\"\n      },\n      \"source\": [\n        \"if display_architecture==True:\\n\",\n        \"  print(model.forward)\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"<bound method T5ForConditionalGeneration.forward of T5ForConditionalGeneration(\\n\",\n            \"  (shared): Embedding(32128, 1024)\\n\",\n            \"  (encoder): T5Stack(\\n\",\n            \"    (embed_tokens): Embedding(32128, 1024)\\n\",\n            \"    (block): ModuleList(\\n\",\n            \"      (0): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (relative_attention_bias): Embedding(32, 16)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (1): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (2): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (3): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (4): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (5): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (6): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (7): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (8): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (9): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (10): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (11): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (12): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (13): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (14): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (15): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (16): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (17): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (18): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (19): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (20): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (21): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (22): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (23): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (final_layer_norm): T5LayerNorm()\\n\",\n            \"    (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"  )\\n\",\n            \"  (decoder): T5Stack(\\n\",\n            \"    (embed_tokens): Embedding(32128, 1024)\\n\",\n            \"    (block): ModuleList(\\n\",\n            \"      (0): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (relative_attention_bias): Embedding(32, 16)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (1): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (2): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (3): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (4): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (5): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (6): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (7): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (8): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (9): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (10): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (11): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (12): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (13): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (14): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (15): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (16): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (17): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (18): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (19): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (20): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (21): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (22): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (23): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (final_layer_norm): T5LayerNorm()\\n\",\n            \"    (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"  )\\n\",\n            \"  (lm_head): Linear(in_features=1024, out_features=32128, bias=False)\\n\",\n            \")>\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"S5KfhCrifP01\"\n      },\n      \"source\": [\n        \"\\n\",\n        \"def summarize(text,ml):\\n\",\n        \"  preprocess_text = text.strip().replace(\\\"\\\\n\\\",\\\"\\\")\\n\",\n        \"  t5_prepared_Text = \\\"summarize: \\\"+preprocess_text\\n\",\n        \"  print (\\\"Preprocessed and prepared text: \\\\n\\\", t5_prepared_Text)\\n\",\n        \"\\n\",\n        \"  tokenized_text = tokenizer.encode(t5_prepared_Text, return_tensors=\\\"pt\\\").to(device)\\n\",\n        \"\\n\",\n        \"  # summmarize \\n\",\n        \"  summary_ids = model.generate(tokenized_text,\\n\",\n        \"                                      num_beams=4,\\n\",\n        \"                                      no_repeat_ngram_size=2,\\n\",\n        \"                                      min_length=30,\\n\",\n        \"                                      max_length=ml,\\n\",\n        \"                                      early_stopping=True)\\n\",\n        \"\\n\",\n        \"  output = tokenizer.decode(summary_ids[0], skip_special_tokens=True)\\n\",\n        \"  return output\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"vqiTNoDc7pOv\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"d46bc587-b39d-4e8e-bd5e-3d4d6d529168\"\n      },\n      \"source\": [\n        \"text=\\\"\\\"\\\"\\n\",\n        \"The United States Declaration of Independence was the first Etext\\n\",\n        \"released by Project Gutenberg, early in 1971.  The title was stored\\n\",\n        \"in an emailed instruction set which required a tape or diskpack be\\n\",\n        \"hand mounted for retrieval.  The diskpack was the size of a large\\n\",\n        \"cake in a cake carrier, cost $1500, and contained 5 megabytes, of\\n\",\n        \"which this file took 1-2%.  Two tape backups were kept plus one on\\n\",\n        \"paper tape.  The 10,000 files we hope to have online by the end of\\n\",\n        \"2001 should take about 1-2% of a comparably priced drive in 2001.\\n\",\n        \"\\\"\\\"\\\"\\n\",\n        \"print(\\\"Number of characters:\\\",len(text))\\n\",\n        \"summary=summarize(text,50)\\n\",\n        \"print (\\\"\\\\n\\\\nSummarized text: \\\\n\\\",summary)\\n\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Number of characters: 534\\n\",\n            \"Preprocessed and prepared text: \\n\",\n            \" summarize: The United States Declaration of Independence was the first Etextreleased by Project Gutenberg, early in 1971.  The title was storedin an emailed instruction set which required a tape or diskpack behand mounted for retrieval.  The diskpack was the size of a largecake in a cake carrier, cost $1500, and contained 5 megabytes, ofwhich this file took 1-2%.  Two tape backups were kept plus one onpaper tape.  The 10,000 files we hope to have online by the end of2001 should take about 1-2% of a comparably priced drive in 2001.\\n\",\n            \"\\n\",\n            \"\\n\",\n            \"Summarized text: \\n\",\n            \" the united states declaration of independence was the first etext published by project gutenberg, early in 1971. the 10,000 files we hope to have online by the end of2001 should take about 1-2% of a comparably priced drive in\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"2321zS1Q3jPX\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"fccf0c6e-926f-4f65-b3ba-b0e19715f87b\"\n      },\n      \"source\": [\n        \"#Bill of Rights,V\\n\",\n        \"text =\\\"\\\"\\\"\\n\",\n        \"No person shall be held to answer for a capital, or otherwise infamous crime,\\n\",\n        \"unless on a presentment or indictment of a Grand Jury, except in cases arising\\n\",\n        \"in the land or naval forces, or in the Militia, when in actual service\\n\",\n        \"in time of War or public danger; nor shall any person be subject for\\n\",\n        \"the same offense to be twice put in jeopardy of life or limb;\\n\",\n        \"nor shall be compelled in any criminal case to be a witness against himself,\\n\",\n        \"nor be deprived of life, liberty, or property, without due process of law;\\n\",\n        \"nor shall private property be taken for public use without just compensation.\\n\",\n        \"\\n\",\n        \"\\\"\\\"\\\"\\n\",\n        \"print(\\\"Number of characters:\\\",len(text))\\n\",\n        \"summary=summarize(text,50)\\n\",\n        \"print (\\\"\\\\n\\\\nSummarized text: \\\\n\\\",summary)\\n\",\n        \" \"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Number of characters: 591\\n\",\n            \"Preprocessed and prepared text: \\n\",\n            \" summarize: No person shall be held to answer for a capital, or otherwise infamous crime,unless on a presentment or indictment of a Grand Jury, except in cases arisingin the land or naval forces, or in the Militia, when in actual servicein time of War or public danger; nor shall any person be subject forthe same offense to be twice put in jeopardy of life or limb;nor shall be compelled in any criminal case to be a witness against himself,nor be deprived of life, liberty, or property, without due process of law;nor shall private property be taken for public use without just compensation.\\n\",\n            \"\\n\",\n            \"\\n\",\n            \"Summarized text: \\n\",\n            \" no person shall be held to answer for a capital, or otherwise infamous crime, unless ona presentment or indictment ofa Grand Jury. nor shall any person be subject for the same offense to be twice put\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"k_h8oQ55_zr5\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"7c957d5a-6711-4169-9244-51562a6cc9cd\"\n      },\n      \"source\": [\n        \"#Montana Corporate Law\\n\",\n        \"#https://corporations.uslegal.com/state-corporation-law/montana-corporation-law/#:~:text=Montana%20Corporation%20Law,carrying%20out%20its%20business%20activities.\\n\",\n        \"\\n\",\n        \"text =\\\"\\\"\\\"The law regarding corporations prescribes that a corporation can be incorporated in the state of Montana to serve any lawful purpose.  In the state of Montana, a corporation has all the powers of a natural person for carrying out its business activities.  The corporation can sue and be sued in its corporate name.  It has perpetual succession.  The corporation can buy, sell or otherwise acquire an interest in a real or personal property.  It can conduct business, carry on operations, and have offices and exercise the powers in a state, territory or district in possession of the U.S., or in a foreign country.  It can appoint officers and agents of the corporation for various duties and fix their compensation.\\n\",\n        \"The name of a corporation must contain the word “corporation” or its abbreviation “corp.”  The name of a corporation should not be deceptively similar to the name of another corporation incorporated in the same state.  It should not be deceptively identical to the fictitious name adopted by a foreign corporation having business transactions in the state.\\n\",\n        \"The corporation is formed by one or more natural persons by executing and filing articles of incorporation to the secretary of state of filing.  The qualifications for directors are fixed either by articles of incorporation or bylaws.  The names and addresses of the initial directors and purpose of incorporation should be set forth in the articles of incorporation.  The articles of incorporation should contain the corporate name, the number of shares authorized to issue, a brief statement of the character of business carried out by the corporation, the names and addresses of the directors until successors are elected, and name and addresses of incorporators.  The shareholders have the power to change the size of board of directors.\\n\",\n        \"\\\"\\\"\\\"\\n\",\n        \"print(\\\"Number of characters:\\\",len(text))\\n\",\n        \"summary=summarize(text,50)\\n\",\n        \"print (\\\"\\\\n\\\\nSummarized text: \\\\n\\\",summary)\\n\",\n        \" \"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Number of characters: 1816\\n\",\n            \"Preprocessed and prepared text: \\n\",\n            \" summarize: The law regarding corporations prescribes that a corporation can be incorporated in the state of Montana to serve any lawful purpose.  In the state of Montana, a corporation has all the powers of a natural person for carrying out its business activities.  The corporation can sue and be sued in its corporate name.  It has perpetual succession.  The corporation can buy, sell or otherwise acquire an interest in a real or personal property.  It can conduct business, carry on operations, and have offices and exercise the powers in a state, territory or district in possession of the U.S., or in a foreign country.  It can appoint officers and agents of the corporation for various duties and fix their compensation.The name of a corporation must contain the word “corporation” or its abbreviation “corp.”  The name of a corporation should not be deceptively similar to the name of another corporation incorporated in the same state.  It should not be deceptively identical to the fictitious name adopted by a foreign corporation having business transactions in the state.The corporation is formed by one or more natural persons by executing and filing articles of incorporation to the secretary of state of filing.  The qualifications for directors are fixed either by articles of incorporation or bylaws.  The names and addresses of the initial directors and purpose of incorporation should be set forth in the articles of incorporation.  The articles of incorporation should contain the corporate name, the number of shares authorized to issue, a brief statement of the character of business carried out by the corporation, the names and addresses of the directors until successors are elected, and name and addresses of incorporators.  The shareholders have the power to change the size of board of directors.\\n\",\n            \"\\n\",\n            \"\\n\",\n            \"Summarized text: \\n\",\n            \" a corporation can be incorporated in the state of Montana to serve any lawful purpose. the corporation has perpetual succession and can sue and be sued in its corporate name. it can conduct business, carry on operations, and have offices\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    }\n  ]\n}\n"
  },
  {
    "path": "Chapter08/Summarizing_Text_V2.ipynb",
    "content": "{\n  \"nbformat\": 4,\n  \"nbformat_minor\": 0,\n  \"metadata\": {\n    \"colab\": {\n      \"name\": \"Summarizing Text V2.ipynb\",\n      \"provenance\": [],\n      \"collapsed_sections\": []\n    },\n    \"kernelspec\": {\n      \"name\": \"python3\",\n      \"display_name\": \"Python 3\"\n    }\n  },\n  \"cells\": [\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"RcdcqBkV0MTU\"\n      },\n      \"source\": [\n        \"#Summarizing Text with T5\\n\",\n        \"Copyright 2020, Denis Rothman. MIT License. Hugging Face usage example was modified for educational purposes.\\n\",\n        \"\\n\",\n        \"[Hugging Face Models](https://huggingface.co/transformers/model_doc/t5.html)\\n\",\n        \"\\n\",\n        \"[Hugging Face Framework Usage](https://huggingface.co/transformers/usage.html)\\n\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"06QFZGxsf_KJ\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"931f8431-e4c2-4144-bad7-60a4df9e6ae8\"\n      },\n      \"source\": [\n        \"!pip install transformers\"\n      ],\n      \"execution_count\": 13,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Requirement already satisfied: transformers in /usr/local/lib/python3.6/dist-packages (4.1.1)\\n\",\n            \"Requirement already satisfied: tqdm>=4.27 in /usr/local/lib/python3.6/dist-packages (from transformers) (4.41.1)\\n\",\n            \"Requirement already satisfied: requests in /usr/local/lib/python3.6/dist-packages (from transformers) (2.23.0)\\n\",\n            \"Requirement already satisfied: sacremoses in /usr/local/lib/python3.6/dist-packages (from transformers) (0.0.43)\\n\",\n            \"Requirement already satisfied: filelock in /usr/local/lib/python3.6/dist-packages (from transformers) (3.0.12)\\n\",\n            \"Requirement already satisfied: tokenizers==0.9.4 in /usr/local/lib/python3.6/dist-packages (from transformers) (0.9.4)\\n\",\n            \"Requirement already satisfied: dataclasses; python_version < \\\"3.7\\\" in /usr/local/lib/python3.6/dist-packages (from transformers) (0.8)\\n\",\n            \"Requirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.6/dist-packages (from transformers) (2019.12.20)\\n\",\n            \"Requirement already satisfied: packaging in /usr/local/lib/python3.6/dist-packages (from transformers) (20.8)\\n\",\n            \"Requirement already satisfied: numpy in /usr/local/lib/python3.6/dist-packages (from transformers) (1.19.4)\\n\",\n            \"Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.6/dist-packages (from requests->transformers) (2020.12.5)\\n\",\n            \"Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from requests->transformers) (1.24.3)\\n\",\n            \"Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests->transformers) (2.10)\\n\",\n            \"Requirement already satisfied: chardet<4,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests->transformers) (3.0.4)\\n\",\n            \"Requirement already satisfied: six in /usr/local/lib/python3.6/dist-packages (from sacremoses->transformers) (1.15.0)\\n\",\n            \"Requirement already satisfied: joblib in /usr/local/lib/python3.6/dist-packages (from sacremoses->transformers) (1.0.0)\\n\",\n            \"Requirement already satisfied: click in /usr/local/lib/python3.6/dist-packages (from sacremoses->transformers) (7.1.2)\\n\",\n            \"Requirement already satisfied: pyparsing>=2.0.2 in /usr/local/lib/python3.6/dist-packages (from packaging->transformers) (2.4.7)\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"id\": \"3tYFf-cEIkKL\",\n        \"outputId\": \"f72a822c-f875-40e3-cc3f-eb7b37e5e47b\"\n      },\n      \"source\": [\n        \"!pip install sentencepiece==0.1.94\"\n      ],\n      \"execution_count\": 14,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Requirement already satisfied: sentencepiece==0.1.94 in /usr/local/lib/python3.6/dist-packages (0.1.94)\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"FEQO4tDl7xH_\"\n      },\n      \"source\": [\n        \"display_architecture=True\"\n      ],\n      \"execution_count\": 15,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"q8suV48O07TW\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"0743f9df-e4a8-46c3-9d8c-595698fa78a5\"\n      },\n      \"source\": [\n        \"import torch\\n\",\n        \"import json \\n\",\n        \"from transformers import T5Tokenizer, T5ForConditionalGeneration, T5Config\\n\",\n        \"\\n\",\n        \"model = T5ForConditionalGeneration.from_pretrained('t5-large')\\n\",\n        \"tokenizer = T5Tokenizer.from_pretrained('t5-large')\\n\",\n        \"device = torch.device('cpu')\"\n      ],\n      \"execution_count\": 16,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Some weights of the model checkpoint at t5-large were not used when initializing T5ForConditionalGeneration: ['decoder.block.0.layer.1.EncDecAttention.relative_attention_bias.weight']\\n\",\n            \"- This IS expected if you are initializing T5ForConditionalGeneration from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\\n\",\n            \"- This IS NOT expected if you are initializing T5ForConditionalGeneration from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\\n\"\n          ],\n          \"name\": \"stderr\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"Q6zHDK7I1GsY\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"c40c5620-12c6-4a99-9f86-7eef28ec0871\"\n      },\n      \"source\": [\n        \"if display_architecture==True:\\n\",\n        \" print(model.config)\"\n      ],\n      \"execution_count\": 17,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"T5Config {\\n\",\n            \"  \\\"_name_or_path\\\": \\\"t5-large\\\",\\n\",\n            \"  \\\"architectures\\\": [\\n\",\n            \"    \\\"T5WithLMHeadModel\\\"\\n\",\n            \"  ],\\n\",\n            \"  \\\"d_ff\\\": 4096,\\n\",\n            \"  \\\"d_kv\\\": 64,\\n\",\n            \"  \\\"d_model\\\": 1024,\\n\",\n            \"  \\\"decoder_start_token_id\\\": 0,\\n\",\n            \"  \\\"dropout_rate\\\": 0.1,\\n\",\n            \"  \\\"eos_token_id\\\": 1,\\n\",\n            \"  \\\"feed_forward_proj\\\": \\\"relu\\\",\\n\",\n            \"  \\\"initializer_factor\\\": 1.0,\\n\",\n            \"  \\\"is_encoder_decoder\\\": true,\\n\",\n            \"  \\\"layer_norm_epsilon\\\": 1e-06,\\n\",\n            \"  \\\"model_type\\\": \\\"t5\\\",\\n\",\n            \"  \\\"n_positions\\\": 512,\\n\",\n            \"  \\\"num_decoder_layers\\\": 24,\\n\",\n            \"  \\\"num_heads\\\": 16,\\n\",\n            \"  \\\"num_layers\\\": 24,\\n\",\n            \"  \\\"output_past\\\": true,\\n\",\n            \"  \\\"pad_token_id\\\": 0,\\n\",\n            \"  \\\"relative_attention_num_buckets\\\": 32,\\n\",\n            \"  \\\"task_specific_params\\\": {\\n\",\n            \"    \\\"summarization\\\": {\\n\",\n            \"      \\\"early_stopping\\\": true,\\n\",\n            \"      \\\"length_penalty\\\": 2.0,\\n\",\n            \"      \\\"max_length\\\": 200,\\n\",\n            \"      \\\"min_length\\\": 30,\\n\",\n            \"      \\\"no_repeat_ngram_size\\\": 3,\\n\",\n            \"      \\\"num_beams\\\": 4,\\n\",\n            \"      \\\"prefix\\\": \\\"summarize: \\\"\\n\",\n            \"    },\\n\",\n            \"    \\\"translation_en_to_de\\\": {\\n\",\n            \"      \\\"early_stopping\\\": true,\\n\",\n            \"      \\\"max_length\\\": 300,\\n\",\n            \"      \\\"num_beams\\\": 4,\\n\",\n            \"      \\\"prefix\\\": \\\"translate English to German: \\\"\\n\",\n            \"    },\\n\",\n            \"    \\\"translation_en_to_fr\\\": {\\n\",\n            \"      \\\"early_stopping\\\": true,\\n\",\n            \"      \\\"max_length\\\": 300,\\n\",\n            \"      \\\"num_beams\\\": 4,\\n\",\n            \"      \\\"prefix\\\": \\\"translate English to French: \\\"\\n\",\n            \"    },\\n\",\n            \"    \\\"translation_en_to_ro\\\": {\\n\",\n            \"      \\\"early_stopping\\\": true,\\n\",\n            \"      \\\"max_length\\\": 300,\\n\",\n            \"      \\\"num_beams\\\": 4,\\n\",\n            \"      \\\"prefix\\\": \\\"translate English to Romanian: \\\"\\n\",\n            \"    }\\n\",\n            \"  },\\n\",\n            \"  \\\"use_cache\\\": true,\\n\",\n            \"  \\\"vocab_size\\\": 32128\\n\",\n            \"}\\n\",\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"5LaWN15NPIPC\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"d746c169-acc4-4e6a-e704-b7c4f635adc2\"\n      },\n      \"source\": [\n        \"if(display_architecture==True):\\n\",\n        \"  print(model)\"\n      ],\n      \"execution_count\": 18,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"T5ForConditionalGeneration(\\n\",\n            \"  (shared): Embedding(32128, 1024)\\n\",\n            \"  (encoder): T5Stack(\\n\",\n            \"    (embed_tokens): Embedding(32128, 1024)\\n\",\n            \"    (block): ModuleList(\\n\",\n            \"      (0): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (relative_attention_bias): Embedding(32, 16)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (1): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (2): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (3): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (4): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (5): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (6): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (7): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (8): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (9): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (10): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (11): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (12): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (13): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (14): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (15): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (16): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (17): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (18): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (19): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (20): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (21): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (22): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (23): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (final_layer_norm): T5LayerNorm()\\n\",\n            \"    (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"  )\\n\",\n            \"  (decoder): T5Stack(\\n\",\n            \"    (embed_tokens): Embedding(32128, 1024)\\n\",\n            \"    (block): ModuleList(\\n\",\n            \"      (0): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (relative_attention_bias): Embedding(32, 16)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (1): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (2): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (3): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (4): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (5): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (6): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (7): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (8): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (9): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (10): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (11): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (12): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (13): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (14): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (15): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (16): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (17): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (18): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (19): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (20): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (21): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (22): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (23): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (final_layer_norm): T5LayerNorm()\\n\",\n            \"    (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"  )\\n\",\n            \"  (lm_head): Linear(in_features=1024, out_features=32128, bias=False)\\n\",\n            \")\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"DS2twf1P1UYI\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"29de71ca-6f91-4d94-90f6-215b47468d66\"\n      },\n      \"source\": [\n        \"if display_architecture==True:\\n\",\n        \"  print(model.encoder)\"\n      ],\n      \"execution_count\": 19,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"T5Stack(\\n\",\n            \"  (embed_tokens): Embedding(32128, 1024)\\n\",\n            \"  (block): ModuleList(\\n\",\n            \"    (0): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (relative_attention_bias): Embedding(32, 16)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (1): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (2): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (3): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (4): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (5): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (6): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (7): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (8): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (9): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (10): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (11): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (12): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (13): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (14): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (15): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (16): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (17): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (18): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (19): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (20): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (21): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (22): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (23): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"  )\\n\",\n            \"  (final_layer_norm): T5LayerNorm()\\n\",\n            \"  (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \")\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"MCwdhX9U1MA5\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"f2d4dfd6-c313-415b-e997-da0b0bedd7df\"\n      },\n      \"source\": [\n        \"if display_architecture==True:\\n\",\n        \"  print(model.decoder)\"\n      ],\n      \"execution_count\": 20,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"T5Stack(\\n\",\n            \"  (embed_tokens): Embedding(32128, 1024)\\n\",\n            \"  (block): ModuleList(\\n\",\n            \"    (0): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (relative_attention_bias): Embedding(32, 16)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (1): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (2): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (3): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (4): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (5): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (6): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (7): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (8): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (9): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (10): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (11): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (12): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (13): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (14): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (15): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (16): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (17): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (18): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (19): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (20): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (21): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (22): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (23): T5Block(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): T5LayerSelfAttention(\\n\",\n            \"          (SelfAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (1): T5LayerCrossAttention(\\n\",\n            \"          (EncDecAttention): T5Attention(\\n\",\n            \"            (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"        (2): T5LayerFF(\\n\",\n            \"          (DenseReluDense): T5DenseReluDense(\\n\",\n            \"            (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"            (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (layer_norm): T5LayerNorm()\\n\",\n            \"          (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"  )\\n\",\n            \"  (final_layer_norm): T5LayerNorm()\\n\",\n            \"  (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \")\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"GmrCDtcL1hPn\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"3a8d557d-0e1f-4174-f58e-4e0e708434fe\"\n      },\n      \"source\": [\n        \"if display_architecture==True:\\n\",\n        \"  print(model.forward)\"\n      ],\n      \"execution_count\": 21,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"<bound method T5ForConditionalGeneration.forward of T5ForConditionalGeneration(\\n\",\n            \"  (shared): Embedding(32128, 1024)\\n\",\n            \"  (encoder): T5Stack(\\n\",\n            \"    (embed_tokens): Embedding(32128, 1024)\\n\",\n            \"    (block): ModuleList(\\n\",\n            \"      (0): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (relative_attention_bias): Embedding(32, 16)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (1): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (2): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (3): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (4): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (5): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (6): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (7): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (8): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (9): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (10): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (11): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (12): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (13): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (14): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (15): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (16): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (17): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (18): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (19): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (20): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (21): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (22): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (23): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (final_layer_norm): T5LayerNorm()\\n\",\n            \"    (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"  )\\n\",\n            \"  (decoder): T5Stack(\\n\",\n            \"    (embed_tokens): Embedding(32128, 1024)\\n\",\n            \"    (block): ModuleList(\\n\",\n            \"      (0): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (relative_attention_bias): Embedding(32, 16)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (1): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (2): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (3): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (4): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (5): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (6): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (7): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (8): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (9): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (10): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (11): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (12): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (13): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (14): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (15): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (16): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (17): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (18): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (19): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (20): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (21): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (22): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"      (23): T5Block(\\n\",\n            \"        (layer): ModuleList(\\n\",\n            \"          (0): T5LayerSelfAttention(\\n\",\n            \"            (SelfAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (1): T5LayerCrossAttention(\\n\",\n            \"            (EncDecAttention): T5Attention(\\n\",\n            \"              (q): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (k): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (v): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"              (o): Linear(in_features=1024, out_features=1024, bias=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"          (2): T5LayerFF(\\n\",\n            \"            (DenseReluDense): T5DenseReluDense(\\n\",\n            \"              (wi): Linear(in_features=1024, out_features=4096, bias=False)\\n\",\n            \"              (wo): Linear(in_features=4096, out_features=1024, bias=False)\\n\",\n            \"              (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            )\\n\",\n            \"            (layer_norm): T5LayerNorm()\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"          )\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"    (final_layer_norm): T5LayerNorm()\\n\",\n            \"    (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"  )\\n\",\n            \"  (lm_head): Linear(in_features=1024, out_features=32128, bias=False)\\n\",\n            \")>\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"S5KfhCrifP01\"\n      },\n      \"source\": [\n        \"\\n\",\n        \"def summarize(text,ml):\\n\",\n        \"  preprocess_text = text.strip().replace(\\\"\\\\n\\\",\\\"\\\")\\n\",\n        \"  t5_prepared_Text = \\\"summarize: \\\"+preprocess_text\\n\",\n        \"  print (\\\"Preprocessed and prepared text: \\\\n\\\", t5_prepared_Text)\\n\",\n        \"\\n\",\n        \"  tokenized_text = tokenizer.encode(t5_prepared_Text, return_tensors=\\\"pt\\\").to(device)\\n\",\n        \"\\n\",\n        \"  # summmarize \\n\",\n        \"  summary_ids = model.generate(tokenized_text,\\n\",\n        \"                                      num_beams=4,\\n\",\n        \"                                      no_repeat_ngram_size=2,\\n\",\n        \"                                      min_length=30,\\n\",\n        \"                                      max_length=ml,\\n\",\n        \"                                      early_stopping=True)\\n\",\n        \"\\n\",\n        \"  output = tokenizer.decode(summary_ids[0], skip_special_tokens=True)\\n\",\n        \"  return output\"\n      ],\n      \"execution_count\": 22,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"vqiTNoDc7pOv\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"c83c8762-48bd-47fe-ee95-0da2acf99ddc\"\n      },\n      \"source\": [\n        \"text=\\\"\\\"\\\"\\n\",\n        \"The United States Declaration of Independence was the first Etext\\n\",\n        \"released by Project Gutenberg, early in 1971.  The title was stored\\n\",\n        \"in an emailed instruction set which required a tape or diskpack be\\n\",\n        \"hand mounted for retrieval.  The diskpack was the size of a large\\n\",\n        \"cake in a cake carrier, cost $1500, and contained 5 megabytes, of\\n\",\n        \"which this file took 1-2%.  Two tape backups were kept plus one on\\n\",\n        \"paper tape.  The 10,000 files we hope to have online by the end of\\n\",\n        \"2001 should take about 1-2% of a comparably priced drive in 2001.\\n\",\n        \"\\\"\\\"\\\"\\n\",\n        \"print(\\\"Number of characters:\\\",len(text))\\n\",\n        \"summary=summarize(text,50)\\n\",\n        \"print (\\\"\\\\n\\\\nSummarized text: \\\\n\\\",summary)\\n\"\n      ],\n      \"execution_count\": 23,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Number of characters: 534\\n\",\n            \"Preprocessed and prepared text: \\n\",\n            \" summarize: The United States Declaration of Independence was the first Etextreleased by Project Gutenberg, early in 1971.  The title was storedin an emailed instruction set which required a tape or diskpack behand mounted for retrieval.  The diskpack was the size of a largecake in a cake carrier, cost $1500, and contained 5 megabytes, ofwhich this file took 1-2%.  Two tape backups were kept plus one onpaper tape.  The 10,000 files we hope to have online by the end of2001 should take about 1-2% of a comparably priced drive in 2001.\\n\",\n            \"\\n\",\n            \"\\n\",\n            \"Summarized text: \\n\",\n            \" the united states declaration of independence was the first etext published by project gutenberg, early in 1971. the 10,000 files we hope to have online by the end of2001 should take about 1-2% of a comparably priced drive in\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"nvB2NenCBfO4\"\n      },\n      \"source\": [\n        \"Summarizing the Bill of Rights, Version 1\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"2321zS1Q3jPX\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"6c78b355-8d54-4553-b6e7-d43f8d2b770c\"\n      },\n      \"source\": [\n        \"#Bill of Rights,V\\n\",\n        \"text =\\\"\\\"\\\"\\n\",\n        \"No person shall be held to answer for a capital, or otherwise infamous crime,\\n\",\n        \"unless on a presentment or indictment of a Grand Jury, except in cases arising\\n\",\n        \"in the land or naval forces, or in the Militia, when in actual service\\n\",\n        \"in time of War or public danger; nor shall any person be subject for\\n\",\n        \"the same offense to be twice put in jeopardy of life or limb;\\n\",\n        \"nor shall be compelled in any criminal case to be a witness against himself,\\n\",\n        \"nor be deprived of life, liberty, or property, without due process of law;\\n\",\n        \"nor shall private property be taken for public use without just compensation.\\n\",\n        \"\\n\",\n        \"\\\"\\\"\\\"\\n\",\n        \"print(\\\"Number of characters:\\\",len(text))\\n\",\n        \"summary=summarize(text,50)\\n\",\n        \"print (\\\"\\\\n\\\\nSummarized text: \\\\n\\\",summary)\\n\",\n        \" \"\n      ],\n      \"execution_count\": 24,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Number of characters: 591\\n\",\n            \"Preprocessed and prepared text: \\n\",\n            \" summarize: No person shall be held to answer for a capital, or otherwise infamous crime,unless on a presentment or indictment of a Grand Jury, except in cases arisingin the land or naval forces, or in the Militia, when in actual servicein time of War or public danger; nor shall any person be subject forthe same offense to be twice put in jeopardy of life or limb;nor shall be compelled in any criminal case to be a witness against himself,nor be deprived of life, liberty, or property, without due process of law;nor shall private property be taken for public use without just compensation.\\n\",\n            \"\\n\",\n            \"\\n\",\n            \"Summarized text: \\n\",\n            \" no person shall be held to answer for a capital, or otherwise infamous crime, unless ona presentment or indictment ofa Grand Jury. nor shall any person be subject for the same offense to be twice put\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"zr2A49TDBkZz\"\n      },\n      \"source\": [\n        \"Summarizing the Bill of Rights, Version 2\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"HWMvLGyahPFP\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"0354d7f5-ec21-476d-eb93-dd66204c30b3\"\n      },\n      \"source\": [\n        \"#Bill of Rights,V\\n\",\n        \"text =\\\"\\\"\\\"\\n\",\n        \"A person must be indicted by a Grand Jury for a capital or infamous crime.\\n\",\n        \"There are excpetions in time of war for a person in the army, navy, or national guard.\\n\",\n        \"A person can not be judged twice for the same offense or put in a situation of double jeopardy of life.\\n\",\n        \"A person can not be asked to be a witness agains herself or himself.\\n\",\n        \"A person cannot be deprived of life, liberty or property without due process of law.\\n\",\n        \"A person must be compensated for property taken for public use.\\n\",\n        \"\\\"\\\"\\\"\\n\",\n        \"print(\\\"Number of characters:\\\",len(text))\\n\",\n        \"summary=summarize(text,50)\\n\",\n        \"print (\\\"\\\\n\\\\nSummarized text: \\\\n\\\",summary)\\n\",\n        \" \"\n      ],\n      \"execution_count\": 25,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Number of characters: 485\\n\",\n            \"Preprocessed and prepared text: \\n\",\n            \" summarize: A person must be indicted by a Grand Jury for a capital or infamous crime.There are excpetions in time of war for a person in the army, navy, or national guard.A person can not be judged twice for the same offense or put in a situation of double jeopardy of life.A person can not be asked to be a witness agains herself or himself.A person cannot be deprived of life, liberty or property without due process of law.A person must be compensated for property taken for public use.\\n\",\n            \"\\n\",\n            \"\\n\",\n            \"Summarized text: \\n\",\n            \" a person cannot be deprived of life, liberty or property without due process of law.A person must be compensated for property taken for public use.\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"k_h8oQ55_zr5\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"4bc5bef5-139a-436f-c820-5a5851cfde36\"\n      },\n      \"source\": [\n        \"#Montana Corporate Law\\n\",\n        \"#https://corporations.uslegal.com/state-corporation-law/montana-corporation-law/#:~:text=Montana%20Corporation%20Law,carrying%20out%20its%20business%20activities.\\n\",\n        \"\\n\",\n        \"text =\\\"\\\"\\\"The law regarding corporations prescribes that a corporation can be incorporated in the state of Montana to serve any lawful purpose.  In the state of Montana, a corporation has all the powers of a natural person for carrying out its business activities.  The corporation can sue and be sued in its corporate name.  It has perpetual succession.  The corporation can buy, sell or otherwise acquire an interest in a real or personal property.  It can conduct business, carry on operations, and have offices and exercise the powers in a state, territory or district in possession of the U.S., or in a foreign country.  It can appoint officers and agents of the corporation for various duties and fix their compensation.\\n\",\n        \"The name of a corporation must contain the word “corporation” or its abbreviation “corp.”  The name of a corporation should not be deceptively similar to the name of another corporation incorporated in the same state.  It should not be deceptively identical to the fictitious name adopted by a foreign corporation having business transactions in the state.\\n\",\n        \"The corporation is formed by one or more natural persons by executing and filing articles of incorporation to the secretary of state of filing.  The qualifications for directors are fixed either by articles of incorporation or bylaws.  The names and addresses of the initial directors and purpose of incorporation should be set forth in the articles of incorporation.  The articles of incorporation should contain the corporate name, the number of shares authorized to issue, a brief statement of the character of business carried out by the corporation, the names and addresses of the directors until successors are elected, and name and addresses of incorporators.  The shareholders have the power to change the size of board of directors.\\n\",\n        \"\\\"\\\"\\\"\\n\",\n        \"print(\\\"Number of characters:\\\",len(text))\\n\",\n        \"summary=summarize(text,50)\\n\",\n        \"print (\\\"\\\\n\\\\nSummarized text: \\\\n\\\",summary)\\n\",\n        \" \"\n      ],\n      \"execution_count\": 26,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Number of characters: 1816\\n\",\n            \"Preprocessed and prepared text: \\n\",\n            \" summarize: The law regarding corporations prescribes that a corporation can be incorporated in the state of Montana to serve any lawful purpose.  In the state of Montana, a corporation has all the powers of a natural person for carrying out its business activities.  The corporation can sue and be sued in its corporate name.  It has perpetual succession.  The corporation can buy, sell or otherwise acquire an interest in a real or personal property.  It can conduct business, carry on operations, and have offices and exercise the powers in a state, territory or district in possession of the U.S., or in a foreign country.  It can appoint officers and agents of the corporation for various duties and fix their compensation.The name of a corporation must contain the word “corporation” or its abbreviation “corp.”  The name of a corporation should not be deceptively similar to the name of another corporation incorporated in the same state.  It should not be deceptively identical to the fictitious name adopted by a foreign corporation having business transactions in the state.The corporation is formed by one or more natural persons by executing and filing articles of incorporation to the secretary of state of filing.  The qualifications for directors are fixed either by articles of incorporation or bylaws.  The names and addresses of the initial directors and purpose of incorporation should be set forth in the articles of incorporation.  The articles of incorporation should contain the corporate name, the number of shares authorized to issue, a brief statement of the character of business carried out by the corporation, the names and addresses of the directors until successors are elected, and name and addresses of incorporators.  The shareholders have the power to change the size of board of directors.\\n\",\n            \"\\n\",\n            \"\\n\",\n            \"Summarized text: \\n\",\n            \" a corporation can be incorporated in the state of Montana to serve any lawful purpose. the corporation has perpetual succession and can sue and be sued in its corporate name. it can conduct business, carry on operations, and have offices\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    }\n  ]\n}"
  },
  {
    "path": "Chapter08/Tokenizer.ipynb",
    "content": "{\n  \"nbformat\": 4,\n  \"nbformat_minor\": 0,\n  \"metadata\": {\n    \"colab\": {\n      \"name\": \"Tokenizer.ipynb\",\n      \"provenance\": [],\n      \"collapsed_sections\": [],\n      \"toc_visible\": true\n    },\n    \"kernelspec\": {\n      \"name\": \"python3\",\n      \"display_name\": \"Python 3\"\n    }\n  },\n  \"cells\": [\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"7fjcTlyE3WvR\"\n      },\n      \"source\": [\n        \"#Tokenizers\\n\",\n        \"Copyright 2020 Denis Rothman, MIT License\\n\",\n        \"\\n\",\n        \"Reference 1 for word embedding:\\n\",\n        \"https://www.geeksforgeeks.org/python-word-embedding-using-word2vec/\\n\",\n        \"\\n\",\n        \"Reference 2 for cosine similarity:\\n\",\n        \"SciKit Learn cosine similarity documentation\\n\",\n        \"\\n\",\n        \"***Upload text.txt before running the Notebook***\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"JKJ8Saf6vR9b\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"e329b785-128d-447a-b97d-eeaeb740e9e4\"\n      },\n      \"source\": [\n        \"#@title Pre-Requisistes\\n\",\n        \"!pip install gensim==3.8.3\\n\",\n        \"import nltk\\n\",\n        \"nltk.download('punkt')\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Collecting gensim==3.8.3\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/5c/4e/afe2315e08a38967f8a3036bbe7e38b428e9b7a90e823a83d0d49df1adf5/gensim-3.8.3-cp37-cp37m-manylinux1_x86_64.whl (24.2MB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 24.2MB 1.5MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: scipy>=0.18.1 in /usr/local/lib/python3.7/dist-packages (from gensim==3.8.3) (1.4.1)\\n\",\n            \"Requirement already satisfied: numpy>=1.11.3 in /usr/local/lib/python3.7/dist-packages (from gensim==3.8.3) (1.19.5)\\n\",\n            \"Requirement already satisfied: smart-open>=1.8.1 in /usr/local/lib/python3.7/dist-packages (from gensim==3.8.3) (5.0.0)\\n\",\n            \"Requirement already satisfied: six>=1.5.0 in /usr/local/lib/python3.7/dist-packages (from gensim==3.8.3) (1.15.0)\\n\",\n            \"Installing collected packages: gensim\\n\",\n            \"  Found existing installation: gensim 4.0.1\\n\",\n            \"    Uninstalling gensim-4.0.1:\\n\",\n            \"      Successfully uninstalled gensim-4.0.1\\n\",\n            \"Successfully installed gensim-3.8.3\\n\",\n            \"[nltk_data] Downloading package punkt to /root/nltk_data...\\n\",\n            \"[nltk_data]   Package punkt is already up-to-date!\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"execute_result\",\n          \"data\": {\n            \"text/plain\": [\n              \"True\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          },\n          \"execution_count\": 1\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"7o7EeDUUu0Sh\"\n      },\n      \"source\": [\n        \"import math\\n\",\n        \"import numpy as np\\n\",\n        \"from nltk.tokenize import sent_tokenize, word_tokenize \\n\",\n        \"import gensim \\n\",\n        \"from gensim.models import Word2Vec \\n\",\n        \"import numpy as np\\n\",\n        \"from sklearn.metrics.pairwise import cosine_similarity\\n\",\n        \"import matplotlib.pyplot as plt\\n\",\n        \"import warnings \\n\",\n        \"warnings.filterwarnings(action = 'ignore') \"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"1NRomrXEJOxJ\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"360af319-8259-469e-babd-6eaa4cd6c714\"\n      },\n      \"source\": [\n        \"#@title Word2Vec Tokenization\\n\",\n        \"#‘text.txt’ file \\n\",\n        \"sample = open(\\\"text.txt\\\", \\\"r\\\") \\n\",\n        \"s = sample.read() \\n\",\n        \"\\n\",\n        \"# processing escape characters \\n\",\n        \"f = s.replace(\\\"\\\\n\\\", \\\" \\\") \\n\",\n        \"\\n\",\n        \"data = [] \\n\",\n        \"# sentence parsing\\n\",\n        \"for i in sent_tokenize(f): \\n\",\n        \"\\ttemp = [] \\n\",\n        \"\\t# tokenize the sentence into words \\n\",\n        \"\\tfor j in word_tokenize(i): \\n\",\n        \"\\t\\ttemp.append(j.lower())\\n\",\n        \"\\tdata.append(temp)\\n\",\n        \"\\n\",\n        \"# Creating Skip Gram model \\n\",\n        \"model2 = gensim.models.Word2Vec(data, min_count = 1, size = 512,window = 5, sg = 1) \\n\",\n        \"print(model2)\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Word2Vec(vocab=11822, size=512, alpha=0.025)\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"YcC_3JLcJTgw\"\n      },\n      \"source\": [\n        \"#@title Cosine Similarity\\n\",\n        \"def similarity(word1,word2):\\n\",\n        \"        cosine=False #default value\\n\",\n        \"        try:\\n\",\n        \"                a=model2[word1]\\n\",\n        \"                cosine=True\\n\",\n        \"        except KeyError:     #The KeyError exception is raised\\n\",\n        \"                print(word1, \\\":[unk] key not found in dictionary\\\")#False implied\\n\",\n        \"\\n\",\n        \"        try:\\n\",\n        \"                b=model2[word2]#a=True implied\\n\",\n        \"        except KeyError:       #The KeyError exception is raised\\n\",\n        \"                cosine=False   #both a and b must be true\\n\",\n        \"                print(word2, \\\":[unk] key not found in dictionary\\\")\\n\",\n        \"\\n\",\n        \"        if(cosine==True):\\n\",\n        \"                b=model2[word2]\\n\",\n        \"                # compute cosine similarity\\n\",\n        \"                dot = np.dot(a, b)\\n\",\n        \"                norma = np.linalg.norm(a)\\n\",\n        \"                normb = np.linalg.norm(b)\\n\",\n        \"                cos = dot / (norma * normb)\\n\",\n        \"\\n\",\n        \"                aa = a.reshape(1,512) \\n\",\n        \"                ba = b.reshape(1,512)\\n\",\n        \"                #print(\\\"Word1\\\",aa)\\n\",\n        \"                #print(\\\"Word2\\\",ba)\\n\",\n        \"                cos_lib = cosine_similarity(aa, ba)\\n\",\n        \"                #print(cos_lib,\\\"word similarity\\\")\\n\",\n        \"          \\n\",\n        \"        if(cosine==False):cos_lib=0;\\n\",\n        \"        return cos_lib\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"fMfgbogHJVh-\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"3bdcc464-75fb-4f60-be5b-86b59af7809e\"\n      },\n      \"source\": [\n        \"#@title Case 0: Words in text and dictionary\\n\",\n        \"word1=\\\"freedom\\\";word2=\\\"liberty\\\"\\n\",\n        \"print(\\\"Similarity\\\",similarity(word1,word2),word1,word2)\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Similarity [[0.38632965]] freedom liberty\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"4B7vvKxOLbYC\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"837c3d9f-f64c-43be-f689-0c8ea9284c25\"\n      },\n      \"source\": [\n        \"#@title Word(s) Case 1: Word not in text or dictionary\\n\",\n        \"word1=\\\"corporations\\\";word2=\\\"rights\\\"\\n\",\n        \"print(\\\"Similarity\\\",similarity(word1,word2),word1,word2)\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"corporations :[unk] key not found in dictionary\\n\",\n            \"Similarity 0 corporations rights\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"qkFIC79JCQJp\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"2f53a434-ce87-47c2-c37c-375f9afac846\"\n      },\n      \"source\": [\n        \"#@title Case 2: Noisy Relationship \\n\",\n        \"word1=\\\"etext\\\";word2=\\\"declaration\\\"\\n\",\n        \"print(\\\"Similarity\\\",similarity(word1,word2),word1,word2)\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Similarity [[0.51544815]] etext declaration\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"mKVPiEi-GZtf\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"f2b3a8b0-63f1-4e0c-a242-ca709140bcb3\"\n      },\n      \"source\": [\n        \"#@title Case 3: Rare words\\n\",\n        \"word1=\\\"justiciar\\\";word2=\\\"judgement\\\"\\n\",\n        \"print(\\\"Similarity\\\",similarity(word1,word2),word1,word2)\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Similarity [[0.2304948]] justiciar judgement\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"0xZtAm3DHGJg\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"7ac7a0f5-3509-4254-a7f2-c55b6fb8b46c\"\n      },\n      \"source\": [\n        \"#@title Case 4: Replacing words\\n\",\n        \"word1=\\\"judge\\\";word2=\\\"judgement\\\"\\n\",\n        \"print(\\\"Similarity\\\",similarity(word1,word2),word1,word2)\\n\",\n        \"\\n\",\n        \"word1=\\\"justiciar\\\";word2=\\\"judge\\\"\\n\",\n        \"print(\\\"Similarity\\\",similarity(word1,word2),word1,word2)\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Similarity [[0.20353234]] judge judgement\\n\",\n            \"Similarity [[0.37659135]] justiciar judge\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"wOSID8kXHXWt\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"1a60f440-4d9c-4b68-e30f-60fe474cf458\"\n      },\n      \"source\": [\n        \"#@title Case 5: Entailment\\n\",\n        \"word1=\\\"pay\\\";word2=\\\"debt\\\"\\n\",\n        \"print(\\\"Similarity\\\",similarity(word1,word2),word1,word2)\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Similarity [[0.54338676]] pay debt\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    }\n  ]\n}"
  },
  {
    "path": "Chapter08/Training_OpenAI_GPT_2_CH08.ipynb",
    "content": "{\n  \"nbformat\": 4,\n  \"nbformat_minor\": 0,\n  \"metadata\": {\n    \"colab\": {\n      \"name\": \"Training OpenAI GPT-2-CH08.ipynb\",\n      \"provenance\": [],\n      \"collapsed_sections\": [],\n      \"toc_visible\": true\n    },\n    \"kernelspec\": {\n      \"name\": \"python3\",\n      \"display_name\": \"Python 3\"\n    },\n    \"accelerator\": \"GPU\"\n  },\n  \"cells\": [\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"LH2YgC7LfzJZ\",\n        \"colab_type\": \"text\"\n      },\n      \"source\": [\n        \"#Training OpenAI GTP-2\\n\",\n        \"Copyright 2020, Denis Rothman MIT License. Denis Rothman created the Colab notebook using the OpenAI repository, adding title steps for educational purposes only.\\n\",\n        \"\\n\",\n        \"***Code References***\\n\",\n        \"\\n\",\n        \"[Reference: OpenAI Repository](https://github.com/openai/gpt-2)\\n\",\n        \"The repository was cloned and adapted to N Shepperd's repository.\\n\",\n        \"\\n\",\n        \"[Reference: N Shepperd Repository](https://github.com/nshepperd/gpt-2)\\n\",\n        \"The repository was not cloned. N Shepperd's training programs were inserted into the OpenAI Repository. The list of N Shepperd's programs are cited in the 'N Shepperd' section of the notebook. Some programs were modified for educational purposes only to work with this notebook.\\n\",\n        \"\\n\",\n        \"***Model Reference Paper***\\n\",\n        \"\\n\",\n        \"[Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, Ilya Sutskever,2019,'Language Models are Unsupervised Multitask Learners'](https://d4mucfpksywv.cloudfront.net/better-language-models/language-models.pdf)\\n\",\n        \"\\n\",\n        \"\\n\",\n        \"***Step 1: Pre-requisites:***\\n\",\n        \"\\n\",\n        \"a) activate GPU in the notebook settings runTime menu <br>\\n\",\n        \"b) Upload the following program files and dset.txt(dataset) with the file manager: train.py,load_dataset.py,encode.py,accumulate,memory_saving_gradients.py,dset.txt\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"isqdu1fpfmqM\",\n        \"colab_type\": \"code\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 102\n        },\n        \"outputId\": \"0662d019-7248-4642-c840-7b87c08e7ce7\"\n      },\n      \"source\": [\n        \"#@title Step 2: Cloning the OpenAI GPT-2 Repository \\n\",\n        \"#!git clone https://github.com/nshepperd/gpt-2.git\\n\",\n        \"!git clone https://github.com/openai/gpt-2.git\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Cloning into 'gpt-2'...\\n\",\n            \"remote: Enumerating objects: 230, done.\\u001b[K\\n\",\n            \"remote: Total 230 (delta 0), reused 0 (delta 0), pack-reused 230\\u001b[K\\n\",\n            \"Receiving objects: 100% (230/230), 4.38 MiB | 7.37 MiB/s, done.\\n\",\n            \"Resolving deltas: 100% (119/119), done.\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"7RHOjN-TjUbj\",\n        \"colab_type\": \"code\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 887\n        },\n        \"outputId\": \"cc45d116-e7a5-4ff8-e41b-7d440317c9a8\"\n      },\n      \"source\": [\n        \"#@title Step 3: Installing the requirements\\n\",\n        \"import os                     # when the VM restarts import os necessary\\n\",\n        \"os.chdir(\\\"/content/gpt-2\\\")    \\n\",\n        \"!pip3 install -r requirements.txt\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Collecting fire>=0.1.3\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/34/a7/0e22e70778aca01a52b9c899d9c145c6396d7b613719cd63db97ffa13f2f/fire-0.3.1.tar.gz (81kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 81kB 2.5MB/s \\n\",\n            \"\\u001b[?25hCollecting regex==2017.4.5\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/36/62/c0c0d762ffd4ffaf39f372eb8561b8d491a11ace5a7884610424a8b40f95/regex-2017.04.05.tar.gz (601kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 604kB 8.9MB/s \\n\",\n            \"\\u001b[?25hCollecting requests==2.21.0\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl (57kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 61kB 6.5MB/s \\n\",\n            \"\\u001b[?25hCollecting tqdm==4.31.1\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/6c/4b/c38b5144cf167c4f52288517436ccafefe9dc01b8d1c190e18a6b154cd4a/tqdm-4.31.1-py2.py3-none-any.whl (48kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 51kB 5.7MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: six in /usr/local/lib/python3.6/dist-packages (from fire>=0.1.3->-r requirements.txt (line 1)) (1.12.0)\\n\",\n            \"Requirement already satisfied: termcolor in /usr/local/lib/python3.6/dist-packages (from fire>=0.1.3->-r requirements.txt (line 1)) (1.1.0)\\n\",\n            \"Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.6/dist-packages (from requests==2.21.0->-r requirements.txt (line 3)) (2020.6.20)\\n\",\n            \"Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests==2.21.0->-r requirements.txt (line 3)) (3.0.4)\\n\",\n            \"Requirement already satisfied: urllib3<1.25,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from requests==2.21.0->-r requirements.txt (line 3)) (1.24.3)\\n\",\n            \"Collecting idna<2.9,>=2.5\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl (58kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 61kB 6.0MB/s \\n\",\n            \"\\u001b[?25hBuilding wheels for collected packages: fire, regex\\n\",\n            \"  Building wheel for fire (setup.py) ... \\u001b[?25l\\u001b[?25hdone\\n\",\n            \"  Created wheel for fire: filename=fire-0.3.1-py2.py3-none-any.whl size=111005 sha256=3310fe2adb427d9c42d252d7a50303321e9db5a10c95bd0083efc4df204f9703\\n\",\n            \"  Stored in directory: /root/.cache/pip/wheels/c1/61/df/768b03527bf006b546dce284eb4249b185669e65afc5fbb2ac\\n\",\n            \"  Building wheel for regex (setup.py) ... \\u001b[?25l\\u001b[?25hdone\\n\",\n            \"  Created wheel for regex: filename=regex-2017.4.5-cp36-cp36m-linux_x86_64.whl size=533204 sha256=410a1a2649a21cad83bbd2d67acd95e54704541f49ca03c2ac08574a44ff5985\\n\",\n            \"  Stored in directory: /root/.cache/pip/wheels/75/07/38/3c16b529d50cb4e0cd3dbc7b75cece8a09c132692c74450b01\\n\",\n            \"Successfully built fire regex\\n\",\n            \"\\u001b[31mERROR: spacy 2.2.4 has requirement tqdm<5.0.0,>=4.38.0, but you'll have tqdm 4.31.1 which is incompatible.\\u001b[0m\\n\",\n            \"\\u001b[31mERROR: google-colab 1.0.0 has requirement requests~=2.23.0, but you'll have requests 2.21.0 which is incompatible.\\u001b[0m\\n\",\n            \"\\u001b[31mERROR: datascience 0.10.6 has requirement folium==0.2.1, but you'll have folium 0.8.3 which is incompatible.\\u001b[0m\\n\",\n            \"Installing collected packages: fire, regex, idna, requests, tqdm\\n\",\n            \"  Found existing installation: regex 2019.12.20\\n\",\n            \"    Uninstalling regex-2019.12.20:\\n\",\n            \"      Successfully uninstalled regex-2019.12.20\\n\",\n            \"  Found existing installation: idna 2.9\\n\",\n            \"    Uninstalling idna-2.9:\\n\",\n            \"      Successfully uninstalled idna-2.9\\n\",\n            \"  Found existing installation: requests 2.23.0\\n\",\n            \"    Uninstalling requests-2.23.0:\\n\",\n            \"      Successfully uninstalled requests-2.23.0\\n\",\n            \"  Found existing installation: tqdm 4.41.1\\n\",\n            \"    Uninstalling tqdm-4.41.1:\\n\",\n            \"      Successfully uninstalled tqdm-4.41.1\\n\",\n            \"Successfully installed fire-0.3.1 idna-2.8 regex-2017.4.5 requests-2.21.0 tqdm-4.31.1\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.colab-display-data+json\": {\n              \"pip_warning\": {\n                \"packages\": [\n                  \"idna\",\n                  \"requests\",\n                  \"tqdm\"\n                ]\n              }\n            }\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"q9vV73Opw68m\",\n        \"colab_type\": \"code\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 105\n        },\n        \"outputId\": \"8d3e336b-7385-4a51-f054-bf3a1ffd3b6a\"\n      },\n      \"source\": [\n        \"!pip install toposort\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Collecting toposort\\n\",\n            \"  Downloading https://files.pythonhosted.org/packages/e9/8a/321cd8ea5f4a22a06e3ba30ef31ec33bea11a3443eeb1d89807640ee6ed4/toposort-1.5-py2.py3-none-any.whl\\n\",\n            \"Installing collected packages: toposort\\n\",\n            \"Successfully installed toposort-1.5\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"_kpNCnh9fyYD\",\n        \"colab_type\": \"code\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 51\n        },\n        \"outputId\": \"6915ef8b-a48f-4a27-c6d4-43fda10b0e82\"\n      },\n      \"source\": [\n        \"#@title Step 4: Checking TensorFlow version \\n\",\n        \"#Colab has tf 1.x and tf 2.x installed\\n\",\n        \"#Restart runtime using 'Runtime' -> 'Restart runtime...'\\n\",\n        \"%tensorflow_version 1.x\\n\",\n        \"import tensorflow as tf\\n\",\n        \"print(tf.__version__)\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"TensorFlow 1.x selected.\\n\",\n            \"1.15.2\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"jvVj0cLVkaPL\",\n        \"colab_type\": \"code\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 136\n        },\n        \"outputId\": \"12f91649-5661-4323-887a-bed1456ce370\"\n      },\n      \"source\": [\n        \"#@title Step 5: Downloading 117M parameter GPT-2 Model\\n\",\n        \"# run code and send argument\\n\",\n        \"import os # after runtime is restarted\\n\",\n        \"os.chdir(\\\"/content/gpt-2\\\")\\n\",\n        \"!python3 download_model.py '117M' #creates model directory\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\rFetching checkpoint:   0%|                                              | 0.00/77.0 [00:00<?, ?it/s]\\rFetching checkpoint: 1.00kit [00:00, 781kit/s]                                                      \\n\",\n            \"\\rFetching encoder.json:   0%|                                           | 0.00/1.04M [00:00<?, ?it/s]\\rFetching encoder.json: 1.04Mit [00:00, 34.0Mit/s]                                                   \\n\",\n            \"Fetching hparams.json: 1.00kit [00:00, 1.04Mit/s]                                                   \\n\",\n            \"Fetching model.ckpt.data-00000-of-00001: 498Mit [00:10, 47.8Mit/s]                                  \\n\",\n            \"Fetching model.ckpt.index: 6.00kit [00:00, 4.75Mit/s]                                               \\n\",\n            \"Fetching model.ckpt.meta: 472kit [00:00, 35.2Mit/s]                                                 \\n\",\n            \"Fetching vocab.bpe: 457kit [00:00, 35.1Mit/s]                                                       \\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"aV5K8rvD1b-r\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"#@title Step 6: Copying the Project Resources to scr\\n\",\n        \"!cp /content/mdset.txt /content/gpt-2/src/\\n\",\n        \"!cp -r /content/gpt-2/models/ /content/gpt-2/src/\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"dTUxDwtWlOLf\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"#@title Step 7: Copying the N Shepperd Training Files\\n\",\n        \"#Referfence GitHub repository: https://github.com/nshepperd/gpt-2\\n\",\n        \"import os # import after runtime is restarted\\n\",\n        \"!cp /content/train.py /content/gpt-2/src/\\n\",\n        \"!cp /content/load_dataset.py /content/gpt-2/src/\\n\",\n        \"!cp /content/encode.py /content/gpt-2/src/\\n\",\n        \"!cp /content/accumulate.py /content/gpt-2/src/\\n\",\n        \"!cp /content/memory_saving_gradients.py /content/gpt-2/src/\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"B6T2OrWoOvG0\",\n        \"colab_type\": \"code\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 68\n        },\n        \"outputId\": \"cb7f7b63-837e-4d19-b594-c73a535492c3\"\n      },\n      \"source\": [\n        \"#@title Step 8:Encoding dataset\\n\",\n        \"import os # import after runtime is restarted\\n\",\n        \"os.chdir(\\\"/content/gpt-2/src/\\\")\\n\",\n        \"model_name=\\\"117M\\\"\\n\",\n        \"!python /content/gpt-2/src/encode.py mdset.txt out.npz \"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Reading files\\n\",\n            \"100% 1/1 [00:00<00:00,  3.98it/s]\\n\",\n            \"Writing out.npz\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"UzlkNGbAkDBk\",\n        \"colab_type\": \"code\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 1000\n        },\n        \"outputId\": \"5cbf0d29-6d2d-4630-9c30-07930c37e6dd\"\n      },\n      \"source\": [\n        \"#@title Step 9:Training the Model\\n\",\n        \"#Model saved after 1000 steps\\n\",\n        \"import os # import after runtime is restarted\\n\",\n        \"os.chdir(\\\"/content/gpt-2/src/\\\")\\n\",\n        \"!python train.py --dataset out.npz\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"WARNING:tensorflow:\\n\",\n            \"The TensorFlow contrib module will not be included in TensorFlow 2.0.\\n\",\n            \"For more information, please see:\\n\",\n            \"  * https://github.com/tensorflow/community/blob/master/rfcs/20180907-contrib-sunset.md\\n\",\n            \"  * https://github.com/tensorflow/addons\\n\",\n            \"  * https://github.com/tensorflow/io (for I/O related ops)\\n\",\n            \"If you depend on functionality not listed there, please file an issue.\\n\",\n            \"\\n\",\n            \"WARNING:tensorflow:From /content/gpt-2/src/memory_saving_gradients.py:13: The name tf.GraphKeys is deprecated. Please use tf.compat.v1.GraphKeys instead.\\n\",\n            \"\\n\",\n            \"WARNING:tensorflow:From train.py:89: The name tf.ConfigProto is deprecated. Please use tf.compat.v1.ConfigProto instead.\\n\",\n            \"\\n\",\n            \"WARNING:tensorflow:From train.py:92: The name tf.Session is deprecated. Please use tf.compat.v1.Session instead.\\n\",\n            \"\\n\",\n            \"2020-06-29 09:16:57.805692: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 2299995000 Hz\\n\",\n            \"2020-06-29 09:16:57.806095: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x2c1b2c0 initialized for platform Host (this does not guarantee that XLA will be used). Devices:\\n\",\n            \"2020-06-29 09:16:57.806138: I tensorflow/compiler/xla/service/service.cc:176]   StreamExecutor device (0): Host, Default Version\\n\",\n            \"2020-06-29 09:16:57.812896: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1\\n\",\n            \"2020-06-29 09:16:57.997614: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:983] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero\\n\",\n            \"2020-06-29 09:16:57.998629: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x2c1b480 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:\\n\",\n            \"2020-06-29 09:16:57.998671: I tensorflow/compiler/xla/service/service.cc:176]   StreamExecutor device (0): Tesla K80, Compute Capability 3.7\\n\",\n            \"2020-06-29 09:16:58.000243: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:983] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero\\n\",\n            \"2020-06-29 09:16:58.001082: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1639] Found device 0 with properties: \\n\",\n            \"name: Tesla K80 major: 3 minor: 7 memoryClockRate(GHz): 0.8235\\n\",\n            \"pciBusID: 0000:00:04.0\\n\",\n            \"2020-06-29 09:16:58.001644: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1\\n\",\n            \"2020-06-29 09:16:58.333957: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10\\n\",\n            \"2020-06-29 09:16:58.555071: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10\\n\",\n            \"2020-06-29 09:16:58.593295: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10\\n\",\n            \"2020-06-29 09:16:58.915089: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10\\n\",\n            \"2020-06-29 09:16:58.956129: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10\\n\",\n            \"2020-06-29 09:16:59.738702: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7\\n\",\n            \"2020-06-29 09:16:59.738968: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:983] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero\\n\",\n            \"2020-06-29 09:16:59.740099: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:983] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero\\n\",\n            \"2020-06-29 09:16:59.740924: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1767] Adding visible gpu devices: 0\\n\",\n            \"2020-06-29 09:16:59.745598: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1\\n\",\n            \"2020-06-29 09:16:59.747346: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1180] Device interconnect StreamExecutor with strength 1 edge matrix:\\n\",\n            \"2020-06-29 09:16:59.747386: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1186]      0 \\n\",\n            \"2020-06-29 09:16:59.747406: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1199] 0:   N \\n\",\n            \"2020-06-29 09:16:59.749272: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:983] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero\\n\",\n            \"2020-06-29 09:16:59.750200: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:983] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero\\n\",\n            \"2020-06-29 09:16:59.750941: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1325] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 10805 MB memory) -> physical GPU (device: 0, name: Tesla K80, pci bus id: 0000:00:04.0, compute capability: 3.7)\\n\",\n            \"WARNING:tensorflow:From train.py:93: The name tf.placeholder is deprecated. Please use tf.compat.v1.placeholder instead.\\n\",\n            \"\\n\",\n            \"WARNING:tensorflow:From /content/gpt-2/src/model.py:148: The name tf.variable_scope is deprecated. Please use tf.compat.v1.variable_scope instead.\\n\",\n            \"\\n\",\n            \"WARNING:tensorflow:From /content/gpt-2/src/model.py:152: The name tf.get_variable is deprecated. Please use tf.compat.v1.get_variable instead.\\n\",\n            \"\\n\",\n            \"WARNING:tensorflow:From /content/gpt-2/src/model.py:36: The name tf.rsqrt is deprecated. Please use tf.math.rsqrt instead.\\n\",\n            \"\\n\",\n            \"WARNING:tensorflow:From /content/gpt-2/src/sample.py:51: The name tf.AUTO_REUSE is deprecated. Please use tf.compat.v1.AUTO_REUSE instead.\\n\",\n            \"\\n\",\n            \"WARNING:tensorflow:From /content/gpt-2/src/sample.py:64: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.\\n\",\n            \"Instructions for updating:\\n\",\n            \"Use `tf.cast` instead.\\n\",\n            \"WARNING:tensorflow:From /content/gpt-2/src/sample.py:16: where (from tensorflow.python.ops.array_ops) is deprecated and will be removed in a future version.\\n\",\n            \"Instructions for updating:\\n\",\n            \"Use tf.where in 2.0, which has the same broadcast rule as np.where\\n\",\n            \"WARNING:tensorflow:From /content/gpt-2/src/sample.py:67: multinomial (from tensorflow.python.ops.random_ops) is deprecated and will be removed in a future version.\\n\",\n            \"Instructions for updating:\\n\",\n            \"Use `tf.random.categorical` instead.\\n\",\n            \"WARNING:tensorflow:From train.py:118: The name tf.trainable_variables is deprecated. Please use tf.compat.v1.trainable_variables instead.\\n\",\n            \"\\n\",\n            \"WARNING:tensorflow:From train.py:122: The name tf.train.AdamOptimizer is deprecated. Please use tf.compat.v1.train.AdamOptimizer instead.\\n\",\n            \"\\n\",\n            \"WARNING:tensorflow:From train.py:145: The name tf.summary.scalar is deprecated. Please use tf.compat.v1.summary.scalar instead.\\n\",\n            \"\\n\",\n            \"WARNING:tensorflow:From train.py:148: The name tf.summary.merge is deprecated. Please use tf.compat.v1.summary.merge instead.\\n\",\n            \"\\n\",\n            \"WARNING:tensorflow:From train.py:150: The name tf.summary.FileWriter is deprecated. Please use tf.compat.v1.summary.FileWriter instead.\\n\",\n            \"\\n\",\n            \"WARNING:tensorflow:From train.py:153: The name tf.train.Saver is deprecated. Please use tf.compat.v1.train.Saver instead.\\n\",\n            \"\\n\",\n            \"WARNING:tensorflow:From train.py:157: The name tf.global_variables_initializer is deprecated. Please use tf.compat.v1.global_variables_initializer instead.\\n\",\n            \"\\n\",\n            \"Loading checkpoint models/117M/model.ckpt\\n\",\n            \"Loading dataset...\\n\",\n            \"100% 1/1 [00:00<00:00, 260.74it/s]\\n\",\n            \"dataset has 29379 tokens\\n\",\n            \"Training...\\n\",\n            \"2020-06-29 09:17:29.007668: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10\\n\",\n            \"[1 | 7.03] loss=3.18 avg=3.18\\n\",\n            \"[2 | 7.96] loss=2.67 avg=2.92\\n\",\n            \"[3 | 8.90] loss=2.92 avg=2.92\\n\",\n            \"[4 | 9.82] loss=3.00 avg=2.94\\n\",\n            \"[5 | 10.76] loss=2.65 avg=2.88\\n\",\n            \"[6 | 11.69] loss=2.88 avg=2.88\\n\",\n            \"[7 | 12.63] loss=2.80 avg=2.87\\n\",\n            \"[8 | 13.57] loss=2.68 avg=2.84\\n\",\n            \"[9 | 14.52] loss=2.88 avg=2.85\\n\",\n            \"[10 | 15.46] loss=3.93 avg=2.96\\n\",\n            \"[11 | 16.40] loss=3.06 avg=2.97\\n\",\n            \"[12 | 17.34] loss=2.48 avg=2.93\\n\",\n            \"[13 | 18.28] loss=2.69 avg=2.91\\n\",\n            \"[14 | 19.22] loss=3.19 avg=2.93\\n\",\n            \"[15 | 20.16] loss=2.29 avg=2.88\\n\",\n            \"[16 | 21.11] loss=2.28 avg=2.84\\n\",\n            \"[17 | 22.04] loss=2.91 avg=2.85\\n\",\n            \"[18 | 22.97] loss=2.67 avg=2.84\\n\",\n            \"[19 | 23.91] loss=2.14 avg=2.80\\n\",\n            \"[20 | 24.85] loss=2.00 avg=2.75\\n\",\n            \"[21 | 25.78] loss=2.58 avg=2.75\\n\",\n            \"[22 | 26.73] loss=2.66 avg=2.74\\n\",\n            \"[23 | 27.67] loss=2.80 avg=2.74\\n\",\n            \"[24 | 28.60] loss=3.18 avg=2.76\\n\",\n            \"[25 | 29.54] loss=2.95 avg=2.77\\n\",\n            \"[26 | 30.47] loss=3.41 avg=2.80\\n\",\n            \"[27 | 31.41] loss=2.92 avg=2.81\\n\",\n            \"[28 | 32.34] loss=2.33 avg=2.79\\n\",\n            \"[29 | 33.27] loss=2.17 avg=2.76\\n\",\n            \"[30 | 34.20] loss=1.87 avg=2.73\\n\",\n            \"[31 | 35.13] loss=2.60 avg=2.72\\n\",\n            \"[32 | 36.06] loss=2.71 avg=2.72\\n\",\n            \"[33 | 37.00] loss=2.82 avg=2.73\\n\",\n            \"[34 | 37.95] loss=2.26 avg=2.71\\n\",\n            \"[35 | 38.89] loss=2.20 avg=2.69\\n\",\n            \"[36 | 39.83] loss=2.48 avg=2.69\\n\",\n            \"[37 | 40.76] loss=2.03 avg=2.66\\n\",\n            \"[38 | 41.70] loss=2.15 avg=2.65\\n\",\n            \"[39 | 42.64] loss=2.57 avg=2.65\\n\",\n            \"[40 | 43.57] loss=2.42 avg=2.64\\n\",\n            \"[41 | 44.50] loss=2.20 avg=2.63\\n\",\n            \"[42 | 45.43] loss=3.01 avg=2.64\\n\",\n            \"[43 | 46.37] loss=2.74 avg=2.64\\n\",\n            \"[44 | 47.30] loss=3.33 avg=2.66\\n\",\n            \"[45 | 48.24] loss=3.14 avg=2.67\\n\",\n            \"[46 | 49.17] loss=2.40 avg=2.67\\n\",\n            \"[47 | 50.11] loss=2.58 avg=2.66\\n\",\n            \"[48 | 51.04] loss=1.93 avg=2.64\\n\",\n            \"[49 | 51.97] loss=3.22 avg=2.66\\n\",\n            \"[50 | 52.91] loss=2.56 avg=2.66\\n\",\n            \"[51 | 53.84] loss=1.95 avg=2.64\\n\",\n            \"[52 | 54.77] loss=2.18 avg=2.63\\n\",\n            \"[53 | 55.70] loss=2.65 avg=2.63\\n\",\n            \"[54 | 56.63] loss=2.29 avg=2.62\\n\",\n            \"[55 | 57.55] loss=2.21 avg=2.61\\n\",\n            \"[56 | 58.49] loss=1.98 avg=2.60\\n\",\n            \"[57 | 59.41] loss=2.47 avg=2.59\\n\",\n            \"[58 | 60.34] loss=1.95 avg=2.58\\n\",\n            \"[59 | 61.26] loss=2.40 avg=2.57\\n\",\n            \"[60 | 62.19] loss=2.22 avg=2.57\\n\",\n            \"[61 | 63.12] loss=3.16 avg=2.58\\n\",\n            \"[62 | 64.05] loss=2.25 avg=2.57\\n\",\n            \"[63 | 64.99] loss=3.32 avg=2.59\\n\",\n            \"[64 | 65.93] loss=2.44 avg=2.59\\n\",\n            \"[65 | 66.86] loss=2.39 avg=2.58\\n\",\n            \"[66 | 67.79] loss=2.23 avg=2.57\\n\",\n            \"[67 | 68.73] loss=2.21 avg=2.57\\n\",\n            \"[68 | 69.66] loss=2.45 avg=2.56\\n\",\n            \"[69 | 70.58] loss=3.28 avg=2.58\\n\",\n            \"[70 | 71.52] loss=2.22 avg=2.57\\n\",\n            \"[71 | 72.45] loss=1.76 avg=2.56\\n\",\n            \"[72 | 73.38] loss=3.01 avg=2.56\\n\",\n            \"[73 | 74.31] loss=2.04 avg=2.55\\n\",\n            \"[74 | 75.25] loss=2.20 avg=2.55\\n\",\n            \"[75 | 76.18] loss=2.43 avg=2.54\\n\",\n            \"[76 | 77.10] loss=3.45 avg=2.56\\n\",\n            \"[77 | 78.03] loss=2.40 avg=2.56\\n\",\n            \"[78 | 78.96] loss=2.34 avg=2.55\\n\",\n            \"[79 | 79.89] loss=2.09 avg=2.55\\n\",\n            \"[80 | 80.82] loss=2.17 avg=2.54\\n\",\n            \"[81 | 81.75] loss=2.27 avg=2.53\\n\",\n            \"[82 | 82.69] loss=2.17 avg=2.53\\n\",\n            \"[83 | 83.62] loss=2.19 avg=2.52\\n\",\n            \"[84 | 84.56] loss=2.73 avg=2.53\\n\",\n            \"[85 | 85.49] loss=2.96 avg=2.53\\n\",\n            \"[86 | 86.43] loss=2.20 avg=2.53\\n\",\n            \"[87 | 87.37] loss=2.10 avg=2.52\\n\",\n            \"[88 | 88.31] loss=2.91 avg=2.53\\n\",\n            \"[89 | 89.24] loss=2.91 avg=2.53\\n\",\n            \"[90 | 90.17] loss=2.07 avg=2.53\\n\",\n            \"[91 | 91.10] loss=2.84 avg=2.53\\n\",\n            \"[92 | 92.03] loss=1.77 avg=2.52\\n\",\n            \"[93 | 92.96] loss=2.68 avg=2.52\\n\",\n            \"[94 | 93.88] loss=2.36 avg=2.52\\n\",\n            \"[95 | 94.81] loss=2.65 avg=2.52\\n\",\n            \"[96 | 95.74] loss=1.89 avg=2.51\\n\",\n            \"[97 | 96.68] loss=2.37 avg=2.51\\n\",\n            \"[98 | 97.60] loss=1.99 avg=2.50\\n\",\n            \"[99 | 98.53] loss=2.62 avg=2.50\\n\",\n            \"Generating samples...\\n\",\n            \"======== SAMPLE 1 ========\\n\",\n            \"ive, and the two are not related.\\n\",\n            \"\\n\",\n            \"The second is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The third is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The fourth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The fifth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The sixth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The seventh is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The eighth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The ninth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The tenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The eleventh is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The twelfth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the two are not related.\\n\",\n            \"\\n\",\n            \"The thirteenth is the same as the first, but the\\n\",\n            \"\\n\",\n            \"[100 | 121.68] loss=1.76 avg=2.49\\n\",\n            \"[101 | 122.61] loss=2.08 avg=2.48\\n\",\n            \"[102 | 123.55] loss=2.05 avg=2.48\\n\",\n            \"[103 | 124.49] loss=2.38 avg=2.48\\n\",\n            \"[104 | 125.43] loss=2.39 avg=2.47\\n\",\n            \"[105 | 126.36] loss=2.23 avg=2.47\\n\",\n            \"[106 | 127.31] loss=2.02 avg=2.46\\n\",\n            \"[107 | 128.24] loss=2.95 avg=2.47\\n\",\n            \"[108 | 129.17] loss=1.90 avg=2.46\\n\",\n            \"[109 | 130.11] loss=2.49 avg=2.46\\n\",\n            \"[110 | 131.04] loss=2.15 avg=2.46\\n\",\n            \"[111 | 131.97] loss=2.17 avg=2.45\\n\",\n            \"[112 | 132.90] loss=2.15 avg=2.45\\n\",\n            \"[113 | 133.83] loss=2.10 avg=2.44\\n\",\n            \"[114 | 134.75] loss=2.61 avg=2.45\\n\",\n            \"[115 | 135.68] loss=2.62 avg=2.45\\n\",\n            \"[116 | 136.61] loss=2.28 avg=2.45\\n\",\n            \"[117 | 137.54] loss=2.04 avg=2.44\\n\",\n            \"[118 | 138.47] loss=1.96 avg=2.43\\n\",\n            \"[119 | 139.40] loss=1.84 avg=2.42\\n\",\n            \"[120 | 140.33] loss=2.49 avg=2.43\\n\",\n            \"[121 | 141.26] loss=1.63 avg=2.41\\n\",\n            \"[122 | 142.19] loss=2.49 avg=2.42\\n\",\n            \"[123 | 143.12] loss=2.08 avg=2.41\\n\",\n            \"[124 | 144.05] loss=1.63 avg=2.40\\n\",\n            \"[125 | 144.97] loss=2.10 avg=2.40\\n\",\n            \"[126 | 145.91] loss=3.43 avg=2.41\\n\",\n            \"[127 | 146.84] loss=2.68 avg=2.41\\n\",\n            \"[128 | 147.78] loss=1.55 avg=2.40\\n\",\n            \"[129 | 148.72] loss=2.65 avg=2.41\\n\",\n            \"[130 | 149.66] loss=1.87 avg=2.40\\n\",\n            \"[131 | 150.59] loss=3.37 avg=2.41\\n\",\n            \"[132 | 151.52] loss=1.48 avg=2.40\\n\",\n            \"[133 | 152.44] loss=2.43 avg=2.40\\n\",\n            \"[134 | 153.37] loss=3.28 avg=2.41\\n\",\n            \"[135 | 154.31] loss=1.49 avg=2.40\\n\",\n            \"[136 | 155.24] loss=1.95 avg=2.39\\n\",\n            \"[137 | 156.17] loss=2.05 avg=2.39\\n\",\n            \"[138 | 157.10] loss=2.05 avg=2.38\\n\",\n            \"[139 | 158.04] loss=2.11 avg=2.38\\n\",\n            \"[140 | 158.97] loss=1.66 avg=2.37\\n\",\n            \"[141 | 159.90] loss=1.82 avg=2.36\\n\",\n            \"[142 | 160.82] loss=2.41 avg=2.36\\n\",\n            \"[143 | 161.75] loss=1.53 avg=2.35\\n\",\n            \"[144 | 162.68] loss=2.33 avg=2.35\\n\",\n            \"[145 | 163.62] loss=1.95 avg=2.35\\n\",\n            \"[146 | 164.56] loss=1.88 avg=2.34\\n\",\n            \"[147 | 165.50] loss=1.91 avg=2.34\\n\",\n            \"[148 | 166.43] loss=1.93 avg=2.33\\n\",\n            \"[149 | 167.36] loss=1.72 avg=2.32\\n\",\n            \"[150 | 168.31] loss=2.56 avg=2.33\\n\",\n            \"[151 | 169.25] loss=2.28 avg=2.32\\n\",\n            \"[152 | 170.19] loss=1.94 avg=2.32\\n\",\n            \"[153 | 171.12] loss=2.83 avg=2.33\\n\",\n            \"[154 | 172.05] loss=1.50 avg=2.32\\n\",\n            \"[155 | 172.98] loss=1.85 avg=2.31\\n\",\n            \"[156 | 173.92] loss=1.74 avg=2.30\\n\",\n            \"[157 | 174.84] loss=1.63 avg=2.29\\n\",\n            \"[158 | 175.78] loss=1.65 avg=2.29\\n\",\n            \"[159 | 176.71] loss=2.11 avg=2.28\\n\",\n            \"[160 | 177.64] loss=1.82 avg=2.28\\n\",\n            \"[161 | 178.57] loss=1.92 avg=2.27\\n\",\n            \"[162 | 179.49] loss=1.85 avg=2.27\\n\",\n            \"[163 | 180.41] loss=2.33 avg=2.27\\n\",\n            \"[164 | 181.34] loss=1.66 avg=2.26\\n\",\n            \"[165 | 182.27] loss=1.46 avg=2.25\\n\",\n            \"[166 | 183.19] loss=1.62 avg=2.24\\n\",\n            \"[167 | 184.12] loss=1.62 avg=2.24\\n\",\n            \"[168 | 185.04] loss=2.43 avg=2.24\\n\",\n            \"[169 | 185.97] loss=1.23 avg=2.23\\n\",\n            \"[170 | 186.89] loss=1.78 avg=2.22\\n\",\n            \"[171 | 187.83] loss=2.42 avg=2.22\\n\",\n            \"[172 | 188.76] loss=1.61 avg=2.22\\n\",\n            \"[173 | 189.70] loss=1.67 avg=2.21\\n\",\n            \"[174 | 190.63] loss=2.53 avg=2.21\\n\",\n            \"[175 | 191.56] loss=1.82 avg=2.21\\n\",\n            \"[176 | 192.49] loss=1.53 avg=2.20\\n\",\n            \"[177 | 193.43] loss=1.21 avg=2.19\\n\",\n            \"[178 | 194.35] loss=2.13 avg=2.19\\n\",\n            \"[179 | 195.28] loss=2.07 avg=2.19\\n\",\n            \"[180 | 196.21] loss=1.44 avg=2.18\\n\",\n            \"[181 | 197.14] loss=2.44 avg=2.18\\n\",\n            \"[182 | 198.07] loss=2.22 avg=2.18\\n\",\n            \"[183 | 199.00] loss=1.86 avg=2.18\\n\",\n            \"[184 | 199.94] loss=2.09 avg=2.18\\n\",\n            \"[185 | 200.87] loss=2.00 avg=2.17\\n\",\n            \"[186 | 201.81] loss=2.12 avg=2.17\\n\",\n            \"[187 | 202.74] loss=1.32 avg=2.16\\n\",\n            \"[188 | 203.67] loss=2.10 avg=2.16\\n\",\n            \"[189 | 204.59] loss=1.52 avg=2.15\\n\",\n            \"[190 | 205.52] loss=1.69 avg=2.15\\n\",\n            \"[191 | 206.46] loss=2.13 avg=2.15\\n\",\n            \"[192 | 207.39] loss=2.10 avg=2.15\\n\",\n            \"[193 | 208.32] loss=2.32 avg=2.15\\n\",\n            \"[194 | 209.25] loss=2.89 avg=2.16\\n\",\n            \"[195 | 210.18] loss=1.48 avg=2.15\\n\",\n            \"[196 | 211.12] loss=1.45 avg=2.14\\n\",\n            \"[197 | 212.05] loss=2.73 avg=2.15\\n\",\n            \"[198 | 212.97] loss=1.91 avg=2.15\\n\",\n            \"[199 | 213.90] loss=1.58 avg=2.14\\n\",\n            \"Generating samples...\\n\",\n            \"======== SAMPLE 1 ========\\n\",\n            \"002\\n\",\n            \"\\n\",\n            \"(1)\\n\",\n            \"\\n\",\n            \"where\\n\",\n            \"\\n\",\n            \"S(x, v, vˆ) =\\n\",\n            \"\\n\",\n            \"(x)\\n\",\n            \"\\n\",\n            \"and\\n\",\n            \"\\n\",\n            \"(v) = 0.\\n\",\n            \"\\n\",\n            \"In this case, we can perform the\\n\",\n            \"\\n\",\n            \"transition operator\\n\",\n            \"\\n\",\n            \"where\\n\",\n            \"\\n\",\n            \"S(x, v, vˆ) =\\n\",\n            \"\\n\",\n            \"(x)\\n\",\n            \"\\n\",\n            \"and\\n\",\n            \"\\n\",\n            \"(v) = 0.\\n\",\n            \"\\n\",\n            \"In this case, we can perform the\\n\",\n            \"\\n\",\n            \"transition operator\\n\",\n            \"\\n\",\n            \"where\\n\",\n            \"\\n\",\n            \"S(x, v, vˆ) =\\n\",\n            \"\\n\",\n            \"(x)\\n\",\n            \"\\n\",\n            \"and\\n\",\n            \"\\n\",\n            \"(v) = 0.\\n\",\n            \"\\n\",\n            \"In this case, we can perform the\\n\",\n            \"\\n\",\n            \"transition operator\\n\",\n            \"\\n\",\n            \"where\\n\",\n            \"S(x, v, vˆ) =\\n\",\n            \"\\n\",\n            \"(x)\\n\",\n            \"\\n\",\n            \"and\\n\",\n            \"\\n\",\n            \"(v) = 0.\\n\",\n            \"\\n\",\n            \"In this case, we can perform the\\n\",\n            \"\\n\",\n            \"transition operator\\n\",\n            \"\\n\",\n            \"where\\n\",\n            \"S(x, v, vˆ) =\\n\",\n            \"\\n\",\n            \"(x)\\n\",\n            \"\\n\",\n            \"and\\n\",\n            \"\\n\",\n            \"(v) = 0.\\n\",\n            \"\\n\",\n            \"In this case, we can perform the\\n\",\n            \"\\n\",\n            \"transition operator\\n\",\n            \"\\n\",\n            \"where\\n\",\n            \"S(x, v, vˆ) =\\n\",\n            \"\\n\",\n            \"(x)\\n\",\n            \"\\n\",\n            \"and\\n\",\n            \"(v) = 0.\\n\",\n            \"\\n\",\n            \"In this case, we can perform the\\n\",\n            \"\\n\",\n            \"transition operator\\n\",\n            \"\\n\",\n            \"where\\n\",\n            \"S(x, v, vˆ) =\\n\",\n            \"\\n\",\n            \"(x)\\n\",\n            \"\\n\",\n            \"and\\n\",\n            \"(v) = 0.\\n\",\n            \"\\n\",\n            \"In this case, we can perform the\\n\",\n            \"\\n\",\n            \"transition operator\\n\",\n            \"\\n\",\n            \"where\\n\",\n            \"S(x, v, vˆ) =\\n\",\n            \"\\n\",\n            \"(x)\\n\",\n            \"\\n\",\n            \"and\\n\",\n            \"(v) = 0.\\n\",\n            \"\\n\",\n            \"In this case, we can perform the\\n\",\n            \"\\n\",\n            \"transition operator\\n\",\n            \"\\n\",\n            \"where\\n\",\n            \"S(x, v, vˆ) =\\n\",\n            \"\\n\",\n            \"(x)\\n\",\n            \"\\n\",\n            \"and\\n\",\n            \"(v) = 0.\\n\",\n            \"\\n\",\n            \"In this case, we can perform the\\n\",\n            \"\\n\",\n            \"transition operator\\n\",\n            \"\\n\",\n            \"where\\n\",\n            \"S(x, v, vˆ) =\\n\",\n            \"\\n\",\n            \"(x)\\n\",\n            \"\\n\",\n            \"and\\n\",\n            \"(v) = 0.\\n\",\n            \"\\n\",\n            \"In this case, we can perform the\\n\",\n            \"\\n\",\n            \"transition operator\\n\",\n            \"\\n\",\n            \"where\\n\",\n            \"S(x, v, vˆ) =\\n\",\n            \"\\n\",\n            \"(x)\\n\",\n            \"\\n\",\n            \"and\\n\",\n            \"(v) = 0.\\n\",\n            \"\\n\",\n            \"In this case, we can perform the\\n\",\n            \"\\n\",\n            \"transition operator\\n\",\n            \"\\n\",\n            \"where\\n\",\n            \"S(x, v, vˆ) =\\n\",\n            \"\\n\",\n            \"(x)\\n\",\n            \"\\n\",\n            \"and\\n\",\n            \"(v) = 0.\\n\",\n            \"\\n\",\n            \"In this case, we can perform the\\n\",\n            \"\\n\",\n            \"transition operator\\n\",\n            \"\\n\",\n            \"where\\n\",\n            \"S(x, v, vˆ) =\\n\",\n            \"\\n\",\n            \"(x)\\n\",\n            \"\\n\",\n            \"and\\n\",\n            \"(v) = 0.\\n\",\n            \"\\n\",\n            \"In this case, we can perform the\\n\",\n            \"\\n\",\n            \"transition operator\\n\",\n            \"\\n\",\n            \"where\\n\",\n            \"S(x, v, vˆ) =\\n\",\n            \"\\n\",\n            \"(x)\\n\",\n            \"\\n\",\n            \"and\\n\",\n            \"(v) = 0.\\n\",\n            \"\\n\",\n            \"In this case, we can perform the\\n\",\n            \"\\n\",\n            \"transition operator\\n\",\n            \"\\n\",\n            \"where\\n\",\n            \"S(x, v, vˆ) =\\n\",\n            \"\\n\",\n            \"(x)\\n\",\n            \"\\n\",\n            \"and\\n\",\n            \"(v) = 0.\\n\",\n            \"\\n\",\n            \"In this case, we can perform the\\n\",\n            \"\\n\",\n            \"transition operator\\n\",\n            \"\\n\",\n            \"where\\n\",\n            \"S(x, v, vˆ) =\\n\",\n            \"\\n\",\n            \"(x)\\n\",\n            \"\\n\",\n            \"and\\n\",\n            \"(v) = 0.\\n\",\n            \"\\n\",\n            \"In this case, we can perform the\\n\",\n            \"\\n\",\n            \"transition operator\\n\",\n            \"\\n\",\n            \"where\\n\",\n            \"S(x, v, vˆ) =\\n\",\n            \"\\n\",\n            \"(x)\\n\",\n            \"\\n\",\n            \"and\\n\",\n            \"(v) = 0.\\n\",\n            \"\\n\",\n            \"In this case, we can perform the\\n\",\n            \"\\n\",\n            \"transition operator\\n\",\n            \"\\n\",\n            \"where\\n\",\n            \"S(x, v, vˆ) =\\n\",\n            \"\\n\",\n            \"(x)\\n\",\n            \"\\n\",\n            \"and\\n\",\n            \"(v) = 0.\\n\",\n            \"\\n\",\n            \"In this case, we can perform the\\n\",\n            \"\\n\",\n            \"transition operator\\n\",\n            \"\\n\",\n            \"where\\n\",\n            \"S(x, v, vˆ) =\\n\",\n            \"\\n\",\n            \"(x)\\n\",\n            \"\\n\",\n            \"and\\n\",\n            \"(v) = 0.\\n\",\n            \"\\n\",\n            \"In this case, we can perform the\\n\",\n            \"\\n\",\n            \"transition operator\\n\",\n            \"\\n\",\n            \"where\\n\",\n            \"S(x, v, vˆ) =\\n\",\n            \"\\n\",\n            \"(x)\\n\",\n            \"\\n\",\n            \"and\\n\",\n            \"(v) = 0.\\n\",\n            \"\\n\",\n            \"In this case, we can perform the\\n\",\n            \"\\n\",\n            \"transition operator\\n\",\n            \"\\n\",\n            \"where\\n\",\n            \"S(x, v, vˆ) =\\n\",\n            \"\\n\",\n            \"(x)\\n\",\n            \"\\n\",\n            \"and\\n\",\n            \"(v) = 0.\\n\",\n            \"\\n\",\n            \"In this case, we can perform the\\n\",\n            \"\\n\",\n            \"transition operator\\n\",\n            \"\\n\",\n            \"where\\n\",\n            \"S(x, v, vˆ) =\\n\",\n            \"\\n\",\n            \"(x)\\n\",\n            \"\\n\",\n            \"and\\n\",\n            \"(v) = 0.\\n\",\n            \"\\n\",\n            \"In this case, we can perform the\\n\",\n            \"\\n\",\n            \"transition operator\\n\",\n            \"\\n\",\n            \"where\\n\",\n            \"S(x, v, vˆ) =\\n\",\n            \"\\n\",\n            \"(x)\\n\",\n            \"\\n\",\n            \"[200 | 235.36] loss=2.05 avg=2.14\\n\",\n            \"[201 | 236.29] loss=2.65 avg=2.15\\n\",\n            \"[202 | 237.21] loss=1.34 avg=2.14\\n\",\n            \"[203 | 238.14] loss=1.23 avg=2.13\\n\",\n            \"[204 | 239.06] loss=1.69 avg=2.12\\n\",\n            \"[205 | 240.00] loss=1.38 avg=2.11\\n\",\n            \"[206 | 240.94] loss=1.44 avg=2.10\\n\",\n            \"[207 | 241.87] loss=2.10 avg=2.10\\n\",\n            \"[208 | 242.81] loss=1.89 avg=2.10\\n\",\n            \"[209 | 243.74] loss=2.23 avg=2.10\\n\",\n            \"[210 | 244.67] loss=1.67 avg=2.10\\n\",\n            \"[211 | 245.60] loss=1.49 avg=2.09\\n\",\n            \"[212 | 246.53] loss=1.76 avg=2.09\\n\",\n            \"[213 | 247.46] loss=1.46 avg=2.08\\n\",\n            \"[214 | 248.39] loss=1.55 avg=2.07\\n\",\n            \"[215 | 249.32] loss=1.73 avg=2.07\\n\",\n            \"[216 | 250.25] loss=1.28 avg=2.06\\n\",\n            \"[217 | 251.19] loss=2.06 avg=2.06\\n\",\n            \"[218 | 252.11] loss=1.38 avg=2.05\\n\",\n            \"[219 | 253.04] loss=1.70 avg=2.05\\n\",\n            \"[220 | 253.96] loss=1.93 avg=2.05\\n\",\n            \"[221 | 254.90] loss=1.72 avg=2.05\\n\",\n            \"[222 | 255.83] loss=1.43 avg=2.04\\n\",\n            \"[223 | 256.77] loss=1.31 avg=2.03\\n\",\n            \"[224 | 257.70] loss=1.37 avg=2.02\\n\",\n            \"[225 | 258.64] loss=1.23 avg=2.01\\n\",\n            \"[226 | 259.58] loss=1.39 avg=2.01\\n\",\n            \"[227 | 260.51] loss=1.38 avg=2.00\\n\",\n            \"[228 | 261.45] loss=1.91 avg=2.00\\n\",\n            \"[229 | 262.38] loss=1.49 avg=1.99\\n\",\n            \"[230 | 263.31] loss=2.82 avg=2.00\\n\",\n            \"[231 | 264.25] loss=1.32 avg=1.99\\n\",\n            \"[232 | 265.17] loss=1.44 avg=1.99\\n\",\n            \"[233 | 266.10] loss=1.64 avg=1.98\\n\",\n            \"[234 | 267.03] loss=1.49 avg=1.98\\n\",\n            \"[235 | 267.96] loss=1.15 avg=1.97\\n\",\n            \"[236 | 268.90] loss=1.86 avg=1.97\\n\",\n            \"[237 | 269.83] loss=1.50 avg=1.96\\n\",\n            \"[238 | 270.76] loss=1.42 avg=1.96\\n\",\n            \"[239 | 271.70] loss=1.60 avg=1.95\\n\",\n            \"[240 | 272.61] loss=1.37 avg=1.95\\n\",\n            \"[241 | 273.55] loss=1.34 avg=1.94\\n\",\n            \"[242 | 274.48] loss=1.20 avg=1.93\\n\",\n            \"[243 | 275.40] loss=1.24 avg=1.93\\n\",\n            \"[244 | 276.32] loss=1.64 avg=1.92\\n\",\n            \"[245 | 277.25] loss=1.20 avg=1.91\\n\",\n            \"[246 | 278.18] loss=1.91 avg=1.91\\n\",\n            \"[247 | 279.11] loss=1.71 avg=1.91\\n\",\n            \"[248 | 280.04] loss=1.19 avg=1.90\\n\",\n            \"[249 | 280.96] loss=1.61 avg=1.90\\n\",\n            \"[250 | 281.90] loss=1.61 avg=1.90\\n\",\n            \"[251 | 282.83] loss=1.36 avg=1.89\\n\",\n            \"[252 | 283.76] loss=1.63 avg=1.89\\n\",\n            \"[253 | 284.69] loss=2.02 avg=1.89\\n\",\n            \"[254 | 285.64] loss=1.33 avg=1.88\\n\",\n            \"[255 | 286.58] loss=1.04 avg=1.88\\n\",\n            \"[256 | 287.51] loss=1.20 avg=1.87\\n\",\n            \"[257 | 288.45] loss=1.43 avg=1.86\\n\",\n            \"[258 | 289.38] loss=1.03 avg=1.85\\n\",\n            \"[259 | 290.30] loss=1.04 avg=1.85\\n\",\n            \"[260 | 291.24] loss=1.84 avg=1.85\\n\",\n            \"[261 | 292.17] loss=2.42 avg=1.85\\n\",\n            \"[262 | 293.11] loss=1.92 avg=1.85\\n\",\n            \"[263 | 294.04] loss=1.78 avg=1.85\\n\",\n            \"[264 | 294.97] loss=1.89 avg=1.85\\n\",\n            \"[265 | 295.90] loss=1.04 avg=1.84\\n\",\n            \"[266 | 296.83] loss=1.08 avg=1.84\\n\",\n            \"[267 | 297.76] loss=2.00 avg=1.84\\n\",\n            \"[268 | 298.71] loss=1.56 avg=1.83\\n\",\n            \"[269 | 299.64] loss=1.78 avg=1.83\\n\",\n            \"[270 | 300.58] loss=2.13 avg=1.84\\n\",\n            \"[271 | 301.52] loss=1.21 avg=1.83\\n\",\n            \"[272 | 302.45] loss=1.03 avg=1.82\\n\",\n            \"[273 | 303.39] loss=2.25 avg=1.83\\n\",\n            \"[274 | 304.33] loss=1.13 avg=1.82\\n\",\n            \"[275 | 305.26] loss=1.66 avg=1.82\\n\",\n            \"[276 | 306.18] loss=1.40 avg=1.81\\n\",\n            \"[277 | 307.11] loss=1.11 avg=1.80\\n\",\n            \"[278 | 308.04] loss=1.41 avg=1.80\\n\",\n            \"[279 | 308.98] loss=2.19 avg=1.80\\n\",\n            \"[280 | 309.91] loss=1.21 avg=1.80\\n\",\n            \"[281 | 310.84] loss=0.96 avg=1.79\\n\",\n            \"[282 | 311.77] loss=1.13 avg=1.78\\n\",\n            \"[283 | 312.70] loss=0.89 avg=1.77\\n\",\n            \"[284 | 313.64] loss=1.72 avg=1.77\\n\",\n            \"[285 | 314.57] loss=1.03 avg=1.76\\n\",\n            \"[286 | 315.50] loss=2.07 avg=1.77\\n\",\n            \"[287 | 316.43] loss=0.93 avg=1.76\\n\",\n            \"[288 | 317.36] loss=1.32 avg=1.75\\n\",\n            \"[289 | 318.28] loss=0.93 avg=1.75\\n\",\n            \"[290 | 319.21] loss=1.28 avg=1.74\\n\",\n            \"[291 | 320.14] loss=2.47 avg=1.75\\n\",\n            \"[292 | 321.06] loss=1.72 avg=1.75\\n\",\n            \"[293 | 321.99] loss=0.88 avg=1.74\\n\",\n            \"[294 | 322.92] loss=1.20 avg=1.73\\n\",\n            \"[295 | 323.86] loss=0.93 avg=1.72\\n\",\n            \"[296 | 324.80] loss=2.07 avg=1.73\\n\",\n            \"[297 | 325.74] loss=0.84 avg=1.72\\n\",\n            \"[298 | 326.67] loss=1.90 avg=1.72\\n\",\n            \"[299 | 327.60] loss=1.64 avg=1.72\\n\",\n            \"Generating samples...\\n\",\n            \"======== SAMPLE 1 ========\\n\",\n            \"S.\\n\",\n            \"The first step in the identification of the chemotactic regime is to compare the two regimes. In the first case, we compare the two regimes, while in the second case, we compare the chemotactic regime. In the first case, we compare the two regimes, while in the second case, we compare the chemotactic regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we\\n\",\n            \"\\n\",\n            \"[300 | 348.94] loss=1.11 avg=1.71\\n\",\n            \"[301 | 349.86] loss=1.62 avg=1.71\\n\",\n            \"[302 | 350.79] loss=1.19 avg=1.71\\n\",\n            \"[303 | 351.72] loss=0.70 avg=1.70\\n\",\n            \"[304 | 352.65] loss=1.82 avg=1.70\\n\",\n            \"[305 | 353.58] loss=0.90 avg=1.69\\n\",\n            \"[306 | 354.50] loss=0.91 avg=1.68\\n\",\n            \"[307 | 355.43] loss=1.17 avg=1.68\\n\",\n            \"[308 | 356.35] loss=0.75 avg=1.67\\n\",\n            \"[309 | 357.28] loss=2.11 avg=1.67\\n\",\n            \"[310 | 358.21] loss=0.94 avg=1.66\\n\",\n            \"[311 | 359.13] loss=1.06 avg=1.66\\n\",\n            \"[312 | 360.06] loss=1.33 avg=1.65\\n\",\n            \"[313 | 360.99] loss=1.52 avg=1.65\\n\",\n            \"[314 | 361.93] loss=1.02 avg=1.65\\n\",\n            \"[315 | 362.86] loss=0.63 avg=1.63\\n\",\n            \"[316 | 363.79] loss=1.17 avg=1.63\\n\",\n            \"[317 | 364.74] loss=0.87 avg=1.62\\n\",\n            \"[318 | 365.68] loss=1.61 avg=1.62\\n\",\n            \"[319 | 366.61] loss=1.14 avg=1.62\\n\",\n            \"[320 | 367.54] loss=0.88 avg=1.61\\n\",\n            \"[321 | 368.47] loss=1.66 avg=1.61\\n\",\n            \"[322 | 369.40] loss=0.88 avg=1.60\\n\",\n            \"[323 | 370.32] loss=0.77 avg=1.59\\n\",\n            \"[324 | 371.26] loss=1.39 avg=1.59\\n\",\n            \"[325 | 372.19] loss=1.60 avg=1.59\\n\",\n            \"[326 | 373.12] loss=0.89 avg=1.58\\n\",\n            \"[327 | 374.05] loss=1.57 avg=1.58\\n\",\n            \"[328 | 374.98] loss=1.62 avg=1.58\\n\",\n            \"[329 | 375.91] loss=2.22 avg=1.59\\n\",\n            \"[330 | 376.84] loss=1.21 avg=1.59\\n\",\n            \"[331 | 377.77] loss=1.09 avg=1.58\\n\",\n            \"[332 | 378.70] loss=1.68 avg=1.58\\n\",\n            \"[333 | 379.64] loss=0.57 avg=1.57\\n\",\n            \"[334 | 380.57] loss=0.94 avg=1.57\\n\",\n            \"[335 | 381.51] loss=0.59 avg=1.56\\n\",\n            \"[336 | 382.44] loss=1.25 avg=1.55\\n\",\n            \"[337 | 383.38] loss=1.40 avg=1.55\\n\",\n            \"[338 | 384.31] loss=0.87 avg=1.54\\n\",\n            \"[339 | 385.24] loss=0.54 avg=1.53\\n\",\n            \"[340 | 386.17] loss=1.17 avg=1.53\\n\",\n            \"[341 | 387.10] loss=0.98 avg=1.52\\n\",\n            \"[342 | 388.04] loss=1.51 avg=1.52\\n\",\n            \"[343 | 388.96] loss=0.44 avg=1.51\\n\",\n            \"[344 | 389.89] loss=1.37 avg=1.51\\n\",\n            \"[345 | 390.81] loss=1.65 avg=1.51\\n\",\n            \"[346 | 391.75] loss=1.73 avg=1.51\\n\",\n            \"[347 | 392.67] loss=1.36 avg=1.51\\n\",\n            \"[348 | 393.61] loss=1.15 avg=1.51\\n\",\n            \"[349 | 394.54] loss=0.94 avg=1.50\\n\",\n            \"[350 | 395.47] loss=1.27 avg=1.50\\n\",\n            \"[351 | 396.39] loss=1.38 avg=1.50\\n\",\n            \"[352 | 397.32] loss=0.92 avg=1.49\\n\",\n            \"[353 | 398.25] loss=1.13 avg=1.49\\n\",\n            \"[354 | 399.17] loss=1.38 avg=1.49\\n\",\n            \"[355 | 400.10] loss=0.82 avg=1.48\\n\",\n            \"[356 | 401.03] loss=1.94 avg=1.49\\n\",\n            \"[357 | 401.95] loss=0.82 avg=1.48\\n\",\n            \"[358 | 402.87] loss=0.41 avg=1.47\\n\",\n            \"[359 | 403.80] loss=2.16 avg=1.48\\n\",\n            \"[360 | 404.73] loss=2.05 avg=1.48\\n\",\n            \"[361 | 405.66] loss=0.86 avg=1.48\\n\",\n            \"[362 | 406.60] loss=1.46 avg=1.48\\n\",\n            \"[363 | 407.53] loss=1.14 avg=1.47\\n\",\n            \"[364 | 408.46] loss=1.03 avg=1.47\\n\",\n            \"[365 | 409.40] loss=1.86 avg=1.47\\n\",\n            \"[366 | 410.33] loss=1.84 avg=1.48\\n\",\n            \"[367 | 411.27] loss=1.29 avg=1.47\\n\",\n            \"[368 | 412.19] loss=0.92 avg=1.47\\n\",\n            \"[369 | 413.12] loss=2.56 avg=1.48\\n\",\n            \"[370 | 414.05] loss=0.87 avg=1.47\\n\",\n            \"[371 | 414.98] loss=1.09 avg=1.47\\n\",\n            \"[372 | 415.90] loss=0.86 avg=1.46\\n\",\n            \"[373 | 416.83] loss=1.37 avg=1.46\\n\",\n            \"[374 | 417.77] loss=1.08 avg=1.46\\n\",\n            \"[375 | 418.70] loss=1.24 avg=1.46\\n\",\n            \"[376 | 419.63] loss=1.53 avg=1.46\\n\",\n            \"[377 | 420.56] loss=1.00 avg=1.45\\n\",\n            \"[378 | 421.50] loss=0.83 avg=1.45\\n\",\n            \"[379 | 422.42] loss=1.16 avg=1.44\\n\",\n            \"[380 | 423.35] loss=1.40 avg=1.44\\n\",\n            \"[381 | 424.27] loss=1.45 avg=1.44\\n\",\n            \"[382 | 425.21] loss=1.42 avg=1.44\\n\",\n            \"[383 | 426.14] loss=1.52 avg=1.44\\n\",\n            \"[384 | 427.07] loss=0.55 avg=1.43\\n\",\n            \"[385 | 428.00] loss=0.55 avg=1.42\\n\",\n            \"[386 | 428.94] loss=2.22 avg=1.43\\n\",\n            \"[387 | 429.88] loss=0.53 avg=1.42\\n\",\n            \"[388 | 430.81] loss=0.76 avg=1.42\\n\",\n            \"[389 | 431.75] loss=0.72 avg=1.41\\n\",\n            \"[390 | 432.68] loss=1.05 avg=1.41\\n\",\n            \"[391 | 433.61] loss=0.59 avg=1.40\\n\",\n            \"[392 | 434.54] loss=1.73 avg=1.40\\n\",\n            \"[393 | 435.47] loss=0.94 avg=1.40\\n\",\n            \"[394 | 436.39] loss=1.34 avg=1.40\\n\",\n            \"[395 | 437.32] loss=0.58 avg=1.39\\n\",\n            \"[396 | 438.26] loss=0.55 avg=1.38\\n\",\n            \"[397 | 439.19] loss=1.17 avg=1.38\\n\",\n            \"[398 | 440.12] loss=1.58 avg=1.38\\n\",\n            \"[399 | 441.04] loss=0.65 avg=1.37\\n\",\n            \"Generating samples...\\n\",\n            \"======== SAMPLE 1 ========\\n\",\n            \" orientation, and the direction of alignment of the fibers.\\n\",\n            \"In particular, we shall consider a diffusive model for chemotaxis. In particular, we shall consider a diffusion-advection model. In particular, we shall consider a drift-diffusion model. In particular, we shall consider a non-local sensing model. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a directional cue. In particular, we shall consider a directional cue. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a nonsharing non-local non-sensing kernel.\\n\",\n            \"In particular, we shall consider a non-local non-sensing kernel for which the kernel distribution is non-zero. In particular, we shall not consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a nonsharing non-local non-sensing kernel. In particular, we shall not consider a non-local non-sensing kernel. In particular, we shall not consider a non-local non-sensing kernel. In particular, we shall not consider a nonsharing non-local non-sensing kernel. In particular, we shall not consider a nonsharing non-local non-sensing kernel. In particular, we shall not consider a nonsharing non-local non-sensing kernel. In particular, we shall not consider a nonsharing non-local non-sensing kernel. In particular, we shall not consider a nonsharing nonsharing kernel.\\n\",\n            \"In particular, we shall consider a non-local non-sensing kernel for which the kernel distribution is non-zero. In particular, we shall not consider a non-local non-sensing kernel. In particular, we shall not consider a non-local non-sensing kernel. In particular, we shall not consider a non-local non-sensing kernel. In particular, we shall not consider a nonsharing nonsharing kernel.\\n\",\n            \"In particular, we shall not consider a nonsharing nonsharing kernel. In particular, we shall not consider a nonsharing non-local non-sensing kernel. In particular, we shall not consider a non-local non-sensing kernel. In particular, we shall not consider a nonsharing nonsharing kernel.\\n\",\n            \"In particular, we shall not consider a nonsharing nonsharing kernel. In particular, we shall not consider a non-local non-sensing kernel. In particular, we shall not consider a nonsharing nonsharing kernel. In particular, we shall not consider a nonsharing nonsharing kernel.\\n\",\n            \"In particular, we shall not consider a nonsharing nonsharing kernel. In particular, we shall not consider a non-local non-sensing kernel. In particular, we shall not consider a nonsharing non-local non-sensing kernel. In particular, we shall not consider a nonsharing nonsharing kernel.\\n\",\n            \"In particular, we shall not consider a nonsharing non-local non-sensing kernel. In particular, we shall not consider a nonsharing non-local non-sensing kernel. In particular, we shall not consider a non-local non-sensing kernel. In particular, we shall not consider a nonsharing nonsharing kernel.\\n\",\n            \"In particular, we shall not consider a non-local non-sensing kernel. In particular, we shall not consider a nonsharing non-local non-sensing kernel. In particular, we shall not consider a nonsharing\\n\",\n            \"\\n\",\n            \"[400 | 462.48] loss=1.44 avg=1.37\\n\",\n            \"[401 | 463.41] loss=0.61 avg=1.36\\n\",\n            \"[402 | 464.34] loss=0.57 avg=1.36\\n\",\n            \"[403 | 465.27] loss=0.79 avg=1.35\\n\",\n            \"[404 | 466.20] loss=0.37 avg=1.34\\n\",\n            \"[405 | 467.13] loss=0.87 avg=1.34\\n\",\n            \"[406 | 468.06] loss=0.73 avg=1.33\\n\",\n            \"[407 | 468.99] loss=1.05 avg=1.33\\n\",\n            \"[408 | 469.93] loss=1.21 avg=1.33\\n\",\n            \"[409 | 470.86] loss=0.55 avg=1.32\\n\",\n            \"[410 | 471.80] loss=2.09 avg=1.33\\n\",\n            \"[411 | 472.74] loss=0.61 avg=1.32\\n\",\n            \"[412 | 473.66] loss=1.43 avg=1.32\\n\",\n            \"[413 | 474.59] loss=0.50 avg=1.31\\n\",\n            \"[414 | 475.53] loss=1.01 avg=1.31\\n\",\n            \"[415 | 476.46] loss=1.49 avg=1.31\\n\",\n            \"[416 | 477.38] loss=0.71 avg=1.30\\n\",\n            \"[417 | 478.31] loss=0.53 avg=1.30\\n\",\n            \"[418 | 479.24] loss=0.71 avg=1.29\\n\",\n            \"[419 | 480.17] loss=0.94 avg=1.29\\n\",\n            \"[420 | 481.11] loss=0.92 avg=1.28\\n\",\n            \"[421 | 482.03] loss=1.16 avg=1.28\\n\",\n            \"[422 | 482.96] loss=0.75 avg=1.28\\n\",\n            \"[423 | 483.89] loss=1.20 avg=1.28\\n\",\n            \"[424 | 484.82] loss=0.50 avg=1.27\\n\",\n            \"[425 | 485.75] loss=0.49 avg=1.26\\n\",\n            \"[426 | 486.68] loss=0.81 avg=1.25\\n\",\n            \"[427 | 487.60] loss=1.84 avg=1.26\\n\",\n            \"[428 | 488.53] loss=0.55 avg=1.25\\n\",\n            \"[429 | 489.45] loss=0.38 avg=1.24\\n\",\n            \"[430 | 490.38] loss=1.23 avg=1.24\\n\",\n            \"[431 | 491.30] loss=0.91 avg=1.24\\n\",\n            \"[432 | 492.23] loss=0.77 avg=1.24\\n\",\n            \"[433 | 493.15] loss=0.70 avg=1.23\\n\",\n            \"[434 | 494.09] loss=1.00 avg=1.23\\n\",\n            \"[435 | 495.01] loss=1.86 avg=1.24\\n\",\n            \"[436 | 495.95] loss=1.14 avg=1.23\\n\",\n            \"[437 | 496.88] loss=0.73 avg=1.23\\n\",\n            \"[438 | 497.82] loss=0.51 avg=1.22\\n\",\n            \"[439 | 498.76] loss=0.62 avg=1.22\\n\",\n            \"[440 | 499.68] loss=1.04 avg=1.21\\n\",\n            \"[441 | 500.61] loss=1.39 avg=1.22\\n\",\n            \"[442 | 501.54] loss=0.60 avg=1.21\\n\",\n            \"[443 | 502.47] loss=0.99 avg=1.21\\n\",\n            \"[444 | 503.40] loss=1.10 avg=1.21\\n\",\n            \"[445 | 504.33] loss=1.03 avg=1.20\\n\",\n            \"[446 | 505.26] loss=0.93 avg=1.20\\n\",\n            \"[447 | 506.19] loss=0.89 avg=1.20\\n\",\n            \"[448 | 507.12] loss=0.77 avg=1.19\\n\",\n            \"[449 | 508.05] loss=1.22 avg=1.19\\n\",\n            \"[450 | 508.98] loss=0.78 avg=1.19\\n\",\n            \"[451 | 509.90] loss=0.68 avg=1.18\\n\",\n            \"[452 | 510.83] loss=0.34 avg=1.18\\n\",\n            \"[453 | 511.78] loss=0.82 avg=1.17\\n\",\n            \"[454 | 512.71] loss=0.49 avg=1.17\\n\",\n            \"[455 | 513.64] loss=0.92 avg=1.16\\n\",\n            \"[456 | 514.58] loss=1.10 avg=1.16\\n\",\n            \"[457 | 515.53] loss=0.68 avg=1.16\\n\",\n            \"[458 | 516.46] loss=0.36 avg=1.15\\n\",\n            \"[459 | 517.40] loss=0.37 avg=1.14\\n\",\n            \"[460 | 518.34] loss=1.00 avg=1.14\\n\",\n            \"[461 | 519.27] loss=0.76 avg=1.14\\n\",\n            \"[462 | 520.19] loss=0.32 avg=1.13\\n\",\n            \"[463 | 521.12] loss=0.35 avg=1.12\\n\",\n            \"[464 | 522.05] loss=0.32 avg=1.11\\n\",\n            \"[465 | 522.98] loss=0.93 avg=1.11\\n\",\n            \"[466 | 523.91] loss=0.78 avg=1.11\\n\",\n            \"[467 | 524.84] loss=0.55 avg=1.10\\n\",\n            \"[468 | 525.77] loss=0.70 avg=1.10\\n\",\n            \"[469 | 526.70] loss=0.80 avg=1.09\\n\",\n            \"[470 | 527.62] loss=0.68 avg=1.09\\n\",\n            \"[471 | 528.54] loss=0.68 avg=1.09\\n\",\n            \"[472 | 529.46] loss=0.42 avg=1.08\\n\",\n            \"[473 | 530.39] loss=0.43 avg=1.07\\n\",\n            \"[474 | 531.32] loss=0.63 avg=1.07\\n\",\n            \"[475 | 532.24] loss=0.56 avg=1.06\\n\",\n            \"[476 | 533.17] loss=0.63 avg=1.06\\n\",\n            \"[477 | 534.09] loss=1.12 avg=1.06\\n\",\n            \"[478 | 535.01] loss=0.33 avg=1.05\\n\",\n            \"[479 | 535.95] loss=0.87 avg=1.05\\n\",\n            \"[480 | 536.88] loss=0.43 avg=1.04\\n\",\n            \"[481 | 537.82] loss=1.60 avg=1.05\\n\",\n            \"[482 | 538.76] loss=0.51 avg=1.04\\n\",\n            \"[483 | 539.70] loss=0.76 avg=1.04\\n\",\n            \"[484 | 540.63] loss=0.46 avg=1.04\\n\",\n            \"[485 | 541.55] loss=0.71 avg=1.03\\n\",\n            \"[486 | 542.49] loss=1.91 avg=1.04\\n\",\n            \"[487 | 543.42] loss=1.45 avg=1.05\\n\",\n            \"[488 | 544.35] loss=0.51 avg=1.04\\n\",\n            \"[489 | 545.28] loss=0.66 avg=1.04\\n\",\n            \"[490 | 546.20] loss=0.39 avg=1.03\\n\",\n            \"[491 | 547.13] loss=0.88 avg=1.03\\n\",\n            \"[492 | 548.06] loss=0.40 avg=1.02\\n\",\n            \"[493 | 549.00] loss=0.44 avg=1.02\\n\",\n            \"[494 | 549.92] loss=0.51 avg=1.01\\n\",\n            \"[495 | 550.85] loss=1.18 avg=1.01\\n\",\n            \"[496 | 551.77] loss=0.48 avg=1.01\\n\",\n            \"[497 | 552.71] loss=0.48 avg=1.00\\n\",\n            \"[498 | 553.64] loss=0.94 avg=1.00\\n\",\n            \"[499 | 554.57] loss=0.60 avg=1.00\\n\",\n            \"Generating samples...\\n\",\n            \"======== SAMPLE 1 ========\\n\",\n            \", the average speed of the fibers is given by\\n\",\n            \"the momentum\\n\",\n            \"T[q, S](x, v, vˆ) = c(x)\\n\",\n            \"Z\\n\",\n            \"R+\\n\",\n            \"γS (λ)S(x + λvˆ) dλ Z\\n\",\n            \"R+\\n\",\n            \"γq(λ) q(x + λvˆ, vˆ) dλ ψ(v).\\n\",\n            \"In particular, the velocity\\n\",\n            \"T[q, S] = T0(q, v, vˆ)\\n\",\n            \"Z\\n\",\n            \"R+\\n\",\n            \"γS (λ)S(x + λvˆ) q(x + λvˆ, vˆ) dλ =\\n\",\n            \"0\\n\",\n            \"Z\\n\",\n            \"R+\\n\",\n            \"γS (λ)S(x + λvˆ) dλ\\n\",\n            \"(1)\\n\",\n            \"and the average\\n\",\n            \"T0[q, S] = T0(q, v, vˆ)\\n\",\n            \"T0(k, v, vˆ)\\n\",\n            \"T0(a) =\\n\",\n            \"Γ\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \"k\\n\",\n            \"\\n\",\n            \"h\\n\",\n            \"\\n\",\n            \"| Γ\\n\",\n            \"S\\n\",\n            \"e\\n\",\n            \"i\\n\",\n            \"|\\n\",\n            \"Z\\n\",\n            \"R+\\n\",\n            \"γS (λ)S(x + λvˆ) dλ i = 1\\n\",\n            \"Γ\\n\",\n            \"Γq\\n\",\n            \"k\\n\",\n            \"h\\n\",\n            \"I\\n\",\n            \"(a) vˆ =\\n\",\n            \"I(a)\\n\",\n            \"Z\\n\",\n            \"R+\\n\",\n            \"γS (λ)S(x + λvˆ) dλ i = 1\\n\",\n            \"Γ\\n\",\n            \"Γq\\n\",\n            \"k\\n\",\n            \"\\n\",\n            \"I\\n\",\n            \"(b) vˆ =\\n\",\n            \"I(b)\\n\",\n            \"Z\\n\",\n            \"R+\\n\",\n            \"γS (λ)S(x + λvˆ) dλ i = 1\\n\",\n            \"Γ\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \"k\\n\",\n            \"\\n\",\n            \"h\\n\",\n            \"I\\n\",\n            \"(c) vˆ =\\n\",\n            \"I(c)\\n\",\n            \"Z\\n\",\n            \"R+\\n\",\n            \"γS (λ)S(x + λvˆ) dλ i = 1\\n\",\n            \"Γ\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \"k\\n\",\n            \"\\n\",\n            \"h\\n\",\n            \"I\\n\",\n            \"(d) vˆ =\\n\",\n            \"I(d)\\n\",\n            \"Z\\n\",\n            \"R+\\n\",\n            \"γS (λ)S(x + λvˆ) dλ i = 1\\n\",\n            \"Γ\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \"k\\n\",\n            \"h\\n\",\n            \"I\\n\",\n            \"(e) vˆ =\\n\",\n            \"I(e)\\n\",\n            \"Z\\n\",\n            \"R+\\n\",\n            \"γS (λ)S(x + λvˆ) dλ i = 1\\n\",\n            \"Γ\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \"k\\n\",\n            \"h\\n\",\n            \"I\\n\",\n            \"(f) vˆ =\\n\",\n            \"I(f)\\n\",\n            \"Z\\n\",\n            \"R+\\n\",\n            \"γS (λ)S(x + λvˆ) dλ i = 1\\n\",\n            \"Γ\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \"k\\n\",\n            \"h\\n\",\n            \"I\\n\",\n            \"(g) vˆ =\\n\",\n            \"I(g)\\n\",\n            \"Z\\n\",\n            \"R+\\n\",\n            \"γS (λ)S(x + λvˆ) dλ i = 1\\n\",\n            \"Γ\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \"k\\n\",\n            \"h\\n\",\n            \"I\\n\",\n            \"(h) vˆ =\\n\",\n            \"I(h)\\n\",\n            \"Z\\n\",\n            \"R+\\n\",\n            \"γS (λ)S(x + λvˆ) dλ i = 1\\n\",\n            \"Γ\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \"k\\n\",\n            \"h\\n\",\n            \"I\\n\",\n            \"(i) vˆ =\\n\",\n            \"I(i)\\n\",\n            \"Z\\n\",\n            \"R+\\n\",\n            \"γS (λ) S(x + λvˆ) dλ i = 1\\n\",\n            \"Γ\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \"k\\n\",\n            \"h\\n\",\n            \"I\\n\",\n            \"(k) vˆ =\\n\",\n            \"I(k)\\n\",\n            \"Z\\n\",\n            \"R+\\n\",\n            \"γS (λ) S(x + λvˆ) dλ i = 1\\n\",\n            \"Γ\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \"k\\n\",\n            \"h\\n\",\n            \"I\\n\",\n            \"(l) vˆ =\\n\",\n            \"I(l)\\n\",\n            \"Z\\n\",\n            \"R+\\n\",\n            \"γS (λ) S(x + λvˆ) dλ i = 1\\n\",\n            \"Γ\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \"k\\n\",\n            \"h\\n\",\n            \"I\\n\",\n            \"(m) vˆ =\\n\",\n            \"\\n\",\n            \"I(m)\\n\",\n            \"Z\\n\",\n            \"R+\\n\",\n            \"γS (λ) S(x + λvˆ) dλ i = 1\\n\",\n            \"Γ\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \"k\\n\",\n            \"h\\n\",\n            \"I\\n\",\n            \"(n) vˆ =\\n\",\n            \"\\n\",\n            \"I(n)\\n\",\n            \"Z\\n\",\n            \"R+\\n\",\n            \"γS (λ) S(x + λvˆ) dλ i = 1\\n\",\n            \"Γ\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \"k\\n\",\n            \"h\\n\",\n            \"I\\n\",\n            \"(o) vˆ =\\n\",\n            \"\\n\",\n            \"I(o)\\n\",\n            \"Z\\n\",\n            \"R+\\n\",\n            \"γS (λ) S(x + λvˆ) dλ i = 1\\n\",\n            \"Γ\\n\",\n            \"Γ\\n\",\n            \"\\n\",\n            \"[500 | 576.77] loss=0.66 avg=0.99\\n\",\n            \"[501 | 577.70] loss=0.77 avg=0.99\\n\",\n            \"[502 | 578.62] loss=0.26 avg=0.98\\n\",\n            \"[503 | 579.54] loss=0.15 avg=0.98\\n\",\n            \"[504 | 580.47] loss=1.50 avg=0.98\\n\",\n            \"[505 | 581.40] loss=0.53 avg=0.98\\n\",\n            \"[506 | 582.33] loss=0.56 avg=0.97\\n\",\n            \"[507 | 583.26] loss=0.41 avg=0.97\\n\",\n            \"[508 | 584.20] loss=0.32 avg=0.96\\n\",\n            \"[509 | 585.13] loss=0.35 avg=0.95\\n\",\n            \"[510 | 586.06] loss=0.74 avg=0.95\\n\",\n            \"[511 | 586.99] loss=0.46 avg=0.95\\n\",\n            \"[512 | 587.93] loss=0.78 avg=0.95\\n\",\n            \"[513 | 588.86] loss=0.67 avg=0.94\\n\",\n            \"[514 | 589.79] loss=0.45 avg=0.94\\n\",\n            \"[515 | 590.73] loss=0.97 avg=0.94\\n\",\n            \"[516 | 591.66] loss=0.94 avg=0.94\\n\",\n            \"[517 | 592.58] loss=0.36 avg=0.93\\n\",\n            \"[518 | 593.51] loss=0.69 avg=0.93\\n\",\n            \"[519 | 594.44] loss=1.25 avg=0.93\\n\",\n            \"[520 | 595.36] loss=0.78 avg=0.93\\n\",\n            \"[521 | 596.29] loss=0.58 avg=0.93\\n\",\n            \"[522 | 597.23] loss=0.42 avg=0.92\\n\",\n            \"[523 | 598.16] loss=0.43 avg=0.92\\n\",\n            \"[524 | 599.09] loss=0.30 avg=0.91\\n\",\n            \"[525 | 600.03] loss=0.62 avg=0.91\\n\",\n            \"[526 | 600.96] loss=0.21 avg=0.90\\n\",\n            \"[527 | 601.90] loss=0.57 avg=0.90\\n\",\n            \"[528 | 602.83] loss=0.47 avg=0.89\\n\",\n            \"[529 | 603.77] loss=0.61 avg=0.89\\n\",\n            \"[530 | 604.71] loss=0.69 avg=0.89\\n\",\n            \"[531 | 605.64] loss=0.57 avg=0.89\\n\",\n            \"[532 | 606.57] loss=0.34 avg=0.88\\n\",\n            \"[533 | 607.49] loss=0.95 avg=0.88\\n\",\n            \"[534 | 608.42] loss=0.93 avg=0.88\\n\",\n            \"[535 | 609.34] loss=1.00 avg=0.88\\n\",\n            \"[536 | 610.27] loss=0.45 avg=0.88\\n\",\n            \"[537 | 611.20] loss=0.53 avg=0.88\\n\",\n            \"[538 | 612.12] loss=0.29 avg=0.87\\n\",\n            \"[539 | 613.05] loss=0.27 avg=0.86\\n\",\n            \"[540 | 613.97] loss=0.60 avg=0.86\\n\",\n            \"[541 | 614.90] loss=0.42 avg=0.86\\n\",\n            \"[542 | 615.82] loss=0.22 avg=0.85\\n\",\n            \"[543 | 616.74] loss=0.41 avg=0.85\\n\",\n            \"[544 | 617.67] loss=0.17 avg=0.84\\n\",\n            \"[545 | 618.59] loss=0.43 avg=0.83\\n\",\n            \"[546 | 619.52] loss=0.47 avg=0.83\\n\",\n            \"[547 | 620.44] loss=0.69 avg=0.83\\n\",\n            \"[548 | 621.36] loss=0.27 avg=0.82\\n\",\n            \"[549 | 622.29] loss=0.65 avg=0.82\\n\",\n            \"[550 | 623.22] loss=1.12 avg=0.83\\n\",\n            \"[551 | 624.15] loss=0.54 avg=0.82\\n\",\n            \"[552 | 625.08] loss=0.46 avg=0.82\\n\",\n            \"[553 | 626.02] loss=0.62 avg=0.82\\n\",\n            \"[554 | 626.96] loss=0.27 avg=0.81\\n\",\n            \"[555 | 627.89] loss=0.35 avg=0.81\\n\",\n            \"[556 | 628.82] loss=0.25 avg=0.80\\n\",\n            \"[557 | 629.76] loss=0.41 avg=0.80\\n\",\n            \"[558 | 630.69] loss=0.26 avg=0.79\\n\",\n            \"[559 | 631.62] loss=0.68 avg=0.79\\n\",\n            \"[560 | 632.56] loss=0.24 avg=0.78\\n\",\n            \"[561 | 633.49] loss=0.21 avg=0.78\\n\",\n            \"[562 | 634.42] loss=0.30 avg=0.77\\n\",\n            \"[563 | 635.35] loss=0.32 avg=0.77\\n\",\n            \"[564 | 636.29] loss=0.31 avg=0.77\\n\",\n            \"[565 | 637.21] loss=0.36 avg=0.76\\n\",\n            \"[566 | 638.14] loss=0.52 avg=0.76\\n\",\n            \"[567 | 639.08] loss=0.16 avg=0.75\\n\",\n            \"[568 | 640.01] loss=0.50 avg=0.75\\n\",\n            \"[569 | 640.95] loss=0.73 avg=0.75\\n\",\n            \"[570 | 641.88] loss=0.50 avg=0.75\\n\",\n            \"[571 | 642.81] loss=0.43 avg=0.74\\n\",\n            \"[572 | 643.75] loss=0.68 avg=0.74\\n\",\n            \"[573 | 644.67] loss=0.61 avg=0.74\\n\",\n            \"[574 | 645.61] loss=0.13 avg=0.74\\n\",\n            \"[575 | 646.54] loss=0.21 avg=0.73\\n\",\n            \"[576 | 647.47] loss=0.34 avg=0.73\\n\",\n            \"[577 | 648.39] loss=0.33 avg=0.72\\n\",\n            \"[578 | 649.32] loss=0.22 avg=0.72\\n\",\n            \"[579 | 650.26] loss=0.52 avg=0.72\\n\",\n            \"[580 | 651.19] loss=0.26 avg=0.71\\n\",\n            \"[581 | 652.12] loss=0.51 avg=0.71\\n\",\n            \"[582 | 653.05] loss=0.76 avg=0.71\\n\",\n            \"[583 | 653.97] loss=0.73 avg=0.71\\n\",\n            \"[584 | 654.91] loss=0.34 avg=0.71\\n\",\n            \"[585 | 655.84] loss=0.42 avg=0.70\\n\",\n            \"[586 | 656.76] loss=0.48 avg=0.70\\n\",\n            \"[587 | 657.69] loss=0.34 avg=0.70\\n\",\n            \"[588 | 658.62] loss=0.33 avg=0.69\\n\",\n            \"[589 | 659.54] loss=0.56 avg=0.69\\n\",\n            \"[590 | 660.47] loss=0.52 avg=0.69\\n\",\n            \"[591 | 661.40] loss=0.32 avg=0.69\\n\",\n            \"[592 | 662.32] loss=0.19 avg=0.68\\n\",\n            \"[593 | 663.25] loss=0.21 avg=0.68\\n\",\n            \"[594 | 664.16] loss=0.16 avg=0.67\\n\",\n            \"[595 | 665.10] loss=0.69 avg=0.67\\n\",\n            \"[596 | 666.02] loss=0.27 avg=0.67\\n\",\n            \"[597 | 666.95] loss=0.17 avg=0.66\\n\",\n            \"[598 | 667.88] loss=0.93 avg=0.67\\n\",\n            \"[599 | 668.82] loss=0.24 avg=0.66\\n\",\n            \"Generating samples...\\n\",\n            \"======== SAMPLE 1 ========\\n\",\n            \" values, and the kernel\\n\",\n            \"(x, v, vˆ) is the average of the variance of the\\n\",\n            \"choosing direction. The mean direction is given by the mean/(vˆ)\\n\",\n            \"distribution\\n\",\n            \"0, v, vˆ =\\n\",\n            \"20\\n\",\n            \"4\\n\",\n            \"4\\n\",\n            \"(1) (2) (3)\\n\",\n            \"(4)\\n\",\n            \"(5)\\n\",\n            \"and the variance, i.e., the\\n\",\n            \"turning kernel, given by the\\n\",\n            \"turning(k(x)) =\\n\",\n            \"90\\n\",\n            \"Γ\\n\",\n            \"0\\n\",\n            \"q(x, v, vˆ)\\n\",\n            \"0\\n\",\n            \"that is the turning operator, is given by\\n\",\n            \"UT\\n\",\n            \"dv = V\\n\",\n            \"0\\n\",\n            \"Γ\\n\",\n            \"q(x, v, vˆ)\\n\",\n            \"1\\n\",\n            \"that is the variance of the choice of the\\n\",\n            \"turning operator, given by UT\\n\",\n            \"dvˆ = UT\\n\",\n            \"2\\n\",\n            \"Γ\\n\",\n            \"q(x, v, vˆ)\\n\",\n            \"2\\n\",\n            \"that is the turning\\n\",\n            \"direction given by UT\\n\",\n            \"0\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \"(x, v, vˆ)\\n\",\n            \"(6) (7)\\n\",\n            \"and the mean velocity\\n\",\n            \"Vˆ\\n\",\n            \"0\\n\",\n            \"(max) =\\n\",\n            \"70\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \"(x, v, vˆ)\\n\",\n            \"(8)\\n\",\n            \"and the\\n\",\n            \"U1\\n\",\n            \"q\\n\",\n            \"(x, v, vˆ) =\\n\",\n            \"70\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \"(x, v, vˆ)\\n\",\n            \"(9)\\n\",\n            \"and the\\n\",\n            \"D\\n\",\n            \"0\\n\",\n            \"T\\n\",\n            \"(ξ) =\\n\",\n            \"U¯\\n\",\n            \"(ξ)\\n\",\n            \"∇ · Dq\\n\",\n            \"0\\n\",\n            \"T\\n\",\n            \"(ξ)\\n\",\n            \"· (v0, v1) + ξv (ξ) (v0, v1).\\n\",\n            \"Re-scaling the space variable as in (6), we have\\n\",\n            \"D\\n\",\n            \"0\\n\",\n            \"T\\n\",\n            \"(ξ) =\\n\",\n            \"U¯\\n\",\n            \"ξ\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \"(ξ)\\n\",\n            \"∇ · Dq\\n\",\n            \"0\\n\",\n            \"T\\n\",\n            \"(ξ)\\n\",\n            \"· (v0, v1) + ξv (ξ) (v0, v1).\\n\",\n            \"The mean direction is given by the variance of the\\n\",\n            \"turning direction given by\\n\",\n            \"UT (ξ) = UT\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \"(ξ)U¯\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \"(ξ)∇ · Dq\\n\",\n            \"0\\n\",\n            \"T\\n\",\n            \"(ξ)\\n\",\n            \". (10)\\n\",\n            \"As a consequence, the macroscopic behavior is strongly affected by the\\n\",\n            \"turning operator, that is\\n\",\n            \"D\\n\",\n            \"0\\n\",\n            \"T\\n\",\n            \"(ξ) =\\n\",\n            \"U¯\\n\",\n            \"ξ\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \"(ξ)∇ · Dq\\n\",\n            \"0\\n\",\n            \"T\\n\",\n            \"(ξ)\\n\",\n            \". (11)\\n\",\n            \"In particular, the sensing radius of the cells is given by\\n\",\n            \"S\\n\",\n            \"c(x, y) =\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \"(x, y)U¯\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \"(x, y)Γ\\n\",\n            \"both\\n\",\n            \"i\\n\",\n            \" and ii\\n\",\n            \", v\\n\",\n            \", dv\\n\",\n            \",\\n\",\n            \"dvˆ\\n\",\n            \",\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \",\\n\",\n            \"are given by (12) and (13). The chemoattractant has a Gaussian\\n\",\n            \"c = c(x, y)\\n\",\n            \"and on the left the two values of c both have to be in the\\n\",\n            \"same direction. Therefore, the sensing radius of the cells is given by\\n\",\n            \"the momentum\\n\",\n            \"T = S\\n\",\n            \"c(x, y) =\\n\",\n            \"v\\n\",\n            \"(x, y)\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \"(x, y)\\n\",\n            \"vˆ + Γ ii\\n\",\n            \"(x, y)\\n\",\n            \"iiˆ + Γ\\n\",\n            \"q\\n\",\n            \"(x, y)\\n\",\n            \"iiˆ + Γ\\n\",\n            \"v\\n\",\n            \"(x, y)\\n\",\n            \"(14)\\n\",\n            \"and the sensing function Γ =\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \"(x, y)U¯\\n\",\n            \"Γ\\n\",\n            \". (15)\\n\",\n            \"In particular, when the two sensing functions are independent,\\n\",\n            \"when Γ is equal to Γvˆ, we have that the weighted average\\n\",\n            \"for the two velocities is given by the momentum\\n\",\n            \"T = S\\n\",\n            \"c(x, y) =\\n\",\n            \"\\n\",\n            \"v\\n\",\n            \"(x, y)\\n\",\n            \"Γ\\n\",\n            \"q\\n\",\n            \"(x, y)\\n\",\n            \"Γ\\n\",\n            \"\\n\",\n            \"i\\n\",\n            \"and\\n\",\n            \"k\\n\",\n            \":= vˆ k(x, y)\\n\",\n            \". (16)\\n\",\n            \"This translates into\\n\",\n            \"k(x, y) =\\n\",\n            \"vˆ(x)\\n\",\n            \",\\n\",\n            \"that is the kurtosis\\n\",\n            \"T\\n\",\n            \"(ξ) = vˆ(x)\\n\",\n            \",\\n\",\n            \"that is the tach statistic\\n\",\n            \"DT\\n\",\n            \"(ξ) = u\\n\",\n            \"T\\n\",\n            \"(ξ)\\n\",\n            \"∇T\\n\",\n            \"(ξ)\\n\",\n            \". (\\n\",\n            \"\\n\",\n            \"[600 | 690.44] loss=0.29 avg=0.66\\n\",\n            \"[601 | 691.37] loss=0.45 avg=0.66\\n\",\n            \"[602 | 692.29] loss=0.27 avg=0.65\\n\",\n            \"[603 | 693.22] loss=0.38 avg=0.65\\n\",\n            \"[604 | 694.15] loss=0.33 avg=0.65\\n\",\n            \"[605 | 695.09] loss=0.89 avg=0.65\\n\",\n            \"[606 | 696.01] loss=0.59 avg=0.65\\n\",\n            \"[607 | 696.94] loss=0.27 avg=0.64\\n\",\n            \"[608 | 697.86] loss=0.61 avg=0.64\\n\",\n            \"[609 | 698.80] loss=0.33 avg=0.64\\n\",\n            \"[610 | 699.72] loss=0.80 avg=0.64\\n\",\n            \"[611 | 700.65] loss=0.49 avg=0.64\\n\",\n            \"[612 | 701.58] loss=0.35 avg=0.64\\n\",\n            \"[613 | 702.51] loss=0.26 avg=0.63\\n\",\n            \"[614 | 703.43] loss=0.47 avg=0.63\\n\",\n            \"[615 | 704.36] loss=0.41 avg=0.63\\n\",\n            \"[616 | 705.28] loss=0.42 avg=0.63\\n\",\n            \"[617 | 706.20] loss=0.69 avg=0.63\\n\",\n            \"[618 | 707.13] loss=0.41 avg=0.63\\n\",\n            \"[619 | 708.06] loss=0.44 avg=0.62\\n\",\n            \"[620 | 709.00] loss=0.23 avg=0.62\\n\",\n            \"[621 | 709.94] loss=0.46 avg=0.62\\n\",\n            \"[622 | 710.88] loss=0.34 avg=0.62\\n\",\n            \"[623 | 711.82] loss=0.34 avg=0.61\\n\",\n            \"[624 | 712.74] loss=0.26 avg=0.61\\n\",\n            \"[625 | 713.67] loss=0.23 avg=0.61\\n\",\n            \"[626 | 714.59] loss=0.35 avg=0.60\\n\",\n            \"[627 | 715.52] loss=0.50 avg=0.60\\n\",\n            \"[628 | 716.45] loss=0.31 avg=0.60\\n\",\n            \"[629 | 717.38] loss=0.36 avg=0.60\\n\",\n            \"[630 | 718.31] loss=0.13 avg=0.59\\n\",\n            \"[631 | 719.24] loss=0.29 avg=0.59\\n\",\n            \"[632 | 720.17] loss=0.20 avg=0.59\\n\",\n            \"[633 | 721.10] loss=0.23 avg=0.58\\n\",\n            \"[634 | 722.03] loss=0.16 avg=0.58\\n\",\n            \"[635 | 722.96] loss=0.14 avg=0.57\\n\",\n            \"[636 | 723.90] loss=0.31 avg=0.57\\n\",\n            \"[637 | 724.83] loss=0.80 avg=0.57\\n\",\n            \"[638 | 725.78] loss=0.31 avg=0.57\\n\",\n            \"[639 | 726.72] loss=0.38 avg=0.57\\n\",\n            \"[640 | 727.65] loss=0.23 avg=0.57\\n\",\n            \"[641 | 728.59] loss=0.24 avg=0.56\\n\",\n            \"[642 | 729.53] loss=0.35 avg=0.56\\n\",\n            \"[643 | 730.46] loss=0.19 avg=0.56\\n\",\n            \"[644 | 731.39] loss=0.16 avg=0.55\\n\",\n            \"[645 | 732.32] loss=0.24 avg=0.55\\n\",\n            \"[646 | 733.25] loss=0.10 avg=0.54\\n\",\n            \"[647 | 734.18] loss=0.40 avg=0.54\\n\",\n            \"[648 | 735.11] loss=0.28 avg=0.54\\n\",\n            \"[649 | 736.04] loss=0.36 avg=0.54\\n\",\n            \"[650 | 736.97] loss=0.27 avg=0.54\\n\",\n            \"[651 | 737.90] loss=0.78 avg=0.54\\n\",\n            \"interrupted\\n\",\n            \"Saving checkpoint/run1/model-652\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"z-zAFd2hLQ2V\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"#@title Step 10: Creating a Training Model directory\\n\",\n        \"#Creating a Training Model directory named 'tgmodel'\\n\",\n        \"import os\\n\",\n        \"run_dir = '/content/gpt-2/models/tgmodel'\\n\",\n        \"if not os.path.exists(run_dir):\\n\",\n        \"  os.makedirs(run_dir)\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"-POx-g1Ql76C\",\n        \"colab_type\": \"code\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 68\n        },\n        \"outputId\": \"4a14528a-cb8e-4b8c-9b8a-9a5cfab4c6fe\"\n      },\n      \"source\": [\n        \"#@title Step 10A: Copying training Files\\n\",\n        \"!cp /content/gpt-2/src/checkpoint/run1/model-1000.data-00000-of-00001 /content/gpt-2/models/tgmodel\\n\",\n        \"!cp /content/gpt-2/src/checkpoint/run1/checkpoint /content/gpt-2/models/tgmodel\\n\",\n        \"!cp /content/gpt-2/src/checkpoint/run1/model-1000.index /content/gpt-2/models/tgmodel\\n\",\n        \"!cp /content/gpt-2/src/checkpoint/run1/model-1000.meta /content/gpt-2/models/tgmodel\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"cp: cannot stat '/content/gpt-2/src/checkpoint/run1/model-1000.data-00000-of-00001': No such file or directory\\n\",\n            \"cp: cannot stat '/content/gpt-2/src/checkpoint/run1/model-1000.index': No such file or directory\\n\",\n            \"cp: cannot stat '/content/gpt-2/src/checkpoint/run1/model-1000.meta': No such file or directory\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"hdE9nNH8m7VD\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"#@title Step 10B: Copying the OpenAI GPT-2 117M Model files\\n\",\n        \"!cp /content/gpt-2/models/117M/encoder.json /content/gpt-2/models/tgmodel\\n\",\n        \"!cp /content/gpt-2/models/117M/hparams.json /content/gpt-2/models/tgmodel\\n\",\n        \"!cp /content/gpt-2/models/117M/vocab.bpe /content/gpt-2/models/tgmodel\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"3G8NOUXjMq4u\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"#@title Step 10C: Renaming the model directories\\n\",\n        \"import os\\n\",\n        \"!mv /content/gpt-2/models/117M  /content/gpt-2/models/117M_OpenAI\\n\",\n        \"!mv /content/gpt-2/models/tgmodel  /content/gpt-2/models/117M\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"h3uexz_e4d18\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"#@title Step 11: Generating Unconditional Samples\\n\",\n        \"import os # import after runtime is restarted\\n\",\n        \"os.chdir(\\\"/content/gpt-2/src\\\")\\n\",\n        \"!python generate_unconditional_samples.py --model_name '117M'\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"6HI7DuBK4iSU\",\n        \"colab_type\": \"code\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 1000\n        },\n        \"outputId\": \"aced43aa-7fcc-4bac-99c7-8bfd7ded0028\"\n      },\n      \"source\": [\n        \"#@title Step 12: Interactive Context and Completion Examples\\n\",\n        \"import os # import after runtime is restarted\\n\",\n        \"os.chdir(\\\"/content/gpt-2/src\\\")\\n\",\n        \"!python interactive_conditional_samples.py --temperature 0.8 --top_k 40 --model_name '117M' --length 50\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"WARNING:tensorflow:From interactive_conditional_samples.py:57: The name tf.Session is deprecated. Please use tf.compat.v1.Session instead.\\n\",\n            \"\\n\",\n            \"2020-06-29 09:30:02.273624: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1\\n\",\n            \"2020-06-29 09:30:02.292947: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:983] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero\\n\",\n            \"2020-06-29 09:30:02.293714: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1639] Found device 0 with properties: \\n\",\n            \"name: Tesla K80 major: 3 minor: 7 memoryClockRate(GHz): 0.8235\\n\",\n            \"pciBusID: 0000:00:04.0\\n\",\n            \"2020-06-29 09:30:02.294023: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1\\n\",\n            \"2020-06-29 09:30:02.295631: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10\\n\",\n            \"2020-06-29 09:30:02.297301: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10\\n\",\n            \"2020-06-29 09:30:02.297699: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10\\n\",\n            \"2020-06-29 09:30:02.299362: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10\\n\",\n            \"2020-06-29 09:30:02.300174: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10\\n\",\n            \"2020-06-29 09:30:02.303450: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7\\n\",\n            \"2020-06-29 09:30:02.303619: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:983] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero\\n\",\n            \"2020-06-29 09:30:02.304415: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:983] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero\\n\",\n            \"2020-06-29 09:30:02.305120: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1767] Adding visible gpu devices: 0\\n\",\n            \"2020-06-29 09:30:02.310474: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 2299995000 Hz\\n\",\n            \"2020-06-29 09:30:02.310737: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x1426d80 initialized for platform Host (this does not guarantee that XLA will be used). Devices:\\n\",\n            \"2020-06-29 09:30:02.310775: I tensorflow/compiler/xla/service/service.cc:176]   StreamExecutor device (0): Host, Default Version\\n\",\n            \"2020-06-29 09:30:02.360414: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:983] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero\\n\",\n            \"2020-06-29 09:30:02.361376: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x1426f40 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:\\n\",\n            \"2020-06-29 09:30:02.361416: I tensorflow/compiler/xla/service/service.cc:176]   StreamExecutor device (0): Tesla K80, Compute Capability 3.7\\n\",\n            \"2020-06-29 09:30:02.361699: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:983] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero\\n\",\n            \"2020-06-29 09:30:02.362523: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1639] Found device 0 with properties: \\n\",\n            \"name: Tesla K80 major: 3 minor: 7 memoryClockRate(GHz): 0.8235\\n\",\n            \"pciBusID: 0000:00:04.0\\n\",\n            \"2020-06-29 09:30:02.362622: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1\\n\",\n            \"2020-06-29 09:30:02.362681: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10\\n\",\n            \"2020-06-29 09:30:02.362735: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10\\n\",\n            \"2020-06-29 09:30:02.362790: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10\\n\",\n            \"2020-06-29 09:30:02.362854: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10\\n\",\n            \"2020-06-29 09:30:02.362922: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10\\n\",\n            \"2020-06-29 09:30:02.362980: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7\\n\",\n            \"2020-06-29 09:30:02.363153: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:983] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero\\n\",\n            \"2020-06-29 09:30:02.364047: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:983] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero\\n\",\n            \"2020-06-29 09:30:02.364759: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1767] Adding visible gpu devices: 0\\n\",\n            \"2020-06-29 09:30:02.364834: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1\\n\",\n            \"2020-06-29 09:30:02.366467: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1180] Device interconnect StreamExecutor with strength 1 edge matrix:\\n\",\n            \"2020-06-29 09:30:02.366509: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1186]      0 \\n\",\n            \"2020-06-29 09:30:02.366530: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1199] 0:   N \\n\",\n            \"2020-06-29 09:30:02.366754: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:983] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero\\n\",\n            \"2020-06-29 09:30:02.367607: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:983] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero\\n\",\n            \"2020-06-29 09:30:02.368323: W tensorflow/core/common_runtime/gpu/gpu_bfc_allocator.cc:39] Overriding allow_growth setting because the TF_FORCE_GPU_ALLOW_GROWTH environment variable is set. Original config value was 0.\\n\",\n            \"2020-06-29 09:30:02.368380: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1325] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 10805 MB memory) -> physical GPU (device: 0, name: Tesla K80, pci bus id: 0000:00:04.0, compute capability: 3.7)\\n\",\n            \"WARNING:tensorflow:From interactive_conditional_samples.py:58: The name tf.placeholder is deprecated. Please use tf.compat.v1.placeholder instead.\\n\",\n            \"\\n\",\n            \"WARNING:tensorflow:From interactive_conditional_samples.py:60: The name tf.set_random_seed is deprecated. Please use tf.compat.v1.set_random_seed instead.\\n\",\n            \"\\n\",\n            \"WARNING:tensorflow:From /content/gpt-2/src/sample.py:51: The name tf.AUTO_REUSE is deprecated. Please use tf.compat.v1.AUTO_REUSE instead.\\n\",\n            \"\\n\",\n            \"WARNING:tensorflow:From /content/gpt-2/src/model.py:148: The name tf.variable_scope is deprecated. Please use tf.compat.v1.variable_scope instead.\\n\",\n            \"\\n\",\n            \"WARNING:tensorflow:From /content/gpt-2/src/model.py:152: The name tf.get_variable is deprecated. Please use tf.compat.v1.get_variable instead.\\n\",\n            \"\\n\",\n            \"WARNING:tensorflow:From /content/gpt-2/src/model.py:36: The name tf.rsqrt is deprecated. Please use tf.math.rsqrt instead.\\n\",\n            \"\\n\",\n            \"WARNING:tensorflow:From /content/gpt-2/src/sample.py:64: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.\\n\",\n            \"Instructions for updating:\\n\",\n            \"Use `tf.cast` instead.\\n\",\n            \"WARNING:tensorflow:From /content/gpt-2/src/sample.py:16: where (from tensorflow.python.ops.array_ops) is deprecated and will be removed in a future version.\\n\",\n            \"Instructions for updating:\\n\",\n            \"Use tf.where in 2.0, which has the same broadcast rule as np.where\\n\",\n            \"WARNING:tensorflow:From /content/gpt-2/src/sample.py:67: multinomial (from tensorflow.python.ops.random_ops) is deprecated and will be removed in a future version.\\n\",\n            \"Instructions for updating:\\n\",\n            \"Use `tf.random.categorical` instead.\\n\",\n            \"WARNING:tensorflow:From interactive_conditional_samples.py:68: The name tf.train.Saver is deprecated. Please use tf.compat.v1.train.Saver instead.\\n\",\n            \"\\n\",\n            \"Model prompt >>> During such processes, cells sense the environment and respond to external factors that induce a certain direction of motion towards specific targets (taxis): this results in a persistent migration in a certain preferential direction. The guidance cues leading to directed migration may be biochemical or biophysical. Biochemical cues can be, for example, soluble factors or growth factors that give rise to chemotaxis, which involves a mono-directional stimulus. Other cues generating mono-directional stimuli include, for instance, bound ligands to the substratum that induce haptotaxis, durotaxis, that involves migration towards regions with an increasing stiffness of the ECM, electrotaxis, also known as galvanotaxis, that prescribes a directed motion guided by an electric field or current, or phototaxis, referring to the movement oriented by a stimulus of light [34]. Important biophysical cues are some of the properties of the extracellular matrix (ECM), first among all the alignment of collagen fibers and its stiffness. In particular, the fiber alignment is shown to stimulate contact guidance [22, 21]. TL;DR:\\n\",\n            \"2020-06-29 09:31:30.405327: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10\\n\",\n            \"======================================== SAMPLE 1 ========================================\\n\",\n            \" the ECM of a single tissue is the ECM that is the most effective.\\n\",\n            \"\\n\",\n            \"To address this concern, we developed a novel imaging and immunostaining scheme that, when activated, induces the conversion of a protein to its exogenous target\\n\",\n            \"================================================================================\\n\",\n            \"Model prompt >>> Traceback (most recent call last):\\n\",\n            \"  File \\\"/usr/lib/python3.6/contextlib.py\\\", line 99, in __exit__\\n\",\n            \"    self.gen.throw(type, value, traceback)\\n\",\n            \"  File \\\"/tensorflow-1.15.2/python3.6/tensorflow_core/python/framework/ops.py\\\", line 5480, in get_controller\\n\",\n            \"    yield g\\n\",\n            \"  File \\\"interactive_conditional_samples.py\\\", line 73, in interact_model\\n\",\n            \"    raw_text = input(\\\"Model prompt >>> \\\")\\n\",\n            \"KeyboardInterrupt\\n\",\n            \"\\n\",\n            \"During handling of the above exception, another exception occurred:\\n\",\n            \"\\n\",\n            \"Traceback (most recent call last):\\n\",\n            \"  File \\\"interactive_conditional_samples.py\\\", line 91, in <module>\\n\",\n            \"    fire.Fire(interact_model)\\n\",\n            \"  File \\\"/usr/local/lib/python3.6/dist-packages/fire/core.py\\\", line 138, in Fire\\n\",\n            \"    component_trace = _Fire(component, args, parsed_flag_args, context, name)\\n\",\n            \"  File \\\"/usr/local/lib/python3.6/dist-packages/fire/core.py\\\", line 468, in _Fire\\n\",\n            \"    target=component.__name__)\\n\",\n            \"  File \\\"/usr/local/lib/python3.6/dist-packages/fire/core.py\\\", line 672, in _CallAndUpdateTrace\\n\",\n            \"    component = fn(*varargs, **kwargs)\\n\",\n            \"  File \\\"interactive_conditional_samples.py\\\", line 88, in interact_model\\n\",\n            \"    print(\\\"=\\\" * 80)\\n\",\n            \"  File \\\"/tensorflow-1.15.2/python3.6/tensorflow_core/python/client/session.py\\\", line 1633, in __exit__\\n\",\n            \"    close_thread.start()\\n\",\n            \"  File \\\"/usr/lib/python3.6/threading.py\\\", line 851, in start\\n\",\n            \"    self._started.wait()\\n\",\n            \"  File \\\"/usr/lib/python3.6/threading.py\\\", line 551, in wait\\n\",\n            \"    signaled = self._cond.wait(timeout)\\n\",\n            \"  File \\\"/usr/lib/python3.6/threading.py\\\", line 295, in wait\\n\",\n            \"    waiter.acquire()\\n\",\n            \"KeyboardInterrupt\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"ihVnmXFYB-E7\",\n        \"colab_type\": \"code\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 1000\n        },\n        \"outputId\": \"bbe7802e-2e06-4c70-debb-bab34bfb0c2e\"\n      },\n      \"source\": [\n        \"#@title Additional Tools: Controlling Tokenized Data\\n\",\n        \"#Unzip out.npz\\n\",\n        \"import zipfile\\n\",\n        \"with zipfile.ZipFile('/content/gpt-2/src/out.npz', 'r') as zip_ref:\\n\",\n        \"    zip_ref.extractall('/content/gpt-2/src/')\\n\",\n        \"\\n\",\n        \"#Load arr_0.npy which contains encoded dset\\n\",\n        \"import numpy as np\\n\",\n        \"f=np.load('/content/gpt-2/src/arr_0.npy')\\n\",\n        \"print(f)\\n\",\n        \"print(f.shape)\\n\",\n        \"for i in range(0,10):\\n\",\n        \"    print(f[i])\\n\",\n        \"     \\n\",\n        \"#We first import encoder.json\\n\",\n        \"import json\\n\",\n        \"i=0\\n\",\n        \"with open(\\\"/content/gpt-2/models/117M/encoder.json\\\", \\\"r\\\") as read_file:\\n\",\n        \"    print(\\\"Converting the JSON encoded data into a Python dictionary\\\")\\n\",\n        \"    developer = json.load(read_file) #converts the encoded data into a Python dictionary\\n\",\n        \"    for key, value in developer.items(): #we parse the decoded json data\\n\",\n        \"        i+=1\\n\",\n        \"        if(i>10):\\n\",\n        \"            break;\\n\",\n        \"        print(key, \\\":\\\", value)\\n\",\n        \"\\n\",\n        \"#We will now search for the key and value for each encoded token\\n\",\n        \"    for i in range(0,500):\\n\",\n        \"        for key, value in developer.items():\\n\",\n        \"            if f[i]==value:\\n\",\n        \"                print(key, \\\":\\\", value)\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"[1212 5644  326 ...   13  198 2682]\\n\",\n            \"(29379,)\\n\",\n            \"1212\\n\",\n            \"5644\\n\",\n            \"326\\n\",\n            \"11\\n\",\n            \"355\\n\",\n            \"716\\n\",\n            \"78\\n\",\n            \"1765\\n\",\n            \"1868\\n\",\n            \"4778\\n\",\n            \"Converting JSON encoded data into Python dictionary\\n\",\n            \"! : 0\\n\",\n            \"\\\" : 1\\n\",\n            \"# : 2\\n\",\n            \"$ : 3\\n\",\n            \"% : 4\\n\",\n            \"& : 5\\n\",\n            \"' : 6\\n\",\n            \"( : 7\\n\",\n            \") : 8\\n\",\n            \"* : 9\\n\",\n            \"This : 1212\\n\",\n            \"Ġsuggests : 5644\\n\",\n            \"Ġthat : 326\\n\",\n            \", : 11\\n\",\n            \"Ġas : 355\\n\",\n            \"Ġam : 716\\n\",\n            \"o : 78\\n\",\n            \"eb : 1765\\n\",\n            \"oid : 1868\\n\",\n            \"Ġcells : 4778\\n\",\n            \"Ġare : 389\\n\",\n            \"Ġless : 1342\\n\",\n            \"Ġcontract : 2775\\n\",\n            \"ile : 576\\n\",\n            \", : 11\\n\",\n            \"Ġwhile : 981\\n\",\n            \"Ġmes : 18842\\n\",\n            \"ench : 24421\\n\",\n            \"ym : 4948\\n\",\n            \"al : 282\\n\",\n            \"Ċ : 198\\n\",\n            \"cells : 46342\\n\",\n            \"Ġare : 389\\n\",\n            \"Ġmore : 517\\n\",\n            \"Ġcontract : 2775\\n\",\n            \"ile : 576\\n\",\n            \", : 11\\n\",\n            \"Ġand : 290\\n\",\n            \"Ġthere : 612\\n\",\n            \"Ġmay : 743\\n\",\n            \"Ġbe : 307\\n\",\n            \"Ġa : 257\\n\",\n            \"Ġswitching : 15430\\n\",\n            \"Ġbetween : 1022\\n\",\n            \"Ġam : 716\\n\",\n            \"o : 78\\n\",\n            \"eb : 1765\\n\",\n            \"oid : 1868\\n\",\n            \"Ġand : 290\\n\",\n            \"Ġmes : 18842\\n\",\n            \"ench : 24421\\n\",\n            \"ym : 4948\\n\",\n            \"al : 282\\n\",\n            \"Ċ : 198\\n\",\n            \"m : 76\\n\",\n            \"igration : 4254\\n\",\n            \", : 11\\n\",\n            \"Ġperhaps : 3737\\n\",\n            \"Ġthere : 612\\n\",\n            \"Ġcan : 460\\n\",\n            \"Ġalso : 635\\n\",\n            \"Ġbe : 307\\n\",\n            \"Ġa : 257\\n\",\n            \"Ġswitching : 15430\\n\",\n            \"Ġbetween : 1022\\n\",\n            \"Ġthe : 262\\n\",\n            \"Ġdominance : 18648\\n\",\n            \"Ġof : 286\\n\",\n            \"Ġchem : 4607\\n\",\n            \"ot : 313\\n\",\n            \"axis : 22704\\n\",\n            \"Ġ( : 357\\n\",\n            \"amo : 18811\\n\",\n            \"eb : 1765\\n\",\n            \"oid : 1868\\n\",\n            \"Ċ : 198\\n\",\n            \"m : 76\\n\",\n            \"igration : 4254\\n\",\n            \") : 8\\n\",\n            \"Ġand : 290\\n\",\n            \"Ġcontact : 2800\\n\",\n            \"Ġguidance : 11154\\n\",\n            \"Ġ( : 357\\n\",\n            \"mes : 6880\\n\",\n            \"ench : 24421\\n\",\n            \"ym : 4948\\n\",\n            \"al : 282\\n\",\n            \"Ġmigration : 13472\\n\",\n            \") : 8\\n\",\n            \"Ġ[ : 685\\n\",\n            \"60 : 1899\\n\",\n            \"]. : 4083\\n\",\n            \"ĠOne : 1881\\n\",\n            \"Ġof : 286\\n\",\n            \"Ġthe : 262\\n\",\n            \"Ġmost : 749\\n\",\n            \"Ġinteresting : 3499\\n\",\n            \"Ġ2 : 362\\n\",\n            \"D : 35\\n\",\n            \"Ċ : 198\\n\",\n            \"platform : 24254\\n\",\n            \"s : 82\\n\",\n            \", : 11\\n\",\n            \"Ġallowing : 5086\\n\",\n            \"Ġto : 284\\n\",\n            \"Ġstudy : 2050\\n\",\n            \"Ġcontact : 2800\\n\",\n            \"Ġguidance : 11154\\n\",\n            \"Ġand : 290\\n\",\n            \"Ġchem : 4607\\n\",\n            \"ot : 313\\n\",\n            \"axis : 22704\\n\",\n            \", : 11\\n\",\n            \"Ġwas : 373\\n\",\n            \"Ġproposed : 5150\\n\",\n            \"Ġin : 287\\n\",\n            \"Ġ[ : 685\\n\",\n            \"57 : 3553\\n\",\n            \"], : 4357\\n\",\n            \"Ġin : 287\\n\",\n            \"Ġwhich : 543\\n\",\n            \"Ġthe : 262\\n\",\n            \"Ċ : 198\\n\",\n            \"authors : 41617\\n\",\n            \"Ġdemonstrated : 9555\\n\",\n            \"Ġan : 281\\n\",\n            \"Ġadditive : 38298\\n\",\n            \"Ġeffect : 1245\\n\",\n            \"Ġof : 286\\n\",\n            \"Ġchemical : 5931\\n\",\n            \"Ġgrad : 3915\\n\",\n            \"ients : 2334\\n\",\n            \"Ġand : 290\\n\",\n            \"Ġfiber : 13608\\n\",\n            \"Ġalignment : 19114\\n\",\n            \"Ġby : 416\\n\",\n            \"Ġmeasuring : 15964\\n\",\n            \"Ċ : 198\\n\",\n            \"the : 1169\\n\",\n            \"Ġpersistence : 30802\\n\",\n            \"Ġtime : 640\\n\",\n            \"; : 26\\n\",\n            \"Ġthey : 484\\n\",\n            \"Ġalso : 635\\n\",\n            \"Ġobserved : 6515\\n\",\n            \"Ġthat : 326\\n\",\n            \"Ġcells : 4778\\n\",\n            \"Ġwere : 547\\n\",\n            \"Ġdirected : 7924\\n\",\n            \"Ġby : 416\\n\",\n            \"Ġfiber : 13608\\n\",\n            \"Ġalignment : 19114\\n\",\n            \"Ġand : 290\\n\",\n            \"Ġthere : 612\\n\",\n            \"Ġwas : 373\\n\",\n            \"Ċ : 198\\n\",\n            \"no : 3919\\n\",\n            \"Ġeffect : 1245\\n\",\n            \"Ġof : 286\\n\",\n            \"Ġthe : 262\\n\",\n            \"Ġchemical : 5931\\n\",\n            \"Ġgradient : 31312\\n\",\n            \"Ġwhen : 618\\n\",\n            \"Ġfibers : 26742\\n\",\n            \"Ġwere : 547\\n\",\n            \"Ġaligned : 19874\\n\",\n            \"Ġperpendicular : 47190\\n\",\n            \"Ġto : 284\\n\",\n            \"Ġit : 340\\n\",\n            \". : 13\\n\",\n            \"ĠA : 317\\n\",\n            \"Ġsimilar : 2092\\n\",\n            \"Ġsetting : 4634\\n\",\n            \"Ċ : 198\\n\",\n            \"was : 9776\\n\",\n            \"Ġalso : 635\\n\",\n            \"Ġused : 973\\n\",\n            \"Ġfor : 329\\n\",\n            \"Ġstudying : 11065\\n\",\n            \"Ġthe : 262\\n\",\n            \"Ġdependence : 21403\\n\",\n            \"Ġof : 286\\n\",\n            \"Ġcontact : 2800\\n\",\n            \"Ġguidance : 11154\\n\",\n            \"Ġon : 319\\n\",\n            \"Ġthe : 262\\n\",\n            \"Ġcell : 2685\\n\",\n            \"Ġcycle : 6772\\n\",\n            \"Ġ[ : 685\\n\",\n            \"48 : 2780\\n\",\n            \"]. : 4083\\n\",\n            \"ĠHowever : 2102\\n\",\n            \", : 11\\n\",\n            \"ĠIn : 554\\n\",\n            \"Ċ : 198\\n\",\n            \"the : 1169\\n\",\n            \"Ġcase : 1339\\n\",\n            \"Ġof : 286\\n\",\n            \"Ġdifferent : 1180\\n\",\n            \"Ġmulti : 5021\\n\",\n            \"- : 12\\n\",\n            \"direction : 37295\\n\",\n            \"al : 282\\n\",\n            \"Ġcues : 25288\\n\",\n            \", : 11\\n\",\n            \"Ġtotally : 6635\\n\",\n            \"Ġdifferent : 1180\\n\",\n            \"Ġscenarios : 13858\\n\",\n            \"Ġmay : 743\\n\",\n            \"Ġhappen : 1645\\n\",\n            \", : 11\\n\",\n            \"Ġe : 304\\n\",\n            \". : 13\\n\",\n            \"g : 70\\n\",\n            \". : 13\\n\",\n            \"Ġin : 287\\n\",\n            \"Ġ[ : 685\\n\",\n            \"51 : 4349\\n\",\n            \"] : 60\\n\",\n            \"Ġit : 340\\n\",\n            \"Ġis : 318\\n\",\n            \"Ċ : 198\\n\",\n            \"shown : 42579\\n\",\n            \"Ġthat : 326\\n\",\n            \"Ġfor : 329\\n\",\n            \"Ġcontact : 2800\\n\",\n            \"Ġguidance : 11154\\n\",\n            \"Ġand : 290\\n\",\n            \"Ġelect : 1742\\n\",\n            \"rot : 10599\\n\",\n            \"axis : 22704\\n\",\n            \"Ġin : 287\\n\",\n            \"Ġthe : 262\\n\",\n            \"Ġcor : 1162\\n\",\n            \"nea : 39718\\n\",\n            \", : 11\\n\",\n            \"Ġelect : 1742\\n\",\n            \"rot : 10599\\n\",\n            \"axis : 22704\\n\",\n            \"Ġwins : 7864\\n\",\n            \"Ġwhen : 618\\n\",\n            \"Ġcompeting : 11780\\n\",\n            \"Ċ : 198\\n\",\n            \"with : 4480\\n\",\n            \"Ġthe : 262\\n\",\n            \"Ġdirection : 4571\\n\",\n            \"Ġof : 286\\n\",\n            \"Ġalignment : 19114\\n\",\n            \"Ġof : 286\\n\",\n            \"Ġthe : 262\\n\",\n            \"Ġfibers : 26742\\n\",\n            \". : 13\\n\",\n            \"Ċ : 198\\n\",\n            \"Multi : 29800\\n\",\n            \"- : 12\\n\",\n            \"cue : 15509\\n\",\n            \"Ġkinetic : 37892\\n\",\n            \"Ġmodel : 2746\\n\",\n            \"Ġwith : 351\\n\",\n            \"Ġnon : 1729\\n\",\n            \"- : 12\\n\",\n            \"local : 12001\\n\",\n            \"Ġsensing : 34244\\n\",\n            \"Ġfor : 329\\n\",\n            \"Ġcell : 2685\\n\",\n            \"Ċ : 198\\n\",\n            \"m : 76\\n\",\n            \"igration : 4254\\n\",\n            \"Ġon : 319\\n\",\n            \"Ġa : 257\\n\",\n            \"Ġfibers : 26742\\n\",\n            \"Ġnetwork : 3127\\n\",\n            \"Ġwith : 351\\n\",\n            \"Ġchem : 4607\\n\",\n            \"ot : 313\\n\",\n            \"axis : 22704\\n\",\n            \"Ċ : 198\\n\",\n            \"Mart : 13143\\n\",\n            \"ina : 1437\\n\",\n            \"ĠCon : 1482\\n\",\n            \"te : 660\\n\",\n            \"ĠâĪ : 18872\\n\",\n            \"Ĺ : 245\\n\",\n            \"ĠNad : 21877\\n\",\n            \"ia : 544\\n\",\n            \"ĠL : 406\\n\",\n            \"oy : 726\\n\",\n            \"ĠâĢ : 564\\n\",\n            \"ł : 254\\n\",\n            \"âĢ : 447\\n\",\n            \"¡ : 94\\n\",\n            \"Ċ : 198\\n\",\n            \"June : 15749\\n\",\n            \"Ġ18 : 1248\\n\",\n            \", : 11\\n\",\n            \"Ġ2020 : 12131\\n\",\n            \"Ċ : 198\\n\",\n            \"Abstract : 23839\\n\",\n            \"Ċ : 198\\n\",\n            \"C : 34\\n\",\n            \"ells : 19187\\n\",\n            \"Ġperform : 1620\\n\",\n            \"Ġdirected : 7924\\n\",\n            \"Ġmotion : 6268\\n\",\n            \"Ġin : 287\\n\",\n            \"Ġresponse : 2882\\n\",\n            \"Ġto : 284\\n\",\n            \"Ġexternal : 7097\\n\",\n            \"Ġstimuli : 25973\\n\",\n            \"Ġthat : 326\\n\",\n            \"Ġthey : 484\\n\",\n            \"Ġdetect : 4886\\n\",\n            \"Ġby : 416\\n\",\n            \"Ġsensing : 34244\\n\",\n            \"Ċ : 198\\n\",\n            \"the : 1169\\n\",\n            \"Ġenvironment : 2858\\n\",\n            \"Ġwith : 351\\n\",\n            \"Ġtheir : 511\\n\",\n            \"Ġmembrane : 25019\\n\",\n            \"Ġprot : 1237\\n\",\n            \"rus : 14932\\n\",\n            \"ions : 507\\n\",\n            \". : 13\\n\",\n            \"ĠIn : 554\\n\",\n            \"Ġparticular : 1948\\n\",\n            \", : 11\\n\",\n            \"Ġseveral : 1811\\n\",\n            \"Ġbiochemical : 47685\\n\",\n            \"Ġand : 290\\n\",\n            \"Ġbi : 3182\\n\",\n            \"ophysical : 41789\\n\",\n            \"Ġcues : 25288\\n\",\n            \"Ġgive : 1577\\n\",\n            \"Ġrise : 4485\\n\",\n            \"Ġto : 284\\n\",\n            \"Ġtactic : 18543\\n\",\n            \"Ġmigration : 13472\\n\",\n            \"Ġin : 287\\n\",\n            \"Ġthe : 262\\n\",\n            \"Ġdirection : 4571\\n\",\n            \"Ġof : 286\\n\",\n            \"Ġtheir : 511\\n\",\n            \"Ġspecific : 2176\\n\",\n            \"Ġtargets : 6670\\n\",\n            \". : 13\\n\",\n            \"ĠThis : 770\\n\",\n            \"Ġdefines : 15738\\n\",\n            \"Ċ : 198\\n\",\n            \"a : 64\\n\",\n            \"Ġmulti : 5021\\n\",\n            \"- : 12\\n\",\n            \"cue : 15509\\n\",\n            \"Ġenvironment : 2858\\n\",\n            \"Ġin : 287\\n\",\n            \"Ġwhich : 543\\n\",\n            \"Ġcells : 4778\\n\",\n            \"Ġhave : 423\\n\",\n            \"Ġto : 284\\n\",\n            \"Ġsort : 3297\\n\",\n            \"Ġand : 290\\n\",\n            \"Ġcombine : 12082\\n\",\n            \"Ġdifferent : 1180\\n\",\n            \", : 11\\n\",\n            \"Ġand : 290\\n\",\n            \"Ġpotentially : 6196\\n\",\n            \"Ċ : 198\\n\",\n            \"competitive : 46131\\n\",\n            \", : 11\\n\",\n            \"Ġstimuli : 25973\\n\",\n            \". : 13\\n\",\n            \"ĠWe : 775\\n\",\n            \"Ġpropose : 18077\\n\",\n            \"Ġa : 257\\n\",\n            \"Ġnon : 1729\\n\",\n            \"- : 12\\n\",\n            \"local : 12001\\n\",\n            \"Ġkinetic : 37892\\n\",\n            \"Ġmodel : 2746\\n\",\n            \"Ġfor : 329\\n\",\n            \"Ġcell : 2685\\n\",\n            \"Ġmigration : 13472\\n\",\n            \"Ġin : 287\\n\",\n            \"Ġpresence : 4931\\n\",\n            \"Ġof : 286\\n\",\n            \"Ċ : 198\\n\",\n            \"two : 11545\\n\",\n            \"Ġexternal : 7097\\n\",\n            \"Ġfactors : 5087\\n\",\n            \"Ġboth : 1111\\n\",\n            \"Ġinfluencing : 32596\\n\",\n            \"Ġcell : 2685\\n\",\n            \"Ġpolarization : 42704\\n\",\n            \": : 25\\n\",\n            \"Ġcontact : 2800\\n\",\n            \"Ġguidance : 11154\\n\",\n            \"Ġand : 290\\n\",\n            \"Ġchem : 4607\\n\",\n            \"ot : 313\\n\",\n            \"axis : 22704\\n\",\n            \". : 13\\n\",\n            \"ĠWe : 775\\n\",\n            \"Ċ : 198\\n\",\n            \"pro : 1676\\n\",\n            \"pose : 3455\\n\",\n            \"Ġtwo : 734\\n\",\n            \"Ġdifferent : 1180\\n\",\n            \"Ġsensing : 34244\\n\",\n            \"Ġstrategies : 10064\\n\",\n            \"Ġand : 290\\n\",\n            \"Ġwe : 356\\n\",\n            \"Ġanalyze : 16602\\n\",\n            \"Ġthe : 262\\n\",\n            \"Ġtwo : 734\\n\",\n            \"Ġresulting : 7186\\n\",\n            \"Ġmodels : 4981\\n\",\n            \"Ġby : 416\\n\",\n            \"Ġrecovering : 20222\\n\",\n            \"Ċ : 198\\n\",\n            \"the : 1169\\n\",\n            \"Ġappropriate : 5035\\n\",\n            \"Ġmacro : 15021\\n\",\n            \"sc : 1416\\n\",\n            \"opic : 16603\\n\",\n            \"Ġlimit : 4179\\n\",\n            \"Ġin : 287\\n\",\n            \"Ġdifferent : 1180\\n\",\n            \"Ġregimes : 25879\\n\",\n            \", : 11\\n\",\n            \"Ġin : 287\\n\",\n            \"Ġorder : 1502\\n\",\n            \"Ġto : 284\\n\",\n            \"Ġsee : 766\\n\",\n            \"Ġhow : 703\\n\",\n            \"Ġthe : 262\\n\",\n            \"Ġsize : 2546\\n\",\n            \"Ġof : 286\\n\",\n            \"Ġthe : 262\\n\",\n            \"Ġcell : 2685\\n\",\n            \", : 11\\n\",\n            \"Ċ : 198\\n\",\n            \"with : 4480\\n\",\n            \"Ġrespect : 2461\\n\",\n            \"Ġto : 284\\n\",\n            \"Ġthe : 262\\n\",\n            \"Ġvariation : 12291\\n\",\n            \"Ġof : 286\\n\",\n            \"Ġboth : 1111\\n\",\n            \"Ġexternal : 7097\\n\",\n            \"Ġfields : 7032\\n\",\n            \", : 11\\n\",\n            \"Ġinfluences : 16717\\n\",\n            \"Ġthe : 262\\n\",\n            \"Ġoverall : 4045\\n\",\n            \"Ġbehavior : 4069\\n\",\n            \". : 13\\n\",\n            \"ĠMoreover : 10968\\n\",\n            \", : 11\\n\",\n            \"Ċ : 198\\n\",\n            \"we : 732\\n\",\n            \"Ġintegrate : 19386\\n\",\n            \"Ġnumer : 5470\\n\",\n            \"ically : 1146\\n\",\n            \"Ġthe : 262\\n\",\n            \"Ġkinetic : 37892\\n\",\n            \"Ġtransport : 4839\\n\",\n            \"Ġequation : 16022\\n\",\n            \"Ġin : 287\\n\",\n            \"Ġa : 257\\n\",\n            \"Ġtwo : 734\\n\",\n            \"- : 12\\n\",\n            \"dimensional : 19577\\n\",\n            \"Ġsetting : 4634\\n\",\n            \"Ġin : 287\\n\",\n            \"Ġorder : 1502\\n\",\n            \"Ċ : 198\\n\",\n            \"to : 1462\\n\",\n            \"Ġinvestigate : 9161\\n\",\n            \"Ġqual : 4140\\n\",\n            \"itatively : 48668\\n\",\n            \"Ġvarious : 2972\\n\",\n            \"Ġscenarios : 13858\\n\",\n            \". : 13\\n\",\n            \"Ċ : 198\\n\",\n            \"Key : 9218\\n\",\n            \"word : 4775\\n\",\n            \". : 13\\n\",\n            \"ĠKin : 16645\\n\",\n            \"etic : 5139\\n\",\n            \"Ġequations : 27490\\n\",\n            \", : 11\\n\",\n            \"Ġmult : 1963\\n\",\n            \"isc : 2304\\n\",\n            \"ale : 1000\\n\",\n            \"Ġmodeling : 21128\\n\",\n            \", : 11\\n\",\n            \"Ġmulti : 5021\\n\",\n            \"- : 12\\n\",\n            \"cue : 15509\\n\",\n            \", : 11\\n\",\n            \"Ġnon : 1729\\n\",\n            \"- : 12\\n\",\n            \"local : 12001\\n\",\n            \", : 11\\n\",\n            \"Ġhyd : 7409\\n\",\n            \"rod : 14892\\n\",\n            \"ynamic : 28995\\n\",\n            \"Ġlimit : 4179\\n\",\n            \", : 11\\n\",\n            \"Ċ : 198\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    }\n  ]\n}"
  },
  {
    "path": "Chapter08/gpt-2-train_files/accumulate.py",
    "content": "import argparse\nimport json\nimport os\nimport numpy as np\nimport tensorflow as tf\nimport time\n\n\nclass AccumulatingOptimizer(object):\n    def __init__(self, opt, var_list):\n        self.opt = opt\n        self.var_list = var_list\n        self.accum_vars = {tv : tf.Variable(tf.zeros_like(tv.initialized_value()), trainable=False)\n                           for tv in var_list}\n        self.total_loss = tf.Variable(tf.zeros(shape=[], dtype=tf.float32))\n        self.count_loss = tf.Variable(tf.zeros(shape=[], dtype=tf.float32))\n\n    def reset(self):\n        updates = [tv.assign(tf.zeros_like(tv)) for tv in self.accum_vars.values()]\n        updates.append(self.total_loss.assign(tf.zeros(shape=[], dtype=tf.float32)))\n        updates.append(self.count_loss.assign(tf.zeros(shape=[], dtype=tf.float32)))\n        with tf.control_dependencies(updates):\n            return tf.no_op()\n\n    def compute_gradients(self, loss):\n        grads = self.opt.compute_gradients(loss, self.var_list)\n        updates = [self.accum_vars[v].assign_add(g) for (g,v) in grads]\n        updates.append(self.total_loss.assign_add(loss))\n        updates.append(self.count_loss.assign_add(1.0))\n        with tf.control_dependencies(updates):\n            return tf.no_op()\n\n    def apply_gradients(self):\n        grads = [(g,v) for (v,g) in self.accum_vars.items()]\n        with tf.control_dependencies([self.opt.apply_gradients(grads)]):\n            return self.total_loss / self.count_loss\n"
  },
  {
    "path": "Chapter08/gpt-2-train_files/encode.py",
    "content": "#!/usr/bin/env python3\n# Usage:\n#  PYTHONPATH=src ./encode.py <file|directory|glob> /path/to/output.npz\n#  PYTHONPATH=src ./train --dataset /path/to/output.npz\n\nimport argparse\nimport numpy as np\n\nimport encoder\nfrom load_dataset import load_dataset\n\nparser = argparse.ArgumentParser(\n    description='Pre-encode text files into tokenized training set.',\n    formatter_class=argparse.ArgumentDefaultsHelpFormatter)\nparser.add_argument('--model_name', metavar='MODEL', type=str, default='117M', help='Pretrained model name')\nparser.add_argument('--combine', metavar='CHARS', type=int, default=50000, help='Concatenate files with <|endoftext|> separator into chunks of this minimum size')\nparser.add_argument('--encoding', type=str, default='utf-8', help='Set the encoding for reading and writing files.')\nparser.add_argument('in_text', metavar='PATH', type=str, help='Input file, directory, or glob pattern (utf-8 text).')\nparser.add_argument('out_npz', metavar='OUT.npz', type=str, help='Output file path')\n\ndef main():\n    models_dir='/content/gpt-2/src/models'\n    args = parser.parse_args()\n    enc = encoder.get_encoder(args.model_name,models_dir)\n    print('Reading files')\n    chunks = load_dataset(enc, args.in_text, args.combine, encoding=args.encoding)\n    print('Writing', args.out_npz)\n    np.savez_compressed(args.out_npz, *chunks)\n\n\nif __name__ == '__main__':\n    main()\n"
  },
  {
    "path": "Chapter08/gpt-2-train_files/load_dataset.py",
    "content": "import glob\nimport numpy as np\nimport os\nimport tensorflow as tf\nimport tqdm\n\n\ndef load_dataset(enc, path, combine, encoding=None):\n    paths = []\n    if os.path.isfile(path):\n        # Simple file\n        paths.append(path)\n    elif os.path.isdir(path):\n        # Directory\n        for (dirpath, _, fnames) in os.walk(path):\n            for fname in fnames:\n                paths.append(os.path.join(dirpath, fname))\n    else:\n        # Assume glob\n        paths = glob.glob(path)\n\n    token_chunks = []\n    raw_text = ''\n    for path in tqdm.tqdm(paths):\n        if path.endswith('.npz'):\n            # Pre-encoded\n            with np.load(path) as npz:\n                for item in npz.files:\n                    token_chunks.append(npz[item])\n        else:\n            # Plain text\n            with open(path, 'r', encoding=encoding) as fp:\n                raw_text += fp.read()\n            if len(raw_text) >= combine:\n                tokens = np.stack(enc.encode(raw_text))\n                token_chunks.append(tokens)\n                raw_text = ''\n            else:\n                raw_text += '<|endoftext|>'\n    if raw_text:\n        tokens = np.stack(enc.encode(raw_text))\n        token_chunks.append(tokens)\n    return token_chunks\n\n\ndef binary_search(f, lo, hi):\n    if f(lo) or not f(hi):\n        return None\n    while hi > lo + 1:\n        mid = (lo + hi) // 2\n        if f(mid):\n            hi = mid\n        else:\n            lo = mid\n    return hi\n\n\nclass Sampler(object):\n    \"\"\"Fairly samples a slice from a set of variable sized chunks.\n\n    'Fairly' means that the distribution is the same as sampling from one concatenated chunk,\n    but without crossing chunk boundaries.\"\"\"\n\n    def __init__(self, chunks, seed=None):\n        self.chunks = chunks\n        self.total_size = sum(chunk.shape[0] for chunk in chunks)\n        self.boundaries = [0]\n        for i in range(len(chunks)):\n            self.boundaries.append(self.boundaries[-1] + chunks[i].shape[0])\n        self.rs = np.random.RandomState(seed=seed)\n\n    def sample(self, length):\n        assert length < self.total_size // len(\n            self.chunks\n        ), \"Dataset files are too small to sample {} tokens at a time\".format(\n            length)\n        while True:\n            index = self.rs.randint(0, self.total_size - length - 1)\n            i = binary_search(lambda j: self.boundaries[j] > index, 0,\n                              len(self.boundaries) - 1) - 1\n            if self.boundaries[i + 1] > index + length:\n                within_chunk = index - self.boundaries[i]\n                return self.chunks[i][within_chunk:within_chunk + length]\n"
  },
  {
    "path": "Chapter08/gpt-2-train_files/mdset.txt",
    "content": "This suggests that, as amoeboid cells are less contractile, while mesenchymal\ncells are more contractile, and there may be a switching between amoeboid and mesenchymal\nmigration, perhaps there can also be a switching between the dominance of chemotaxis (amoeboid\nmigration) and contact guidance (mesenchymal migration) [60]. One of the most interesting 2D\nplatforms, allowing to study contact guidance and chemotaxis, was proposed in [57], in which the\nauthors demonstrated an additive effect of chemical gradients and fiber alignment by measuring\nthe persistence time; they also observed that cells were directed by fiber alignment and there was\nno effect of the chemical gradient when fibers were aligned perpendicular to it. A similar setting\nwas also used for studying the dependence of contact guidance on the cell cycle [48]. However, In\nthe case of different multi-directional cues, totally different scenarios may happen, e.g. in [51] it is\nshown that for contact guidance and electrotaxis in the cornea, electrotaxis wins when competing\nwith the direction of alignment of the fibers.\nMulti-cue kinetic model with non-local sensing for cell\nmigration on a fibers network with chemotaxis\nMartina Conte ∗ Nadia Loy †‡\nJune 18, 2020\nAbstract\nCells perform directed motion in response to external stimuli that they detect by sensing\nthe environment with their membrane protrusions. In particular, several biochemical and biophysical cues give rise to tactic migration in the direction of their specific targets. This defines\na multi-cue environment in which cells have to sort and combine different, and potentially\ncompetitive, stimuli. We propose a non-local kinetic model for cell migration in presence of\ntwo external factors both influencing cell polarization: contact guidance and chemotaxis. We\npropose two different sensing strategies and we analyze the two resulting models by recovering\nthe appropriate macroscopic limit in different regimes, in order to see how the size of the cell,\nwith respect to the variation of both external fields, influences the overall behavior. Moreover,\nwe integrate numerically the kinetic transport equation in a two-dimensional setting in order\nto investigate qualitatively various scenarios.\nKeyword. Kinetic equations, multiscale modeling, multi-cue, non-local, hydrodynamic limit,\ncell migration, contact guidance, chemotaxis\nAMS subject classifications. 35Q20, 35Q92, 92B05, 45K05, 92C17\n1 Introduction\nCell migration is a fundamental mechanism in a huge variety of processes, such as embryogenesis,\nwound healing, angiogenesis, immune response and tumor stroma formation and metastasis.\nDuring such processes, cells sense the environment and respond to external factors that induce\na certain direction of motion towards specific targets (taxis): this results in a persistent migration\nin a certain preferential direction. The guidance cues leading to directed migration may be biochemical or biophysical. Biochemical cues can be, for example, soluble factors or growth factors\nthat give rise to chemotaxis, which involves a mono-directional stimulus. Other cues generating mono-directional stimuli include, for instance, bound ligands to the substratum that induce\nhaptotaxis, durotaxis, that involves migration towards regions with an increasing stiffness of the\nECM, electrotaxis, also known as galvanotaxis, that prescribes a directed motion guided by an\nelectric field or current, or phototaxis, referring to the movement oriented by a stimulus of light\n[34]. Important biophysical cues are some of the properties of the extracellular matrix (ECM),\nfirst among all the alignment of collagen fibers and its stiffness. In particular, the fiber alignment is shown to stimulate contact guidance [22, 21]. Contact guidance is a key mechanism in a\nnumber of in vivo situations in which cells tend to migrate crawling on the fibers, thus following\n∗BCAM - Basque Center for Applied Mathematics, Alameda de Mazarredo, 14, 48009 Bilbao, Spain\n(mconte@bcamath.org)\n†Department of Mathematical Sciences “G. L. Lagrange”, Politecnico di Torino, Corso Duca degli Abruzzi\n24, 10129 Torino, Italy, and Department of Mathematics “G. Peano”, Via Carlo Alberto 10, 10123 Torino, Italy\n(nadia.loy@polito.it)\n‡Corresponding author: nadia.loy@polito.it\n1\narXiv:2006.09707v1 [q-bio.CB] 17 Jun 2020\nthe directions imposed by the network structure of the ECM. This is a bi-directional cue, as, if\nthe fibers network is not polarized, there is no preferential sense of migration along them. For\nexample, during wound healing fibroblasts migrate efficiently along collagen or fibronectin fibers\nin connective tissues; in cancer spread and metastasis formation, cancer cells migrate through the\nstromal tissue and are thus facilitated to reach blood and lymphatic vessels [55, 49, 50].\nIn many processes there are several directional cues that may induce different simultaneous\nstimuli. While the cell response to each of them has been largely studied, from both an intracellular\nand a migrative point of view, cell responses to a multi-cue environment are much less understood.\nThe fundamental issue is the way cells rank, integrate or hierarchize multiple cues, in particular\nwhen these give conflicting stimuli, because, for example, they are not co-aligned [51]. Some\nstudies have shown that there may be competition or cooperation between different stimuli in\nthe directional response of a cell in a multi-cue environment. Considering the angle between the\nrelative orientation of the directional cues, in the mono-directional case they compete when this\nangle is π, whereas they collaborate when this angle is 0. Bi-directional cues, such as contact\nguidance, compete when the angle is π/2. Then, many intermediate scenarios may happen and\nguidance stimuli submit or prevail according to other factors, among all their average concentration\nand intensity, that relates to the steepness of the gradient for taxis processes and to the degree\nof alignment for contact guidance. In particular, regarding the external environment, the average\nvalue of the directional cue (fiber density, molecule concentration, etc.) and the steepness of\nthe gradient, or the degree of fiber alignment, are fundamental parameters that can be quantified.\nWhile, for cell migration, the angle between the polarization direction and the preferential direction\nimposed by the guidance cue can be measured, as well as the displacement, the mean squared\ndisplacement and the persistence time [15]. However, in general, when cues are aligned, a simple\nadditive mechanism is not what governs multi-cue migration [34], even if it is weighted by the\naverage cue concentrations or intensities.\nIn the framework of kinetic models, in the present paper we will focus on how the environmental\nsensing of two different stimuli over a finite radius can influence the choice of the direction of\nmotion of a cell. In particular, we combine chemotaxis, a mono-directional biochemical cue, with\ncontact guidance, defining the new orientation of the cells as a result of the sensing of the two\ncues over a finite neighborhood, that gives a non-local character to the model. In particular,\nthe combination of chemotaxis and contact-guidance happens in vivo in a variety of situations,\nfor example in wound healing and in breast cancer. In wound healing, fibers guide cells towards\nthe provisional clot, whilst in breast cancer cells follow the aligned fibers at the tumor-stroma\ninterface for migrating out of the primary tumor. Chemotaxis accelerates and enhances these\nprocesses [34, 6, 49, 50]. Therefore, a deep understanding of multi-cue migrational responses is a\nkey step for the comprehension of both physiologic and pathologic processes, but also for building\nengineered tissues, as their structure is realized for guiding cell migration in a focused way [34].\nThere are not many experimental studies concerning chemotaxis and contact guidance, as well\nas other combinations of directional guidances cues [34]. One of the main reasons is the difficulty in\ndesigning environments for controlling multiple directional cues, in particular soluble factors and\naligned fibers and fibrous materials. For example, in one of the first works studying in vitro contact\nguidance of neutrophil leukocytes on fibrils of collagen [59], it is shown that migration is more\nefficient in the direction of alignment, instead of in the perpendicular direction; in the presence\nof chemotaxis, obtained by adding a chemoattractant, they observe that these cues cooperate or\ncompete in dependence on their relative orientation. In particular, the chemotactic response is\nlower for cells trying to cross fibers in the perpendicular direction. In [6], it is shown that alignment\nalong the fibers is greater in presence of a co-aligned chemoattractant. In [38], the authors study\nhow multiple uniformly distributed cues quantitatively regulate random cell migration. One of the\nlatest works concerning the competition between chemotaxis and contact guidance shows that less\ncontractile cells are dominated by chemotaxis, while contact guidance might dominate in more\ncontractile cells [52]. This suggests that, as amoeboid cells are less contractile, while mesenchymal\ncells are more contractile, and there may be a switching between amoeboid and mesenchymal\nmigration, perhaps there can also be a switching between the dominance of chemotaxis (amoeboid\nmigration) and contact guidance (mesenchymal migration) [60]. One of the most interesting 2D\n2\nplatforms, allowing to study contact guidance and chemotaxis, was proposed in [57], in which the\nauthors demonstrated an additive effect of chemical gradients and fiber alignment by measuring\nthe persistence time; they also observed that cells were directed by fiber alignment and there was\nno effect of the chemical gradient when fibers were aligned perpendicular to it. A similar setting\nwas also used for studying the dependence of contact guidance on the cell cycle [48]. However, In\nthe case of different multi-directional cues, totally different scenarios may happen, e.g. in [51] it is\nshown that for contact guidance and electrotaxis in the cornea, electrotaxis wins when competing\nwith the direction of alignment of the fibers.\nThere is a huge variety of mathematical models concerning cell migration. They range from\nmicroscopic models (also called individuals based models), that describe migration at the cell\nlevel, up to macroscopic ones, that describe collective cell-migration at a tissue level. There are\nmany examples of individual based models regarding chemotaxis ([14, 23] and references therein)\nand migration on the ECM [11, 54, 53]. Concerning macroscopic models, first among all the\nfamous Keller and Segel model is a drift-diffusion model postulated at the macroscopic level\n[29]. Many efforts were made in order to encompass the defects of the Keller and Segel model,\nas well as for deriving it from lower scale models (see [30, 27, 40, 41] and references therein).\nBetween microscopic and macroscopic models there are mesoscopic models that are an intermediate\nrepresentative scale, as they include microscopic dynamics and describe the statistical distribution\nof the individuals. They also allow, for instance in the case of kinetic theory, to recover the\nappropriate macroscopic regime which inherit some details of the microscopic dynamics, thus\ngiving more significance to some of the parameters [40]. Some examples are [12, 7, 17]. The two\nmajor models for contact guidance at the mesoscopic level were proposed in [24] and [16], both\nlocal models in the physical space. Concerning multiple cues, not many models exist. In [31], the\nauthors propose a macroscopic drift-diffusion model derived from a space jump process in which\nthey include the response to multiple chemicals. A recent review for macroscopic PDEs including\nmultiple-taxis has been proposed in [32]. In [58], the authors propose one of the first models for\nboth contact guidance and chemotaxis, derived from a microscopic dynamics description. In a\nrecent work [1], the authors propose a microscopic stochastic model for studying contact guidance\nand add chemotaxis in order to study migration at the tumor-stroma interface for classifying TACS\n(tumor associated collagen signature). In [8], a kinetic model for cell-cell interactions on a fibers\nnetwork in presence of a tactic cue is considered. In [36, 37], the authors propose a non-local\nkinetic model with a double biasing cue: the first one affecting the choice of the direction and\nthe second one affecting the speed, including, through the non-locality, the sensing of macroscopic\nquantities performed by the cell, that depends on the cell size, i.e., on its maximum protrusion\nlength.\nAs already stated, in this paper we want to include chemotaxis and contact guidance as directional cues guiding cell polarization. In particular, we analyze two possible sensing strategies\nthat a cell could apply for exploring the neighborhood around, and that determine the choice\nfor the transition probability for the transport model. The cell can measure the guidance cues\nindependently, and, then, choose the new orientation using the collected information, eventually\nweighted in different ways. Otherwise, it can measure the two directional stimuli, weighting them\nequally, and assuming a conditioning of one cue on the other. Therefore, cell response is related to\nthe choice of the sensing strategy, and the macroscopic overall effect of the two cues would also be\naffected. Moreover, we shall consider for the first time a non-local sensing of the fibers distribution\ndefined at a mesoscopic level; this allows for many intermediate scenarios in the analysis about\nthe collaborative or competitive effect of the cues. For a better understanding, we discuss how\nthe choices made on the transition probability, together with the size of the sampling volume and\nthe characteristics of the two cues determine the macroscopic behavior. Specifically, in section\n2, we shall present the mathematical framework, while in section 3 we shall introduce the two\nclasses of models, that describe the different strategies for the sensing of a double cue, along with\nthe corresponding macroscopic limits in various regimes, depending on the cell size and on the\nvariability of the external cues. In section 4, some numerical simulations of the kinetic models will\nbe presented for investigating qualitatively various scenarios in a two-dimensional setting.\n3\n2 Mathematical framework\n2.1 The transport model\nThe cell population will be described at a mesoscopic level through the distribution density p =\np(t, x, v, vˆ) that, for every time t > 0 and position x ∈ Ω ⊆ R\nd\n, gives the statistical distribution\nof the speeds v ∈ [0, U], where U is the maximal speed a cell can achieve, and of the polarization\ndirections vˆ ∈ S\nd−1\n, being S\nd−1\nthe unit sphere boundary in R\nd\n. The velocity vector, thus, will\nbe given by v = vvˆ.\nThen, a macroscopic description for the cell population can be classically recovered through\nthe definition of moments of the distribution function p. In particular, we recover the cell number\ndensity ρ(t, x)\nρ(t, x) = Z\nS\nd−1\nZ U\n0\np(t, x, v, vˆ) dv dvˆ (1)\nthe momentum\nρ(t, x)U(t, x) = Z\nS\nd−1\nZ U\n0\nv p(t, x, v, vˆ) dv dvˆ (2)\nthe cell mean velocity\nU(t, x) = 1\nρ(t, x)\nZ\nS\nd−1\nZ U\n0\nv p(t, x, v, vˆ) dv dvˆ (3)\nand the energy tensor\nD(t, x) = Z\nS\nd−1\nZ U\n0\n(v − U) ⊗ (v − U) p(t, x, v, vˆ) dv dvˆ. (4)\nThe mesoscopic model consists in the transport equation for the cell distribution\n∂p\n∂t(t, x, v, vˆ) + v · ∇p(t, x, v, vˆ) = J [p](t, x, v, vˆ) (5)\nwhere the operator ∇ denotes the spatial gradient, so that the term v·∇p takes into account the free\nparticle transport. The term J [p](t, x, v, vˆ) is the turning operator that describes the scattering of\nthe microscopic velocity in direction and speed. This is related to the typical microscopic dynamics\nof the cell, that is the run and tumble [5, 2]. The run and tumble prescribes an alternation of\nruns over straight lines and re-orientations: the choice of the new direction may be random or it\nmay be biased by the presence of external factors, that may attract or repel the cell as well as\nincrease the time spent in a run. The run and tumble is classically modeled by a scattering of the\nmicroscopic velocity called velocity jump process [56], characterized by a turning frequency µ and\na transition probability T. The general form of the turning operator which implements a velocity\njump process at a kinetic level is given by\nJ [p](x, v, vˆ) =µ(x)\nZ\nS\nd−1\nZ U\n0\nh\nT(x, v, vˆ|v\n0\n, vˆ\n0\n)p(t, x, v0\n, vˆ\n0\n) − T(x, v0\n, vˆ\n0\n|v, vˆ)p(t, x, v, vˆ)\ni\ndv0\ndvˆ\n0\n(6)\nwhere we assumed that the turning frequency does not depend on the microscopic velocity. The\ntransition probability T(x, v, vˆ|v\n0\n, vˆ\n0\n) is also called turning kernel and it is a conditional probability\nsatisfying, ∀x ∈ Ω,\nZ\nS\nd−1\nZ U\n0\nT(x, v, vˆ|v\n0\n, vˆ\n0\n)dvdvˆ = 1 , ∀v\n0 ∈ [0, U], vˆ\n0 ∈ S\nd−1\n. (7)\nThanks to this property, the operator (6) reads\nJ [p](t, x, v, vˆ) = µ(x)\n Z\nS\nd−1\nZ U\n0\nT(x, v, vˆ|v\n0\n, vˆ\n0\n)p(t, x, v0\n, vˆ\n0\n) dv0\ndvˆ\n0 − p(t, x, v, vˆ)\n!\n.\n4\nFor our purposes, we shall assume that the transition probability only depends on the posttumbling velocity\nT(x, v, vˆ|v\n0\n, vˆ\n0\n) = T(x, v, vˆ) (8)\nas classically done in the pioneering work concerning kinetic equations for velocity jump processes\n[56, 42, 24]. This assumption, along with the assumption on the turning frequency, is due to the\nfact that we shall consider directional cues which are sensed non-locally, and, therefore, the most\nrelevant aspect will be the measured preferential direction instead than the incoming velocity. The\nlatter (8) allows to write the turning operator as\nJ [p](t, x, v, vˆ) = µ(x)\n\u0010\nρ(t, x)T(x, v, vˆ) − p(t, x, v, vˆ)\n\u0011\n. (9)\nThe mean macroscopic velocity after a tumble is given by the average of T\nUT (x) = Z\nS\nd−1\nZ U\n0\nv T(x, v, vˆ) dv dvˆ (10)\nand the diffusion tensor by the variance-covariance matrix\nDT (x) = Z\nS\nd−1\nZ U\n0\nT(x, v, vˆ)(v − UT ) ⊗ (v − UT )dv dvˆ. (11)\nArguing as in [46, 4], we can prove a linear version of the classical H-Theorem for the linear\nBoltzmann equation (5)-(9) with p\n0 = p(0, x, v, vˆ) ∈ L\n1\n(Ω × [0, U] × S\nd−1\n). In particular the\nMaxwellian\nM(x, v, vˆ) = ρ\n∞(x)T(x, v, vˆ),\nmaking the turning operator vanish, is the local asymptotic stable equilibrium of the system. As\nalready remarked by [36], this implies that T is the local asymptotic equilibrium steady state of\nthe system. Therefore UT and DT are the mean velocity and diffusion tensor of the cell population\nat equilibrium.\n2.2 Boundary conditions\nSince we are going to consider two-dimensional bounded domains without loss of cells and no\ncells coming in, we shall assume conservation of mass. Therefore, we will require that the chosen\nboundary condition is no-flux [47]\nZ\nS\nd−1\nZ U\n0\np(t, x, v, vˆ)vˆ · n(x) dv dvˆ = 0, ∀x ∈ ∂Ω, t > 0 , (12)\nbeing n(x) the outward normal to the boundary ∂Ω in the point x. This class of boundary\nconditions is part of the wider class of non-absorbing boundary conditions. Denoting the boundary\noperator as\nR[p](t, x, v, vˆ) = p(t, x, v0\n, vˆ\n0\n),\nthere are two important classes of kinetic boundary conditions which satisfy (12): the regular\nreflection boundary operators and the non-local (in velocity) boundary operators of diffusive type.\nWe address the reader to the works [45] and [35] for the definition of these boundary operators.\nIn the present work, we shall consider specular reflection boundary conditions\np(t, x, v0\n, vˆ\n0\n) = p\n\u0012\nt, x, v,\nvˆ − 2(vˆ · n)n\n|vˆ − 2(vˆ · n)n|\n\u0013\n, n · vˆ ≤ 0, (13)\nthat means that cells are reflected with an angle of π/2 when they hit the wall.\n5\n2.3 Macroscopic limits\nIn order to investigate the overall trend of the system, the macroscopic behavior is typically\nanalyzed. By integrating Eq. (5) with (9) on S\nd−1 × [0, U], thanks to Eq. (7), we have that\n∂tρ(t, x) + ∇ · (ρ(t, x)U(t, x)) = 0 ,\ni.e., the mass is conserved pointwise and in the entire domain, because of no-flux boundary\nconditions (after integration on Ω). If we multiply Eq. (5) with (9) by vvˆ, and we then integrate\nthe result on S\nd−1 × [0, U], we see that the momentum is not conserved\n∂tρ(t, x)U(t, x) + ∇ · (ρ(t, x)DT (t, x)) = µ(x) (ρ(t, x)UT (x) − ρ(t, x)U(t, x)).\nWe can observe that, if we multiply the transport equations by increasing orders n of power of\nv and, then, we integrate on the velocity space, we obtain a non-closed system of macroscopic\nequations, since the equations describing the evolution of n\nth moment of p contain the (n + 1)th\nmoment. Therefore, we need some procedures to obtain a closed evolution equation (or system of\nequations) for the macroscopic quantities. In particular, we are interested in the evolution of ρ(t, x)\nin the emerging regime of the system. Therefore, we shall consider a diffusive or a hydrodynamic\nscaling of the transport equation (5) with (9), resulting from a proper non-dimensionalization\nof the system. Diffusive and hydrodynamic limits for transport equations with velocity jump\nprocesses have been widely treated in [26, 40, 24, 36]. Formally, we introduce a small parameter\n\u000f \u001c 1 and we re-scale the spatial variable as\nξ = \u000fx, (14)\nbeing ξ the macroscopic spatial variable. According to the other characteristic quantities of the\nsystem of study, the macroscopic time scale τ will be\nτ = \u000f\n2\nt, (15)\nthat is the parabolic scaling representing a diffusion dominated phenomenon, or\nτ = \u000ft, (16)\nthat is the hyperbolic scaling that represents a drift driven phenomenon. Up to the spatial scaling\n(14), we have that the transition probability may be expanded as\nT(ξ, v, vˆ) = T0(ξ, v, vˆ) + \u000fT1(ξ, v, vˆ) + O(\u000f\n2\n).\nTherefore, the corresponding means and diffusion tensors will be given by\nUi\nT\n(ξ) = Z\nS\nd−1\nZ U\n0\nTi(ξ, v, vˆ)v dvdvˆ (17)\nand\nD\ni\nT\n(ξ) = Z\nS\nd−1\nZ U\n0\nTi(ξ, v, vˆ)(v − Ui\nT\n) ⊗ (v − Ui\nT\n)dv dvˆ . (18)\nConsidering a Hilbert expansion of the distribution function p\np = p0 + \u000fp1 + O(\u000f\n2\n), (19)\nif there is conservation of mass, we have that all the mass is in p0 [26], i.e.,\nρ0 = ρ, ρi = 0 ∀i ≥ 1 , (20)\nwhere ρi =\nZ\nS\nd−1\nZ U\n0\npi dv dvˆ. Furthermore, for performing the diffusive limit we shall assume\nthat Z\nS\nd−1\nZ U\n0\npi v dv dvˆ = 0 ∀i ≥ 2 [26].\n6\nThe functional solvability condition that is necessary for performing a diffusive limit (i.e., for\nchoosing τ = \u000f\n2\nt) is\nU0\nT = 0, (21)\nmeaning that the leading order of the drift vanishes, which is coherent with the fact that the time\nscale τ = \u000f\n2\nt is chosen because the phenomenon macroscopically is diffusion-driven. The diffusive\nlimit procedure prescribes to re-scale (5)-(9) with (14)-(15) and to insert (19) in the re-scaled\nequation. By comparing equal order of \u000f, we obtain the macroscopic diffusive limit, given by\n(dropping the dependencies)\n∂\n∂τ ρ + ∇ ·\nU1\nT ρ\n\u0001\n= ∇ · \u0014\n1\nµ\n∇ ·\nD\n0\nT ρ\n\u0001\n\u0015\n, (22)\nbeing\nD\n0\nT\n(ξ) = Z\nS\nd−1\nZ U\n0\nT0(ξ, v, vˆ)v ⊗ v dvdvˆ\nthe diffusion motility tensor. Equation (22) is a diffusion-advection equation, where U1\nT\nis the\ndrift velocity of first order. If (21) does not hold, a hyperbolic scaling is required, that gives\n∂\n∂τ ρ + ∇ ·\nρU0\nT\n\u0001\n= 0 . (23)\nThis is an advection equation modeling a drift driven phenomenon. We address the reader to [36]\nfor further details.\nConcerning the boundary conditions, at the macroscopic level (12) gives [47]\n\u0010\nDT ∇ρ − ρU1\nT\n\u0011\n· n = 0, on ∂Ω,\nfor the diffusive limit, whilst for the hyperbolic limit the corresponding boundary condition is\nU0\nT\n· n = 0, on ∂Ω .\n3 A mathematical model for chemotaxis on a fibers network\nIn this section, we shall introduce the transition probability modeling a decision process of a cell in\npresence of a double directional guidance cue: a fibrous ECM and a chemoattractant. In particular,\nwe shall consider amoeboid cells [60] moving by contact guidance without proteolysis: cells hit\nthe fiber and then move along the direction of the fiber itself. It has been shown experimentally,\nfor example in the case of glioma cancer cells [28], that randomly disposed fibers imply isotropic\ndiffusion of cells, while aligned fibers cause anisotropic diffusion of cells along the preferential\ndirection of the fibers themselves. The first transport model for contact guidance was proposed by\n[24], further studied and developed by [43, 8, 9] and applied to the study of glioma by [44, 20, 19,\n13, 18]. The model proposed by [24] prescribes a distribution of fibers on the space of directions,\ngiven by the unit sphere in R\nn,\nq = q(x, vˆ), x ∈ Ω, vˆ ∈ S\nd−1\n(24)\nthat satisfies\nQ1: q(x, vˆ) > 0, ∀x ∈ Ω, vˆ ∈ S\nd−1\nQ2: Z\nS\nd−1\nq(x, vˆ) dvˆ = 1, ∀x ∈ Ω\nQ3: q(x, vˆ) = q(x, −vˆ), ∀x ∈ Ω, vˆ ∈ S\nd−1\n,\nwhere the last condition means that we are considering a non-polarized network of fibers, so that\ncells are able to go in both senses in every direction. Being, then, q(x, vˆ) a probability density, we\ncan define the mean direction of the fibers\nEq(x) = Z\nS\nd−1\nq(x, vˆ) vˆ dvˆ, (25)\nand the diffusion tensor of the fibers, given by the variance-covariance matrix of q\nDq(x) = Z\nS\nd−1\nq(x, vˆ) (vˆ − Eq) ⊗ (vˆ − Eq) dvˆ . (26)\nAs we consider a non polarized fibers network, we have that\nEq(x) = 0, (27)\nmeaning that there is no mean direction in the dynamics. The tensor (26) is symmetric and\npositive definite, when q is a regular probability distribution, and, thus, it is diagonalizable. Each\neigenvalue represents the diffusivity in the direction of the corresponding eigenvector, meaning\nthat, if the eigenvalues are equal, there is isotropic diffusion, while, if they are different, there is a\npreferential direction of motion, i.e. anisotropy. Therefore, the model introduced in [24], as shown\nin [43], allows to reproduce isotropic/anisotropic diffusion on a non-polarized fibers network.\nConcerning chemotaxis, we shall consider a chemoattractant in the region Ω defined by a\nstrictly positive definite function\nS = S(x) : Ω 7−→ R+. (28)\nWe consider that the sensing performed by the cells is non-local, as they may extend their\nprotrusions, through which they sense the environment, up to several cell diameters [3]. The\nmaximum length R of a protrusion is called sensing radius and it has been first introduced in\n[40] for modeling a non-local gradient of a chemical and, then, used in a number of works (see\n[10] for a review and references therein) for describing the sensing of macroscopic quantities. In\nparticular, in [36] and, later, in [37] the authors propose a double bias model, in which two cues\nare sensed non-locally and they affect cell polarization and speed. In the present work we shall\ndrop the sensing of a cue that affects the speed, that will be unbiased, and we will extend the\nmodel proposed in [36] to a double sensing of cues affecting the polarization of the cell.\nTherefore, in the model both S and q will be sensed non-locally by a cell that, starting from\nits position x, extends its protrusions in every direction vˆ ∈ S\nd−1 up to the distance R, given by\nthe sensing radius. In particular, assuming a non-local sensing of the fibers network will allow to\nreproduce a wider range of migration strategies, that a cell can perform in order to cleverly reach\nthe chemoattractant, with respect to a local sensing. Therefore, we shall consider the quantities\nS(x + λvˆ), q(x + λvˆ, vˆ), ∀ x ∈ Ω, ∀ vˆ ∈ S\nd−1\n, λ ≤ R.\nOf course, next to the border of the domain Ω, we shall always consider λ such that x + λvˆ ∈ Ω.\nIn order to analyze qualitatively the impact of the non-locality at the macroscopic level, we\nstudy, as previously done in [36, 37], the impact of the directional cues S and q with respect to\nthe size of the cell, that is related to its sensing radius R. Thus, we introduce the characteristic\nlength of variation of S as\nlS :=\n1\nmax\nx∈Ω\n|∇S|\nS\n. (29)\nIt allows to approximate S(x + λvˆ) with a positive quantity\nS(x + λvˆ) ∼ S(x) + λ∇S · vˆ ≥ 0 ∀λ ≤ R if R < lS (30)\nwhere we neglected higher order terms in λ. Beside the above defined characteristic length of\nvariation of the chemoattractant lS , we define an analogue quantity for the fibers distribution. We\nchoose\nlq :=\n1\nmax\nx∈Ω\nmax\nvˆ∈S\nd−1\n|∇q·vˆ|\nq\n. (31)\n8\nIn this case, we can approximate q(x + λvˆ, vˆ) with a positive quantity\nq(x + λvˆ, vˆ) ∼ q(x, vˆ) + λ∇q · vˆ ≥ 0 ∀λ < R if R < lq . (32)\nIn particular, this definition of lq takes into account the variation of directionality of the fibers\nin space, that is what actually influences the cell orientation, more than spatial variation of the\ndensity of the extracellular matrix. We analyze the possible scenarios depending on the relation\nbetween R, lS and lq.\nIn analogy to [36], let us now introduce the parameters\nηq :=\nR\nlq\n(33)\nand\nηS :=\nR\nlS\n, (34)\nthat quantify the capability of measuring of the cell with respect to the characteristic lengths of\nvariation of the sensed guidance cues q and S. In particular, ηi < 1, i = q, S, means that the\nsensing radius is smaller than the characteristic length of variation of q (S, respectively) and the\nidea is that a single instantaneous sensing of the cell is not capable of catching the total spatial\nvariability of q (S, respectively), while if ηi > 1, i = q, S, the sensing radius is large enough in\norder to capture the spatial variability of q (S, respectively). If we consider the two cues separately,\nin the first case we expect that the sensing of q (S, respectively) induces a diffusive behavior, while\nin the second scenario the overall behavior induced by q (S, respectively) is drift-driven.\nAs we are considering the two guidance cues simultaneously affecting cell polarization, we now\ntake into account for limit cases:\ni) ηq, ηS \u001d 1;\nii) ηq, ηS \u001c 1;\niii) ηS \u001c 1, ηq \u001d 1;\niv) ηS \u001d 1, ηq \u001c 1.\nIn case i), a Taylor expansion cannot be used, since there is no guarantee that the first order\napproximations are positive, as well as in case iii) and iv) for q and S, respectively.\nIn order to quantify the relative contribution of chemotaxis to contact guidance, we may\nintroduce the parameter\nη =\nηq\nηS\n(35)\nthat is larger than 1 if contact guidance prevails, whilst it is smaller then 1 if chemotaxis is\nstronger. Due to (33) and (34), we have that, despite its definition, η does not depend on the size\nand sensing capability of the cell, as η =\nηq\nηS\n=\nlS\nlq\n. In particular, if lS is larger than lq, i.e. η > 1,\nit means that the gradient of q is steeper than the one of S, thus enhancing a stronger effect of\ncontact guidance on the dynamics. We may also observe that in case iii) we have always that\nη > 1 while in case iv) we always have η < 1, i.e. contact guidance is weaker then chemotaxis.\nWe shall propose two different transition probabilities describing two different sensing strategies: in the first model the sensings of q and S are independent, while in the second model a\nunique sensing is performed. In the first model, we shall introduce a transition probability that is\nthe product of two different independent sensings\nT[q, S](x, v, vˆ) = c(x)\nZ\nR+\nγS (λ)S(x + λvˆ) dλ Z\nR+\nγq(λ) q(x + λvˆ, vˆ) dλ ψ(v). (36)\nIn this case the cell located in position x measures along the direction vˆ the field S(x+λvˆ) weighted\nby γS , and, independently, the quantity q(x + λvˆ, vˆ), weighted by γq. The sensing functions γS\n9\nand γq have compact support in [0, R] and they may be Dirac deltas centered in R, if the cell only\nmeasures the guidance cues on its membrane (only on x + Rvˆ for every vˆ), or Heaviside functions\nif the cell measures and gives the same weight to q and S from x to x + Rvˆ in every direction.\nFormally the transition probability might be seen as the product of the independent probabilities\nof q and S, i.e. T[q, S] = Tˆ[q] Tˆ[S].\nThe second model prescribes a simultaneous averaging of the guidance cues S and q, i.e.,\nT[q, S](x, v, vˆ) = c(x)\nZ\nR+\nγ(λ)S(x + λvˆ) q(x + λvˆ, vˆ)dλ ψ(v). (37)\nThis transition probability describes a cells in position x that measures in the direction vˆ the two\nquantities S(x + λvˆ) and q(x + λvˆ), weighting both with γ, that is a sensing function. Formally,\nas the two sensing are not independent and, therefore, factorized, we have a conditioning of S\ngiven q and viceversa, i.e., T[q, S] = T˜[S|q] T˜[q] = T˜[q|S] T˜[S].\nIn (36) and (37), c(x) is a normalization coefficient. Moreover the probability density ψ is the\ndistribution of the speeds on the interval [0, U] and satisfies\nZ U\n0\nψ(v)dv = 1 .\nWe introduce its mean speed\nU¯ =\nZ U\n0\nv ψ(v) dv (38)\nand the second moment\nD =\nZ U\n0\nv\n2 ψ(v) dv , (39)\nsuch that the variance of ψ is given by σ\n2\nψ =\n1\n2\n(D − U¯ 2\n).\nWe shall refer to the transport model (5)-(9) with (36) as non-local independent sensing model,\nin which the cell averages the two cues independently according to two different sensing functions\nγq, γS . On the other hand, the transport model (5)-(9) with (37) is defined as non-local dependent\nsensing model, describing cells that sense the two cues at the same time and average them with a\nunique sensing kernel γ. In the next sections we shall analyze the macroscopic limits for the two\nmodels in the scenarios i) − iv) and we shall compare the two models.\n3.1 Amoeboid motion and chemotaxis: non-local independent sensing\nWe first consider the non-local independent sensing case (5)-(9) with (36). We recall the expression\nof the transition probability\nT[q, S](x, v, vˆ) = c(x)\nZ\nR+\nγS (λ)S(x + λvˆ) dλ Z\nR+\nγq(λ) q(x + λvˆ, vˆ) dλ ψ(v).\nThe average of T, that will be the equilibrium velocity of the cell population, is given by\nUT (x) = c(x)U¯\nZ\nS\nd−1\nvˆ\n Z\nR+\nγS (λ)S(x + λvˆ) dλ Z\nR+\nγq(λ) q(x + λvˆ, vˆ) dλ!\ndvˆ . (40)\nCase i) In this case, we shall choose\n\u000f = min \u001a\n1\nηq\n,\n1\nηS\n\u001b\n.\n10\nAs a consequence of the fact that T cannot be expanded in powers of \u000f after re-scaling with (14),\nwe have that U0\nT = UT given by (40). Therefore, we have to perform a hyperbolic scaling that\nleads to the following macroscopic equation for the cells macroscopic density:\n∂\n∂τ ρ(τ, ξ) + ∇ · (ρ(τ, ξ)UT (ξ)) = 0 , (41)\nwith UT (ξ) given by the re-scaling of (40) with (14).\nCase ii) In this case, we can expand both S(x + λvˆ) and q(x + λvˆ, vˆ) and consider the approximations (30) and (32) for λ < min{lq, lS }. Therefore, we approximate the transition probability\nby substituting (30) and (32) in (36), and, thus, we obtain the following approximation for the\nturning kernel T[q, S], that reads\nT[q, S](x, v, vˆ) =c(x)\nh\nΓ\nS\n0 Γ\nq\n0 S(x) q(x, vˆ) + ΓS\n0 Γ\nq\n1 S(x) ∇q · vˆ + ΓS\n1 Γ\nq\n0\nq(x, vˆ) ∇S · vˆ\ni\nψ(v) (42)\nwhere we neglected higher orders terms in λ. In the latter\nc(x) = 1\nS(x) ΓS\n0 Γ\nq\n0\nand\nΓ\nS\ni\n:= Z\nR+\nλ\ni\nγS (λ) dλ i = 0, 1\nΓ\nq\ni\n:= Z\nR+\nλ\ni\nγq(λ) dλ i = 0, 1 .\nThe quantities Γq\n0\n, Γ\nS\n0 are the weighted (by γq, γS ) measures of the sensed linear tracts in every\ndirection, whilst Γq\n1\n, Γ\nS\n1 are the averages of γq, γS on [0, R].\nWe can, then, introduce the small parameter\n\u000f = min{ηq, ηS } (43)\nand re-scale the space variable as ξ = \u000fx, getting\nT0[q, S](ξ, v, vˆ) = q(ξ, vˆ)ψ(v), (44)\nmeaning that the equilibrium is determined by the fibers distribution, and\nT1[q, S](ξ, v, vˆ) = \u0014\nΓ\nq ∇q · vˆ + ΓS\nq(ξ, vˆ)\n∇S\nS(ξ)\n· vˆ\n\u0015\nψ(v)\nwhere\nΓ\nS\n:=\nΓ\nS\n1\nΓ\nS\n0\n, Γq\n:=\nΓ\nq\n1\nΓ\nq\n0\n.\nBecause of (27) and (44), we have that UT\n0\n(ξ) = 0, meaning that we are in a diffusive regime,\nand the diffusive limits leads to the advection-diffusion equation (22). The explicit form for the\nzero-order macroscopic diffusion tensor is\nD\n0\nT\n(ξ) = D\nZ\nS\nd−1\nq(ξ, vˆ)vˆ ⊗ vˆ dvˆ = D Dq(ξ), (45)\nand for the macroscopic first-order velocity is\nU1\nT\n(ξ) = U¯\nZ\nS\nd−1\n\u0012\nΓ\nq ∇q · vˆ + ΓS ∇S\nS(ξ)\n· vˆ q(ξ, vˆ)\n\u0013\nvˆdvˆ\n= U¯ Γ\nq\nZ\nS\nd−1\n(∇q · vˆ) vˆdvˆ + U¯ Γ\nS ∇S\nS\nZ\nS\nd−1\nvˆ ⊗ vˆ q(ξ, vˆ)dvˆ\n= U¯\n\u0014\nΓ\nq ∇ · Dq + ΓS Dq\n∇S\nS\n\u0015\n.\n(46)\n11\nTherefore, the diffusion-advection equation (22) reads (dropping the dependencies)\n∂\n∂τ ρ + ∇ · χ\nS Dq∇S + χ\nq∇ · Dq\n\u0001\nρ\n\n= ∇ · \u0014\n1\nµ\n∇ ·\nD Dq ρ\n\u0001\n\u0015\n, (47)\nwhere\nχ\nS\n(ξ) := U¯ Γ\nS\nS(ξ)\n, χ\nq\n:= U¯ Γ\nq\n(48)\nare the sensitivities. The diffusion represented by the motility tensor of the cells (45) only depends\non the fibers distribution, while the advective term has two contributions differently weighted by\nthe sensitivities (48). We remark that, in this regime, we obtain the same macroscopic behavior\npostulated by Keller and Segel [29], with the logarithmic chemotactic sensitivity χS given in\n(48). The term Dq∇S depends on both the fibers distribution and the chemotactic field; it never\nvanishes if ∇S is not the null vector, since it may be proved that Dq is invertible. In the case\nof randomly disposed fibers, corresponding to the isotropic case, i.e., when Dq is proportional to\nthe identity matrix, then Dq∇S is parallel to ∇S, that, thus, represents the anisotropy direction.\nOn the other hand, when Dq is anisotropic, if ∇S is not parallel to the eigenvector corresponding\nto the highest eigenvalue of Dq, then the migration does not follow the dominant direction of the\nfibers, but rather its projection on ∇S. Moreover, the second contribution in the drift term, i.e.,\n∇ · Dq, is a measure of the velocity field induced by the spatial variation of the distribution of\nthe fiber directions, that determines the microscopic velocities of the cells. This term vanishes if\nthe fibers distribution is homogeneous in space. Therefore, if q is homogeneous in space, even in\ncase of competing cues, i.e., Eq ⊥ ∇S, in general the advective term U1\nT does not vanish, while\nin case of cooperating cues, i.e., ∇S is an eigenvector of Dq with eigenvalue D∇S , migration is in\ndirection ∇S with a kinetic factor χS D∇S . In intermediate scenarios, migration happens in the\nprojection Dq∇S, but, if q is not homogeneous, the dynamics is more complex and, even in case\nof cooperation, we cannot conclude anything about additivity effects.\nCase iii) In this case, we can only expand with Taylor series the chemoattractant, as in (30),\nand the turning kernel (36) may be approximated as\nT[q, S](x, v, vˆ) =c(x)\nh\nS(x) ΓS\n0\nZ\nR+\nγq(λ)q(x + λvˆ, vˆ) dλ + ΓS\n1\n(∇S · vˆ)\nZ\nR+\nγq(λ)q(x + λvˆ, vˆ) dλi\nψ(v)\n(49)\nwhere we neglected higher order terms in λ. Here, the normalization coefficient reduces to\nc(x) = 1\nΓ\nS\n0 Γ\nq\n0 S(x)\n.\nIn this case we may choose\n\u000f = min \u001a\n1\nηq\n, ηS\n\u001b\n,\nand, re-scaling the space variable as (14), we get\nT0[q, S](ξ, v, vˆ) = 1\nΓ\nq\n0\nZ\nR+\nγq(λ)q(ξ + λvˆ, vˆ) dλ ψ(v) (50)\nand\nT1[q, S](ξ, v, vˆ) = Γ\nS\nΓ\nq\n0\n\u0012\n∇S\nS\n· vˆ\n\u0013 Z\nR+\nγq(λ)q(ξ + λvˆ, vˆ) dλ ψ(v).\nEquation (50) indicates that the equilibrium distribution is a non-local average of the fibers distribution according to the sensing kernel γq and normalized by the measure of the sensed linear\ntract Γq\n0\nover the direction vˆ. Its average is\nU0\nT\n(ξ) = U¯\nΓ\nq\n0\nZ\nR+\nγq(λ)Eq(ξ + λvˆ) dλ\n\nthat vanishes as ξ + λvˆ ∈ Ω and (27) holds true. Therefore, we perform the diffusive limit that\nleads to (22) with\nD\n0\nT\n(ξ) = D\nZ\nS\nd−1\n1\nΓ\nq\n0\nZ\nR+\nγq(λ) q(ξ + λvˆ, vˆ) dλ vˆ ⊗ vˆ dvˆ .\nLet us now define\nD\nλ\nq\n(ξ) = Z\nS\nd−1\nq(ξ + λvˆ, vˆ) vˆ ⊗ vˆ dvˆ , (51)\nthat, for each point ξ, is the diffusion tensor of the fibers on a circle of radius λ, and\nD¯ 0\nq =\n1\nΓ\nq\n0\nZ\nR+\nγq(λ)D\nλ\nq dλ , (52)\nthat is a weighted diffusion tensor of the fibers in the whole neighborhood sensed by the cells, so\nthat\nD\n0\nT\n(ξ) = DD¯ 0\nq\n(ξ) (53)\nand\nU1\nT\n(ξ) = U c ¯ (ξ)\nZ\nS\nd−1\n\nΓ\nS\n1\n(∇S · vˆ)\nZ\nR+\nγq(λ) q(ξ + λvˆ, vˆ) dλ!\nvˆ dvˆ\n= U c ¯ (ξ) ΓS\n1 ∇S Z\nR+\nγq(λ)\nZ\nS\nd−1\nvˆ ⊗ vˆ q(ξ + λvˆ, vˆ) dvˆ dλ =\n= U¯ Γ\nS D¯ 0\nq\n(ξ)\n∇S\nS(ξ)\n= χ\nS\n(ξ)D¯ 0\nq\n(ξ)∇S .\n(54)\nWe have defined the chemotactic sensitivity as\nχ\nS\n(ξ) := U¯ Γ\nS\nS(ξ)\n,\nthat is a function of the chemical alone, as it is the cue inducing a diffusive behavior. Here, the\nadvection velocity is related to a non-local average of the diffusion tensor of the fibers D¯ 0\nq projected on ∇S, and it cannot be decomposed into two contributions because of the large size of the\ncell with respect to the spatial variability of the fibers distribution. Therefore, in this case the\nadditivity effect of the two cues is not evident and the possible scenarios are many more.\nRemark If we consider γq = δ(λ − 0) we obtain a local sensing of fibers. Without chemotaxis we\nwould have the classical model for contact guidance [24], that gives rise, at the macroscopic level,\nto a fully anisotropic diffusive equation. The presence of a non-local chemoattractant, even when\nR < lS , gives rise to a drift correction term proportional to Dq∇S.\nCase iv) The last case allows only for the Taylor expansion of the distribution function q, as in\n(32). Therefore, the turning kernel may be approximated as\nT[q, S](x, v, vˆ) =h\nc0(x) Γq\n0\nq(x, vˆ)\nZ\nR+\nγS (λ)S(x + λvˆ) dλ + c1(x)Γq\n1\n(∇q · vˆ)\nZ\nR+\nγS (λ)S(x + λvˆ) dλi\nψ(v)\n(55)\nwhere\nc0(x)\n−1\n:= 2 Z\nS\nd−1\nΓ\nq\n0\nq(x, vˆ)\nZ\nR+\nγS (λ)S(x + λvˆ) dλ dvˆ\n13\nand\nc1(x)\n−1\n:= 2 Z\nS\nd−1\nΓ\nq\n1\n(∇q · vˆ)\nZ\nR+\nγS (λ)S(x + λvˆ) dλ dvˆ ,\nboth different from zero. In this case we may choose\n\u000f = min \u001a\n1\nηS\n, ηq\n\u001b\nand, by re-scaling (55) with (14), we get T[q, S] = T0[q, S]. Hence U0\nT\n(ξ) does not vanish in Ω, as\nit is given by\nU0\nT\n(ξ) = U¯ Γ\nq\n0\nc0(ξ)\nZ\nS\nd−1\nvˆ q(ξ, vˆ)\nZ\nR+\nγS (λ) S(ξ + λvˆ) dλ dvˆ\n+\nU¯ Γ\nq\n1\nc1(ξ)\nZ\nS\nd−1\nvˆ ⊗ vˆ ∇q\nZ\nR+\nγS (λ) S(ξ + λvˆ) dλ dvˆ ,\n(56)\nand the macroscopic equation is given by (23). The mean velocity (56) is a linear combination of\na non-local measure of the chemoattractant S over the fibers network and a non-local measure of\nS weighted by the directional average of the spatial variability of the fiber direction.\nRemark If we consider a local sensing for the chemoattractant, i.e. γS = δ(λ − 0), we obtain a\nmacroscopic advection-diffusion equation, where the macroscopic velocity is induced by the spatial\nvariation of the distribution of fiber directions ∇ · Dq, and the measure of S does not affect the\nchoice of the direction. In this case, if ∇q vanishes, the model reduces to a fully anisotropic\ndiffusive equation [24].\n3.2 Amoeboid motion and chemotaxis: non-local dependent sensing\nConcerning the non-local dependent sensing case (5)-(9) with (37), we recall the expression of the\ntransition probability\nT[q, S](x, v, vˆ) = c(x)\nZ\nR+\nγ(λ)S(x + λvˆ) q(x + λvˆ, vˆ)dλ ψ(v),\nwith\nc(x) := Z\nS\nd−1\nZ\nR+\nγ(λ)S(x + λvˆ) q(x + λvˆ, vˆ)dλ .\nThe macroscopic velocity is here given by\nUT (x) = c(x)U¯\nZ\nS\nd−1\nvˆ\nZ\nR+\nγ(λ)S(x + λvˆ) q(x + λvˆ, vˆ)dλ dvˆ . (57)\nThe macroscopic limits can be performed as in the previous section and the choice of the parameter\n\u000f will be the same for the cases i)−iv), since it does not depend on the kind of model (independent\nor dependent sensing), but only on ηS and ηq.\nCase i) In this case we cannot consider the expansions (32) and (30), and, thus, we cannot\nexpand the turning kernel, whose non vanishing average is given by (57). Therefore, we perform\na hyperbolic limit leading to (23) with macroscopic velocity (57).\n14\nCase ii) When, instead, the maximum sensing radius R is smaller than both the characteristic\nlengths, we may consider the positive expansions (32) and (30) and substitute them in (37).\nNeglecting the higher order terms in λ, we get the approximation\nT[q, S](x, v, vˆ) = c(x)\nh\nS(x) Γ0 q(x, vˆ) + S(x) Γ1 ∇q · vˆ + Γ1 q(x, vˆ) ∇S · vˆ\ni\nψ(v) (58)\nwith\nc(x) = 1\nS(x) Γ0\nand\nΓi\n:= Z R\n0\nλ\ni\nγ(λ) dλ , i = 0, 1 .\nRe-scaling the space variable as in (14), we find\nT0[q, S](ξ, v, vˆ) = q(ξ, vˆ)ψ(v)\nand\nT1[q, S](ξ, v, vˆ) = Γh\n∇q · vˆ + q(ξ, vˆ)\n∇S\nS\n· vˆ\ni\nψ(v)\nwith\nΓ := Γ1\nΓ0\n.\nTherefore, UT\n0\n(ξ) = 0, because of (27), and we can perform a diffusive scaling that leads to the\nzero-order macroscopic diffusion tensor\nD\n0\nT\n(ξ) = D Dq(ξ), (59)\nand to the macroscopic first-order velocity\nU1\nT\n(ξ) = U¯ Γ ∇ · Dq(ξ) + U¯ Γ Dq(ξ)\n∇S\nS\n. (60)\nThe macroscopic advection-diffusion equation (22) now reads (dropping the dependencies)\n∂\n∂τ ρ + ∇ · \u0014\nχ\n\u0012\n∇ · Dq + Dq\n∇S\nS\n\u0013\nρ\n\u0015\n= ∇ · \u0014\n1\nµ\n∇ ·\nD Dq ρ\n\u0001\n\u0015\n(61)\nwhere\nχ := U¯Γ .\nSimilar considerations to the case ii) of the non-local independent sensing model may be done,\nexcept that there is a unique sensitivity χ that weights equally the two contributions to the\nadvection term (60).\nCase iii) In this case, we expand only the chemoattractant S(x+λvˆ), as in (30), and the turning\nkernel (37) can be approximated as\nT[q, S](x, v, vˆ) =c(x)\nh\nS(x)\nZ\nR+\nγ(λ)q(x + λvˆ, vˆ) dλ + (∇S · vˆ)\nZ\nR+\nλ γ(λ)q(x + λvˆ, vˆ) dλi\nψ(v)\n(62)\nwith\nc(x) := 1\nΓ0 S(x)\n.\nRe-scaling the space variable as in (14), we find\nT0[q, S](ξ, v, vˆ) = 1\nΓ0\nZ\nR+\nγ(λ)q(ξ + λvˆ, vˆ) dλ ψ(v),\n1\nand\nT1[q, S](ξ, v, vˆ) = 1\nΓ0\n\u0012\n∇S\nS\n· vˆ\n\u0013 Z\nR+\nλ γ(λ)q(ξ + λvˆ, vˆ) dλ ψ(v).\nThe macroscopic velocity of zero order is then\nU0\nT\n(ξ) = U¯\nΓ0\nZ\nS\nd−1\nZ\nR+\nγ(λ) q(ξ + λvˆ, vˆ) dλ vˆ dvˆ , (63)\nand, again, it vanishes because of ξ + λvˆ ∈ Ω and (27). Therefore, the macroscopic diffusionadvection equation is given by (22) with\nD\n0\nT\n(ξ) = D\nΓ0\nZ\nR+\nD\nλ\nq\n(ξ) γ(λ)dλ = DD¯ 0\nq\n(64)\nand\nU1\nT\n(ξ) = U¯\nΓ0\nZ\nR+\nλ D\nλ\nq\n(ξ) γ(λ) dλ ∇S\nS(ξ)\n= U¯D¯ 1\nq\n(ξ)\n∇S\nS(ξ)\n, (65)\nwhere we defined\nD¯ 1\nq\n(ξ) = 1\nΓ0\nZ\nR+\nλ D\nλ\nq\n(ξ) γ(λ)dλ (66)\nas an average of the weighted diffusion tensor of the fibers in the whole neighborhood sensed by\nthe cells, differently form the case iii) of the non-local independent model.\nCase iv) In this case, again, we can only consider the positive approximation (32), and the\ntransition probability rewrites as\nT[q, S](x, v, vˆ) =h\nc0(x)q(x, vˆ)\nZ\nR+\nγ(λ) S(x + λvˆ) dλ + c1(x)∇q · vˆ\nZ\nR+\nλ γ(λ) S(x + λvˆ) dλi\nψ(v)\n(67)\nwhere\nc0(x)\n−1\n:= 2 Z\nS\nd−1\nq(x, vˆ)\nZ\nR+\nγ(λ)S(x + λvˆ) dλ dvˆ\nand\nc1(x)\n−1\n:= 2 Z\nS\nd−1\n(∇q · vˆ)\nZ\nR+\nλ γ(λ)S(x + λvˆ) dλ dvˆ ,\nboth different from zero. As before, by re-scaling (67) with (14), we get T[q, S] = T0[q, S] and we\nhave that the average velocity U0\nT = UT 6= 0. In particular, it is given by\nUT (ξ) : = U¯\nc0(ξ)\nZ\nS\nd−1\nvˆ q(ξ, vˆ)\nZ\nR+\nγ(λ) S(ξ + λvˆ) dλ dvˆ\n+\nU¯\nc1(ξ)\nZ\nS\nd−1\nvˆ ⊗ vˆ ∇q(ξ, vˆ)\nZ\nR+\nλ γ(λ)S(ξ + λvˆ) dλ dvˆ\n(68)\nand, thus, we perform a hyperbolic limit leading to (23). The mean velocity (68) is a linear\ncombination of a non-local measure of the chemoattractant S over the fibers network and a nonlocal average of S weighted by the directional average of the spatial variability of the fiber direction.\n16\nCase non-local independent sensing (5)-(9)-(36) non-local dependent sensing (5)-(9)-(37)\ni) drift dominated drift dominated\nUT = cU¯\nZ\nSd−1\nvˆ\nZ R\n0\nγS (λ)S(ξ + λvˆ)dλZ R\n0\nγq(λ) q(ξ + λvˆ, vˆ)dλdvˆ UT = cU¯\nZ\nSd−1\nvˆ\nZ R\n0\nγ(λ)S(ξ + λvˆ)q(ξ + λvˆ, vˆ)dλ dvˆ\nii) drift-diffusion drift-diffusion\nD\n0\nT = D Dq D\n0\nT = D Dq\nU1\nT = U¯\n\u0014\nΓ\nq ∇ · Dq + ΓS Dq\n∇S\nS\n\u0015\nU1\nT = U¯Γ\n\u0014\n∇ · Dq + Dq\n∇S\nS\n\u0015\niii) drift-diffusion drift-diffusion\nD\n0\nT = DD¯0\nq D\n0\nT = DD¯0\nq\nU1\nT = U¯ Γ\nS D¯0\nq\n∇S\nS\nU1\nT = U¯ D¯1\nq\n∇S\nS\niv) drift dominated drift dominated\nUT =\nU¯ Γ\nq\n0\nc0\nZ\nSd−1\nvˆ q\nZ R\n0\nγS (λ) S(ξ + λvˆ) dλ dvˆ UT :=\nU¯\nc0\nZ\nSd−1\nvˆ q\nZ R\n0\nγ(λ) S(ξ + λvˆ) dλ dvˆ\n+\nU¯ Γ\nq\n1\nc1\nZ\nSd−1\nvˆ ⊗ vˆ ∇q\nZ R\n0\nγS (λ) S(ξ + λvˆ) dλ dvˆ +\nU¯\nc1\nZ\nSd−1\nvˆ ⊗ vˆ ∇q\nZ R\n0\nλ γ(λ)S(ξ + λvˆ) dλ dvˆ\nTable 1: Summary of the models (dropping the local dependencies in ξ).\n3.2.1 Comments\nWe can observe that, if γq = γS = γ = δ(λ−R), the two non-local transport models for independent\nand dependent sensing are the same, while, if the sensing kernels are not dirac deltas (even if\nγq = γS = γ), the transport models are always different. Instead, at the macroscopic level, with\nany choice of the sensing functions the models coincide only in case ii). In this case, in fact, the\nmacroscopic limits are different only if γq 6= γS , while in the cases iii) and iv) they are different if\nthe sensing kernel are not dirac deltas (even if γS = γq = γ). The relevant difference concerns the\nmacroscopic transport velocities (see (54) and (65) for the case iii, and (56) and (68) for the case\niv). In fact, in the cases iii) and iv), for the non-local dependent sensing model, as only one cue\nis considered non-locally and both cues are averaged with the same sensing function γ, we have a\nweighted average on λ of the non-local quantities, that results in the weighted averages (65) and\nthe second term of (68). These remarks are summarized in Table 2.\nγq = γS = γ = δ γq = γS = γ 6= δ γq 6= γS\nMeso models (5)-(9)-(36) and (5)-(9)-(37) = 6= 6=\nMacro models case i) = 6= 6=\nMacro models case ii) = = 6=\nMacro models case iii) = 6= 6=\nMacro models case iv) = 6= 6=\nTable 2: Summary of the comparison of the models for different choices of the sensing functions.\n= indicates the cases in which the models coincide, while 6= the ones in which the models are\ndifferent.\n17\n4 Numerical simulations\nWe shall now propose two-dimensional numerical simulations in order to illustrate the behavior of\nthe kinetic transport models for non-local independent sensing and non-local dependent sensing.\nIn particular, we shall integrate numerically the transport equation as in [36] and, then, we shall\ncompute the macroscopic density (1). Concerning the fibers network, a classical used distribution\nis the Von Mises distribution [39]\nq˜(x, vˆ) = 1\n2πI0(k(x)) e\nk(x) u(x)·vˆ\nwhere Iν(k) is the modified Bessel function of first kind of order ν and\nu(x) = (cos(θq(x)),sin(θq(x))).\nIt can be proved that Eq˜(x) = u(x) [25], and, therefore, θq(x) is the mean direction in the\nspace [0, 2π) of the fibers located at point x. As we are dealing with cell migrating on a nonpolarized network of fibers, we shall consider the symmetric version, namely the Bimodal Von\nMises distribution\nq (x, vˆ) = 1\n4πI0(k(x))\n\u0010\ne\nk(x) u(x)·vˆ + e\n−k(x) u(x)·vˆ\n\u0011\n,\nthat also satisfies Q3; its variance is [25]\nDq(x) = 1\n2\n\u0012\n1 −\nI2(k)\nI0(k)\n\u0013\nI2 +\nI2(k)\nI0(k)\nu ⊗ u,\nwhere I2 is the identity tensor in R\n2×2\n, while k and u are functions of x. Moreover, the variance\nin the space [0, 2π) is the scalar\nDq(x) = 1\n2\nZ 2π\n0\nq(θ − θq)\n2\ndθ =\n\u0012\n1 −\nI1(k)\nI0(k)\n\u0013\nthat represents the degree of alignment of the fibers at point x.\n4.1 Test 1: local ECM sensing and non-local chemotaxis\nAs a first example, we shall present the particular case in which the sensing of q is local. This\nillustrates the effect of a second directional cue when dealing with a cell population migrating by\ncontact guidance and evaluating the local alignment of the fibers over a non-polarized network.\nFormally, we are dealing with (36) in which γq = δ(λ−0). In particular, we shall consider a region\nΩq = {x = (x, y) ∈ Ω s.t. x1 ≤ x ≤ x2} (69)\nwith x1 = 1.8 and x2 = 3.2 in which the fibers are strongly aligned along the direction identified\nby θq = π/2. In particular, for (x, y) ∈ Ωq, k(x, y) = 700, such that Dq = 5 · 10−3\n. In the rest of\nthe domain Ω − Ωq fibers are uniformly distributed. The chemoattractant has a Gaussian profile\nS(x, y) = p\nmS\n2πσ2\nS\ne\n−\n((x, y) − (xS , yS ))2\n2σ\n2\nS . (70)\nIn particular, in Test 1 (see Fig. 1) we choose (xS , yS ) = (4, 4), mS = 10, σ2\nS = 0.1. The initial\ncondition for the cell population is a Gaussian\nρ0(x, y) = r0e\n−\n((x, y) − (x0, y0))2\n2σ\n2\n0 (71)\n18\nwith r0 = 0.1 and σ\n2\n0 = 0.1. In this first test, the initial condition for the cell population is centered\nin (x0, y0) = (2.5, 2.5), i.e., the center of the region Ωq (see Fig. 1a). Without chemoattractant,\nbecause of the presence of highly aligned fibers, we would expect that cells diffuse anisotropically\nin the preferential direction of the fibers ±π/2, forming the well known ellipsis [43], that represents\ncells moving with the same probability along direction π/2 and −π/2. In the present case, due to\nthe presence of a chemoattractant, the symmetry is broken, and, even if q describes a non-polarized\nfibers network, there is a preferential sense of motion (see Fig. 1d-1f). In particular, cells migrate\nalong the fibers in the direction identified by θq = π/2, corresponding to the preferential sense\nimposed by the presence of the chemoattractant in the upper-right corner of the domain Ω. Given\nthis directional setting, the cell population dynamics is also greatly affected by the strength of the\nchemoattractant, that depends on mS and σ\n2\nS\n, the degree of the alignment Dq, that depends on\nk(x, y), and by the sensing radius R. Another important aspect is the sensing function γS , that\ninfluences the transient dynamics and, especially, the relaxation time. This appears to be double\nin the case of a Heaviside function, since the kernel γS doubles when computed with a Heaviside\nfunction instead of a Dirac delta (see also [36]).\n(a) Initial cell distribution (b) Initial average polarization (c) Center of mass: trajectory.\n(d) t=1.25 (e) t=3.75 (f) t=12.5\n(g) t=1.25 (h) t=3.75 (i) t=12.5\nFigure 1: Test 1 Evolution of the initial distribution given in (a) for the case of local q and\nnon-local chemoattractant S with sensing function γS = δ(λ − R). In (b), S is a Gaussian centred\nin (4, 4) and with mS = 10 and σ\n2\nS = 0.1. The sensing radius of the cells is set to R = 0.5.\n(c): trajectory of the center of mass of the cell population, where each black dot is plotted every\n∆t = 1. Figs. (d)-(f): evolution of the macroscopic density. Figs. (g)-(i): polarizations of the\ncells.\n19\nWe also analyzed the average polarization of the cells at every position x, that is given by\nthe momentum (2). The microscopic directions of cells are initially randomly distributed and\nthey start from a vanishing initial speed (see Fig. 1b). Then, they start to align along the\nfibers and to migrate upward in the direction individuated by the angle π/2, since cells sense the\nchemoattractant (see Figs. 1g-1h). Eventually when cells reach the level y = 4, the microscopic\ndirections polarize towards the chemoattractant (see Fig. 1i). The center of mass plotted in Fig.\n1c stays in the region Ωq during the migration of cells along the fibers bundle in Ωq, and it moves\nout of Ωq only when it reaches y = 4. The black dots are plotted every ∆t = 1 and it is clear\nthat the highest acceleration happens when cells are on the bundle of fibers, while they are slowed\ndown when they start to move out of the fibers stripe Ωq.\n4.2 Test 2: non-local ECM sensing and chemotaxis\nAs a second test, we present both the non-local independent sensing model and the non-local\ndependent sensing model. We shall now consider a non-local sensing of the distribution of fibers.\nIn particular, we assume fibers distributed similarly to the previous test, i.e., fibers shall be highly\naligned in Ωq given, this time, by x1 = 2.1 and x2 = 2.9 (see Fig. 2b). Here, for (x, y) ∈ Ωq,\nk(x, y) = 100, that corresponds to Dq = 0.0025, and θq(x, y) = π/2. In the region Ω−Ωq fibers are\nuniformly distributed. The initial condition of the cell population is (71) with in (x0, y0) = (1, 0.5)\n(see Fig. 2a) while the chemoattractant is located as in Test 1, with mS = 10 and σ\n2\nS = 0.05. We\nshall compare the dynamics of the cells in four settings:\n1. local fiber distribution and non-local chemoattractant, as in Test 1, i.e., (36) with γq =\nδ(λ − 0) and γS = δ(λ − R);\n2. non-local sensing with a Dirac Delta for both q and S; this corresponds to both (36) and\n(37) with γq = γS = γ = δ(λ − R);\n3. non-local independent sensing with Heaviside sensing functions for both S and q, i.e., (36)\nwith γq = γS = H(R − λ);\n4. non-local dependent sensing for q and S, dealing with (37) and γ = H(R − λ).\nResults of these simulations are shown in Fig. 2. We can observe that, in the 1-4 settings, cells start\nfrom (1, 0.5), they are attracted by the chemoattractant and, on their way towards S, they cross\nthe aligned fibers region Ωq and climb up this region in the direction π/2. Eventually, in all the\ncases, cells reach the chemoattractant, but the dynamics, as well as the transient time, is influenced\nby the different sensing kernels, even though the differences are not extremely appreciable, and\nby the local or non-local sensing strategy. Although settings 3 and 4 in Fig. 2, that are related to\nthe case of independent and dependent cues, respectively, do not show very strong differences, in\ncase 3 (see Figs. 2k-2n) the tendency of going in both the direction π/2, determined by the fibers,\nand π/4, determined by the chemoattractant, appears more marked because of the independent\nsensing. In contrast, this behavior results the least evident in the case in which cells deal with a\nlocal sensing of the fibers (setting 1), resulting also in a general slow down of the dynamics.\n4.3 Test 3. non-local independent sensing model: comparison of the\ncases i) − iv)\nWe now present a comparison of the macroscopic behaviors of the cells, depending on the relation\nbetween R, lS and lq, i.e., we compare the cases i), ii), iii) and iv). In particular, we shall do\nthis for the non-local independent sensing model with γq = γS = H(R − λ), as this is the case\nin which the transport model is different from the dependent sensing model. Additionally, the\nindependence of the two sensings allows to visualize more efficiently the two distinct directional\neffects (contact guidance and chemotaxis).\n20\n(a) Initial condition for cells (b) Initial fiber distribution\n(c) t=1.25 (d) t=3.75 (e) t=5 (f) t=6.25\n(g) t=1.25 (h) t=3.75 (i) t=5 (j) t=6.25\n(k) t=1.25 (l) t=3.75 (m) t=5 (n) t=6.25\n(o) t=1.25 (p) t=3.75 (q) t=5 (r) t=6.25\nFigure 2: Test 2 Time evolution of the initial distribution given in Fig. 2a in the four settings\n1-4. The sensing radius of the cells is R = 0.5 and the chemoattractant is (70) with mS =\n10, σ2\nS = 0.05 and (xS , yS ) = (4, 4). Setting 1 is represented in Figs. (c)-(f): local q and non-local\nchemoattractant, γS = δ(λ − R). Setting 2 is represented in Figs. (g)-(j): non-local q and S with\nsensing functions γq = γS = δ(λ−R). Setting 3 is represented in Figs. (k)-(n): non-local q and S,\nindependent sensing with γq = γS = H(R − λ). Setting 4 is represented in Figs. (o)-(r): non-local\nq and S, dependent sensing with γ = H(R − λ).\n21\nWe shall consider the turning kernel describing contact guidance lead by a q with mean direction\nθq(x, y) = 3π/4 ∀(x, y) ∈ Ω and coefficient k(x, y), modulating the strength of the alignment, given\nby a gaussian distribution\nk(x, y) = mke\n−\n((x, y) − (xk, yk))2\n2σ\n2\nk (72)\nwhere (xk, yk) = (2.5, 2.5) and σ\n2\nk = 0.15 (Fig. 3d). This mimics the situation of fibers more\naligned in the central circular region and uniformly disposed in the rest of the domain. We shall\nconsider different values of mk in order to obtain different values of lq: mk = 10 corresponds to\nlq ≈ 0.031 and mk = 100 corresponds to lq ≈ 0.0031. Details about the estimation of lq for a\nBimodal Von Mises distribution of fibers q are given in Appendix A. The chemoattractant is (70)\nwith (xS , yS ) = (4.5, 4.5) and mS = 10. In the simulations, we shall consider three different values\nfor the variance of the chemoattractant σ\n2\nS\nin order to obtain different values of lS : σ\n2\nS = 0.05 that\ncorresponds to lS = 0.002 in Fig. 3a, σ\n2\nS = 0.25 that corresponds to lS = 0.055 in Fig. 3b and\nσ\n2\nS = 1.8 that corresponds to lS = 0.25 in Fig. 3c. The initial distribution of cells for all the tests\npresented in Figs. 4, 5, 6, 7 and 8 is given by (71) with (x0, y0) = (1.5, 1.5), r0 = 0.1, σ\n2\n0 = 0.1.\nIn particular, we present five sets of simulations that are summarized in Table 3.\nlS lq R Case η Figure\n0.002 0.0031 0.7 i) < 1 4\n0.25 0.0031 0.7 i) \u001d 1 5\n0.055 0.031 0.02 ii) > 1 6\n0.25 0.0031 0.02 iii) \u001d 1 7\n0.002 0.031 0.02 iv) < 1 8\nTable 3: Summary of the simulations presented in Test 3.\nIn Fig. 4, we consider the case in which ηS , ηq \u001d 1, i.e., we are dealing with case i). The\nmacroscopic behavior is strongly hyperbolic with macroscopic velocity given by (40). In fact, in\nFig. 4 we can observe that the behavior is not diffusive and the cluster of cells is quite compact. Moreover, when cells reach the region in which fibers are strongly aligned in the direction\n3π/4 (as shown in Fig. 3d), that is perpendicular to the favorable direction π/4 induced by the\nchemoattractant, they surround that region inducing strong alignment and go over towards the\nchemoattractant. In this setting, the parameter defined in (35) is slightly smaller then 1 and, in\nfact, chemotaxis prevails in the overall dynamics, as the stationary state is clearly peaked on the\nchemoattractant profile, but the fibers structure influences the transient.\nIn Fig. 5, we shall consider S with σ\n2\nS = 1.8 and, consequently, lS = 0.25 (see Fig. 3c).\nConcerning the fibers, we have mk = 100, so that lq ≈ 0.0031, and the sensing radius is R = 0.7.\nThis setting falls again in case i), but the behavior is different with respect to the previous\nsimulation in Fig. 4. The chemoattractant in Fig. 3c, in fact, is spread over the whole domain\nand, actually, the quantity lS is almost 102\ntimes the lS considered in Fig. 3a and used for the\nsimulation in Fig. 4. Even though we are still in a strongly hyperbolic case and cells are guided\nby the strong drift (40), as R is slightly larger then lS and lS is large, the cell cluster diffuses a bit\nmore in the domain. When it reaches the region of strongly aligned fibers, it starts to surround\nthat region (see Figs. 5a-5c), but, as ηS = 2.8 = O(1), some cells, that do not surround the region,\nare slowed down and partially tend to align along the fibers. In Fig. 5d, for instance, we have a\nhigh density of cells both in the strongly aligned fiber region and in the region of high density of\nchemoattractant. Eventually, cells manage to overcome the area of highly aligned fibers and they\ntend to converge to the chemoattractant profile (see Figs. 5e-5f). Now, the the overall dynamics\nis greatly affected by the fibers and, in fact, η \u001d 1.\nThe second scenario, illustrated in Fig. 6, refers to the case ii), since the sensing radius\nR = 0.02 is smaller than both lS = 0.055 and lq ≈ 0.031. At the macroscopic level, the behavior\n22\n(a) Chemoattractant S with σ\n2\nS = 0.05. (b) Chemoattractant S with σ\n2\nS = 0.25.\n(c) Chemoattractant S with σ\n2\nS = 1.8.\n(d) Fibers distribution\nFigure 3: Test 3 Three different chemoattractants used for comparing models i) − iv). The\nchemoattractant profile is given by (70) with mS = 10 and (a) σ\n2\nS = 0.05, corresponding to\nlS = 0.002, (b) σ\n2\nS = 0.25, corresponding to lS = 0.055, and (c) σ\n2\nS = 1.8, corresponding to\nlS = 0.25. The fibers distribution in sketched in (d).\nof the system is described by the diffusion-advection equation (47) with macroscopic velocity (46).\nActually, in Fig. 6 we can observe a highly diffusive behavior, as the macroscopic density of cells\nhas invaded almost the half of the domain before even starting to be influenced by the fibers. If\nwe compare the same time step in Figs. 6b and 5b, we see that the cells are in both cases reaching\nthe fibers and feeling the region in which fibers are aligned the most. However, in Fig. 5b the cell\ncluster is much more compact than in Fig. 6b, where, instead, cells already occupied half of the\ndomain, because of diffusion, and we have high density of cells both closely to the strongly aligned\nfiber region and around the initial position. Therefore, cells start surrounding the central region\nof strongly aligned fibers, because they already sense the chemoattractant, and, once overcome\nthis area, they tend to the chemoattractant profile (see Figs. 6c-6f). In particular, in the transient\ntime, cells accumulate the most at the sides of the region with highly aligned fibers. In this specific\nsetting, η > 1 and, in fact, contact guidance highly affects the dynamics.\nThe third scenario, illustrated in Fig. 7, refers to the case iii), since the sensing radius R = 0.02\nis smaller than lS = 0.25 but it is larger then lq ≈ 0.0031. The macroscopic setting is described\nby a diffusion-advection equation with diffusion tensor and drift velocity given by (53) and (54),\nrespectively. As ηS < 1, we have that the chemoattractant induces a strong diffusivity, but being\nηq > 1, the alignment of fibers strongly affects the dynamics (see Figs. 7c-7d). Comparing, in\naddition, Figs. 6b and 7b, we have now that the highest cell concentration is in the mean fiber\ndirection θq = 3π/4 in the region surrounding the center of the domain, where the fibers are\naligned with a higher degree. As already observe in section 3, this scenario prescribes η \u001d 1 and,\nin fact, contact guidance dominates again the dynamics.\nEventually, for a sensing radius R = 0.02 smaller than lq ≈ 0.031, but larger than lS = 0.002,\nthe macroscopic behavior is approximated by an hyperbolic equation with drift velocity given in\n(56). Results of the simulation are presented in Fig. 8. Here, the chemoattractant has the profile\n23\n(a) t=1.25 (b) t=1.875 (c) t=2.5\n(d) t=3.75 (e) t=5 (f) t=6.25\nFigure 4: Test 3 Case i) with non-local q and S, sensed with an independent sensing through the\nkernels γq = γS = H(R−λ). S is given in Fig. 3a with mS = 10 and σ\n2\nS = 0.05, so that lS = 0.002.\nThe fibers distribution q has a space dependent parameter k given by (72) with mk = 100, so that\nlq ≈ 0.0031. The sensing radius of the cells is R = 0.7.\n(a) t=2.5 (b) t=5 (c) t=10\n(d) t=15 (e) t=22.5 (f) t=27.5\nFigure 5: Test 3 Case i) with non-local q and S, independent and sensing with γq = γS = H(R−λ).\nS is given in Fig. 3c, that corresponds to lS = 0.25, while for the fiber distribution mk = 100, so\nthat lq ≈ 0.0031. The sensing radius of the cells is R = 0.7.\nshown in Fig. 3a. Cells diffuse in the domain because ηq is smaller than 1, and they start moving\nin a region with randomly disposed fibers (see Fig. 8a). Then, they mainly follow the preferential\ndirection π/4 thanks to the presence of the chemoattractant. In fact, it induces a strong drift\n24\n(a) t=2.5 (b) t=5 (c) t=7.5\n(d) t=10 (e) t=15 (f) t=20\nFigure 6: Test 3 Case ii) with non-local q and S, independent and sensing with γq = γS =\nH(R − λ). S is given in Fig. 3b, that corresponds to lS = 0.055, while mk = 10, so that\nlq ≈ 0.031. The sensing radius of the cells is R = 0.02.\n(a) t=2.5 (b) t=5 (c) t=10\n(d) t=20 (e) t=30 (f) t=60\nFigure 7: Test 3 Case iii) with non-local q and S, independent and with sensing function γq =\nγS = H(R−λ). S is given in Figure 3c, so that lS = 0.25, while for the fiber distribution mk = 100,\ncorresponding to lq ≈ 0.0031. The sensing radius of the cells is set to R = 0.02.\nbecause of the high non-locality, determining ηS \u001d 1. Here chemotaxis is slightly dominating the\ndynamics and, in fact, η < 1.\n25\n(a) t=1.25 (b) t=2.5 (c) t=5\n(d) t=7.5 (e) t=10 (f) t=15\nFigure 8: Test 3 Case iv) with non-local q and S, independent sensing with γq = γS = H(R − λ).\nS is given in Fig. 3a, that corresponds to lS = 0.002, whilst mk = 10, so that lq ≈ 0.031. The\nsensing radius of the cells is R = 0.02.\n4.4 Test 4: heterogeneous ECM environment\nWe now consider a domain Ω divided in several regions, each of them characterized by a different\naverage direction of the fibers. In particular, we shall do this in the case of independent sensing\nmodel with γq = γS = H(R−λ), as for Test 3; the independence of the two sensings, in fact, allows\nto visualize more efficiently the two distinct directional effects. As first scenario, we shall consider\nthe domain schematized in Fig. 9a; in each subdomain we have k(x, y) = 50, that corresponds\nto Dq = 0.005. The initial condition of the cells is represented in Fig. 9c, with initial density\nr0 = 0.1, while the chemoattractant has a gaussian profile (70) centered in (xS , yS ) = (4, 4), with\nmS = 10 and σ\n2\nS = 0.5, as shown in Fig. 9b. We observe that cells do not migrate collectively\ntowards the chemoattractant, but they divide into two main separated clusters (see Figs. 9f -\n9h): in fact, although the sensing radius R = 0.8 is quite large, the cells that are closer to the\nleft boundary remain trapped in the first subdomain, showing a loss of adhesion with the rest of\nthe cell population. As shown in Fig. 9i, even though the cells that are in the left subdomain\nhorizontally align to the chemoattractant, the high degree of alignment of the fiber does not allow\nthem to escape this region, even for large times.\nAs second scenario, we shall consider the domain represented in Fig. 10a; in each subdomain,\nthe parameter k(x, y) = 50. The initial condition of the cell population is (71) with (x0, y0) =\n(4, 0.5) and r0 = 0.1, while the chemoattractant has a gaussian profile (70) centered in (xS , yS ) =\n(2, 4.5) with mS = 10 and σ\n2\nS = 0.05, as shown in Fig. 10c and 10b, respectively. We observe that\ncells do not migrate directly towards the chemoattractant, as they sense the heterogeneous fibrous\nenvironment and, consequently, adapt their migration to it. In particular, cells that are able to\nreach and sense the isotropic subdomain where the fibers are uniformly distributed (defined by\n1 ≤ x ≤ 3 and 0 ≤ y ≤ 3), go in this direction imposed by the gradient of the chemoattractant.\nOn the other hand, in the subdomain 3 ≤ x ≤ 5 and 1 ≤ y ≤ 2, they follow the direction of\nfiber alignment, that is π/4, perpendicular to the favorable direction imposed by S. However,\nthe sensing radius R = 0.7 allows the cells that are closer to the right boundary to escape quite\nfast the disadvantageous (in terms of preferential direction) subdomains and, following firstly the\ndirection π/2 in 2 ≤ y ≤ 3 and, then, 3π/4 in 3 ≤ y ≤ 4, to reach the chemoattractant.\n26\n(a) Fibers distribution. (b) Chemoattractant S. (c) Initial condition for the cell.\n(d) t=0.04 (e) t=1.6 (f) t=2.904\n(g) t=18.4 (h) t=44 (i) t=67.2\nFigure 9: Test 4 Migration of cells in an heterogenous domain as illustrated in (a). The sensing\nradius of the cells is R = 0.8. The chemoattractant (b) is (70) with mS = 10 and σ\n2\nS = 0.5. The\ninitial cell profile (c) evolves in time as illustrated in (d)-(i).\n5 Conclusion\nWe have proposed a kinetic model for describing cell migration in a multi-cue environment. In\nparticular, in the same spirit as [36], we have considered that cells, as they can extend protrusions\nup to several cell diameters, perform a non-local sensing of the environment up to a distance R\n(named the sensing radius) from its nucleus. In the present model, there are two guidance cues\naffecting the polarization, and, therefore, the direction of motion of the cells: contact guidance,\nthat is a bi-directional cue, and a chemical gradient, that is a mono-directional cue. We remark\nthat for the first time in this work a non-local sensing in the physical space of the mesoscopic\ndistribution of fibers is considered. In particular, we introduced two classes of models: in the first\none, the cells perform an independent sensing of the fibers and of the chemical in its neighborhood,\nwhile in the second class of models the cells average the chemical and the fibers with the same\nsensing kernel.\nIn the two cases, a particular attention was devoted to the identification of the proper macroscopic limit according to the properties of the turning operator. We detected two parameters, ηq\nand ηS , that measure the relation between the cell sensing radius and the characteristic lengths\nof variation − lS and lq − of the two cues, and discriminate between a diffusion-driven regime\nwith an advective correction and a drift-driven regime. In particular, when the sensing radius\ndoes not exceed the characteristic length of the chemoattractant, the bi-directional nature of the\n27\n(a) Fibers distribution. (b) Chemoattractant S. (c) Cells initial condition.\n(d) t=0.5 (e) t=1 (f) t=1.5\n(g) t=2.5 (h) t=3.5 (i) t=4.5\nFigure 10: Test 4 Migration of cell in an heterogenous domain as illustrated in (a). The sensing\nradius of the cells is R = 0.7. The chemoattractant (b) is (70) with mS = 10 and σ\n2\nS = 0.05. The\ninitial cell profile (c) evolves in time as illustrated in (d)-(i).\nfibers allows for a diffusive regime; otherwise the hyperbolic scaling leads to macroscopic drift.\nA common feature in the different cases is the dependency of the macroscopic velocity on both\nthe fibers network and the chemoattractant. This aspect enhances the non-trivial influence of\ncontact guidance on the cell drift, although we considered a non polarized fibers network. This\ninterdependence is in accordance with the model proposed in [58]. Moreover, in absence of a\nchemoattractant, this impact on the drift term could persist for spatial heterogenous fiber distributions. This is in accordance to what is observed in [24] and it represents a step forward with\nrespect to [58], in which the drift is a function of contact guidance only through to the presence\nof a chemical gradient, i.e., without chemoattractant there will be no drift.\nThe numerical simulations of the transport model pointed out the main features characterizing\nthe two classes of models and the possible scenarios that they are able to capture. We observed\nthat the presence of two cues influencing cell polarization, even when the fibers are sensed locally,\nensures a preferential sense of motion for cells laying on regions of highly aligned non-oriented\nfibers. Test 3 allowed to show the importance of deriving the macroscopic equations from an underlying microscopic dynamics and in the appropriate regime: a directly postulated drift-diffusion\nequation would not capture the exact dynamics in all the possible regimes. The competitive or\ncollaborative effects of the cues depend, in a first instance, on the angle between their relative\norientations, i.e., the direction of fiber alignment θq and the gradient of the chemoattractant.\n28\nMoreover, especially for the cases of competitive cues, determining which one is the dominant\ncue depends on their relative strengths, in terms of both concentration and intensity (degree of\nalignment of the fiber k(x) or steepness of the chemoattractive gradient). We introduced the\nparameter η = lS /lq that, independently on the cell size or its sensing capability, quantifies the\nrelative contribution of guidance to chemotaxis and provide a first separation between the cases\nof fiber-dominating and chemotaxis-dominating dynamics (η \u001d 1 or η \u001c 1, respectively). The\npresented framework also allows for the direct calculation of parameters that can be used to quantify directed cell migration and to set its efficiency, like, for instance, mean square displacement,\npersistence time, directional persistence and mean speed [42].\nAdditionally, the non locality brings an further level of detail to the model, allowing to obtain\ndifferent macroscopic behaviour depending on the characteristics of the two sensing. In fact, we\ndid not observe strong differences between the independent and the dependent sensing models,\nwhen we assume in the former the same sensing kernel for fibers and chemoattractant, i.e., when\nγq = γS . However, if there are biological observations sustaining the possibility that a cell might\nimplement different strategies for sensing the underlying fibers network and the chemoattractant,\nit would be possible to use the proposed model, in its independent sensing version, to investigate\nthis scenario and to compare the possible outcomes of this sensing approach with the case of a\nunique and common sensing strategy.\nPotentially, the case of competitive cues, combined with the non-local aspect of the model,\ncould lead to interesting further analysis. As observed in the last numerical tests, the combination\nof heterogenous landscapes of fiber with chemoattractive agents show how the cell density can\ndivide and cross the domain using different migration strategies. This leads to natural questions\nabout the deeper mechanisms leading the competition between the two cues, considering, for\ninstance, the possible role of cell adhesion in recovering collective migration.\nWe remark that, even if simulations were performed in a two dimensional setting, the transport model (and its macroscopic limits, as a consequence) is formulated in a general d-dimensional\nsetting. Hence, a possible future development is to perform simulations in the three dimensional\ncase, that would be much more realistic for mimicking in-vivo migration of cells in the extracellular\nmatrix. Moreover, the model that we proposed may be adapted to describe other directional cues\nthat might describe, among others, haptotactic, durotactic or electrotactic mechanisms. Furthermore, in the same spirit as in [37] we could enrich this model with a non-constant sensing-radius,\nas it may vary according to the spatial and directional variability of the external guidance cues.\nLastly, this study was restricted to the case in which the cues affect only cell polarization, considering a uniform distribution of the speeds. However, similarly to what is done in [36, 37], it may\nbe modified to model a multi-cue environment in which one of the signals affects also the speed\nof the cells.\nA Estimation of lq\nLet us consider the fiber density distribution q(x, vˆ) defined by a bimodal Von Mises Fisher\nq(x, vˆ) = 1\n4πI0(k(x))\n\u0010\ne\nk(x) u·vˆ + e\n−k(x) u·vˆ\n\u0011\n,\nwhere k(x) ∈ C1\n(Ω) and Iν(k(x)) denotes the modified Bessel function of first kind of order ν.\nWe now want to give an estimation for the range of variability of the characteristic length lq,\ndefined as:\nlq :=\n1\nmax\nx∈Ω\nmax\nvˆ∈S\nd−1\n|∇q·vˆ|\nq\n.\n29\nSince ∂I0\n∂k =\nI1(k)\nI0(k)\n, we have that\n∇q =\n\ne\nk(x) u·vˆ − e\n−k(x) u·vˆ\n\u0001\n4πI0(k(x)) ∇k (u · vˆ) −\n\ne\nk(x) u·vˆ + e\n−k(x) u·vˆ\n\u0001\n4πI2\n0\n(k(x))\n∂I0\n∂k ∇k =\n=\n\ne\nk(x) u·vˆ − e\n−k(x) u·vˆ\n\u0001\n4πI0(k(x)) ∇k (u · vˆ) −\n\ne\nk(x) u·vˆ + e\n−k(x) u·vˆ\n\u0001\n4πI0(k(x))\nI1(k(x))\nI0(k(x)) ∇k\nSince q(x, vˆ) > 0, we have:\n∇q · vˆ\nq\n=\n\f\n\f\n\f\n\f\n\f\n\ne\nk(x) u·vˆ − e\n−k(x) u·vˆ\n\u0001\n\ne\nk(x) u·vˆ + e−k(x) u·vˆ\n\u0001 (u · vˆ) −\nI1(k(x))\nI0(k(x))\n\f\n\f\n\f\n\f\n\f\n||∇k|| cos(∇k · vˆ)\nwhere || · || denotes the L2-norm and we use the fact that ||vˆ|| = 1. Therefore,\n\f\n\f\n\f\n\f\n∇q · vˆ\nq\n\f\n\f\n\f\n\f =\n\f\n\f\n\f\n\f\n\f\n\ne\nk(x) u·vˆ − e\n−k(x) u·vˆ\n\u0001\n\ne\nk(x) u·vˆ + e−k(x) u·vˆ\n\u0001 (u · vˆ) −\nI1(k(x))\nI0(k(x))\n\f\n\f\n\f\n\f\n\f\n||∇k|| |cos(∇k · vˆ)|\nRecalling that |a − b| ≤ |a| + |b|, −1 ≤\n\ne\nk(x) u·vˆ − e\n−k(x) u·vˆ\n\u0001\n\ne\nk(x) u·vˆ + e−k(x) u·vˆ\n\u0001 ≤ 1 and | cos (·)| ≤ 1, we get\n\f\n\f\n\f\n\f\n∇q · vˆ\nq\n\f\n\f\n\f\n\f\n≤\n\u0012\n1 +\n\f\n\f\n\f\n\f\nI1(k(x))\nI0(k(x))\n\f\n\f\n\f\n\f\n\u0013\n||∇k|| .\nConsidering Eq. (1.12) in [33] for ν = 1, we obtain that\n\f\n\f\n\f\n\f\nI1\nI0\n\f\n\f\n\f\n\f\n< 1, and, therefore,\n\f\n\f\n\f\n\f\n∇q · vˆ\nq\n\f\n\f\n\f\n\f\n< 2||∇k||\nthat implies\nmax\nx∈Ω\nmax\nvˆ∈S\nd−1\n\f\n\f\n\f\n\f\n∇q · vˆ\nq\n\f\n\f\n\f\n\f\n< 2 max\nx∈Ω\n||∇k||.\nThis translates into\nlq ≥\n1\n2 max\nx∈Ω\n||∇k|| . (73)\nIn particular, if there exists x such that ∇k(x) · vˆ = 1 and, at the same time, also satisfies\n∇k(x) k u, then (73) is true with the equal sign. In particular, for the symmetry of (72) and (70)\nwe shall consider\nlq ≈\n1\n2 max\nx∈Ω\n||∇k|| .\nAcknowledgments The authors would like to thank Prof. Luigi Preziosi for fruitful discussions and valuable comments. This work was partially supported by Istituto Nazionale di Alta\nMatematica, Ministry of Education, Universities and Research, through the MIUR grant Dipartimento di Eccellenza 2018-2022, Project no. E11G18000350001, and the Scientific Reseach Programmes of Relevant National Interest project n. 2017KL4EF3. NL also acknowledges Compagnia\ndi San Paolo. This research was also partially supported by the Basque Government through the\nBERC 2018- 2021 program and by the Spanish State Research Agency through BCAM Severo\nOchoa excellence accreditation SEV-2017-0718. MC has received funding from the European\nUnions Horizon 2020 research and innovation programme under the Marie Skodowska- Curie grant\nagreement No. 713673. The project that gave rise to these results received the support of a fellowship from la Caixa Foundation (ID 100010434). The fellowship code is LCF/BQ/IN17/116\nReferences\n[1] Y. Azimzade, A. A. Saberi, and M. Sahimi. Regulation of migration of chemotactic tumor\ncells by the spatial distribution of collagen fiber orientation. Phys. Rev. E, 99:062414, 2019.\n[2] H. C. Berg. Random Walks in Biology. Princeton University Press, revised edition, 1983.\n[3] H. C. Berg and E. M. Purcell. Physics of chemoreception. Biophys. J., 20(2):193–219, 1977.\n[4] M. Bisi, J. A. Carrillo, and B. Lods. Equilibrium solution to the inelastic boltzmann equation\ndriven by a particle bath. J. Stat. Phys., 133(5):841–870, 2008.\n[5] S. M. Block, J. E. Segall, and H. C. Berg. Adaptation kinetics in bacterial chemotaxis. J.\nBacteriol. Res., 154(1):312–323, 1983.\n[6] B. A. Bromberek, P. A. J. Enever, D. I. Shreiber, M. D. Caldwell, and R. T. Tranquillo.\nMacrophages influence a competition of contact guidance and chemotaxis for fibroblast alignment in a fibrin gel coculture assay. Exp. Cell Res., 275(2):230–242, 2002.\n[7] F. A. C. C. Chalub, P. A. Markowich, B. Perthame, and C. Schmeiser. Kinetic models for\nchemotaxis and their drift-diffusion limits. Monatsh. Math., 142(1):123–141, 2004.\n[8] A. Chauviere, T. Hillen, and L. Preziosi. Modeling cell movement in anisotropic and heterogeneous network tissues. Netw. Heterog. Media, 2(2):333–351, 2007.\n[9] A. Chauviere, T. Hillen, and L. Preziosi. Modeling the motion of a cell population in the\nextracellular matrix. Discrete Cont. Dyn.-B, 2007(Supplemental volume):250–259, 2007.\n[10] L. Chen, K. J. Painter, C. Surulescu, and A. Zhigun. Mathematical models for cell migration:\na nonlocal perspective. arXiv preprint arXiv:1911.05200, 2019.\n[11] A. Colombi, M. Scianna, and L. Preziosi. Coherent modelling switch between pointwise and\ndistributed representations of cell aggregates. J. Math. Biol., 74(4):783–808, 2017.\n[12] A. Colombi, M. Scianna, and A. Tosin. Differentiated cell behavior: a multiscale approach\nusing measure theory. J. Math. Biol., 71:1049–1079, 2015.\n[13] M. Conte, L. Gerardo-Giorda, and M. Groppi. Glioma invasion and its interplay with nervous\ntissue and therapy: A multiscale model. J. Theo. Biol., 486:110088, 2020.\n[14] E. Di Costanzo, M. Menci, E. Messina, R. Natalini, and A. Vecchio. A hybrid model of\ncollective motion of discrete particles under alignment and continuum chemotaxis. Discrete\nCont. Dyn.-B, 25:443–472, 2020.\n[15] R. Dickinson and R. T. Tranquillo. Stochastic model of biased cell migration based on binding\nfluctuations of adhesion receptors. J. Math. Biol., 19:563–600, 1991.\n[16] R. B. Dickinson. A generalized transport model for biased cell migration in an anisotropic\nenvironment. J. Math. Biol., 40(2):97–135, 2000.\n[17] R. Eftimie. Hyperbolic and kinetic models for self-organized biological aggregations and\nmovement: a brief review. J. Math. Biol., 65(1):35–75, 2012.\n[18] C. Engwer, T. Hillen, M. Knappitsch, and C. Surulescu. Glioma follow white matter tracts:\na multiscale dti-based model. J. Math. Biol., 71(3):551–582, 2015.\n[19] C. Engwer, M. Knappitsch, and C. Surulescu. A multiscale model for glioma spread including\ncell-tissue interactions and proliferation. Math. Biosci. Eng., 13:443–460, 2016.\n31\n[20] C. Engwer, C. Stinner, and C. Surulescu. On a structured multiscale model for acid-mediated\ntumor invasion: The effects of adhesion and proliferation. Math. Mod. Meth. Appl. S.,\n27:1355–1390, 2017.\n[21] P. Friedl. Prespecification and plasticity: shifting mechanisms of cell migration. Curr. Opin.\nCell Biol., 16:1423, 2004.\n[22] P. Friedl and E.-B. Brocker. The biology of cell locomotion within three dimensional extracellular matrix. Cell Mol Life Sci., 57:41–64, 2000.\n[23] R. Gininait, R. E. Baker, P. M. Kulesa, and P. K. Maini. Modelling collective cell migration:\nneural crest as a model paradigm. J. Math. Biol., 80:481–504, 2019.\n[24] T. Hillen. M5 mesoscopic and macroscopic models for mesenchymal motion. J. Math. Biol.,\n53(4):585–616, 2006.\n[25] T. Hillen, A. Murtha, K. J. Painter, and A. Swan. Moments of the von mises and fischer\ndistributions and applications. Math. Biosci. Eng., 14(3):673–694, 2017.\n[26] T. Hillen and H. G. Othmer. The diffusion limit of transport equations derived from velocityjump processes. SIAM J. Appl. Math., 61:751–775, 2000.\n[27] T. Hillen and K. J. Painter. A user’s guide to pde models for chemotaxis. J. Math. Biol.,\n58(1):183–217, 2008.\n[28] J. Johnson, M. O. Nowicki, C. H. Lee, E. A. Chiocca, M. S. Viapiano, S. E. Lawler, and J. J\nLannutti. Quantitative analysis of complex glioma cell migration on electrospun polycaprolactone using time-lapse microscopy. Tissue Eng. Part C-Me, 15(4):531–540, 2009.\n[29] E. F. Keller and L. A. Segel. Initiation of slime mold aggregation viewed as an instability. J.\nTheo. Biol., 26(3):399–415, 1970.\n[30] P. J. Kevin. Mathematical models for chemotaxis and their applications in self-organisation\nphenomena. J. Theor. Biol., 481:162–182, 2019.\n[31] P. J. Kevin, P. K. Maini, and H. G. Othmer. Development and applications of a model for\ncellular response to multiple chemotactic cues. J. Math. Biol., 41(4):285–314, 2000.\n[32] N. Kolbe, N. Sfakianakis, C. Stinner, C. Surulescu, and J. Lenz. Modeling multiple taxis:\ntumor invasion with phenotypic heterogeneity, haptotaxis, and unilateral interspecies repellence. arXiv preprint arXiv:2005.01444, 2020.\n[33] A. Laforgia and P. Natalini. Some inequalities for modified bessel functions. J. Inequal. Appl.,\n2010(1):253035, 2010.\n[34] L. Lara and I. Schneider. Directed cell migration in multi-cue environments. Integr. Biol.,\n5(11):1306–1323, 2013.\n[35] B. Lods. Semigroup generation propertiesof streaming operators with noncontractive boundary conditions. Math. Comput. Model., 42:1441–1462, 2005.\n[36] N. Loy and L. Preziosi. Kinetic models with non-local sensing determining cell polarization\nand speed according to independent cues. J. Math. Biol., 80:373–421, 2019.\n[37] N. Loy and L. Preziosi. Modelling physical limits of migration by a kinetic model with\nnon-local sensing. J. Math. Biol., 2019. In Press.\n[38] G. Maheshwari, A. Wells, L. G. Griffith, and D. A. Lauffenburger. Biophysical integration\nof effects of epidermal growth factor and fibronectin on fibroblast migration. Biophys. J.,\n76(5):2814–2823, 1999.\n32\n[39] K. V. Mardia and P. E. Jupp. Directional statistics, volume 494. John Wiley & Sons, 2009.\n[40] H. Othmer and T. Hillen. The diffusion limit of transport equations ii: Chemotaxis equations.\nSIAM J. Appl. Math., 62:1222–1250, 2002.\n[41] H. Othmer and A. Stevens. Aggregation, blowup, and collapse: The ABC’s of taxis in\nreinforced random walks. SIAM J. Appl. Math., 57:1044–1081, 2001.\n[42] H. G. Othmer, S. R. Dunbar, and W. Alt. Models of dispersal in biological systems. J. Math.\nBiol., 26(3):263–298, 1988.\n[43] K. J. Painter. Modelling cell migration strategies in the extracellular matrix. J. Math. Biol.,\n58(4):511–543, 2008.\n[44] K. J. Painter and T. Hillen. Transport and anisotropic diffusion models for movement in\noriented habitats, volume 2071, pages 177–222. Lect. Notes Math., Springer - verlag -, 2013.\n[45] A. Palcewski. Velocity averaging for boundary value problems, pages 1–284. Ser. Adv. Math.\nAppl. Sci. World Scientific Publishing Company, 1992.\n[46] R. Pettersson. On solutions to the Linear Boltzmann equation for granular gases. Transport\nTheor. Stat., 33(5-7):527–543, 2004.\n[47] R. G. Plaza. Derivation of a bacterial nutrient-taxis system with doubly degenerate crossdiffusion as the parabolic limit of a velocity-jump process. J. Math. Biol., 78(6):1681–1711,\n2019.\n[48] K. E. Pourfarhangi, E. Hoz, A. Cohen, and B. Gligorijevic. Contact guidance is cell cycledependent. APL Bioeng., 2:031904, 2018.\n[49] P. P. Provenzano, K. W. Eliceiri, J. M. Campbell, and et al. Collagen reorganization at the\ntumor-stromal interface facilitates local invasion. BMC Med., 4(1):38, 2006.\n[50] P. P. Provenzano, K. W. Eliceiri, and P. J. Keely. Shining new light on 3d cell motility and\nthe metastatic process. Trends Cell Biol., 19(11):638–648, 2009.\n[51] A. M. Rajnicek, L. E. Foubister, and C. D. McCaig. Prioritising guidance cues: Directional migration induced by substratum contours and electrical gradients is controlled by a\nrho/cdc42 switch. Dev. Biol., 312(1):448–460, 2007.\n[52] S. W. Rhee, A. M. Taylor, C. H. Tu, D. H. Cribbs, C. Cotman, and N. Li Jeon. Patterned\ncell culture inside microfluidic devices. Lab Chip, 51:102–107, 2005.\n[53] D. Schlter, I. Ramis-Conde, and M. Chaplain. Computational modeling of single-cell migration: The leading role of extracellular matrix fibers. Biophys. J., 103:1141–51, 2012.\n[54] M. Scianna, L. Preziosi, and K. Wolf. A cellular potts model simulating cell migration on\nand in matrix environments. Math. Biosci. Eng., 10:235–261, 2013.\n[55] P. Steeg. Targeting metastasis. Nat. Rev. Cancer., 16:201–218, 2016.\n[56] D. W. Stroock. Some stochastic processes which arise from a model of the motion of a\nbacterium. Z. Wahrscheinlichkeit, 28(4):305–315, 1974.\n[57] H. Sundararaghavan, R. Saunders, D. Hammer, and J. Burdick. Fiber alignment directs cell\nmotility over chemotactic gradients. Biotechnol. Bioeng., 110(4):1249–1254, 2013.\n[58] M. A. Wagle and R. T. Tranquillo. A self-consistent cell flux expression for simultaneous\nchemotaxis and contact guidance in tissues. J. Math. Biol., 41(4):315–330, 2000.\n33\n[59] P. C. Wilkinson and J. M. Lackie. The influence of contact guidance on chemotaxis of human\nneutrophil leukocytes. Exp. Cell Res., 145(2):255–264, 1983.\n[60] K. Wolf, I. Mazo, H. Leung, K. Engelke, U. H. von Andrian, E. I. Deryugina, A. Y. Strongin, E.-B. Br¨ocker, and P. Friedl. Compensation mechanism in tumor cell migration: mesenchymalamoeboid transition after blocking of pericellular proteolysis. Int. J. Cell Biol.,\n160(2):267–277, 2003.\n34"
  },
  {
    "path": "Chapter08/gpt-2-train_files/memory_saving_gradients.py",
    "content": "from toposort import toposort\nimport contextlib\nimport numpy as np\nimport tensorflow as tf\nimport tensorflow.contrib.graph_editor as ge\nimport time\nimport sys\nsys.setrecursionlimit(10000)\n# refers back to current module if we decide to split helpers out\nutil = sys.modules[__name__]\n\n# getting rid of \"WARNING:tensorflow:VARIABLES collection name is deprecated\"\nsetattr(tf.GraphKeys, \"VARIABLES\", \"variables\")\n\n# save original gradients since tf.gradient could be monkey-patched to point\n# to our version\nfrom tensorflow.python.ops import gradients as tf_gradients_lib\ntf_gradients = tf_gradients_lib.gradients\n\nMIN_CHECKPOINT_NODE_SIZE=1024    # use lower value during testing\n\n# specific versions we can use to do process-wide replacement of tf.gradients\ndef gradients_speed(ys, xs, grad_ys=None, **kwargs):\n    return gradients(ys, xs, grad_ys, checkpoints='speed', **kwargs)\n\ndef gradients_memory(ys, xs, grad_ys=None, **kwargs):\n    return gradients(ys, xs, grad_ys, checkpoints='memory', **kwargs)\n\ndef gradients_collection(ys, xs, grad_ys=None, **kwargs):\n    return gradients(ys, xs, grad_ys, checkpoints='collection', **kwargs)\n\ndef gradients(ys, xs, grad_ys=None, checkpoints='collection', **kwargs):\n    '''\n    Authors: Tim Salimans & Yaroslav Bulatov\n\n    memory efficient gradient implementation inspired by \"Training Deep Nets with Sublinear Memory Cost\"\n    by Chen et al. 2016 (https://arxiv.org/abs/1604.06174)\n\n    ys,xs,grad_ys,kwargs are the arguments to standard tensorflow tf.gradients\n    (https://www.tensorflow.org/versions/r0.12/api_docs/python/train.html#gradients)\n\n    'checkpoints' can either be\n        - a list consisting of tensors from the forward pass of the neural net\n          that we should re-use when calculating the gradients in the backward pass\n          all other tensors that do not appear in this list will be re-computed\n        - a string specifying how this list should be determined. currently we support\n            - 'speed':  checkpoint all outputs of convolutions and matmuls. these ops are usually the most expensive,\n                        so checkpointing them maximizes the running speed\n                        (this is a good option if nonlinearities, concats, batchnorms, etc are taking up a lot of memory)\n            - 'memory': try to minimize the memory usage\n                        (currently using a very simple strategy that identifies a number of bottleneck tensors in the graph to checkpoint)\n            - 'collection': look for a tensorflow collection named 'checkpoints', which holds the tensors to checkpoint\n    '''\n\n    #    print(\"Calling memsaving gradients with\", checkpoints)\n    if not isinstance(ys,list):\n        ys = [ys]\n    if not isinstance(xs,list):\n        xs = [xs]\n\n    bwd_ops = ge.get_backward_walk_ops([y.op for y in ys],\n                                       inclusive=True)\n\n    debug_print(\"bwd_ops: %s\", bwd_ops)\n\n    # forward ops are all ops that are candidates for recomputation\n    fwd_ops = ge.get_forward_walk_ops([x.op for x in xs],\n                                      inclusive=True,\n                                      within_ops=bwd_ops)\n    debug_print(\"fwd_ops: %s\", fwd_ops)\n\n    # exclude ops with no inputs\n    fwd_ops = [op for op in fwd_ops if op.inputs]\n\n    # don't recompute xs, remove variables\n    xs_ops = _to_ops(xs)\n    fwd_ops = [op for op in fwd_ops if not op in xs_ops]\n    fwd_ops = [op for op in fwd_ops if not '/assign' in op.name]\n    fwd_ops = [op for op in fwd_ops if not '/Assign' in op.name]\n    fwd_ops = [op for op in fwd_ops if not '/read' in op.name]\n    ts_all = ge.filter_ts(fwd_ops, True) # get the tensors\n    ts_all = [t for t in ts_all if '/read' not in t.name]\n    ts_all = set(ts_all) - set(xs) - set(ys)\n\n    # construct list of tensors to checkpoint during forward pass, if not\n    # given as input\n    if type(checkpoints) is not list:\n        if checkpoints == 'collection':\n            checkpoints = tf.get_collection('checkpoints')\n\n        elif checkpoints == 'speed':\n            # checkpoint all expensive ops to maximize running speed\n            checkpoints = ge.filter_ts_from_regex(fwd_ops, 'conv2d|Conv|MatMul')\n\n        elif checkpoints == 'memory':\n\n            # remove very small tensors and some weird ops\n            def fixdims(t): # tf.Dimension values are not compatible with int, convert manually\n                try:\n                    return [int(e if e.value is not None else 64) for e in t]\n                except:\n                    return [0]  # unknown shape\n            ts_all = [t for t in ts_all if np.prod(fixdims(t.shape)) > MIN_CHECKPOINT_NODE_SIZE]\n            ts_all = [t for t in ts_all if 'L2Loss' not in t.name]\n            ts_all = [t for t in ts_all if 'entropy' not in t.name]\n            ts_all = [t for t in ts_all if 'FusedBatchNorm' not in t.name]\n            ts_all = [t for t in ts_all if 'Switch' not in t.name]\n            ts_all = [t for t in ts_all if 'dropout' not in t.name]\n            # DV: FP16_FIX - need to add 'Cast' layer here to make it work for FP16\n            ts_all = [t for t in ts_all if 'Cast' not in t.name]\n\n            # filter out all tensors that are inputs of the backward graph\n            with util.capture_ops() as bwd_ops:\n                tf_gradients(ys, xs, grad_ys, **kwargs)\n\n            bwd_inputs = [t for op in bwd_ops for t in op.inputs]\n            # list of tensors in forward graph that is in input to bwd graph\n            ts_filtered = list(set(bwd_inputs).intersection(ts_all))\n            debug_print(\"Using tensors %s\", ts_filtered)\n\n            # try two slightly different ways of getting bottlenecks tensors\n            # to checkpoint\n            for ts in [ts_filtered, ts_all]:\n\n                # get all bottlenecks in the graph\n                bottleneck_ts = []\n                for t in ts:\n                    b = set(ge.get_backward_walk_ops(t.op, inclusive=True, within_ops=fwd_ops))\n                    f = set(ge.get_forward_walk_ops(t.op, inclusive=False, within_ops=fwd_ops))\n                    # check that there are not shortcuts\n                    b_inp = set([inp for op in b for inp in op.inputs]).intersection(ts_all)\n                    f_inp = set([inp for op in f for inp in op.inputs]).intersection(ts_all)\n                    if not set(b_inp).intersection(f_inp) and len(b_inp)+len(f_inp) >= len(ts_all):\n                        bottleneck_ts.append(t)  # we have a bottleneck!\n                    else:\n                        debug_print(\"Rejected bottleneck candidate and ops %s\", [t] + list(set(ts_all) - set(b_inp) - set(f_inp)))\n\n                # success? or try again without filtering?\n                if len(bottleneck_ts) >= np.sqrt(len(ts_filtered)): # yes, enough bottlenecks found!\n                    break\n\n            if not bottleneck_ts:\n                raise Exception('unable to find bottleneck tensors! please provide checkpoint nodes manually, or use checkpoints=\"speed\".')\n\n            # sort the bottlenecks\n            bottlenecks_sorted_lists = tf_toposort(bottleneck_ts, within_ops=fwd_ops)\n            sorted_bottlenecks = [t for ts in bottlenecks_sorted_lists for t in ts]\n\n            # save an approximately optimal number ~ sqrt(N)\n            N = len(ts_filtered)\n            if len(bottleneck_ts) <= np.ceil(np.sqrt(N)):\n                checkpoints = sorted_bottlenecks\n            else:\n                step = int(np.ceil(len(bottleneck_ts) / np.sqrt(N)))\n                checkpoints = sorted_bottlenecks[step::step]\n\n        else:\n            raise Exception('%s is unsupported input for \"checkpoints\"' % (checkpoints,))\n\n    checkpoints = list(set(checkpoints).intersection(ts_all))\n\n    # at this point automatic selection happened and checkpoints is list of nodes\n    assert isinstance(checkpoints, list)\n\n    debug_print(\"Checkpoint nodes used: %s\", checkpoints)\n    # better error handling of special cases\n    # xs are already handled as checkpoint nodes, so no need to include them\n    xs_intersect_checkpoints = set(xs).intersection(set(checkpoints))\n    if xs_intersect_checkpoints:\n        debug_print(\"Warning, some input nodes are also checkpoint nodes: %s\",\n                    xs_intersect_checkpoints)\n    ys_intersect_checkpoints = set(ys).intersection(set(checkpoints))\n    debug_print(\"ys: %s, checkpoints: %s, intersect: %s\", ys, checkpoints,\n                ys_intersect_checkpoints)\n    # saving an output node (ys) gives no benefit in memory while creating\n    # new edge cases, exclude them\n    if ys_intersect_checkpoints:\n        debug_print(\"Warning, some output nodes are also checkpoints nodes: %s\",\n              format_ops(ys_intersect_checkpoints))\n\n    # remove initial and terminal nodes from checkpoints list if present\n    checkpoints = list(set(checkpoints) - set(ys) - set(xs))\n\n    # check that we have some nodes to checkpoint\n    # if not checkpoints:\n    #     raise Exception('no checkpoints nodes found or given as input! ')\n\n    # disconnect dependencies between checkpointed tensors\n    checkpoints_disconnected = {}\n    for x in checkpoints:\n        if x.op and x.op.name is not None:\n            grad_node = tf.stop_gradient(x, name=x.op.name+\"_sg\")\n        else:\n            grad_node = tf.stop_gradient(x)\n        checkpoints_disconnected[x] = grad_node\n\n    # partial derivatives to the checkpointed tensors and xs\n    ops_to_copy = fast_backward_ops(seed_ops=[y.op for y in ys],\n                                    stop_at_ts=checkpoints, within_ops=fwd_ops)\n    debug_print(\"Found %s ops to copy within fwd_ops %s, seed %s, stop_at %s\",\n                    len(ops_to_copy), fwd_ops, [r.op for r in ys], checkpoints)\n    debug_print(\"ops_to_copy = %s\", ops_to_copy)\n    debug_print(\"Processing list %s\", ys)\n    copied_sgv, info = ge.copy_with_input_replacements(ge.sgv(ops_to_copy), {})\n    for origin_op, op in info._transformed_ops.items():\n        op._set_device(origin_op.node_def.device)\n    copied_ops = info._transformed_ops.values()\n    debug_print(\"Copied %s to %s\", ops_to_copy, copied_ops)\n    ge.reroute_ts(checkpoints_disconnected.values(), checkpoints_disconnected.keys(), can_modify=copied_ops)\n    debug_print(\"Rewired %s in place of %s restricted to %s\",\n                checkpoints_disconnected.values(), checkpoints_disconnected.keys(), copied_ops)\n\n    # get gradients with respect to current boundary + original x's\n    copied_ys = [info._transformed_ops[y.op]._outputs[0] for y in ys]\n    boundary = list(checkpoints_disconnected.values())\n    dv = tf_gradients(ys=copied_ys, xs=boundary+xs, grad_ys=grad_ys, **kwargs)\n    debug_print(\"Got gradients %s\", dv)\n    debug_print(\"for %s\", copied_ys)\n    debug_print(\"with respect to %s\", boundary+xs)\n\n    inputs_to_do_before = [y.op for y in ys]\n    if grad_ys is not None:\n        inputs_to_do_before += grad_ys\n    wait_to_do_ops = list(copied_ops) + [g.op for g in dv if g is not None]\n    my_add_control_inputs(wait_to_do_ops, inputs_to_do_before)\n\n    # partial derivatives to the checkpointed nodes\n    # dictionary of \"node: backprop\" for nodes in the boundary\n    d_checkpoints = {r: dr for r,dr in zip(checkpoints_disconnected.keys(),\n                                        dv[:len(checkpoints_disconnected)])}\n    # partial derivatives to xs (usually the params of the neural net)\n    d_xs = dv[len(checkpoints_disconnected):]\n\n    # incorporate derivatives flowing through the checkpointed nodes\n    checkpoints_sorted_lists = tf_toposort(checkpoints, within_ops=fwd_ops)\n    for ts in checkpoints_sorted_lists[::-1]:\n        debug_print(\"Processing list %s\", ts)\n        checkpoints_other = [r for r in checkpoints if r not in ts]\n        checkpoints_disconnected_other = [checkpoints_disconnected[r] for r in checkpoints_other]\n\n        # copy part of the graph below current checkpoint node, stopping at\n        # other checkpoints nodes\n        ops_to_copy = fast_backward_ops(within_ops=fwd_ops, seed_ops=[r.op for r in ts], stop_at_ts=checkpoints_other)\n        debug_print(\"Found %s ops to copy within %s, seed %s, stop_at %s\",\n                    len(ops_to_copy), fwd_ops, [r.op for r in ts],\n                    checkpoints_other)\n        debug_print(\"ops_to_copy = %s\", ops_to_copy)\n        if not ops_to_copy: # we're done!\n            break\n        copied_sgv, info = ge.copy_with_input_replacements(ge.sgv(ops_to_copy), {})\n        for origin_op, op in info._transformed_ops.items():\n            op._set_device(origin_op.node_def.device)\n        copied_ops = info._transformed_ops.values()\n        debug_print(\"Copied %s to %s\", ops_to_copy, copied_ops)\n        ge.reroute_ts(checkpoints_disconnected_other, checkpoints_other, can_modify=copied_ops)\n        debug_print(\"Rewired %s in place of %s restricted to %s\",\n                    checkpoints_disconnected_other, checkpoints_other, copied_ops)\n\n        # gradient flowing through the checkpointed node\n        boundary = [info._transformed_ops[r.op]._outputs[0] for r in ts]\n        substitute_backprops = [d_checkpoints[r] for r in ts]\n        dv = tf_gradients(boundary,\n                          checkpoints_disconnected_other+xs,\n                          grad_ys=substitute_backprops, **kwargs)\n        debug_print(\"Got gradients %s\", dv)\n        debug_print(\"for %s\", boundary)\n        debug_print(\"with respect to %s\", checkpoints_disconnected_other+xs)\n        debug_print(\"with boundary backprop substitutions %s\", substitute_backprops)\n\n        inputs_to_do_before = [d_checkpoints[r].op for r in ts]\n        wait_to_do_ops = list(copied_ops) + [g.op for g in dv if g is not None]\n        my_add_control_inputs(wait_to_do_ops, inputs_to_do_before)\n\n        # partial derivatives to the checkpointed nodes\n        for r, dr in zip(checkpoints_other, dv[:len(checkpoints_other)]):\n            if dr is not None:\n                if d_checkpoints[r] is None:\n                    d_checkpoints[r] = dr\n                else:\n                    d_checkpoints[r] += dr\n        def _unsparsify(x):\n            if not isinstance(x, tf.IndexedSlices):\n                return x\n            assert x.dense_shape is not None, \"memory_saving_gradients encountered sparse gradients of unknown shape\"\n            indices = x.indices\n            while indices.shape.ndims < x.values.shape.ndims:\n                indices = tf.expand_dims(indices, -1)\n            return tf.scatter_nd(indices, x.values, x.dense_shape)\n\n        # partial derivatives to xs (usually the params of the neural net)\n        d_xs_new = dv[len(checkpoints_other):]\n        for j in range(len(xs)):\n            if d_xs_new[j] is not None:\n                if d_xs[j] is None:\n                    d_xs[j] = _unsparsify(d_xs_new[j])\n                else:\n                    d_xs[j] += _unsparsify(d_xs_new[j])\n\n\n    return d_xs\n\ndef tf_toposort(ts, within_ops=None):\n    all_ops = ge.get_forward_walk_ops([x.op for x in ts], within_ops=within_ops)\n\n    deps = {}\n    for op in all_ops:\n        for o in op.outputs:\n            deps[o] = set(op.inputs)\n    sorted_ts = toposort(deps)\n\n    # only keep the tensors from our original list\n    ts_sorted_lists = []\n    for l in sorted_ts:\n        keep = list(set(l).intersection(ts))\n        if keep:\n            ts_sorted_lists.append(keep)\n\n    return ts_sorted_lists\n\ndef fast_backward_ops(within_ops, seed_ops, stop_at_ts):\n    bwd_ops = set(ge.get_backward_walk_ops(seed_ops, stop_at_ts=stop_at_ts))\n    ops = bwd_ops.intersection(within_ops).difference([t.op for t in stop_at_ts])\n    return list(ops)\n\n@contextlib.contextmanager\ndef capture_ops():\n  \"\"\"Decorator to capture ops created in the block.\n  with capture_ops() as ops:\n    # create some ops\n  print(ops) # => prints ops created.\n  \"\"\"\n\n  micros = int(time.time()*10**6)\n  scope_name = str(micros)\n  op_list = []\n  with tf.name_scope(scope_name):\n    yield op_list\n\n  g = tf.get_default_graph()\n  op_list.extend(ge.select_ops(scope_name+\"/.*\", graph=g))\n\ndef _to_op(tensor_or_op):\n  if hasattr(tensor_or_op, \"op\"):\n    return tensor_or_op.op\n  return tensor_or_op\n\ndef _to_ops(iterable):\n  if not _is_iterable(iterable):\n    return iterable\n  return [_to_op(i) for i in iterable]\n\ndef _is_iterable(o):\n  try:\n    _ = iter(o)\n  except Exception:\n    return False\n  return True\n\nDEBUG_LOGGING=False\ndef debug_print(s, *args):\n  \"\"\"Like logger.log, but also replaces all TensorFlow ops/tensors with their\n  names. Sensitive to value of DEBUG_LOGGING, see enable_debug/disable_debug\n\n  Usage:\n    debug_print(\"see tensors %s for %s\", tensorlist, [1,2,3])\n  \"\"\"\n\n  if DEBUG_LOGGING:\n    formatted_args = [format_ops(arg) for arg in args]\n    print(\"DEBUG \"+s % tuple(formatted_args))\n\ndef format_ops(ops, sort_outputs=True):\n  \"\"\"Helper method for printing ops. Converts Tensor/Operation op to op.name,\n  rest to str(op).\"\"\"\n\n  if hasattr(ops, '__iter__') and not isinstance(ops, str):\n    l = [(op.name if hasattr(op, \"name\") else str(op)) for op in ops]\n    if sort_outputs:\n      return sorted(l)\n    return l\n  else:\n    return ops.name if hasattr(ops, \"name\") else str(ops)\n\ndef my_add_control_inputs(wait_to_do_ops, inputs_to_do_before):\n    for op in wait_to_do_ops:\n        ci = [i for i in inputs_to_do_before if op.control_inputs is None or i not in op.control_inputs]\n        ge.add_control_inputs(op, ci)"
  },
  {
    "path": "Chapter08/gpt-2-train_files/train.py",
    "content": "#!/usr/bin/env python3\n# Usage:\n#  PYTHONPATH=src ./train --dataset <file|directory|glob>\n\nimport argparse\nimport json\nimport os\nimport numpy as np\nimport tensorflow as tf\nimport time\nimport tqdm\nfrom tensorflow.core.protobuf import rewriter_config_pb2\n\nimport model, sample, encoder\nfrom load_dataset import load_dataset, Sampler\nfrom accumulate import AccumulatingOptimizer\nimport memory_saving_gradients\n\nCHECKPOINT_DIR = 'checkpoint'\nSAMPLE_DIR = 'samples'\n\n\nparser = argparse.ArgumentParser(\n    description='Fine-tune GPT-2 on your custom dataset.',\n    formatter_class=argparse.ArgumentDefaultsHelpFormatter)\n\nparser.add_argument('--dataset', metavar='PATH', type=str, required=True, help='Input file, directory, or glob pattern (utf-8 text, or preencoded .npz files).')\nparser.add_argument('--model_name', metavar='MODEL', type=str, default='117M', help='Pretrained model name')\nparser.add_argument('--combine', metavar='CHARS', type=int, default=50000, help='Concatenate input files with <|endoftext|> separator into chunks of this minimum size')\nparser.add_argument('--encoding', type=str, default='utf-8', help='Set the encoding for reading and writing files.')\n\nparser.add_argument('--batch_size', metavar='SIZE', type=int, default=1, help='Batch size')\nparser.add_argument('--learning_rate', metavar='LR', type=float, default=0.00002, help='Learning rate for Adam')\nparser.add_argument('--accumulate_gradients', metavar='N', type=int, default=1, help='Accumulate gradients across N minibatches.')\nparser.add_argument('--memory_saving_gradients', default=False, action='store_true', help='Use gradient checkpointing to reduce vram usage.')\nparser.add_argument('--only_train_transformer_layers', default=False, action='store_true', help='Restrict training to the transformer blocks.')\nparser.add_argument('--optimizer', type=str, default='adam', help='Optimizer. <adam|sgd>.')\nparser.add_argument('--noise', type=float, default=0.0, help='Add noise to input training data to regularize against typos.')\n\nparser.add_argument('--top_k', type=int, default=40, help='K for top-k sampling.')\nparser.add_argument('--top_p', type=float, default=0.0, help='P for top-p sampling. Overrides top_k if set > 0.')\n\nparser.add_argument('--restore_from', type=str, default='latest', help='Either \"latest\", \"fresh\", or a path to a checkpoint file')\nparser.add_argument('--run_name', type=str, default='run1', help='Run id. Name of subdirectory in checkpoint/ and samples/')\nparser.add_argument('--sample_every', metavar='N', type=int, default=100, help='Generate samples every N steps')\nparser.add_argument('--sample_length', metavar='TOKENS', type=int, default=1023, help='Sample this many tokens')\nparser.add_argument('--sample_num', metavar='N', type=int, default=1, help='Generate this many samples')\nparser.add_argument('--save_every', metavar='N', type=int, default=1000, help='Write a checkpoint every N steps')\n\nparser.add_argument('--val_dataset', metavar='PATH', type=str, default=None, help='Dataset for validation loss, defaults to --dataset.')\nparser.add_argument('--val_batch_size', metavar='SIZE', type=int, default=2, help='Batch size for validation.')\nparser.add_argument('--val_batch_count', metavar='N', type=int, default=40, help='Number of batches for validation.')\nparser.add_argument('--val_every', metavar='STEPS', type=int, default=0, help='Calculate validation loss every STEPS steps.')\n\n\ndef maketree(path):\n    try:\n        os.makedirs(path)\n    except:\n        pass\n\n\ndef randomize(context, hparams, p):\n    if p > 0:\n        mask = tf.random.uniform(shape=tf.shape(context)) < p\n        noise = tf.random.uniform(shape=tf.shape(context), minval=0, maxval=hparams.n_vocab, dtype=tf.int32)\n        return tf.where(mask, noise, context)\n    else:\n        return context\n\n\ndef main():\n    args = parser.parse_args()\n    models_dir='/content/gpt-2/src/models'\n    enc = encoder.get_encoder(args.model_name,models_dir)\n    hparams = model.default_hparams()\n    with open(os.path.join('models', args.model_name, 'hparams.json')) as f:\n        hparams.override_from_dict(json.load(f))\n\n    if args.sample_length > hparams.n_ctx:\n        raise ValueError(\n            \"Can't get samples longer than window size: %s\" % hparams.n_ctx)\n\n    if args.model_name == '345M':\n        args.memory_saving_gradients = True\n        if args.optimizer == 'adam':\n            args.only_train_transformer_layers = True\n\n    config = tf.ConfigProto()\n    config.gpu_options.allow_growth = True\n    config.graph_options.rewrite_options.layout_optimizer = rewriter_config_pb2.RewriterConfig.OFF\n    with tf.Session(config=config) as sess:\n        context = tf.placeholder(tf.int32, [args.batch_size, None])\n        context_in = randomize(context, hparams, args.noise)\n        output = model.model(hparams=hparams, X=context_in)\n        loss = tf.reduce_mean(\n            tf.nn.sparse_softmax_cross_entropy_with_logits(\n                labels=context[:, 1:], logits=output['logits'][:, :-1]))\n\n        if args.val_every > 0:\n            val_context = tf.placeholder(tf.int32, [args.val_batch_size, None])\n            val_output = model.model(hparams=hparams, X=val_context)\n            val_loss = tf.reduce_mean(\n                tf.nn.sparse_softmax_cross_entropy_with_logits(\n                    labels=val_context[:, 1:], logits=val_output['logits'][:, :-1]))\n            val_loss_summary = tf.summary.scalar('val_loss', val_loss)\n\n\n        tf_sample = sample.sample_sequence(\n            hparams=hparams,\n            length=args.sample_length,\n            context=context,\n            batch_size=args.batch_size,\n            temperature=1.0,\n            top_k=args.top_k,\n            top_p=args.top_p)\n\n        all_vars = [v for v in tf.trainable_variables() if 'model' in v.name]\n        train_vars = [v for v in all_vars if '/h' in v.name] if args.only_train_transformer_layers else all_vars\n\n        if args.optimizer == 'adam':\n            opt = tf.train.AdamOptimizer(learning_rate=args.learning_rate)\n        elif args.optimizer == 'sgd':\n            opt = tf.train.GradientDescentOptimizer(learning_rate=args.learning_rate)\n        else:\n            exit('Bad optimizer:', args.optimizer)\n\n        if args.accumulate_gradients > 1:\n            if args.memory_saving_gradients:\n                exit(\"Memory saving gradients are not implemented for gradient accumulation yet.\")\n            opt = AccumulatingOptimizer(\n                opt=opt,\n                var_list=train_vars)\n            opt_reset = opt.reset()\n            opt_compute = opt.compute_gradients(loss)\n            opt_apply = opt.apply_gradients()\n            summary_loss = tf.summary.scalar('loss', opt_apply)\n        else:\n            if args.memory_saving_gradients:\n                opt_grads = memory_saving_gradients.gradients(loss, train_vars)\n            else:\n                opt_grads = tf.gradients(loss, train_vars)\n            opt_grads = list(zip(opt_grads, train_vars))\n            opt_apply = opt.apply_gradients(opt_grads)\n            summary_loss = tf.summary.scalar('loss', loss)\n\n        summary_lr = tf.summary.scalar('learning_rate', args.learning_rate)\n        summaries = tf.summary.merge([summary_lr, summary_loss])\n\n        summary_log = tf.summary.FileWriter(\n            os.path.join(CHECKPOINT_DIR, args.run_name))\n\n        saver = tf.train.Saver(\n            var_list=all_vars,\n            max_to_keep=5,\n            keep_checkpoint_every_n_hours=2)\n        sess.run(tf.global_variables_initializer())\n\n        if args.restore_from == 'latest':\n            ckpt = tf.train.latest_checkpoint(\n                os.path.join(CHECKPOINT_DIR, args.run_name))\n            if ckpt is None:\n                # Get fresh GPT weights if new run.\n                ckpt = tf.train.latest_checkpoint(\n                    os.path.join('models', args.model_name))\n        elif args.restore_from == 'fresh':\n            ckpt = tf.train.latest_checkpoint(\n                os.path.join('models', args.model_name))\n        else:\n            ckpt = tf.train.latest_checkpoint(args.restore_from)\n        print('Loading checkpoint', ckpt)\n        saver.restore(sess, ckpt)\n\n        print('Loading dataset...')\n        chunks = load_dataset(enc, args.dataset, args.combine, encoding=args.encoding)\n        data_sampler = Sampler(chunks)\n        if args.val_every > 0:\n            if args.val_dataset:\n                val_chunks = load_dataset(enc, args.val_dataset, args.combine, encoding=args.encoding)\n            else:\n                val_chunks = chunks\n        print('dataset has', data_sampler.total_size, 'tokens')\n        print('Training...')\n\n        if args.val_every > 0:\n            # Sample from validation set once with fixed seed to make\n            # it deterministic during training as well as across runs.\n            val_data_sampler = Sampler(val_chunks, seed=1)\n            val_batches = [[val_data_sampler.sample(1024) for _ in range(args.val_batch_size)]\n                           for _ in range(args.val_batch_count)]\n\n        counter = 1\n        counter_path = os.path.join(CHECKPOINT_DIR, args.run_name, 'counter')\n        if os.path.exists(counter_path):\n            # Load the step number if we're resuming a run\n            # Add 1 so we don't immediately try to save again\n            with open(counter_path, 'r') as fp:\n                counter = int(fp.read()) + 1\n\n        def save():\n            maketree(os.path.join(CHECKPOINT_DIR, args.run_name))\n            print(\n                'Saving',\n                os.path.join(CHECKPOINT_DIR, args.run_name,\n                             'model-{}').format(counter))\n            saver.save(\n                sess,\n                os.path.join(CHECKPOINT_DIR, args.run_name, 'model'),\n                global_step=counter)\n            with open(counter_path, 'w') as fp:\n                fp.write(str(counter) + '\\n')\n\n        def generate_samples():\n            print('Generating samples...')\n            context_tokens = data_sampler.sample(1)\n            all_text = []\n            index = 0\n            while index < args.sample_num:\n                out = sess.run(\n                    tf_sample,\n                    feed_dict={context: args.batch_size * [context_tokens]})\n                for i in range(min(args.sample_num - index, args.batch_size)):\n                    text = enc.decode(out[i])\n                    text = '======== SAMPLE {} ========\\n{}\\n'.format(\n                        index + 1, text)\n                    all_text.append(text)\n                    index += 1\n            print(text)\n            maketree(os.path.join(SAMPLE_DIR, args.run_name))\n            with open(\n                    os.path.join(SAMPLE_DIR, args.run_name,\n                                 'samples-{}').format(counter), 'w', encoding=args.encoding) as fp:\n                fp.write('\\n'.join(all_text))\n\n        def validation():\n            print('Calculating validation loss...')\n            losses = []\n            for batch in tqdm.tqdm(val_batches):\n                losses.append(sess.run(val_loss, feed_dict={val_context: batch}))\n            v_val_loss = np.mean(losses)\n            v_summary = sess.run(val_loss_summary, feed_dict={val_loss: v_val_loss})\n            summary_log.add_summary(v_summary, counter)\n            summary_log.flush()\n            print(\n                '[{counter} | {time:2.2f}] validation loss = {loss:2.2f}'\n                .format(\n                    counter=counter,\n                    time=time.time() - start_time,\n                    loss=v_val_loss))\n\n        def sample_batch():\n            return [data_sampler.sample(1024) for _ in range(args.batch_size)]\n\n\n        avg_loss = (0.0, 0.0)\n        start_time = time.time()\n\n        try:\n            while True:\n                if counter % args.save_every == 0:\n                    save()\n                if counter % args.sample_every == 0:\n                    generate_samples()\n                if args.val_every > 0 and (counter % args.val_every == 0 or counter == 1):\n                    validation()\n\n                if args.accumulate_gradients > 1:\n                    sess.run(opt_reset)\n                    for _ in range(args.accumulate_gradients):\n                        sess.run(\n                            opt_compute, feed_dict={context: sample_batch()})\n                    (v_loss, v_summary) = sess.run((opt_apply, summaries))\n                else:\n                    (_, v_loss, v_summary) = sess.run(\n                        (opt_apply, loss, summaries),\n                        feed_dict={context: sample_batch()})\n\n                summary_log.add_summary(v_summary, counter)\n\n                avg_loss = (avg_loss[0] * 0.99 + v_loss,\n                            avg_loss[1] * 0.99 + 1.0)\n\n                print(\n                    '[{counter} | {time:2.2f}] loss={loss:2.2f} avg={avg:2.2f}'\n                    .format(\n                        counter=counter,\n                        time=time.time() - start_time,\n                        loss=v_loss,\n                        avg=avg_loss[0] / avg_loss[1]))\n\n                counter += 1\n        except KeyboardInterrupt:\n            print('interrupted')\n            save()\n\n\nif __name__ == '__main__':\n    main()\n"
  },
  {
    "path": "Chapter09/SRL.ipynb",
    "content": "{\n  \"nbformat\": 4,\n  \"nbformat_minor\": 0,\n  \"metadata\": {\n    \"colab\": {\n      \"name\": \"SRL.ipynb\",\n      \"provenance\": [],\n      \"collapsed_sections\": [],\n      \"toc_visible\": true\n    },\n    \"kernelspec\": {\n      \"name\": \"python3\",\n      \"display_name\": \"Python 3\"\n    }\n  },\n  \"cells\": [\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"bzRpKWxSgZmb\"\n      },\n      \"source\": [\n        \"#Semantic Role Labeling(SRL)\\n\",\n        \"\\n\",\n        \"The notebook is an implementation of the Allen Institute for AI BERT-based model. [Reference usage of the Notebook](https://demo.allennlp.org/semantic-role-labeling/MjE4NjI1Ng==)\\n\",\n        \"\\n\",\n        \"The BERT-based model is an implementation of [Peng Shi and Jimmy Lin, (2019), ‘Simple BERT Models for Relation Extraction and Semantic Role Labeling’]( https://arxiv.org/abs/1904.05255)\\n\"\n      ]\n    },\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"9aeqrxgQhKmE\"\n      },\n      \"source\": [\n        \"Intalling Allen NLP\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"XAIkwYFaeBBD\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"d75870e6-02e6-4d6f-f165-c89bf32e17bb\"\n      },\n      \"source\": [\n        \"!pip install allennlp==1.0.0 allennlp-models==1.0.0\"\n      ],\n      \"execution_count\": 1,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Collecting allennlp==1.0.0\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/2c/49/bf0ec241496a82c9dd2f0b6ff6f8156b6b2b72b849df8c00a4f2bcf61485/allennlp-1.0.0-py3-none-any.whl (473kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 481kB 9.4MB/s \\n\",\n            \"\\u001b[?25hCollecting allennlp-models==1.0.0\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/3d/d5/9ee1d0b8c217b6978e42e54fbab8bafe9e792f0f8262f381dde44cee44ae/allennlp_models-1.0.0-py3-none-any.whl (282kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 286kB 28.6MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: tqdm>=4.19 in /usr/local/lib/python3.6/dist-packages (from allennlp==1.0.0) (4.41.1)\\n\",\n            \"Requirement already satisfied: spacy<2.3,>=2.1.0 in /usr/local/lib/python3.6/dist-packages (from allennlp==1.0.0) (2.2.4)\\n\",\n            \"Requirement already satisfied: scikit-learn in /usr/local/lib/python3.6/dist-packages (from allennlp==1.0.0) (0.22.2.post1)\\n\",\n            \"Requirement already satisfied: scipy in /usr/local/lib/python3.6/dist-packages (from allennlp==1.0.0) (1.4.1)\\n\",\n            \"Requirement already satisfied: nltk in /usr/local/lib/python3.6/dist-packages (from allennlp==1.0.0) (3.2.5)\\n\",\n            \"Collecting torch<1.6.0,>=1.5.0\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/62/01/457b49d790b6c4b9720e6f9dbbb617692f6ce8afdaadf425c055c41a7416/torch-1.5.1-cp36-cp36m-manylinux1_x86_64.whl (753.2MB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 753.2MB 23kB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: pytest in /usr/local/lib/python3.6/dist-packages (from allennlp==1.0.0) (3.6.4)\\n\",\n            \"Requirement already satisfied: dataclasses; python_version < \\\"3.7\\\" in /usr/local/lib/python3.6/dist-packages (from allennlp==1.0.0) (0.8)\\n\",\n            \"Collecting overrides==3.0.0\\n\",\n            \"  Downloading https://files.pythonhosted.org/packages/42/8d/caa729f809ecdf8e76fac3c1ff7d3f0b72c398c9dd8a6919927a30a873b3/overrides-3.0.0.tar.gz\\n\",\n            \"Requirement already satisfied: requests>=2.18 in /usr/local/lib/python3.6/dist-packages (from allennlp==1.0.0) (2.23.0)\\n\",\n            \"Collecting jsonpickle\\n\",\n            \"  Downloading https://files.pythonhosted.org/packages/ee/d5/1cc282dc23346a43aab461bf2e8c36593aacd34242bee1a13fa750db0cfe/jsonpickle-1.4.2-py2.py3-none-any.whl\\n\",\n            \"Collecting transformers<2.12,>=2.9\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/48/35/ad2c5b1b8f99feaaf9d7cdadaeef261f098c6e1a6a2935d4d07662a6b780/transformers-2.11.0-py3-none-any.whl (674kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 675kB 42.1MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: filelock<3.1,>=3.0 in /usr/local/lib/python3.6/dist-packages (from allennlp==1.0.0) (3.0.12)\\n\",\n            \"Collecting tensorboardX>=1.2\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/af/0c/4f41bcd45db376e6fe5c619c01100e9b7531c55791b7244815bac6eac32c/tensorboardX-2.1-py2.py3-none-any.whl (308kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 317kB 30.5MB/s \\n\",\n            \"\\u001b[?25hCollecting boto3\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/f6/bf/6fd00f2e8b2f9e8688b10b616c66be985a0053729cb1e92ac2f6e8ec1cd6/boto3-1.16.40-py2.py3-none-any.whl (130kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 133kB 47.7MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: numpy in /usr/local/lib/python3.6/dist-packages (from allennlp==1.0.0) (1.19.4)\\n\",\n            \"Requirement already satisfied: h5py in /usr/local/lib/python3.6/dist-packages (from allennlp==1.0.0) (2.10.0)\\n\",\n            \"Collecting jsonnet>=0.10.0; sys_platform != \\\"win32\\\"\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/42/40/6f16e5ac994b16fa71c24310f97174ce07d3a97b433275589265c6b94d2b/jsonnet-0.17.0.tar.gz (259kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 266kB 59.0MB/s \\n\",\n            \"\\u001b[?25hCollecting word2number>=1.1\\n\",\n            \"  Downloading https://files.pythonhosted.org/packages/4a/29/a31940c848521f0725f0df6b25dca8917f13a2025b0e8fcbe5d0457e45e6/word2number-1.1.zip\\n\",\n            \"Collecting conllu==3.0\\n\",\n            \"  Downloading https://files.pythonhosted.org/packages/66/0b/a8863b5c14aee200a13a0f8c28550fd0132e947ae88441c9f517eb84613b/conllu-3.0-py2.py3-none-any.whl\\n\",\n            \"Collecting py-rouge==1.1\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/9c/1d/0bdbaf559fb7afe32308ebc84a2028600988212d7eb7fb9f69c4e829e4a0/py_rouge-1.1-py3-none-any.whl (56kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 61kB 5.9MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: thinc==7.4.0 in /usr/local/lib/python3.6/dist-packages (from spacy<2.3,>=2.1.0->allennlp==1.0.0) (7.4.0)\\n\",\n            \"Requirement already satisfied: preshed<3.1.0,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from spacy<2.3,>=2.1.0->allennlp==1.0.0) (3.0.5)\\n\",\n            \"Requirement already satisfied: srsly<1.1.0,>=1.0.2 in /usr/local/lib/python3.6/dist-packages (from spacy<2.3,>=2.1.0->allennlp==1.0.0) (1.0.5)\\n\",\n            \"Requirement already satisfied: cymem<2.1.0,>=2.0.2 in /usr/local/lib/python3.6/dist-packages (from spacy<2.3,>=2.1.0->allennlp==1.0.0) (2.0.5)\\n\",\n            \"Requirement already satisfied: plac<1.2.0,>=0.9.6 in /usr/local/lib/python3.6/dist-packages (from spacy<2.3,>=2.1.0->allennlp==1.0.0) (1.1.3)\\n\",\n            \"Requirement already satisfied: blis<0.5.0,>=0.4.0 in /usr/local/lib/python3.6/dist-packages (from spacy<2.3,>=2.1.0->allennlp==1.0.0) (0.4.1)\\n\",\n            \"Requirement already satisfied: catalogue<1.1.0,>=0.0.7 in /usr/local/lib/python3.6/dist-packages (from spacy<2.3,>=2.1.0->allennlp==1.0.0) (1.0.0)\\n\",\n            \"Requirement already satisfied: setuptools in /usr/local/lib/python3.6/dist-packages (from spacy<2.3,>=2.1.0->allennlp==1.0.0) (50.3.2)\\n\",\n            \"Requirement already satisfied: wasabi<1.1.0,>=0.4.0 in /usr/local/lib/python3.6/dist-packages (from spacy<2.3,>=2.1.0->allennlp==1.0.0) (0.8.0)\\n\",\n            \"Requirement already satisfied: murmurhash<1.1.0,>=0.28.0 in /usr/local/lib/python3.6/dist-packages (from spacy<2.3,>=2.1.0->allennlp==1.0.0) (1.0.5)\\n\",\n            \"Requirement already satisfied: joblib>=0.11 in /usr/local/lib/python3.6/dist-packages (from scikit-learn->allennlp==1.0.0) (1.0.0)\\n\",\n            \"Requirement already satisfied: six in /usr/local/lib/python3.6/dist-packages (from nltk->allennlp==1.0.0) (1.15.0)\\n\",\n            \"Requirement already satisfied: future in /usr/local/lib/python3.6/dist-packages (from torch<1.6.0,>=1.5.0->allennlp==1.0.0) (0.16.0)\\n\",\n            \"Requirement already satisfied: attrs>=17.4.0 in /usr/local/lib/python3.6/dist-packages (from pytest->allennlp==1.0.0) (20.3.0)\\n\",\n            \"Requirement already satisfied: more-itertools>=4.0.0 in /usr/local/lib/python3.6/dist-packages (from pytest->allennlp==1.0.0) (8.6.0)\\n\",\n            \"Requirement already satisfied: pluggy<0.8,>=0.5 in /usr/local/lib/python3.6/dist-packages (from pytest->allennlp==1.0.0) (0.7.1)\\n\",\n            \"Requirement already satisfied: py>=1.5.0 in /usr/local/lib/python3.6/dist-packages (from pytest->allennlp==1.0.0) (1.10.0)\\n\",\n            \"Requirement already satisfied: atomicwrites>=1.0 in /usr/local/lib/python3.6/dist-packages (from pytest->allennlp==1.0.0) (1.4.0)\\n\",\n            \"Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests>=2.18->allennlp==1.0.0) (2.10)\\n\",\n            \"Requirement already satisfied: chardet<4,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests>=2.18->allennlp==1.0.0) (3.0.4)\\n\",\n            \"Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.6/dist-packages (from requests>=2.18->allennlp==1.0.0) (2020.12.5)\\n\",\n            \"Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from requests>=2.18->allennlp==1.0.0) (1.24.3)\\n\",\n            \"Requirement already satisfied: importlib-metadata; python_version < \\\"3.8\\\" in /usr/local/lib/python3.6/dist-packages (from jsonpickle->allennlp==1.0.0) (3.3.0)\\n\",\n            \"Collecting tokenizers==0.7.0\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/14/e5/a26eb4716523808bb0a799fcfdceb6ebf77a18169d9591b2f46a9adb87d9/tokenizers-0.7.0-cp36-cp36m-manylinux1_x86_64.whl (3.8MB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 3.8MB 48.9MB/s \\n\",\n            \"\\u001b[?25hCollecting sentencepiece\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/e5/2d/6d4ca4bef9a67070fa1cac508606328329152b1df10bdf31fb6e4e727894/sentencepiece-0.1.94-cp36-cp36m-manylinux2014_x86_64.whl (1.1MB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 1.1MB 47.5MB/s \\n\",\n            \"\\u001b[?25hCollecting sacremoses\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/7d/34/09d19aff26edcc8eb2a01bed8e98f13a1537005d31e95233fd48216eed10/sacremoses-0.0.43.tar.gz (883kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 890kB 50.1MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.6/dist-packages (from transformers<2.12,>=2.9->allennlp==1.0.0) (2019.12.20)\\n\",\n            \"Requirement already satisfied: packaging in /usr/local/lib/python3.6/dist-packages (from transformers<2.12,>=2.9->allennlp==1.0.0) (20.8)\\n\",\n            \"Requirement already satisfied: protobuf>=3.8.0 in /usr/local/lib/python3.6/dist-packages (from tensorboardX>=1.2->allennlp==1.0.0) (3.12.4)\\n\",\n            \"Collecting jmespath<1.0.0,>=0.7.1\\n\",\n            \"  Downloading https://files.pythonhosted.org/packages/07/cb/5f001272b6faeb23c1c9e0acc04d48eaaf5c862c17709d20e3469c6e0139/jmespath-0.10.0-py2.py3-none-any.whl\\n\",\n            \"Collecting botocore<1.20.0,>=1.19.40\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/ea/77/13fc099a10c22d08d766e244412c114694e21982c04cafc1ade33d8a430c/botocore-1.19.40-py2.py3-none-any.whl (7.1MB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 7.1MB 34.3MB/s \\n\",\n            \"\\u001b[?25hCollecting s3transfer<0.4.0,>=0.3.0\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/69/79/e6afb3d8b0b4e96cefbdc690f741d7dd24547ff1f94240c997a26fa908d3/s3transfer-0.3.3-py2.py3-none-any.whl (69kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 71kB 7.6MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: typing-extensions>=3.6.4; python_version < \\\"3.8\\\" in /usr/local/lib/python3.6/dist-packages (from importlib-metadata; python_version < \\\"3.8\\\"->jsonpickle->allennlp==1.0.0) (3.7.4.3)\\n\",\n            \"Requirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.6/dist-packages (from importlib-metadata; python_version < \\\"3.8\\\"->jsonpickle->allennlp==1.0.0) (3.4.0)\\n\",\n            \"Requirement already satisfied: click in /usr/local/lib/python3.6/dist-packages (from sacremoses->transformers<2.12,>=2.9->allennlp==1.0.0) (7.1.2)\\n\",\n            \"Requirement already satisfied: pyparsing>=2.0.2 in /usr/local/lib/python3.6/dist-packages (from packaging->transformers<2.12,>=2.9->allennlp==1.0.0) (2.4.7)\\n\",\n            \"Requirement already satisfied: python-dateutil<3.0.0,>=2.1 in /usr/local/lib/python3.6/dist-packages (from botocore<1.20.0,>=1.19.40->boto3->allennlp==1.0.0) (2.8.1)\\n\",\n            \"Building wheels for collected packages: overrides, jsonnet, word2number, sacremoses\\n\",\n            \"  Building wheel for overrides (setup.py) ... \\u001b[?25l\\u001b[?25hdone\\n\",\n            \"  Created wheel for overrides: filename=overrides-3.0.0-cp36-none-any.whl size=5669 sha256=6f80143088a78455dd287d78fdced3dd986f9bb4a23edb94ec3376db1b81df6f\\n\",\n            \"  Stored in directory: /root/.cache/pip/wheels/6f/1b/ec/6c71a1eb823df7f850d956b2d8c50a6d49c191e1063d73b9be\\n\",\n            \"  Building wheel for jsonnet (setup.py) ... \\u001b[?25l\\u001b[?25hdone\\n\",\n            \"  Created wheel for jsonnet: filename=jsonnet-0.17.0-cp36-cp36m-linux_x86_64.whl size=3387942 sha256=1418eb22c110c8535ec13a7e223704f7d00de2f9d27b395203bcc0a6044e03aa\\n\",\n            \"  Stored in directory: /root/.cache/pip/wheels/26/7a/37/7dbcc30a6b4efd17b91ad1f0128b7bbf84813bd4e1cfb8c1e3\\n\",\n            \"  Building wheel for word2number (setup.py) ... \\u001b[?25l\\u001b[?25hdone\\n\",\n            \"  Created wheel for word2number: filename=word2number-1.1-cp36-none-any.whl size=5588 sha256=6c711b492080c629e45bd6f594bbd082f84b77fb2859e691b07f8cc43e891868\\n\",\n            \"  Stored in directory: /root/.cache/pip/wheels/46/2f/53/5f5c1d275492f2fce1cdab9a9bb12d49286dead829a4078e0e\\n\",\n            \"  Building wheel for sacremoses (setup.py) ... \\u001b[?25l\\u001b[?25hdone\\n\",\n            \"  Created wheel for sacremoses: filename=sacremoses-0.0.43-cp36-none-any.whl size=893261 sha256=ca22850c9a27802373c980ccdce43020db3a1c30576474ebc27c849dd7a8374e\\n\",\n            \"  Stored in directory: /root/.cache/pip/wheels/29/3c/fd/7ce5c3f0666dab31a50123635e6fb5e19ceb42ce38d4e58f45\\n\",\n            \"Successfully built overrides jsonnet word2number sacremoses\\n\",\n            \"\\u001b[31mERROR: torchvision 0.8.1+cu101 has requirement torch==1.7.0, but you'll have torch 1.5.1 which is incompatible.\\u001b[0m\\n\",\n            \"\\u001b[31mERROR: botocore 1.19.40 has requirement urllib3<1.27,>=1.25.4; python_version != \\\"3.4\\\", but you'll have urllib3 1.24.3 which is incompatible.\\u001b[0m\\n\",\n            \"Installing collected packages: torch, overrides, jsonpickle, tokenizers, sentencepiece, sacremoses, transformers, tensorboardX, jmespath, botocore, s3transfer, boto3, jsonnet, allennlp, word2number, conllu, py-rouge, allennlp-models\\n\",\n            \"  Found existing installation: torch 1.7.0+cu101\\n\",\n            \"    Uninstalling torch-1.7.0+cu101:\\n\",\n            \"      Successfully uninstalled torch-1.7.0+cu101\\n\",\n            \"Successfully installed allennlp-1.0.0 allennlp-models-1.0.0 boto3-1.16.40 botocore-1.19.40 conllu-3.0 jmespath-0.10.0 jsonnet-0.17.0 jsonpickle-1.4.2 overrides-3.0.0 py-rouge-1.1 s3transfer-0.3.3 sacremoses-0.0.43 sentencepiece-0.1.94 tensorboardX-2.1 tokenizers-0.7.0 torch-1.5.1 transformers-2.11.0 word2number-1.1\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"EcSZJu8ohUv5\"\n      },\n      \"source\": [\n        \"Sample 1: Did Bob really think he could prepare a meal for 50 people in only a few hours?\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"1pziWuZKeMti\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"cef0cbfb-9256-45d6-9419-3cba6bd616c3\"\n      },\n      \"source\": [\n        \"!echo '{\\\"sentence\\\": \\\"Did Bob really think he could prepare a meal for 50 people in only a few hours?\\\"}' | \\\\\\n\",\n        \"allennlp predict https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz -\"\n      ],\n      \"execution_count\": 2,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"2020-12-20 09:07:37,371 - INFO - transformers.file_utils - PyTorch version 1.5.1 available.\\n\",\n            \"2020-12-20 09:07:37.529750: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library libcudart.so.10.1\\n\",\n            \"2020-12-20 09:07:39,325 - INFO - transformers.file_utils - TensorFlow version 2.4.0 available.\\n\",\n            \"[nltk_data] Downloading package punkt to /root/nltk_data...\\n\",\n            \"[nltk_data]   Unzipping tokenizers/punkt.zip.\\n\",\n            \"[nltk_data] Downloading package wordnet to /root/nltk_data...\\n\",\n            \"[nltk_data]   Unzipping corpora/wordnet.zip.\\n\",\n            \"2020-12-20 09:07:43,159 - INFO - allennlp.common.file_utils - checking cache for https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\\n\",\n            \"2020-12-20 09:07:43,159 - INFO - allennlp.common.file_utils - waiting to acquire lock on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\\n\",\n            \"2020-12-20 09:07:43,160 - INFO - filelock - Lock 140370276310488 acquired on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\\n\",\n            \"2020-12-20 09:07:43,161 - INFO - allennlp.common.file_utils - https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz not found in cache, downloading to /root/.allennlp/cache/tmpqkld8r7a.tmp\\n\",\n            \"100% 406056588/406056588 [00:06<00:00, 64834243.25B/s]\\n\",\n            \"2020-12-20 09:07:49,714 - INFO - allennlp.common.file_utils - Renaming temp file /root/.allennlp/cache/tmpqkld8r7a.tmp to cache at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\\n\",\n            \"2020-12-20 09:07:49,714 - INFO - allennlp.common.file_utils - creating metadata file for /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\\n\",\n            \"2020-12-20 09:07:49,715 - INFO - filelock - Lock 140370276310488 released on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\\n\",\n            \"2020-12-20 09:07:49,715 - INFO - allennlp.models.archival - loading archive file https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz from cache at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\\n\",\n            \"2020-12-20 09:07:49,715 - INFO - allennlp.models.archival - extracting archive file /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724 to temp dir /tmp/tmpaqcbgixa\\n\",\n            \"2020-12-20 09:07:54,008 - INFO - allennlp.common.params - type = from_instances\\n\",\n            \"2020-12-20 09:07:54,009 - INFO - allennlp.data.vocabulary - Loading token dictionary from /tmp/tmpaqcbgixa/vocabulary.\\n\",\n            \"2020-12-20 09:07:54,009 - INFO - filelock - Lock 140370276158376 acquired on /tmp/tmpaqcbgixa/vocabulary/.lock\\n\",\n            \"2020-12-20 09:07:54,036 - INFO - filelock - Lock 140370276158376 released on /tmp/tmpaqcbgixa/vocabulary/.lock\\n\",\n            \"2020-12-20 09:07:54,037 - INFO - allennlp.common.params - model.type = srl_bert\\n\",\n            \"2020-12-20 09:07:54,037 - INFO - allennlp.common.params - model.regularizer = None\\n\",\n            \"2020-12-20 09:07:54,037 - INFO - allennlp.common.params - model.bert_model = bert-base-uncased\\n\",\n            \"2020-12-20 09:07:54,037 - INFO - allennlp.common.params - model.embedding_dropout = 0.1\\n\",\n            \"2020-12-20 09:07:54,038 - INFO - allennlp.common.params - model.initializer = <allennlp.nn.initializers.InitializerApplicator object at 0x7faa80ddc7b8>\\n\",\n            \"2020-12-20 09:07:54,038 - INFO - allennlp.common.params - model.label_smoothing = None\\n\",\n            \"2020-12-20 09:07:54,038 - INFO - allennlp.common.params - model.ignore_span_metric = False\\n\",\n            \"2020-12-20 09:07:54,038 - INFO - allennlp.common.params - model.srl_eval_path = /usr/local/lib/python3.6/dist-packages/allennlp_models/structured_prediction/tools/srl-eval.pl\\n\",\n            \"2020-12-20 09:07:54,338 - INFO - filelock - Lock 140370267846920 acquired on /root/.cache/torch/transformers/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517.lock\\n\",\n            \"2020-12-20 09:07:54,339 - INFO - transformers.file_utils - https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json not found in cache or force_download set to True, downloading to /root/.cache/torch/transformers/tmp9z3yyeli\\n\",\n            \"Downloading: 100% 433/433 [00:00<00:00, 297kB/s]\\n\",\n            \"2020-12-20 09:07:54,651 - INFO - transformers.file_utils - storing https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json in cache at /root/.cache/torch/transformers/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517\\n\",\n            \"2020-12-20 09:07:54,651 - INFO - transformers.file_utils - creating metadata file for /root/.cache/torch/transformers/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517\\n\",\n            \"2020-12-20 09:07:54,651 - INFO - filelock - Lock 140370267846920 released on /root/.cache/torch/transformers/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517.lock\\n\",\n            \"2020-12-20 09:07:54,652 - INFO - transformers.configuration_utils - loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json from cache at /root/.cache/torch/transformers/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517\\n\",\n            \"2020-12-20 09:07:54,652 - INFO - transformers.configuration_utils - Model config BertConfig {\\n\",\n            \"  \\\"architectures\\\": [\\n\",\n            \"    \\\"BertForMaskedLM\\\"\\n\",\n            \"  ],\\n\",\n            \"  \\\"attention_probs_dropout_prob\\\": 0.1,\\n\",\n            \"  \\\"hidden_act\\\": \\\"gelu\\\",\\n\",\n            \"  \\\"hidden_dropout_prob\\\": 0.1,\\n\",\n            \"  \\\"hidden_size\\\": 768,\\n\",\n            \"  \\\"initializer_range\\\": 0.02,\\n\",\n            \"  \\\"intermediate_size\\\": 3072,\\n\",\n            \"  \\\"layer_norm_eps\\\": 1e-12,\\n\",\n            \"  \\\"max_position_embeddings\\\": 512,\\n\",\n            \"  \\\"model_type\\\": \\\"bert\\\",\\n\",\n            \"  \\\"num_attention_heads\\\": 12,\\n\",\n            \"  \\\"num_hidden_layers\\\": 12,\\n\",\n            \"  \\\"pad_token_id\\\": 0,\\n\",\n            \"  \\\"type_vocab_size\\\": 2,\\n\",\n            \"  \\\"vocab_size\\\": 30522\\n\",\n            \"}\\n\",\n            \"\\n\",\n            \"2020-12-20 09:07:54,878 - INFO - filelock - Lock 140370267847312 acquired on /root/.cache/torch/transformers/f2ee78bdd635b758cc0a12352586868bef80e47401abe4c4fcc3832421e7338b.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157.lock\\n\",\n            \"2020-12-20 09:07:54,879 - INFO - transformers.file_utils - https://cdn.huggingface.co/bert-base-uncased-pytorch_model.bin not found in cache or force_download set to True, downloading to /root/.cache/torch/transformers/tmp2oj6r8r1\\n\",\n            \"Downloading: 100% 440M/440M [00:10<00:00, 43.8MB/s]\\n\",\n            \"2020-12-20 09:08:04,981 - INFO - transformers.file_utils - storing https://cdn.huggingface.co/bert-base-uncased-pytorch_model.bin in cache at /root/.cache/torch/transformers/f2ee78bdd635b758cc0a12352586868bef80e47401abe4c4fcc3832421e7338b.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157\\n\",\n            \"2020-12-20 09:08:04,981 - INFO - transformers.file_utils - creating metadata file for /root/.cache/torch/transformers/f2ee78bdd635b758cc0a12352586868bef80e47401abe4c4fcc3832421e7338b.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157\\n\",\n            \"2020-12-20 09:08:04,982 - INFO - filelock - Lock 140370267847312 released on /root/.cache/torch/transformers/f2ee78bdd635b758cc0a12352586868bef80e47401abe4c4fcc3832421e7338b.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157.lock\\n\",\n            \"2020-12-20 09:08:04,982 - INFO - transformers.modeling_utils - loading weights file https://cdn.huggingface.co/bert-base-uncased-pytorch_model.bin from cache at /root/.cache/torch/transformers/f2ee78bdd635b758cc0a12352586868bef80e47401abe4c4fcc3832421e7338b.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157\\n\",\n            \"2020-12-20 09:08:07,795 - INFO - allennlp.nn.initializers - Initializing parameters\\n\",\n            \"2020-12-20 09:08:07,796 - INFO - allennlp.nn.initializers - Done initializing parameters; the following parameters are using their default initialization from their code\\n\",\n            \"2020-12-20 09:08:07,796 - INFO - allennlp.nn.initializers -    bert_model.embeddings.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:07,796 - INFO - allennlp.nn.initializers -    bert_model.embeddings.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:07,796 - INFO - allennlp.nn.initializers -    bert_model.embeddings.position_embeddings.weight\\n\",\n            \"2020-12-20 09:08:07,796 - INFO - allennlp.nn.initializers -    bert_model.embeddings.token_type_embeddings.weight\\n\",\n            \"2020-12-20 09:08:07,796 - INFO - allennlp.nn.initializers -    bert_model.embeddings.word_embeddings.weight\\n\",\n            \"2020-12-20 09:08:07,796 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:07,796 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:07,796 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:07,796 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:07,796 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:07,796 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.dense.bias\\n\",\n            \"2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.dense.weight\\n\",\n            \"2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.dense.bias\\n\",\n            \"2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.dense.weight\\n\",\n            \"2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.dense.bias\\n\",\n            \"2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.dense.weight\\n\",\n            \"2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.dense.bias\\n\",\n            \"2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.dense.weight\\n\",\n            \"2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.dense.bias\\n\",\n            \"2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.dense.weight\\n\",\n            \"2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.dense.bias\\n\",\n            \"2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.dense.weight\\n\",\n            \"2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.dense.bias\\n\",\n            \"2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.dense.weight\\n\",\n            \"2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.dense.bias\\n\",\n            \"2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.dense.weight\\n\",\n            \"2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.dense.bias\\n\",\n            \"2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.dense.weight\\n\",\n            \"2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.dense.bias\\n\",\n            \"2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.dense.weight\\n\",\n            \"2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.dense.bias\\n\",\n            \"2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.dense.weight\\n\",\n            \"2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.dense.bias\\n\",\n            \"2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.dense.weight\\n\",\n            \"2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers -    bert_model.pooler.dense.bias\\n\",\n            \"2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers -    bert_model.pooler.dense.weight\\n\",\n            \"2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers -    tag_projection_layer.bias\\n\",\n            \"2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers -    tag_projection_layer.weight\\n\",\n            \"2020-12-20 09:08:08,259 - INFO - allennlp.common.params - dataset_reader.type = srl\\n\",\n            \"2020-12-20 09:08:08,260 - INFO - allennlp.common.params - dataset_reader.lazy = False\\n\",\n            \"2020-12-20 09:08:08,260 - INFO - allennlp.common.params - dataset_reader.cache_directory = None\\n\",\n            \"2020-12-20 09:08:08,260 - INFO - allennlp.common.params - dataset_reader.max_instances = None\\n\",\n            \"2020-12-20 09:08:08,260 - INFO - allennlp.common.params - dataset_reader.manual_distributed_sharding = False\\n\",\n            \"2020-12-20 09:08:08,260 - INFO - allennlp.common.params - dataset_reader.manual_multi_process_sharding = False\\n\",\n            \"2020-12-20 09:08:08,260 - INFO - allennlp.common.params - dataset_reader.token_indexers = None\\n\",\n            \"2020-12-20 09:08:08,260 - INFO - allennlp.common.params - dataset_reader.domain_identifier = None\\n\",\n            \"2020-12-20 09:08:08,260 - INFO - allennlp.common.params - dataset_reader.bert_model_name = bert-base-uncased\\n\",\n            \"2020-12-20 09:08:08,568 - INFO - filelock - Lock 140370276307520 acquired on /root/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084.lock\\n\",\n            \"2020-12-20 09:08:08,568 - INFO - transformers.file_utils - https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt not found in cache or force_download set to True, downloading to /root/.cache/torch/transformers/tmp733mj1ki\\n\",\n            \"Downloading: 100% 232k/232k [00:00<00:00, 880kB/s]\\n\",\n            \"2020-12-20 09:08:09,117 - INFO - transformers.file_utils - storing https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt in cache at /root/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084\\n\",\n            \"2020-12-20 09:08:09,117 - INFO - transformers.file_utils - creating metadata file for /root/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084\\n\",\n            \"2020-12-20 09:08:09,117 - INFO - filelock - Lock 140370276307520 released on /root/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084.lock\\n\",\n            \"2020-12-20 09:08:09,118 - INFO - transformers.tokenization_utils - loading file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt from cache at /root/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084\\n\",\n            \"input 0:  {\\\"sentence\\\": \\\"Did Bob really think he could prepare a meal for 50 people in only a few hours?\\\"}\\n\",\n            \"prediction:  {\\\"verbs\\\": [{\\\"verb\\\": \\\"think\\\", \\\"description\\\": \\\"Did [ARG0: Bob] [ARGM-ADV: really] [V: think] [ARG1: he could prepare a meal for 50 people in only a few hours] ?\\\", \\\"tags\\\": [\\\"O\\\", \\\"B-ARG0\\\", \\\"B-ARGM-ADV\\\", \\\"B-V\\\", \\\"B-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"O\\\"]}, {\\\"verb\\\": \\\"could\\\", \\\"description\\\": \\\"Did Bob really think he [V: could] prepare a meal for 50 people in only a few hours ?\\\", \\\"tags\\\": [\\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"B-V\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\"]}, {\\\"verb\\\": \\\"prepare\\\", \\\"description\\\": \\\"Did Bob really think [ARG0: he] [ARGM-MOD: could] [V: prepare] [ARG1: a meal for 50 people] [ARGM-TMP: in only a few hours] ?\\\", \\\"tags\\\": [\\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"B-ARG0\\\", \\\"B-ARGM-MOD\\\", \\\"B-V\\\", \\\"B-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"B-ARGM-TMP\\\", \\\"I-ARGM-TMP\\\", \\\"I-ARGM-TMP\\\", \\\"I-ARGM-TMP\\\", \\\"I-ARGM-TMP\\\", \\\"O\\\"]}], \\\"words\\\": [\\\"Did\\\", \\\"Bob\\\", \\\"really\\\", \\\"think\\\", \\\"he\\\", \\\"could\\\", \\\"prepare\\\", \\\"a\\\", \\\"meal\\\", \\\"for\\\", \\\"50\\\", \\\"people\\\", \\\"in\\\", \\\"only\\\", \\\"a\\\", \\\"few\\\", \\\"hours\\\", \\\"?\\\"]}\\n\",\n            \"\\n\",\n            \"2020-12-20 09:08:10,277 - INFO - allennlp.models.archival - removing temporary unarchived model dir at /tmp/tmpaqcbgixa\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"hWHpOrNvZQ3m\"\n      },\n      \"source\": [\n        \"Sample 2: Mrs. and Mr. Tomaso went to Europe for vacation and visited Paris and first went to visit the Eiffel Tower.\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"yFKPLyqihrB_\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"e0f52798-903a-4d04-f388-d26a700b8cd7\"\n      },\n      \"source\": [\n        \"!echo '{\\\"sentence\\\": \\\"Mrs. and Mr. Tomaso went to Europe for vacation and visited Paris and first went to visit the Eiffel Tower.\\\"}' | \\\\\\n\",\n        \"allennlp predict https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz -\"\n      ],\n      \"execution_count\": 3,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"2020-12-20 09:08:12,622 - INFO - transformers.file_utils - PyTorch version 1.5.1 available.\\n\",\n            \"2020-12-20 09:08:12.774532: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library libcudart.so.10.1\\n\",\n            \"2020-12-20 09:08:14,547 - INFO - transformers.file_utils - TensorFlow version 2.4.0 available.\\n\",\n            \"2020-12-20 09:08:15,761 - INFO - allennlp.common.file_utils - checking cache for https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\\n\",\n            \"2020-12-20 09:08:15,761 - INFO - allennlp.common.file_utils - waiting to acquire lock on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\\n\",\n            \"2020-12-20 09:08:15,762 - INFO - filelock - Lock 139888470380440 acquired on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\\n\",\n            \"2020-12-20 09:08:15,763 - INFO - allennlp.common.file_utils - cache of https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz is up-to-date\\n\",\n            \"2020-12-20 09:08:15,763 - INFO - filelock - Lock 139888470380440 released on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\\n\",\n            \"2020-12-20 09:08:15,763 - INFO - allennlp.models.archival - loading archive file https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz from cache at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\\n\",\n            \"2020-12-20 09:08:15,763 - INFO - allennlp.models.archival - extracting archive file /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724 to temp dir /tmp/tmp7jeuj77a\\n\",\n            \"2020-12-20 09:08:19,975 - INFO - allennlp.common.params - type = from_instances\\n\",\n            \"2020-12-20 09:08:19,976 - INFO - allennlp.data.vocabulary - Loading token dictionary from /tmp/tmp7jeuj77a/vocabulary.\\n\",\n            \"2020-12-20 09:08:19,976 - INFO - filelock - Lock 139888468747992 acquired on /tmp/tmp7jeuj77a/vocabulary/.lock\\n\",\n            \"2020-12-20 09:08:20,002 - INFO - filelock - Lock 139888468747992 released on /tmp/tmp7jeuj77a/vocabulary/.lock\\n\",\n            \"2020-12-20 09:08:20,003 - INFO - allennlp.common.params - model.type = srl_bert\\n\",\n            \"2020-12-20 09:08:20,004 - INFO - allennlp.common.params - model.regularizer = None\\n\",\n            \"2020-12-20 09:08:20,004 - INFO - allennlp.common.params - model.bert_model = bert-base-uncased\\n\",\n            \"2020-12-20 09:08:20,004 - INFO - allennlp.common.params - model.embedding_dropout = 0.1\\n\",\n            \"2020-12-20 09:08:20,004 - INFO - allennlp.common.params - model.initializer = <allennlp.nn.initializers.InitializerApplicator object at 0x7f3a52797748>\\n\",\n            \"2020-12-20 09:08:20,004 - INFO - allennlp.common.params - model.label_smoothing = None\\n\",\n            \"2020-12-20 09:08:20,004 - INFO - allennlp.common.params - model.ignore_span_metric = False\\n\",\n            \"2020-12-20 09:08:20,004 - INFO - allennlp.common.params - model.srl_eval_path = /usr/local/lib/python3.6/dist-packages/allennlp_models/structured_prediction/tools/srl-eval.pl\\n\",\n            \"2020-12-20 09:08:20,298 - INFO - transformers.configuration_utils - loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json from cache at /root/.cache/torch/transformers/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517\\n\",\n            \"2020-12-20 09:08:20,298 - INFO - transformers.configuration_utils - Model config BertConfig {\\n\",\n            \"  \\\"architectures\\\": [\\n\",\n            \"    \\\"BertForMaskedLM\\\"\\n\",\n            \"  ],\\n\",\n            \"  \\\"attention_probs_dropout_prob\\\": 0.1,\\n\",\n            \"  \\\"hidden_act\\\": \\\"gelu\\\",\\n\",\n            \"  \\\"hidden_dropout_prob\\\": 0.1,\\n\",\n            \"  \\\"hidden_size\\\": 768,\\n\",\n            \"  \\\"initializer_range\\\": 0.02,\\n\",\n            \"  \\\"intermediate_size\\\": 3072,\\n\",\n            \"  \\\"layer_norm_eps\\\": 1e-12,\\n\",\n            \"  \\\"max_position_embeddings\\\": 512,\\n\",\n            \"  \\\"model_type\\\": \\\"bert\\\",\\n\",\n            \"  \\\"num_attention_heads\\\": 12,\\n\",\n            \"  \\\"num_hidden_layers\\\": 12,\\n\",\n            \"  \\\"pad_token_id\\\": 0,\\n\",\n            \"  \\\"type_vocab_size\\\": 2,\\n\",\n            \"  \\\"vocab_size\\\": 30522\\n\",\n            \"}\\n\",\n            \"\\n\",\n            \"2020-12-20 09:08:20,492 - INFO - transformers.modeling_utils - loading weights file https://cdn.huggingface.co/bert-base-uncased-pytorch_model.bin from cache at /root/.cache/torch/transformers/f2ee78bdd635b758cc0a12352586868bef80e47401abe4c4fcc3832421e7338b.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157\\n\",\n            \"2020-12-20 09:08:23,170 - INFO - allennlp.nn.initializers - Initializing parameters\\n\",\n            \"2020-12-20 09:08:23,171 - INFO - allennlp.nn.initializers - Done initializing parameters; the following parameters are using their default initialization from their code\\n\",\n            \"2020-12-20 09:08:23,171 - INFO - allennlp.nn.initializers -    bert_model.embeddings.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers -    bert_model.embeddings.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers -    bert_model.embeddings.position_embeddings.weight\\n\",\n            \"2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers -    bert_model.embeddings.token_type_embeddings.weight\\n\",\n            \"2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers -    bert_model.embeddings.word_embeddings.weight\\n\",\n            \"2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.dense.bias\\n\",\n            \"2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.dense.weight\\n\",\n            \"2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.dense.bias\\n\",\n            \"2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.dense.weight\\n\",\n            \"2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.dense.bias\\n\",\n            \"2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.dense.weight\\n\",\n            \"2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.dense.bias\\n\",\n            \"2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.dense.weight\\n\",\n            \"2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.dense.bias\\n\",\n            \"2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.dense.weight\\n\",\n            \"2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.dense.bias\\n\",\n            \"2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.dense.weight\\n\",\n            \"2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.dense.bias\\n\",\n            \"2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.dense.weight\\n\",\n            \"2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.dense.bias\\n\",\n            \"2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.dense.weight\\n\",\n            \"2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.dense.bias\\n\",\n            \"2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.dense.weight\\n\",\n            \"2020-12-20 09:08:23,180 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:23,180 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:23,180 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:23,180 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:23,180 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:23,180 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:23,180 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:23,180 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:23,180 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:23,208 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:23,209 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:23,209 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:23,209 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:23,209 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:23,209 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.dense.bias\\n\",\n            \"2020-12-20 09:08:23,209 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.dense.weight\\n\",\n            \"2020-12-20 09:08:23,209 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:23,209 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:23,209 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.dense.bias\\n\",\n            \"2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.dense.weight\\n\",\n            \"2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.dense.bias\\n\",\n            \"2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.dense.weight\\n\",\n            \"2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers -    bert_model.pooler.dense.bias\\n\",\n            \"2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers -    bert_model.pooler.dense.weight\\n\",\n            \"2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers -    tag_projection_layer.bias\\n\",\n            \"2020-12-20 09:08:23,212 - INFO - allennlp.nn.initializers -    tag_projection_layer.weight\\n\",\n            \"2020-12-20 09:08:23,664 - INFO - allennlp.common.params - dataset_reader.type = srl\\n\",\n            \"2020-12-20 09:08:23,664 - INFO - allennlp.common.params - dataset_reader.lazy = False\\n\",\n            \"2020-12-20 09:08:23,665 - INFO - allennlp.common.params - dataset_reader.cache_directory = None\\n\",\n            \"2020-12-20 09:08:23,665 - INFO - allennlp.common.params - dataset_reader.max_instances = None\\n\",\n            \"2020-12-20 09:08:23,665 - INFO - allennlp.common.params - dataset_reader.manual_distributed_sharding = False\\n\",\n            \"2020-12-20 09:08:23,665 - INFO - allennlp.common.params - dataset_reader.manual_multi_process_sharding = False\\n\",\n            \"2020-12-20 09:08:23,665 - INFO - allennlp.common.params - dataset_reader.token_indexers = None\\n\",\n            \"2020-12-20 09:08:23,665 - INFO - allennlp.common.params - dataset_reader.domain_identifier = None\\n\",\n            \"2020-12-20 09:08:23,665 - INFO - allennlp.common.params - dataset_reader.bert_model_name = bert-base-uncased\\n\",\n            \"2020-12-20 09:08:23,987 - INFO - transformers.tokenization_utils - loading file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt from cache at /root/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084\\n\",\n            \"input 0:  {\\\"sentence\\\": \\\"Mrs. and Mr. Tomaso went to Europe for vacation and visited Paris and first went to visit the Eiffel Tower.\\\"}\\n\",\n            \"prediction:  {\\\"verbs\\\": [{\\\"verb\\\": \\\"went\\\", \\\"description\\\": \\\"[ARG0: Mrs. and Mr. Tomaso] [V: went] [ARG4: to Europe] [ARGM-PRP: for vacation] and visited Paris and first went to visit the Eiffel Tower .\\\", \\\"tags\\\": [\\\"B-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"B-V\\\", \\\"B-ARG4\\\", \\\"I-ARG4\\\", \\\"B-ARGM-PRP\\\", \\\"I-ARGM-PRP\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\"]}, {\\\"verb\\\": \\\"visited\\\", \\\"description\\\": \\\"[ARG0: Mrs. and Mr. Tomaso] went to Europe for vacation and [V: visited] [ARG1: Paris] and first went to visit the Eiffel Tower .\\\", \\\"tags\\\": [\\\"B-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"B-V\\\", \\\"B-ARG1\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\"]}, {\\\"verb\\\": \\\"went\\\", \\\"description\\\": \\\"[ARG0: Mrs. and Mr. Tomaso] went to Europe for vacation and visited Paris and [ARGM-TMP: first] [V: went] [ARGM-PRP: to visit the Eiffel Tower] .\\\", \\\"tags\\\": [\\\"B-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"B-ARGM-TMP\\\", \\\"B-V\\\", \\\"B-ARGM-PRP\\\", \\\"I-ARGM-PRP\\\", \\\"I-ARGM-PRP\\\", \\\"I-ARGM-PRP\\\", \\\"I-ARGM-PRP\\\", \\\"O\\\"]}, {\\\"verb\\\": \\\"visit\\\", \\\"description\\\": \\\"[ARG0: Mrs. and Mr. Tomaso] went to Europe for vacation and visited Paris and first went to [V: visit] [ARG1: the Eiffel Tower] .\\\", \\\"tags\\\": [\\\"B-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"B-V\\\", \\\"B-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"O\\\"]}], \\\"words\\\": [\\\"Mrs.\\\", \\\"and\\\", \\\"Mr.\\\", \\\"Tomaso\\\", \\\"went\\\", \\\"to\\\", \\\"Europe\\\", \\\"for\\\", \\\"vacation\\\", \\\"and\\\", \\\"visited\\\", \\\"Paris\\\", \\\"and\\\", \\\"first\\\", \\\"went\\\", \\\"to\\\", \\\"visit\\\", \\\"the\\\", \\\"Eiffel\\\", \\\"Tower\\\", \\\".\\\"]}\\n\",\n            \"\\n\",\n            \"2020-12-20 09:08:25,342 - INFO - allennlp.models.archival - removing temporary unarchived model dir at /tmp/tmp7jeuj77a\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"v45ooI5ReoXk\"\n      },\n      \"source\": [\n        \"Sample 3:John wanted to drink tea, Mary likes to drink coffee but Karim drank some cool water and Faiza would like to drink tomato juice.\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"Pz-jLVeAersa\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"13b50208-4caf-476d-abf8-627b21c65863\"\n      },\n      \"source\": [\n        \"!echo '{\\\"sentence\\\": \\\"John wanted to drink tea, Mary likes to drink coffee but Karim drank some cool water and Faiza would like to drink tomato juice.\\\"}' | \\\\\\n\",\n        \"allennlp predict https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz -\"\n      ],\n      \"execution_count\": 4,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"2020-12-20 09:08:27,582 - INFO - transformers.file_utils - PyTorch version 1.5.1 available.\\n\",\n            \"2020-12-20 09:08:27.767124: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library libcudart.so.10.1\\n\",\n            \"2020-12-20 09:08:29,592 - INFO - transformers.file_utils - TensorFlow version 2.4.0 available.\\n\",\n            \"2020-12-20 09:08:30,797 - INFO - allennlp.common.file_utils - checking cache for https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\\n\",\n            \"2020-12-20 09:08:30,797 - INFO - allennlp.common.file_utils - waiting to acquire lock on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\\n\",\n            \"2020-12-20 09:08:30,799 - INFO - filelock - Lock 140584440236464 acquired on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\\n\",\n            \"2020-12-20 09:08:30,799 - INFO - allennlp.common.file_utils - cache of https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz is up-to-date\\n\",\n            \"2020-12-20 09:08:30,799 - INFO - filelock - Lock 140584440236464 released on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\\n\",\n            \"2020-12-20 09:08:30,799 - INFO - allennlp.models.archival - loading archive file https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz from cache at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\\n\",\n            \"2020-12-20 09:08:30,799 - INFO - allennlp.models.archival - extracting archive file /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724 to temp dir /tmp/tmpse7z902p\\n\",\n            \"2020-12-20 09:08:35,061 - INFO - allennlp.common.params - type = from_instances\\n\",\n            \"2020-12-20 09:08:35,061 - INFO - allennlp.data.vocabulary - Loading token dictionary from /tmp/tmpse7z902p/vocabulary.\\n\",\n            \"2020-12-20 09:08:35,062 - INFO - filelock - Lock 140584442130328 acquired on /tmp/tmpse7z902p/vocabulary/.lock\\n\",\n            \"2020-12-20 09:08:35,089 - INFO - filelock - Lock 140584442130328 released on /tmp/tmpse7z902p/vocabulary/.lock\\n\",\n            \"2020-12-20 09:08:35,089 - INFO - allennlp.common.params - model.type = srl_bert\\n\",\n            \"2020-12-20 09:08:35,090 - INFO - allennlp.common.params - model.regularizer = None\\n\",\n            \"2020-12-20 09:08:35,090 - INFO - allennlp.common.params - model.bert_model = bert-base-uncased\\n\",\n            \"2020-12-20 09:08:35,090 - INFO - allennlp.common.params - model.embedding_dropout = 0.1\\n\",\n            \"2020-12-20 09:08:35,090 - INFO - allennlp.common.params - model.initializer = <allennlp.nn.initializers.InitializerApplicator object at 0x7fdc5d9b87b8>\\n\",\n            \"2020-12-20 09:08:35,090 - INFO - allennlp.common.params - model.label_smoothing = None\\n\",\n            \"2020-12-20 09:08:35,090 - INFO - allennlp.common.params - model.ignore_span_metric = False\\n\",\n            \"2020-12-20 09:08:35,090 - INFO - allennlp.common.params - model.srl_eval_path = /usr/local/lib/python3.6/dist-packages/allennlp_models/structured_prediction/tools/srl-eval.pl\\n\",\n            \"2020-12-20 09:08:35,400 - INFO - transformers.configuration_utils - loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json from cache at /root/.cache/torch/transformers/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517\\n\",\n            \"2020-12-20 09:08:35,400 - INFO - transformers.configuration_utils - Model config BertConfig {\\n\",\n            \"  \\\"architectures\\\": [\\n\",\n            \"    \\\"BertForMaskedLM\\\"\\n\",\n            \"  ],\\n\",\n            \"  \\\"attention_probs_dropout_prob\\\": 0.1,\\n\",\n            \"  \\\"hidden_act\\\": \\\"gelu\\\",\\n\",\n            \"  \\\"hidden_dropout_prob\\\": 0.1,\\n\",\n            \"  \\\"hidden_size\\\": 768,\\n\",\n            \"  \\\"initializer_range\\\": 0.02,\\n\",\n            \"  \\\"intermediate_size\\\": 3072,\\n\",\n            \"  \\\"layer_norm_eps\\\": 1e-12,\\n\",\n            \"  \\\"max_position_embeddings\\\": 512,\\n\",\n            \"  \\\"model_type\\\": \\\"bert\\\",\\n\",\n            \"  \\\"num_attention_heads\\\": 12,\\n\",\n            \"  \\\"num_hidden_layers\\\": 12,\\n\",\n            \"  \\\"pad_token_id\\\": 0,\\n\",\n            \"  \\\"type_vocab_size\\\": 2,\\n\",\n            \"  \\\"vocab_size\\\": 30522\\n\",\n            \"}\\n\",\n            \"\\n\",\n            \"2020-12-20 09:08:35,598 - INFO - transformers.modeling_utils - loading weights file https://cdn.huggingface.co/bert-base-uncased-pytorch_model.bin from cache at /root/.cache/torch/transformers/f2ee78bdd635b758cc0a12352586868bef80e47401abe4c4fcc3832421e7338b.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157\\n\",\n            \"2020-12-20 09:08:38,288 - INFO - allennlp.nn.initializers - Initializing parameters\\n\",\n            \"2020-12-20 09:08:38,289 - INFO - allennlp.nn.initializers - Done initializing parameters; the following parameters are using their default initialization from their code\\n\",\n            \"2020-12-20 09:08:38,289 - INFO - allennlp.nn.initializers -    bert_model.embeddings.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:38,289 - INFO - allennlp.nn.initializers -    bert_model.embeddings.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:38,289 - INFO - allennlp.nn.initializers -    bert_model.embeddings.position_embeddings.weight\\n\",\n            \"2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers -    bert_model.embeddings.token_type_embeddings.weight\\n\",\n            \"2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers -    bert_model.embeddings.word_embeddings.weight\\n\",\n            \"2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.dense.bias\\n\",\n            \"2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.dense.weight\\n\",\n            \"2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.dense.bias\\n\",\n            \"2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.dense.weight\\n\",\n            \"2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.dense.bias\\n\",\n            \"2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.dense.weight\\n\",\n            \"2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.dense.bias\\n\",\n            \"2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.dense.weight\\n\",\n            \"2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.dense.bias\\n\",\n            \"2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.dense.weight\\n\",\n            \"2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.dense.bias\\n\",\n            \"2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.dense.weight\\n\",\n            \"2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.dense.bias\\n\",\n            \"2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.dense.weight\\n\",\n            \"2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.dense.bias\\n\",\n            \"2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.dense.weight\\n\",\n            \"2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.dense.bias\\n\",\n            \"2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.dense.weight\\n\",\n            \"2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:38,374 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:38,374 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:38,374 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:38,374 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:38,374 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:38,374 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.dense.bias\\n\",\n            \"2020-12-20 09:08:38,374 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.dense.weight\\n\",\n            \"2020-12-20 09:08:38,374 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:38,374 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.dense.bias\\n\",\n            \"2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.dense.weight\\n\",\n            \"2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.dense.bias\\n\",\n            \"2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.dense.weight\\n\",\n            \"2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers -    bert_model.pooler.dense.bias\\n\",\n            \"2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers -    bert_model.pooler.dense.weight\\n\",\n            \"2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers -    tag_projection_layer.bias\\n\",\n            \"2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers -    tag_projection_layer.weight\\n\",\n            \"2020-12-20 09:08:38,830 - INFO - allennlp.common.params - dataset_reader.type = srl\\n\",\n            \"2020-12-20 09:08:38,831 - INFO - allennlp.common.params - dataset_reader.lazy = False\\n\",\n            \"2020-12-20 09:08:38,831 - INFO - allennlp.common.params - dataset_reader.cache_directory = None\\n\",\n            \"2020-12-20 09:08:38,831 - INFO - allennlp.common.params - dataset_reader.max_instances = None\\n\",\n            \"2020-12-20 09:08:38,831 - INFO - allennlp.common.params - dataset_reader.manual_distributed_sharding = False\\n\",\n            \"2020-12-20 09:08:38,831 - INFO - allennlp.common.params - dataset_reader.manual_multi_process_sharding = False\\n\",\n            \"2020-12-20 09:08:38,831 - INFO - allennlp.common.params - dataset_reader.token_indexers = None\\n\",\n            \"2020-12-20 09:08:38,831 - INFO - allennlp.common.params - dataset_reader.domain_identifier = None\\n\",\n            \"2020-12-20 09:08:38,831 - INFO - allennlp.common.params - dataset_reader.bert_model_name = bert-base-uncased\\n\",\n            \"2020-12-20 09:08:39,125 - INFO - transformers.tokenization_utils - loading file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt from cache at /root/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084\\n\",\n            \"input 0:  {\\\"sentence\\\": \\\"John wanted to drink tea, Mary likes to drink coffee but Karim drank some cool water and Faiza would like to drink tomato juice.\\\"}\\n\",\n            \"prediction:  {\\\"verbs\\\": [{\\\"verb\\\": \\\"wanted\\\", \\\"description\\\": \\\"[ARG0: John] [V: wanted] [ARG1: to drink tea] , Mary likes to drink coffee but Karim drank some cool water and Faiza would like to drink tomato juice .\\\", \\\"tags\\\": [\\\"B-ARG0\\\", \\\"B-V\\\", \\\"B-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\"]}, {\\\"verb\\\": \\\"drink\\\", \\\"description\\\": \\\"[ARG0: John] wanted to [V: drink] [ARG1: tea] , Mary likes to drink coffee but Karim drank some cool water and Faiza would like to drink tomato juice .\\\", \\\"tags\\\": [\\\"B-ARG0\\\", \\\"O\\\", \\\"O\\\", \\\"B-V\\\", \\\"B-ARG1\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\"]}, {\\\"verb\\\": \\\"likes\\\", \\\"description\\\": \\\"John wanted to drink tea , [ARG0: Mary] [V: likes] [ARG1: to drink coffee] but Karim drank some cool water and Faiza would like to drink tomato juice .\\\", \\\"tags\\\": [\\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"B-ARG0\\\", \\\"B-V\\\", \\\"B-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\"]}, {\\\"verb\\\": \\\"drink\\\", \\\"description\\\": \\\"John wanted to drink tea , [ARG0: Mary] likes to [V: drink] [ARG1: coffee] but Karim drank some cool water and Faiza would like to drink tomato juice .\\\", \\\"tags\\\": [\\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"B-ARG0\\\", \\\"O\\\", \\\"O\\\", \\\"B-V\\\", \\\"B-ARG1\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\"]}, {\\\"verb\\\": \\\"drank\\\", \\\"description\\\": \\\"John wanted to drink tea , Mary likes to drink coffee but [ARG0: Karim] [V: drank] [ARG1: some cool water and Faiza] would like to drink tomato juice .\\\", \\\"tags\\\": [\\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"B-ARG0\\\", \\\"B-V\\\", \\\"B-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\"]}, {\\\"verb\\\": \\\"would\\\", \\\"description\\\": \\\"John wanted to drink tea , Mary likes to drink coffee but Karim drank some cool water and Faiza [V: would] [ARGM-DIS: like] to drink tomato juice .\\\", \\\"tags\\\": [\\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"B-V\\\", \\\"B-ARGM-DIS\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\"]}, {\\\"verb\\\": \\\"like\\\", \\\"description\\\": \\\"John wanted to drink tea , Mary likes to drink coffee but Karim drank [ARG0: some cool water and Faiza] [ARGM-MOD: would] [V: like] [ARG1: to drink tomato juice] .\\\", \\\"tags\\\": [\\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"B-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"B-ARGM-MOD\\\", \\\"B-V\\\", \\\"B-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"O\\\"]}, {\\\"verb\\\": \\\"drink\\\", \\\"description\\\": \\\"John wanted to drink tea , Mary likes to drink coffee but Karim drank [ARG0: some cool water and Faiza] would like to [V: drink] [ARG1: tomato juice] .\\\", \\\"tags\\\": [\\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"B-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"B-V\\\", \\\"B-ARG1\\\", \\\"I-ARG1\\\", \\\"O\\\"]}], \\\"words\\\": [\\\"John\\\", \\\"wanted\\\", \\\"to\\\", \\\"drink\\\", \\\"tea\\\", \\\",\\\", \\\"Mary\\\", \\\"likes\\\", \\\"to\\\", \\\"drink\\\", \\\"coffee\\\", \\\"but\\\", \\\"Karim\\\", \\\"drank\\\", \\\"some\\\", \\\"cool\\\", \\\"water\\\", \\\"and\\\", \\\"Faiza\\\", \\\"would\\\", \\\"like\\\", \\\"to\\\", \\\"drink\\\", \\\"tomato\\\", \\\"juice\\\", \\\".\\\"]}\\n\",\n            \"\\n\",\n            \"2020-12-20 09:08:40,852 - INFO - allennlp.models.archival - removing temporary unarchived model dir at /tmp/tmpse7z902p\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"k7QVm45YmxTt\"\n      },\n      \"source\": [\n        \"Sample 4: Alice, whose husband went jogging every Sunday, liked to go to a dancing class in the meantime.\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"mvm6zN7_m0GI\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"736f6e5d-0f14-4692-962d-696f3e884c5e\"\n      },\n      \"source\": [\n        \"!echo '{\\\"sentence\\\": \\\"Alice, whose husband went jogging every Sunday, liked to go to a dancing class in the meantime.\\\"}' | \\\\\\n\",\n        \"allennlp predict https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz -\"\n      ],\n      \"execution_count\": 5,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"2020-12-20 09:08:43,153 - INFO - transformers.file_utils - PyTorch version 1.5.1 available.\\n\",\n            \"2020-12-20 09:08:43.324358: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library libcudart.so.10.1\\n\",\n            \"2020-12-20 09:08:45,080 - INFO - transformers.file_utils - TensorFlow version 2.4.0 available.\\n\",\n            \"2020-12-20 09:08:46,294 - INFO - allennlp.common.file_utils - checking cache for https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\\n\",\n            \"2020-12-20 09:08:46,294 - INFO - allennlp.common.file_utils - waiting to acquire lock on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\\n\",\n            \"2020-12-20 09:08:46,295 - INFO - filelock - Lock 139693451663232 acquired on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\\n\",\n            \"2020-12-20 09:08:46,295 - INFO - allennlp.common.file_utils - cache of https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz is up-to-date\\n\",\n            \"2020-12-20 09:08:46,295 - INFO - filelock - Lock 139693451663232 released on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\\n\",\n            \"2020-12-20 09:08:46,295 - INFO - allennlp.models.archival - loading archive file https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz from cache at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\\n\",\n            \"2020-12-20 09:08:46,295 - INFO - allennlp.models.archival - extracting archive file /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724 to temp dir /tmp/tmpmnefy_l0\\n\",\n            \"2020-12-20 09:08:50,552 - INFO - allennlp.common.params - type = from_instances\\n\",\n            \"2020-12-20 09:08:50,552 - INFO - allennlp.data.vocabulary - Loading token dictionary from /tmp/tmpmnefy_l0/vocabulary.\\n\",\n            \"2020-12-20 09:08:50,552 - INFO - filelock - Lock 139692801159064 acquired on /tmp/tmpmnefy_l0/vocabulary/.lock\\n\",\n            \"2020-12-20 09:08:50,580 - INFO - filelock - Lock 139692801159064 released on /tmp/tmpmnefy_l0/vocabulary/.lock\\n\",\n            \"2020-12-20 09:08:50,580 - INFO - allennlp.common.params - model.type = srl_bert\\n\",\n            \"2020-12-20 09:08:50,581 - INFO - allennlp.common.params - model.regularizer = None\\n\",\n            \"2020-12-20 09:08:50,581 - INFO - allennlp.common.params - model.bert_model = bert-base-uncased\\n\",\n            \"2020-12-20 09:08:50,581 - INFO - allennlp.common.params - model.embedding_dropout = 0.1\\n\",\n            \"2020-12-20 09:08:50,581 - INFO - allennlp.common.params - model.initializer = <allennlp.nn.initializers.InitializerApplicator object at 0x7f0cc3c6a7b8>\\n\",\n            \"2020-12-20 09:08:50,581 - INFO - allennlp.common.params - model.label_smoothing = None\\n\",\n            \"2020-12-20 09:08:50,581 - INFO - allennlp.common.params - model.ignore_span_metric = False\\n\",\n            \"2020-12-20 09:08:50,581 - INFO - allennlp.common.params - model.srl_eval_path = /usr/local/lib/python3.6/dist-packages/allennlp_models/structured_prediction/tools/srl-eval.pl\\n\",\n            \"2020-12-20 09:08:50,888 - INFO - transformers.configuration_utils - loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json from cache at /root/.cache/torch/transformers/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517\\n\",\n            \"2020-12-20 09:08:50,889 - INFO - transformers.configuration_utils - Model config BertConfig {\\n\",\n            \"  \\\"architectures\\\": [\\n\",\n            \"    \\\"BertForMaskedLM\\\"\\n\",\n            \"  ],\\n\",\n            \"  \\\"attention_probs_dropout_prob\\\": 0.1,\\n\",\n            \"  \\\"hidden_act\\\": \\\"gelu\\\",\\n\",\n            \"  \\\"hidden_dropout_prob\\\": 0.1,\\n\",\n            \"  \\\"hidden_size\\\": 768,\\n\",\n            \"  \\\"initializer_range\\\": 0.02,\\n\",\n            \"  \\\"intermediate_size\\\": 3072,\\n\",\n            \"  \\\"layer_norm_eps\\\": 1e-12,\\n\",\n            \"  \\\"max_position_embeddings\\\": 512,\\n\",\n            \"  \\\"model_type\\\": \\\"bert\\\",\\n\",\n            \"  \\\"num_attention_heads\\\": 12,\\n\",\n            \"  \\\"num_hidden_layers\\\": 12,\\n\",\n            \"  \\\"pad_token_id\\\": 0,\\n\",\n            \"  \\\"type_vocab_size\\\": 2,\\n\",\n            \"  \\\"vocab_size\\\": 30522\\n\",\n            \"}\\n\",\n            \"\\n\",\n            \"2020-12-20 09:08:50,928 - INFO - transformers.modeling_utils - loading weights file https://cdn.huggingface.co/bert-base-uncased-pytorch_model.bin from cache at /root/.cache/torch/transformers/f2ee78bdd635b758cc0a12352586868bef80e47401abe4c4fcc3832421e7338b.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157\\n\",\n            \"2020-12-20 09:08:53,601 - INFO - allennlp.nn.initializers - Initializing parameters\\n\",\n            \"2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers - Done initializing parameters; the following parameters are using their default initialization from their code\\n\",\n            \"2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers -    bert_model.embeddings.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers -    bert_model.embeddings.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers -    bert_model.embeddings.position_embeddings.weight\\n\",\n            \"2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers -    bert_model.embeddings.token_type_embeddings.weight\\n\",\n            \"2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers -    bert_model.embeddings.word_embeddings.weight\\n\",\n            \"2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.dense.bias\\n\",\n            \"2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.dense.weight\\n\",\n            \"2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.dense.bias\\n\",\n            \"2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.dense.weight\\n\",\n            \"2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.dense.bias\\n\",\n            \"2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.dense.weight\\n\",\n            \"2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.dense.bias\\n\",\n            \"2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.dense.weight\\n\",\n            \"2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.dense.bias\\n\",\n            \"2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.dense.weight\\n\",\n            \"2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.dense.bias\\n\",\n            \"2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.dense.weight\\n\",\n            \"2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.dense.bias\\n\",\n            \"2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.dense.weight\\n\",\n            \"2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.dense.bias\\n\",\n            \"2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.dense.weight\\n\",\n            \"2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:53,609 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.dense.bias\\n\",\n            \"2020-12-20 09:08:53,609 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.dense.weight\\n\",\n            \"2020-12-20 09:08:53,609 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:53,609 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:53,609 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:53,609 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:53,609 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:53,609 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:53,609 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:53,609 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:53,609 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:53,683 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:53,683 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:53,683 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:53,683 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:53,683 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:53,683 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.dense.bias\\n\",\n            \"2020-12-20 09:08:53,683 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.dense.weight\\n\",\n            \"2020-12-20 09:08:53,683 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.dense.bias\\n\",\n            \"2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.dense.weight\\n\",\n            \"2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.key.bias\\n\",\n            \"2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.key.weight\\n\",\n            \"2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.query.bias\\n\",\n            \"2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.query.weight\\n\",\n            \"2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.value.bias\\n\",\n            \"2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.value.weight\\n\",\n            \"2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.dense.bias\\n\",\n            \"2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.dense.weight\\n\",\n            \"2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers -    bert_model.pooler.dense.bias\\n\",\n            \"2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers -    bert_model.pooler.dense.weight\\n\",\n            \"2020-12-20 09:08:53,686 - INFO - allennlp.nn.initializers -    tag_projection_layer.bias\\n\",\n            \"2020-12-20 09:08:53,686 - INFO - allennlp.nn.initializers -    tag_projection_layer.weight\\n\",\n            \"2020-12-20 09:08:54,186 - INFO - allennlp.common.params - dataset_reader.type = srl\\n\",\n            \"2020-12-20 09:08:54,187 - INFO - allennlp.common.params - dataset_reader.lazy = False\\n\",\n            \"2020-12-20 09:08:54,187 - INFO - allennlp.common.params - dataset_reader.cache_directory = None\\n\",\n            \"2020-12-20 09:08:54,187 - INFO - allennlp.common.params - dataset_reader.max_instances = None\\n\",\n            \"2020-12-20 09:08:54,187 - INFO - allennlp.common.params - dataset_reader.manual_distributed_sharding = False\\n\",\n            \"2020-12-20 09:08:54,187 - INFO - allennlp.common.params - dataset_reader.manual_multi_process_sharding = False\\n\",\n            \"2020-12-20 09:08:54,187 - INFO - allennlp.common.params - dataset_reader.token_indexers = None\\n\",\n            \"2020-12-20 09:08:54,187 - INFO - allennlp.common.params - dataset_reader.domain_identifier = None\\n\",\n            \"2020-12-20 09:08:54,187 - INFO - allennlp.common.params - dataset_reader.bert_model_name = bert-base-uncased\\n\",\n            \"2020-12-20 09:08:54,497 - INFO - transformers.tokenization_utils - loading file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt from cache at /root/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084\\n\",\n            \"input 0:  {\\\"sentence\\\": \\\"Alice, whose husband went jogging every Sunday, liked to go to a dancing class in the meantime.\\\"}\\n\",\n            \"prediction:  {\\\"verbs\\\": [{\\\"verb\\\": \\\"went\\\", \\\"description\\\": \\\"Alice , [ARG1: whose husband] [V: went] [ARG2: jogging] [ARGM-TMP: every Sunday] , liked to go to a dancing class in the meantime .\\\", \\\"tags\\\": [\\\"O\\\", \\\"O\\\", \\\"B-ARG1\\\", \\\"I-ARG1\\\", \\\"B-V\\\", \\\"B-ARG2\\\", \\\"B-ARGM-TMP\\\", \\\"I-ARGM-TMP\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\"]}, {\\\"verb\\\": \\\"jogging\\\", \\\"description\\\": \\\"Alice , [ARG0: whose husband] went [V: jogging] [ARGM-TMP: every Sunday] , liked to go to a dancing class in the meantime .\\\", \\\"tags\\\": [\\\"O\\\", \\\"O\\\", \\\"B-ARG0\\\", \\\"I-ARG0\\\", \\\"O\\\", \\\"B-V\\\", \\\"B-ARGM-TMP\\\", \\\"I-ARGM-TMP\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\"]}, {\\\"verb\\\": \\\"liked\\\", \\\"description\\\": \\\"[ARG0: Alice , whose husband went jogging every Sunday] , [V: liked] [ARG1: to go to a dancing class in the meantime] .\\\", \\\"tags\\\": [\\\"B-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"O\\\", \\\"B-V\\\", \\\"B-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"O\\\"]}, {\\\"verb\\\": \\\"go\\\", \\\"description\\\": \\\"[ARG0: Alice , whose husband went jogging every Sunday] , liked to [V: go] [ARG4: to a dancing class] [ARGM-TMP: in the meantime] .\\\", \\\"tags\\\": [\\\"B-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"I-ARG0\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"B-V\\\", \\\"B-ARG4\\\", \\\"I-ARG4\\\", \\\"I-ARG4\\\", \\\"I-ARG4\\\", \\\"B-ARGM-TMP\\\", \\\"I-ARGM-TMP\\\", \\\"I-ARGM-TMP\\\", \\\"O\\\"]}, {\\\"verb\\\": \\\"dancing\\\", \\\"description\\\": \\\"Alice , whose husband went jogging every Sunday , liked to go to a [V: dancing] [ARG0: class] in the meantime .\\\", \\\"tags\\\": [\\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"B-V\\\", \\\"B-ARG0\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\", \\\"O\\\"]}], \\\"words\\\": [\\\"Alice\\\", \\\",\\\", \\\"whose\\\", \\\"husband\\\", \\\"went\\\", \\\"jogging\\\", \\\"every\\\", \\\"Sunday\\\", \\\",\\\", \\\"liked\\\", \\\"to\\\", \\\"go\\\", \\\"to\\\", \\\"a\\\", \\\"dancing\\\", \\\"class\\\", \\\"in\\\", \\\"the\\\", \\\"meantime\\\", \\\".\\\"]}\\n\",\n            \"\\n\",\n            \"2020-12-20 09:08:55,842 - INFO - allennlp.models.archival - removing temporary unarchived model dir at /tmp/tmpmnefy_l0\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"Hog7HwIHzdm8\"\n      },\n      \"source\": [\n        \"Sample 5: The bright sun, the blue sky, the warm sand, the palm trees, everything round off.\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"6NFVmvYtzguX\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"48e86088-f681-4721-9ea4-42fd7476e6a8\"\n      },\n      \"source\": [\n        \"!echo '{\\\"sentence\\\": \\\"The bright sun, the blue sky, the warm sand, the palm trees, everything round off.\\\"}' | \\\\\\n\",\n        \"allennlp predict https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz -\"\n      ],\n      \"execution_count\": 6,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"2020-12-20 09:08:58,132 - INFO - transformers.file_utils - PyTorch version 1.5.1 available.\\n\",\n            \"2020-12-20 09:08:58.293529: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library libcudart.so.10.1\\n\",\n            \"2020-12-20 09:09:00,044 - INFO - transformers.file_utils - TensorFlow version 2.4.0 available.\\n\",\n            \"2020-12-20 09:09:01,253 - INFO - allennlp.common.file_utils - checking cache for https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\\n\",\n            \"2020-12-20 09:09:01,253 - INFO - allennlp.common.file_utils - waiting to acquire lock on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\\n\",\n            \"2020-12-20 09:09:01,255 - INFO - filelock - Lock 140218919610240 acquired on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\\n\",\n            \"2020-12-20 09:09:01,255 - INFO - allennlp.common.file_utils - cache of https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz is up-to-date\\n\",\n            \"2020-12-20 09:09:01,255 - INFO - filelock - Lock 140218919610240 released on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\\n\",\n            \"2020-12-20 09:09:01,255 - INFO - allennlp.models.archival - loading archive file https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz from cache at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\\n\",\n            \"2020-12-20 09:09:01,256 - INFO - allennlp.models.archival - extracting archive file /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724 to temp dir /tmp/tmpn3jl1yco\\n\",\n            \"2020-12-20 09:09:05,507 - INFO - allennlp.common.params - type = from_instances\\n\",\n            \"2020-12-20 09:09:05,507 - INFO - allennlp.data.vocabulary - Loading token dictionary from /tmp/tmpn3jl1yco/vocabulary.\\n\",\n            \"2020-12-20 09:09:05,507 - INFO - filelock - Lock 140218269110168 acquired on /tmp/tmpn3jl1yco/vocabulary/.lock\\n\",\n            \"2020-12-20 09:09:05,535 - INFO - filelock - Lock 140218269110168 released on /tmp/tmpn3jl1yco/vocabulary/.lock\\n\",\n            \"2020-12-20 09:09:05,536 - INFO - allennlp.common.params - model.type = srl_bert\\n\",\n            \"2020-12-20 09:09:05,536 - INFO - allennlp.common.params - model.regularizer = None\\n\",\n            \"2020-12-20 09:09:05,536 - INFO - allennlp.common.params - model.bert_model = bert-base-uncased\\n\",\n            \"2020-12-20 09:09:05,536 - INFO - allennlp.common.params - model.embedding_dropout = 0.1\\n\",\n            \"2020-12-20 09:09:05,536 - INFO - allennlp.common.params - model.initializer = <allennlp.nn.initializers.InitializerApplicator object at 0x7f871c1b37b8>\\n\",\n            \"2020-12-20 09:09:05,537 - INFO - allennlp.common.params - model.label_smoothing = None\\n\",\n            \"2020-12-20 09:09:05,537 - INFO - allennlp.common.params - model.ignore_span_metric = False\\n\",\n            \"2020-12-20 09:09:05,537 - INFO - allennlp.common.params - model.srl_eval_path = /usr/local/lib/python3.6/dist-packages/allennlp_models/structured_prediction/tools/srl-eval.pl\\n\",\n            \"2020-12-20 09:09:05,837 - INFO - transformers.configuration_utils - loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json from cache at /root/.cache/torch/transformers/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517\\n\",\n            \"2020-12-20 09:09:05,837 - INFO - transformers.configuration_utils - Model config BertConfig {\\n\",\n            \"  \\\"architectures\\\": [\\n\",\n            \"    \\\"BertForMaskedLM\\\"\\n\",\n            \"  ],\\n\",\n            \"  \\\"attention_probs_dropout_prob\\\": 0.1,\\n\",\n            \"  \\\"hidden_act\\\": \\\"gelu\\\",\\n\",\n            \"  \\\"hidden_dropout_prob\\\": 0.1,\\n\",\n            \"  \\\"hidden_size\\\": 768,\\n\",\n            \"  \\\"initializer_range\\\": 0.02,\\n\",\n            \"  \\\"intermediate_size\\\": 3072,\\n\",\n            \"  \\\"layer_norm_eps\\\": 1e-12,\\n\",\n            \"  \\\"max_position_embeddings\\\": 512,\\n\",\n            \"  \\\"model_type\\\": \\\"bert\\\",\\n\",\n            \"  \\\"num_attention_heads\\\": 12,\\n\",\n            \"  \\\"num_hidden_layers\\\": 12,\\n\",\n            \"  \\\"pad_token_id\\\": 0,\\n\",\n            \"  \\\"type_vocab_size\\\": 2,\\n\",\n            \"  \\\"vocab_size\\\": 30522\\n\",\n            \"}\\n\",\n            \"\\n\",\n            \"2020-12-20 09:09:06,048 - INFO - transformers.modeling_utils - loading weights file https://cdn.huggingface.co/bert-base-uncased-pytorch_model.bin from cache at /root/.cache/torch/transformers/f2ee78bdd635b758cc0a12352586868bef80e47401abe4c4fcc3832421e7338b.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157\\n\",\n            \"2020-12-20 09:09:08,747 - INFO - allennlp.nn.initializers - Initializing parameters\\n\",\n            \"2020-12-20 09:09:08,748 - INFO - allennlp.nn.initializers - Done initializing parameters; the following parameters are using their default initialization from their code\\n\",\n            \"2020-12-20 09:09:08,748 - INFO - allennlp.nn.initializers -    bert_model.embeddings.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:08,748 - INFO - allennlp.nn.initializers -    bert_model.embeddings.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:08,748 - INFO - allennlp.nn.initializers -    bert_model.embeddings.position_embeddings.weight\\n\",\n            \"2020-12-20 09:09:08,748 - INFO - allennlp.nn.initializers -    bert_model.embeddings.token_type_embeddings.weight\\n\",\n            \"2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers -    bert_model.embeddings.word_embeddings.weight\\n\",\n            \"2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.dense.bias\\n\",\n            \"2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.dense.weight\\n\",\n            \"2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.dense.bias\\n\",\n            \"2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.dense.weight\\n\",\n            \"2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.dense.bias\\n\",\n            \"2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.dense.weight\\n\",\n            \"2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.dense.bias\\n\",\n            \"2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.dense.weight\\n\",\n            \"2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.dense.bias\\n\",\n            \"2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.dense.weight\\n\",\n            \"2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.dense.bias\\n\",\n            \"2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.dense.weight\\n\",\n            \"2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.dense.bias\\n\",\n            \"2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.dense.weight\\n\",\n            \"2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.dense.bias\\n\",\n            \"2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.dense.weight\\n\",\n            \"2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.dense.bias\\n\",\n            \"2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.dense.weight\\n\",\n            \"2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:08,786 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:08,787 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:08,787 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:08,787 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.dense.bias\\n\",\n            \"2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.dense.weight\\n\",\n            \"2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.dense.bias\\n\",\n            \"2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.dense.weight\\n\",\n            \"2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:08,790 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:08,790 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.dense.bias\\n\",\n            \"2020-12-20 09:09:08,790 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.dense.weight\\n\",\n            \"2020-12-20 09:09:08,790 - INFO - allennlp.nn.initializers -    bert_model.pooler.dense.bias\\n\",\n            \"2020-12-20 09:09:08,790 - INFO - allennlp.nn.initializers -    bert_model.pooler.dense.weight\\n\",\n            \"2020-12-20 09:09:08,790 - INFO - allennlp.nn.initializers -    tag_projection_layer.bias\\n\",\n            \"2020-12-20 09:09:08,790 - INFO - allennlp.nn.initializers -    tag_projection_layer.weight\\n\",\n            \"2020-12-20 09:09:09,268 - INFO - allennlp.common.params - dataset_reader.type = srl\\n\",\n            \"2020-12-20 09:09:09,268 - INFO - allennlp.common.params - dataset_reader.lazy = False\\n\",\n            \"2020-12-20 09:09:09,268 - INFO - allennlp.common.params - dataset_reader.cache_directory = None\\n\",\n            \"2020-12-20 09:09:09,268 - INFO - allennlp.common.params - dataset_reader.max_instances = None\\n\",\n            \"2020-12-20 09:09:09,268 - INFO - allennlp.common.params - dataset_reader.manual_distributed_sharding = False\\n\",\n            \"2020-12-20 09:09:09,269 - INFO - allennlp.common.params - dataset_reader.manual_multi_process_sharding = False\\n\",\n            \"2020-12-20 09:09:09,269 - INFO - allennlp.common.params - dataset_reader.token_indexers = None\\n\",\n            \"2020-12-20 09:09:09,269 - INFO - allennlp.common.params - dataset_reader.domain_identifier = None\\n\",\n            \"2020-12-20 09:09:09,269 - INFO - allennlp.common.params - dataset_reader.bert_model_name = bert-base-uncased\\n\",\n            \"2020-12-20 09:09:09,561 - INFO - transformers.tokenization_utils - loading file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt from cache at /root/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084\\n\",\n            \"input 0:  {\\\"sentence\\\": \\\"The bright sun, the blue sky, the warm sand, the palm trees, everything round off.\\\"}\\n\",\n            \"prediction:  {\\\"verbs\\\": [], \\\"words\\\": [\\\"The\\\", \\\"bright\\\", \\\"sun\\\", \\\",\\\", \\\"the\\\", \\\"blue\\\", \\\"sky\\\", \\\",\\\", \\\"the\\\", \\\"warm\\\", \\\"sand\\\", \\\",\\\", \\\"the\\\", \\\"palm\\\", \\\"trees\\\", \\\",\\\", \\\"everything\\\", \\\"round\\\", \\\"off\\\", \\\".\\\"]}\\n\",\n            \"\\n\",\n            \"2020-12-20 09:09:10,283 - INFO - allennlp.models.archival - removing temporary unarchived model dir at /tmp/tmpn3jl1yco\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"T9UCG-qN018X\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"af4f9689-442f-4726-a908-7dba1a824036\"\n      },\n      \"source\": [\n        \"!echo '{\\\"sentence\\\": \\\"The bright sun, the blue sky, the warm sand, the palm trees, everything rounds off.\\\"}' | \\\\\\n\",\n        \"allennlp predict https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz -\"\n      ],\n      \"execution_count\": 7,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"2020-12-20 09:09:12,636 - INFO - transformers.file_utils - PyTorch version 1.5.1 available.\\n\",\n            \"2020-12-20 09:09:12.789933: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library libcudart.so.10.1\\n\",\n            \"2020-12-20 09:09:14,547 - INFO - transformers.file_utils - TensorFlow version 2.4.0 available.\\n\",\n            \"2020-12-20 09:09:15,750 - INFO - allennlp.common.file_utils - checking cache for https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\\n\",\n            \"2020-12-20 09:09:15,750 - INFO - allennlp.common.file_utils - waiting to acquire lock on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\\n\",\n            \"2020-12-20 09:09:15,751 - INFO - filelock - Lock 139884787906432 acquired on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\\n\",\n            \"2020-12-20 09:09:15,751 - INFO - allennlp.common.file_utils - cache of https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz is up-to-date\\n\",\n            \"2020-12-20 09:09:15,751 - INFO - filelock - Lock 139884787906432 released on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\\n\",\n            \"2020-12-20 09:09:15,751 - INFO - allennlp.models.archival - loading archive file https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz from cache at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\\n\",\n            \"2020-12-20 09:09:15,751 - INFO - allennlp.models.archival - extracting archive file /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724 to temp dir /tmp/tmpuj2lb1i1\\n\",\n            \"2020-12-20 09:09:19,983 - INFO - allennlp.common.params - type = from_instances\\n\",\n            \"2020-12-20 09:09:19,983 - INFO - allennlp.data.vocabulary - Loading token dictionary from /tmp/tmpuj2lb1i1/vocabulary.\\n\",\n            \"2020-12-20 09:09:19,983 - INFO - filelock - Lock 139884137381784 acquired on /tmp/tmpuj2lb1i1/vocabulary/.lock\\n\",\n            \"2020-12-20 09:09:20,009 - INFO - filelock - Lock 139884137381784 released on /tmp/tmpuj2lb1i1/vocabulary/.lock\\n\",\n            \"2020-12-20 09:09:20,010 - INFO - allennlp.common.params - model.type = srl_bert\\n\",\n            \"2020-12-20 09:09:20,010 - INFO - allennlp.common.params - model.regularizer = None\\n\",\n            \"2020-12-20 09:09:20,010 - INFO - allennlp.common.params - model.bert_model = bert-base-uncased\\n\",\n            \"2020-12-20 09:09:20,011 - INFO - allennlp.common.params - model.embedding_dropout = 0.1\\n\",\n            \"2020-12-20 09:09:20,011 - INFO - allennlp.common.params - model.initializer = <allennlp.nn.initializers.InitializerApplicator object at 0x7f39504e07b8>\\n\",\n            \"2020-12-20 09:09:20,011 - INFO - allennlp.common.params - model.label_smoothing = None\\n\",\n            \"2020-12-20 09:09:20,011 - INFO - allennlp.common.params - model.ignore_span_metric = False\\n\",\n            \"2020-12-20 09:09:20,011 - INFO - allennlp.common.params - model.srl_eval_path = /usr/local/lib/python3.6/dist-packages/allennlp_models/structured_prediction/tools/srl-eval.pl\\n\",\n            \"2020-12-20 09:09:20,306 - INFO - transformers.configuration_utils - loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json from cache at /root/.cache/torch/transformers/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517\\n\",\n            \"2020-12-20 09:09:20,307 - INFO - transformers.configuration_utils - Model config BertConfig {\\n\",\n            \"  \\\"architectures\\\": [\\n\",\n            \"    \\\"BertForMaskedLM\\\"\\n\",\n            \"  ],\\n\",\n            \"  \\\"attention_probs_dropout_prob\\\": 0.1,\\n\",\n            \"  \\\"hidden_act\\\": \\\"gelu\\\",\\n\",\n            \"  \\\"hidden_dropout_prob\\\": 0.1,\\n\",\n            \"  \\\"hidden_size\\\": 768,\\n\",\n            \"  \\\"initializer_range\\\": 0.02,\\n\",\n            \"  \\\"intermediate_size\\\": 3072,\\n\",\n            \"  \\\"layer_norm_eps\\\": 1e-12,\\n\",\n            \"  \\\"max_position_embeddings\\\": 512,\\n\",\n            \"  \\\"model_type\\\": \\\"bert\\\",\\n\",\n            \"  \\\"num_attention_heads\\\": 12,\\n\",\n            \"  \\\"num_hidden_layers\\\": 12,\\n\",\n            \"  \\\"pad_token_id\\\": 0,\\n\",\n            \"  \\\"type_vocab_size\\\": 2,\\n\",\n            \"  \\\"vocab_size\\\": 30522\\n\",\n            \"}\\n\",\n            \"\\n\",\n            \"2020-12-20 09:09:20,499 - INFO - transformers.modeling_utils - loading weights file https://cdn.huggingface.co/bert-base-uncased-pytorch_model.bin from cache at /root/.cache/torch/transformers/f2ee78bdd635b758cc0a12352586868bef80e47401abe4c4fcc3832421e7338b.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157\\n\",\n            \"2020-12-20 09:09:23,169 - INFO - allennlp.nn.initializers - Initializing parameters\\n\",\n            \"2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - Done initializing parameters; the following parameters are using their default initialization from their code\\n\",\n            \"2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers -    bert_model.embeddings.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers -    bert_model.embeddings.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers -    bert_model.embeddings.position_embeddings.weight\\n\",\n            \"2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers -    bert_model.embeddings.token_type_embeddings.weight\\n\",\n            \"2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers -    bert_model.embeddings.word_embeddings.weight\\n\",\n            \"2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.dense.bias\\n\",\n            \"2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.dense.weight\\n\",\n            \"2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.dense.bias\\n\",\n            \"2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.dense.weight\\n\",\n            \"2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.dense.bias\\n\",\n            \"2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.dense.weight\\n\",\n            \"2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.dense.bias\\n\",\n            \"2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.dense.weight\\n\",\n            \"2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.dense.bias\\n\",\n            \"2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.dense.weight\\n\",\n            \"2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.dense.bias\\n\",\n            \"2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.dense.weight\\n\",\n            \"2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.dense.bias\\n\",\n            \"2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.dense.weight\\n\",\n            \"2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.dense.bias\\n\",\n            \"2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.dense.weight\\n\",\n            \"2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.dense.bias\\n\",\n            \"2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.dense.weight\\n\",\n            \"2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:23,236 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:23,236 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:23,236 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:23,236 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.dense.bias\\n\",\n            \"2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.dense.weight\\n\",\n            \"2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.dense.bias\\n\",\n            \"2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.dense.weight\\n\",\n            \"2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:23,239 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:23,239 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.dense.bias\\n\",\n            \"2020-12-20 09:09:23,239 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.dense.weight\\n\",\n            \"2020-12-20 09:09:23,239 - INFO - allennlp.nn.initializers -    bert_model.pooler.dense.bias\\n\",\n            \"2020-12-20 09:09:23,239 - INFO - allennlp.nn.initializers -    bert_model.pooler.dense.weight\\n\",\n            \"2020-12-20 09:09:23,239 - INFO - allennlp.nn.initializers -    tag_projection_layer.bias\\n\",\n            \"2020-12-20 09:09:23,239 - INFO - allennlp.nn.initializers -    tag_projection_layer.weight\\n\",\n            \"2020-12-20 09:09:23,707 - INFO - allennlp.common.params - dataset_reader.type = srl\\n\",\n            \"2020-12-20 09:09:23,708 - INFO - allennlp.common.params - dataset_reader.lazy = False\\n\",\n            \"2020-12-20 09:09:23,708 - INFO - allennlp.common.params - dataset_reader.cache_directory = None\\n\",\n            \"2020-12-20 09:09:23,708 - INFO - allennlp.common.params - dataset_reader.max_instances = None\\n\",\n            \"2020-12-20 09:09:23,708 - INFO - allennlp.common.params - dataset_reader.manual_distributed_sharding = False\\n\",\n            \"2020-12-20 09:09:23,708 - INFO - allennlp.common.params - dataset_reader.manual_multi_process_sharding = False\\n\",\n            \"2020-12-20 09:09:23,708 - INFO - allennlp.common.params - dataset_reader.token_indexers = None\\n\",\n            \"2020-12-20 09:09:23,708 - INFO - allennlp.common.params - dataset_reader.domain_identifier = None\\n\",\n            \"2020-12-20 09:09:23,709 - INFO - allennlp.common.params - dataset_reader.bert_model_name = bert-base-uncased\\n\",\n            \"2020-12-20 09:09:23,994 - INFO - transformers.tokenization_utils - loading file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt from cache at /root/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084\\n\",\n            \"input 0:  {\\\"sentence\\\": \\\"The bright sun, the blue sky, the warm sand, the palm trees, everything rounds off.\\\"}\\n\",\n            \"prediction:  {\\\"verbs\\\": [{\\\"verb\\\": \\\"rounds\\\", \\\"description\\\": \\\"[ARG1: The bright sun , the blue sky , the warm sand , the palm trees] , [R-ARG1: everything] [V: rounds] off .\\\", \\\"tags\\\": [\\\"B-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"I-ARG1\\\", \\\"O\\\", \\\"B-R-ARG1\\\", \\\"B-V\\\", \\\"O\\\", \\\"O\\\"]}], \\\"words\\\": [\\\"The\\\", \\\"bright\\\", \\\"sun\\\", \\\",\\\", \\\"the\\\", \\\"blue\\\", \\\"sky\\\", \\\",\\\", \\\"the\\\", \\\"warm\\\", \\\"sand\\\", \\\",\\\", \\\"the\\\", \\\"palm\\\", \\\"trees\\\", \\\",\\\", \\\"everything\\\", \\\"rounds\\\", \\\"off\\\", \\\".\\\"]}\\n\",\n            \"\\n\",\n            \"2020-12-20 09:09:24,932 - INFO - allennlp.models.archival - removing temporary unarchived model dir at /tmp/tmpuj2lb1i1\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"cBrxUvrL3Sp4\"\n      },\n      \"source\": [\n        \"Sample 6 Ice pucks\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"rp77Vazw3QY8\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"c8e32eb0-1162-4acc-976e-a25ee428dbda\"\n      },\n      \"source\": [\n        \"!echo '{\\\"sentence\\\": \\\"Now, ice pucks guys!\\\"}' | \\\\\\n\",\n        \"allennlp predict https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz -\"\n      ],\n      \"execution_count\": 8,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"2020-12-20 09:09:27,286 - INFO - transformers.file_utils - PyTorch version 1.5.1 available.\\n\",\n            \"2020-12-20 09:09:27.438284: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library libcudart.so.10.1\\n\",\n            \"2020-12-20 09:09:29,226 - INFO - transformers.file_utils - TensorFlow version 2.4.0 available.\\n\",\n            \"2020-12-20 09:09:30,428 - INFO - allennlp.common.file_utils - checking cache for https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\\n\",\n            \"2020-12-20 09:09:30,428 - INFO - allennlp.common.file_utils - waiting to acquire lock on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\\n\",\n            \"2020-12-20 09:09:30,429 - INFO - filelock - Lock 139618002246904 acquired on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\\n\",\n            \"2020-12-20 09:09:30,429 - INFO - allennlp.common.file_utils - cache of https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz is up-to-date\\n\",\n            \"2020-12-20 09:09:30,429 - INFO - filelock - Lock 139618002246904 released on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\\n\",\n            \"2020-12-20 09:09:30,429 - INFO - allennlp.models.archival - loading archive file https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz from cache at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\\n\",\n            \"2020-12-20 09:09:30,430 - INFO - allennlp.models.archival - extracting archive file /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724 to temp dir /tmp/tmpewur_o27\\n\",\n            \"2020-12-20 09:09:34,712 - INFO - allennlp.common.params - type = from_instances\\n\",\n            \"2020-12-20 09:09:34,712 - INFO - allennlp.data.vocabulary - Loading token dictionary from /tmp/tmpewur_o27/vocabulary.\\n\",\n            \"2020-12-20 09:09:34,713 - INFO - filelock - Lock 139618002367880 acquired on /tmp/tmpewur_o27/vocabulary/.lock\\n\",\n            \"2020-12-20 09:09:34,741 - INFO - filelock - Lock 139618002367880 released on /tmp/tmpewur_o27/vocabulary/.lock\\n\",\n            \"2020-12-20 09:09:34,742 - INFO - allennlp.common.params - model.type = srl_bert\\n\",\n            \"2020-12-20 09:09:34,742 - INFO - allennlp.common.params - model.regularizer = None\\n\",\n            \"2020-12-20 09:09:34,742 - INFO - allennlp.common.params - model.bert_model = bert-base-uncased\\n\",\n            \"2020-12-20 09:09:34,742 - INFO - allennlp.common.params - model.embedding_dropout = 0.1\\n\",\n            \"2020-12-20 09:09:34,743 - INFO - allennlp.common.params - model.initializer = <allennlp.nn.initializers.InitializerApplicator object at 0x7efb596c07b8>\\n\",\n            \"2020-12-20 09:09:34,743 - INFO - allennlp.common.params - model.label_smoothing = None\\n\",\n            \"2020-12-20 09:09:34,743 - INFO - allennlp.common.params - model.ignore_span_metric = False\\n\",\n            \"2020-12-20 09:09:34,743 - INFO - allennlp.common.params - model.srl_eval_path = /usr/local/lib/python3.6/dist-packages/allennlp_models/structured_prediction/tools/srl-eval.pl\\n\",\n            \"2020-12-20 09:09:35,046 - INFO - transformers.configuration_utils - loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json from cache at /root/.cache/torch/transformers/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517\\n\",\n            \"2020-12-20 09:09:35,047 - INFO - transformers.configuration_utils - Model config BertConfig {\\n\",\n            \"  \\\"architectures\\\": [\\n\",\n            \"    \\\"BertForMaskedLM\\\"\\n\",\n            \"  ],\\n\",\n            \"  \\\"attention_probs_dropout_prob\\\": 0.1,\\n\",\n            \"  \\\"hidden_act\\\": \\\"gelu\\\",\\n\",\n            \"  \\\"hidden_dropout_prob\\\": 0.1,\\n\",\n            \"  \\\"hidden_size\\\": 768,\\n\",\n            \"  \\\"initializer_range\\\": 0.02,\\n\",\n            \"  \\\"intermediate_size\\\": 3072,\\n\",\n            \"  \\\"layer_norm_eps\\\": 1e-12,\\n\",\n            \"  \\\"max_position_embeddings\\\": 512,\\n\",\n            \"  \\\"model_type\\\": \\\"bert\\\",\\n\",\n            \"  \\\"num_attention_heads\\\": 12,\\n\",\n            \"  \\\"num_hidden_layers\\\": 12,\\n\",\n            \"  \\\"pad_token_id\\\": 0,\\n\",\n            \"  \\\"type_vocab_size\\\": 2,\\n\",\n            \"  \\\"vocab_size\\\": 30522\\n\",\n            \"}\\n\",\n            \"\\n\",\n            \"2020-12-20 09:09:35,254 - INFO - transformers.modeling_utils - loading weights file https://cdn.huggingface.co/bert-base-uncased-pytorch_model.bin from cache at /root/.cache/torch/transformers/f2ee78bdd635b758cc0a12352586868bef80e47401abe4c4fcc3832421e7338b.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157\\n\",\n            \"2020-12-20 09:09:37,949 - INFO - allennlp.nn.initializers - Initializing parameters\\n\",\n            \"2020-12-20 09:09:37,949 - INFO - allennlp.nn.initializers - Done initializing parameters; the following parameters are using their default initialization from their code\\n\",\n            \"2020-12-20 09:09:37,949 - INFO - allennlp.nn.initializers -    bert_model.embeddings.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers -    bert_model.embeddings.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers -    bert_model.embeddings.position_embeddings.weight\\n\",\n            \"2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers -    bert_model.embeddings.token_type_embeddings.weight\\n\",\n            \"2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers -    bert_model.embeddings.word_embeddings.weight\\n\",\n            \"2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.dense.bias\\n\",\n            \"2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.0.output.dense.weight\\n\",\n            \"2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.dense.bias\\n\",\n            \"2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.1.output.dense.weight\\n\",\n            \"2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.dense.bias\\n\",\n            \"2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.10.output.dense.weight\\n\",\n            \"2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.dense.bias\\n\",\n            \"2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.11.output.dense.weight\\n\",\n            \"2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:37,954 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:37,954 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:37,954 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:37,954 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.dense.bias\\n\",\n            \"2020-12-20 09:09:37,954 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.2.output.dense.weight\\n\",\n            \"2020-12-20 09:09:37,954 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:37,954 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:37,954 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:37,954 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:37,954 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:37,954 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.dense.bias\\n\",\n            \"2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.3.output.dense.weight\\n\",\n            \"2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:37,956 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:37,956 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:37,956 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:37,956 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.dense.bias\\n\",\n            \"2020-12-20 09:09:37,956 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.4.output.dense.weight\\n\",\n            \"2020-12-20 09:09:37,956 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:37,956 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:37,956 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:37,956 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:37,956 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:37,956 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:37,956 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.dense.bias\\n\",\n            \"2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.5.output.dense.weight\\n\",\n            \"2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.dense.bias\\n\",\n            \"2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.6.output.dense.weight\\n\",\n            \"2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:37,968 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:37,968 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:37,968 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:37,968 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:37,968 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:37,968 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.dense.bias\\n\",\n            \"2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.7.output.dense.weight\\n\",\n            \"2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.dense.bias\\n\",\n            \"2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.8.output.dense.weight\\n\",\n            \"2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.dense.bias\\n\",\n            \"2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.output.dense.weight\\n\",\n            \"2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.key.bias\\n\",\n            \"2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.key.weight\\n\",\n            \"2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.query.bias\\n\",\n            \"2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.query.weight\\n\",\n            \"2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.value.bias\\n\",\n            \"2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.attention.self.value.weight\\n\",\n            \"2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.intermediate.dense.bias\\n\",\n            \"2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.intermediate.dense.weight\\n\",\n            \"2020-12-20 09:09:37,971 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.LayerNorm.bias\\n\",\n            \"2020-12-20 09:09:37,971 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.LayerNorm.weight\\n\",\n            \"2020-12-20 09:09:37,971 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.dense.bias\\n\",\n            \"2020-12-20 09:09:37,971 - INFO - allennlp.nn.initializers -    bert_model.encoder.layer.9.output.dense.weight\\n\",\n            \"2020-12-20 09:09:37,971 - INFO - allennlp.nn.initializers -    bert_model.pooler.dense.bias\\n\",\n            \"2020-12-20 09:09:37,971 - INFO - allennlp.nn.initializers -    bert_model.pooler.dense.weight\\n\",\n            \"2020-12-20 09:09:37,971 - INFO - allennlp.nn.initializers -    tag_projection_layer.bias\\n\",\n            \"2020-12-20 09:09:37,971 - INFO - allennlp.nn.initializers -    tag_projection_layer.weight\\n\",\n            \"2020-12-20 09:09:38,432 - INFO - allennlp.common.params - dataset_reader.type = srl\\n\",\n            \"2020-12-20 09:09:38,433 - INFO - allennlp.common.params - dataset_reader.lazy = False\\n\",\n            \"2020-12-20 09:09:38,433 - INFO - allennlp.common.params - dataset_reader.cache_directory = None\\n\",\n            \"2020-12-20 09:09:38,433 - INFO - allennlp.common.params - dataset_reader.max_instances = None\\n\",\n            \"2020-12-20 09:09:38,433 - INFO - allennlp.common.params - dataset_reader.manual_distributed_sharding = False\\n\",\n            \"2020-12-20 09:09:38,433 - INFO - allennlp.common.params - dataset_reader.manual_multi_process_sharding = False\\n\",\n            \"2020-12-20 09:09:38,433 - INFO - allennlp.common.params - dataset_reader.token_indexers = None\\n\",\n            \"2020-12-20 09:09:38,433 - INFO - allennlp.common.params - dataset_reader.domain_identifier = None\\n\",\n            \"2020-12-20 09:09:38,433 - INFO - allennlp.common.params - dataset_reader.bert_model_name = bert-base-uncased\\n\",\n            \"2020-12-20 09:09:38,744 - INFO - transformers.tokenization_utils - loading file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt from cache at /root/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084\\n\",\n            \"input 0:  {\\\"sentence\\\": \\\"Now, ice pucks guys!\\\"}\\n\",\n            \"prediction:  {\\\"verbs\\\": [], \\\"words\\\": [\\\"Now\\\", \\\",\\\", \\\"ice\\\", \\\"pucks\\\", \\\"guys\\\", \\\"!\\\"]}\\n\",\n            \"\\n\",\n            \"2020-12-20 09:09:39,466 - INFO - allennlp.models.archival - removing temporary unarchived model dir at /tmp/tmpewur_o27\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    }\n  ]\n}"
  },
  {
    "path": "Chapter10/Haystack_QA_Pipeline.ipynb",
    "content": "{\n  \"nbformat\": 4,\n  \"nbformat_minor\": 0,\n  \"metadata\": {\n    \"kernelspec\": {\n      \"display_name\": \"Python 3\",\n      \"language\": \"python\",\n      \"name\": \"python3\"\n    },\n    \"language_info\": {\n      \"codemirror_mode\": {\n        \"name\": \"ipython\",\n        \"version\": 3\n      },\n      \"file_extension\": \".py\",\n      \"mimetype\": \"text/x-python\",\n      \"name\": \"python\",\n      \"nbconvert_exporter\": \"python\",\n      \"pygments_lexer\": \"ipython3\",\n      \"version\": \"3.7.6\"\n    },\n    \"colab\": {\n      \"name\": \"Haystack_QA_Pipeline.ipynb\",\n      \"provenance\": [],\n      \"collapsed_sections\": []\n    },\n    \"accelerator\": \"GPU\",\n    \"widgets\": {\n      \"application/vnd.jupyter.widget-state+json\": {\n        \"8e2aa2531c9a4890ad722171e4a51122\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_743ca8b7bb574fd99e3e0516ab60b7cc\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_4b361abfd25e4dfc86b96384753f6cdd\",\n              \"IPY_MODEL_f0da7f5b445a443ca0692396bdf54062\"\n            ]\n          }\n        },\n        \"743ca8b7bb574fd99e3e0516ab60b7cc\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"4b361abfd25e4dfc86b96384753f6cdd\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_f480c548d8774a8abc61bd8b045fee29\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 571,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 571,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_8459fc383ca741a29ea1d53190b71457\"\n          }\n        },\n        \"f0da7f5b445a443ca0692396bdf54062\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_d1b1591de71a4bd8a2909975ee82d998\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 571/571 [00:16&lt;00:00, 34.8B/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_afebd0dd7f3e4e44927220ad3fc13f17\"\n          }\n        },\n        \"f480c548d8774a8abc61bd8b045fee29\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"8459fc383ca741a29ea1d53190b71457\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"d1b1591de71a4bd8a2909975ee82d998\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"afebd0dd7f3e4e44927220ad3fc13f17\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"56c4b0d2d2654b1e9470d8f0b920ae16\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_d0cdfe65d369405a90a7abcf38c289c8\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_d66b73d1014848db81edf429c857f9ef\",\n              \"IPY_MODEL_dce3f0f8e8e24c73913617300da7e370\"\n            ]\n          }\n        },\n        \"d0cdfe65d369405a90a7abcf38c289c8\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"d66b73d1014848db81edf429c857f9ef\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_2c095740976c4bdea8645e21d277283c\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 496313727,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 496313727,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_fbc7634200954b37b7c869050591022d\"\n          }\n        },\n        \"dce3f0f8e8e24c73913617300da7e370\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_4a622f003ef24eb79fd13a08f423c665\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 496M/496M [00:13&lt;00:00, 35.8MB/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_db417b1034bc405fa3ac881361f943c4\"\n          }\n        },\n        \"2c095740976c4bdea8645e21d277283c\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"fbc7634200954b37b7c869050591022d\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"4a622f003ef24eb79fd13a08f423c665\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"db417b1034bc405fa3ac881361f943c4\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"ede7dfd59ae8455689373afda2771132\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_44ff19ca5bed4708aa3bf39032563b2e\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_9be9ce8699c14d47853d40e9cd8bf7d0\",\n              \"IPY_MODEL_9f3d806fbec84b179e9a49e49c905fa3\"\n            ]\n          }\n        },\n        \"44ff19ca5bed4708aa3bf39032563b2e\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"9be9ce8699c14d47853d40e9cd8bf7d0\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_f532cd4dccc14741a9d4e506a72507ad\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 898822,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 898822,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_d810a6e15c1d43e0a6793c60622da504\"\n          }\n        },\n        \"9f3d806fbec84b179e9a49e49c905fa3\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_4f711e53e0944446aa51e8a241017e8c\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 899k/899k [00:00&lt;00:00, 933kB/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_19b30696c5294a6b92cecb3533b10fed\"\n          }\n        },\n        \"f532cd4dccc14741a9d4e506a72507ad\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"d810a6e15c1d43e0a6793c60622da504\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"4f711e53e0944446aa51e8a241017e8c\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"19b30696c5294a6b92cecb3533b10fed\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"7fb2356ed11344af950f52ebad7c57e1\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_74e56a811e4646839645cc8e1cd2945e\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_9207a922be2c48b187d97e1575b3ac39\",\n              \"IPY_MODEL_7726ea4f836642ad8ab3e19a4f087919\"\n            ]\n          }\n        },\n        \"74e56a811e4646839645cc8e1cd2945e\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"9207a922be2c48b187d97e1575b3ac39\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_c11075b708e84ffabe5ce867f710dbb3\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 456318,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 456318,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_3596178648404646a454b7fb5aeb0742\"\n          }\n        },\n        \"7726ea4f836642ad8ab3e19a4f087919\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_2b348979747342869ab83a3bb7d9f71a\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 456k/456k [00:02&lt;00:00, 215kB/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_9b7811f75b0e42e4b7a4fa14e089668c\"\n          }\n        },\n        \"c11075b708e84ffabe5ce867f710dbb3\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"3596178648404646a454b7fb5aeb0742\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"2b348979747342869ab83a3bb7d9f71a\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"9b7811f75b0e42e4b7a4fa14e089668c\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"9ffa302c8b604d56a4d9826fb783f786\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_a37b422df0054a89a9b59d4233461b1b\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_3f9fa7a617c74f4fa4eda55e6d2c3f3d\",\n              \"IPY_MODEL_18e00a4457534250970b69f8146282e2\"\n            ]\n          }\n        },\n        \"a37b422df0054a89a9b59d4233461b1b\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"3f9fa7a617c74f4fa4eda55e6d2c3f3d\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_a23f46136c0e4373a9733cc7dba0c95e\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 772,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 772,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_26b1e8ea52e041ce8a1847e4d9483434\"\n          }\n        },\n        \"18e00a4457534250970b69f8146282e2\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_db5bdb301f2b48c5987c8fcb5236cf6c\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 772/772 [00:00&lt;00:00, 774B/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_6064912c6a0e41d4b0a8389f3e56b1e9\"\n          }\n        },\n        \"a23f46136c0e4373a9733cc7dba0c95e\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"26b1e8ea52e041ce8a1847e4d9483434\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"db5bdb301f2b48c5987c8fcb5236cf6c\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"6064912c6a0e41d4b0a8389f3e56b1e9\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"d5287e28a98749b5b2cd1560f157ff36\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_17ffd297995d442d86a05273b39ba7c0\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_38d8b60615fa4e649634802153ad09cd\",\n              \"IPY_MODEL_c86d5ff0dd8b40a5a4ff92d16ffcb21a\"\n            ]\n          }\n        },\n        \"17ffd297995d442d86a05273b39ba7c0\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"38d8b60615fa4e649634802153ad09cd\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_5de028a8c6494e5d9e2c0e1134080369\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 79,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 79,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_bb183a36134f4463a5453a043de77d93\"\n          }\n        },\n        \"c86d5ff0dd8b40a5a4ff92d16ffcb21a\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_fbedf7b06f9846348e174fc2536b765f\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 79.0/79.0 [00:00&lt;00:00, 216B/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_cac99d9ef6f74c07937fc39a20157937\"\n          }\n        },\n        \"5de028a8c6494e5d9e2c0e1134080369\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"bb183a36134f4463a5453a043de77d93\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"fbedf7b06f9846348e174fc2536b765f\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"cac99d9ef6f74c07937fc39a20157937\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        }\n      }\n    }\n  },\n  \"cells\": [\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"NyzjawOojIw-\"\n      },\n      \"source\": [\n        \"#Haystack Question-Answering Framework\\r\\n\",\n        \"\\r\\n\",\n        \"Notebook Author: [Malte Pietsch](https://www.linkedin.com/in/maltepietsch/)\\r\\n\",\n        \"\\r\\n\",\n        \"[Deepset AI Haystack GitHub Repository](https://github.com/deepset-ai/haystack/)\\r\\n\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"9E7CI3wONcSo\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"56fe1747-3bb5-46e0-b8f1-c55641473dcd\"\n      },\n      \"source\": [\n        \"# Install Haystack\\n\",\n        \"!pip install farm-haystack==0.6.0\\n\",\n        \"\\n\",\n        \"# Install specific versions of urllib and torch to avoid conflicts with preinstalled versions on Colab\\n\",\n        \"!pip install urllib3==1.25.4\\n\",\n        \"!pip install torch==1.6.0+cu101 -f https://download.pytorch.org/whl/torch_stable.html\\n\"\n      ],\n      \"execution_count\": 1,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Collecting farm-haystack==0.6.0\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/6d/c1/004081bfe50c20433718812321044b9d9dc7cf73bc5a63a2b335227bd21c/farm_haystack-0.6.0-py3-none-any.whl (104kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 112kB 8.1MB/s \\n\",\n            \"\\u001b[?25hCollecting uvloop; sys_platform != \\\"win32\\\" and sys_platform != \\\"cygwin\\\"\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/41/48/586225bbb02d3bdca475b17e4be5ce5b3f09da2d6979f359916c1592a687/uvloop-0.14.0-cp36-cp36m-manylinux2010_x86_64.whl (3.9MB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 3.9MB 13.7MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: coverage in /usr/local/lib/python3.6/dist-packages (from farm-haystack==0.6.0) (3.7.1)\\n\",\n            \"Requirement already satisfied: pandas in /usr/local/lib/python3.6/dist-packages (from farm-haystack==0.6.0) (1.1.5)\\n\",\n            \"Requirement already satisfied: nltk in /usr/local/lib/python3.6/dist-packages (from farm-haystack==0.6.0) (3.2.5)\\n\",\n            \"Collecting elasticsearch<=7.10,>=7.7\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/14/ba/f950bdd9164fb2bbbe5093700162234fbe61f446fe2300a8993761c132ca/elasticsearch-7.10.0-py2.py3-none-any.whl (321kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 327kB 49.8MB/s \\n\",\n            \"\\u001b[?25hCollecting farm==0.5.0\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/a3/e4/2f47c850732a1d729e74add867e967f058370f29a313da05dc871ff8465e/farm-0.5.0-py3-none-any.whl (207kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 215kB 56.3MB/s \\n\",\n            \"\\u001b[?25hCollecting fastapi\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/9f/33/1b643f650688ad368983bbaf3b0658438038ea84d775dd37393d826c3833/fastapi-0.63.0-py3-none-any.whl (50kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 51kB 8.0MB/s \\n\",\n            \"\\u001b[?25hCollecting python-multipart\\n\",\n            \"  Downloading https://files.pythonhosted.org/packages/46/40/a933ac570bf7aad12a298fc53458115cc74053474a72fbb8201d7dc06d3d/python-multipart-0.0.5.tar.gz\\n\",\n            \"Collecting langdetect\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/56/a3/8407c1e62d5980188b4acc45ef3d94b933d14a2ebc9ef3505f22cf772570/langdetect-1.0.8.tar.gz (981kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 983kB 53.3MB/s \\n\",\n            \"\\u001b[?25hCollecting psycopg2-binary; sys_platform != \\\"win32\\\" and sys_platform != \\\"cygwin\\\"\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/f2/1b/720b36697158113ca1b2221a8e96a470088ccf3770d182214689d1a96a07/psycopg2_binary-2.8.6-cp36-cp36m-manylinux1_x86_64.whl (3.0MB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 3.0MB 53.4MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: networkx in /usr/local/lib/python3.6/dist-packages (from farm-haystack==0.6.0) (2.5)\\n\",\n            \"Collecting faiss-cpu==1.6.3; sys_platform != \\\"win32\\\" and sys_platform != \\\"cygwin\\\"\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/1d/84/9de38703486d9f00b1a63590887a318d08c52f10f768968bd7626aee75da/faiss_cpu-1.6.3-cp36-cp36m-manylinux2010_x86_64.whl (7.2MB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 7.2MB 28.6MB/s \\n\",\n            \"\\u001b[?25hCollecting tika\\n\",\n            \"  Downloading https://files.pythonhosted.org/packages/96/07/244fbb9c74c0de8a3745cc9f3f496077a29f6418c7cbd90d68fd799574cb/tika-1.24.tar.gz\\n\",\n            \"Requirement already satisfied: sklearn in /usr/local/lib/python3.6/dist-packages (from farm-haystack==0.6.0) (0.0)\\n\",\n            \"Collecting uvicorn\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/2e/02/1e2520f6999e793d5bc5c15d8057b2e829d16a148e41199e0ae519653fa0/uvicorn-0.13.3-py3-none-any.whl (45kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 51kB 9.6MB/s \\n\",\n            \"\\u001b[?25hCollecting gunicorn\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/69/ca/926f7cd3a2014b16870086b2d0fdc84a9e49473c68a8dff8b57f7c156f43/gunicorn-20.0.4-py2.py3-none-any.whl (77kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 81kB 12.3MB/s \\n\",\n            \"\\u001b[?25hCollecting sqlalchemy-utils\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/14/68/e5301c4c960c79a32333b8805e52cb69d3d237aa869a773b4157ccb3eb26/SQLAlchemy-Utils-0.36.8.tar.gz (138kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 143kB 54.8MB/s \\n\",\n            \"\\u001b[?25hCollecting httptools\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/b1/a6/dc1e7e8f4049ab70d52c9690ec10652e268ab2542853033cc1d539594102/httptools-0.1.1-cp36-cp36m-manylinux1_x86_64.whl (216kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 225kB 48.2MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: more-itertools in /usr/local/lib/python3.6/dist-packages (from farm-haystack==0.6.0) (8.6.0)\\n\",\n            \"Collecting tox\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/e0/79/5915b9dad867e89bb6495456acfe5d4e2287e74dfa29c059f7b127d5480e/tox-3.20.1-py2.py3-none-any.whl (83kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 92kB 13.6MB/s \\n\",\n            \"\\u001b[?25hCollecting python-docx\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/e4/83/c66a1934ed5ed8ab1dbb9931f1779079f8bca0f6bbc5793c06c4b5e7d671/python-docx-0.8.10.tar.gz (5.5MB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 5.5MB 41.0MB/s \\n\",\n            \"\\u001b[?25hCollecting elastic-apm\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/27/c4/7bc90b3398198ea87f4316b739055f0319a0871415e561aceb4682e30a73/elastic_apm-5.10.0-cp36-cp36m-manylinux2010_x86_64.whl (318kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 327kB 57.1MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: pytz>=2017.2 in /usr/local/lib/python3.6/dist-packages (from pandas->farm-haystack==0.6.0) (2018.9)\\n\",\n            \"Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.6/dist-packages (from pandas->farm-haystack==0.6.0) (2.8.1)\\n\",\n            \"Requirement already satisfied: numpy>=1.15.4 in /usr/local/lib/python3.6/dist-packages (from pandas->farm-haystack==0.6.0) (1.19.4)\\n\",\n            \"Requirement already satisfied: six in /usr/local/lib/python3.6/dist-packages (from nltk->farm-haystack==0.6.0) (1.15.0)\\n\",\n            \"Requirement already satisfied: certifi in /usr/local/lib/python3.6/dist-packages (from elasticsearch<=7.10,>=7.7->farm-haystack==0.6.0) (2020.12.5)\\n\",\n            \"Requirement already satisfied: urllib3<2,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from elasticsearch<=7.10,>=7.7->farm-haystack==0.6.0) (1.24.3)\\n\",\n            \"Collecting flask-cors\\n\",\n            \"  Downloading https://files.pythonhosted.org/packages/69/7f/d0aeaaafb5c3c76c8d2141dbe2d4f6dca5d6c31872d4e5349768c1958abc/Flask_Cors-3.0.9-py2.py3-none-any.whl\\n\",\n            \"Collecting dotmap==1.3.0\\n\",\n            \"  Downloading https://files.pythonhosted.org/packages/fa/eb/ee5f0358a9e0ede90308d8f34e697e122f191c2702dc4f614eca7770b1eb/dotmap-1.3.0-py3-none-any.whl\\n\",\n            \"Requirement already satisfied: setuptools in /usr/local/lib/python3.6/dist-packages (from farm==0.5.0->farm-haystack==0.6.0) (51.0.0)\\n\",\n            \"Collecting transformers==3.3.1\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/19/22/aff234f4a841f8999e68a7a94bdd4b60b4cebcfeca5d67d61cd08c9179de/transformers-3.3.1-py3-none-any.whl (1.1MB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 1.1MB 50.3MB/s \\n\",\n            \"\\u001b[?25hCollecting torch<1.7,>1.5\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/38/53/914885a93a44b96c0dd1c36f36ff10afe341f091230aad68f7228d61db1e/torch-1.6.0-cp36-cp36m-manylinux1_x86_64.whl (748.8MB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 748.8MB 24kB/s \\n\",\n            \"\\u001b[?25hCollecting boto3\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/1d/da/c6eaf4c1c8eec70fea402495ee34112824241bc96e20756c0c0c6f97feab/boto3-1.16.46-py2.py3-none-any.whl (130kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 133kB 59.5MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: scipy>=1.3.2 in /usr/local/lib/python3.6/dist-packages (from farm==0.5.0->farm-haystack==0.6.0) (1.4.1)\\n\",\n            \"Requirement already satisfied: dill in /usr/local/lib/python3.6/dist-packages (from farm==0.5.0->farm-haystack==0.6.0) (0.3.3)\\n\",\n            \"Requirement already satisfied: tqdm in /usr/local/lib/python3.6/dist-packages (from farm==0.5.0->farm-haystack==0.6.0) (4.41.1)\\n\",\n            \"Collecting seqeval==0.0.12\\n\",\n            \"  Downloading https://files.pythonhosted.org/packages/34/91/068aca8d60ce56dd9ba4506850e876aba5e66a6f2f29aa223224b50df0de/seqeval-0.0.12.tar.gz\\n\",\n            \"Requirement already satisfied: psutil in /usr/local/lib/python3.6/dist-packages (from farm==0.5.0->farm-haystack==0.6.0) (5.4.8)\\n\",\n            \"Collecting mlflow==1.0.0\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/01/ec/8c9448968d4662e8354b9c3a62e635f8929ed507a45af3d9fdb84be51270/mlflow-1.0.0-py3-none-any.whl (47.7MB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 47.7MB 143kB/s \\n\",\n            \"\\u001b[?25hCollecting flask-restplus\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/c2/a6/b17c848771f96ad039ad9e3ea275e842a16c39c4f3eb9f60ee330b20b6c2/flask_restplus-0.13.0-py2.py3-none-any.whl (2.5MB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 2.5MB 50.2MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: wheel in /usr/local/lib/python3.6/dist-packages (from farm==0.5.0->farm-haystack==0.6.0) (0.36.2)\\n\",\n            \"Collecting Werkzeug==0.16.1\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/c2/e4/a859d2fe516f466642fa5c6054fd9646271f9da26b0cac0d2f37fc858c8f/Werkzeug-0.16.1-py2.py3-none-any.whl (327kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 327kB 50.4MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: requests in /usr/local/lib/python3.6/dist-packages (from farm==0.5.0->farm-haystack==0.6.0) (2.23.0)\\n\",\n            \"Requirement already satisfied: flask in /usr/local/lib/python3.6/dist-packages (from farm==0.5.0->farm-haystack==0.6.0) (1.1.2)\\n\",\n            \"Collecting pydantic<2.0.0,>=1.0.0\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/52/ea/fae9f69b6e56407961318e8c73e203097a97c7bd71b30bf1b4f5eb448f28/pydantic-1.7.3-cp36-cp36m-manylinux2014_x86_64.whl (9.2MB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 9.2MB 25.7MB/s \\n\",\n            \"\\u001b[?25hCollecting starlette==0.13.6\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/c5/a4/c9e228d7d47044ce4c83ba002f28ff479e542455f0499198a3f77c94f564/starlette-0.13.6-py3-none-any.whl (59kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 61kB 10.9MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: decorator>=4.3.0 in /usr/local/lib/python3.6/dist-packages (from networkx->farm-haystack==0.6.0) (4.4.2)\\n\",\n            \"Requirement already satisfied: scikit-learn in /usr/local/lib/python3.6/dist-packages (from sklearn->farm-haystack==0.6.0) (0.22.2.post1)\\n\",\n            \"Requirement already satisfied: typing-extensions; python_version < \\\"3.8\\\" in /usr/local/lib/python3.6/dist-packages (from uvicorn->farm-haystack==0.6.0) (3.7.4.3)\\n\",\n            \"Collecting h11>=0.8\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/b2/79/9c5f5cd738ec2a9b26453b3093915c0999f24454e2773921025c03b5509e/h11-0.11.0-py2.py3-none-any.whl (54kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 61kB 10.0MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: click==7.* in /usr/local/lib/python3.6/dist-packages (from uvicorn->farm-haystack==0.6.0) (7.1.2)\\n\",\n            \"Requirement already satisfied: SQLAlchemy>=1.0 in /usr/local/lib/python3.6/dist-packages (from sqlalchemy-utils->farm-haystack==0.6.0) (1.3.20)\\n\",\n            \"Requirement already satisfied: filelock>=3.0.0 in /usr/local/lib/python3.6/dist-packages (from tox->farm-haystack==0.6.0) (3.0.12)\\n\",\n            \"Requirement already satisfied: py>=1.4.17 in /usr/local/lib/python3.6/dist-packages (from tox->farm-haystack==0.6.0) (1.10.0)\\n\",\n            \"Requirement already satisfied: toml>=0.9.4 in /usr/local/lib/python3.6/dist-packages (from tox->farm-haystack==0.6.0) (0.10.2)\\n\",\n            \"Collecting virtualenv!=20.0.0,!=20.0.1,!=20.0.2,!=20.0.3,!=20.0.4,!=20.0.5,!=20.0.6,!=20.0.7,>=16.0.0\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/1a/c6/bb564f5eec616d241e85d741f00a07f5f50ea12989022ad49bc66876993c/virtualenv-20.2.2-py2.py3-none-any.whl (5.7MB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 5.7MB 58.4MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: packaging>=14 in /usr/local/lib/python3.6/dist-packages (from tox->farm-haystack==0.6.0) (20.8)\\n\",\n            \"Collecting importlib-metadata<3,>=0.12; python_version < \\\"3.8\\\"\\n\",\n            \"  Downloading https://files.pythonhosted.org/packages/98/b8/8ec57a8ef46fbe7f185318c7ff7df9a06c9df451d9a59a067bfa851bb828/importlib_metadata-2.1.1-py2.py3-none-any.whl\\n\",\n            \"Collecting pluggy>=0.12.0\\n\",\n            \"  Downloading https://files.pythonhosted.org/packages/a0/28/85c7aa31b80d150b772fbe4a229487bc6644da9ccb7e427dd8cc60cb8a62/pluggy-0.13.1-py2.py3-none-any.whl\\n\",\n            \"Requirement already satisfied: lxml>=2.3.2 in /usr/local/lib/python3.6/dist-packages (from python-docx->farm-haystack==0.6.0) (4.2.6)\\n\",\n            \"Collecting tokenizers==0.8.1.rc2\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/80/83/8b9fccb9e48eeb575ee19179e2bdde0ee9a1904f97de5f02d19016b8804f/tokenizers-0.8.1rc2-cp36-cp36m-manylinux1_x86_64.whl (3.0MB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 3.0MB 46.4MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.6/dist-packages (from transformers==3.3.1->farm==0.5.0->farm-haystack==0.6.0) (2019.12.20)\\n\",\n            \"Requirement already satisfied: dataclasses; python_version < \\\"3.7\\\" in /usr/local/lib/python3.6/dist-packages (from transformers==3.3.1->farm==0.5.0->farm-haystack==0.6.0) (0.8)\\n\",\n            \"Collecting sentencepiece!=0.1.92\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/e5/2d/6d4ca4bef9a67070fa1cac508606328329152b1df10bdf31fb6e4e727894/sentencepiece-0.1.94-cp36-cp36m-manylinux2014_x86_64.whl (1.1MB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 1.1MB 55.2MB/s \\n\",\n            \"\\u001b[?25hCollecting sacremoses\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/7d/34/09d19aff26edcc8eb2a01bed8e98f13a1537005d31e95233fd48216eed10/sacremoses-0.0.43.tar.gz (883kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 890kB 57.7MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: future in /usr/local/lib/python3.6/dist-packages (from torch<1.7,>1.5->farm==0.5.0->farm-haystack==0.6.0) (0.16.0)\\n\",\n            \"Collecting botocore<1.20.0,>=1.19.46\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/5c/48/8151aad820996a46373a4ffa2268a7209c10518d1b3eb48bbd0010c5b6a3/botocore-1.19.46-py2.py3-none-any.whl (7.2MB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 7.2MB 54.2MB/s \\n\",\n            \"\\u001b[?25hCollecting s3transfer<0.4.0,>=0.3.0\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/69/79/e6afb3d8b0b4e96cefbdc690f741d7dd24547ff1f94240c997a26fa908d3/s3transfer-0.3.3-py2.py3-none-any.whl (69kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 71kB 11.4MB/s \\n\",\n            \"\\u001b[?25hCollecting jmespath<1.0.0,>=0.7.1\\n\",\n            \"  Downloading https://files.pythonhosted.org/packages/07/cb/5f001272b6faeb23c1c9e0acc04d48eaaf5c862c17709d20e3469c6e0139/jmespath-0.10.0-py2.py3-none-any.whl\\n\",\n            \"Requirement already satisfied: Keras>=2.2.4 in /usr/local/lib/python3.6/dist-packages (from seqeval==0.0.12->farm==0.5.0->farm-haystack==0.6.0) (2.4.3)\\n\",\n            \"Collecting databricks-cli>=0.8.0\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/40/88/ae1f78cf582b707c605c77df49b4c8786a4465edc51adb25d2f98ef4c4de/databricks-cli-0.14.1.tar.gz (54kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 61kB 11.6MB/s \\n\",\n            \"\\u001b[?25hCollecting gitpython>=2.1.0\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/24/d1/a7f8fe3df258549b303415157328bfcc63e9b11d06a7ad7a3327f3d32606/GitPython-3.1.11-py3-none-any.whl (159kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 163kB 59.8MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: cloudpickle in /usr/local/lib/python3.6/dist-packages (from mlflow==1.0.0->farm==0.5.0->farm-haystack==0.6.0) (1.3.0)\\n\",\n            \"Collecting simplejson\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/73/96/1e6b19045375890068d7342cbe280dd64ae73fd90b9735b5efb8d1e044a1/simplejson-3.17.2-cp36-cp36m-manylinux2010_x86_64.whl (127kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 133kB 58.1MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: protobuf>=3.6.0 in /usr/local/lib/python3.6/dist-packages (from mlflow==1.0.0->farm==0.5.0->farm-haystack==0.6.0) (3.12.4)\\n\",\n            \"Requirement already satisfied: entrypoints in /usr/local/lib/python3.6/dist-packages (from mlflow==1.0.0->farm==0.5.0->farm-haystack==0.6.0) (0.3)\\n\",\n            \"Requirement already satisfied: sqlparse in /usr/local/lib/python3.6/dist-packages (from mlflow==1.0.0->farm==0.5.0->farm-haystack==0.6.0) (0.4.1)\\n\",\n            \"Collecting querystring-parser\\n\",\n            \"  Downloading https://files.pythonhosted.org/packages/88/6b/572b2590fd55114118bf08bde63c0a421dcc82d593700f3e2ad89908a8a9/querystring_parser-1.2.4-py2.py3-none-any.whl\\n\",\n            \"Collecting alembic\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/12/aa/c261dfd7f4ba6ce4701846a2689a46e2a172e012171de4378fc2926e3bf0/alembic-1.4.3-py2.py3-none-any.whl (159kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 163kB 50.7MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: pyyaml in /usr/local/lib/python3.6/dist-packages (from mlflow==1.0.0->farm==0.5.0->farm-haystack==0.6.0) (3.13)\\n\",\n            \"Collecting docker>=3.6.0\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/9f/a5/eec74d8d1016e6c2042ba31ca6fba3bba520e27d8a061e82bccd36bd64ef/docker-4.4.1-py2.py3-none-any.whl (146kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 153kB 64.8MB/s \\n\",\n            \"\\u001b[?25hCollecting aniso8601>=0.82\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/93/4e/760c0aaf32034e2da98e1ac6d83b6ffc6d1301132af54c3950ee07785bfa/aniso8601-8.1.0-py2.py3-none-any.whl (44kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 51kB 10.1MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: jsonschema in /usr/local/lib/python3.6/dist-packages (from flask-restplus->farm==0.5.0->farm-haystack==0.6.0) (2.6.0)\\n\",\n            \"Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests->farm==0.5.0->farm-haystack==0.6.0) (2.10)\\n\",\n            \"Requirement already satisfied: chardet<4,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests->farm==0.5.0->farm-haystack==0.6.0) (3.0.4)\\n\",\n            \"Requirement already satisfied: Jinja2>=2.10.1 in /usr/local/lib/python3.6/dist-packages (from flask->farm==0.5.0->farm-haystack==0.6.0) (2.11.2)\\n\",\n            \"Requirement already satisfied: itsdangerous>=0.24 in /usr/local/lib/python3.6/dist-packages (from flask->farm==0.5.0->farm-haystack==0.6.0) (1.1.0)\\n\",\n            \"Requirement already satisfied: joblib>=0.11 in /usr/local/lib/python3.6/dist-packages (from scikit-learn->sklearn->farm-haystack==0.6.0) (1.0.0)\\n\",\n            \"Requirement already satisfied: importlib-resources>=1.0; python_version < \\\"3.7\\\" in /usr/local/lib/python3.6/dist-packages (from virtualenv!=20.0.0,!=20.0.1,!=20.0.2,!=20.0.3,!=20.0.4,!=20.0.5,!=20.0.6,!=20.0.7,>=16.0.0->tox->farm-haystack==0.6.0) (3.3.0)\\n\",\n            \"Collecting distlib<1,>=0.3.1\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/f5/0a/490fa011d699bb5a5f3a0cf57de82237f52a6db9d40f33c53b2736c9a1f9/distlib-0.3.1-py2.py3-none-any.whl (335kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 337kB 60.5MB/s \\n\",\n            \"\\u001b[?25hCollecting appdirs<2,>=1.4.3\\n\",\n            \"  Downloading https://files.pythonhosted.org/packages/3b/00/2344469e2084fb287c2e0b57b72910309874c3245463acd6cf5e3db69324/appdirs-1.4.4-py2.py3-none-any.whl\\n\",\n            \"Requirement already satisfied: pyparsing>=2.0.2 in /usr/local/lib/python3.6/dist-packages (from packaging>=14->tox->farm-haystack==0.6.0) (2.4.7)\\n\",\n            \"Requirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.6/dist-packages (from importlib-metadata<3,>=0.12; python_version < \\\"3.8\\\"->tox->farm-haystack==0.6.0) (3.4.0)\\n\",\n            \"Requirement already satisfied: h5py in /usr/local/lib/python3.6/dist-packages (from Keras>=2.2.4->seqeval==0.0.12->farm==0.5.0->farm-haystack==0.6.0) (2.10.0)\\n\",\n            \"Requirement already satisfied: tabulate>=0.7.7 in /usr/local/lib/python3.6/dist-packages (from databricks-cli>=0.8.0->mlflow==1.0.0->farm==0.5.0->farm-haystack==0.6.0) (0.8.7)\\n\",\n            \"Collecting gitdb<5,>=4.0.1\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/48/11/d1800bca0a3bae820b84b7d813ad1eff15a48a64caea9c823fc8c1b119e8/gitdb-4.0.5-py3-none-any.whl (63kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 71kB 1.9MB/s \\n\",\n            \"\\u001b[?25hCollecting Mako\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/a6/37/0e706200d22172eb8fa17d68a7ae22dec7631a0a92266634fb518a88a5b2/Mako-1.1.3-py2.py3-none-any.whl (75kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 81kB 14.0MB/s \\n\",\n            \"\\u001b[?25hCollecting python-editor>=0.3\\n\",\n            \"  Downloading https://files.pythonhosted.org/packages/c6/d3/201fc3abe391bbae6606e6f1d598c15d367033332bd54352b12f35513717/python_editor-1.0.4-py3-none-any.whl\\n\",\n            \"Collecting websocket-client>=0.32.0\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/4c/5f/f61b420143ed1c8dc69f9eaec5ff1ac36109d52c80de49d66e0c36c3dfdf/websocket_client-0.57.0-py2.py3-none-any.whl (200kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 204kB 58.5MB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: MarkupSafe>=0.23 in /usr/local/lib/python3.6/dist-packages (from Jinja2>=2.10.1->flask->farm==0.5.0->farm-haystack==0.6.0) (1.1.1)\\n\",\n            \"Collecting smmap<4,>=3.0.1\\n\",\n            \"  Downloading https://files.pythonhosted.org/packages/b0/9a/4d409a6234eb940e6a78dfdfc66156e7522262f5f2fecca07dc55915952d/smmap-3.0.4-py2.py3-none-any.whl\\n\",\n            \"Building wheels for collected packages: python-multipart, langdetect, tika, sqlalchemy-utils, python-docx, seqeval, sacremoses, databricks-cli\\n\",\n            \"  Building wheel for python-multipart (setup.py) ... \\u001b[?25l\\u001b[?25hdone\\n\",\n            \"  Created wheel for python-multipart: filename=python_multipart-0.0.5-cp36-none-any.whl size=31671 sha256=0a59f1ef7b4b3ef62324c163a33aa3ff58decbbc951b2021021425f040d24ec6\\n\",\n            \"  Stored in directory: /root/.cache/pip/wheels/f0/e6/66/14a866a3cbd6a0cabfbef91f7edf40aa03595ef6c88d6d1be4\\n\",\n            \"  Building wheel for langdetect (setup.py) ... \\u001b[?25l\\u001b[?25hdone\\n\",\n            \"  Created wheel for langdetect: filename=langdetect-1.0.8-cp36-none-any.whl size=993194 sha256=21c74a416e30b2e1d7048d4363953fdd57ded891678346ef6ba4f2711b0281a5\\n\",\n            \"  Stored in directory: /root/.cache/pip/wheels/8d/b3/aa/6d99de9f3841d7d3d40a60ea06e6d669e8e5012e6c8b947a57\\n\",\n            \"  Building wheel for tika (setup.py) ... \\u001b[?25l\\u001b[?25hdone\\n\",\n            \"  Created wheel for tika: filename=tika-1.24-cp36-none-any.whl size=32885 sha256=82254f4e17038471246232ba8f657133b805be2c90379e2aa425ab5321a3e8ff\\n\",\n            \"  Stored in directory: /root/.cache/pip/wheels/73/9c/f5/0b1b738442fc2a2862bef95b908b374f8e80215550fb2a8975\\n\",\n            \"  Building wheel for sqlalchemy-utils (setup.py) ... \\u001b[?25l\\u001b[?25hdone\\n\",\n            \"  Created wheel for sqlalchemy-utils: filename=SQLAlchemy_Utils-0.36.8-py2.py3-none-any.whl size=93220 sha256=6d55e4d4f1adef609d0f145b405ab7b028b6866b68a0e6130d492d14b2eb9607\\n\",\n            \"  Stored in directory: /root/.cache/pip/wheels/68/31/b6/a96bf6868f42753696d647846c9a0f8e51bd99295790d07660\\n\",\n            \"  Building wheel for python-docx (setup.py) ... \\u001b[?25l\\u001b[?25hdone\\n\",\n            \"  Created wheel for python-docx: filename=python_docx-0.8.10-cp36-none-any.whl size=184491 sha256=f3dfbdb09cc4716346ce7fc0a371d29b748d9e96a1a700ee4946a4933ab8ece2\\n\",\n            \"  Stored in directory: /root/.cache/pip/wheels/18/0b/a0/1dd62ff812c857c9e487f27d80d53d2b40531bec1acecfa47b\\n\",\n            \"  Building wheel for seqeval (setup.py) ... \\u001b[?25l\\u001b[?25hdone\\n\",\n            \"  Created wheel for seqeval: filename=seqeval-0.0.12-cp36-none-any.whl size=7424 sha256=cf33a62c02c6d373cbd8c3841bd5085356d5da2a9a4064af58cac1aebb35748c\\n\",\n            \"  Stored in directory: /root/.cache/pip/wheels/4f/32/0a/df3b340a82583566975377d65e724895b3fad101a3fb729f68\\n\",\n            \"  Building wheel for sacremoses (setup.py) ... \\u001b[?25l\\u001b[?25hdone\\n\",\n            \"  Created wheel for sacremoses: filename=sacremoses-0.0.43-cp36-none-any.whl size=893261 sha256=4d768e9af78825f19673b2d4b776e31a897b760396ebc71399d8b484886871fc\\n\",\n            \"  Stored in directory: /root/.cache/pip/wheels/29/3c/fd/7ce5c3f0666dab31a50123635e6fb5e19ceb42ce38d4e58f45\\n\",\n            \"  Building wheel for databricks-cli (setup.py) ... \\u001b[?25l\\u001b[?25hdone\\n\",\n            \"  Created wheel for databricks-cli: filename=databricks_cli-0.14.1-cp36-none-any.whl size=100579 sha256=69e07ef2dad31db2c9dae884c55407849e95145d192c4d68512092c26079bbbe\\n\",\n            \"  Stored in directory: /root/.cache/pip/wheels/82/91/ac/5d417ee5ccbb76c8cca096cf4cfb9ed9d49d889d1d1ca0fc39\\n\",\n            \"Successfully built python-multipart langdetect tika sqlalchemy-utils python-docx seqeval sacremoses databricks-cli\\n\",\n            \"\\u001b[31mERROR: torchvision 0.8.1+cu101 has requirement torch==1.7.0, but you'll have torch 1.6.0 which is incompatible.\\u001b[0m\\n\",\n            \"\\u001b[31mERROR: pytest 3.6.4 has requirement pluggy<0.8,>=0.5, but you'll have pluggy 0.13.1 which is incompatible.\\u001b[0m\\n\",\n            \"\\u001b[31mERROR: datascience 0.10.6 has requirement folium==0.2.1, but you'll have folium 0.8.3 which is incompatible.\\u001b[0m\\n\",\n            \"\\u001b[31mERROR: botocore 1.19.46 has requirement urllib3<1.27,>=1.25.4; python_version != \\\"3.4\\\", but you'll have urllib3 1.24.3 which is incompatible.\\u001b[0m\\n\",\n            \"Installing collected packages: uvloop, elasticsearch, flask-cors, dotmap, tokenizers, sentencepiece, sacremoses, transformers, torch, jmespath, botocore, s3transfer, boto3, seqeval, gunicorn, databricks-cli, smmap, gitdb, gitpython, simplejson, querystring-parser, Mako, python-editor, alembic, websocket-client, docker, mlflow, aniso8601, flask-restplus, Werkzeug, farm, pydantic, starlette, fastapi, python-multipart, langdetect, psycopg2-binary, faiss-cpu, tika, h11, uvicorn, sqlalchemy-utils, httptools, distlib, appdirs, importlib-metadata, virtualenv, pluggy, tox, python-docx, elastic-apm, farm-haystack\\n\",\n            \"  Found existing installation: torch 1.7.0+cu101\\n\",\n            \"    Uninstalling torch-1.7.0+cu101:\\n\",\n            \"      Successfully uninstalled torch-1.7.0+cu101\\n\",\n            \"  Found existing installation: Werkzeug 1.0.1\\n\",\n            \"    Uninstalling Werkzeug-1.0.1:\\n\",\n            \"      Successfully uninstalled Werkzeug-1.0.1\\n\",\n            \"  Found existing installation: importlib-metadata 3.3.0\\n\",\n            \"    Uninstalling importlib-metadata-3.3.0:\\n\",\n            \"      Successfully uninstalled importlib-metadata-3.3.0\\n\",\n            \"  Found existing installation: pluggy 0.7.1\\n\",\n            \"    Uninstalling pluggy-0.7.1:\\n\",\n            \"      Successfully uninstalled pluggy-0.7.1\\n\",\n            \"Successfully installed Mako-1.1.3 Werkzeug-0.16.1 alembic-1.4.3 aniso8601-8.1.0 appdirs-1.4.4 boto3-1.16.46 botocore-1.19.46 databricks-cli-0.14.1 distlib-0.3.1 docker-4.4.1 dotmap-1.3.0 elastic-apm-5.10.0 elasticsearch-7.10.0 faiss-cpu-1.6.3 farm-0.5.0 farm-haystack-0.6.0 fastapi-0.63.0 flask-cors-3.0.9 flask-restplus-0.13.0 gitdb-4.0.5 gitpython-3.1.11 gunicorn-20.0.4 h11-0.11.0 httptools-0.1.1 importlib-metadata-2.1.1 jmespath-0.10.0 langdetect-1.0.8 mlflow-1.0.0 pluggy-0.13.1 psycopg2-binary-2.8.6 pydantic-1.7.3 python-docx-0.8.10 python-editor-1.0.4 python-multipart-0.0.5 querystring-parser-1.2.4 s3transfer-0.3.3 sacremoses-0.0.43 sentencepiece-0.1.94 seqeval-0.0.12 simplejson-3.17.2 smmap-3.0.4 sqlalchemy-utils-0.36.8 starlette-0.13.6 tika-1.24 tokenizers-0.8.1rc2 torch-1.6.0 tox-3.20.1 transformers-3.3.1 uvicorn-0.13.3 uvloop-0.14.0 virtualenv-20.2.2 websocket-client-0.57.0\\n\",\n            \"Collecting urllib3==1.25.4\\n\",\n            \"\\u001b[?25l  Downloading https://files.pythonhosted.org/packages/91/0d/7777358f672a14b7ae0dfcd29f949f409f913e0578190d6bfa68eb55864b/urllib3-1.25.4-py2.py3-none-any.whl (125kB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 133kB 7.5MB/s \\n\",\n            \"\\u001b[31mERROR: datascience 0.10.6 has requirement folium==0.2.1, but you'll have folium 0.8.3 which is incompatible.\\u001b[0m\\n\",\n            \"\\u001b[?25hInstalling collected packages: urllib3\\n\",\n            \"  Found existing installation: urllib3 1.24.3\\n\",\n            \"    Uninstalling urllib3-1.24.3:\\n\",\n            \"      Successfully uninstalled urllib3-1.24.3\\n\",\n            \"Successfully installed urllib3-1.25.4\\n\",\n            \"Looking in links: https://download.pytorch.org/whl/torch_stable.html\\n\",\n            \"Collecting torch==1.6.0+cu101\\n\",\n            \"\\u001b[?25l  Downloading https://download.pytorch.org/whl/cu101/torch-1.6.0%2Bcu101-cp36-cp36m-linux_x86_64.whl (708.0MB)\\n\",\n            \"\\u001b[K     |████████████████████████████████| 708.0MB 10kB/s \\n\",\n            \"\\u001b[?25hRequirement already satisfied: future in /usr/local/lib/python3.6/dist-packages (from torch==1.6.0+cu101) (0.16.0)\\n\",\n            \"Requirement already satisfied: numpy in /usr/local/lib/python3.6/dist-packages (from torch==1.6.0+cu101) (1.19.4)\\n\",\n            \"\\u001b[31mERROR: torchvision 0.8.1+cu101 has requirement torch==1.7.0, but you'll have torch 1.6.0+cu101 which is incompatible.\\u001b[0m\\n\",\n            \"Installing collected packages: torch\\n\",\n            \"  Found existing installation: torch 1.6.0\\n\",\n            \"    Uninstalling torch-1.6.0:\\n\",\n            \"      Successfully uninstalled torch-1.6.0\\n\",\n            \"Successfully installed torch-1.6.0+cu101\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"R8aBneh68pIJ\"\n      },\n      \"source\": [\n        \"# Extractive QA in a closed domain (single text)\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"pycharm\": {\n          \"is_executing\": false\n        },\n        \"id\": \"m7G3G4BjNcSz\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 790,\n          \"referenced_widgets\": [\n            \"8e2aa2531c9a4890ad722171e4a51122\",\n            \"743ca8b7bb574fd99e3e0516ab60b7cc\",\n            \"4b361abfd25e4dfc86b96384753f6cdd\",\n            \"f0da7f5b445a443ca0692396bdf54062\",\n            \"f480c548d8774a8abc61bd8b045fee29\",\n            \"8459fc383ca741a29ea1d53190b71457\",\n            \"d1b1591de71a4bd8a2909975ee82d998\",\n            \"afebd0dd7f3e4e44927220ad3fc13f17\",\n            \"56c4b0d2d2654b1e9470d8f0b920ae16\",\n            \"d0cdfe65d369405a90a7abcf38c289c8\",\n            \"d66b73d1014848db81edf429c857f9ef\",\n            \"dce3f0f8e8e24c73913617300da7e370\",\n            \"2c095740976c4bdea8645e21d277283c\",\n            \"fbc7634200954b37b7c869050591022d\",\n            \"4a622f003ef24eb79fd13a08f423c665\",\n            \"db417b1034bc405fa3ac881361f943c4\",\n            \"ede7dfd59ae8455689373afda2771132\",\n            \"44ff19ca5bed4708aa3bf39032563b2e\",\n            \"9be9ce8699c14d47853d40e9cd8bf7d0\",\n            \"9f3d806fbec84b179e9a49e49c905fa3\",\n            \"f532cd4dccc14741a9d4e506a72507ad\",\n            \"d810a6e15c1d43e0a6793c60622da504\",\n            \"4f711e53e0944446aa51e8a241017e8c\",\n            \"19b30696c5294a6b92cecb3533b10fed\",\n            \"7fb2356ed11344af950f52ebad7c57e1\",\n            \"74e56a811e4646839645cc8e1cd2945e\",\n            \"9207a922be2c48b187d97e1575b3ac39\",\n            \"7726ea4f836642ad8ab3e19a4f087919\",\n            \"c11075b708e84ffabe5ce867f710dbb3\",\n            \"3596178648404646a454b7fb5aeb0742\",\n            \"2b348979747342869ab83a3bb7d9f71a\",\n            \"9b7811f75b0e42e4b7a4fa14e089668c\",\n            \"9ffa302c8b604d56a4d9826fb783f786\",\n            \"a37b422df0054a89a9b59d4233461b1b\",\n            \"3f9fa7a617c74f4fa4eda55e6d2c3f3d\",\n            \"18e00a4457534250970b69f8146282e2\",\n            \"a23f46136c0e4373a9733cc7dba0c95e\",\n            \"26b1e8ea52e041ce8a1847e4d9483434\",\n            \"db5bdb301f2b48c5987c8fcb5236cf6c\",\n            \"6064912c6a0e41d4b0a8389f3e56b1e9\",\n            \"d5287e28a98749b5b2cd1560f157ff36\",\n            \"17ffd297995d442d86a05273b39ba7c0\",\n            \"38d8b60615fa4e649634802153ad09cd\",\n            \"c86d5ff0dd8b40a5a4ff92d16ffcb21a\",\n            \"5de028a8c6494e5d9e2c0e1134080369\",\n            \"bb183a36134f4463a5453a043de77d93\",\n            \"fbedf7b06f9846348e174fc2536b765f\",\n            \"cac99d9ef6f74c07937fc39a20157937\"\n          ]\n        },\n        \"outputId\": \"2490aed8-2175-4c60-f39e-3711f21d6f1a\"\n      },\n      \"source\": [\n        \"# Load a  local model or any of the QA models on Hugging Face's model hub (https://huggingface.co/models)\\n\",\n        \"from haystack.reader.farm import FARMReader\\n\",\n        \"\\n\",\n        \"reader = FARMReader(model_name_or_path=\\\"deepset/roberta-base-squad2\\\", use_gpu=True, no_ans_boost=0, return_no_answer=False)\\n\",\n        \"\\n\",\n        \"\\n\",\n        \"# Create document which the model should scan for answers.\\n\",\n        \"from haystack import Document\\n\",\n        \"\\n\",\n        \"text = \\\"The traffic began to slow down on Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool jazz, and the weather was cool, making it rather pleasant to be making it out of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and drove toward Barstow. They planned to get to Las Vegas early enough in the evening to have a nice dinner and go see a show.\\\"\\n\",\n        \"doc = Document(text=text)\"\n      ],\n      \"execution_count\": 2,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"12/31/2020 16:02:14 - INFO - faiss -   Loading faiss with AVX2 support.\\n\",\n            \"12/31/2020 16:02:14 - INFO - faiss -   Loading faiss.\\n\",\n            \"12/31/2020 16:02:15 - INFO - farm.utils -   device: cuda n_gpu: 1, distributed training: False, automatic mixed precision training: None\\n\",\n            \"12/31/2020 16:02:15 - INFO - farm.infer -   Could not find `deepset/roberta-base-squad2` locally. Try to download from model hub ...\\n\",\n            \"12/31/2020 16:02:15 - INFO - filelock -   Lock 139851960964880 acquired on /root/.cache/torch/transformers/f7d4b9379a9c487fa03ccf3d8e00058faa9d664cf01fc03409138246f48760da.6060f348ba2b58d6d30b5324910152ffc512e7c3891ed13f22844f1a9b5c0d0f.lock\\n\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"8e2aa2531c9a4890ad722171e4a51122\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=571.0, style=ProgressStyle(description_…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"12/31/2020 16:02:16 - INFO - filelock -   Lock 139851960964880 released on /root/.cache/torch/transformers/f7d4b9379a9c487fa03ccf3d8e00058faa9d664cf01fc03409138246f48760da.6060f348ba2b58d6d30b5324910152ffc512e7c3891ed13f22844f1a9b5c0d0f.lock\\n\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"12/31/2020 16:02:16 - INFO - filelock -   Lock 139849313884144 acquired on /root/.cache/torch/transformers/8c0c8b6371111ac5fbc176aefcf9dbe129db7be654c569b8375dd3712fc4dc67.a851909c96149f062acca04d647da88d0dcd3a52cd5a8c7169e89fc6e5971c7b.lock\\n\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"56c4b0d2d2654b1e9470d8f0b920ae16\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=496313727.0, style=ProgressStyle(descri…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"12/31/2020 16:02:30 - INFO - filelock -   Lock 139849313884144 released on /root/.cache/torch/transformers/8c0c8b6371111ac5fbc176aefcf9dbe129db7be654c569b8375dd3712fc4dc67.a851909c96149f062acca04d647da88d0dcd3a52cd5a8c7169e89fc6e5971c7b.lock\\n\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Some weights of RobertaModel were not initialized from the model checkpoint at deepset/roberta-base-squad2 and are newly initialized: ['roberta.pooler.dense.weight', 'roberta.pooler.dense.bias']\\n\",\n            \"You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\\n\",\n            \"12/31/2020 16:02:32 - WARNING - farm.modeling.language_model -   Could not automatically detect from language model name what language it is. \\n\",\n            \"\\t We guess it's an *ENGLISH* model ... \\n\",\n            \"\\t If not: Init the language model by supplying the 'language' param.\\n\",\n            \"12/31/2020 16:02:44 - INFO - filelock -   Lock 139849313883528 acquired on /root/.cache/torch/transformers/1e3af82648d7190d959a9d76d727ef629b1ca51b3da6ad04039122453cb56307.6a4061e8fc00057d21d80413635a86fdcf55b6e7594ad9e25257d2f99a02f4be.lock\\n\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"ede7dfd59ae8455689373afda2771132\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=898822.0, style=ProgressStyle(descripti…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"12/31/2020 16:02:45 - INFO - filelock -   Lock 139849313883528 released on /root/.cache/torch/transformers/1e3af82648d7190d959a9d76d727ef629b1ca51b3da6ad04039122453cb56307.6a4061e8fc00057d21d80413635a86fdcf55b6e7594ad9e25257d2f99a02f4be.lock\\n\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"12/31/2020 16:02:45 - INFO - filelock -   Lock 139849296850280 acquired on /root/.cache/torch/transformers/b901c69e8e7da4a24c635ad81d016d274f174261f4f5c144e43f4b00e242c3b0.70bec105b4158ed9a1747fea67a43f5dee97855c64d62b6ec3742f4cfdb5feda.lock\\n\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"7fb2356ed11344af950f52ebad7c57e1\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=456318.0, style=ProgressStyle(descripti…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"12/31/2020 16:02:46 - INFO - filelock -   Lock 139849296850280 released on /root/.cache/torch/transformers/b901c69e8e7da4a24c635ad81d016d274f174261f4f5c144e43f4b00e242c3b0.70bec105b4158ed9a1747fea67a43f5dee97855c64d62b6ec3742f4cfdb5feda.lock\\n\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"12/31/2020 16:02:46 - INFO - filelock -   Lock 139849313883528 acquired on /root/.cache/torch/transformers/2d9b03b59a8af464bf4238025a3cf0e5a340b9d0ba77400011e23c130b452510.6e217123a3ada61145de1f20b1443a1ec9aac93492a4bd1ce6a695935f0fd97a.lock\\n\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"9ffa302c8b604d56a4d9826fb783f786\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=772.0, style=ProgressStyle(description_…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"12/31/2020 16:02:47 - INFO - filelock -   Lock 139849313883528 released on /root/.cache/torch/transformers/2d9b03b59a8af464bf4238025a3cf0e5a340b9d0ba77400011e23c130b452510.6e217123a3ada61145de1f20b1443a1ec9aac93492a4bd1ce6a695935f0fd97a.lock\\n\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"12/31/2020 16:02:47 - INFO - filelock -   Lock 139849313883528 acquired on /root/.cache/torch/transformers/507984f2e28c7dfed5db9a20acd68beb969c7f2833abc9e582e967fa0291f3dc.ec06af3e1b426682955dab3bd553eaf178b6eafac9079fc133925e0e2654213e.lock\\n\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"d5287e28a98749b5b2cd1560f157ff36\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=79.0, style=ProgressStyle(description_w…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"12/31/2020 16:02:47 - INFO - filelock -   Lock 139849313883528 released on /root/.cache/torch/transformers/507984f2e28c7dfed5db9a20acd68beb969c7f2833abc9e582e967fa0291f3dc.ec06af3e1b426682955dab3bd553eaf178b6eafac9079fc133925e0e2654213e.lock\\n\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"12/31/2020 16:02:48 - INFO - farm.utils -   device: cuda n_gpu: 1, distributed training: False, automatic mixed precision training: None\\n\",\n            \"12/31/2020 16:02:48 - INFO - farm.infer -   Got ya 1 parallel workers to do inference ...\\n\",\n            \"12/31/2020 16:02:48 - INFO - farm.infer -    0 \\n\",\n            \"12/31/2020 16:02:48 - INFO - farm.infer -   /w\\\\\\n\",\n            \"12/31/2020 16:02:48 - INFO - farm.infer -   /'\\\\\\n\",\n            \"12/31/2020 16:02:48 - INFO - farm.infer -   \\n\"\n          ],\n          \"name\": \"stderr\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"om3NX21XPiu1\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"f7dec3f6-68c2-49a2-d4ab-1a5e7448d050\"\n      },\n      \"source\": [\n        \"# Some questions that \\\"work\\\":\\n\",\n        \"reader.predict(query=\\\"Where is Pioneer Boulevard located?\\\", documents=[doc])\"\n      ],\n      \"execution_count\": 3,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 22.86 Batches/s]\\n\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"execute_result\",\n          \"data\": {\n            \"text/plain\": [\n              \"{'answers': [{'answer': 'Los Angeles',\\n\",\n              \"   'context': 'The traffic began to slow down on Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool ja',\\n\",\n              \"   'document_id': '4fa8dd28-9694-47cb-bc5a-19a74f357403',\\n\",\n              \"   'offset_end': 66,\\n\",\n              \"   'offset_end_in_doc': 66,\\n\",\n              \"   'offset_start': 55,\\n\",\n              \"   'offset_start_in_doc': 55,\\n\",\n              \"   'probability': 0.8022719840448774,\\n\",\n              \"   'score': 11.204442024230957}],\\n\",\n              \" 'no_ans_gap': 10.05622935295105,\\n\",\n              \" 'query': 'Where is Pioneer Boulevard located?'}\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          },\n          \"execution_count\": 3\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"id\": \"vKtgTCpRCtYd\",\n        \"outputId\": \"7256a926-11f3-4f11-947b-68c406682c62\"\n      },\n      \"source\": [\n        \"reader.predict(query=\\\"Who drove to Las Vegas?\\\", documents=[doc])\\n\"\n      ],\n      \"execution_count\": 4,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 34.52 Batches/s]\\n\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"execute_result\",\n          \"data\": {\n            \"text/plain\": [\n              \"{'answers': [{'answer': 'Jo and Maria',\\n\",\n              \"   'context': 't of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and drove toward Barstow. They plann',\\n\",\n              \"   'document_id': '4fa8dd28-9694-47cb-bc5a-19a74f357403',\\n\",\n              \"   'offset_end': 81,\\n\",\n              \"   'offset_end_in_doc': 305,\\n\",\n              \"   'offset_start': 69,\\n\",\n              \"   'offset_start_in_doc': 293,\\n\",\n              \"   'probability': 0.8081116565023317,\\n\",\n              \"   'score': 11.50229263305664}],\\n\",\n              \" 'no_ans_gap': 3.7832298278808594,\\n\",\n              \" 'query': 'Who drove to Las Vegas?'}\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          },\n          \"execution_count\": 4\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"id\": \"6FErhiqfC2kz\",\n        \"outputId\": \"d7726229-fb30-4e59-843d-5fa991c80071\"\n      },\n      \"source\": [\n        \"reader.predict(query=\\\"Who is singing?\\\", documents=[doc])\\n\"\n      ],\n      \"execution_count\": 5,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 34.25 Batches/s]\\n\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"execute_result\",\n          \"data\": {\n            \"text/plain\": [\n              \"{'answers': [{'answer': 'Nat King Cole',\\n\",\n              \"   'context': 'r pleasant to be making it out of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and dro',\\n\",\n              \"   'document_id': '4fa8dd28-9694-47cb-bc5a-19a74f357403',\\n\",\n              \"   'offset_end': 82,\\n\",\n              \"   'offset_end_in_doc': 277,\\n\",\n              \"   'offset_start': 69,\\n\",\n              \"   'offset_start_in_doc': 264,\\n\",\n              \"   'probability': 0.8818636635368704,\\n\",\n              \"   'score': 16.081584930419922}],\\n\",\n              \" 'no_ans_gap': 12.141630411148071,\\n\",\n              \" 'query': 'Who is singing?'}\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          },\n          \"execution_count\": 5\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"id\": \"bUasgTO7DFN2\",\n        \"outputId\": \"f533b27f-7d6c-4160-e93e-deaf30e50b11\"\n      },\n      \"source\": [\n        \"reader.predict(query=\\\"What is the plan for the night?\\\", documents=[doc])\\n\"\n      ],\n      \"execution_count\": 6,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 32.32 Batches/s]\\n\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"execute_result\",\n          \"data\": {\n            \"text/plain\": [\n              \"{'answers': [{'answer': 'They planned to get to Las Vegas early enough in the evening to have a nice dinner and go see a show',\\n\",\n              \"   'context': 'de their way out of LA and drove toward Barstow. They planned to get to Las Vegas early enough in the evening to have a nice dinner and go see a show.',\\n\",\n              \"   'document_id': '4fa8dd28-9694-47cb-bc5a-19a74f357403',\\n\",\n              \"   'offset_end': 149,\\n\",\n              \"   'offset_end_in_doc': 464,\\n\",\n              \"   'offset_start': 49,\\n\",\n              \"   'offset_start_in_doc': 364,\\n\",\n              \"   'probability': 0.7315710454025786,\\n\",\n              \"   'score': 8.020864486694336}],\\n\",\n              \" 'no_ans_gap': 6.077347040176392,\\n\",\n              \" 'query': 'What is the plan for the night?'}\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          },\n          \"execution_count\": 6\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"id\": \"EAbgAUqe9wmN\",\n        \"outputId\": \"e88a40ec-2e79-45a9-b934-1be2bf25e801\"\n      },\n      \"source\": [\n        \"# Some questions where the answer is not in the text (and the model therefore cannot find it)\\n\",\n        \"# If you inspect the results, you will see that the value \\\"no_ans_gap\\\" is negative for all these questions and actually indicates that the likelihood of \\\"no answer\\\" is higher than the best textual answer\\n\",\n        \"questions = [\\\"Where is Los Angeles located?\\\",\\\"Where is LA located?\\\",\\\"Where is Barstow located?\\\",\\\"Where is Las Vegas located ?\\\"]\\n\",\n        \"for q in questions:\\n\",\n        \"  result = reader.predict(query=q, documents=[doc])\\n\",\n        \"  print(result)\\n\",\n        \"  print(\\\"\\\\n\\\")\"\n      ],\n      \"execution_count\": 7,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 32.07 Batches/s]\\n\",\n            \"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 31.49 Batches/s]\\n\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"{'query': 'Where is Los Angeles located?', 'no_ans_gap': -0.41483497619628906, 'answers': [{'answer': 'Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool jazz, and the weather was cool, making it rather pleasant to be making it out of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and drove toward Barstow', 'score': 1.0702476501464844, 'probability': 0.5333954464343146, 'context': 'Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool jazz, and the weather was cool, making it rather pleasant to be making it out of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and drove toward Barstow', 'offset_start': 0, 'offset_end': 328, 'offset_start_in_doc': 34, 'offset_end_in_doc': 362, 'document_id': '4fa8dd28-9694-47cb-bc5a-19a74f357403'}]}\\n\",\n            \"\\n\",\n            \"\\n\",\n            \"{'query': 'Where is LA located?', 'no_ans_gap': -0.19409167766571045, 'answers': [{'answer': 'Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool jazz, and the weather was cool, making it rather pleasant to be making it out of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and drove toward Barstow', 'score': 1.6217641830444336, 'probability': 0.5505072801165964, 'context': 'Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool jazz, and the weather was cool, making it rather pleasant to be making it out of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and drove toward Barstow', 'offset_start': 0, 'offset_end': 328, 'offset_start_in_doc': 34, 'offset_end_in_doc': 362, 'document_id': '4fa8dd28-9694-47cb-bc5a-19a74f357403'}]}\\n\",\n            \"\\n\",\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 30.91 Batches/s]\\n\",\n            \"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 32.63 Batches/s]\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"{'query': 'Where is Barstow located?', 'no_ans_gap': -1.593643844127655, 'answers': [{'answer': 'Las Vegas', 'score': 0.7261489033699036, 'probability': 0.522676586113031, 'context': 'de their way out of LA and drove toward Barstow. They planned to get to Las Vegas early enough in the evening to have a nice dinner and go see a show.', 'offset_start': 72, 'offset_end': 81, 'offset_start_in_doc': 387, 'offset_end_in_doc': 396, 'document_id': '4fa8dd28-9694-47cb-bc5a-19a74f357403'}]}\\n\",\n            \"\\n\",\n            \"\\n\",\n            \"{'query': 'Where is Las Vegas located ?', 'no_ans_gap': -2.1370767652988434, 'answers': [{'answer': 'Los Angeles', 'score': -0.025329262018203735, 'probability': 0.49920846122316637, 'context': 'The traffic began to slow down on Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool ja', 'offset_start': 55, 'offset_end': 66, 'offset_start_in_doc': 55, 'offset_end_in_doc': 66, 'document_id': '4fa8dd28-9694-47cb-bc5a-19a74f357403'}]}\\n\",\n            \"\\n\",\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stderr\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"id\": \"F174S7VV-xRZ\",\n        \"outputId\": \"45eacf23-6e02-43af-d11a-7c3e2cf9400b\"\n      },\n      \"source\": [\n        \"# We can also directly make use of this \\\"no answer\\\" option and allow our reader to return \\\"no answer\\\" (indicated via \\\"answer: None\\\" in the results) by enabling the arg in the FARMreader:\\n\",\n        \"reader = FARMReader(model_name_or_path=\\\"deepset/roberta-base-squad2\\\", use_gpu=True, no_ans_boost=0, return_no_answer=True)\\n\",\n        \"for q in questions:\\n\",\n        \"  result = reader.predict(query=q, documents=[doc])\\n\",\n        \"  print(result)\\n\",\n        \"  print(\\\"\\\\n\\\")\"\n      ],\n      \"execution_count\": 8,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"12/31/2020 16:02:49 - INFO - farm.utils -   device: cuda n_gpu: 1, distributed training: False, automatic mixed precision training: None\\n\",\n            \"12/31/2020 16:02:49 - INFO - farm.infer -   Could not find `deepset/roberta-base-squad2` locally. Try to download from model hub ...\\n\",\n            \"Some weights of RobertaModel were not initialized from the model checkpoint at deepset/roberta-base-squad2 and are newly initialized: ['roberta.pooler.dense.weight', 'roberta.pooler.dense.bias']\\n\",\n            \"You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\\n\",\n            \"12/31/2020 16:02:53 - WARNING - farm.modeling.language_model -   Could not automatically detect from language model name what language it is. \\n\",\n            \"\\t We guess it's an *ENGLISH* model ... \\n\",\n            \"\\t If not: Init the language model by supplying the 'language' param.\\n\",\n            \"12/31/2020 16:03:00 - INFO - farm.utils -   device: cuda n_gpu: 1, distributed training: False, automatic mixed precision training: None\\n\",\n            \"12/31/2020 16:03:00 - INFO - farm.infer -   Got ya 1 parallel workers to do inference ...\\n\",\n            \"12/31/2020 16:03:00 - INFO - farm.infer -    0 \\n\",\n            \"12/31/2020 16:03:00 - INFO - farm.infer -   /w\\\\\\n\",\n            \"12/31/2020 16:03:00 - INFO - farm.infer -   /'\\\\\\n\",\n            \"12/31/2020 16:03:00 - INFO - farm.infer -   \\n\",\n            \"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 30.13 Batches/s]\\n\",\n            \"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 33.39 Batches/s]\\n\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"{'query': 'Where is Los Angeles located?', 'no_ans_gap': -0.41483497619628906, 'answers': [{'answer': None, 'score': 1.4850826263427734, 'probability': 0.5462760172072342, 'context': None, 'offset_start': 0, 'offset_end': 0, 'document_id': None, 'meta': None}, {'answer': 'Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool jazz, and the weather was cool, making it rather pleasant to be making it out of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and drove toward Barstow', 'score': 1.0702476501464844, 'probability': 0.5333954464343146, 'context': 'Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool jazz, and the weather was cool, making it rather pleasant to be making it out of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and drove toward Barstow', 'offset_start': 0, 'offset_end': 328, 'offset_start_in_doc': 34, 'offset_end_in_doc': 362, 'document_id': '4fa8dd28-9694-47cb-bc5a-19a74f357403'}]}\\n\",\n            \"\\n\",\n            \"\\n\",\n            \"{'query': 'Where is LA located?', 'no_ans_gap': -0.19409167766571045, 'answers': [{'answer': None, 'score': 1.815855860710144, 'probability': 0.5565031131376446, 'context': None, 'offset_start': 0, 'offset_end': 0, 'document_id': None, 'meta': None}, {'answer': 'Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool jazz, and the weather was cool, making it rather pleasant to be making it out of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and drove toward Barstow', 'score': 1.6217641830444336, 'probability': 0.5505072801165964, 'context': 'Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool jazz, and the weather was cool, making it rather pleasant to be making it out of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and drove toward Barstow', 'offset_start': 0, 'offset_end': 328, 'offset_start_in_doc': 34, 'offset_end_in_doc': 362, 'document_id': '4fa8dd28-9694-47cb-bc5a-19a74f357403'}]}\\n\",\n            \"\\n\",\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 33.63 Batches/s]\\n\",\n            \"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 32.21 Batches/s]\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"{'query': 'Where is Barstow located?', 'no_ans_gap': -1.593643844127655, 'answers': [{'answer': None, 'score': 2.3197927474975586, 'probability': 0.5719897905641838, 'context': None, 'offset_start': 0, 'offset_end': 0, 'document_id': None, 'meta': None}, {'answer': 'Las Vegas', 'score': 0.7261489033699036, 'probability': 0.522676586113031, 'context': 'de their way out of LA and drove toward Barstow. They planned to get to Las Vegas early enough in the evening to have a nice dinner and go see a show.', 'offset_start': 72, 'offset_end': 81, 'offset_start_in_doc': 387, 'offset_end_in_doc': 396, 'document_id': '4fa8dd28-9694-47cb-bc5a-19a74f357403'}]}\\n\",\n            \"\\n\",\n            \"\\n\",\n            \"{'query': 'Where is Las Vegas located ?', 'no_ans_gap': -2.1370767652988434, 'answers': [{'answer': None, 'score': 2.1370767652988434, 'probability': 0.5663893175959525, 'context': None, 'offset_start': 0, 'offset_end': 0, 'document_id': None, 'meta': None}, {'answer': 'Los Angeles', 'score': -0.025329262018203735, 'probability': 0.49920846122316637, 'context': 'The traffic began to slow down on Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool ja', 'offset_start': 55, 'offset_end': 66, 'offset_start_in_doc': 55, 'offset_end_in_doc': 66, 'document_id': '4fa8dd28-9694-47cb-bc5a-19a74f357403'}]}\\n\",\n            \"\\n\",\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stderr\"\n        }\n      ]\n    }\n  ]\n}"
  },
  {
    "path": "Chapter10/QA.ipynb",
    "content": "{\n  \"nbformat\": 4,\n  \"nbformat_minor\": 0,\n  \"metadata\": {\n    \"kernelspec\": {\n      \"display_name\": \"Python 3\",\n      \"language\": \"python\",\n      \"name\": \"python3\"\n    },\n    \"language_info\": {\n      \"codemirror_mode\": {\n        \"name\": \"ipython\",\n        \"version\": 3\n      },\n      \"file_extension\": \".py\",\n      \"mimetype\": \"text/x-python\",\n      \"name\": \"python\",\n      \"nbconvert_exporter\": \"python\",\n      \"pygments_lexer\": \"ipython3\",\n      \"version\": \"3.7.6\"\n    },\n    \"pycharm\": {\n      \"stem_cell\": {\n        \"cell_type\": \"raw\",\n        \"source\": [],\n        \"metadata\": {\n          \"collapsed\": false\n        }\n      }\n    },\n    \"colab\": {\n      \"name\": \"QA.ipynb\",\n      \"provenance\": [],\n      \"collapsed_sections\": []\n    },\n    \"widgets\": {\n      \"application/vnd.jupyter.widget-state+json\": {\n        \"ec5480ed053b46cdb517d77899900a2f\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_85a1571687b84d19ae442c5f81f26f7a\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_e2eb771a977f44c180f99f87ca99fd77\",\n              \"IPY_MODEL_cb3ee57a490d4a9592e4b122d0d81948\"\n            ]\n          }\n        },\n        \"85a1571687b84d19ae442c5f81f26f7a\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"e2eb771a977f44c180f99f87ca99fd77\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_3da87e6c040b440988d93c43ac3a2c09\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 463,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 463,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_dceac99c8e52482e9f62a5a55898641e\"\n          }\n        },\n        \"cb3ee57a490d4a9592e4b122d0d81948\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_a07d84c9acd24a909d65f8b16f85fbe9\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 463/463 [00:00&lt;00:00, 982B/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_1f9e54937bcc49858ae9a938d899379b\"\n          }\n        },\n        \"3da87e6c040b440988d93c43ac3a2c09\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"dceac99c8e52482e9f62a5a55898641e\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"a07d84c9acd24a909d65f8b16f85fbe9\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"1f9e54937bcc49858ae9a938d899379b\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"27a07215928f497db5e317b82e9e5922\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_7c2581de98954b79b19ee3c6a2259ba7\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_7b616c40df534e3fa921221fe620a3d9\",\n              \"IPY_MODEL_23afbc66e2dd43eb84d3d731c46263f1\"\n            ]\n          }\n        },\n        \"7c2581de98954b79b19ee3c6a2259ba7\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"7b616c40df534e3fa921221fe620a3d9\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_e084d2d75eb14fe3aacedbd5ecf711bc\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 54236116,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 54236116,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_a4b121058c934cf081e8af64e893b913\"\n          }\n        },\n        \"23afbc66e2dd43eb84d3d731c46263f1\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_b7c0ba54037049e7a9ce91a85c41c580\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 54.2M/54.2M [00:02&lt;00:00, 21.2MB/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_d7d6eb6e2945450fb1c0590a506222e5\"\n          }\n        },\n        \"e084d2d75eb14fe3aacedbd5ecf711bc\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"a4b121058c934cf081e8af64e893b913\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"b7c0ba54037049e7a9ce91a85c41c580\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"d7d6eb6e2945450fb1c0590a506222e5\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"a7f35783ec6249be8ccfba1c83ed0e9f\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_c9bd7a41ada546e88509a60576dcbd81\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_dad65f326c614a58a0c13c94accab562\",\n              \"IPY_MODEL_f853299184d54861874b30a6087c6e3b\"\n            ]\n          }\n        },\n        \"c9bd7a41ada546e88509a60576dcbd81\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"dad65f326c614a58a0c13c94accab562\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_863fcf057e7d4fe984c531d6b1291814\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 231508,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 231508,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_058caba18b6d4a1e8fb52d573465255e\"\n          }\n        },\n        \"f853299184d54861874b30a6087c6e3b\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_6234b7e5ca9c4067ae359a31f9b38e27\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 232k/232k [00:00&lt;00:00, 436kB/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_a475b921ef3d4315990e07512ff759a3\"\n          }\n        },\n        \"863fcf057e7d4fe984c531d6b1291814\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"058caba18b6d4a1e8fb52d573465255e\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"6234b7e5ca9c4067ae359a31f9b38e27\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"a475b921ef3d4315990e07512ff759a3\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"469aaef964d644198b9cf9b878c56178\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_045cce4661714b078350aa8c12f86680\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_4f43fcc08b664d0d9a9c5edbec57dfe2\",\n              \"IPY_MODEL_a61f3ca504574a4db912806d920daad9\"\n            ]\n          }\n        },\n        \"045cce4661714b078350aa8c12f86680\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"4f43fcc08b664d0d9a9c5edbec57dfe2\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_da55a1ee4c1547c180fc2fa62e8908d2\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 466062,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 466062,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_9d754136f43641b4996383d99a3163c3\"\n          }\n        },\n        \"a61f3ca504574a4db912806d920daad9\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_fd4c6744a60340b4b004710cf2e9c96c\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 466k/466k [00:00&lt;00:00, 1.40MB/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_3a196782e8624fca9baf0561b48cf0b8\"\n          }\n        },\n        \"da55a1ee4c1547c180fc2fa62e8908d2\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"9d754136f43641b4996383d99a3163c3\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"fd4c6744a60340b4b004710cf2e9c96c\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"3a196782e8624fca9baf0561b48cf0b8\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        }\n      }\n    }\n  },\n  \"cells\": [\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"eSCTm5d8MhJA\"\n      },\n      \"source\": [\n        \"Question Answering Transformers with Hugging Face\\n\",\n        \"\\n\",\n        \"Copyright 2020 Denis Rothman\\n\",\n        \"\\n\",\n        \"[Hugging Face notebook Resources and Documentation](https://huggingface.co/)\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"pycharm\": {\n          \"name\": \"#%% code\\n\"\n        },\n        \"id\": \"4maAknWNrl_N\"\n      },\n      \"source\": [\n        \"!pip install -q transformers==4.0.0\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"pycharm\": {\n          \"is_executing\": false,\n          \"name\": \"#%% code \\n\"\n        },\n        \"id\": \"uKaqzCh6rl_V\"\n      },\n      \"source\": [\n        \"from transformers import pipeline\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"4VYUquAoa2eT\"\n      },\n      \"source\": [\n        \"nlp_qa = pipeline('question-answering')\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"ZxKBah-9iYF7\"\n      },\n      \"source\": [\n        \"Sample 1:The traffic began to slow down on Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool jazz, and the weather was cool, making it rather pleasant to be making it out of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and drove toward Barstow. They planned to get to Las Vegas early enough in the evening to have a nice dinner and go see a show.\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"MqvL7FP6bhzv\"\n      },\n      \"source\": [\n        \"sequence = \\\"The traffic began to slow down on Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool jazz, and the weather was cool, making it rather pleasant to be making it out of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and drove toward Barstow. They planned to get to Las Vegas early enough in the evening to have a nice dinner and go see a show.\\\"\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"syYGx5ZF6rkL\"\n      },\n      \"source\": [\n        \"Question-Answering\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"04tFdSHTbsFQ\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"4a720d1b-2764-4868-dbb8-e4ed662915ac\"\n      },\n      \"source\": [\n        \"nlp_qa(context=sequence, question='Where is Pioneer Boulevard ?')\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"execute_result\",\n          \"data\": {\n            \"text/plain\": [\n              \"{'answer': 'Los Angeles', 'end': 66, 'score': 0.9879737496376038, 'start': 55}\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          },\n          \"execution_count\": 14\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"Mqt2Z8qN6vNz\"\n      },\n      \"source\": [\n        \"Named Entity Recognition(NER)\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"V5GJSN_ui3J6\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"2f21cb0f-4d86-4851-bef4-003a6f67ecab\"\n      },\n      \"source\": [\n        \"nlp_ner = pipeline(\\\"ner\\\")\\n\",\n        \"print(nlp_ner(sequence))\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"[{'word': 'Pioneer', 'score': 0.9735257029533386, 'entity': 'I-LOC', 'index': 8}, {'word': 'Boulevard', 'score': 0.9944824576377869, 'entity': 'I-LOC', 'index': 9}, {'word': 'Los', 'score': 0.9995775818824768, 'entity': 'I-LOC', 'index': 11}, {'word': 'Angeles', 'score': 0.9995693564414978, 'entity': 'I-LOC', 'index': 12}, {'word': 'W', 'score': 0.991984486579895, 'entity': 'I-ORG', 'index': 26}, {'word': '##B', 'score': 0.990750253200531, 'entity': 'I-ORG', 'index': 27}, {'word': '##G', 'score': 0.9884582161903381, 'entity': 'I-ORG', 'index': 28}, {'word': '##O', 'score': 0.9722681641578674, 'entity': 'I-ORG', 'index': 29}, {'word': 'Nat', 'score': 0.9966881275177002, 'entity': 'I-PER', 'index': 59}, {'word': 'King', 'score': 0.997648298740387, 'entity': 'I-PER', 'index': 60}, {'word': 'Cole', 'score': 0.9986170530319214, 'entity': 'I-PER', 'index': 61}, {'word': 'Jo', 'score': 0.9978788495063782, 'entity': 'I-PER', 'index': 65}, {'word': 'Maria', 'score': 0.9988164901733398, 'entity': 'I-PER', 'index': 67}, {'word': 'LA', 'score': 0.998134434223175, 'entity': 'I-LOC', 'index': 74}, {'word': 'Bar', 'score': 0.9970266819000244, 'entity': 'I-LOC', 'index': 78}, {'word': '##sto', 'score': 0.8573915958404541, 'entity': 'I-LOC', 'index': 79}, {'word': '##w', 'score': 0.9920249581336975, 'entity': 'I-LOC', 'index': 80}, {'word': 'Las', 'score': 0.9993551969528198, 'entity': 'I-LOC', 'index': 87}, {'word': 'Vegas', 'score': 0.9989539384841919, 'entity': 'I-LOC', 'index': 88}]\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"ye1D9aYaun7y\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"e49f9baa-b5e4-401c-d051-951ff090a209\"\n      },\n      \"source\": [\n        \"nlp_qa = pipeline('question-answering')\\n\",\n        \"print(\\\"Question 1.\\\",nlp_qa(context=sequence, question='Where is Pioneer Boulevard ?'))\\n\",\n        \"print(\\\"Question 2.\\\",nlp_qa(context=sequence, question='Where is Los Angeles located?'))\\n\",\n        \"print(\\\"Question 3.\\\",nlp_qa(context=sequence, question='Where is LA ?'))\\n\",\n        \"print(\\\"Question 4.\\\",nlp_qa(context=sequence, question='Where is Barstow ?'))\\n\",\n        \"print(\\\"Question 5.\\\",nlp_qa(context=sequence, question='Where is Las Vegas located ?'))\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Question 1. {'score': 0.9879737496376038, 'start': 55, 'end': 66, 'answer': 'Los Angeles'}\\n\",\n            \"Question 2. {'score': 0.9875388741493225, 'start': 34, 'end': 51, 'answer': 'Pioneer Boulevard'}\\n\",\n            \"Question 3. {'score': 0.5090540647506714, 'start': 55, 'end': 66, 'answer': 'Los Angeles'}\\n\",\n            \"Question 4. {'score': 0.3695431649684906, 'start': 387, 'end': 396, 'answer': 'Las Vegas'}\\n\",\n            \"Question 5. {'score': 0.21839778125286102, 'start': 355, 'end': 362, 'answer': 'Barstow'}\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"TPd42T7TrhVH\"\n      },\n      \"source\": [\n        \"Question-answering applied to NER person entities\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"6yQyrSjsv6dJ\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"4780dda5-4485-417e-c0e1-1c4ca8bd9cb5\"\n      },\n      \"source\": [\n        \"nlp_qa = pipeline('question-answering')\\n\",\n        \"nlp_qa(context=sequence, question='Who was singing ?')\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"execute_result\",\n          \"data\": {\n            \"text/plain\": [\n              \"{'answer': 'Nat King Cole',\\n\",\n              \" 'end': 277,\\n\",\n              \" 'score': 0.9653680324554443,\\n\",\n              \" 'start': 264}\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          },\n          \"execution_count\": 17\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"CfOlUtS0wapC\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"4cce8650-5d46-4374-d987-8111c6e81cbc\"\n      },\n      \"source\": [\n        \"nlp_qa(context=sequence, question='Who was going to Las Vegas ?')\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"execute_result\",\n          \"data\": {\n            \"text/plain\": [\n              \"{'answer': 'Nat King Cole',\\n\",\n              \" 'end': 277,\\n\",\n              \" 'score': 0.4316245913505554,\\n\",\n              \" 'start': 264}\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          },\n          \"execution_count\": 18\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"DI_8OcAdx7Rp\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"212ae5d4-23e1-4016-e846-fe74cef78a26\"\n      },\n      \"source\": [\n        \"nlp_qa(context=sequence, question='Who are they?')\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"execute_result\",\n          \"data\": {\n            \"text/plain\": [\n              \"{'answer': 'Jo and Maria',\\n\",\n              \" 'end': 305,\\n\",\n              \" 'score': 0.8486908078193665,\\n\",\n              \" 'start': 293}\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          },\n          \"execution_count\": 19\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"Oc3Pe7CByyhc\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"5625a1f6-1b12-4ab4-813b-74a0fa3f727d\"\n      },\n      \"source\": [\n        \"nlp_qa(context=sequence, question='Who drove to Las Vegas?')\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"execute_result\",\n          \"data\": {\n            \"text/plain\": [\n              \"{'answer': 'Nat King Cole was singing as Jo and Maria',\\n\",\n              \" 'end': 305,\\n\",\n              \" 'score': 0.35941559076309204,\\n\",\n              \" 'start': 264}\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          },\n          \"execution_count\": 20\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"TmF96wthzwWT\"\n      },\n      \"source\": [\n        \"Description of the Default Model\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"_EMgV9dnz60s\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"86ed2cfb-d7e1-4038-fd6a-daba48a80414\"\n      },\n      \"source\": [\n        \"print(nlp_qa.model)\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"DistilBertForQuestionAnswering(\\n\",\n            \"  (distilbert): DistilBertModel(\\n\",\n            \"    (embeddings): Embeddings(\\n\",\n            \"      (word_embeddings): Embedding(28996, 768, padding_idx=0)\\n\",\n            \"      (position_embeddings): Embedding(512, 768)\\n\",\n            \"      (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n            \"      (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"    )\\n\",\n            \"    (transformer): Transformer(\\n\",\n            \"      (layer): ModuleList(\\n\",\n            \"        (0): TransformerBlock(\\n\",\n            \"          (attention): MultiHeadSelfAttention(\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            (q_lin): Linear(in_features=768, out_features=768, bias=True)\\n\",\n            \"            (k_lin): Linear(in_features=768, out_features=768, bias=True)\\n\",\n            \"            (v_lin): Linear(in_features=768, out_features=768, bias=True)\\n\",\n            \"            (out_lin): Linear(in_features=768, out_features=768, bias=True)\\n\",\n            \"          )\\n\",\n            \"          (sa_layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n            \"          (ffn): FFN(\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            (lin1): Linear(in_features=768, out_features=3072, bias=True)\\n\",\n            \"            (lin2): Linear(in_features=3072, out_features=768, bias=True)\\n\",\n            \"          )\\n\",\n            \"          (output_layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n            \"        )\\n\",\n            \"        (1): TransformerBlock(\\n\",\n            \"          (attention): MultiHeadSelfAttention(\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            (q_lin): Linear(in_features=768, out_features=768, bias=True)\\n\",\n            \"            (k_lin): Linear(in_features=768, out_features=768, bias=True)\\n\",\n            \"            (v_lin): Linear(in_features=768, out_features=768, bias=True)\\n\",\n            \"            (out_lin): Linear(in_features=768, out_features=768, bias=True)\\n\",\n            \"          )\\n\",\n            \"          (sa_layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n            \"          (ffn): FFN(\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            (lin1): Linear(in_features=768, out_features=3072, bias=True)\\n\",\n            \"            (lin2): Linear(in_features=3072, out_features=768, bias=True)\\n\",\n            \"          )\\n\",\n            \"          (output_layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n            \"        )\\n\",\n            \"        (2): TransformerBlock(\\n\",\n            \"          (attention): MultiHeadSelfAttention(\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            (q_lin): Linear(in_features=768, out_features=768, bias=True)\\n\",\n            \"            (k_lin): Linear(in_features=768, out_features=768, bias=True)\\n\",\n            \"            (v_lin): Linear(in_features=768, out_features=768, bias=True)\\n\",\n            \"            (out_lin): Linear(in_features=768, out_features=768, bias=True)\\n\",\n            \"          )\\n\",\n            \"          (sa_layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n            \"          (ffn): FFN(\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            (lin1): Linear(in_features=768, out_features=3072, bias=True)\\n\",\n            \"            (lin2): Linear(in_features=3072, out_features=768, bias=True)\\n\",\n            \"          )\\n\",\n            \"          (output_layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n            \"        )\\n\",\n            \"        (3): TransformerBlock(\\n\",\n            \"          (attention): MultiHeadSelfAttention(\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            (q_lin): Linear(in_features=768, out_features=768, bias=True)\\n\",\n            \"            (k_lin): Linear(in_features=768, out_features=768, bias=True)\\n\",\n            \"            (v_lin): Linear(in_features=768, out_features=768, bias=True)\\n\",\n            \"            (out_lin): Linear(in_features=768, out_features=768, bias=True)\\n\",\n            \"          )\\n\",\n            \"          (sa_layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n            \"          (ffn): FFN(\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            (lin1): Linear(in_features=768, out_features=3072, bias=True)\\n\",\n            \"            (lin2): Linear(in_features=3072, out_features=768, bias=True)\\n\",\n            \"          )\\n\",\n            \"          (output_layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n            \"        )\\n\",\n            \"        (4): TransformerBlock(\\n\",\n            \"          (attention): MultiHeadSelfAttention(\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            (q_lin): Linear(in_features=768, out_features=768, bias=True)\\n\",\n            \"            (k_lin): Linear(in_features=768, out_features=768, bias=True)\\n\",\n            \"            (v_lin): Linear(in_features=768, out_features=768, bias=True)\\n\",\n            \"            (out_lin): Linear(in_features=768, out_features=768, bias=True)\\n\",\n            \"          )\\n\",\n            \"          (sa_layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n            \"          (ffn): FFN(\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            (lin1): Linear(in_features=768, out_features=3072, bias=True)\\n\",\n            \"            (lin2): Linear(in_features=3072, out_features=768, bias=True)\\n\",\n            \"          )\\n\",\n            \"          (output_layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n            \"        )\\n\",\n            \"        (5): TransformerBlock(\\n\",\n            \"          (attention): MultiHeadSelfAttention(\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            (q_lin): Linear(in_features=768, out_features=768, bias=True)\\n\",\n            \"            (k_lin): Linear(in_features=768, out_features=768, bias=True)\\n\",\n            \"            (v_lin): Linear(in_features=768, out_features=768, bias=True)\\n\",\n            \"            (out_lin): Linear(in_features=768, out_features=768, bias=True)\\n\",\n            \"          )\\n\",\n            \"          (sa_layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n            \"          (ffn): FFN(\\n\",\n            \"            (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \"            (lin1): Linear(in_features=768, out_features=3072, bias=True)\\n\",\n            \"            (lin2): Linear(in_features=3072, out_features=768, bias=True)\\n\",\n            \"          )\\n\",\n            \"          (output_layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\\n\",\n            \"        )\\n\",\n            \"      )\\n\",\n            \"    )\\n\",\n            \"  )\\n\",\n            \"  (qa_outputs): Linear(in_features=768, out_features=2, bias=True)\\n\",\n            \"  (dropout): Dropout(p=0.1, inplace=False)\\n\",\n            \")\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"vAlWY5E6TfKL\"\n      },\n      \"source\": [\n        \"Question-Answering with ELECTRA\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"BFNSvGN0znq9\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 386,\n          \"referenced_widgets\": [\n            \"ec5480ed053b46cdb517d77899900a2f\",\n            \"85a1571687b84d19ae442c5f81f26f7a\",\n            \"e2eb771a977f44c180f99f87ca99fd77\",\n            \"cb3ee57a490d4a9592e4b122d0d81948\",\n            \"3da87e6c040b440988d93c43ac3a2c09\",\n            \"dceac99c8e52482e9f62a5a55898641e\",\n            \"a07d84c9acd24a909d65f8b16f85fbe9\",\n            \"1f9e54937bcc49858ae9a938d899379b\",\n            \"27a07215928f497db5e317b82e9e5922\",\n            \"7c2581de98954b79b19ee3c6a2259ba7\",\n            \"7b616c40df534e3fa921221fe620a3d9\",\n            \"23afbc66e2dd43eb84d3d731c46263f1\",\n            \"e084d2d75eb14fe3aacedbd5ecf711bc\",\n            \"a4b121058c934cf081e8af64e893b913\",\n            \"b7c0ba54037049e7a9ce91a85c41c580\",\n            \"d7d6eb6e2945450fb1c0590a506222e5\",\n            \"a7f35783ec6249be8ccfba1c83ed0e9f\",\n            \"c9bd7a41ada546e88509a60576dcbd81\",\n            \"dad65f326c614a58a0c13c94accab562\",\n            \"f853299184d54861874b30a6087c6e3b\",\n            \"863fcf057e7d4fe984c531d6b1291814\",\n            \"058caba18b6d4a1e8fb52d573465255e\",\n            \"6234b7e5ca9c4067ae359a31f9b38e27\",\n            \"a475b921ef3d4315990e07512ff759a3\",\n            \"469aaef964d644198b9cf9b878c56178\",\n            \"045cce4661714b078350aa8c12f86680\",\n            \"4f43fcc08b664d0d9a9c5edbec57dfe2\",\n            \"a61f3ca504574a4db912806d920daad9\",\n            \"da55a1ee4c1547c180fc2fa62e8908d2\",\n            \"9d754136f43641b4996383d99a3163c3\",\n            \"fd4c6744a60340b4b004710cf2e9c96c\",\n            \"3a196782e8624fca9baf0561b48cf0b8\"\n          ]\n        },\n        \"outputId\": \"1f9b1b3e-f51d-48dd-a97e-d5f43d3207c9\"\n      },\n      \"source\": [\n        \"nlp_qa = pipeline('question-answering', model='google/electra-small-generator', tokenizer='google/electra-small-generator')\\n\",\n        \"nlp_qa(context=sequence, question='Who drove to Las Vegas ?')\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"ec5480ed053b46cdb517d77899900a2f\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=463.0, style=ProgressStyle(description_…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"27a07215928f497db5e317b82e9e5922\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=54236116.0, style=ProgressStyle(descrip…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"a7f35783ec6249be8ccfba1c83ed0e9f\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=231508.0, style=ProgressStyle(descripti…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"469aaef964d644198b9cf9b878c56178\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=466062.0, style=ProgressStyle(descripti…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"Some weights of the model checkpoint at google/electra-small-generator were not used when initializing ElectraForQuestionAnswering: ['generator_predictions.LayerNorm.weight', 'generator_predictions.LayerNorm.bias', 'generator_predictions.dense.weight', 'generator_predictions.dense.bias', 'generator_lm_head.weight', 'generator_lm_head.bias']\\n\",\n            \"- This IS expected if you are initializing ElectraForQuestionAnswering from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\\n\",\n            \"- This IS NOT expected if you are initializing ElectraForQuestionAnswering from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\\n\",\n            \"Some weights of ElectraForQuestionAnswering were not initialized from the model checkpoint at google/electra-small-generator and are newly initialized: ['qa_outputs.weight', 'qa_outputs.bias']\\n\",\n            \"You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\\n\"\n          ],\n          \"name\": \"stderr\"\n        },\n        {\n          \"output_type\": \"execute_result\",\n          \"data\": {\n            \"text/plain\": [\n              \"{'answer': 'rather pleasant to be making it out of the city on this',\\n\",\n              \" 'end': 245,\\n\",\n              \" 'score': 0.00034621506347320974,\\n\",\n              \" 'start': 190}\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          },\n          \"execution_count\": 22\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"49PDRpKHsc41\"\n      },\n      \"source\": [\n        \"Question Answering with default Model and SRL\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"W8kGz5ihz96g\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"393a5a3e-ea75-4c7e-84f3-9f9930edd164\"\n      },\n      \"source\": [\n        \"nlp_qa = pipeline('question-answering')\\n\",\n        \"nlp_qa(context=sequence, question='What was slow?')\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"execute_result\",\n          \"data\": {\n            \"text/plain\": [\n              \"{'answer': 'The traffic', 'end': 11, 'score': 0.46530455350875854, 'start': 0}\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          },\n          \"execution_count\": 23\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"4mycOJhdugbL\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"aeeb1e97-51e7-4658-da75-cbeba382521b\"\n      },\n      \"source\": [\n        \"nlp_qa = pipeline('question-answering')\\n\",\n        \"nlp_qa(context=sequence, question='What was playing')\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"execute_result\",\n          \"data\": {\n            \"text/plain\": [\n              \"{'answer': 'cool jazz', 'end': 152, 'score': 0.3511938154697418, 'start': 143}\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          },\n          \"execution_count\": 24\n        }\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"bniJUNoxwtiw\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\"\n        },\n        \"outputId\": \"b3ea4b39-8c69-4cef-9a57-401aefd9065e\"\n      },\n      \"source\": [\n        \"nlp_qa = pipeline('question-answering')\\n\",\n        \"nlp_qa(context=sequence, question='Who sees a show?')\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"execute_result\",\n          \"data\": {\n            \"text/plain\": [\n              \"{'answer': 'Nat King Cole',\\n\",\n              \" 'end': 277,\\n\",\n              \" 'score': 0.5588219165802002,\\n\",\n              \" 'start': 264}\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          },\n          \"execution_count\": 25\n        }\n      ]\n    }\n  ]\n}"
  },
  {
    "path": "Chapter11/SentimentAnalysis.ipynb",
    "content": "{\n  \"nbformat\": 4,\n  \"nbformat_minor\": 0,\n  \"metadata\": {\n    \"kernelspec\": {\n      \"display_name\": \"Python 3\",\n      \"language\": \"python\",\n      \"name\": \"python3\"\n    },\n    \"language_info\": {\n      \"codemirror_mode\": {\n        \"name\": \"ipython\",\n        \"version\": 3\n      },\n      \"file_extension\": \".py\",\n      \"mimetype\": \"text/x-python\",\n      \"name\": \"python\",\n      \"nbconvert_exporter\": \"python\",\n      \"pygments_lexer\": \"ipython3\",\n      \"version\": \"3.7.6\"\n    },\n    \"pycharm\": {\n      \"stem_cell\": {\n        \"cell_type\": \"raw\",\n        \"source\": [],\n        \"metadata\": {\n          \"collapsed\": false\n        }\n      }\n    },\n    \"colab\": {\n      \"name\": \"SentimentAnalysis.ipynb\",\n      \"provenance\": [],\n      \"collapsed_sections\": []\n    },\n    \"widgets\": {\n      \"application/vnd.jupyter.widget-state+json\": {\n        \"491c9ee2f443495dba7465ab25a7ba70\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HBoxModel\",\n          \"state\": {\n            \"_view_name\": \"HBoxView\",\n            \"_dom_classes\": [],\n            \"_model_name\": \"HBoxModel\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"box_style\": \"\",\n            \"layout\": \"IPY_MODEL_ea8709edb3204155b74956ae66fd9d78\",\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"children\": [\n              \"IPY_MODEL_d7677297d78940de840ebfe90e20aac8\",\n              \"IPY_MODEL_b95468d3abe24608997d4ea2a26a6449\"\n            ]\n          }\n        },\n        \"ea8709edb3204155b74956ae66fd9d78\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"d7677297d78940de840ebfe90e20aac8\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"FloatProgressModel\",\n          \"state\": {\n            \"_view_name\": \"ProgressView\",\n            \"style\": \"IPY_MODEL_52d2aae401344ddea76b1032e10582a4\",\n            \"_dom_classes\": [],\n            \"description\": \"Downloading: 100%\",\n            \"_model_name\": \"FloatProgressModel\",\n            \"bar_style\": \"success\",\n            \"max\": 230,\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": 230,\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"orientation\": \"horizontal\",\n            \"min\": 0,\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_99579edca7fd4e5aa93491630aac72c0\"\n          }\n        },\n        \"b95468d3abe24608997d4ea2a26a6449\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"HTMLModel\",\n          \"state\": {\n            \"_view_name\": \"HTMLView\",\n            \"style\": \"IPY_MODEL_1f0b1e493558418b9ff5558cfc7baae3\",\n            \"_dom_classes\": [],\n            \"description\": \"\",\n            \"_model_name\": \"HTMLModel\",\n            \"placeholder\": \"​\",\n            \"_view_module\": \"@jupyter-widgets/controls\",\n            \"_model_module_version\": \"1.5.0\",\n            \"value\": \" 230/230 [00:00&lt;00:00, 541B/s]\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.5.0\",\n            \"description_tooltip\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\",\n            \"layout\": \"IPY_MODEL_bc86f7386abf4e1fb2aba758a2982039\"\n          }\n        },\n        \"52d2aae401344ddea76b1032e10582a4\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"ProgressStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"ProgressStyleModel\",\n            \"description_width\": \"initial\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"bar_color\": null,\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"99579edca7fd4e5aa93491630aac72c0\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        },\n        \"1f0b1e493558418b9ff5558cfc7baae3\": {\n          \"model_module\": \"@jupyter-widgets/controls\",\n          \"model_name\": \"DescriptionStyleModel\",\n          \"state\": {\n            \"_view_name\": \"StyleView\",\n            \"_model_name\": \"DescriptionStyleModel\",\n            \"description_width\": \"\",\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"_model_module_version\": \"1.5.0\",\n            \"_view_count\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"_model_module\": \"@jupyter-widgets/controls\"\n          }\n        },\n        \"bc86f7386abf4e1fb2aba758a2982039\": {\n          \"model_module\": \"@jupyter-widgets/base\",\n          \"model_name\": \"LayoutModel\",\n          \"state\": {\n            \"_view_name\": \"LayoutView\",\n            \"grid_template_rows\": null,\n            \"right\": null,\n            \"justify_content\": null,\n            \"_view_module\": \"@jupyter-widgets/base\",\n            \"overflow\": null,\n            \"_model_module_version\": \"1.2.0\",\n            \"_view_count\": null,\n            \"flex_flow\": null,\n            \"width\": null,\n            \"min_width\": null,\n            \"border\": null,\n            \"align_items\": null,\n            \"bottom\": null,\n            \"_model_module\": \"@jupyter-widgets/base\",\n            \"top\": null,\n            \"grid_column\": null,\n            \"overflow_y\": null,\n            \"overflow_x\": null,\n            \"grid_auto_flow\": null,\n            \"grid_area\": null,\n            \"grid_template_columns\": null,\n            \"flex\": null,\n            \"_model_name\": \"LayoutModel\",\n            \"justify_items\": null,\n            \"grid_row\": null,\n            \"max_height\": null,\n            \"align_content\": null,\n            \"visibility\": null,\n            \"align_self\": null,\n            \"height\": null,\n            \"min_height\": null,\n            \"padding\": null,\n            \"grid_auto_rows\": null,\n            \"grid_gap\": null,\n            \"max_width\": null,\n            \"order\": null,\n            \"_view_module_version\": \"1.2.0\",\n            \"grid_template_areas\": null,\n            \"object_position\": null,\n            \"object_fit\": null,\n            \"grid_auto_columns\": null,\n            \"margin\": null,\n            \"display\": null,\n            \"left\": null\n          }\n        }\n      }\n    }\n  },\n  \"cells\": [\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"rFCzxMzfG2Jh\",\n        \"colab_type\": \"text\"\n      },\n      \"source\": [\n        \"#Sentiment Analysis with Hugging Face Models and AllenNLP\\n\",\n        \"\\n\",\n        \"Copyright 2020, Denis Rothman\\n\",\n        \"\\n\",\n        \"Resources\\n\",\n        \"\\n\",\n        \"[AllenNLP](https://demo.allennlp.org/sentiment-analysis)\\n\",\n        \"\\n\",\n        \"[Hugging Face Pipelines](https://huggingface.co/transformers/main_classes/pipelines.html)\\n\",\n        \"\\n\",\n        \"[Hugging Face Models](https://huggingface.co/models)\\n\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"TssYtycqPQSW\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"!pip install allennlp==1.0.0 allennlp-models==1.0.0\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"4-nbsdFAQyVj\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"!echo '{\\\"sentence\\\": \\\"Whether or not you're enlightened by any of Derrida's lectures on the other and the self, Derrida is an undeniably fascinating and playful fellow.\\\"}' | \\\\\\n\",\n        \"allennlp predict https://storage.googleapis.com/allennlp-public-models/sst-roberta-large-2020.06.08.tar.gz -\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"pycharm\": {\n          \"name\": \"#%% code\\n\"\n        },\n        \"id\": \"4maAknWNrl_N\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"!pip install -q transformers\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"pycharm\": {\n          \"is_executing\": false,\n          \"name\": \"#%% code \\n\"\n        },\n        \"id\": \"uKaqzCh6rl_V\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"from transformers import pipeline\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"CRUQAGAzA1Vr\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"def classify(sequence,M):\\n\",\n        \"   #DistilBertForSequenceClassification(default model)\\n\",\n        \"    nlp_cls = pipeline('sentiment-analysis') \\n\",\n        \"    if M==1:\\n\",\n        \"      print(nlp_cls.model.config)\\n\",\n        \"    return nlp_cls(sequence)\\n\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"BitkJ4tM5C9p\",\n        \"colab_type\": \"code\",\n        \"colab\": {\n          \"base_uri\": \"https://localhost:8080/\",\n          \"height\": 100,\n          \"referenced_widgets\": [\n            \"491c9ee2f443495dba7465ab25a7ba70\",\n            \"ea8709edb3204155b74956ae66fd9d78\",\n            \"d7677297d78940de840ebfe90e20aac8\",\n            \"b95468d3abe24608997d4ea2a26a6449\",\n            \"52d2aae401344ddea76b1032e10582a4\",\n            \"99579edca7fd4e5aa93491630aac72c0\",\n            \"1f0b1e493558418b9ff5558cfc7baae3\",\n            \"bc86f7386abf4e1fb2aba758a2982039\"\n          ]\n        },\n        \"outputId\": \"75b4010f-686f-4e1d-f5d8-dc9d286af660\"\n      },\n      \"source\": [\n        \"seq=3\\n\",\n        \"if seq==1:\\n\",\n        \"  sequence=\\\"The battery on my Model9X phone doesn't last more than 6 hours and I'm unhappy about that.\\\"\\n\",\n        \"if seq==2:\\n\",\n        \"  sequence=\\\"The battery on my Model9X phone doesn't last more than 6 hours and I'm unhappy about that. I was really mad! I bought a Moel10x and things seem to be better. I'm super satisfied now.\\\"\\n\",\n        \"if seq==3:\\n\",\n        \"  sequence=\\\"The customer was very unhappy\\\"\\n\",\n        \"if seq==4:\\n\",\n        \"  sequence=\\\"The customer was very satisfied\\\"\\n\",\n        \"print(sequence)\\n\",\n        \"M=0 #display model cofiguration=1, default=0\\n\",\n        \"CS=classify(sequence,M) \\n\",\n        \"print(CS)\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": [\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"The customer was very unhappy\\n\"\n          ],\n          \"name\": \"stdout\"\n        },\n        {\n          \"output_type\": \"display_data\",\n          \"data\": {\n            \"application/vnd.jupyter.widget-view+json\": {\n              \"model_id\": \"491c9ee2f443495dba7465ab25a7ba70\",\n              \"version_minor\": 0,\n              \"version_major\": 2\n            },\n            \"text/plain\": [\n              \"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=230.0, style=ProgressStyle(description_…\"\n            ]\n          },\n          \"metadata\": {\n            \"tags\": []\n          }\n        },\n        {\n          \"output_type\": \"stream\",\n          \"text\": [\n            \"\\n\",\n            \"[{'label': 'NEGATIVE', 'score': 0.9997098445892334}]\\n\"\n          ],\n          \"name\": \"stdout\"\n        }\n      ]\n    }\n  ]\n}"
  },
  {
    "path": "Chapter12/Fake_News.ipynb",
    "content": "{\n  \"nbformat\": 4,\n  \"nbformat_minor\": 0,\n  \"metadata\": {\n    \"kernelspec\": {\n      \"display_name\": \"Python 3\",\n      \"language\": \"python\",\n      \"name\": \"python3\"\n    },\n    \"language_info\": {\n      \"codemirror_mode\": {\n        \"name\": \"ipython\",\n        \"version\": 3\n      },\n      \"file_extension\": \".py\",\n      \"mimetype\": \"text/x-python\",\n      \"name\": \"python\",\n      \"nbconvert_exporter\": \"python\",\n      \"pygments_lexer\": \"ipython3\",\n      \"version\": \"3.7.6\"\n    },\n    \"pycharm\": {\n      \"stem_cell\": {\n        \"cell_type\": \"raw\",\n        \"source\": [],\n        \"metadata\": {\n          \"collapsed\": false\n        }\n      }\n    },\n    \"colab\": {\n      \"name\": \"Fake_News.ipynb\",\n      \"provenance\": [],\n      \"collapsed_sections\": []\n    }\n  },\n  \"cells\": [\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"rFCzxMzfG2Jh\",\n        \"colab_type\": \"text\"\n      },\n      \"source\": [\n        \"#Fake News\\n\",\n        \"\\n\",\n        \"Copyright 2020, Denis Rothman\\n\",\n        \"\\n\",\n        \"## Notebook resources:\\n\",\n        \"\\n\",\n        \"[Hugging Face](https://huggingface.co)\\n\",\n        \"\\n\",\n        \"[The Allen Institute for AI](https://allennlp.org/)<br>\\n\",\n        \"Some of Allen NLP resources come from Hugging Face\\n\",\n        \"\\n\"\n      ]\n    },\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"_UlmKCHMttiY\",\n        \"colab_type\": \"text\"\n      },\n      \"source\": [\n        \"## Fake News from an emotional perspective \\n\",\n        \"\\n\",\n        \"Allen NLP Sentiment Analysis with RoBERTa-large \\n\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"TssYtycqPQSW\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"!pip install allennlp==1.0.0 allennlp-models==1.0.0\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"4-nbsdFAQyVj\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"!echo '{\\\"sentence\\\":\\\"Climate change is bogus. It’s a plot by the liberals to take the economy down.\\\"}' | \\\\\\n\",\n        \"allennlp predict https://storage.googleapis.com/allennlp-public-models/sst-roberta-large-2020.06.08.tar.gz -\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"colab_type\": \"code\",\n        \"id\": \"jFgnbGsEwXUe\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"!echo '{\\\"sentence\\\":\\\"I am a Republican and think that climate change consciousness is a great thing!\\\"}' | \\\\\\n\",\n        \"allennlp predict https://storage.googleapis.com/allennlp-public-models/sst-roberta-large-2020.06.08.tar.gz -\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"markdown\",\n      \"metadata\": {\n        \"id\": \"5TWl4dpzbGe6\",\n        \"colab_type\": \"text\"\n      },\n      \"source\": [\n        \"## GUN CONTROL\"\n      ]\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"colab_type\": \"code\",\n        \"id\": \"pDX7NjFVa31H\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"!echo '{\\\"sentence\\\":\\\"I have had rifles and guns for years and never had a problem. I raised my kids right so they have guns too and never hurt anything except rabbits.\\\"}' | \\\\\\n\",\n        \"allennlp predict https://storage.googleapis.com/allennlp-public-models/sst-roberta-large-2020.06.08.tar.gz -\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"colab_type\": \"code\",\n        \"id\": \"PpdecpJPfTl6\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"!echo '{\\\"sentence\\\":\\\"I have heard gunshots all my life in my neighborhood, have lost many friends, and am afraid to go out at night.\\\"}' | \\\\\\n\",\n        \"allennlp predict https://storage.googleapis.com/allennlp-public-models/sst-roberta-large-2020.06.08.tar.gz -\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"pycharm\": {\n          \"name\": \"#%% code\\n\"\n        },\n        \"id\": \"4maAknWNrl_N\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"!pip install -q transformers\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"pycharm\": {\n          \"is_executing\": false,\n          \"name\": \"#%% code \\n\"\n        },\n        \"id\": \"uKaqzCh6rl_V\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"from transformers import pipeline\\n\",\n        \"from transformers import AutoTokenizer, AutoModelForSequenceClassification,AutoModel\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"6bWOwVgIh9Ai\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"nlp_token_class = pipeline('ner')\\n\",\n        \"nlp_token_class('I have had rifles and guns for years and never had a problem. I raised my kids right so they have guns too and never hurt anything except rabbits.')\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"colab_type\": \"code\",\n        \"id\": \"69Az4owA2UQv\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"nlp_token_class = pipeline('ner')\\n\",\n        \"nlp_token_class('I have had rifles and guns for years and never had a problem. I raised my kids right so they have guns too and never hurt anything except rabbits.')\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"colab_type\": \"code\",\n        \"id\": \"VmJw_cAmkI43\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"nlp_token_class.model.config\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    },\n    {\n      \"cell_type\": \"code\",\n      \"metadata\": {\n        \"id\": \"IE-TRXfppsKJ\",\n        \"colab_type\": \"code\",\n        \"colab\": {}\n      },\n      \"source\": [\n        \"!echo '{\\\"sentence\\\": \\\"I have heard gunshots all my life in my neighborhood, have lost many friends, and am afraid to go out at night.\\\"}' | \\\\\\n\",\n        \"allennlp predict https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz -\"\n      ],\n      \"execution_count\": null,\n      \"outputs\": []\n    }\n  ]\n}"
  },
  {
    "path": "README.md",
    "content": "\n\n---\n\n## Join Our Newsletters 📬\n\n### DataPro  \n*The future of AI is unfolding. Don’t fall behind.*\n\n<p><a href=\"https://landing.packtpub.com/subscribe-datapronewsletter/?link_from_packtlink=yes\"><img src=\"https://static.packt-cdn.com/assets/images/DataPro NL QR Code.png\" alt=\"DataPro QR\" width=\"150\"/></a></p>\n\nStay ahead with [**DataPro**](https://landing.packtpub.com/subscribe-datapronewsletter/?link_from_packtlink=yes), the free weekly newsletter for data scientists, AI/ML researchers, and data engineers.  \nFrom trending tools like **PyTorch**, **scikit-learn**, **XGBoost**, and **BentoML** to hands-on insights on **database optimization** and real-world **ML workflows**, you’ll get what matters, fast.\n\n> Stay sharp with [DataPro](https://landing.packtpub.com/subscribe-datapronewsletter/?link_from_packtlink=yes). Join **115K+ data professionals** who never miss a beat.\n\n---\n\n### BIPro  \n*Business runs on data. Make sure yours tells the right story.*\n\n<p><a href=\"https://landing.packtpub.com/subscribe-bipro-newsletter/?link_from_packtlink=yes\"><img src=\"https://static.packt-cdn.com/assets/images/BIPro NL QR Code.png\" alt=\"BIPro QR\" width=\"150\"/></a></p>\n\n[**BIPro**](https://landing.packtpub.com/subscribe-bipro-newsletter/?link_from_packtlink=yes) is your free weekly newsletter for BI professionals, analysts, and data leaders.  \nGet practical tips on **dashboarding**, **data visualization**, and **analytics strategy** with tools like **Power BI**, **Tableau**, **Looker**, **SQL**, and **dbt**.\n\n> Get smarter with [BIPro](https://landing.packtpub.com/subscribe-bipro-newsletter/?link_from_packtlink=yes). Trusted by **35K+ BI professionals**, see what you’re missing.\n\n\n\n\n# Transformers for Natural Language Processing\nThis is the code repository for [Transformers for Natural Language Processing](https://www.packtpub.com/product/transformers-for-natural-language-processing/9781800565791), published by [Packt](https://www.packtpub.com/?utm_source=github). It contains all the supporting project files necessary to work through the book from start to finish.\n\n* **Paperback**: 384 pages\n* **ISBN-13**: 9781800565791\n* **Date Of Publication**: January 2021\n\n[<img src=\"./.other/cover.png\" width=\"248\">](https://www.amazon.com/Transformers-Natural-Language-Processing-architectures-ebook/dp/B08S977X8K/)\n\n## Links\n\n* [Amazon](https://www.amazon.com/Transformers-Natural-Language-Processing-architectures-ebook/dp/B08S977X8K/)\n\n* [Packt Publishing](https://www.packtpub.com/product/transformers-for-natural-language-processing/9781800565791)\n\n## About the Book\nTransformers for Natural Language Processing investigates in vast detail the deep learning for machine translations, speech-to-text, text-to-speech, language modeling, question answering, and many more NLP domains in context with the Transformers.\n\nThe book takes you through Natural language processing with Python and examines various eminent models and datasets in the transformer technology created by pioneers such as Google, Facebook, Microsoft, OpenAI, Hugging Face, and other contributors.\n\nThe book trains you in three stages. The first stage introduces you to transformer architectures, starting with the original Transformer, before moving on to RoBERTa, BERT, and DistilBERT models. You will discover training methods for smaller Transformers that can outperform GPT-3 in some cases. In the second stage, you will apply Transformers for Natural Language Understanding (NLU) and Generation. Finally, the third stage will help you grasp advanced language understanding techniques such as optimizing social network datasets and fake news identification.\n\nBy the end of this NLP book, you will understand transformers from a cognitive science perspective and be proficient in applying pre-trained transformer models by tech giants to various datasets.\n\n## Things you will learn\n* Use the latest pretrained transformer models\n* Grasp the workings of the original Transformer, GPT-2, BERT, T5, and other transformer models\n* Create language understanding Python programs using concepts that outperform classical deep learning models\n* Use a variety of NLP platforms, including Hugging Face, Trax, and AllenNLP\n* Apply Python, TensorFlow, and Keras programs to sentiment analysis, text summarization, speech recognition, machine translations, and more\n* Measure the productivity of key transformers to define their scope, potential, and limits in production\n\n## Instructions and Navigation\nAll of the code is organized into folders that are named chapter-wise, for example: `Chapter02`.\n\nThe code will look like the following:\n```python\n#@title Activating the GPU\n# Main menu->Runtime->Change Runtime Type\nimport tensorflow as tf\ndevice_name = tf.test.gpu_device_name()\nif device_name != ‘/device:GPU:0’:\n  raise SystemError(‘GPU device not found’)\nprint(‘Found GPU at: {}’.format(device_name))\n```\n\n## Software Requirements\n\nCheck this file for the hardware and software requirements: [technical_requirements.md](./.other/technical_requirements.md)\n\n## Related Products\n\n* [Python Machine Learning - Third Edition](https://www.packtpub.com/product/python-machine-learning-third-edition/9781789955750)\n* [Hands-On Explainable AI (XAI) with Python - Second Edition](https://www.packtpub.com/product/hands-on-explainable-ai-xai-with-python/9781800208131)\n\n### Download a free PDF\n\n <i>If you have already purchased a print or Kindle version of this book, you can get a DRM-free PDF version at no cost.<br>Simply click on the link to claim your free PDF.</i>\n<p align=\"center\"> <a href=\"https://packt.link/free-ebook/9781800565791\">https://packt.link/free-ebook/9781800565791 </a> </p>"
  }
]